Problem: training one GPT-3 model emit as much carbon as five cars in their lifetimes

Solution: solar panels in the artic ocean (24/7 sun) to train AI models sustainably

Vision: sustainably scale the computational power of our planet

Mission: rent GPUs at a competitive price point with 100% renewable energy source guarantee

Roadmap

YC Application

Vision


🎯 Problem: training one GPT-3 model emit as much carbon as five cars in their lifetimes

Let's connect with people who feel/understand the problem.

There is also cryptocurrency problem, but we feel connected more artificial intelligence, so we will focus research on that for now.

📝 Energy and Policy Considerations for Deep Learning in NLP (Strubell, 2019)

Most cited paper (669 citations) around energy consumption of NLP.

They mention Power Usage Effectiveness, 36% of power used to train model goes into cooling. That's using the world average PUE of 1.58 from Uptime Institute Global Datacenter Survey.

We estimate total power consumption as combined GPU, CPU and DRAM consumption, then multiply this by Power Usage Effectiveness (PUE), which accounts for the additional energy required to sup- port the compute infrastructure (mainly cooling).