Sustainably scale the computational power of our planet

From Richard Sutton, "The Bitter Lesson"

The biggest lesson that can be read from 70 years of AI research is that general methods that leverage computation are ultimately the most effective, and by a large margin. [...] breakthrough progress eventually arrives by an opposing approach based on scaling computation by search and learning.

From Dario Amodei, "AI and Compute"

Since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.4-month doubling time (by comparison, Moore’s Law had a 2-year doubling period). By 2019 it had grown by 300,000x (a 2-year doubling period would yield only a 7x increase).

Scaling compute in a non environmentally-sustainable way means we are going to start hitting bottlenecks.

Cost of compute will become more and more prohibitive because fossil fuels get harder and harder to extract.

We will be forced to make tradeoffs between the benefits of compute and the negative environmental impact.

Non sustainable is scarcity.

Sustainable is abundance.