13 May 23
The enormous energy requirement of these brute force statistical models is due to the following attributes:
- Requires millions or billions of training examples
- Requires many training cycles
- Requires retraining when presented with new information
- Requires many weights and lots of multiplication
ChatGPT and other AI applications such as Midjourney have pushed “Artificial Intelligence” high on the hype cycle. In this article, I want to focus specifically on the energy cost of training and using applications like ChatGPT, what their widespread adoption could mean for global CO₂ emissions, and what we could do to limit these emissions.
Key points
- Training of large AI models is not the problem
- Large-scale use of large AI models would be unsustainable
- Renewables are not making AI more sustainable
On the need for low-carbon and sustainable computing and the path towards zero-carbon computing.
This website is a solar-powered, self-hosted version of Low-tech Magazine. It has been designed to radically reduce the energy use associated with accessing our content.
This paper describes how principles derived from degrowth can be a useful heuristic for designing an ICT system within energy limits. It does so by discussing the design choices behind https://solar.lowtechmagazine.com, an ongoing design research project that set out to build a ’low-tech website’. This research resulted in a design which is lightweight, tailored towards older and lower-powered devices, is powered by off-grid solar energy and thus designed with energy scarcity in mind. The project shows that values and frameworks theorized within the Computing within Limits community are technically applicable to practices of web development but also identifies hurdles to their more widespread applicability.