For the last few years, generative AI has been all the proverbial rage and is forecast to grow by 10x in the next few years. One dark side to AI is that it is fundamentally antithetical to sustainability: ever-growing training models, and the hardware to support them, demand substantial electricity. According to one report, ChatGPT requires “3-30 times more electricity than a Google search query and often leads to follow-up queries.” Moreover, the same article points out that data centers in the United States, which are rapidly expanding to meet AI demand, are expected to account for 7.5% of all electricity used in the country, or about as much as 1/3 of all homes in the U.S. Globally, within two years, AI might consume as much electricity as Japan.
So, where will all this power come from? Sam Altman, the CEO of OpenAI, claims that substantial advances in nuclear fusion will be required, in addition to ramping up other renewable energy sources. In the same article, AI companies claim that power demand is ushering in advances in efficiency for data centers, GPU cards, etc., while others claim that making the process more efficient will only exacerbate demand.