Back to Articles
OpenAI CEO Sam Altman Defends AI Resource Usage, Arguing Humans Use Energy Too

9News

ENRICHED

Authors (1)

Description

<p>OpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too.</p>

Summary

The article highlights OpenAI CEO Sam Altman's defense of AI's substantial energy consumption by comparing it to the energy needed to 'train' a human being over a lifetime. Altman minimizes concerns about AI's environmental impact, emphasizing the need for cleaner energy sources like nuclear or renewables due to growing AI usage. The piece touches on the energy demands of data centers and Altman's dismissal of exaggerated water usage claims associated with AI. While the article illustrates AI's significant energy footprint, it provides limited insights into how this may relate to existential or catastrophic AI risks, focusing mainly on energy efficiency and sustainability.

Body

OpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too.Altman, who appeared at a Q&A session hosted by newspaper The Indian Express this week, pushed back on the comparisons between humans and artificial intelligence."One of the things that is always unfair in this comparison is that people talk about how much energy it takes to train an AI model relative to how much it takes for a human to do an inference query," he said.READ MORE: Albanese backs push to remove ex-prince Andrew from line of successionOpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too. (AP)EXCLUSIVE: Lucy works a 9 to 5, but she makes more money trading Pokémon cards"But it also takes a lot of energy to train a human."It takes like 20 years of life and all of the food you eat during that time before you get smart. "Not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you."Then you took whatever you took."According to US research, a single ChatGPT query consumes nearly 10 times as much energy as a typical Google search, while it takes about half a litre of water to process 20 to 50 queries.Generative AI models need massive computational power to train and run them.Data centres, which contain buildings full of computer servers, churn through huge amounts of electricity all over the globe.WEATHER: State warned to brace for thunderstorms and flash floodingThe world's data centres consumed more electricity than all of Australia in 2022. (Getty)MONEY: Major super shake-up set to impact more than 14 million workersData centres consumed about 460 terawatt-hours (TWh) in 2022 alone, according to the International Energy Agency (IEA).All of Australia consumed less than 300TWh that same year.Altman argued concerns about water usage were "totally fake" but conceded "we used to do evaporative cooling in data centres"."Now that we don't do that," he said."You see these things on the internet where [a post says] 'don't use ChatGPT, it's 17 gallons of water for each query' or whatever."This is completely untrue, totally insane, no connection to reality."What is fair though is the energy consumption, not per query, but in total because the world is now using so much AI."It is real and we need to move towards nuclear or wind and solar very quickly."NEVER MISS A STORY: Get your breaking news and exclusive stories first by following us across all platforms.Download the 9NEWS App here via Apple and Google PlayMake 9News your preferred source on Google by ticking this box hereSign up to our breaking newsletter here