Back to Articles
OpenAI CEO Sam Altman Defends AI Resource Usage, Arguing Humans Use Energy Too

9News

ENRICHED

Authors (1)

Description

<p>OpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too.</p>

Summary

OpenAI CEO Sam Altman addressed concerns about the energy usage of AI systems, comparing it to the substantial energy required for human development and activity. Altman highlighted that significant computational power is necessary for training and operating generative AI models, such as ChatGPT, which demand considerable electrical and water resources. He suggests that transitioning to renewable energy sources like nuclear, wind, and solar is essential to mitigate these concerns. While Altman downplays the water usage fears as exaggerated, he acknowledges the substantial electricity demands AI imposes globally, underscoring the critical need for sustainable energy solutions.

Body

OpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too.Altman, who appeared at a Q&A session hosted by newspaper The Indian Express this week, pushed back on the comparisons between humans and artificial intelligence."One of the things that is always unfair in this comparison is that people talk about how much energy it takes to train an AI model relative to how much it takes for a human to do an inference query," he said.READ MORE: Albanese backs push to remove ex-prince Andrew from line of successionOpenAI CEO Sam Altman has downplayed concerns about AI's energy cost, arguing it takes a lot of energy to train a human too. (AP)EXCLUSIVE: Lucy works a 9 to 5, but she makes more money trading Pokémon cards"But it also takes a lot of energy to train a human."It takes like 20 years of life and all of the food you eat during that time before you get smart. "Not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you."Then you took whatever you took."According to US research, a single ChatGPT query consumes nearly 10 times as much energy as a typical Google search, while it takes about half a litre of water to process 20 to 50 queries.Generative AI models need massive computational power to train and run them.Data centres, which contain buildings full of computer servers, churn through huge amounts of electricity all over the globe.WEATHER: State warned to brace for thunderstorms and flash floodingThe world's data centres consumed more electricity than all of Australia in 2022. (Getty)MONEY: Major super shake-up set to impact more than 14 million workersData centres consumed about 460 terawatt-hours (TWh) in 2022 alone, according to the International Energy Agency (IEA).All of Australia consumed less than 300TWh that same year.Altman argued concerns about water usage were "totally fake" but conceded "we used to do evaporative cooling in data centres"."Now that we don't do that," he said."You see these things on the internet where [a post says] 'don't use ChatGPT, it's 17 gallons of water for each query' or whatever."This is completely untrue, totally insane, no connection to reality."What is fair though is the energy consumption, not per query, but in total because the world is now using so much AI."It is real and we need to move towards nuclear or wind and solar very quickly."NEVER MISS A STORY: Get your breaking news and exclusive stories first by following us across all platforms.Download the 9NEWS App here via Apple and Google PlayMake 9News your preferred source on Google by ticking this box hereSign up to our breaking newsletter here