Home Technology Artificial Intelligence Insights: AI is booming – so is its carbon footprint AI uses more energy than other forms of computing, and training a single model can gobble up more electricity than 100 US homes use in an entire year by Bloomberg April 16, 2023 Artificial intelligence (AI) has become the tech industry’s shiny new toy, with expectations it’ll revolutionise trillion-dollar industries from retail to medicine. But the creation of every new chatbot and image generator requires a lot of electricity, which means the technology may be responsible for a massive amount of planet-warming carbon emissions. Microsoft, Alphabet’s Google and ChatGPT maker OpenAI use cloud computing that relies on thousands of chips inside servers in massive data centres across the globe to train AI algorithms called models, analysing data to help them “learn” to perform tasks. AI uses more energy than other forms of computing, and training a single model can gobble up more electricity than 100 US homes use in an entire year. The emissions could also vary widely depending on what type of power plants provide that electricity; a data centre that draws its electricity from a coal or natural gas-fired plant will be responsible for much higher emissions than one that draws power from solar or wind farms. While researchers have tallied the emissions from the creation of a single model, and some companies have provided data about their energy use, they don’t have an overall estimate for the total amount of power the technology uses. Sasha Luccioni, a researcher at AI company Hugging Face, wrote a paper quantifying the carbon impact of her company’s Bloom, a competitor of OpenAI’s GPT-3. She has also tried to estimate the same for OpenAI’s viral hit ChatGPT, based on a limited set of publicly available data. Greater transparency Researchers like Luccioni say we need transparency on the power usage and emissions for AI models. Greater transparency might also bring more scrutiny; the crypto industry could provide a cautionary tale. Training GPT-3, which is a single general-purpose AI programme that can generate language and has many different uses, took 1.287 gigawatt hours, according to a research paper published in 2021, or about as much electricity as 120 US homes would consume in a year. That training generated 502 tonnes of carbon emissions, according to the same paper. That’s for just one programme or model. While training a model has a huge upfront power cost, researchers found in some cases it’s only about 40 per cent of the power burned by the actual use of the model, with billions of requests pouring in for popular programmes. OpenAI’s GPT-3 uses 175 billion parameters or variables, that the AI system has learned through its training and retraining. Its predecessor used just 1.5 billion. Another relative measure comes from Google, where researchers found that AI made up 10 to 15 per cent of the company’s total electricity consumption, which was 18.3 terawatt hours in 2021. That would mean that Google’s AI burns around 2.3 terawatt hours annually. Net-zero pledges While the models are getting larger in many cases, the AI companies are also constantly working on improvements that make them run more efficiently. Microsoft, Google and Amazon – all have carbon negative or neutral pledges. Google said in a statement that it’s pursuing net-zero emissions across its operations by 2030, with a goal to run its office and data centres entirely on carbon-free energy. OpenAI cited work it has done to make the application programming interface for ChatGPT more efficient, cutting electricity usage and prices for customers. “We take our responsibility to stop and reverse climate change very seriously, and we think a lot about how to make the best use of our computing power,” an OpenAI spokesperson said in a statement. Microsoft noted it is buying renewable energy and taking other steps to meet its previously announced goal of being carbon negative by 2030. “As part of our commitment to create a more sustainable future, Microsoft is investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application,” the company said in a statement. There are ways to make AI run more efficiently. Since AI training can happen at any time, developers or data centres could schedule the training for times when power is cheaper or at a surplus, thereby making their operations more green, said Ben Hertz-Shargel of energy consultant Wood Mackenzie. One of the bigger mysteries in AI is the total accounting for carbon emissions associated with the chips being used. Nvidia, the biggest manufacturer of graphics processing units, said that when it comes to AI tasks, they can complete the task more quickly, making them more efficient overall. While Nvidia has disclosed its direct emissions and the indirect ones related to energy, it hasn’t revealed all of the emissions they are indirectly response for, said Luccioni, who asked for that data for her research. When Nvidia does share that information, Luccioni thinks it’ll turn out that GPUs burn up as much power as a small country. She said, “It’s going to be bananas.” Also read: Abu Dhabi’s ADNOC begins work on new carbon capture and storage project Tags Artificial Intelligence carbon emissions Carbon footprint Technology 0 Comments You might also like Dell’s Walid Yehia on AI innovation, cybersecurity and sustainability GB Business Breakfast shines spotlight on GCC’s automotive, mobility sectors Al Laith’s Jason English on supporting the region’s evolving events sector Google launches AI accelerator programme for MENAT startups