Sustainable Computing for AI Processing
TBMG-53017
05/01/2025
- Content
As artificial intelligence (AI) and high-performance computing (HPC) workloads continue to surge, traditional semiconductor technology is reaching its limits. In addition to needing more pure computing power, AI requires more electricity than the world can provide. AI data centers alone are expected to consume up to 17 percent of U.S. electricity by 2030(1) more than triple the amount used in 2023, much due to generative AI. A query to ChatGPT requires nearly 10 times as much electricity as a regular Google search.(2) This raises urgent concerns about sustainability, especially as Goldman Sachs has forecasted a 160 percent increase in data center electricity usage by 2030.(2)
- Citation
- "Sustainable Computing for AI Processing," Mobility Engineering, May 1, 2025.