Run AI and edge workloads on processors

Run AI and edge workloads on processors

HomeNews, Other ContentRun AI and edge workloads on processors

Partner Content For more than a decade, data center power and cooling has been a challenge for businesses trying to meet their computing needs while keeping costs down.

CPU vs GPU vs TPU explained visually

These challenges have become more acute as the computing environment extended beyond the data center into the cloud and out to the edge and workloads became more fragmented, ranging from AI and HPC to microservices and the Internet of Things.

The rapid innovation and adoption of AI – especially in this new era of generative AI (GenAI) – will have a significant impact. Data centers in the United States used about 3 percent of the nation's power in 2022, which could triple by the end of the decade.

A report by the Electric Power Research Institute (EPRI) in May predicted that by 2030, data centers could consume up to 9 percent of US electricity generation, powered by processing AI workloads. According to EPRI's Powering Intelligence: Analyzing Artificial Intelligence and Datacenter Energy Consumption report, AI queries require about 10 times the electricity compared to traditional internet searches and the generation of music, photos and video that require even more. By comparison, a traditional Google search uses about 0.3 watt-hours (Wh). A question on OpenAI's ChatGPT requires about 2.9 Wh.

Tagged:
Run AI and edge workloads on processors.
Want to go more in-depth? Ask a question to learn more about the event.