Skip to Main Content

Artificial intelligence (AI) is poised to drive the next industrial revolution. Large language models alone attract millions of users weekly and process billions of prompts each day. Despite its proven utility across virtually every industry, the data centers powering these systems consume enormous amounts of electricity, placing considerable pressure on the electric grid.

As AI adoption accelerates, data center power consumption is projected to rise significantly. Meeting that demand by building new data centers is costly, both financially and in navigating regulatory requirements. At the same time, the rapid growth of AI computing is creating new challenges for energy providers trying to maintain grid stability.

Researchers at Carnegie Mellon University (CMU) are partnering with Bosch Research (Pittsburgh, PA) to explore how AI data centers can operate more efficiently by coordinating computing workloads with energy availability. Their project focuses on jointly optimizing AI job scheduling and energy use to reduce grid strain and increase renewable energy utilization.

“The world is going through a major AI revolution in regard to large models that are changing our daily lives,” says Guannan Qu, assistant professor of electrical and computer engineering at CMU. “These large models must run in data centers, which consume substantial energy. This, in turn, only feeds the interest to develop even more data centers to support additional models.”

Large AI workloads can place significant strain on electrical infrastructure. When major AI training jobs start or end, abrupt changes in energy demand can cause fluctuations that affect residents by disrupting the energy grid.

“AI labs run large training workloads, which causes frequency fluctuations on the grid because those workloads will use a lot of energy but then suddenly will use much less energy,” says Gauri Joshi, associate professor of electrical and computer engineering at CMU. “Energy providers occasionally need to bring in additional sources of energy, such as spinning up generators, to support increased energy demand. This can affect people living near data centers, raising their electricity bills. Reports of these cases are appearing in many states that have data centers, including Pennsylvania.”

To address these challenges, the research team is investigating how AI workloads can be scheduled more intelligently to work around peak energy demands. Instead of running computing tasks whenever servers are free, data centers could align workloads with periods when renewable energy is abundant.

The Bosch team has readily been willing to share their time with our students. The project is a fantastic two-way collaboration.

Gauri Joshi, Associate Professor, Electrical and Computer Engineering

“If you think about a data center that has a renewable energy element in it, such as a battery system, you want to schedule the AI jobs to match your renewable generation, which can be highly variable,” says Qu. “Ideally, you want to schedule more of your AI jobs to run when more renewable energy is available and defer non-urgent AI jobs for a later time when renewable energy is scarce.”

AI workloads vary widely in their size, duration, and urgency, presenting opportunities for more flexible scheduling strategies.

“AI training jobs often use hundreds of servers for a long period of time—sometimes for several days or even weeks at a time,” says Joshi. “Although these jobs are large-scale and consume large amounts of energy, they are somewhat flexible: you can pause the task and then resume it later. Conversely, AI inference jobs—such as sending a prompt to ChatGPT and waiting for a reply—consume less energy but are more time-sensitive. We’re taking time constraints into account when scheduling such inference jobs.”

To manage these complex scheduling decisions, the researchers are exploring machine learning techniques to predict energy availability and optimize the distribution of computing workloads.

Joshi explains: “From the energy system management perspective, if we want to incorporate renewables into the energy mix that is used to run data centers, we need a good prediction of how much energy is going to come in from the renewable sources. To do that prediction, we could use AI concepts like reinforcement learning.”

The CMU team is currently working with Bosch Research to evaluate the approach by applying AI training and inference workloads in a computing cluster. If successful, the project could enable data centers to rely more heavily on renewable energy while continuing to meet the rapidly growing demand for AI.

“The Bosch team has readily been willing to share their time with our students,” says Joshi. “Bosch researchers regularly schedule meetings with our students, which has greatly aided the project’s development. They are also willing to host some of our students as interns, which has made the project a fantastic two-way collaboration.”

The CMU–Bosch team anticipates the project will generate valuable insights for designing future AI infrastructure, particularly as states like Pennsylvania attract new data center development.

I hope the output of this research project will inform the design of the future generation of data centers.

Guannan Qu, Assistant Professor, Electrical and Computer Engineering

“Pennsylvania is set to become a national leader in data centers,” says Qu. “I hope the output of this research project will inform the design of the future generation of data centers, which can tremendously improve the sustainability and lower the grid-integration barrier of these data centers. That's a win for Pennsylvania's economy.”

Joshi adds that the work reflects a growing intersection between two rapidly evolving fields.

“We are excited about the convergence of AI and energy,” says Joshi. “AI requires so much energy, and it's growing exponentially. We’re excited to see if this project can lead to more sustainable, stable growth of AI in the long term without adversely affecting the energy system that's supporting it.”