Predicting future energy sources
Carnegie Mellon researchers have developed a brand new model to help us plan for the future of energy.
For many of us, electricity is tantamount to a magic trick. We walk into a room, flip a switch, and suddenly, poof. Light. Power. But peel back the curtain, and the reality is much different. The mechanisms that allow us to perform this magic are deep and complicated, involving an unbelievable number of variables and strategies, all to ensure that when that switch is flicked, the light in fact goes on.
There are fuel sources, power plants, and transmission lines; investors and capital interests; environmental considerations. Not to mention all the different kinds of batteries for energy storage. It’s a lot for anyone to consider, which is why scientists traditionally use algorithmic models to plan how to expand our power generation strategies—whether we use coal, renewables, or other sources to meet load demand, while satisfying technical and economic requirements. As the energy sector changes, however, our existing models encounter new and worsening problems.
To address this problem, Cristiana Lara and Ignacio Grossmann, in collaboration with ExxonMobil and NETL, have developed a brand new optimization model (multiperiod mixed-integer linear programming)—one that will help us plan for the future of energy.
The combination of fossil fuels and renewables will be able to satisfy forecast power demands and variability in renewable generation, while anticipating development of new power storage technologies.
Ignacio Grossmann, Professor, Chemical Engineering
“We are conducting research to address the planning of capacity expansion in the electricity generation infrastructure,” says Lara, a Ph.D. student in Chemical Engineering, “assuming an increasing share of power generation from renewables, and the possibility of including baseline fossil power generation and energy storage systems.”
In a paper they published in the European Journal of Operational Research, the team addresses how new large-scale optimization models are required for predicting and planning for where our power will come from over the course of the next 20 to 30 years.
Renewable power generation, such as solar and wind, comes with its own set of problems. The sun is not always shining; the wind not always blowing. Due to the intermittent nature of renewable power generation, it’s necessary to plan for possible fluctuations in power on an hour-by-hour basis, in order to ensure both the reliability and flexibility of the system. There already exist a number of sustainable fossil energy systems that can be used to generate a baseline of power to mitigate this intermittency, and further development of energy storage systems can help as well.
“Advanced optimization modeling techniques have been developed for the planning of cost-effective, long-term sources for power generation in a given region,” says Grossmann, a professor in Chemical Engineering. “These choices include both fossil fuels such as coal, gas, and nuclear—and renewables, such as wind, solar, and hydro. Together, the combination of these sources will be able to satisfy forecast power demands and variability in renewable generation, while anticipating development of new power storage technologies.”
The model is applied to a given area, such as a state or an independent systems operator (ISO) consisting of existing and potential generators, along with potential energy storage units. From there, the objective is to find the location, year, type, and number of generators and storage units to install; when to retire the generators, or extend their lifetime; the approximate power flow between these locations; and the approximate operating schedule in order to meet the projected demand while minimizing costs. Since the corresponding programming model involves millions of variables and constraints, advanced commercial optimization software cannot solve these problems. Therefore, Lara and Grossmann have developed a special solution algorithm, which allows them to effectively optimize these problems, typically in just a few hours.
In order to test their model and decomposition algorithm, the team applied it to a case study on the Texas Interconnection, managed by the Electric Reliability Council of Texas (ERCOT). The results show that their framework can provide substantial speed-up, and allow the solution of larger instances. This improvement in solution time allows one to perform several sensitivity analyses, and better understand the drivers for a variety of scenarios.
With the help of this new optimization model, future energy infrastructure will not only be more productive and cost effective, but more environmentally efficient as well, bringing the full benefit of next-generation energy technologies to the world.