When the power goes out, customers aren’t the only ones kept in the dark. The outage can come as a surprise to electricity providers too, which helps explain why electric companies are moving to more transparent reporting systems. Taking a page from supercomputing, scientists at GE Global Research are developing new software using the power of parallel processing to make the electrical grid more efficient.
Naresh Acharya, a senior engineer at GE’s global research labs in upstate New York, explains that to keep systems safe, grid operators have to determine the maximum amount of power that can run safely through their systems in the worst conditions. It’s a process that is refined every few months, but there’s not much flexibility in the system once the limits are set.
“These analytical tools have been in place for decades and they are very rigid,” Acharya says. “The worst-case scenario may apply to just a few days during a heat wave or a winter storm. This type of thinking is leading us to overdesign and overbuild the grid. With real-time knowledge, we could be getting much more out of our assets without building out a new grid.”
GE Global Research and GE Energy Consulting are collaborating with the Pacific Northwest National Laboratory and Southern California Edison on a real-time software system to allow the grid to expand and shrink in tandem with the peaks and valleys of actual demand.
Most grid tools available today are still stuck in the era of the single-core computer, which means they cannot take advantage of the multicore machines available today.
Acharya explains that while utility companies can monitor the health of the grid, they cannot run the complex models that would enable them to decide the best course of action.
The team is developing grid analytics tools for more powerful multicore machines that use parallel processing. The program screens data coming over the Industrial Internet, a snapshot generated by sensors, generators and other equipment distributed along hundreds of miles of high voltage wires that make up the grid. From this mass of data, the software determines the few dozen key signals that have the biggest impact on the stability of grid.
“It tells us where we might have a weak spot,” Acharya says.
The system will provide guidance regarding which generators should be ramped up or down, and what is the optimum electricity load for a given interval.
The main goal is to help utilities maximize the use of their grid, which will have the added benefit of optimizing the share of power from renewable sources.