Clouds Set to Make Smart Grids Smarter

By Nicole Hemsoth

March 21, 2011

The United Nations estimates that in 2009, the world’s population was officially evenly split between rural and urban areas. However, this figure is not a static one—the organization contends that by 2050, almost 70 percent of the projected 9.1 people living on the planet will be living in cities.

Leaving aside the multiple matters tied to population growth in general, if the urban migration figures are correct, cities are going to have to find far more sustainable ways to power their cities—a matter that can be addressed by streamlining, and in some cases reconstructing, IT infrastructure.

A recent report by Microsoft, called “The Central Role of Cloud Computing in Making Cities Energy-Smart” examined the pending urban population push with a focus on refining city energy infrastructure. The authors claim that in order for cities to thrive in the wake of population explosions, they will “need to radically evolve their infrastructures.” This includes water, waste and other crucial systems, but as the report notes, it is “the evolution of energy infrastructure, in particular, that presents an important opportunity to make major reductions in GHG emissions, support economic development and maintain high quality of life in our cities.”

While at one time energy might have only meant power stations and the way electricity was served, the tide is changing. Now, instead of buildings (and a growing range of other structures and devices) being simply consumers of electricity, they are also producing it and sending it back to the grid.

This give and take relationship in the energy grid goes beyond buildings sending energy as well as consuming it. Newer modes of power generation and consumption like electric vehicles which consume power but also use their battery stores to send power back to the grid are on the horizon as are a number of other renewable systems to send and receive energy.

The problem now is to architect a system that takes these logistical problems of the smart grid and manages the complexity efficiently and in a manner that actually enhances power generation and dispersal. In other words, the time has come to turn those massive, power-chugging cloud data centers into the sources of more streamlined, efficient energy use and distribution—forcing them into their own sort of “give and take” relationship.

Maximizing the Power of Smart Grids

Today IBM and British Cable and Wireless announced that they were collaborating on a cloud computing system to monitor the use of power in over 50 million homes via their smart meters. The project, called the “UK Smart Energy Cloud” will “gather data many times a day from smart meters around the country and store it in a cloud hosted within the country. This data then will be sent to power utilities for analysis to aid in better planning for peak loads.

Smart meters have emerged to let utility companies will be more proficient in gauging demand, thus allowing them to plan ahead to prevent outages. While there are other potential benefits for smart meters, including the ability for customers to see when their electricity is cheapest, the goal to maximize energy use is paramount—both for consumers and power companies.

For true revolutionizing of energy infrastructure, however, it will take a great deal more IT horsepower since smart metering alone cannot provide the efficiency required for the big urban boom.  A more integrated approach using information accessible anywhere via cloud-based platforms is needed. This is especially important because the nature of power operations is changing with far more “give back” via wind and other generation mechanisms that allow consumers to shift to providers.

There is no easy task in this mission to build sustainable IT infrastructures, according to Chris Johnston from the Network-Ready device unit at AT&T. He states that “Ultimately, the grid may be transformed into a wirelessly-controlled, digital network capable of handling complex, multi-directional flows of power. Communications and computing must be both cost effective and easy to install.  These criteria greatly favor wireless communications and cloud computing. Companies capable of delivering the necessary wireless communications and cloud computing functionality must also be armed with outstanding service level agreements and disaster recovery capabilities.”

The Microsoft report also gave a hat tip to this complexity, noting that “the coordination of all of those supply sources is complex and requires significant understanding of where, when and how much power is available to satisfy the energy demands of growing cities. This is of course why the integration of sensors, monitoring equipment, advanced control systems and information technologies to collect this supply-side data and turn it into useful, actionable information is important.”

A Data Market for Energy Efficiency

As more renewable energy sources become more pervasive, bringing with them a need to interact with the grid (versus simply be a consumer of its resources) there will need to be a way to merge these elements into the infrastructure. As the Microsoft report on this issue noted, “optimizing energy efficiency across these interconnected systems at the city level requires the secure and reliable collection of massive amounts of data from sensors, meters and controls embedded within these complex systems.

In Microsoft’s view, cloud computing could be the key to providing a flexible platform to bring together these disparate sensors, meters and measurements so that energy efficiency can be streamlined into one system. They note that “developers will be able to deliver new solutions, such as weather forecasts, energy pricing and traffic conditions.” Along with this is other data that can be particularly useful when culled into the big picture cloud platform. This could include building occupancy, energy performance, manufacturing and other activity and even shipping or distribution schedules.

To help developers focused on building smarter city infrastructures, Microsoft has opened its Windows Azure DataMarket, which the company claims will enable “the discovery, exploration and consumption of data from trusted public domains and commercial data sources such as demographics, health, location-based services, real estate, science, transportation, navigation, weather, finance, etc.”  This market for data also provides visualizations and analytics to help developers “see” the data and its wider implications.

In essence, by providing a complex “mashup” of this data, developers will be to create total systems that are integrated for energy efficiency by use of the vast amount of data pertaining to urban elements that might, at first anyway, appear to have little to do with the grid. In short, “applications and services that leverage such a diverse portfolio of disparate data sets will enable new insights for citizens, governments and utilities on how to manage energy infrastructure in real time.”

The company notes that developers can make use of this data on any platform and can incorporate the data via a common API to mobile, desktop and web-based applications. This centralization of information for energy efficient planning (and beyond, the commercial applications are limited only by imagination—and will likely be the primary use) will allow developers unprecedented ease in terms of creating specific local mashups to aid in effective energy use and distribution.

The report out of Redmond, “IT for Energy Smart Cities” notes that this movement to cull data from a range of source to aid in more efficient energy distribution and use is already taking shape. It notes that ISVs are system integrators are already “taking advantage of high performance and cloud computing platforms to deliver solutions and services that address the needs of this evolving energy infrastructure.” According to Microsoft, many of these ISVs and Sis are already players in the power and grid space and are pulling together pieces of information related to everything from building design and management to transportation systems.

Microsoft claims that the “work to evolve energy smart cities is focusing on these major intersecting infrastructures—power generation and grid, buildings, transportation systems and the security, privacy, reliability and accessibility of the general information backbone that connects them.”

Bringing it all Together

Chris Johnston’s assessment that there are big challenges ahead on a number of fronts shouldn’t be taken lightly. Before the merging of disparate technologies and sources of energy production and consumption some core refinements and modernization measures are needed.

Some groups are addressing such challenges, including a team of researchers out of the University of Pittsburgh’s Swanson School of Engineering. The group recently announced that they have embarked on a long-term mission to integrate more efficient power delivery systems into the expanding American power grid. As a release this week noted, “by employing the same simulation technology used to design and engineer electricity grids, the researchers will model an expanded power grid that delivers electricity from the power plant to our homes and businesses with less infrastructure and a more reliable and efficient flow of electricity.”

This attempt to reconstruct infrastructure would allow for better conservation and also will make it simpler to look to more renewable sources of energy that are generated in remote locations. As one of the lead researchers for the project, Gregory Reed, noted the issue with power delivery in the U.S. is related to consistency. He explains:

“Electricity in the United States is generated, transported and delivered by alternating current (AC). But modern devices—from renewable power resources and electric vehicles to high-definition televisions, data center, computers and other devices take a direct current (DC) input, hence the AC/DC converters.” In short, systems that were once adequate are now strained due to a lack of IT infrastructure to support added complexity.

Others are looking at the roles that smart meters can play within paradoxical issue of cloud computing data centers themselves. In other words, they are examining how data centers that provide the cloud power to create better efficiency can themselves become more central to overall local efficiency.

The cloud can provide a robust, scalable system to manage fluctuating demand and input/output of information as well as serve as a springboard for creative teams of developers. However, it’s a long road ahead. In the meantime, the urban migration continues to build, making the work to support energy efficiency via cloud computing (and other IT innovations) an urgent matter.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

IBM Touts OpenPOWER Ecosystem, Announces New Customers, Products for AI and Hyperscale

March 20, 2018

At SC17 in Denver four months ago, Ken King, GM, OpenPOWER, IBM Systems Group, told a somewhat jaundiced trio of journalists that 2018 would, finally, after several years of expectations, be the year OpenPOWER and IBM’ Read more…

By Doug Black

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate scientists the ability to use machine learning to identify e Read more…

By Rob Farber

Mellanox Reacts to Activist Investor Pressures in Letter to Shareholders

March 16, 2018

Activist investor Starboard Value has been exerting pressure on Mellanox Technologies to increase its returns. In response, the high-performance networking company on Monday, March 12, published a letter to shareholders outlining its proposal for a May 2018 extraordinary general meeting (EGM) of shareholders and highlighting its long-term growth strategy and focus on operating margin improvement. Read more…

By Staff

HPE Extreme Performance Solutions

Harness the Full Power of HPC Servers with an Effective Cooling Approach

High performance computing (HPC) innovation is rapidly transforming the way we operate – with an onslaught of cutting-edge technologies designed to optimize applications and workloads, increase productivity, and enable better business outcomes. Read more…

Quantum Computing vs. Our ‘Caveman Newtonian Brain’: Why Quantum Is So Hard

March 15, 2018

Quantum is coming. Maybe not today, maybe not tomorrow, but soon enough. Within 10 to 12 years, we’re told, special-purpose quantum systems will enter the commercial realm. Assuming this happens, we can also assume that quantum will, over extended time, become increasingly general purpose as it delivers mind-blowing power. Read more…

By Doug Black

IBM Touts OpenPOWER Ecosystem, Announces New Customers, Products for AI and Hyperscale

March 20, 2018

At SC17 in Denver four months ago, Ken King, GM, OpenPOWER, IBM Systems Group, told a somewhat jaundiced trio of journalists that 2018 would, finally, after sev Read more…

By Doug Black

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

Stephen Hawking, Legendary Scientist, Dies at 76

March 14, 2018

Stephen Hawking passed away at his home in Cambridge, England, in the early morning of March 14; he was 76. Born on January 8, 1942, Hawking was an English theo Read more…

By Tiffany Trader

Hyperion Tackles Elusive Quantum Computing Landscape

March 13, 2018

Quantum computing - exciting and off-putting all at once - is a kaleidoscope of technology and market questions whose shapes and positions are far from settled. Read more…

By John Russell

Part Two: Navigating Life Sciences Choppy HPC Waters in 2018

March 8, 2018

2017 was not necessarily the best year to build a large HPC system for life sciences say Ari Berman, VP and GM of consulting services, and Aaron Gardner, direct Read more…

By John Russell

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

SciNet Launches Niagara, Canada’s Fastest Supercomputer

March 5, 2018

SciNet and the University of Toronto today unveiled "Niagara," Canada's most-powerful supercomputer, comprising 1,500 dense Lenovo ThinkSystem SD530 high-perfor Read more…

By Tiffany Trader

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Leading Solution Providers

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

V100 Good but not Great on Select Deep Learning Aps, Says Xcelerit

November 27, 2017

Wringing optimum performance from hardware to accelerate deep learning applications is a challenge that often depends on the specific application in use. A benc Read more…

By John Russell

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

AMD Wins Another: Baidu to Deploy EPYC on Single Socket Servers

December 13, 2017

When AMD introduced its EPYC chip line in June, the company said a portion of the line was specifically designed to re-invigorate a single socket segment in wha Read more…

By John Russell

World Record: Quantum Computer with 46 Qubits Simulated

December 18, 2017

Scientists from the Jülich Supercomputing Centre have set a new world record. Together with researchers from Wuhan University and the University of Groningen, Read more…

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Te Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This