How Cities Use HPC at the Edge to Get Smarter

By Doug Black

November 17, 2017

Cities are sensoring up, collecting vast troves of data that they’re running through predictive models and using the insights to solve problems that, in some cases, city managers didn’t even know existed.

Speaking at SC17 in Denver this week, a panel of smart city practitioners shared the strategies, techniques and technologies they use to understand their cities better and to improve the lives of their residents. With data coming in from all over the urban landscape and worked over by machine learning algorithms, Debra Lam, managing director for smart cities & inclusive innovation at Georgia Tech who works on strategies for Atlanta and the surrounding area, said “we’ve embedded research and development into city operations, we’ve formed a match making exercise between the needs of the city coupled with the most advanced research techniques.”

Panel moderator Charlie Cattlett, director, urban center for computation & data Argonne National Laboratory who works on smart city strategies for Chicago, said that the scale of data involved in complex, long-term modeling will require nothing less than the most powerful supercomputers, including the next generation of exascale systems under development within the Department of Energy. The vision for exascale, he said, is to build “a framework for different computation models to be coupled together in multiple scales to look at long-range forecasting for cities.”

“Let’s say the city is thinking about taking 100 acres and spend a few hundred million dollars to build some new things and rezone and maybe augment public transit,” Cattlett said, “how do you know that that plan is actually what you think it’s going to do? You won’t until 10-20 years later. But if you forecast using computation models you can at least eliminate some of the approaches that would be strictly bad.”

With both Amazon and Microsoft in its metropolitan area, it’s not surprising that Seattle is doing impressive smart city work. Michael Mattmiller, CTO of Seattle, said good planning is necessary for a city expected to grow by 32 percent. Mattmiller said 75 percent of the new residents moving to Seattle are coming for jobs in the technology sector, and they will tend to have high expectations for how their city uses technology.

Some of Seattle’s smart city tactics are relatively straightforward, if invaluable, methods for city government to open the lines of communication with residents and to respond to problems faster. For example, the city developed an app called “Find It, Fix It” in which residents who encounter broken or malfunctioning city equipment (broken street light, potholes, etc.) are encouraged to take a cell phone picture and send a message to the city with a description of the problem and its location.

Of a more strategic nature is Seattle’s goal of becoming carbon neutral by 2050. The key challenges are brought on by the 100,000 people who come to the downtown areas each day for their jobs. The city’s Office of Sustainability collects data on energy consumption from sensors placed on HVAC and lighting systems in office buildings and retail outlets and has developed benchmarks for comparing energy consumption on a per-building basis, notifying building owners if they are above or below their peer group.

Mattmiller said Amazon and Microsoft helped build analytics algorithms that run on Microsoft Azure public cloud. The program is delivering results; Mattmiller said energy consumption is down, with a reduction of 27 million tons of carbon.

Seattle also analyzed weather data and rainfall amounts, discovering that the city has distinct microclimates, with some sections of the city getting as much as eight more inches of rain (the total annual amount of rain in Phoenix) per year than others. This has led to the city issuing weather alerts to areas more likely to have rain events and to send repair and maintenance trucks to higher risk areas.

Transportation, of course, is a major source of pollution, carbon and frustration (30 percent of urban driving is spent looking for parking spaces). Seattle trolled resident for ideas and held a hackathon that produced 14 prototype solutions, including a team from Microsoft who bike to work: they developed a machine learning program that predicts the availability of space on bike racks attached to city buses, “an incredibly clever solution,” Mattmiller said.

In Chicago, Pete Beckman, co-director, Northwestern Argonne Institute of Science and Engineering, Argonne National Laboratory, helped develop sensors placed throughout the city in its Array of Things project. He said that while most sensors used by cities are big, expensive and sparse, Beckman said the project managers wanted to “blanket the city with sensors,” which would collect a broad variety of data and also have significant computational power – a “programmable sensor” that doesn’t just report data but one for which you can write programs to run in the device. They also wanted it to be attractive, so students at the Art Institute of Chicago were recruited to help design the enclosure.

“This becomes a high performance computing problem,” Beckman said. “Why do you need to run programs at the edge? Why run parallel computing out there? Because the amount of data we want to analyze would swamp any network. The ability to have 4K cameras, to have hyperspectral imaging, to have audio, all that (data) can’t be sent back to the data center for processing, it has to be processed right there in a small, parallel supercomputer. Whether it’s Open CV (Open Source Computer Vision Library), Caffe or other deep learning framework like Tensorflow, we have to run the computation out at the edge.”

One scenario outlined was of a sensor detecting an out-of-control vehicle approaching a busy intersection; the sensor picks up on the impending danger and delays the pedestrian “WALK” sign and turns all the traffic lights in the intersection red. These are calculations that require HPC-class computing at the street corner.

Chicago is using its Array of Things sensors in other critical roles, such as real time flood monitoring, for tracking pedestrian, bicycle car and truck traffic and predictively model accidents.

“The questions for us in the parallel computing world,” Beckman said, “are how do we take that structure on our supercomputers and scale it in a way so we have a virtuous loop to do training of large-scale data on the supercomputer and create models that are inference-based, that are quick and fast, that can be pushed out to parallel hardware accelerated out on the edge? The Array of Things project is working on that now.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

NREL ‘Eagle’ Supercomputer to Advance Energy Tech R&D

August 14, 2018

The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) has contracted with Hewlett Packard Enterprise (HPE) for a new 8-petaflops (peak) supercomputer that will be used to advance early-stage R&a Read more…

By Tiffany Trader

Training Time Slashed for Deep Learning

August 14, 2018

Fast.ai, an organization offering free courses on deep learning, claimed a new speed record for training a popular image database using Nvidia GPUs running on public cloud infrastructure. A pair of researchers trained Read more…

By George Leopold

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learning. The CERN team demonstrated that AI-based models have the Read more…

By Rob Farber

HPE Extreme Performance Solutions

Introducing the First Integrated System Management Software for HPC Clusters from HPE

How do you manage your complex, growing cluster environments? Answer that big challenge with the new HPC cluster management solution: HPE Performance Cluster Manager. Read more…

IBM Accelerated Insights

Super Problem Solving

You might think that tackling the world’s toughest problems is a job only for superheroes, but at special places such as the Oak Ridge National Laboratory, supercomputers are the real heroes. Read more…

Rigetti Eyes Scaling with 128-Qubit Architecture

August 10, 2018

Rigetti Computing plans to build a 128-qubit quantum computer based on an equivalent quantum processor that leverages emerging hybrid computing algorithms used to test programs and potential applications. Founded in 2 Read more…

By George Leopold

NREL ‘Eagle’ Supercomputer to Advance Energy Tech R&D

August 14, 2018

The U.S. Department of Energy (DOE) National Renewable Energy Laboratory (NREL) has contracted with Hewlett Packard Enterprise (HPE) for a new 8-petaflops (peak Read more…

By Tiffany Trader

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

SLATE Update: Making Math Libraries Exascale-ready

August 9, 2018

Practically-speaking, achieving exascale computing requires enabling HPC software to effectively use accelerators – mostly GPUs at present – and that remain Read more…

By John Russell

Summertime in Washington: Some Unexpected Advanced Computing News

August 8, 2018

Summertime in Washington DC is known for its heat and humidity. That is why most people get away to either the mountains or the seashore and things slow down. H Read more…

By Alex R. Larzelere

NSF Invests $15 Million in Quantum STAQ

August 7, 2018

Quantum computing development is in full ascent as global backers aim to transcend the limitations of classical computing by leveraging the magical-seeming prop Read more…

By Tiffany Trader

By the Numbers: Cray Would Like Exascale to Be the Icing on the Cake

August 1, 2018

On its earnings call held for investors yesterday, Cray gave an accounting for its latest quarterly financials, offered future guidance and provided an update o Read more…

By Tiffany Trader

Google is First Partner in NIH’s STRIDES Effort to Speed Discovery in the Cloud

July 31, 2018

The National Institutes of Health, with the help of Google, last week launched STRIDES - Science and Technology Research Infrastructure for Discovery, Experimen Read more…

By John Russell

Leading Solution Providers

SC17 Booth Video Tours Playlist

Altair @ SC17

Altair

AMD @ SC17

AMD

ASRock Rack @ SC17

ASRock Rack

CEJN @ SC17

CEJN

DDN Storage @ SC17

DDN Storage

Huawei @ SC17

Huawei

IBM @ SC17

IBM

IBM Power Systems @ SC17

IBM Power Systems

Intel @ SC17

Intel

Lenovo @ SC17

Lenovo

Mellanox Technologies @ SC17

Mellanox Technologies

Microsoft @ SC17

Microsoft

Penguin Computing @ SC17

Penguin Computing

Pure Storage @ SC17

Pure Storage

Supericro @ SC17

Supericro

Tyan @ SC17

Tyan

Univa @ SC17

Univa

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This