Delivering Geospatial Information Systems to the Mainstream

By Jim Skurzynski

January 11, 2011

2011 will be the year Geospatial Information Systems (GIS) in the cloud goes mainstream. 

Despite the recession – or arguably because of the recession – the cloud is changing the way GIS services are developed and delivered. This year promises major and fast-paced changes: a lower-cost infrastructure for developing spatial applications; more robust, creative and sophisticated business, government and consumer uses; and a rapidly-expanding pool of spatial technology users.

At geospatial technology conferences and workshops in 2010, the conversation in the GIS world was centered upon questions of, “should we move to the cloud?” While some major industries and companies continue to struggle with this issue, for most the question has shifted to “how do we move to the cloud?”

A result of this shift was the launch of Directions Magazine’s Location Intelligence for Geospatial Cloud Computing Executive Symposium in 2009. In 2010 the conversation at the symposium focused on the opportunities SaaS offers to developers, the ROI of cloud architecture, and how new Internet-based solutions may change the face of geospatial technology delivery. 

What Trends Have an Impact on Movement of GIS to the Cloud?

What has caused the shift? Several important trends, some financial and some technological, are coming together to accelerate interest in cloud-based GIS:

1. Mapping / GIS and cloud computing are converging. In late 2010, Pitney Bowes Business Insight and Microsoft announced the integration of their respective desktop GIS and mapping platforms, calling it “one more example of how divergent solutions are coming together to provide greater insight to analysts and organizations.”  New cloud-based applications, fostered by wider adoption of spatial technology like Google Maps™, Bing Maps™ and Microsoft’s SQL Server are actually pushing desktop GIS developers to integrate new cloud-type features. Even the most traditional desktop technologies like those offered by Esri are moving into the cloud, a clear indication that the cross-pollenization will continue. 

2. The recession has spurred adoption of the cloud. Rather than slowing down the movement of GIS into the cloud, the recession has actually accelerated the change. Providing GIS services in the cloud is actually more cost-effective than providing desktop services. While many businesses weren’t quite ready to commit to a cloud-based future, the recession put such a crimp on many budgets that lower-cost cloud services suddenly became the only option. In other cases, companies supplemented their overtaxed IT departments with some cloud-based functions during the recession. Either way, those companies are now often satisfied with the SaaS applications they’re using and less likely to backtrack to a desktop system when the economy takes off again. They also found during the recession that letting someone else handle the technical part of GIS allows them to focus on their core business.

3. Consumer demand for location technology is changing the marketplace. For many consumers, the rapid rise of mapping platforms such as Google Maps and Bing Maps was the light switch that flipped on to show the value of spatially-enabled applications. More than a billion people have used Google Maps, more and more often from a mobile phone to get directions or find local businesses. In 2010, Twitter added a geo API and Google improved its Maps API to support spatial search and search feeds, changes that further help developers bring location intelligence to any application.

As consumers see the power of location, without even knowing what GIS means, they are increasingly expecting business and government to offer them spatial tools. In response, more and more businesses are now relying on GIS to automate decision-making. For example, Computerworld reported that General Motors and other automakers used GIS tools to help figure out which auto dealerships should be closed. Government agencies are finding new uses for GIS analytics such as monitoring properties at high risk for foreclosures.

4. The market has responded to demand by investing in cloud-based location technology companies in 2010. Dec. 31, 2010 Jennifer Van Grove suggested on Mashable that Web and mobile technologies are bringing rapid changes to how businesses engage with people, places and things. “The speed at which this evolution takes place will only continue to accelerate in 2011 with the help of fledgling startups who will push the boundaries around geolocation, mobile photos, entertainment services, community and physical-to-digital connections.”  The financial news in the geospatial world is dominated by examples of services like SimpleGeo and GeoAPI, offering easy entry into the spatial market to any sized business hoping to meet consumer demand for location-based applications. The entrance of Microsoft’s Windows Azure Marketplace DataMarket in late 2010, with its easy access to data in the cloud and location data, will further accelerate the entrance of new location services.

5. Cloud-based GIS is a viable business.  While many companies are still experimenting with the business models that accompany their cloud-based GIS technology, some have now proven that a pure cloud-based business can actually be quite profitable.  Combine that with the demonstrated financial success of other more conventional SaaS businesses such as Salesforce.com and it’s safe to say that this phenomenon is no longer hypothetical.  In fact, at Digital Map Products our products have always been cloud native and we’ve been profitable for years.  Further, we continue to see strongly increasing demand for our SaaS GIS software and spatial development platforms – this new business model is clearly here to stay. 
 
6. The data market is rapidly expanding. APIs, basically geo-data in the cloud, exploded in 2010. More and more “data as a service” companies are offering data in the cloud. For example, Education.com allows users to get data on any school system in the United States from the cloud, rather than having to collect the data, make it fit with other data sets and keep it updated. Many industries discovered the value of property data, which is increasingly easy to find in the cloud. Once only the real estate, utility and developer industries could afford to use property data; now it’s available and becoming essential to many industries. 

This new data is coming from new sources as well. Crowdsourcing, data provided by everyday citizens, has both expanded the amount of data available and spurred philosophical debates about who should create and control geospatial data.

Major disasters such as Hurricane Katrina and the earthquake in Haiti showed consumers the valuable role they can play by contributing data to the cloud. OpenStreetMap, a collaborative project to create a free editable map of the world, allowed volunteers to map roads, buildings and refugee camps in Haiti just days after the disaster there. Even traditional GIS authorities recognize the economic value of non-expert contributions to the cloud of geographic knowledge. As the data sets available in the cloud expand, the push to standardize GIS data will continue. An example of this is seen with the international Open Geospatial Consortium, made up of businesses, government agencies and educational institutions, which developing geo-processing specifications. 

Why is the Cloud Suitable for Those Who Need GIS?

As the trends above suggest, the cloud solves many of the traditional problems with GIS: its expense in both money and time; the highly-specialized knowledge required to implement and use it; the difficultly of maintaining it; the expense of acquiring and updating data; and the complexity of integrating data sets.

In addition, in the past GIS was the domain of experts, not of users across the organization or consumers. One of the major reasons the cloud advances GIS use by businesses is that they no longer need to be GIS experts to get the data they need. With “data as a service” companies multiplying, businesses can focus on their core competencies and “outsource” their GIS needs to a cloud-based provider. Now the gathering, integrating and maintaining of data takes place in the cloud. Time-to-market is compressed from years into weeks. Cloud applications are also flexible and scalable. Developers can add features and data easily and scale up and down as needed. All of these advantages reduce the concern for businesses that GIS technology will change faster than the system they’ve invested in.

In the past, most companies that decided to begin utilizing spatial technology did so slowly and carefully. They laboriously researched various options, because buying and learning a desktop system meant making sure it would hold up for a decade or more. Choosing a desktop system meant, basically, building a GIS “box,” gathering the data to put in the box, integrating the data so it could talk to other data, and maintaining both the software and the data.

When businesses contemplate the move to GIS today, they face ever-increasing pressure to choose and get to market as quickly as possible. This pressure gives an advantage to cloud-based technology, as there is no upfront capital expenditure and the onus of updating and maintaining functionality is now a responsibility of cloud service providers.  Equally valuable is the fact that the company no longer needs to collect or buy data, update it or figure out how to make it all fit together. Many businesses once locked out of the GIS market due to high startup costs are now able to get into the game.

Additional Benefits of the Cloud for GIS

The cloud leads to cost savings and reduces time-to-market, but the benefits of cloud GIS go further. The rapidly-expanding market for cloud-based applications is fueling the more rapid development of spatial applications. Every new application expands the market even further, as new users see the value of map-based information and analysis.

Another change is the greater simplicity of cloud GIS services, which means businesses don’t need as much expertise or as many technical experts as they did in the past. Along this same line, it’s now clear the further the technology is moved toward the ultimate beneficiary the greater the return on productivity.  Think about the similar evolution of word processing and computer aided design (CAD).  There was a time when these technologies were managed as an isolated resource such as the “word processing pool”. Now this technology is completely distributed, the word processing pool has been replaced by a support competency in the IT Department and any professional who is not proficient in its use is significantly less employable.

SaaS products are designed to be user-friendly. These services tend to have higher adoption rates by users and a shorter learning curve.  In governments and large businesses, that means more and more non-experts are using GIS at their desktops for the first time, and they are collaborating and sharing data across departments. It’s important to note that this collaboration and sharing increases business efficiency and productivity without placing an added burden on stretched IT departments.

Cloud data providers recognize the competitiveness of the market, and they invest considerable resources to ensure best of class uptime and computing power. The move to cloud-based services is also improving the quality of data that is available.

For developers, the cloud removes the considerable challenges of making GIS work, and frees them up to find creative and simple-to-use applications. Cloud development platforms provide standardized, easy-to-implement back-ends for developers who need a robust, reliable, high-performance foundation. At the same time, these new platforms offer developers flexibility on the front end with the ability to customize data, user interfaces, and integrations into workflows. Using the cloud, develops can create spatial infrastructures and tools that fit the unique needs of their users.

Why GIS will continue to move to the cloud in 2011…

The economy will continue to be an important factor in 2011. As spatial technology buyers weigh “desktop versus cloud” decisions, the simplicity and shortened development time of cloud technology and the need to do more with less will continue to tilt the scales in favor of the SaaS applications. It may be that the recession drives buyers to try the cloud, but when the recession lifts, a whole new group of users will have experienced the advantages of cloud-based GIS. So while it may have been the economy that sped up adoption of cloud-based services, their wide-spread use predicts cloud-based GIS is here to stay.

At the same time, younger, more tech-saavy leaders see the possibilities of location-based technology and expect their developers to deliver ever-increasingly-sophisticated applications. The cloud will offer them the opportunities to reach their visions.

Some of the trends that will continue in 2011 include:

1. Developing spatial applications will become more affordable. Robust cloud- based spatial development platforms are in place. They are reducing the cost of developing applications precipitously and collapsing development timeframes. In many instances, developers will even be able to “try out” cloud based services as part of their agile development process, further lowering risk. 

2. It will become easier for developers to create robust mapping applications. As 2011 begins, developing spatial applications is still difficult, but quickly becoming less difficult. The cloud is moving GIS out of the backroom and into the mainstream of business and consumer applications. The emerging cloud-based development platforms will accelerate the adoption of GIS and also expand the number of GIS applications in the marketplace. Platforms and tools for developers to create location-aware applications will increase. Microsoft and Google will continue to play key roles by providing the base layers and making it easy to get started.

3. Developers will find innovative uses for GIS. Now that the technology has gotten simpler, developers will have more free time to focus on new uses and ways to integrate GIS. Because the technical “learning curve” has been erased, expect to see an explosion of new uses for GIS this year. But developers will need to recognize that, in order for GIS to be successful on a massive scale, it needs to be embedded into workflows or hidden from users. Users want innovation, but they don’t want to have to think about GIS.

4. Applications with incorporate more advanced functionality.  As GIS in the cloud becomes more sophisticated, it will move beyond data and points on a map. The future is in combining data sets and leveraging them for analysis and business intelligence.  APIs to allow heat mapping, thematic mapping, spatial queries and advanced data visualization will see more widespread use. The standardization of infrastructure, data and back-end spatial functions will continue.

5. The pool of spatial technology users will expand. New industries are rapidly embracing Cloud GIS, including business intelligence, local government and real estate. Large organizations will continue to make the move to GIS in the cloud because it makes smart business sense and keeps them competitive.

6. Cloud-based GIS platforms will move from components to solutions. As more industries adopt the new GIS in the cloud, the demand for complete spatial enablement solutions tailored to each industry’s specific needs will overtake the demand for individual GIS components such as data, query, or storage web services.  Expect these turn-key solutions to include data bundled with spatial display and analytical features. The movement towards complete cloud-based GIS solutions is critical for non geodevelopers to fully utilize this technology and rapidly integrate mapping into traditional business and consumer applications.

2011 will be a pivotal year for GIS. It will be the year spatial technology goes mainstream. New companies and industries will embrace GIS.  Instead of incremental improvements to existing uses, new companies and industries will leverage GIS and incorporate it into their basic business operations. GIS will move from being a tool for specialists in a handful of industries to a common platform for business and consumer analysis, allowing exponentially more people to incorporate spatial data into their everyday decision making. The cloud has clearly transformed the nature of location technologies and will continue to propel the GIS industry forward in 2011.

About the Author

Jim Skurzynski is the President and CEO of Digital Map Products and an expert in spatial technology, computing in the cloud and on-line real estate and government technology.

Skurzynski is a founder of Digital Map Products, a leading innovator of web-enabled spatial solutions that has brought the power of spatial technology to mainstream business, government and consumer applications for more than 10 years. He has spent the majority of his career designing and managing the deployment of technology solutions in a variety of public and private sector environments. Over the past twenty years, he has held executive management positions in spatial technology companies in the USA, Canada, and Mexico.

 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire