2012 Cloud Forecast

By Oleg Komissarov, Vice President, Enterprise Solutions

December 15, 2011

Not surprisingly, cloud computing will remain a hot topic in 2012. Leading Platform-as-a-Service providers will keep investing billions in huge datacenters with megawatts of power capacity. Software-as-a-Service platform development will be a main goal for software and Internet industry leaders and enterprises will all weigh in on the best approaches for adoption versus adaption to cloud strategies. 2011 may have been cloud’s fifth birthday, but in 2012, the six-year-old will start making a truly distinctive name for itself.

On the PaaS and IaaS fronts, datacenter players continue their global foothold. Microsoft will invest additional $150 million to expand new datacenters in Southern Virginia even as it is still completing the $499 million first phase of the project. Overall the company will invest $900 million in datacenters in US and Ireland in 2012 alone. Meanwhile, Google is expected to establish a $600 million datacenter in Oklahoma, $100 million in Ireland and worldwide datacenters in Singapore, Taiwan and Hong Kong. IBM proposed a 620,000 sq ft datacenter in Langfang, China which will provide infrastructure for exploding business growing in China and host numerous e-government services for food and drug safety systems, electronic medical records and other government projects. Microsoft, Google Yahoo, Amazon and Facebook building data centers to support expanding range of their Internet services, support ever growing demand of file sharing (image, video and documents). But what is more important, they are betting on growth of popularity of their Platform as a Service and Application as a Service platforms. Higher number of Projects completed for our customers on PaaS and SaaS platforms in 2011 comparing to 2010 year confirms this trend. We expect that number of such projects will be tripled in 2012 and making all preparations on our side to support this demand. Because of the shortage of Cloud specialists on the market we are increasing our investments in specialists training and certification.

Software-as-a-Service platform development is another ubiquitous venture that has legacy stakeholders and impressive novices buzzing. So far the Salesforce platform remains the most mature SaaS platform especially after the Herocu acquisition, but the picture will change and new players will step in. The major shift to watch will be Facebook actively moving to Software-as-a-Service solution providers. Currently, they are building their own $450 million datacenter in North Carolina and opened new development center in New York where they want to hire “as many developers as possible.”

“Our future looks bright; we want the next Facebook to start here, in New York City,” said Sheryl Sandberg, Facebook chief operating officer. Facebook will also probably start building Business applications and APIs for its highly-scalable social networking platform under the impression of Salesforce CRM success and will actively push “Social Enterprise” ideology globally.

Aiming to improve performance, scalability and quality of SaaS software and also due to “cloud buzz,” enterprise leaders will start building strategies of adoption of cloud technologies. But SaaS platforms are still young and do not provide required software components for line of business applications. They will also face other challenges, such as licensing issues, impact of regulatory restrictions, cost justifications, and interoperability/integration with legacy solutions.

Even for IaaS platforms, there are many limitations. In the financial industry for example, broker-dealers have been regulated by the SEC (Securities and Exchange Commission) since 1934. Rule 301 specifies requirements including stress testing, security reviews, oversight procedures, disaster recovery plans, annual auditing and outages and changes reporting via periodically Form amendment. But regulations do not specify how of these responsibilities falls on cloud service providers and clients. The only way to reach Cloud by Regulator is to create a workaround and require broker-dealer to put specific provisions in their outsourcing agreement with the cloud service provider. So, instead of active cloud adoption by enterprise in 2012 (which will be mostly migration to private cloud) it will still be a year of early stage of cloud evaluation and new cloud business models. Enterprise and young cloud platforms have to go through a long learning curve before they start be effective for each other.

Cloud technologies are capable of processing billions of transactions and storing petabytes of data, making this environment very attractive for some industries where such capabilities are critical for business success. Energy and manufacturing are good examples and they will be actively utilizing cloud in 2012. For years energy utility companies were trying to build Smart Grid solutions containing hundreds of thousands of smart meters promising consumers to save on electrical bills. Overall they failed with this idea because of the high cost of installation, and the high cost and low scalability of the hosting infrastructure and device network. Combining new inexpensive wireless technologies like ZigBee with the advantages of cloud and emerging innovative technologies from companies like ThinkEco and Artemis Automation will enable energy management solutions that allow consumers to control energy consumption from mobile devices, conceivably lowering 15 percent reduction in energy costs. Giants like IBM and GE announced Smart Energy Cloud solutions for energy retailers and consumers. They support near real-time data gathering from millions of sensors, transfer it into the cloud databases and organize highly effective/scalable/elastic utility management systems supporting unbelievable scalability and two-way participation of consumers and providers.

In the UK, the Smart Meters Implementation Program will support up to 50 million of devices. Manufacturers will follow this trend because they have similar requirements. In 2011 we completed several projects for our customers where electronic devices were wirelessly connected to cloud and see growing demand in this area for 2012. In order to meet this demand, we keep actively developing our embedded practices together with cloud technologies.

Product companies with industry-specific solutions will start porting them to the cloud together with new Web interfaces. They develop Facebook-compatible applications connected through APIs, provide software components for SaaS platforms and applications registered on Salesforce App Exchange and Google Apps Marketplace. They will start to seriously consider social networking platforms as application platforms for their products and new marketing opportunities.

Lastly, 2012 will open up a new era for big data analytics for enterprises. Microsoft, IBM and Oracle rushed into this area and announced big data product releases. Oracle will release its Big Data Appliance Hadoop-based solution in January. Microsoft will release Hadoop-based SQL server in 2012 and it should have its beta Hadoop service on Azure by the end of 2011. So starting from 2012, Big Data and Big Data Analytics will be demystified and brought to the masses. It will change the way of making decisions. The financial industry will use it as an instrument for monitoring social media to predict customer buying behavior, incorporate analytics in trading decisions, detect complex patterns, and filter misinformation. Governments will use it to predict cyber-attacks and prevent crimes. There are wide range of use cases for real time big data analytics in healthcare, hospitality, telecommunications, and other industries. Organizations will suffer from a sharp shortage of specialists in this area. So, for those who tire of the cloud computing noise, big data analytics in the cloud promises to stimulate some interesting discussion.

About the Author

Oleg KomissarovOleg Komissarov is a veteran of the IT industry with more than 15 years of experience in custom software development and enterprise systems architecture. Oleg joined DataArt’s St. Petersburg office in 2006 as a senior software developer and advanced to a software architect in 2009. During that time he’s been responsible for enterprise solutions implementation for key financial clients in the United States and Europe. In 2010 he relocated to New York headquarters and was appointed Vice President of Enterprise Solutions.

Prior to DataArt, Oleg worked as a Senior Industrial Software Architect at a Magnitogorsk Iron & Steel Works (MMK:LI). He has MS in Electronic Engineering form Magnitogorsk State University, and in System Engineering from Yekaterinburg State University (Russia).

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

RSC Reports 500Tflops, Hot Water Cooled System Deployed at JINR

April 18, 2018

RSC, developer of supercomputers and advanced HPC systems based in Russia, today reported deployment of “the world's first 100% ‘hot water’ liquid cooled supercomputer” at Joint Institute for Nuclear Research (JI Read more…

By Staff

New Device Spots Quantum Particle ‘Fingerprint’

April 18, 2018

Majorana particles have been observed by university researchers employing a device consisting of layers of magnetic insulators on a superconducting material. The advance opens the door to controlling the elusive particle Read more…

By George Leopold

Cray Rolls Out AMD-Based CS500; More to Follow?

April 18, 2018

Cray was the latest OEM to bring AMD back into the fold with introduction today of a CS500 option based on AMD’s Epyc processor line. The move follows Cray’s introduction of an ARM-based system (XC-50) last November. Read more…

By John Russell

HPE Extreme Performance Solutions

Hybrid HPC is Speeding Time to Insight and Revolutionizing Medicine

High performance computing (HPC) is a key driver of success in many verticals today, and health and life science industries are extensively leveraging these capabilities. Read more…

Hennessy & Patterson: A New Golden Age for Computer Architecture

April 17, 2018

On Monday June 4, 2018, 2017 A.M. Turing Award Winners John L. Hennessy and David A. Patterson will deliver the Turing Lecture at the 45th International Symposium on Computer Architecture (ISCA) in Los Angeles. The Read more…

By Staff

Cray Rolls Out AMD-Based CS500; More to Follow?

April 18, 2018

Cray was the latest OEM to bring AMD back into the fold with introduction today of a CS500 option based on AMD’s Epyc processor line. The move follows Cray’ Read more…

By John Russell

IBM: Software Ecosystem for OpenPOWER is Ready for Prime Time

April 16, 2018

With key pieces of the IBM/OpenPOWER versus Intel/x86 gambit settling into place – e.g., the arrival of Power9 chips and Power9-based systems, hyperscaler sup Read more…

By John Russell

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Cloud-Readiness and Looking Beyond Application Scaling

April 11, 2018

There are two aspects to consider when determining if an application is suitable for running in the cloud. The first, which we will discuss here under the title Read more…

By Chris Downing

Transitioning from Big Data to Discovery: Data Management as a Keystone Analytics Strategy

April 9, 2018

The past 10-15 years has seen a stark rise in the density, size, and diversity of scientific data being generated in every scientific discipline in the world. Key among the sciences has been the explosion of laboratory technologies that generate large amounts of data in life-sciences and healthcare research. Large amounts of data are now being stored in very large storage name spaces, with little to no organization and a general unease about how to approach analyzing it. Read more…

By Ari Berman, BioTeam, Inc.

IBM Expands Quantum Computing Network

April 5, 2018

IBM is positioning itself as a first mover in establishing the era of commercial quantum computing. The company believes in order for quantum to work, taming qu Read more…

By Tiffany Trader

FY18 Budget & CORAL-2 – Exascale USA Continues to Move Ahead

April 2, 2018

It was not pretty. However, despite some twists and turns, the federal government’s Fiscal Year 2018 (FY18) budget is complete and ended with some very positi Read more…

By Alex R. Larzelere

Nvidia Ups Hardware Game with 16-GPU DGX-2 Server and 18-Port NVSwitch

March 27, 2018

Nvidia unveiled a raft of new products from its annual technology conference in San Jose today, and despite not offering up a new chip architecture, there were still a few surprises in store for HPC hardware aficionados. Read more…

By Tiffany Trader

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Leading Solution Providers

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

HPC and AI – Two Communities Same Future

January 25, 2018

According to Al Gara (Intel Fellow, Data Center Group), high performance computing and artificial intelligence will increasingly intertwine as we transition to Read more…

By Rob Farber

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Te Read more…

By John Russell

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Momentum Builds for US Exascale

January 9, 2018

2018 looks to be a great year for the U.S. exascale program. The last several months of 2017 revealed a number of important developments that help put the U.S. Read more…

By Alex R. Larzelere

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This