2012 Cloud Forecast

By Oleg Komissarov, Vice President, Enterprise Solutions

December 15, 2011

Not surprisingly, cloud computing will remain a hot topic in 2012. Leading Platform-as-a-Service providers will keep investing billions in huge datacenters with megawatts of power capacity. Software-as-a-Service platform development will be a main goal for software and Internet industry leaders and enterprises will all weigh in on the best approaches for adoption versus adaption to cloud strategies. 2011 may have been cloud’s fifth birthday, but in 2012, the six-year-old will start making a truly distinctive name for itself.

On the PaaS and IaaS fronts, datacenter players continue their global foothold. Microsoft will invest additional $150 million to expand new datacenters in Southern Virginia even as it is still completing the $499 million first phase of the project. Overall the company will invest $900 million in datacenters in US and Ireland in 2012 alone. Meanwhile, Google is expected to establish a $600 million datacenter in Oklahoma, $100 million in Ireland and worldwide datacenters in Singapore, Taiwan and Hong Kong. IBM proposed a 620,000 sq ft datacenter in Langfang, China which will provide infrastructure for exploding business growing in China and host numerous e-government services for food and drug safety systems, electronic medical records and other government projects. Microsoft, Google Yahoo, Amazon and Facebook building data centers to support expanding range of their Internet services, support ever growing demand of file sharing (image, video and documents). But what is more important, they are betting on growth of popularity of their Platform as a Service and Application as a Service platforms. Higher number of Projects completed for our customers on PaaS and SaaS platforms in 2011 comparing to 2010 year confirms this trend. We expect that number of such projects will be tripled in 2012 and making all preparations on our side to support this demand. Because of the shortage of Cloud specialists on the market we are increasing our investments in specialists training and certification.

Software-as-a-Service platform development is another ubiquitous venture that has legacy stakeholders and impressive novices buzzing. So far the Salesforce platform remains the most mature SaaS platform especially after the Herocu acquisition, but the picture will change and new players will step in. The major shift to watch will be Facebook actively moving to Software-as-a-Service solution providers. Currently, they are building their own $450 million datacenter in North Carolina and opened new development center in New York where they want to hire “as many developers as possible.”

“Our future looks bright; we want the next Facebook to start here, in New York City,” said Sheryl Sandberg, Facebook chief operating officer. Facebook will also probably start building Business applications and APIs for its highly-scalable social networking platform under the impression of Salesforce CRM success and will actively push “Social Enterprise” ideology globally.

Aiming to improve performance, scalability and quality of SaaS software and also due to “cloud buzz,” enterprise leaders will start building strategies of adoption of cloud technologies. But SaaS platforms are still young and do not provide required software components for line of business applications. They will also face other challenges, such as licensing issues, impact of regulatory restrictions, cost justifications, and interoperability/integration with legacy solutions.

Even for IaaS platforms, there are many limitations. In the financial industry for example, broker-dealers have been regulated by the SEC (Securities and Exchange Commission) since 1934. Rule 301 specifies requirements including stress testing, security reviews, oversight procedures, disaster recovery plans, annual auditing and outages and changes reporting via periodically Form amendment. But regulations do not specify how of these responsibilities falls on cloud service providers and clients. The only way to reach Cloud by Regulator is to create a workaround and require broker-dealer to put specific provisions in their outsourcing agreement with the cloud service provider. So, instead of active cloud adoption by enterprise in 2012 (which will be mostly migration to private cloud) it will still be a year of early stage of cloud evaluation and new cloud business models. Enterprise and young cloud platforms have to go through a long learning curve before they start be effective for each other.

Cloud technologies are capable of processing billions of transactions and storing petabytes of data, making this environment very attractive for some industries where such capabilities are critical for business success. Energy and manufacturing are good examples and they will be actively utilizing cloud in 2012. For years energy utility companies were trying to build Smart Grid solutions containing hundreds of thousands of smart meters promising consumers to save on electrical bills. Overall they failed with this idea because of the high cost of installation, and the high cost and low scalability of the hosting infrastructure and device network. Combining new inexpensive wireless technologies like ZigBee with the advantages of cloud and emerging innovative technologies from companies like ThinkEco and Artemis Automation will enable energy management solutions that allow consumers to control energy consumption from mobile devices, conceivably lowering 15 percent reduction in energy costs. Giants like IBM and GE announced Smart Energy Cloud solutions for energy retailers and consumers. They support near real-time data gathering from millions of sensors, transfer it into the cloud databases and organize highly effective/scalable/elastic utility management systems supporting unbelievable scalability and two-way participation of consumers and providers.

In the UK, the Smart Meters Implementation Program will support up to 50 million of devices. Manufacturers will follow this trend because they have similar requirements. In 2011 we completed several projects for our customers where electronic devices were wirelessly connected to cloud and see growing demand in this area for 2012. In order to meet this demand, we keep actively developing our embedded practices together with cloud technologies.

Product companies with industry-specific solutions will start porting them to the cloud together with new Web interfaces. They develop Facebook-compatible applications connected through APIs, provide software components for SaaS platforms and applications registered on Salesforce App Exchange and Google Apps Marketplace. They will start to seriously consider social networking platforms as application platforms for their products and new marketing opportunities.

Lastly, 2012 will open up a new era for big data analytics for enterprises. Microsoft, IBM and Oracle rushed into this area and announced big data product releases. Oracle will release its Big Data Appliance Hadoop-based solution in January. Microsoft will release Hadoop-based SQL server in 2012 and it should have its beta Hadoop service on Azure by the end of 2011. So starting from 2012, Big Data and Big Data Analytics will be demystified and brought to the masses. It will change the way of making decisions. The financial industry will use it as an instrument for monitoring social media to predict customer buying behavior, incorporate analytics in trading decisions, detect complex patterns, and filter misinformation. Governments will use it to predict cyber-attacks and prevent crimes. There are wide range of use cases for real time big data analytics in healthcare, hospitality, telecommunications, and other industries. Organizations will suffer from a sharp shortage of specialists in this area. So, for those who tire of the cloud computing noise, big data analytics in the cloud promises to stimulate some interesting discussion.

About the Author

Oleg KomissarovOleg Komissarov is a veteran of the IT industry with more than 15 years of experience in custom software development and enterprise systems architecture. Oleg joined DataArt’s St. Petersburg office in 2006 as a senior software developer and advanced to a software architect in 2009. During that time he’s been responsible for enterprise solutions implementation for key financial clients in the United States and Europe. In 2010 he relocated to New York headquarters and was appointed Vice President of Enterprise Solutions.

Prior to DataArt, Oleg worked as a Senior Industrial Software Architect at a Magnitogorsk Iron & Steel Works (MMK:LI). He has MS in Electronic Engineering form Magnitogorsk State University, and in System Engineering from Yekaterinburg State University (Russia).

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Neural Networking Shows Promise in Earthquake Monitoring

February 21, 2018

A team of Harvard University and MIT researchers report their new neural networking method for monitoring earthquakes is more accurate and orders of magnitude faster than traditional approaches. Read more…

By John Russell

HPE Wins $57 Million DoD Supercomputing Contract

February 20, 2018

Hewlett Packard Enterprise (HPE) today revealed details of its massive $57 million HPC contract with the U.S. Department of Defense (DoD). The deal calls for HPE to provide the DoD High Performance Computing Modernizatio Read more…

By Tiffany Trader

Topological Quantum Superconductor Progress Reported

February 20, 2018

Overcoming sensitivity to decoherence is a persistent stumbling block in efforts to build effective quantum computers. Now, a group of researchers from Chalmers University of Technology (Sweden) report progress in devisi Read more…

By John Russell

HPE Extreme Performance Solutions

Safeguard Your HPC Environment with the World’s Most Secure Industry Standard Servers

Today’s organizations operate in an environment with ever-evolving threats, and in order to protect themselves they must continuously bolster their security strategy. Hewlett Packard Enterprise (HPE) and Intel® are addressing modern security challenges with the world’s most secure industry standard servers powered by the latest generation of Intel® Xeon® Scalable processors. Read more…

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penalties to HPC applications. Even as these patches are rolled o Read more…

By Pete Beckman

Neural Networking Shows Promise in Earthquake Monitoring

February 21, 2018

A team of Harvard University and MIT researchers report their new neural networking method for monitoring earthquakes is more accurate and orders of magnitude faster than traditional approaches. Read more…

By John Russell

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penal Read more…

By Pete Beckman

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

The Food Industry’s Next Journey — from Mars to Exascale

February 12, 2018

Global food producer and one of the world's leading chocolate companies Mars Inc. has a unique perspective on the impact that exascale computing will have on the food industry. Read more…

By Scott Gibson, Oak Ridge National Laboratory

Singularity HPC Container Start-Up – Sylabs – Emerges from Stealth

February 8, 2018

The driving force behind Singularity, the popular HPC container technology, is bringing the open source platform to the enterprise with the launch of a new vent Read more…

By George Leopold

Dell EMC Debuts PowerEdge Servers with AMD EPYC Chips

February 6, 2018

AMD notched another EPYC processor win today with Dell EMC’s introduction of three PowerEdge servers (R6415, R7415, and R7425) based on the EPYC 7000-series p Read more…

By John Russell

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Leading Solution Providers

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

V100 Good but not Great on Select Deep Learning Aps, Says Xcelerit

November 27, 2017

Wringing optimum performance from hardware to accelerate deep learning applications is a challenge that often depends on the specific application in use. A benc Read more…

By John Russell

SC17: Singularity Preps Version 3.0, Nears 1M Containers Served Daily

November 1, 2017

Just a few months ago about half a million jobs were being run daily using Singularity containers, the LBNL-founded container platform intended for HPC. That wa Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This