IBM’s Cloud Strategy: A Little Bit of Everything

By Dennis Barker

October 14, 2008

IBM launched a mini-campaign last week to promote its cloud computing strategy, and part of that strategy involves not using cloud computing at all.

In his official statement on the matter, Willy Chiu, vice president of the High Performance on Demand Solutions division, said IBM is moving clients and itself to “a mixture of data and applications that live in the datacenter and in the cloud.” A mix of on-premise and in-cloud operations, as another IBM exec put it. So, those concerned the world is going silly with cloudmania need not worry.

And if the critics are correct and it really is silly to stake a claim in cloud computing, IBM doesn’t seem to care. The company has made cloud announcements almost monthly during the past year, and last week outlined four ways it plans to “capture the cloud computing opportunity”: (1) offering its own set of cloud products and technologies; (2) helping other companies develop cloud services; (3) helping customers integrate cloud services into their businesses; and (4) building cloud environments for other companies.

“We see an enormous opportunity to help companies use cloud services and to integrate SaaS into their business,” said Dave Mitchell, director of strategy for IBM Developer Relations, during a discussion following IBM’s official announcement. Now that the “emotional” concerns about software as a service have settled, and security practices have proven effective, “the real challenges are bubbling to the top,” Mitchell said. The biggest one is integration. “You’re going to run some stuff on-premise, and some over the cloud. So how do you integrate and administer it all in the most effective way? That’s what we’re solving.”

IBM’s strategy for solving the integration issue: money and effort. Last November, it announced the Blue Cloud initiative, a program aimed at helping companies run large-scale applications and handle massive amounts of data across a distributed, globally accessible fabric of resources — or to move enterprise applications to a cloud. IBM supplies the know-how based on its own cloud construction experience, the hardware (typically racks of iDataPlex servers) and middleware (mainly Tivoli for service management).

The company says it has dedicated 200 “Internet-scale” scientists to cloud research and is sponsoring programs at universities to train the cloud scientists of tomorrow. In recent months, it has opened cloud computing research centers around the world where customers can develop and test cloud technologies with Big Blue’s assistance; the newest are in Brazil, Vietnam, India, and South Korea, for a total of 13. The company says it is spending nearly $400 million to expand cloud computing capabilities at its research parks in North Carolina and Tokyo.

All this research has paid off, Mitchell said, with technologies that enable instant provisioning of resources across many servers, dynamic workload management, utility-based usage and accounting, and improved security. “These technologies underlie everything we’re doing to make the cloud viable and affordable. Clients have the ability to allocate servers and other resources on demand. Advances at the component level include the promise of storage devices with extreme data access speeds.” A related program is the New Enterprise Data Center, focused on making IT more efficient. A major catalyst for the program is “the fact that the cost of running a datacenter has gotten extraordinarily expensive,” Mitchell said.

Anyone interested in a public example of an IBM cloud application can check out the beta site for Bluehouse, which demonstrates a place where people from different organizations can work together by sharing all kinds of files, holding online meetings, and so on. (IBM marketing says it combines social networking and online collaboration tools.)

IBM also has set up 40 “innovation centers,” where independent developers can “do porting, technical enablement, and have all the equipment they need to perform scalability testing and benchmarking,” Mitchell said. “We invested heavily in our technical blueprint for building a SaaS solution on IBM middleware, a multitenant solution. ISVs can take this information and bring it to one of our innovation centers to develop their own projects.”

IBM has been “quite aggressive in establishing datacenters and cloud computing centers internationally to lay the groundwork for a shift in how people view large-scale computing,” said Vishwanath Venugopalan, analyst with The 451 Group. “Not many cloud infrastructure providers have been as open with their own infrastructural plans.”

Delivering Solutions, Not Clouds

One vendor leveraging IBM’s tools to deliver its applications is iEnterprises. The company runs two datacenters to provide its CRM applications to desktop and wireless clients in the legal and pharmaceutical industries, among others. “We offer CRM solutions that are cloud-enabled, but we’re not selling cloud computing,” said John Carini, chief software architect. “The term doesn’t appeal to many of our customers. They just want a solution they know they can download to their BlackBerrys or Windows Mobile devices on demand.” Carini said iEnterprises is in the IBM partner program because “they’re providing a platform that companies like us can use to quickly build applications on and then serve them up to many different clients. It’s all relatively easy to use; everything is SOA-enabled.” 

“IBM has this initiative to go to market with cloud computing, but it’s also really good to partner with them because they’re not going to compete directly with us,” Carini said. “That sets them apart from some other software companies. They’re more of a middleware company.” (Mitchell notes a similar agenda: “We’ve made a very concerted effort to move away from commoditized hosting to higher-value middleware services.”)

One thing research has not answered, however, is the question of what applications should run locally and which can run in the cloud. “There’s not a black-and-white answer, there’s not a clear line that says ‘Thou shalt run that in the cloud,’” Mitchell says. In areas like CRM, HR and collaboration, he adds, there is a definite trend toward cloud-based delivery models. Anything involving a lot of collaboration, access from disparate locations, or is highly standards-based can be a good fit for a SaaS model. Mitchell says IBM is seeing applications being developed in finance, accounting and supply chain management, with relatively few areas being deemed off-limits to the kind of SaaS platform it is marketing.

“But there clearly are going to be companies that say ‘We’re never going to have our HR app running outside our four walls,’” he acknowledges. “What’s OK for one company is not OK for another.”

 With all this talk about cloud computing and SaaS, it might be tough to tell where one ends and the other begins. “We clearly make a distinction between cloud computing and SaaS,” Mitchell says. “Most people would agree that SaaS is an example of cloud computing, a subset, but clearly there’s more to the cloud than SaaS. Software as a service is the most visible, the most mature example of cloud computing out there. Bluehouse is a cloud solution. But within the cloud framework there are many other activities, such as infrastructure as a service, platform as a service. [T]here are also private clouds, with enterprise customers taking the cloud computing model and deploying it in-house to drive efficiencies. The key is that customers have an efficient and effective solution, whether it’s running in their own datacenter or someone else’s, or both.” (In conversation, Mitchell referred to SaaS far more often than he used the cloud word. The SaaS to cloud ratio was about 10:1.)

OK, So What Is Cloud Computing?

“Cloud computing as a concept is going through some definitional structuring,” Mitchell says. “People clearly have different opinions. Some take a broad view: anything delivered over the network. But such a definition is so broad it’s useless. We’ll have to wait a while for a definition that’s widely accepted, but cloud services should have some specific characteristics.”

He believes there needs to be on-demand, request-driven provisioning; scalability and dynamic workload management; the ability to offer a subscription or utility model; open standards-based applications; and foolproof security. “And the feedback we’re getting from customers,” he adds, “is that services and applications have to be available from any location on any device.”

While the IT community continues to work out that definition, “IBM continues to lay out its offerings for various customer constituencies, including ISVs and end users,” The 451 Group’s Venugopalan says. “This latest set of announcements is not as infrastructurally focused as IBM’s prior announcements and seems to broaden the applicability of cloud computing to capitalize upon greater awareness of the term.”

Whatever the definition of cloud computing actually is, most pundits will tell you the paradigm will catch on first with developers and SMBs. Thus, there might some concern over Venugopalan’s notion that Big Blue currently is limiting its target audience. “One area in which cloud computing shows promise is how end users can provision and begin using infrastructure with little more than a credit card, thanks to self-provisioning and wide-ranging automation,” he says. “To date, IBM has undoubtedly invested heavily in technologies frequently found in cloud infrastructure today. However, to the extent that IBM has revealed its cloud strategy, it is designed to best appeal to prospective customers of IBM’s extensive portfolio products and services offerings, rather than to small-time developers.”

Not that IBM is without a black book full of customers, both paying and potential, to tap.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researcher Read more…

By Jan Rowell

What’s New in HPC Research: Cosmic Magnetism, Cryptanalysis, Car Navigation & More

November 8, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

Machine Learning Fuels a Booming HPC Market

November 7, 2019

Enterprise infrastructure investments for training machine learning models have grown more than 50 percent annually over the past two years, and are expected to shortly surpass $10 billion, according to a new market fore Read more…

By George Leopold

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Atom by Atom, Supercomputers Shed Light on Alloys

November 7, 2019

Alloys are at the heart of human civilization, but developing alloys in the Information Age is much different than it was in the Bronze Age. Trial-by-error smelting has given way to the use of high-performance computing Read more…

By Oliver Peckham

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This