William Fellows on the Year Ahead in Grid Computing

By By Derrick Harris, Editor

January 23, 2006

It’s almost a month into 2006, and the Grid world has been relatively quiet. However, if The 451 Group’s William Fellows is correct, we should the action ramping up as the year progresses, as he sees 2006 as the year “Grid technologies start to cross the chasm into the enterprise.” In this interview, Fellows discusses, among other topics, what this year will bring in terms of standards, software licensing and enterprise adoption of Grid computing.

GRIDtoday: Before we get into what to expect in 2006, can you give a recap of 2005? What were some of the key happenings and trends in the Grid space, and how might they affect what we see in the year to come?

WILLIAM FELLOWS: Grid computing is no longer only about high-performance computing or implementing commodity servers. In 2005, we saw Grid computing become firmly established as a means to support service-oriented architecture (SOA) application infrastructure, utility computing delivery infrastructure and the automated datacenter.

In 2005, The 451 Group also identified the key challenges shared by early adopters across several different vertical markets that have been barriers to the further adoption of Grid computing — including software licensing and data management issues. With integrators and ISVs coming to Grid computing, and Grid-based SOA and utility IT models coming into view, the expectation is that 2006 will be the year Grid technologies start to cross the chasm into the enterprise. But, we expect that early adopters will still be plagued by these barriers — and vendors will begin to address them. Grid is a means to an end. The destination, not the journey, should be the focus.

Gt: On to the challenges the Grid market will face in 2006. You mentioned software licensing. What is on the horizon for licensing this year? Will we finally see vertical market-specific ISVs adopting Grid-friendly licensing models?

FELLOWS: Software licensing is definitely the key concern for an increasing number of early adopters. We believe that as early adopters evolve into using grids as a more mainstream technology, the restrictions of current software licensing will become an ever-greater obstacle. Many of the Grid vendors talk about supporting the ability to proactively manage the use of software licenses based on business objectives. But early adopters have their doubts about this and are asking for real and tangible changes. They see metered usage as a potential way to manage software licenses, and in 2006, we could see users collectively exerting pressure, within vertical markets, on suppliers for change. We are already seeing some changes under way in the electronic design automation (EDA) sector.

Gt: What can we expect in the field of standards throughout the year? Will the oft-maligned standards community finally get its act together and develop some widely adoptable standards?

FELLOWS: The world of standards has contributed very little to commercial, enterprise Grid computing. The proliferation of Grid industry bodies only means more confusion, not less, and has throttled momentum. The standards that do exist today are not relevant and there is no evidence that standards are being used in implementations or that any product is being built using them. If grids can find a place in one of the open source stacks, such as LAMP, it would undoubtedly help further adoption.

We believe there needs to be a convergence of standards efforts. GGF announced that it would try to find synergies with other standards groups in order to avoid duplicating efforts. GGF and EGA have already mentioned that they will try to combine their organizations. This is a start. The hard part will be how to reconcile short-term enterprise goals with longer-term global ambitions.

Gt: What about the suggested merge of the GGF and the EGA? How would this affect the standards community, and the Grid community as a whole?

FELLOWS: In 2005, driven by the companies that have membership with both organizations, GGF and EGA said they would look at ways of combining their organizations. Clearly, the landscape needs to change, but the question is how the longer-term “global Grid” ambitions of the science and research-oriented GGF can be reconciled with the shorter-term commercial goals of the EGA. Money, membership and addressing a lack of user interest will be key.

Separately, GGF cannot and will not be able to get its arms around all parts of the distributed computing “value chain.” But, if it wants to become a place where this action happens, it needs to create a leadership position, with new processes, relationships and activities. Collaboration with other groups is a start, though talking about how to regulate standards isn’t going to excite early adopters much until something concrete emerges, which is not even on the agenda at this point. Nevertheless, ensuring that grids can be a full participant within other standards activities brings Grid computing right to the table.

Gt: Where do end-users play into the standards debate? Will we begin to see a strong interoperability push from major end-users?

FELLOWS: Looking at actual implementations, early adopters don’t appear — by and large — to care much about standards, except those de facto ones that reign in particular markets such as Oracle and Linux. But, ask early adopters what they want, and they typically want standardization of the stack and APIs like data input/output. They want one set of standards and one stack. Not multiple. They say the growing number of Grid industry bodies suggests confusion and growing complexity in approach, not standardization.

The danger is that the momentum that Grid computing has gathered will be checked by the inability of organizations to converge. We believe that vendors should not to allow this to happen. Again, we think some groundswell awareness might be generated is if grids can find a place in one of the open source stacks.

In some sectors, there are de facto standards, but there is no Grid stack yet. In other sectors, users are pressuring suppliers to integrate. Major banks have asked Platform Computing and DataSynapse to create a plan to allow their respective middleware to share resources. Presently, the two require dedicated resources.

Gt: Moving on to another area, I’m wondering what you foresee in terms of adoption of Grid-related technologies such as SOA or utility computing? It seems like SOA is everywhere, and the major vendors have been ramping up the hype around utility options.

FELLOWS: With integrators and ISVs coming to Grid computing, and Grid-based SOA and utility IT models coming into view, we think this will be the year Grid technologies start to penetrate into the enterprise. For many organizations, especially in financial services, grids are the basis of SOA. Grids allow them to run Web services better, faster and cheaper.

We see Grid technologies becoming more invisible as they are pushed down into systems and systems/network management software stacks. Changes in the way companies buy software and IT services — such as pay as you go, subscription and outsourcing — suggest there is a long-term and cumulative disruption under way in the technology market.

Vendors hope users will migrate to grids and utility computing for more than just peak loads or high-performance requirements. But utility computing in enterprise organizations is not just around the corner, despite what vendors may suggest. Despite all the hype, it will take a lot of time and effort to turn everything into a service. At this point, nothing suggests that utility models will be an “all or nothing” play for users. The use of grids to support utility models and SOAs within enterprises will be the key market initially, with companies testing outsourced grid services for additional capacity and loads. Grids can underpin SOA, the 21st century data center and utility computing models, but the extent to which they can be integrated with legacy event-driven services, messaging, database systems and networking systems will be crucial to their success. IT as a shared utility underpinned by Grid technologies, and the business models this supports, will become increasingly important as a destination.

Gt: Which leads to the inevitable question of what 2006 will bring in terms of enterprise adoption. Can we expect to see widespread adoption take off this year? If so, what will be the main drivers?

FELLOWS: In 2005, we saw the major IT vendors spend heavily to market the concept of Frid computing. Expectations have never been so high, and 2006 we will see whether it was money well spent. The expectation is that 2006 will be the year Grid technologies start to cross the chasm into the enterprise. Vendors agree that there is increasing customer awareness of grids. Introductions of Grid-based “utility” services from Sun, HP, AT&T, IBM, CSC, EDS and other integrators should move from press releases to production use cases in 2006.

Grids are already well established doing the grunt work in vertical markets, but we believe that grids will start to find a broader enterprise role. Systems integrators have begun to form Grid practices. They can sense a commercial opportunity building and will start helping customers deploy grids across the enterprise. Grid-enablement of enterprise applications will also gather momentum as ISVs Grid-enable their programs. Software licensing and data management will remain the likely challenges. We will also look to see if grids start to appear in database and transactional environments.

Gt: It seems to me that one of the image problems Grid faces is that potential users see it as primarily an HPC solution, best suited for compute- and time- intensive tasks. When will vendors start to push Grid’s capabilities in data management, flexibility and disaster recovery, to name a few?

FELLOWS: Vendors have made a gamble that grids will be used more widely in the enterprise, to support a range of activities including SOA, data center automation and utility delivery, down to development, disaster recovery, ERP and supply chain — not just for HPC applications or consolidation purposes. The limited availability of commercial applications for grids, as well as utilities and design points, presents a real and present barrier to this happening. To encourage the use of grids downstream from HPC, vendors need applications that can take advantage of the adoption of grids.

Existing applications will be migrated to grids, developed as new for grids or supplied as software-as-a-service models. HP’s planned Application Provisioning Service is a good example of the latter. Sun and IBM both have application ISV partner programs. Middleware providers — generally known for job scheduling and reporting — are also eager for customers to move grids downstream of HPC. Platform, DataSynapse, United Devices and Univa hope to create sustainable, scalable businesses by facilitating and managing “downstream” enterprise grids.

In 2006, we expect the likes of EDS, CSC, Accenture and IBM to move from proof-of-concept grid engagements to production references. Second-tier systems integrators and offshore companies, including Satyam, Tata Consultancy Services, Infosys Technologies and Wipro Technologies, are also establishing enterprise Grid practices. Collectively, SIs can leverage huge communities of partners to build momentum around the use of grids in the enterprise.

Early adopter interest in deploying grids across the enterprise hinges entirely on business need, with applications as the essential driver. Where it makes sense, vertical-market ISVs have already embraced grids and more are coming every quarter. We think what happens in the market will depend on what the major vendors do. Companies to watch in 2006 will be SAP, Oracle and business intelligence vendors.

Gt: Finally, if you had to make a bottom-line assessment for the year to come, what would it be? When it’s all said and done, will the Grid market grow, shrink or become stagnant?

FELLOWS: In 2006, we expect to see applications move “downstream” from HPC tasks to other enterprise uses. To date, only a few applications have been written specifically for grids, and only a small number of today’s applications that appear suitable have been deployed on grids. Nevertheless, the grid industry — IT vendors, Grid middleware providers, system integrators and ISVs — are laying foundations that will see the Grid-enablement of applications accelerate significantly through 2006.

Gt: Is there anything else you’d like to add? Are there any topics not discussed that you would like to cover?

FELLOWS: The 451 Group’s Grid Adoption Research Service (GARS) has focused on understanding the needs of enterprise early adopters of Grid computing. We have interviewed more than 200 such users and have an extensive database tracking adoption among users. From our research, it is clear that improved performance — not just doing things more quickly, but being able to do different things — is the key driver. Close behind is the notion of saving money, although once grids are deployed, the performance aspects tend to be ever more important. It became clear that there are distinct differences among vertical markets in the adoption of grids, and these differences are likely to get ever more pronounced.

In 2006, The 451 Group will revisit the Grid computing opportunity within the financial services market. Also, look for The 451 Group’s report on Grid enablement and another that compares usage/experiences across all vertical segments.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Supercomputers Streamline Prediction of Dangerous Arrhythmia

June 2, 2020

Heart arrhythmia can prove deadly, contributing to the hundreds of thousands of deaths from cardiac arrest in the U.S. every year. Unfortunately, many of those arrhythmia are induced as side effects from various medicati Read more…

By Staff report

Indiana University to Deploy Jetstream 2 Cloud with AMD, Nvidia Technology

June 2, 2020

Indiana University has been awarded a $10 million NSF grant to build ‘Jetstream 2,’ a cloud computing system that will provide 8 aggregate petaflops of computing capability in support of data analysis and AI workload Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been instrumental to AMD’s datacenter market resurgence. Nanomet Read more…

By Doug Black

Supercomputer-Powered Protein Simulations Approach Lab Accuracy

June 1, 2020

Protein simulations have dominated the supercomputing conversation of late as supercomputers around the world race to simulate the viral proteins of COVID-19 as accurately as possible and simulate potential bindings in t Read more…

By Oliver Peckham

HPC Career Notes: June 2020 Edition

June 1, 2020

In this monthly feature, we'll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it's a promotion, new company hire, or even an accolade, we've got Read more…

By Mariana Iriarte

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

Supercomputer Modeling Shows How COVID-19 Spreads Through Populations

May 30, 2020

As many states begin to loosen the lockdowns and stay-at-home orders that have forced most Americans inside for the past two months, researchers are poring over the data, looking for signs of the dreaded second peak of t Read more…

By Oliver Peckham

Indiana University to Deploy Jetstream 2 Cloud with AMD, Nvidia Technology

June 2, 2020

Indiana University has been awarded a $10 million NSF grant to build ‘Jetstream 2,’ a cloud computing system that will provide 8 aggregate petaflops of comp Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI Read more…

By Doug Black

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

HPCwire and all of the Tabor Communications family are saddened by last week’s passing of Rich Brueckner. He was the ever-optimistic man in the Red Hat presiding over the InsideHPC media portfolio for the past decade and a constant presence at HPC’s most important events. Read more…

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This