IBM’S DAVE GELARDI TALKS ABOUT THE SIGNIFICANCE OF LINUX

February 2, 2001

by Christopher Rogala, Assistant Editor, HPCwire

As part of HPCwire’s expanded Linux coverage during the LinuxWorld Conference and Expo in New York City, we spoke Wednesday with Dave Gelardi, IBM’s Director of Deep Computing, about the role of Linux in the changing world of high performance computing. The following is a transcription of that interview:

HPCwire: The popularity of Linux systems seems to have taken off relatively suddenly. Why Linux and why now?

GELARDI: I think, you know, in some sense it’s been going on for a while. I think what’s happened is that we’re seeing it sort of move into the mainstream. I do a little test with CIOs in briefings all the time and I sort of ask them “Do you have any Linux deployments?” and they generally say no. But then his lieutenants in the room typically, you know, sort of kick each other under the table and look at their shoe-tops. And then the next thing you find out that there’s been a number of sort of pilot deployments going on in the enterprise. So I think what’s happened is that the experimentation and testing has been going on ever the last several years and now all of a sudden as we, we IBM and others in the industry, are being more public about customers that are actually doing it, they’re saying, you know what, it’s o.k. to do Linux, it’s a good thing, it’s got, you know, the kind of capability that we need for certain classes of applications. So I think what’s happening is we’re seeing a crossover from this early adopter phase to this early majority phase and I think it’s fueled, in part, by the promise of a globally available operating system, lots of underlying technologies, software vendors are stepping up, services companies are stepping up — so I think it’s a whole lot of factors that are all coming together at the same time.

HPCwire: To what previous development in computing can you compare the development of Linux, in terms of significance?

GELARDI: I cannot think of a technology that had all the same similarities, although I would point to the early emergence of UNIX as a, at least at the operating system level, as being quite similar from the vantage point of a promise of broad applicability and broad availability. Of course, what’s fundamentally different about Linux is that it’s a community-based development activity. And I’ve heard it said a number of different ways, but, you know, the way to think about it is the best and brightest in the entire industry are working in the open to develop an operating system that’s not owned by any given company and it is a, the term that I’ve heard which I really like is that it’s a “brutal meritocracy” in terms of the way in which code is examined.

I think the most analogous computing technology was really the emergence of e-business, in so far as it being a disruptive technology to what customers had been previously doing. You know, you might say that client-server computing was a similar trend, although none of them look the same as the way we see Linux.

HPCwire: What, if anything, makes Linux especially suitable for supercomputing?

GELARDI: Probably the thing that makes it most suitable, and why supercomputing in particular, is that the characteristics of the user in a supercomputing environment is that he has ownership of his own code, and then he takes his code and runs it on a variety of different machines. The advantage of Linux is that since Linux is available on Intel, and it will be available when I64 delivers, and it’s available on Alpha, and it’s available on RS6000 or Power, and, you know, on and on, that as a researcher I can take my code, optimize it to the Linux operating environment, and all the advantages that come to my application are sort of immediately available to me and I don’t have to port code from operating system to operating system. So it gives me more flexibility.

You know, you think about the way researchers run codes is that they look for sort of project sponsorship at nationally funded supercomputing centers. And they move their, sort of, research around to the place that can give them the best scalability or the most data or the largest problem sizes or the fastest turnaround. So that’s why I think you’re seeing Linux sort of take over in the supercomputing market, at least as a new technology going forward.

HPCwire: Are we witnessing the beginning of the end for traditional operating systems?

GELARDI: No, I think what we’re witnessing is the emergence of a new technology, that, at least in the short term, and I’m going to declare in the medium term, and even the long term, is going to be a complementary technology. It’s the best example that I could give, is while Linux is being deployed rather broadly, I know that it doesn’t have the capability to run a large database server behind the transaction processing system. I’m talking about thousands of requesters whether they be users or applications, large twenty four-way symmetric multiprocessors. You know, so you can sort of think 24×7 reliability/availability of a single engine called the database server. Linux isn’t there yet, and I would argue that that will continue to be the domain of AIX RS6000 or DOS on Z Series or System 390. So I think what you’ll see is in the computing hybrid, in a costumer deployment, you’re going to see application serving, and web serving, and file and print, and other application environments being married with traditional computing environments, as opposed to completely replacing traditional computing environments.

HPCwire: What other long-term effect do you foresee Linux having on the high performance computing industry?

GELARDI: You know, I think the longer term effect is that it will, you know, clearly as I said before, become quite generic, if you will, across environments. It will force many research institutions to re-examine investments in more proprietary technologies. And that’s probably in the longer term. It’ll probably allow the high performance computing industry to take advantage of very high performance systems because the vendors, like IBM, will deliver Intel-based technologies and Power-based technologies that can be coupled together for supercomputing, but we will not see as much, if you will, customized high performance computing systems as we saw in the past. So the days of Cray, of Thinking Machines, KSR, and some of these more exotic technologies are really gone. So the vendor who can deliver the best, most robust, highest performing, best implementations of Intel and Power and other of the surviving microprocessors will the ones that will be dominant in this industry.

Obviously from our perspective as IBM where we’re already heavily, you know, represented, lets say. in the top 500 supercomputing list, what we expect to see happen over the next several years is that we’ll continue to be dominant and we’ll see mixtures of traditional RS 6000 SP and clusters of RS 6000 with the emergence of Intel-based Linux supercomputing technologies.

HPCwire: How would you describe IBM’s overall Linux strategy?

GELARDI: I think are overall Linux strategy, if I sort of step above, is to leverage the momentum of Linux to help us beat very specific competitors in very specific markets, allow us to take advantage of the community of development, which we’re really quite delighted with, and you’re going to see us make two types of investments. We’re going to invest in the enterprisation of Linux on the one hand and on the other hand you’re going to see us deploy Linux across our entire product line — all of our servers, which we have today, we’ll have all of our middle-ware available — so we’re going to make a huge play that says hey customer, if you adapt, or adopt, Linux in your enterprise we, IBM, are going to able to give you the full power of what we can deliver to the market — hardware, software, storage, services, applications, relationships, support. So we’re really going to use Linux as the disruptive influence that it already is to further our goals and objectives in the industry at large, while playing within the roles of this new community while being a good partner and a good player in the open source community.

HPCwire: How necessary is it for a software or hardware company to have a Linux strategy right now?

My personal belief is that unless you are a very purpose built envi- you know, if your company as a vendor of computing technologies is in any sense a general purpose supplier of technology and capability and you don’t have a Linux strategy, you’re going to be in serious trouble in the short term.

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing components with Intel Xeon, AMD Epyc, IBM Power, and Arm server ch Read more…

By Tiffany Trader

SIA Recognizes Robert Dennard with 2019 Noyce Award

November 12, 2019

If you don’t know what Dennard Scaling is, the chances are strong you don’t labor in electronics. Robert Dennard, longtime IBM researcher, inventor of the DRAM and the fellow for whom Dennard Scaling was named, is th Read more…

By John Russell

Leveraging Exaflops Performance to Remediate Nuclear Waste

November 12, 2019

Nuclear waste storage sites are a subject of intense controversy and debate; nobody wants the radioactive remnants in their backyard. Now, a collaboration between Berkeley Lab, Pacific Northwest National University (PNNL Read more…

By Oliver Peckham

Using HPC and Machine Learning to Predict Traffic Congestion

November 12, 2019

Traffic congestion is a never-ending logic puzzle, dictated by commute patterns, but also by more stochastic accidents and similar disruptions. Traffic engineers struggle to model the traffic flow that occurs after accid Read more…

By Oliver Peckham

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This