CAN “OPEN SOURCE” BRIDGE THE SOFTWARE GAP?

September 1, 2000

FEATURES & COMMENTARY

Washington, D.C. — Steve Lohr reports that in a report to President Clinton last year, a group of leading computer scientists warned that the nation faced a troubling “software gap.”

The group, made up of corporate executives and university researchers, said that programmers simply could not keep pace with exploding demand for high-quality software – the computer code needed for everything from Internet commerce to nuclear weapons design. To bridge the gap, the group said, the nation must not only train more skilled programmers but also explore fresh, even radical, approaches to developing and maintaining software.

In a new report, the group, known as the President’s Information Technology Advisory Committee, will recommend that the federal government back “open source software as an alternate path for software development,” according to a draft copy of the report, which will be sent to the White House and published in a matter of weeks.

“Open source software” seems a radical approach indeed. The term stands for both an iconoclastic philosophy and a software development model: software is distributed free and its “source code,” or underlying instructions, are published openly so that other programmers can study, share and modify the author’s work.

The open-source model represents a sharp break with the practices of the commercial software business, which considers source code a company’s private property – usually guarded jealously and shared only rarely, under strict licensing terms.

But open source, once viewed as an ideological movement at the fringes of computing, is moving into the mainstream – largely because the spread of the Internet and personal computers make it easy for programmers to collaborate in far-flung, voluntary teams.

Open-source software has already made real inroads with a pair of showcase success stories: the Linux operating system for computers used alone or in networks, and the Apache software system for so-called server computers used for Web sites.

Linux and Apache have attracted the support of established companies like I.B.M. and Hewlett-Packard, and startups like Red Hat and VA Linux. Both open-source programs have done best in the market for Web server software. Apache is used on 63 percent of Web servers, according to Netcraft, a research firm, while Linux is the operating system on 36 percent of Web servers.

The movement’s fans say that open-source development, if widely adopted, has the potential to help fill the software gap by more efficiently delivering high-quality software. And open source, they add, is forcing the software industry to rethink its traditional development practices and its business strategies.

The point, they say, is not to destroy the profit motive that has helped make software a $175 billion-a-year business worldwide. The goal instead is to bring software development into the Internet era by sharing knowledge widely, allowing programmers to build on each other’s work and accelerate the pace of software debugging and improvement.

Companies in the open-source economy would make money mainly by tailoring programs for customers, and with service and support. Software, open-source advocates say, would increasingly become a service business – compared with the the traditional model of shipping proprietary code, shrink-wrapped, as if it were a manufactured good.

“I am increasingly coming to the conclusion that the Internet and open-source initiatives are the free marketplace way of dealing with the extremely complex software issues we are facing,” said Irving Wladawsky-Berger, an I.B.M. executive and a member of the presidential advisory committee.

Even Microsoft is taking open source seriously, if warily. “This issue of open source cuts to the core of the software business,” said Jim Gray, a Microsoft researcher and a member of the presidential advisory group. “It is a real challenge, masked by a great deal of hype.”

The new report by the technology advisory panel is another sign that open source is an emerging force. The recommendations focus mainly on developing software in a rarefied niche of computing – the federally financed supercomputing centers devoted to over-the-horizon research in fields like modeling the atmosphere and simulating nuclear explosions.

Yet the report also notes that open-source development could have “a profound economic and social” impact, and it clearly regards the supercomputing centers as technology incubators for the marketplace. (It was a team from the National Center for Supercomputing Applications at the University of Illinois, after all, that helped spawn the Web revolution by creating the Mosaic browser, Netscape’s precursor, in 1993, and that built the server software that became the foundation of the Apache project.)

“High-end computing should be a good test bed if you want to find out if open source can produce high-quality, complex software,” said Larry Smarr, co-chairman of the open-source panel of the presidential advisory committee, who was head of the Illinois supercomputing center in the Mosaic days.

For all the momentum of the open-source model, uncertainties abound. Making the transition from a movement to the mainstream, industry analysts say, will mean overcoming some daunting cultural, business and legal obstacles.

The origins of open-source development go back to the 1970’s and the academic traditions of freely publishing research, subjecting work to peer review and sharing one another’s discoveries. The early idealists, led by Richard M. Stallman, a revered programmer, believed deeply that all software should be free and that commercial software was all but immoral.

But the leading figures in the open-source movement today, however, have a more pragmatic bent. They believe in open-source development mainly, they say, because it produces better software and service – not because it is morally superior to the traditional commercial model. They are all for capitalism, and they welcome investment.

“If open source is to succeed, it has to have a business justification,” said Brian Behlendorf, one of the creators of Apache, and the founder and chief technology officer of Collab.Net, a San Francisco start-up company that helps companies design and run projects using open-source techniques.

“A lot of people are in the open-source community because they think it is the right thing to do,” he added. “But charity only goes so far. You’ve got to make it sustainable.”

Of the estimated five million software programmers worldwide, Mr. Behlendorf figures that fewer than 50,000 participate in open-source projects. “The goal is to bring what works from open source into this other 99 percent of the programming community,” he said.

Collab.Net, founded last year, has about two dozen clients, including Sun Microsystems, Oracle and Hewlett-Packard. Some of its customers are experimenting with distributing source code freely. But others are proceeding more cautiously, perhaps sharing some of their proprietary code with business partners and customers as a way to improve the quality and speed of its software development projects.

“To me,” said Bill Portelli, president of Collab.Net, “this is not about software being free. It’s about the open-source practices and principles – a methodology for fast, effective collaboration.”

Collab.Net charges fees for consulting, education and coordinating software projects. Indeed, the business model for open-source development is based on the premise that software is a service and that the companies seeking to profit from it are service suppliers.

The commercial software business, open-source advocates say, is locked in an industrial-era “factory model” of shipping programs out the door as “finished goods.” Instead, such advocates say, software should be seen as more like a living organism that is best fed and cared for in the open-source environment. Even Microsoft, they note, has said recently that its software effort will evolve into a service business.

So far open-source development has been more of an Internet-age breakthrough in engineering management than a progenitor of new technologies.

The two marketplace triumphs of open source, after all, are derivative rather than truly innovative. Linux is a version of the Unix operating system, and the Apache Web server was derived from software developed at the Illinois supercomputing center.

For this reason, some experts say the promise of open source is being overstated. William N. Joy, a founder and chief scientist of Sun Microsystems, created and distributed an open-source version of Unix two decades ago at the University of California at Berkeley. The Internet, he now says, has changed the context of things, making collaboration easier. But real innovation, he says, remains the work of a few.

“The truth is, great software comes from great programmers, not from a large number of people slaving away,” Mr. Joy said. “Open source can be useful. It can speed things up. But it’s not new, and it’s not holy water.”

Open-source enthusiasts reply that Mr. Joy is correct when it comes to the conceptual insight needed to forge a new direction in software. Truly creative insights, they concede, will remain the bailiwick of the “lone craftsman” programmer. But they hurry to add that breakthrough innovation, while important, is only a small part of the software economy. More than 75 percent of the time and cost of a software project is typically consumed by what the open-source approach is meant to do best: debugging, maintaining and tweaking the code.

Such work may seem mundane, but it represents a huge problem-solving challenge that determines the speed of development as well as the reliability of software.

So when Linus Torvalds released an experimental version of Unix onto the Internet as a student at the University of Helsinki in 1991, it may not have been breakthrough software. As the Web took off, though, he soon found himself the steward of a developing operating system project, dubbed Linux, spurred by the voluntary contributions of programmers around the world. In the book “The Cathedral and the Bazaar,” Eric S. Raymond, an open-source evangelist, observed that Mr. Torvalds was “the first person who learned how to play by the new rules that pervasive Internet access made possible.”

The new rules can be disorienting. Netscape Communications learned some hard lessons after it opened the source code on its Internet browsing software in early 1998. By then, Netscape, following Microsoft’s lead, had been forced to give its browser away. But browser market share was still important to Netscape because its software server sales and service contracts depended on the popularity of its technology.

By making its browser an open-source project, Netscape hoped to enlist the widespread support among open-source programmers eager to help the company in its battle against Microsoft. The open-source project was called mozilla.org – a playful combination of Mosaic, the original browser, and Godzilla. But the effort drew only limited support.

Developers typically join open-source projects for the challenge of solving interesting software problems and the resulting recognition of their peers. That works best when a software program is designed as many pieces, or modules, so “there is a lot of frontier to homestead,” in open-source parlance.

But Netscape’s browser was a big batch of programming code that was not designed in modules. “That made it hard for even strong hackers to come in, stake a claim and do interesting work,” said Brendan Eich, a leading programmer and one of the six founders of mozilla.

Corporate attitudes slowed things down as well. At first, for example, the Netscape programmers were reluctant to post their comments in the online area accessible to outsiders, preferring to post them instead on in-house lists for Netscape engineers. “There was a whole mindset that had to change,” Mr. Eich observed.

The mozilla open-source project was weakened further by staff defections after Netscape agreed to be acquired by America Online in November 1998.

Yet the project was also overhauled in the fall of 1998 in ways that laid the groundwork for a turnaround. The mozilla leaders adopted a modular code base called Gecko – named after a small, fast lizard – for rendering Web pages.

Gradually, more outside developers joined the open-source browser effort. Mr. Eich, the lone founding member of mozilla still with Netscape, rattles off the names of several outside programmers who have made significant contributions. Bugs are fixed quickly. “More eyes looking at the code really helps,” he said.

Earlier this year, America Online released a preview version of its next-generation browser, Netscape 6.0, long delayed but praised in the trade press for its streamlined design and novel features.

By now, Microsoft seems to be entrenched as the leader in browsers for personal computers. Still, the Netscape browser is at least a counterweight to Microsoft on the desktop. Beyond that, Netscape 6.0 is designed for post-PC computing – to run easily on everything from cell phones to hand-held Internet appliances.

“We’ve rounded the corner at mozilla,” Mr. Eich said.

The mozilla effort is also being watched as a model for licensing – a crucial issue for open source. There are many kinds of open-source licenses, but they all require contributors who modify an open-source program to make those improvements available to all members of the project. The most restrictive licenses require any code that touches an open-source program to be made available freely. “To some companies, that is a terrifying thought,” said Lawrence Rosen, executive director of the Open Source Initiative, an education and advocacy group.

The mozilla license is an effort to let open source and proprietary software coexist peacefully. Its license applies to each of the software files – building blocks of code – in the open-source browser. If someone modifies one of those files, the improvement must be given back to the project as open-source code. But if a company mingles some of its own proprietary software files with the browser, those separate files can remain its private property.

That kind of accommodation, industry analysts say, is a step toward making open source acceptable to corporate America. “Open source can’t go mainstream unless it finds ways to work with companies that have very different business models and believe that intellectual property can be owned,” said James F. Moore, president of Geopartners Research, a consulting firm. “That’s a big challenge.”

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Supercomputers Take to the Solar Winds

June 5, 2020

The whims of the solar winds – charged particles flowing from the Sun’s atmosphere – can interfere with systems that are now crucial for modern life, such as satellites and GPS services – but these winds can be d Read more…

By Oliver Peckham

HPC in O&G: Deep Sea Drilling – What Happens Now   

June 4, 2020

At the beginning of March I attended the Rice Oil & Gas HPC conference in Houston. That seems a long time ago now. It’s a great event where oil and gas specialists join with compute veterans and the discussion tell Read more…

By Rosemary Francis

NCSA Wades into Post-Blue Waters Era with Delta Supercomputer

June 3, 2020

NSF has awarded the National Center for Supercomputing Applications (NCSA) $10 million for its next supercomputer - named Delta – “which will kick-start NCSA’s next generation of supercomputers post-Blue Waters,” Read more…

By John Russell

Dell Integrates Bitfusion for vHPC, GPU ‘Pools’

June 3, 2020

Dell Technologies advanced its hardware virtualization strategy to AI workloads this week with the introduction of capabilities aimed at expanding access to GPU and HPC services via its EMC, VMware and recently acquired Read more…

By George Leopold

Supercomputers Streamline Prediction of Dangerous Arrhythmia

June 2, 2020

Heart arrhythmia can prove deadly, contributing to the hundreds of thousands of deaths from cardiac arrest in the U.S. every year. Unfortunately, many of those arrhythmia are induced as side effects from various medicati Read more…

By Staff report

AWS Solution Channel

Join AWS, Univa and Intel for This Informative Session!

Event Date: June 18, 2020

More enterprises than ever are turning to HPC cloud computing. Whether you’re just getting started, or more mature in your use of cloud, this HPC Cloud webinar is an excellent opportunity to gain valuable insights and knowledge to help accelerate your HPC cloud projects. Read more…

Indiana University to Deploy Jetstream 2 Cloud with AMD, Nvidia Technology

June 2, 2020

Indiana University has been awarded a $10 million NSF grant to build ‘Jetstream 2,’ a cloud computing system that will provide 8 aggregate petaflops of computing capability in support of data analysis and AI workload Read more…

By Tiffany Trader

NCSA Wades into Post-Blue Waters Era with Delta Supercomputer

June 3, 2020

NSF has awarded the National Center for Supercomputing Applications (NCSA) $10 million for its next supercomputer - named Delta – “which will kick-start NCS Read more…

By John Russell

Indiana University to Deploy Jetstream 2 Cloud with AMD, Nvidia Technology

June 2, 2020

Indiana University has been awarded a $10 million NSF grant to build ‘Jetstream 2,’ a cloud computing system that will provide 8 aggregate petaflops of comp Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI Read more…

By Doug Black

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

HPCwire and all of the Tabor Communications family are saddened by last week’s passing of Rich Brueckner. He was the ever-optimistic man in the Red Hat presiding over the InsideHPC media portfolio for the past decade and a constant presence at HPC’s most important events. Read more…

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This