CAN “OPEN SOURCE” BRIDGE THE SOFTWARE GAP?

September 1, 2000

FEATURES & COMMENTARY

Washington, D.C. — Steve Lohr reports that in a report to President Clinton last year, a group of leading computer scientists warned that the nation faced a troubling “software gap.”

The group, made up of corporate executives and university researchers, said that programmers simply could not keep pace with exploding demand for high-quality software – the computer code needed for everything from Internet commerce to nuclear weapons design. To bridge the gap, the group said, the nation must not only train more skilled programmers but also explore fresh, even radical, approaches to developing and maintaining software.

In a new report, the group, known as the President’s Information Technology Advisory Committee, will recommend that the federal government back “open source software as an alternate path for software development,” according to a draft copy of the report, which will be sent to the White House and published in a matter of weeks.

“Open source software” seems a radical approach indeed. The term stands for both an iconoclastic philosophy and a software development model: software is distributed free and its “source code,” or underlying instructions, are published openly so that other programmers can study, share and modify the author’s work.

The open-source model represents a sharp break with the practices of the commercial software business, which considers source code a company’s private property – usually guarded jealously and shared only rarely, under strict licensing terms.

But open source, once viewed as an ideological movement at the fringes of computing, is moving into the mainstream – largely because the spread of the Internet and personal computers make it easy for programmers to collaborate in far-flung, voluntary teams.

Open-source software has already made real inroads with a pair of showcase success stories: the Linux operating system for computers used alone or in networks, and the Apache software system for so-called server computers used for Web sites.

Linux and Apache have attracted the support of established companies like I.B.M. and Hewlett-Packard, and startups like Red Hat and VA Linux. Both open-source programs have done best in the market for Web server software. Apache is used on 63 percent of Web servers, according to Netcraft, a research firm, while Linux is the operating system on 36 percent of Web servers.

The movement’s fans say that open-source development, if widely adopted, has the potential to help fill the software gap by more efficiently delivering high-quality software. And open source, they add, is forcing the software industry to rethink its traditional development practices and its business strategies.

The point, they say, is not to destroy the profit motive that has helped make software a $175 billion-a-year business worldwide. The goal instead is to bring software development into the Internet era by sharing knowledge widely, allowing programmers to build on each other’s work and accelerate the pace of software debugging and improvement.

Companies in the open-source economy would make money mainly by tailoring programs for customers, and with service and support. Software, open-source advocates say, would increasingly become a service business – compared with the the traditional model of shipping proprietary code, shrink-wrapped, as if it were a manufactured good.

“I am increasingly coming to the conclusion that the Internet and open-source initiatives are the free marketplace way of dealing with the extremely complex software issues we are facing,” said Irving Wladawsky-Berger, an I.B.M. executive and a member of the presidential advisory committee.

Even Microsoft is taking open source seriously, if warily. “This issue of open source cuts to the core of the software business,” said Jim Gray, a Microsoft researcher and a member of the presidential advisory group. “It is a real challenge, masked by a great deal of hype.”

The new report by the technology advisory panel is another sign that open source is an emerging force. The recommendations focus mainly on developing software in a rarefied niche of computing – the federally financed supercomputing centers devoted to over-the-horizon research in fields like modeling the atmosphere and simulating nuclear explosions.

Yet the report also notes that open-source development could have “a profound economic and social” impact, and it clearly regards the supercomputing centers as technology incubators for the marketplace. (It was a team from the National Center for Supercomputing Applications at the University of Illinois, after all, that helped spawn the Web revolution by creating the Mosaic browser, Netscape’s precursor, in 1993, and that built the server software that became the foundation of the Apache project.)

“High-end computing should be a good test bed if you want to find out if open source can produce high-quality, complex software,” said Larry Smarr, co-chairman of the open-source panel of the presidential advisory committee, who was head of the Illinois supercomputing center in the Mosaic days.

For all the momentum of the open-source model, uncertainties abound. Making the transition from a movement to the mainstream, industry analysts say, will mean overcoming some daunting cultural, business and legal obstacles.

The origins of open-source development go back to the 1970’s and the academic traditions of freely publishing research, subjecting work to peer review and sharing one another’s discoveries. The early idealists, led by Richard M. Stallman, a revered programmer, believed deeply that all software should be free and that commercial software was all but immoral.

But the leading figures in the open-source movement today, however, have a more pragmatic bent. They believe in open-source development mainly, they say, because it produces better software and service – not because it is morally superior to the traditional commercial model. They are all for capitalism, and they welcome investment.

“If open source is to succeed, it has to have a business justification,” said Brian Behlendorf, one of the creators of Apache, and the founder and chief technology officer of Collab.Net, a San Francisco start-up company that helps companies design and run projects using open-source techniques.

“A lot of people are in the open-source community because they think it is the right thing to do,” he added. “But charity only goes so far. You’ve got to make it sustainable.”

Of the estimated five million software programmers worldwide, Mr. Behlendorf figures that fewer than 50,000 participate in open-source projects. “The goal is to bring what works from open source into this other 99 percent of the programming community,” he said.

Collab.Net, founded last year, has about two dozen clients, including Sun Microsystems, Oracle and Hewlett-Packard. Some of its customers are experimenting with distributing source code freely. But others are proceeding more cautiously, perhaps sharing some of their proprietary code with business partners and customers as a way to improve the quality and speed of its software development projects.

“To me,” said Bill Portelli, president of Collab.Net, “this is not about software being free. It’s about the open-source practices and principles – a methodology for fast, effective collaboration.”

Collab.Net charges fees for consulting, education and coordinating software projects. Indeed, the business model for open-source development is based on the premise that software is a service and that the companies seeking to profit from it are service suppliers.

The commercial software business, open-source advocates say, is locked in an industrial-era “factory model” of shipping programs out the door as “finished goods.” Instead, such advocates say, software should be seen as more like a living organism that is best fed and cared for in the open-source environment. Even Microsoft, they note, has said recently that its software effort will evolve into a service business.

So far open-source development has been more of an Internet-age breakthrough in engineering management than a progenitor of new technologies.

The two marketplace triumphs of open source, after all, are derivative rather than truly innovative. Linux is a version of the Unix operating system, and the Apache Web server was derived from software developed at the Illinois supercomputing center.

For this reason, some experts say the promise of open source is being overstated. William N. Joy, a founder and chief scientist of Sun Microsystems, created and distributed an open-source version of Unix two decades ago at the University of California at Berkeley. The Internet, he now says, has changed the context of things, making collaboration easier. But real innovation, he says, remains the work of a few.

“The truth is, great software comes from great programmers, not from a large number of people slaving away,” Mr. Joy said. “Open source can be useful. It can speed things up. But it’s not new, and it’s not holy water.”

Open-source enthusiasts reply that Mr. Joy is correct when it comes to the conceptual insight needed to forge a new direction in software. Truly creative insights, they concede, will remain the bailiwick of the “lone craftsman” programmer. But they hurry to add that breakthrough innovation, while important, is only a small part of the software economy. More than 75 percent of the time and cost of a software project is typically consumed by what the open-source approach is meant to do best: debugging, maintaining and tweaking the code.

Such work may seem mundane, but it represents a huge problem-solving challenge that determines the speed of development as well as the reliability of software.

So when Linus Torvalds released an experimental version of Unix onto the Internet as a student at the University of Helsinki in 1991, it may not have been breakthrough software. As the Web took off, though, he soon found himself the steward of a developing operating system project, dubbed Linux, spurred by the voluntary contributions of programmers around the world. In the book “The Cathedral and the Bazaar,” Eric S. Raymond, an open-source evangelist, observed that Mr. Torvalds was “the first person who learned how to play by the new rules that pervasive Internet access made possible.”

The new rules can be disorienting. Netscape Communications learned some hard lessons after it opened the source code on its Internet browsing software in early 1998. By then, Netscape, following Microsoft’s lead, had been forced to give its browser away. But browser market share was still important to Netscape because its software server sales and service contracts depended on the popularity of its technology.

By making its browser an open-source project, Netscape hoped to enlist the widespread support among open-source programmers eager to help the company in its battle against Microsoft. The open-source project was called mozilla.org – a playful combination of Mosaic, the original browser, and Godzilla. But the effort drew only limited support.

Developers typically join open-source projects for the challenge of solving interesting software problems and the resulting recognition of their peers. That works best when a software program is designed as many pieces, or modules, so “there is a lot of frontier to homestead,” in open-source parlance.

But Netscape’s browser was a big batch of programming code that was not designed in modules. “That made it hard for even strong hackers to come in, stake a claim and do interesting work,” said Brendan Eich, a leading programmer and one of the six founders of mozilla.

Corporate attitudes slowed things down as well. At first, for example, the Netscape programmers were reluctant to post their comments in the online area accessible to outsiders, preferring to post them instead on in-house lists for Netscape engineers. “There was a whole mindset that had to change,” Mr. Eich observed.

The mozilla open-source project was weakened further by staff defections after Netscape agreed to be acquired by America Online in November 1998.

Yet the project was also overhauled in the fall of 1998 in ways that laid the groundwork for a turnaround. The mozilla leaders adopted a modular code base called Gecko – named after a small, fast lizard – for rendering Web pages.

Gradually, more outside developers joined the open-source browser effort. Mr. Eich, the lone founding member of mozilla still with Netscape, rattles off the names of several outside programmers who have made significant contributions. Bugs are fixed quickly. “More eyes looking at the code really helps,” he said.

Earlier this year, America Online released a preview version of its next-generation browser, Netscape 6.0, long delayed but praised in the trade press for its streamlined design and novel features.

By now, Microsoft seems to be entrenched as the leader in browsers for personal computers. Still, the Netscape browser is at least a counterweight to Microsoft on the desktop. Beyond that, Netscape 6.0 is designed for post-PC computing – to run easily on everything from cell phones to hand-held Internet appliances.

“We’ve rounded the corner at mozilla,” Mr. Eich said.

The mozilla effort is also being watched as a model for licensing – a crucial issue for open source. There are many kinds of open-source licenses, but they all require contributors who modify an open-source program to make those improvements available to all members of the project. The most restrictive licenses require any code that touches an open-source program to be made available freely. “To some companies, that is a terrifying thought,” said Lawrence Rosen, executive director of the Open Source Initiative, an education and advocacy group.

The mozilla license is an effort to let open source and proprietary software coexist peacefully. Its license applies to each of the software files – building blocks of code – in the open-source browser. If someone modifies one of those files, the improvement must be given back to the project as open-source code. But if a company mingles some of its own proprietary software files with the browser, those separate files can remain its private property.

That kind of accommodation, industry analysts say, is a step toward making open source acceptable to corporate America. “Open source can’t go mainstream unless it finds ways to work with companies that have very different business models and believe that intellectual property can be owned,” said James F. Moore, president of Geopartners Research, a consulting firm. “That’s a big challenge.”

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire