CAN “OPEN SOURCE” BRIDGE THE SOFTWARE GAP?

September 1, 2000

FEATURES & COMMENTARY

Washington, D.C. — Steve Lohr reports that in a report to President Clinton last year, a group of leading computer scientists warned that the nation faced a troubling “software gap.”

The group, made up of corporate executives and university researchers, said that programmers simply could not keep pace with exploding demand for high-quality software – the computer code needed for everything from Internet commerce to nuclear weapons design. To bridge the gap, the group said, the nation must not only train more skilled programmers but also explore fresh, even radical, approaches to developing and maintaining software.

In a new report, the group, known as the President’s Information Technology Advisory Committee, will recommend that the federal government back “open source software as an alternate path for software development,” according to a draft copy of the report, which will be sent to the White House and published in a matter of weeks.

“Open source software” seems a radical approach indeed. The term stands for both an iconoclastic philosophy and a software development model: software is distributed free and its “source code,” or underlying instructions, are published openly so that other programmers can study, share and modify the author’s work.

The open-source model represents a sharp break with the practices of the commercial software business, which considers source code a company’s private property – usually guarded jealously and shared only rarely, under strict licensing terms.

But open source, once viewed as an ideological movement at the fringes of computing, is moving into the mainstream – largely because the spread of the Internet and personal computers make it easy for programmers to collaborate in far-flung, voluntary teams.

Open-source software has already made real inroads with a pair of showcase success stories: the Linux operating system for computers used alone or in networks, and the Apache software system for so-called server computers used for Web sites.

Linux and Apache have attracted the support of established companies like I.B.M. and Hewlett-Packard, and startups like Red Hat and VA Linux. Both open-source programs have done best in the market for Web server software. Apache is used on 63 percent of Web servers, according to Netcraft, a research firm, while Linux is the operating system on 36 percent of Web servers.

The movement’s fans say that open-source development, if widely adopted, has the potential to help fill the software gap by more efficiently delivering high-quality software. And open source, they add, is forcing the software industry to rethink its traditional development practices and its business strategies.

The point, they say, is not to destroy the profit motive that has helped make software a $175 billion-a-year business worldwide. The goal instead is to bring software development into the Internet era by sharing knowledge widely, allowing programmers to build on each other’s work and accelerate the pace of software debugging and improvement.

Companies in the open-source economy would make money mainly by tailoring programs for customers, and with service and support. Software, open-source advocates say, would increasingly become a service business – compared with the the traditional model of shipping proprietary code, shrink-wrapped, as if it were a manufactured good.

“I am increasingly coming to the conclusion that the Internet and open-source initiatives are the free marketplace way of dealing with the extremely complex software issues we are facing,” said Irving Wladawsky-Berger, an I.B.M. executive and a member of the presidential advisory committee.

Even Microsoft is taking open source seriously, if warily. “This issue of open source cuts to the core of the software business,” said Jim Gray, a Microsoft researcher and a member of the presidential advisory group. “It is a real challenge, masked by a great deal of hype.”

The new report by the technology advisory panel is another sign that open source is an emerging force. The recommendations focus mainly on developing software in a rarefied niche of computing – the federally financed supercomputing centers devoted to over-the-horizon research in fields like modeling the atmosphere and simulating nuclear explosions.

Yet the report also notes that open-source development could have “a profound economic and social” impact, and it clearly regards the supercomputing centers as technology incubators for the marketplace. (It was a team from the National Center for Supercomputing Applications at the University of Illinois, after all, that helped spawn the Web revolution by creating the Mosaic browser, Netscape’s precursor, in 1993, and that built the server software that became the foundation of the Apache project.)

“High-end computing should be a good test bed if you want to find out if open source can produce high-quality, complex software,” said Larry Smarr, co-chairman of the open-source panel of the presidential advisory committee, who was head of the Illinois supercomputing center in the Mosaic days.

For all the momentum of the open-source model, uncertainties abound. Making the transition from a movement to the mainstream, industry analysts say, will mean overcoming some daunting cultural, business and legal obstacles.

The origins of open-source development go back to the 1970’s and the academic traditions of freely publishing research, subjecting work to peer review and sharing one another’s discoveries. The early idealists, led by Richard M. Stallman, a revered programmer, believed deeply that all software should be free and that commercial software was all but immoral.

But the leading figures in the open-source movement today, however, have a more pragmatic bent. They believe in open-source development mainly, they say, because it produces better software and service – not because it is morally superior to the traditional commercial model. They are all for capitalism, and they welcome investment.

“If open source is to succeed, it has to have a business justification,” said Brian Behlendorf, one of the creators of Apache, and the founder and chief technology officer of Collab.Net, a San Francisco start-up company that helps companies design and run projects using open-source techniques.

“A lot of people are in the open-source community because they think it is the right thing to do,” he added. “But charity only goes so far. You’ve got to make it sustainable.”

Of the estimated five million software programmers worldwide, Mr. Behlendorf figures that fewer than 50,000 participate in open-source projects. “The goal is to bring what works from open source into this other 99 percent of the programming community,” he said.

Collab.Net, founded last year, has about two dozen clients, including Sun Microsystems, Oracle and Hewlett-Packard. Some of its customers are experimenting with distributing source code freely. But others are proceeding more cautiously, perhaps sharing some of their proprietary code with business partners and customers as a way to improve the quality and speed of its software development projects.

“To me,” said Bill Portelli, president of Collab.Net, “this is not about software being free. It’s about the open-source practices and principles – a methodology for fast, effective collaboration.”

Collab.Net charges fees for consulting, education and coordinating software projects. Indeed, the business model for open-source development is based on the premise that software is a service and that the companies seeking to profit from it are service suppliers.

The commercial software business, open-source advocates say, is locked in an industrial-era “factory model” of shipping programs out the door as “finished goods.” Instead, such advocates say, software should be seen as more like a living organism that is best fed and cared for in the open-source environment. Even Microsoft, they note, has said recently that its software effort will evolve into a service business.

So far open-source development has been more of an Internet-age breakthrough in engineering management than a progenitor of new technologies.

The two marketplace triumphs of open source, after all, are derivative rather than truly innovative. Linux is a version of the Unix operating system, and the Apache Web server was derived from software developed at the Illinois supercomputing center.

For this reason, some experts say the promise of open source is being overstated. William N. Joy, a founder and chief scientist of Sun Microsystems, created and distributed an open-source version of Unix two decades ago at the University of California at Berkeley. The Internet, he now says, has changed the context of things, making collaboration easier. But real innovation, he says, remains the work of a few.

“The truth is, great software comes from great programmers, not from a large number of people slaving away,” Mr. Joy said. “Open source can be useful. It can speed things up. But it’s not new, and it’s not holy water.”

Open-source enthusiasts reply that Mr. Joy is correct when it comes to the conceptual insight needed to forge a new direction in software. Truly creative insights, they concede, will remain the bailiwick of the “lone craftsman” programmer. But they hurry to add that breakthrough innovation, while important, is only a small part of the software economy. More than 75 percent of the time and cost of a software project is typically consumed by what the open-source approach is meant to do best: debugging, maintaining and tweaking the code.

Such work may seem mundane, but it represents a huge problem-solving challenge that determines the speed of development as well as the reliability of software.

So when Linus Torvalds released an experimental version of Unix onto the Internet as a student at the University of Helsinki in 1991, it may not have been breakthrough software. As the Web took off, though, he soon found himself the steward of a developing operating system project, dubbed Linux, spurred by the voluntary contributions of programmers around the world. In the book “The Cathedral and the Bazaar,” Eric S. Raymond, an open-source evangelist, observed that Mr. Torvalds was “the first person who learned how to play by the new rules that pervasive Internet access made possible.”

The new rules can be disorienting. Netscape Communications learned some hard lessons after it opened the source code on its Internet browsing software in early 1998. By then, Netscape, following Microsoft’s lead, had been forced to give its browser away. But browser market share was still important to Netscape because its software server sales and service contracts depended on the popularity of its technology.

By making its browser an open-source project, Netscape hoped to enlist the widespread support among open-source programmers eager to help the company in its battle against Microsoft. The open-source project was called mozilla.org – a playful combination of Mosaic, the original browser, and Godzilla. But the effort drew only limited support.

Developers typically join open-source projects for the challenge of solving interesting software problems and the resulting recognition of their peers. That works best when a software program is designed as many pieces, or modules, so “there is a lot of frontier to homestead,” in open-source parlance.

But Netscape’s browser was a big batch of programming code that was not designed in modules. “That made it hard for even strong hackers to come in, stake a claim and do interesting work,” said Brendan Eich, a leading programmer and one of the six founders of mozilla.

Corporate attitudes slowed things down as well. At first, for example, the Netscape programmers were reluctant to post their comments in the online area accessible to outsiders, preferring to post them instead on in-house lists for Netscape engineers. “There was a whole mindset that had to change,” Mr. Eich observed.

The mozilla open-source project was weakened further by staff defections after Netscape agreed to be acquired by America Online in November 1998.

Yet the project was also overhauled in the fall of 1998 in ways that laid the groundwork for a turnaround. The mozilla leaders adopted a modular code base called Gecko – named after a small, fast lizard – for rendering Web pages.

Gradually, more outside developers joined the open-source browser effort. Mr. Eich, the lone founding member of mozilla still with Netscape, rattles off the names of several outside programmers who have made significant contributions. Bugs are fixed quickly. “More eyes looking at the code really helps,” he said.

Earlier this year, America Online released a preview version of its next-generation browser, Netscape 6.0, long delayed but praised in the trade press for its streamlined design and novel features.

By now, Microsoft seems to be entrenched as the leader in browsers for personal computers. Still, the Netscape browser is at least a counterweight to Microsoft on the desktop. Beyond that, Netscape 6.0 is designed for post-PC computing – to run easily on everything from cell phones to hand-held Internet appliances.

“We’ve rounded the corner at mozilla,” Mr. Eich said.

The mozilla effort is also being watched as a model for licensing – a crucial issue for open source. There are many kinds of open-source licenses, but they all require contributors who modify an open-source program to make those improvements available to all members of the project. The most restrictive licenses require any code that touches an open-source program to be made available freely. “To some companies, that is a terrifying thought,” said Lawrence Rosen, executive director of the Open Source Initiative, an education and advocacy group.

The mozilla license is an effort to let open source and proprietary software coexist peacefully. Its license applies to each of the software files – building blocks of code – in the open-source browser. If someone modifies one of those files, the improvement must be given back to the project as open-source code. But if a company mingles some of its own proprietary software files with the browser, those separate files can remain its private property.

That kind of accommodation, industry analysts say, is a step toward making open source acceptable to corporate America. “Open source can’t go mainstream unless it finds ways to work with companies that have very different business models and believe that intellectual property can be owned,” said James F. Moore, president of Geopartners Research, a consulting firm. “That’s a big challenge.”

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pressing needs and hurdles to widespread AI adoption. The sudde Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire