W3C’s WORLDWIDE POWER

September 17, 1999

FEATURES & COMMENTARY

San Diego, CA — As Gary H. Anthes reported for IDG, internet pioneer Vinton Cerf and his wife, while dining on roast veal one evening, were discussing cattle drives of the mid-19th century. They wondered what had happened to legendary routes like the Chisholm Trail.

Cerf’s house is equipped with a radio LAN, so he was able to connect to the Internet via a wireless laptop computer at the dinner table. “Within a few queries, we got a wonderful dissertation with pictures and maps and histories of the trails,” he said. “I thought, ‘My God, the world’s knowledge is just sitting there.’ What’s our culture going to be like when you can find out literally anything in a few seconds, when the brilliance of every human being is suddenly available to you?”

Cerf’s vision isn’t so far-fetched. And if that vision is to be realized, there’s little doubt that the World Wide Web Consortium (W3C) will be a major driving force. Technical specifications developed by the W3C, most notably Extensible Markup Language (XML), are morphing the Web into a second-generation architecture. One likely to eclipse even the phenomenal success of the Web in the 1990s.

But despite nearly universal praise for its work, the W3C draws some criticism for its methods and concentration of power at the top, namely Tim Berners-Lee, creator of the Web and founder and director of the W3C.

Nevertheless, even those who don’t fancy the W3C’s operating philosophy acknowledge that its agility is rare in a standards group and that its specifications bear a “moral majesty.”

Semantic Web

The W3C is mapping out technology to support a “semantic Web,” in which all the world’s knowledge becomes computer-accessible. “Querying a database is not exciting,” Berners-Lee says. “But querying a database that gets linked so as to query the whole planet is very exciting.”

The W3C, based at MIT and research centers in France and Japan, last year took a giant step toward that goal by publishing XML. It can describe Web pages with far more power than Hypertext Markup Language (HTML), the Internet programming language developed by Berners-Lee in 1990 at CERN, the European Laboratory for Particle Physics in Switzerland. Unlike HTML, which describes the structure of a page, XML allows developers to make up their own tags, or metadata, to describe information content on the page.

The W3C is also working on the Resource Description Framework (RDF), a language that uses XML to enable application and content developers in different domains to share vocabularies, their own metadata, in ways that allow them to link diverse databases.

XML and RDF promise to make the Web much more powerful by enabling search engines to “understand” the meaning of information. No more getting a million hits on topics you don’t care about, while missing information you want because it doesn’t happen to include the keywords you specified.

Closing the Gap

The Web has propelled the Internet to 60 million nodes in the past five years, according to Scott Bradner, a senior technical consultant at Harvard University and an area director at the Internet Engineering Task Force (IETF). “The Web filled a hole we didn’t know we had; we geeks were doing just fine,” he said. “The W3C deserves an awful lot of the credit for that.”

The W3C also fills a hole left by the older IETF standards body. Bradner says the groups jointly decided several years ago that the IETF would handle low-level topics such as the Web protocol HTTP, while the W3C takes on issues closer to the application. “They are upper-middleware, and we are lower-middleware and below or, as someone put it, underware,” Bradner said.

But the groups differ in other ways as well. In fact, the W3C doesn’t consider itself a standards body at all, preferring to think of itself as a research and development organization, a kind of techno-think tank. It develops open-source software for demonstration purposes or when it feels the marketplace isn’t meeting a critical need.

For example, the W3C developed the first Web browser/editor combination called Amaya, and a flexible and extensible Java-based Web server called Jigsaw. Anyone, even nonmembers, wishing to use, improve or build a product around Amaya or Jigsaw can download the source code at the group’s Web site.

Out of the Loop

The IETF, which doesn’t develop software, is a loosely structured, grassroots-like group from which standards bubble up after being shaped and critiqued by anyone who cares to participate. The W3C is a more structured, less open coalition of 339 software vendors, large user companies and others who pay $5,000 or $50,000 (depending on size) in annual dues and sign an agreement vesting final decision-making authority in Berners-Lee.

But some people aren’t entirely happy with that arrangement. MCI WorldCom Inc. withdrew from the W3C after two years when it concluded that membership wasn’t worth $50,000 per year. “The structure of the W3C didn’t lend itself to quite the degree of freedom to contribute that the IETF does,” says Cerf, MCI’s senior vice president for Internet architecture. “We found it difficult to get points across and to influence what was happening.”

“We were never completely comfortable with W3C acting as a standards body, with its decision model based ultimately on the personal preferences of the director,” says John C. Klensin, distinguished engineering fellow at MCI. “We’ve tended to prefer Internet standards work to be done in bodies that more clearly use an open consensus process rather than in limited-membership consortia of any sort, including W3C.”

But Klensin praises the W3C’s ability to move quickly. “The W3C approach is probably optimal for the design and development of sample advanced technology, especially when it addresses problems two or three years ahead of current products, while the IETF approach is far better for actual standardization,” he says.

Power of Standards

Indeed, Berners-Lee says one of his goals in setting up the W3C in 1994 was to make it more nimble than the IETF. “Always, standards processes have been too slow,” he said. “The IETF has a particular set of processes, and in some circumstances they work very well and under other circumstances they don’t.”

As for his power to affect millions of users, he’s unapologetic. “Members give a mandate to the consortium to do things, and the director has an executive responsibility to get them done,” he said. “But there are a whole set of checks and balances, and there is even a process to review the process.”

The Boeing Co., a W3C member, has an enormously complex and geographically dispersed computing environment, but it’s unified by an intranet with 175,000 users, an extranet with 26,000 users and more than 2,000 Web servers. “The Web is of immense importance to us,” said Ann Bassetti, Web products manager at Boeing.

Boeing joined the W3C to get early information on Web developments and to influence them as a user, Bassetti said. “For us, interoperability is crucial. Without the W3C, we would not have standardization of protocols, and individual vendors would dominate with their proprietary formats.”

Whether one calls them standards or, as the W3C prefers, recommendations, HTML, XML and other technical specifications from the consortium have virtually the force of law. “They have a moral majesty behind them, a moral hegemony,” said Carl Cargill, director of standards at Sun Microsystems Inc. and a member of the W3C Advisory Board. “And they have a great deal of acceptance among users.”

Cargill said the W3C’s members, including Sun, are, in effect, a giant “advisory committee” to Berners-Lee, whom he calls an “honest and reasonably open individual” and an “impartial adjudicator.”

But Cargill acknowledges that the director has tremendous power in an arena increasingly vital to the IT community. “Tim’s in a position where, if he shakes his head yes, somebody can lose $1 billion, but if he shakes his head no, somebody else loses $1 billion,” Cargill says.

“If you have a strong opinion on X, and Tim doesn’t share it, well, it may be a little tricky to get a standard out,” Bradner said. “(But) Tim is the key to ensuring a consistent architecture, to keeping things from fragmenting.”

Asked if anything worries him about the future of the Web, Berners-Lee said, “Fragmentation. If TVs end up with one version of HTML, and normal Web browsers end up with a different version, that would be a mess. That could come about commercially if a monopoly player decides it’s going to try to tweak the standards so everyone has to follow a little behind. That’s a constant threat, because there’s a huge commercial incentive to try to carve out a piece.”

The answer, he said, involves us doing our job and buyers doing their job as informed by the press.

World Wide Works-in-Progress

The W3C has 57 full-time employees and more than 600 people from member organizations around the world assigned to 50 groups working in four domains: architecture, user interface, technology and society and the Web accessibility initiative. Projects the organization is working on include the following:

— HTTP-Next Generation, to offer greater flexibility and performance for distributed applications.

— Ubiquitous, device-independent access to the Web via television, mobile phones, pagers and the like.

— Metadata and a semantic Web that uses better ways to describe and catalog information to enable smarter search and retrieval than what is currently available.

— Platform for Privacy Preferences Project, which helps Web users learn and influence the privacy policies of Web sites. Web content accessibility guidelines to help Web developers make content accessible to people with disabilities.

— XML (a joint project with the IETF), a way to cryptographically sign XML documents.

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Empowering High-Performance Computing for Artificial Intelligence

April 19, 2024

Artificial intelligence (AI) presents some of the most challenging demands in information technology, especially concerning computing power and data movement. As a result of these challenges, high-performance computing Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire