DATABASE SOFTWARE – A FUTURE COMMODITY?

July 14, 2000

COMMERCIAL NEWS

New York, N.Y. — Tiernan Ray reports that when the great book of technology is closed on the chapter about the World Wide Web, among the blather about “new paradigms” there will be one transformation that is truly profound. The Web is a massive database, of sorts, and it has changed the way we all use information.

The Web is all about making connections, some mundane, others fascinating. If you surfed to this page in your Web browser from the “front” of SmartMoney.com, you followed a connection that’s perhaps no more interesting than flipping the pages of a magazine. Last week we glimpsed a more intriguing set of connections. On the Web site of the Human Genome Project, the not-for-profit, internationally sponsored version of the massive DNA-sequencing effort, you can actually search through a catalog of human genes and see the different patterns in the bits that make up life.

The Web-as-database has been very good for the world of commercial database products. Dataquest estimates that sales of database software jumped 18% last year to $8 billion. The Web has been especially good for Oracle, now the No. 1 vendor of databases for the Internet. Oracle has come to dominate databases that run on large Unix-server computers – just the kind of computers used for running Web sites, in other words. The company owns 30% of the market for what are known as relational databases, having reduced rivals Sybase and Informix to mere bit players in that business.

But I think things will soon change dramatically for the industry in general and for Oracle in particular, if the company isn’t careful. A new crop of databases on the Internet threaten to reduce Oracle’s core database program to commodity status while integrating more sophisticated and valuable data analysis than Oracle or any other pure database program offers right now. I’m talking about a new breed of database products that leverage the connective tissue of the Internet to combine traditional database functionality – the storage and retrieval of discreet types of information like bank account numbers and names – with powerful data-analysis and data-mining techniques.

Data mining and data analysis, pioneered by such companies as Red Brick, Arbor and MicroStrategy, has been around for some time. Essentially, it’s a series of computer science techniques that allow you to search for hidden and potentially revealing patterns in data that could become the basis for profitable business decisions. But the Internet puts an interesting spin on data mining. Instead of searching for patterns within a narrow company-specific database, the new generation of data engines is using the Web itself to search for relevant and meaningful connections. (Red Brick is now part of Informix and Arbor is part of Hyperion Solutions.)

My favorite practitioner of the art right now is a privately held search engine company called Google. Google sells its search engine capabilities for a fee to major Web sites, most recently scoring a major coup by replacing rival Inktomi as the search engine powering Yahoo!. Google has come up with very good results on many Web searches that in past disappointed. The main trick it uses is peer review: if you search on the word “Intel” at Google’s site, you get a list of links that has the home page of Intel at the top – exactly what most sane users of the Web would expect. That’s because Google decides to show you the Web page that most other pages link to for the keyword “intel,” namely http://www.intel.com .

Google performs other kinds of analysis. It collects extensive logs about how users interact with the Google.com site. The company crunches this data to deduce patterns or connections, good and bad, in the result sets in order to perfect the program. For example, if a search for Mickey Mantle turns up only results on eBay for baseball cards, Google will try and figure out why the findings are lopsided. The result is that Google improves over time the kinds of connections it can draw: search on Intel and you will not only get the home page, but also some relevant press clippings, a connection that Google figures out by looking over the text contained in press releases.

Other examples are emerging. Quiq of San Mateo, Calif., is building databases for Web sites as a fee-based service. An early example can be seen at Ask Jeeves.com, the search engine. If Ask Jeeves can’t find your answer, you can post the question on another page, called Answer Point, and the Quiq technology will slice and dice all questions, as well as answers posted by other surfers, matching up different strains of thought that share keywords, or that have very high page counts and therefore seem popular. Quiq is a bit like Internet newsgroups, where people go for advice, with the difference that using a database, connections and patterns can be established between separate conversations that interlocutors might not have even been aware of.

Quiq and Google have some important things in common. For one, they are developing database technology that goes beyond what Oracle and others sell. Google actually built its own database from scratch, and it is a wholly different type of software, called a “flat file” database, according to Craig Silverstein, Google’s director of technology. Raghu Ramakrishnan, chairman and chief technology officer of Quiq, and a professor of database technology at the University of Wisconsin-Madison, says that while Quiq’s program uses an Oracle database, the company has applied for about four or five different patents on specific technology enhancements it has made.

I think Quiq and Google are both models for the future of the database market. The basic task of capturing and storing discrete pieces of information is a done deal. Oracle has largely won; now it’s time for more sophisticated operations. I expect that the new database business will be built along the lines of a services business, like the kind Quiq and Google are running. Both companies enjoy leverage that would be impossible to achieve in the kinds of data analysis that Red Brick and Arbor advertised. There is leverage across talent and technology. Google’s founders, Larry Page and Sergey Brin, two graduate students in database technology from Stanford University head a team of human editors that review both search results and the kinds of questions that get answered. Their work can be broadly applied because, unlike the kind of precise analysis that Red Brick and Arbor encouraged, Quiq and Google are looking for fuzzy answers and general improvements in their respective approaches to their clients’ needs. It’s still early for both, however: Quiq has announced only AskJeeves, but expects to make more client announcements soon. Google is selling its search services to only a handful of customers besides Yahoo.

Certainly, traditional relational databases like the kind Oracle sells will continue to be a strong market as long as transaction volume grows on the Web. But as with the corporate market a few years back, Oracle’s Web sales will become a mature market at some point, and I believe growth will then come from the new services like Quiq that are helping to make new connections, rather than simply storing data.

Oracle has always tried to keep ahead of slowing sales by moving into other lines of business. Last year Oracle’s Chief Executive Larry Ellison bought the data mining software of Thinking Machines, a supercomputing pioneer. And Oracle rolled out technology for data analysis. Lately, you’re seeing Oracle integrate technology into its database that has nothing to do with traditional relational-database functions. Features like the ability to cache, or store a copy of data for faster access. The just-announced Internet Application Server, or iAS, allows companies to build code to run a Web site and then run that code out of the database. In the orthodox world of transaction-oriented relational databases, that kind of mixing of applications and data has been considered a cardinal sin. Larry is breaking the rules, in other words.

And maybe with good reason. As Internet databases, not corporate databases, become the hot market, more and more of the underlying technology, the database engine, becomes a commodity. When companies like Quiq are building so much new code, it almost doesn’t matter whose database it runs on. It may even be time to take another look at Informix and Sybase. If the brand of software underlying each Web site matters less, there may be a chance for these also-rans to compete in earnest, if they adjust their business models to reflect the commodity aspect of the business. With Informix trading at around $4.50 these days, and Sybase at about $22 a share, the multiples for these stocks are quite attractive: 5.5 and 23, respectively. It’s not often these days you see a stock with any kind of franchise trading at a price-to-earnings growth rate multiple at or below 1. They’re both companies in need of mending, but at least as acquisition candidates, they’re worth a second look.

The views in this article are those of its author and not necessarily those of the publisher or staff of HPCwire.

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire