DATABASE SOFTWARE – A FUTURE COMMODITY?

July 14, 2000

COMMERCIAL NEWS

New York, N.Y. — Tiernan Ray reports that when the great book of technology is closed on the chapter about the World Wide Web, among the blather about “new paradigms” there will be one transformation that is truly profound. The Web is a massive database, of sorts, and it has changed the way we all use information.

The Web is all about making connections, some mundane, others fascinating. If you surfed to this page in your Web browser from the “front” of SmartMoney.com, you followed a connection that’s perhaps no more interesting than flipping the pages of a magazine. Last week we glimpsed a more intriguing set of connections. On the Web site of the Human Genome Project, the not-for-profit, internationally sponsored version of the massive DNA-sequencing effort, you can actually search through a catalog of human genes and see the different patterns in the bits that make up life.

The Web-as-database has been very good for the world of commercial database products. Dataquest estimates that sales of database software jumped 18% last year to $8 billion. The Web has been especially good for Oracle, now the No. 1 vendor of databases for the Internet. Oracle has come to dominate databases that run on large Unix-server computers – just the kind of computers used for running Web sites, in other words. The company owns 30% of the market for what are known as relational databases, having reduced rivals Sybase and Informix to mere bit players in that business.

But I think things will soon change dramatically for the industry in general and for Oracle in particular, if the company isn’t careful. A new crop of databases on the Internet threaten to reduce Oracle’s core database program to commodity status while integrating more sophisticated and valuable data analysis than Oracle or any other pure database program offers right now. I’m talking about a new breed of database products that leverage the connective tissue of the Internet to combine traditional database functionality – the storage and retrieval of discreet types of information like bank account numbers and names – with powerful data-analysis and data-mining techniques.

Data mining and data analysis, pioneered by such companies as Red Brick, Arbor and MicroStrategy, has been around for some time. Essentially, it’s a series of computer science techniques that allow you to search for hidden and potentially revealing patterns in data that could become the basis for profitable business decisions. But the Internet puts an interesting spin on data mining. Instead of searching for patterns within a narrow company-specific database, the new generation of data engines is using the Web itself to search for relevant and meaningful connections. (Red Brick is now part of Informix and Arbor is part of Hyperion Solutions.)

My favorite practitioner of the art right now is a privately held search engine company called Google. Google sells its search engine capabilities for a fee to major Web sites, most recently scoring a major coup by replacing rival Inktomi as the search engine powering Yahoo!. Google has come up with very good results on many Web searches that in past disappointed. The main trick it uses is peer review: if you search on the word “Intel” at Google’s site, you get a list of links that has the home page of Intel at the top – exactly what most sane users of the Web would expect. That’s because Google decides to show you the Web page that most other pages link to for the keyword “intel,” namely http://www.intel.com .

Google performs other kinds of analysis. It collects extensive logs about how users interact with the Google.com site. The company crunches this data to deduce patterns or connections, good and bad, in the result sets in order to perfect the program. For example, if a search for Mickey Mantle turns up only results on eBay for baseball cards, Google will try and figure out why the findings are lopsided. The result is that Google improves over time the kinds of connections it can draw: search on Intel and you will not only get the home page, but also some relevant press clippings, a connection that Google figures out by looking over the text contained in press releases.

Other examples are emerging. Quiq of San Mateo, Calif., is building databases for Web sites as a fee-based service. An early example can be seen at Ask Jeeves.com, the search engine. If Ask Jeeves can’t find your answer, you can post the question on another page, called Answer Point, and the Quiq technology will slice and dice all questions, as well as answers posted by other surfers, matching up different strains of thought that share keywords, or that have very high page counts and therefore seem popular. Quiq is a bit like Internet newsgroups, where people go for advice, with the difference that using a database, connections and patterns can be established between separate conversations that interlocutors might not have even been aware of.

Quiq and Google have some important things in common. For one, they are developing database technology that goes beyond what Oracle and others sell. Google actually built its own database from scratch, and it is a wholly different type of software, called a “flat file” database, according to Craig Silverstein, Google’s director of technology. Raghu Ramakrishnan, chairman and chief technology officer of Quiq, and a professor of database technology at the University of Wisconsin-Madison, says that while Quiq’s program uses an Oracle database, the company has applied for about four or five different patents on specific technology enhancements it has made.

I think Quiq and Google are both models for the future of the database market. The basic task of capturing and storing discrete pieces of information is a done deal. Oracle has largely won; now it’s time for more sophisticated operations. I expect that the new database business will be built along the lines of a services business, like the kind Quiq and Google are running. Both companies enjoy leverage that would be impossible to achieve in the kinds of data analysis that Red Brick and Arbor advertised. There is leverage across talent and technology. Google’s founders, Larry Page and Sergey Brin, two graduate students in database technology from Stanford University head a team of human editors that review both search results and the kinds of questions that get answered. Their work can be broadly applied because, unlike the kind of precise analysis that Red Brick and Arbor encouraged, Quiq and Google are looking for fuzzy answers and general improvements in their respective approaches to their clients’ needs. It’s still early for both, however: Quiq has announced only AskJeeves, but expects to make more client announcements soon. Google is selling its search services to only a handful of customers besides Yahoo.

Certainly, traditional relational databases like the kind Oracle sells will continue to be a strong market as long as transaction volume grows on the Web. But as with the corporate market a few years back, Oracle’s Web sales will become a mature market at some point, and I believe growth will then come from the new services like Quiq that are helping to make new connections, rather than simply storing data.

Oracle has always tried to keep ahead of slowing sales by moving into other lines of business. Last year Oracle’s Chief Executive Larry Ellison bought the data mining software of Thinking Machines, a supercomputing pioneer. And Oracle rolled out technology for data analysis. Lately, you’re seeing Oracle integrate technology into its database that has nothing to do with traditional relational-database functions. Features like the ability to cache, or store a copy of data for faster access. The just-announced Internet Application Server, or iAS, allows companies to build code to run a Web site and then run that code out of the database. In the orthodox world of transaction-oriented relational databases, that kind of mixing of applications and data has been considered a cardinal sin. Larry is breaking the rules, in other words.

And maybe with good reason. As Internet databases, not corporate databases, become the hot market, more and more of the underlying technology, the database engine, becomes a commodity. When companies like Quiq are building so much new code, it almost doesn’t matter whose database it runs on. It may even be time to take another look at Informix and Sybase. If the brand of software underlying each Web site matters less, there may be a chance for these also-rans to compete in earnest, if they adjust their business models to reflect the commodity aspect of the business. With Informix trading at around $4.50 these days, and Sybase at about $22 a share, the multiples for these stocks are quite attractive: 5.5 and 23, respectively. It’s not often these days you see a stock with any kind of franchise trading at a price-to-earnings growth rate multiple at or below 1. They’re both companies in need of mending, but at least as acquisition candidates, they’re worth a second look.

The views in this article are those of its author and not necessarily those of the publisher or staff of HPCwire.

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Supercomputers Streamline Prediction of Dangerous Arrhythmia

June 2, 2020

Heart arrhythmia can prove deadly, contributing to the hundreds of thousands of deaths from cardiac arrest in the U.S. every year. Unfortunately, many of those arrhythmia are induced as side effects from various medicati Read more…

By Staff report

Indiana University to Deploy Jetstream 2 Cloud with AMD, Nvidia Technology

June 2, 2020

Indiana University has been awarded a $10 million NSF grant to build ‘Jetstream 2,’ a cloud computing system that will provide 8 aggregate petaflops of computing capability in support of data analysis and AI workload Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been instrumental to AMD’s datacenter market resurgence. Nanomet Read more…

By Doug Black

Supercomputer-Powered Protein Simulations Approach Lab Accuracy

June 1, 2020

Protein simulations have dominated the supercomputing conversation of late as supercomputers around the world race to simulate the viral proteins of COVID-19 as accurately as possible and simulate potential bindings in t Read more…

By Oliver Peckham

HPC Career Notes: June 2020 Edition

June 1, 2020

In this monthly feature, we'll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it's a promotion, new company hire, or even an accolade, we've got Read more…

By Mariana Iriarte

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

Supercomputer Modeling Shows How COVID-19 Spreads Through Populations

May 30, 2020

As many states begin to loosen the lockdowns and stay-at-home orders that have forced most Americans inside for the past two months, researchers are poring over the data, looking for signs of the dreaded second peak of t Read more…

By Oliver Peckham

Indiana University to Deploy Jetstream 2 Cloud with AMD, Nvidia Technology

June 2, 2020

Indiana University has been awarded a $10 million NSF grant to build ‘Jetstream 2,’ a cloud computing system that will provide 8 aggregate petaflops of comp Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI Read more…

By Doug Black

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

HPCwire and all of the Tabor Communications family are saddened by last week’s passing of Rich Brueckner. He was the ever-optimistic man in the Red Hat presiding over the InsideHPC media portfolio for the past decade and a constant presence at HPC’s most important events. Read more…

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This