Back to the Future: SGI Returns to Visualization

By John E. West

April 11, 2008

In the 1980s and early 1990s, if you were doing anything serious in computer graphics you were doing it with SGI gear. Then a series of strategic missteps and the emergence of incredibly powerful, cheap graphics cards for PCs made the company’s graphics lines irrelevant, and for nearly a decade they’ve survived as a server company. The SGI Virtu line announced this week steers SGI back into graphics for what it says is the long haul. But is there a market left to capture?

I spoke to Tom Reed, SGI’s Director of Visualization, about the new offering. Tom’s take on the Virtu announcement is that the HPC market is finally ready to once again make serious investments in visualization. “For a long time there has been a lot of focus [in HPC] on getting clusters right. Now we’ve reach a spot were we’re on good footing with clusters and we can start addressing other issues in the high performance ecosystem.” He added, “Visualization is the long pole in the performance computing tent now.”

SGI has made wobbles in and out of the graphics market since the end of their market dominance around the start of this decade, but those efforts have never born fruit. Soon after Bo Ewald returned to the CEO post at SGI he started making public comments about SGI’s return to high end graphics. On September 27 last year he remarked to a packed room at IDC’s HPC User Forum, “We will be back in the visual supercomputing business,” and then he added this quotable quote, “It was really stupid for the company to stop doing visualization types of things.”

This week the company confirmed that Bo wasn’t just talking off script at public events. Although the company’s press release on Tuesday focused on the Virtu VN200 graphics server, the offering also includes a graphics workstation — the Virtu VS series — and the Wide Area Visual Environment, or WAVE, to support the remote visualization needs of many customer sites.

First, the VN200. Built around two quad-core Xeons, the VN200 node is the server side of the SGI visual supercomputing equation. These nodes can run SLES, Red Hat, or Windows (as can the VS), and up to five VN200 nodes can be integrated in a single 4U enclosure. Multiple enclosures can be racked together, and the whole thing can be clustered right into your Altix big iron. Graphics are provided by NVIDIA Quadro FX graphics cards, one to a node.

This integration of compute and visualization gear is a key driver in SGI’s Virtu strategy. Again according to Reed, “We are ready to re-attach visualization back to computing… to bring visualization back as an integral part of the computing experience.” Despite the goal, this is still version 1.0 of the offering. The Virtu nodes don’t have NUMAlink capability, so they don’t share memory with each other or with the processors in the Altix side of the system. The VN200 is really a clustered graphics solution, all the way down to the node, and data for any cooperative rendering that’s done beyond the cores available in a node has to be managed explicitly using distributed memory semantics. Reed did indicate that they are looking to extend the shared memory model to Virtu nodes in the future.

This will be an important differentiator in a space where SGI is competing for graphics cluster business with companies like GraphStream, Verari, HP, and others, all of whom are essentially constructing solutions out of the same parts. Right now SGI says that a big part of their value add in the VN is that they have created a fully integrated solution, where drivers for the graphics cards and the IB ports don’t interfere with each other and everything “just works,” and VN200 nodes can be integrated with the HPC nodes generating the data.

While SGI hopes its customers start buying VN nodes with all of their Altix gear, the VN customer doesn’t have to integrate it with an Altix setup, or even have an Altix at all. VN nodes work just fine as the standalone centerpiece of an enterprise visualization solution.

Another area where improvements will be critical for product differentiation is in the rendering pipeline itself. According to Reed, SGI is going to add in some of VizServer’s collaboration technologies, strengthening the remote and collaborative aspects of the offering.

Though not mentioned in the information released for the launch, if you look at the Virtu offering you’ll notice that SGI has added a visual workstation to it’s lineup, the Virtu VS line of machines. Reed describes the VS line as important only for a few niche customers in some fairly specialized situations, and not a product they expect to focus on for most customers. According to him, “SGI is really not back in the [general purpose] workstation business. These systems are used almost exclusively as platforms for building visualization solutions for customers who need four or more graphics pipes in a single system (driving 4K projection systems, collaborative team rooms, etc).”

I was interested to know whether SGI is actually making this new Virtu gear. According to Reed, both the VS and the VN are manufactured outside of SGI, though he declined to disclose who those partners were. A source close to the HPC industry in Austin, Texas, identified one of the VS manufacturers as BOXX Technologies, an Austin-based company that specializes in making, according to the company’s web site, “high-performance computing platforms for Visual Effects (VFX) professionals.”

According to the press release VN nodes start at $10,575; pricing for the VS units wasn’t available for this story. They don’t have customers to talk with the press yet, but Reed reports that there are at least three VN200 systems being beta tested by customers.

So, nearly a decade after SGI lost or sold much of its critical IP (google for the NVIDIA patent bargain and the Microsoft IP sale), they are trying to get back into what it calls the visual supercomputing business. Right now their offering appears to be incremental — commodity-based graphics capabilities in clusters, and commodity-based graphics in workstations, with some software to glue it all together. Because of who they are, they are likely to attract some customer interest on the basis of their heritage. In fact, the press release mentions SGI’s graphics legacy numerous times.

If they want to recapture the visual supercomputing business this is a reasonable first step. But it’s not enough to do more than get people’s attention.

The hard problem is that visual supercomputing really doesn’t exist as a distinct market anymore, thanks largely to the success of the commodity video card business. In order to precipitate this market back out of the commodity graphics solution it has dissolved into, the company needs to focus hard on leveraging the features that make its computational gear unique while layering on strong, value added visualization-specific features for petascale datasets. They need to create a graphics offering that is unique, and gives users something they cannot get by simply stitching together free software and gear they can buy anywhere.

SGI is still positioned to do this. There is a lot of valuable IP that has remained with the company, in products like Vizserver and others, that they can bring to the challenges we are facing today in analysis of large data. Tom Reed is passionate when he talks about SGI’s commitment to long term leadership in visualization, with this first step creating a platform for SGI to innovate on with new advances in data management and fundamental approaches to data visualization.

This is SGI, and this is an industry that they once dominated. I believe that if anyone can do it, they can.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Pfizer HPC Engineer Aims to Automate Software Stack Testing

January 17, 2019

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

Senegal Prepares to Take Delivery of Atos Supercomputer

January 16, 2019

In just a few months time, Senegal will be operating the second largest HPC system in sub-Saharan Africa. The Minister of Higher Education, Research and Innovation Mary Teuw Niane made the announcement on Monday (Jan. 14 Read more…

By Tiffany Trader

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE Systems With Intel Omni-Path: Architected for Value and Accessible High-Performance Computing

Today’s high-performance computing (HPC) and artificial intelligence (AI) users value high performing clusters. And the higher the performance that their system can deliver, the better. Read more…

IBM Accelerated Insights

Resource Management in the Age of Artificial Intelligence

New challenges demand fresh approaches

Fueled by GPUs, big data, and rapid advances in software, the AI revolution is upon us. Read more…

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By John Russell

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three Read more…

By Tiffany Trader

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchm Read more…

By John Russell

A Big Data Journey While Seeking to Catalog our Universe

January 16, 2019

It turns out, astronomers have lots of photos of the sky but seek knowledge about what the photos mean. Sound familiar? Big data problems are often characterize Read more…

By James Reinders

Intel Bets Big on 2-Track Quantum Strategy

January 15, 2019

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By Doug Black

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM’s New Global Weather Forecasting System Runs on GPUs

January 9, 2019

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By Oliver Peckham

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

Microsoft to Buy Mellanox?

December 20, 2018

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…

By Doug Black

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This