Spectra Tape Plays to Cloud, Big Data, HPC Communities

By Tiffany Trader

November 4, 2011

In the past decade, the prevailing wisdom would have you believe that tape storage was a dead or dying breed, soon-to-be usurped by the sexier, speedier disk. Now that particular hype cycle has run its course and logic and common sense have returned to the storage conversation, prodded no doubt by a the latest buzzwords du jour, big data and cloud computing. At any rate, there’s no doubt that tape storage is as relevant as ever, and perhaps more relevant than ever. Indeed, this was the prevailing theme circulated by a group of prominent storage analysts at a recent Spectra Logic event in Boulder, Colo.

Spectra's expanded campusIn 2009 Spectra celebrated its 30-year anniversary, and like its storage capacity, Spectra is also expanding. Their fiscal 2011 figures reflected a 30 percent YoY growth overall and Spectra’s enterprise tape libraries posted revenue growth of 49 percent in the same period. Growth like that explains why the company recently outgrew their office, lab and manufacturing space for the second time in three years. But they won’t be cramped for long; they just purchased a 55,000-square-foot building in downtown Boulder next door to the 80,712-square-foot building they acquired in 2009.

It’s no surprise that data growth is exploding. In 1999, humankind had amassed 11 exabytes of digital information. By 2010 the figure had jumped to over 1 zettabyte. In light of this exponential growth, it’s imperative that data-intensive markets adopt storage solutions that demonstrate reliability, density, scalability and energy efficiency. They will also need tools to extract value from all this created and stored data.

Spectra Logic T-FinityOn the occasion of a big Spectra announcement earlier this week, Addison Snell, CEO of Intersect360 Research, explained that “the ‘Big Data’ trend is driving technology requirements in applications ranging from enterprise datacenters to academic supercomputing labs.” He went on to say that “the next generation of exascale data will require capabilities that not only provide sufficient capacity, but also deliver the speed, reliability, and data integrity features to match the needs of these environments.”

Consider these statistics (source: Making IT Matter – Chalfant/Toigo, 2009):

  • Up to 70 percent of capacity of every disk drive installed today is misused.
  • 40 percent of data is inert.
  • 15 percent is allocated but unused.
  • 10 percent is orphan data.
  • 5 percent is contraband data.

Currently, disk storage makes up between 33 and 70 cents of every dollar spent on IT hardware, and this trend is accelerating.

Before delving into tape’s role in the cloud space, it’s helpful to first review the major storage categories:

  • Backup: enables restoration of the most recent clean copy of the primary dataset available. 
  • Archive: a repository designed to store, retain, and preserve data over time, regardless of its relation to the primary dataset. 
  • Active Archive: An archive that manages both production and archive data in an online-accessible environment across multiple storage mediums including disk and tape.

It may seem obvious, but it’s important to point out that the value of data changes over time. With backup data, the information is more valuable today than it will be a month from now. Consider files stored on a computer that you will use to deliver a presentation five minutes from now. What if the data was lost or corrupted? With archive data, on the other hand, the value is measured over time. As our data habit grows, so does the need to prioritize it.

At the Spectra Logic event, which took place at the end of October, Chief Marketing Officer Molly Rector delivered a presentation on the state of the tape market, in which she shared the 2011 INSIC Tape Applications report. It looks at the ways tape is evolving to keep pace with the business realities of the 21st century:

“Tape has been shifting from its historical role of serving as a medium dedicated primarily to short-term backup, to a medium that addresses a much broader set of data storage goals, including:

Active archive (the most promising segment of market growth).

Regulatory compliance (approximately 20% – 25% of all business data created must be retained to meet compliance requirements for a specified and often lengthy period).

Disaster recovery, which continues in its traditional requirements as a significant use of tape.”

Rector explained that for practical reasons, a copy of all or most data stored in the cloud on disk is typically migrated to tape, if only for the extra reliability that tape provides. Tape is also used in cloud storage environments for the cost savings it confers.

Spectra slide - disk versus tape

Many people are naturally going to associate cloud storage with online storage, with spinning disks, but in reality most data that is created in or by the cloud will end up in a tape storage mechanism. In this video, Mark Peters, Senior Analyst for Enterprise Strategy Group, examines how tape’s role in the cloud parallels its importance in traditional IT. He details the following benefits:

  • Low CAPEX and OPEX cost per GB.
  • Reliable long-term storage with great media longevity and support for multiple media types.
  • It’s verifiable, searchable and scalable (with an active archive solution).

In Peters’ words, “[tape] remains the ultimate backup, and increasingly archival, device.” He goes on to note that tape is especially relevant for cloud in the areas of tiering and secure multi-tenancy, that is the sharing resources but not data. Keeping data secure is particularly important since privacy is cited as one of the major concerns with cloud. Portability is yet another plus for organizations looking to seed an offsite location, execute bulk DR restores, and even serves as an exit strategy to return data to a customer.

Active archiving is a major part of Spectra’s strategy to fulfill the requirements of an increasingly cloud-centered computing infrastructure. In an active archive, all production data, no matter how old or infrequently accessed, can still be retrieved online. According to company literature, Spectra’s Active Archive platform, announced in April 2010, offers cost-effective, online, file-based solutions that enable user access to all created data. With it, a petabyte of data can be presented as if it were a C drive. Additionally, management is simplified; business users simply set SLAs with their customers on the time to data, which may be 90-120 seconds coming from tape, or 2-3 seconds with a disk array.

Spectra slide - reduce overall data volumes
Spectra slide - before archiving
Spectra slide - after archiving
    Click on image to enlarge.

Archiving also reduces overall data volumes. It can turn 2.6 petabyte of data into 760 MB of managed data (as explained further in the set of slides on the right) and when you look at acquisition and energy costs, tape comes out ahead (as illustrated in the first slide). There are also financial and personnel costs involved with migration. Following best practices involves migrating data every 7 ½ years to new media in the same library with tape, or moving data to a new disk array every 3 – 4 years. As for portability, the quickest way to transport a petabyte of data is to send it through the air in a jumbo jet.

It certainly seems that Spectra is positioning itself as a big data storage provider, which it also refers to as exabyte storage. To that end, Spectra just announced a high-capacity T-Finity enterprise tape library capable of storing more than 3.6 exabytes of data. This represents the world’s largest data storage system – providing the highest capacity single library and the highest capacity library complex. A single T-Finity library will now expand to 40 frames for a capacity of up to 50,000 tape cartridges, and in a library complex, up to eight libraries can be clustered together for a capacity of up to 400,000 tape cartridges.

The jumbo storage solution will run the company’s recently-upgraded BlueScale management software, the latest version of which, BlueScale 12, delivers 35 to 60 percent faster robotic performance across Spectra’s enterprise and mid-range tape libraries, plus 15 to 20 times quicker library ‘power on’ times, as well as improved barcode scanning times. Another new advance, the Redundant Arrays of Independent Tapes (RAIT) architecture, developed with Spectra partner HPSS, “significantly improves data reliability in high performance, big data environments,” according to Jason Alt, senior software engineer, National Center for Supercomputing Applications.

Chairman, CEO and Founder Nathan C. Thompson originally designed Spectra’s storage systems for high-density, not speed, but customers were not willing to make that tradeoff, explains Rector. Active archiving strikes a balance between keeping data visible in online disk arrays and moving the data to more cost-effective near-line or off-line tape.

So who needs all that storage? Spectra sees this solution as a natural fit for a wide range of communities, from medicine and genomics to nuclear physics and petroleum exploration, in addition to supercomputing, media and entertainment, Internet storage, surveillance and, of course, cloud computing. As an interesting side note, when it comes to archiving data, Spectra notes that the general enterprise can archive about 80 percent of its data, but with HPC this figure is closer to 90-95 percent.

Spectra has well over 150 petascale-class customers, including Argonne National Laboratory, NCSA, NASA and Entertainment Tonight, as well as many that do not want to be named. Whether storing petabytes or exabytes, value and density are paramount. Organizations looking to meet their specific storage ought to be concerned with finding the right balance of speed, density and cost. Then there’s the long-term outlook: scalability. Tape libraries can stay active for 10 years, 15 years, or longer, incorporating new technologies as they become available.

Of course tape is not an all or nothing proposition, in most instances, a tiered storage approach provides an optimal cost/benefit profile, a concept summarized by Chris Marsh, Spectra’s IT market and development manager:

Cloud storage providers often implement a tiered storage approach that provides upfront performance to customers via performance disk, while relying on tape storage on the back end. This is an effective way of meeting their customers’ requirements and driving more profitability for their organization. So, regardless of whether it’s a business or technology decision, tape adds value to cloud storage and should be considered when reviewing any specific cloud storage service provider.

Marsh wrote an in-depth 3-part feature on the value of tape for cloud storage providers. In it, he lists the key characteristics for a cloud service, including multi-tenancy, security, data integrity verification, retrieval expectations, and exit strategies.

For anyone wondering why this shift is happening now and not sooner, the simple answer is there wasn’t this much data before! There is a belief that data is growing faster than storage capacity, so we need a shift, one that includes more optimal efficiencies that give users access to the data they need, when they need it, and does so in a reliable, cost-effective, and energy-efficient manner.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pressing needs and hurdles to widespread AI adoption. The sudde Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire