Spectra Tape Plays to Cloud, Big Data, HPC Communities

By Tiffany Trader

November 4, 2011

In the past decade, the prevailing wisdom would have you believe that tape storage was a dead or dying breed, soon-to-be usurped by the sexier, speedier disk. Now that particular hype cycle has run its course and logic and common sense have returned to the storage conversation, prodded no doubt by a the latest buzzwords du jour, big data and cloud computing. At any rate, there’s no doubt that tape storage is as relevant as ever, and perhaps more relevant than ever. Indeed, this was the prevailing theme circulated by a group of prominent storage analysts at a recent Spectra Logic event in Boulder, Colo.

Spectra's expanded campusIn 2009 Spectra celebrated its 30-year anniversary, and like its storage capacity, Spectra is also expanding. Their fiscal 2011 figures reflected a 30 percent YoY growth overall and Spectra’s enterprise tape libraries posted revenue growth of 49 percent in the same period. Growth like that explains why the company recently outgrew their office, lab and manufacturing space for the second time in three years. But they won’t be cramped for long; they just purchased a 55,000-square-foot building in downtown Boulder next door to the 80,712-square-foot building they acquired in 2009.

It’s no surprise that data growth is exploding. In 1999, humankind had amassed 11 exabytes of digital information. By 2010 the figure had jumped to over 1 zettabyte. In light of this exponential growth, it’s imperative that data-intensive markets adopt storage solutions that demonstrate reliability, density, scalability and energy efficiency. They will also need tools to extract value from all this created and stored data.

Spectra Logic T-FinityOn the occasion of a big Spectra announcement earlier this week, Addison Snell, CEO of Intersect360 Research, explained that “the ‘Big Data’ trend is driving technology requirements in applications ranging from enterprise datacenters to academic supercomputing labs.” He went on to say that “the next generation of exascale data will require capabilities that not only provide sufficient capacity, but also deliver the speed, reliability, and data integrity features to match the needs of these environments.”

Consider these statistics (source: Making IT Matter – Chalfant/Toigo, 2009):

  • Up to 70 percent of capacity of every disk drive installed today is misused.
  • 40 percent of data is inert.
  • 15 percent is allocated but unused.
  • 10 percent is orphan data.
  • 5 percent is contraband data.

Currently, disk storage makes up between 33 and 70 cents of every dollar spent on IT hardware, and this trend is accelerating.

Before delving into tape’s role in the cloud space, it’s helpful to first review the major storage categories:

  • Backup: enables restoration of the most recent clean copy of the primary dataset available.

  • Archive: a repository designed to store, retain, and preserve data over time, regardless of its relation to the primary dataset.

  • Active Archive: An archive that manages both production and archive data in an online-accessible environment across multiple storage mediums including disk and tape.

It may seem obvious, but it’s important to point out that the value of data changes over time. With backup data, the information is more valuable today than it will be a month from now. Consider files stored on a computer that you will use to deliver a presentation five minutes from now. What if the data was lost or corrupted? With archive data, on the other hand, the value is measured over time. As our data habit grows, so does the need to prioritize it.

At the Spectra Logic event, which took place at the end of October, Chief Marketing Officer Molly Rector delivered a presentation on the state of the tape market, in which she shared the 2011 INSIC Tape Applications report. It looks at the ways tape is evolving to keep pace with the business realities of the 21st century:

“Tape has been shifting from its historical role of serving as a medium dedicated primarily to short-term backup, to a medium that addresses a much broader set of data storage goals, including:

Active archive (the most promising segment of market growth).

Regulatory compliance (approximately 20% – 25% of all business data created must be retained to meet compliance requirements for a specified and often lengthy period).

Disaster recovery, which continues in its traditional requirements as a significant use of tape.”

Rector explained that for practical reasons, a copy of all or most data stored in the cloud on disk is typically migrated to tape, if only for the extra reliability that tape provides. Tape is also used in cloud storage environments for the cost savings it confers.

Spectra slide - disk versus tape

Many people are naturally going to associate cloud storage with online storage, with spinning disks, but in reality most data that is created in or by the cloud will end up in a tape storage mechanism. In this video, Mark Peters, Senior Analyst for Enterprise Strategy Group, examines how tape’s role in the cloud parallels its importance in traditional IT. He details the following benefits:

  • Low CAPEX and OPEX cost per GB.
  • Reliable long-term storage with great media longevity and support for multiple media types.
  • It’s verifiable, searchable and scalable (with an active archive solution).

In Peters’ words, “[tape] remains the ultimate backup, and increasingly archival, device.” He goes on to note that tape is especially relevant for cloud in the areas of tiering and secure multi-tenancy, that is the sharing resources but not data. Keeping data secure is particularly important since privacy is cited as one of the major concerns with cloud. Portability is yet another plus for organizations looking to seed an offsite location, execute bulk DR restores, and even serves as an exit strategy to return data to a customer.

Active archiving is a major part of Spectra’s strategy to fulfill the requirements of an increasingly cloud-centered computing infrastructure. In an active archive, all production data, no matter how old or infrequently accessed, can still be retrieved online. According to company literature, Spectra’s Active Archive platform, announced in April 2010, offers cost-effective, online, file-based solutions that enable user access to all created data. With it, a petabyte of data can be presented as if it were a C drive. Additionally, management is simplified; business users simply set SLAs with their customers on the time to data, which may be 90-120 seconds coming from tape, or 2-3 seconds with a disk array.

Spectra slide - reduce overall data volumes
Spectra slide - before archiving
Spectra slide - after archiving
    Click on image to enlarge.

Archiving also reduces overall data volumes. It can turn 2.6 petabyte of data into 760 MB of managed data (as explained further in the set of slides on the right) and when you look at acquisition and energy costs, tape comes out ahead (as illustrated in the first slide). There are also financial and personnel costs involved with migration. Following best practices involves migrating data every 7 ½ years to new media in the same library with tape, or moving data to a new disk array every 3 – 4 years. As for portability, the quickest way to transport a petabyte of data is to send it through the air in a jumbo jet.

It certainly seems that Spectra is positioning itself as a big data storage provider, which it also refers to as exabyte storage. To that end, Spectra just announced a high-capacity T-Finity enterprise tape library capable of storing more than 3.6 exabytes of data. This represents the world’s largest data storage system – providing the highest capacity single library and the highest capacity library complex. A single T-Finity library will now expand to 40 frames for a capacity of up to 50,000 tape cartridges, and in a library complex, up to eight libraries can be clustered together for a capacity of up to 400,000 tape cartridges.

The jumbo storage solution will run the company’s recently-upgraded BlueScale management software, the latest version of which, BlueScale 12, delivers 35 to 60 percent faster robotic performance across Spectra’s enterprise and mid-range tape libraries, plus 15 to 20 times quicker library ‘power on’ times, as well as improved barcode scanning times. Another new advance, the Redundant Arrays of Independent Tapes (RAIT) architecture, developed with Spectra partner HPSS, “significantly improves data reliability in high performance, big data environments,” according to Jason Alt, senior software engineer, National Center for Supercomputing Applications.

Chairman, CEO and Founder Nathan C. Thompson originally designed Spectra’s storage systems for high-density, not speed, but customers were not willing to make that tradeoff, explains Rector. Active archiving strikes a balance between keeping data visible in online disk arrays and moving the data to more cost-effective near-line or off-line tape.

So who needs all that storage? Spectra sees this solution as a natural fit for a wide range of communities, from medicine and genomics to nuclear physics and petroleum exploration, in addition to supercomputing, media and entertainment, Internet storage, surveillance and, of course, cloud computing. As an interesting side note, when it comes to archiving data, Spectra notes that the general enterprise can archive about 80 percent of its data, but with HPC this figure is closer to 90-95 percent.

Spectra has well over 150 petascale-class customers, including Argonne National Laboratory, NCSA, NASA and Entertainment Tonight, as well as many that do not want to be named. Whether storing petabytes or exabytes, value and density are paramount. Organizations looking to meet their specific storage ought to be concerned with finding the right balance of speed, density and cost. Then there’s the long-term outlook: scalability. Tape libraries can stay active for 10 years, 15 years, or longer, incorporating new technologies as they become available.

Of course tape is not an all or nothing proposition, in most instances, a tiered storage approach provides an optimal cost/benefit profile, a concept summarized by Chris Marsh, Spectra’s IT market and development manager:

Cloud storage providers often implement a tiered storage approach that provides upfront performance to customers via performance disk, while relying on tape storage on the back end. This is an effective way of meeting their customers’ requirements and driving more profitability for their organization. So, regardless of whether it’s a business or technology decision, tape adds value to cloud storage and should be considered when reviewing any specific cloud storage service provider.

Marsh wrote an in-depth 3-part feature on the value of tape for cloud storage providers. In it, he lists the key characteristics for a cloud service, including multi-tenancy, security, data integrity verification, retrieval expectations, and exit strategies.

For anyone wondering why this shift is happening now and not sooner, the simple answer is there wasn’t this much data before! There is a belief that data is growing faster than storage capacity, so we need a shift, one that includes more optimal efficiencies that give users access to the data they need, when they need it, and does so in a reliable, cost-effective, and energy-efficient manner.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Researchers Scale COSMO Climate Code to 4888 GPUs on Piz Daint

October 17, 2017

Effective global climate simulation, sorely needed to anticipate and cope with global warming, has long been computationally challenging. Two of the major obstacles are the needed resolution and prolonged time to compute Read more…

By John Russell

UCSD Web-based Tool Tracking CA Wildfires Generates 1.5M Views

October 16, 2017

Tracking the wildfires raging in northern CA is an unpleasant but necessary part of guiding efforts to fight the fires and safely evacuate affected residents. One such tool – Firemap – is a web-based tool developed b Read more…

By John Russell

Exascale Imperative: New Movie from HPE Makes a Compelling Case

October 13, 2017

Why is pursuing exascale computing so important? In a new video – Hewlett Packard Enterprise: Eighteen Zeros – four HPE executives, a prominent national lab HPC researcher, and HPCwire managing editor Tiffany Trader Read more…

By John Russell

HPE Extreme Performance Solutions

“Lunch & Learn” to Explore the Growing Applications of Genomic Analytics

In the digital age of medicine, healthcare providers are rapidly transforming their approach to patient care. Traditional technologies are no longer sufficient to process vast quantities of medical data (including patient histories, treatment plans, diagnostic reports, and more), challenging organizations to invest in a new style of IT to enable faster and higher-quality care. Read more…

Intel Delivers 17-Qubit Quantum Chip to European Research Partner

October 10, 2017

On Tuesday, Intel delivered a 17-qubit superconducting test chip to research partner QuTech, the quantum research institute of Delft University of Technology (TU Delft) in the Netherlands. The announcement marks a major milestone in the 10-year, $50-million collaborative relationship with TU Delft and TNO, the Dutch Organization for Applied Research, to accelerate advancements in quantum computing. Read more…

By Tiffany Trader

Intel Delivers 17-Qubit Quantum Chip to European Research Partner

October 10, 2017

On Tuesday, Intel delivered a 17-qubit superconducting test chip to research partner QuTech, the quantum research institute of Delft University of Technology (TU Delft) in the Netherlands. The announcement marks a major milestone in the 10-year, $50-million collaborative relationship with TU Delft and TNO, the Dutch Organization for Applied Research, to accelerate advancements in quantum computing. Read more…

By Tiffany Trader

Fujitsu Tapped to Build 37-Petaflops ABCI System for AIST

October 10, 2017

Fujitsu announced today it will build the long-planned AI Bridging Cloud Infrastructure (ABCI) which is set to become the fastest supercomputer system in Japan Read more…

By John Russell

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Intel Debuts Programmable Acceleration Card

October 5, 2017

With a view toward supporting complex, data-intensive applications, such as AI inference, video streaming analytics, database acceleration and genomics, Intel i Read more…

By Doug Black

OLCF’s 200 Petaflops Summit Machine Still Slated for 2018 Start-up

October 3, 2017

The Department of Energy’s planned 200 petaflops Summit computer, which is currently being installed at Oak Ridge Leadership Computing Facility, is on track t Read more…

By John Russell

US Exascale Program – Some Additional Clarity

September 28, 2017

The last time we left the Department of Energy’s exascale computing program in July, things were looking very positive. Both the U.S. House and Senate had pas Read more…

By Alex R. Larzelere

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in Read more…

By Tiffany Trader

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Leading Solution Providers

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

Cray Moves to Acquire the Seagate ClusterStor Line

July 28, 2017

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Intel Launches Software Tools to Ease FPGA Programming

September 5, 2017

Field Programmable Gate Arrays (FPGAs) have a reputation for being difficult to program, requiring expertise in specialty languages, like Verilog or VHDL. Easin Read more…

By Tiffany Trader

IBM Advances Web-based Quantum Programming

September 5, 2017

IBM Research is pairing its Jupyter-based Data Science Experience notebook environment with its cloud-based quantum computer, IBM Q, in hopes of encouraging a new class of entrepreneurial user to solve intractable problems that even exceed the capabilities of the best AI systems. Read more…

By Alex Woodie

Intel, NERSC and University Partners Launch New Big Data Center

August 17, 2017

A collaboration between the Department of Energy’s National Energy Research Scientific Computing Center (NERSC), Intel and five Intel Parallel Computing Cente Read more…

By Linda Barney

  • arrow
  • Click Here for More Headlines
  • arrow
Share This