Visit additional Tabor Communication Publications
December 15, 2006
Hitachi Data Systems Corporation, has announced major moves that it claims will enable its rapid expansion into the global high-performance computing storage market. These moves, backed by an investment in BlueArc Corporation, a network storage provider, include a five-year worldwide OEM agreement and the immediate availability of the Hitachi High-performance NAS Platform.
"Today's announcement signifies a strategic expansion of our addressable market, enabling us, together with our channel partners across the globe, to bring powerful, reliable, and proven file-intensive storage technology to customers in the high- performance computing market and beyond," said Dave Roberson, president and CEO, Hitachi Data Systems. "Channel partners can use our new HPC storage offerings as the foundation to which they can add vertical market expertise, to provide custom solutions for their clients. Today's announcement is another impressive result of Hitachi Data Systems' increased investment and focus on emerging markets."
"This is another step in what I refer to as the 'new' Hitachi Data Systems. Less than two years ago, I used to say they were a great SAN storage company," said Tony Asaro, senior analyst, Enterprise Strategy Group. But now it is clear that they are focused on being a great storage networking company with a more comprehensive portfolio of products that range from SAN, storage management software, digital archiving, VTL and NAS. It is really important for customers to have other leading vendors provide NAS solutions so that they have more options open to them."
"BlueArc's leading file-based virtualization technology is a perfect complement to Hitachi's industry-leading block-based virtualization solutions," said Mike Gustafson, president and CEO, BlueArc. "This partnership will help expand BlueArc's presence on a global basis and will extend Titan's reach beyond the key vertical markets we serve today to major enterprises and horizontal applications across the world, giving customers the better alternative for high performance network storage they have been seeking."
Many network attached storage solutions today are designed to store and archive data that is rarely or never accessed. For Internet services applications, electronic discovery applications, life sciences, oil and gas exploration, and entertainment applications, conventional NAS storage systems often do not have the performance and scalability needed to manage the intensive file processing requirements that are critical to these vertical industries.
Headquartered in Columbus, Indiana, (USA), Cummins Inc. designs, manufactures, distributes and services engines and related technologies, including fuel systems, controls, air handling, filtration, emission solutions and electrical power generation systems.
"As a design and manufacturing company, it is critical to have high-performance solutions that can scale as our business grows," said Curt Brown, Storage Technology Director, Cummins Inc. "We were reaching our limit with our current NAS solution. The Hitachi High-performance NAS Platform provides the necessary scalability and performance we require. The Hitachi platform exceeds our requirements today and offers capacity and connectivity to scale as we grow."
"With this announcement, we are introducing the most advanced file and block virtualization system today," said John Mansfield, vice president, Product Management, Hitachi Data Systems. "There is tremendous opportunity for the Hitachi High-performance NAS Platform with our installed base. Our customers have been anxious for us to do for their file storage systems what we have successfully done for their block storage systems with our Universal Storage Platform. We can now help them consolidate and virtualize a tiered storage block AND file-based environment."
Hitachi claims the new High-performance NAS Platform offers more performance, capacity, file system size, and snapshot replicas than any comparative file-based storage offering. Delivering up to 6 times the real-world performance (600K IOPS), over 4 times the capacity (512TB), 16 times the file system size (256 TB), 4 times more snapshots per file system (1,024)--and features such as the ability to perform data classification and hierarchical storage management and offer Transparent Data Migration -- the Hitachi says its High-performance NAS Platform effectively eclipses EMC's Celerra/NSX and NetApp's FAS and V series products.
"There is increasing demand from a growing number of industry-leading, business application users for the delivery of information to decision makers in real or near-real time," said John Webster, Principal IT Advisor of Illuminata, Inc. "Through its partnership with BlueArc, Hitachi is further expanding its leadership in storage virtualization and further establishing a growth position in both traditional and emerging high-performance computing opportunities."
According to Hitachi, its High-performance NAS Platform's capacity allows for fewer nodes and lower maintenance costs. The platform's single logical storage pool of up to 512 terabytes eliminates the need to break up large data sets and its file virtualization capabilities enable automatic growth of file systems. When research projects require collaboration, the platform helps avoid duplicate efforts by promoting information sharing among researchers via fast, secure access to a central pool of files and databases that can scale up to 4 million files per directory. The Hitachi High-performance NAS Platform also delivers a cluster name space, which provides a single unified name space to users for both CIFS and NFS, concurrently, giving administrators a single mount point for users.
"We believe advanced file virtualization capabilities that permit automatic growth of file systems will gain the most traction," states Carl Greiner, senior vice president and analyst, Ovum. "However, the vendor that can deliver a single unified cluster name space for both CIFS and NFS will provide the most robust and functionally rich implementation. Successful IT organisations will begin to put unified storage infrastructure virtualisation on their short list of things required to insure flexibility and agility to meet today's ever challenging business requirements."
"Certain applications within high-performance computing, entertainment, and life sciences, have unique characteristics requiring uniquely-optimized solutions," said John McArthur, group vice president and general manager, Information Infrastructure and Enabling Technologies at IDC. "BlueArc has been addressing those requirements with the Titan product line. The partnership with BlueArc expands HDS' portfolio of storage solutions and gives the company improved access to these higher-growth market opportunities."
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.