Visit additional Tabor Communication Publications
December 11, 2008
SDSC's Berman: Will your data be there when you need it?
Dec. 10 -- The world has gone digital in just about everything we do. Almost every iota of information we access these days is stored in some kind of digital form and accessed electronically -- text, charts, images, video, music, you name it. The key questions are: Will your data be there when you need it? And who's going to preserve it?
In the December 2008 edition of Communications of the ACM, the monthly magazine of the Association for Computing Machinery, Dr. Fran Berman, director of the San Diego Supercomputer Center (SDSC) at the University of California, San Diego, provides a guide for surviving what has become known as the "data deluge."
Managing this deluge and preserving what's important is what Berman refers to as one of the "grand challenges" of the Information Age. The amount of digital data is immense: A 2008 report by the International Data Corporation (IDC), a global provider of information technology intelligence based in Framingham, Mass., predicts that by 2011, our "digital universe" will be 10 times the size it was in 2006 -- and almost half of this universe will not have a permanent home as the amount of digital information outstrips storage space.
"As a society, we have only begun to address this challenge at a scale concomitant with the deluge of data available to us and its importance in the modern world," writes Berman, a longtime pioneer in cyberinfrastructure -- an open but organized aggregate of information technologies including computers, data archives, networks, software, digital instruments, and other scientific endeavors that support 21st century life and work.
Berman is a strong advocate of cyberinfrastructure that supports the management and preservation of digital data in the Information Age -- data cyberinfrastructure: "Just like the physical infrastructures all around us -- roads, bridges, water and electricity -- we need a data cyberinfrastructure that is stable, predictable, and cost-effective."
In her article, Berman explores key trends and issues associated with preserving digital data, and what's required to keep it manageable, accessible, available, and secure. However, she warns that there is no "one-size-fits-all" solution for data stewardship and preservation.
"The 'free rider' solution of 'Let someone else do it'-- whether that someone else is the government, a library, a museum, an archive, Google, Microsoft, the data creator, or the data user -- is unrealistic and pushes responsibility to a single company, institution, or sector. What is needed are cross-sector economic partnerships," says Berman. She adds that the solution is to "take a comprehensive and coordinated approach to data cyberinfrastructure and treat the problem holistically, creating strategies that make sense from a technical, policy, regulatory, economic, security, and community perspective."
Berman's ACM article closes with a set of "Top 10" guidelines for data stewardship:
1. Make a plan. Create an explicit strategy for stewardship and preservation for your data, from its inception to the end of its lifetime; explicitly consider what that lifetime may be.
2. Be aware of data costs and include them in your overall IT budget. Ensure that all costs are factored in, including hardware, software, expert support, and time. Determine whether it is more cost-effective to regenerate some of your information rather than preserve it over a long period.
3. Associate metadata with your data. Metadata is needed to be able to find and use your data immediately and for years to come. Identify relevant standards for data/metadata content and format, following them to ensure the data can be used by others.
4. Make multiple copies of valuable data. Store some of them off-site and in different systems.
5. Plan for the transition of digital data to new storage media ahead of time. Include budgetary planning for new storage and software technologies, file format migrations, and time. Migration must be an ongoing process. Migrate data to new technologies before your storage media becomes obsolete.
6. Plan for transitions in data stewardship. If the data will eventually be turned over to a formal repository, institution, or other custodial environment, ensure it meets the requirements of the new environment and that the new steward indeed agrees to take it on.
7. Determine the level of "trust" required when choosing how to archive data. Are the resources of the U.S. National Archives and Records Administration necessary or will Google do?
8. Tailor plans for preservation and access to the expected use. Gene-sequence data used daily by hundreds of thousands of researchers worldwide may need a different preservation and access infrastructure from, for example, digital photos viewed occasionally by family members.
9. Pay attention to security. Be aware of what you must do to maintain the integrity of your data.
10. Know the regulations. Know whether copyright, the Health Insurance Portability and Accountability Act of 1996, the Sarbanes-Oxley Act of 2002, the U.S. National Institutes of Health publishing expectations, or other policies and/or regulations are relevant to your data, ensuring your approach to stewardship and publication is compliant.
Berman is a national leader in this area and also co-chairs of the Blue Ribbon Task Force on Sustainable Digital Preservation and Access with OCLC economist Brian Lavoie. The task force was formed late last year to explore and ultimately present a range of economic models, components, and actionable recommendations for sustainable preservation and access of digital data in the public interest. Commissioned for two years, the task force will publish an interim report outlining economic issues and systemic challenges associated with digital preservation later this month on its website.
For Berman's full Communications of the ACM article, see: http://www.sdsc.edu/about/director/pubs/communications200812-DataDeluge.pdf .
Source: San Diego Supercomputer Center
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 19, 2013 |
Supercomputer architectures have evolved considerably over the last 20 years, particularly in the number of processors that are linked together. One aspect of HPC architecture that hasn't changed is the MPI programming model.
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.