Sick of big data? Not so fast. The age of the data-centric system has just begun. Tackling this subject in a recent blog is Tilak Agerwala, Vice President of Data Centric Systems at IBM Research. Agerwala observes: “Since the 1950s, the models studied by HPC systems have increased both in scale and in detail with Read more…
The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, has created a new “center of excellence” focused on helping researchers more effectively manage research data. Called Workflows for Data Science Center of Excellence, or WorDS Center for short, the goal is to provide a common resource for the developing and validating Read more…
Corral, the DataDirect Networks storage system installed at the Texas Advanced Computing Center (TACC), recently crossed the one petabyte mark in total data stored, and it now hosts over 100 unique data collections.
Cancer statistics. Medicare fraud. Peanut recalls. Federal government agencies collect data on all types of phenomenon, and just about every piece of data is useful to somebody, somewhere, at some time. More of that data will find its way into the hands of the public and entrepreneurs thanks to the government’s new data sharing mandate, and this month, the government issued new guidelines telling federal agencies how to comply with the mandate.
Cloud computing has made the exchange of information between financial companies and the European community complicated.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/bigdatagraphic_132x.jpg” alt=”” width=”75″ height=”105″ />Big data is all the rage these days. It is the subject of a recent Presidential Initiative, has its own news portal, and, in the guise of Watson, is a game show celebrity. Big data has also caused concern in some circles that it might sap interest and funding from the exascale computing initiative. So, is big data distinct from HPC – or is it just a new aspect of our evolving world of high-performance computing?
There is tech trend that may boost US job numbers and bring more investment to cloud providers.
HPC and grid/cloud expert Wolfgang Gentzsch conducts an interview with Paolo Balboni, scientific director of the European Privacy Association and founding partner at ICT Legal Consulting in Milan. Dr. Balboni will be featured speaker at ISC Cloud in Mannheim, Germany, Sept. 26-27, 2012, where he will be discussing the legal aspects of cloud computing.
Is there any way to guarantee the security of cloud-based data? A security provider attempts to reduce vulnerabilities and increase confidence with auditing and compliance services.
By enabling extremely fast and scalable data access even under large and growing workloads, in-memory data grids (IMDGs) have proven their value in storing fast-changing application data, such as financial trading data, shopping data, and much more. As organizations work to efficiently access their critical business data across multiple sites or scale their processing into the cloud, the need rapidly has grown to quickly and seamlessly migrate data where it is needed. The use of IMDGs creates an exciting opportunity for organizations to employ powerful global strategies for data sharing.
Federating IMDGs across multiple sites enables seamless, transparent access to data from any site and provides an ideal solution to the challenge of global data integration.