July 24, 2012
According to a new report from Saugatuck Technology, cloud-based business analytics is poised for huge growth over the next two years. Read more…
June 19, 2012
A growing trend in enterprise environments is the ability to deploy predictive analytics. Cloud services match up well with the requirements of these applications. Read more…
April 19, 2011
In early April the SAS Institute (SAS) announced it had integrated its most advanced analytics software into database appliances from EMC Greenplum and Teradata Corporation. The new offerings marry high performance computing to "big data" and are designed to enable users to perform deep analysis on huge datasets hosted on purpose-built, parallel computing platforms. Read more…
August 4, 2010
R language booster Revolution Analytics is going after the predictive analytics crowd with its latest Revolution R Enterprise software platform. The company announced this week it will be introducing a package called RevoScaleR to bring the R language into the world of "Big Data," enabling analytics applications to turbo-charge their performance and scale terabyte-sized mountains of data. Read more…
June 30, 2010
Business intelligence moves off the desktop. Read more…
March 3, 2010
REvolution Computing may do for R what RedHat did for Linux. Read more…
February 25, 2010
In what IBM is characterizing as a "breakthrough," researchers have developed an algorithm that cuts the computational costs of assessing data quality by two orders of magnitude. The new algorithm has potentially far-reaching applicability, extending to nearly all types of analytics applications as well as scientific modeling and simulation. Read more…
February 11, 2010
'Watson' super being readied for public challenge. Read more…
Five Recommendations to Optimize Data Pipelines
When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.
With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.
To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.
Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.
KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.
Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.