Visit additional Tabor Communication Publications
October 11, 2012
LAS VEGAS, Oct. 11 — SAS High-Performance Analytics Server now supports more analytics, including text mining and optimization. And the predictive modeling capabilities of SAS High-Performance Analytics Server will now also use Hadoop Distributed File System (HDFS), the popular open source big data infrastructure.
In-memory software accelerates insights and solves complex problems involving huge volumes of structured and unstructured data; simply put, SAS High-Performance Analytics Server helps turn big data into gold.
“SAS has ramped up R&D on our high-performance analytics family,” said Jim Davis, SAS Senior Vice President and Chief Marketing Officer. “Our customers say lightning-fast analytic insights with SAS High-Performance Analytics Server provide powerful competitive advantage. These latest enhancements, including additional support for Hadoop and HDFS, make it easier to achieve value from big data.”
The updates bolster SAS’ growing high-performance product set, which includes the in-memory big data visualization of SAS Visual Analytics and increased industry-specific and horizontal analytic solutions. The latest, SAS High-Performance Marketing Optimization, was also announced.
Early users of SAS High-Performance Analytics Server report analytic computing time shrinking from days and hours to minutes, even seconds. They can analyze more data, perform more model iterations and run more sophisticated algorithms.
For example, by evaluating numerous scenarios simultaneously, a bank can spot opportunities, detect emerging issues and respond immediately to market conditions. A retailer can personalize offers on the spot, based on structured and unstructured data from sales and social media, boosting sales.
New capabilities for current users
Increased functionality for customers running SAS High-Performance Analytics Server on database appliances from Teradata or EMC Greenplum includes text data. SAS Enterprise MinerTM users can build predictive and descriptive models even faster based on large data volumes.
Organizations use SAS analytics to detect fraud, minimize risk, increase response rates for marketing campaigns and curb customer attrition. SAS High-Performance Analytics means finding fraud before claims are paid, evaluating risk more frequently, improving marketing campaigns and reaching valuable customers before they defect.
Now enabled for SAS High-Performance Analytics Server, SAS Text Miner unlocks information from extremely large document collections, including social media. Leaders can act on new opportunities more quickly, more precisely and with less risk.
The updates also enhance large-scale optimization with select high-performance capabilities in existing SAS/OR® procedures. Only SAS High-Performance Analytics Server combines true mathematical optimization - versus simple rules-based analysis and user-defined constraint modeling - with big data to make the best possible decisions.
The latest SAS High-Performance Analytic Server supports HDFS in deployments on industry-standard hardware platforms.
SAS/ACCESS® Interface to Hadoop, introduced in February, is among nearly 30 packaged solutions for data connectivity and integration between SAS and third-party data sources, including data warehouse appliances, enterprise applications, non-relational mainframe data sources and popular relational databases.
The new SAS/ACCESS Interface to Hadoop software is eagerly welcomed by companies rolling out Hadoop pilots or production deployments.
“Hadoop is the big data platform of choice for Macys.com’s analytics team, and SAS is our analytics engine that drives insights. We connect the two environments to create an analytics solution that drives business value,” said Kerem Tomak, Vice President of Marketing Analytics at Macys.com. “I’ve used SAS/ACCESS engines for other data sources in the past and, based on that, I believe SAS/ACCESS Interface to Hadoop will provide tangible benefits.”
Meanwhile, SAS graphical tools such as SAS Data Integration Server enable users to access, process and manage Hadoop data and processes from within the familiar SAS environment, alleviating problems of skills shortages and complexity associated with Hadoop. SAS augments Hadoop with world-class analytics, plus metadata, security and lineage capabilities, ensuring that Hadoop will be enterprise-ready.
SAS Data Management also mitigates the lack of mature tools for developing and managing Hadoop deployments, helping organizations gain value from Hadoop more quickly with fewer resources. Hadoop integration with SAS Information Management provides:
“EMC Greenplum has long realized Hadoop’s value and has made it a key component in the Greenplum Unified Analytics Platform (UAP), which combines the co-processing of structured and unstructured data,” said Josh Klahr, Vice President of Products at Greenplum, a division of EMC. “The enhanced SAS High-Performance Analytics Server with Hadoop support running on Greenplum delivers a superior analytics environment, allowing customers to extract insights from vast amounts of data. Greenplum will continue working with SAS to create optimized big data platforms that deliver timely results and excellent customer value.”
Rick Lower, Partnership Marketing Director at Teradata, said, “In this time of increasingly complex business scenarios and the emergence of new opportunities from a universe of big data, SAS and Teradata analytic solutions are producing valuable insight for more companies than ever. Our new Teradata 700 appliance for SAS offers pre-installed SAS in-memory analytics with advanced capabilities.”
In quieter times, sounding the bell of funding big science with big systems tends to resonate farther than when ears are already burning with sour economic and even national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.