Visit additional Tabor Communication Publications
March 08, 2012
Earlier this week, IBM announced an agreement with Citigroup to explore possible banking applications of Watson, Blue Blue’s deep analytics supercomputing system. The technology debuted last February as a nearly unbeatable Jeopardy contestant, and has since moved on to healthcare analytics at WellPoint. Along with its quiz show prowess and medical talents, the system has also been used to assist with determining the rightful owner to pharmaceutical patents.
In Citi’s case, they appear to be focused on using Watson’s content analysis and learning capabilities to improve customer interaction and streamline the banking experience. Don Callahan, Citi’s chief administrative officer and chief operations and technology officer said, “We will collaborate with IBM to explore how we can use the Watson technology to provide our customers with new, secure services designed around their increasingly digital and mobile lives,” he said in the press release.
Although the initial use of Watson for the banking behemoth appears to focus on end-user experience, Mike Rhodin, Senior VP of IBM software solutions hinted that the bank is also looking at making the Jeopardy champ a cyber-financial analyst. According to him, IBM and the bank will “explore how applying Watson in the consumer financial market could help empower financial professionals to make better business decisions.” That line of thinking is reflect in a post by Manoj Saxena, GM of IBM Watson Solutions, who sums up Watson’s potential for financial institutions:
“In banking, Watson can take in a huge amount of data–everything from information about customers’ profiles and banking activities to corporate quarterly reports, analyst reports, regulations, credit ratings, government securities filings and the institution’s own rules. In addition, it can gather intelligence from online news reports, blogs, Twitter feeds and transcripts of a company’s earnings call with analysts. This vast cache of information could provide a 360-degree view of a customer and a contextual view of many decisions that a bank has to make.”
For fairly obvious reasons, Citi is not talking up this angle. And if it ever uses the technology to do the sort of cutting-edge financial analysis IBM is suggesting, it will probably keep the application to itself.
Watson’s general-purpose language recognition and data crunching abilities are enabling the system to migrate into new verticals, which was IBM’s plan from the get-go. It’s entry into the financial industry may begin as a research project to enhance customer experience, however the work could morph into something much more revolutionary, namely, turning Watson into an expert system for lending, trading and risk analysis. Other banks will most likely keep a close eye on Citibank’s progress with Big Blue.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.