Visit additional Tabor Communication Publications
September 24, 2008
NEW YORK CITY -- Can hardware acceleration save Wall Street? Well, not as quickly as a multibillion-dollar bailout might, but there was plenty of discussion at this week's HPC on Wall Street conference about the advantages specialized hardware can bring to market analysts and traders. Sellers of these products were all over the place, their booths were busy, and several sessions on the subject were standing-room only.
The idea of separate appliances to speed up data processing has started catching on in financial services, said Geno Valente, VP of sales and marketing at XtremeData, Inc. "A few years ago, people were like, 'Who needs that?'" The expectation was that faster and faster CPUs would yield the processing and throughput speeds needed to respond to market changes. Plus, the FPGAs that are the brains of most accelerator boards were exotic to organizations outside the scientific community and required specialized parallel-programming skills to be used effectively.
But that's all changed, said Valente. Accelerators are now built into appliances from companies like Solace Systems and Exegy that "anyone can plug in and start using," he said. "Libraries are being developed that make acceleration technology available to people who don't know parallel-programming." As a result, "Public exchanges are using accelerators. Wall Street is now taking advantage of them. And there are a lot more companies using them that we can't talk about," he said.
Obviously performance -- faster processing of more data and more data streams -- is the primary advantage vendors mention when talking about their products. Their specialized processors handle the floating-point operations that crunch the algorithms that analysts and traders do-or-die upon. "Accelerator hardware enables us to do things we couldn't do otherwise," said Henry Young, founder of TS-Associates, an IT services company specializing in financial middleware. "We can handle a 10-gigabit data stream through a hardware accelerator to speed up processing. You can't compress a 10-gig stream like that with current CPU technology."
But it's not just speed. One reason Wall Street is getting on board is the need to score an edge over competitors and get a better handle on volatile markets. "Accelerators can differentiate a cluster," Valente said. "Otherwise you have exactly what the other guy has. It's accelerators and other customizable things you can add to a Linux box that give you an advantage."
Reliability and data precision were brought up as crucial benefits of acceleration hardware during a panel discussion. Simon McIntosh-Smith, VP of applications at ClearSpeed Technology, raised the spectre of "soft errors" resulting from cosmic emissions flipping bits. ClearSpeed makes an ASIC-based accelerator board that can plug into a server or be ganged up in a rack and connected to the network. What if an alpha emission hit a processor and altered the data from "sell 10,000 shares to buy 10,000"? In that case, you'd better hope you have hardware that supports error detection as well as error correction, McIntosh-Smith said. "The point is to make sure you have a system that supports high reliability. Some hardware accelerators support it, and some don't. Without reliability built in, you can have soft errors and not even realize it until it's too late."
While cosmic rays might be too sci-fi for other manufacturers, they still emphasize the high reliability features of their devices. Exegy uses reconfigurable hardware "so people can add logic to detect flipped bits or to add in more data protection as necessary," said Scott Parsons, the company's chief architect. And Solace Systems VP of architecture Shawn McAllister pointed out that Solace appliances can also have reliability features added. "We have firmware that monitors things and can take over in the event of a problem," he said.
Consolidation is another benefit, these hardware makers say. Exegy's technology is incorporated in a ticker based on reconfigurable hardware that "can handle all the North American market data feeds in one box rather than a dozen," Parsons said, with that stream including NYSE/SIAC, NASDAQ, OPRA, and ARCA. (See it in action at marketdatapeaks.com, he noted.)
"Datacenter consolidation is very important to our customers," McAllister said. "Some of them are in a situation where they just can't add another server until they take one out. Accelerators can help with that."
With appliances replacing big servers, in theory at least financial firms will save on utility bills. "We have seen lower power consumption at the system level with our FPGAs," Parsons noted.
That would be an advantage, right?
"I thought by now we'd be hearing more customers saying they want to reduce power consumption," McIntosh-Smith said. "But we're not seeing that. It's still all about speed and performance."
And as market data grows and grows, there will probably be no end of that need. Accelerator designers admit they're just part of the solution. Advances in middleware, algorithms, the OS stack, software development tools, will all be needed to give financial services the ability to acquire, process and interpret data faster. "We don't see general-purpose CPUs getting that much faster," said Parsons. "So you have to look at non-traditional approaches to all these problems."
Meanwhile, manufacturers of other types of processors are designing their chips to meet the demands of financial applications. In June, NVIDIA, whose graphics processing units are at the heart of upscale gaming and multimedia systems (pretty demanding themselves), introduced its 240-core Tesla 10 series. The company says the chip's teraflop of processing power can be applied to mission-critical workloads such as financial analysis. Likewise, AMD is aiming its Firestream 9250 processor at the same kind of HPC number crunching. AMD has said that developers are reporting up to a 55 times performance increase when running financial analysis code using a FireStream 9250 GPU-based accelerator versus a standalone CPU.
Regardless of one's take on hardware accelerators, they've got to be considered, like any other advancement.
"If you want to play in the fast markets, you can't ignore new technology," said Peter Lankford of the Securities Technology Analysis Center, which evaluates and benchmarks IT for trading systems and other financial applications. "Technology budgets are going to be more constrained now, so financial companies need to pay even more attention. Things are just going to get tighter and tighter.
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.