Big Data Engenders New Opportunities and Challenges on Wall Street

By Nicole Hemsoth

September 27, 2012

One of technology’s most pervasive buzzwords echoed in the ears of attendees at this year’s one-day HPC on Wall Street conference in New York City, as panel after panel addressed the challenges and opportunities that big data presents. From the opening remarks regarding Wall Street’s traditional concern of low latency, delivered by Cisco CTO Paul Perez, to the multiple open-ended discussions that took place in concurrent panels, the “big data” problem was a much-discussed topic. 

For this industry, however, the concerns around what the overall technology ecosystem is touting as big data are quite different. The exploding volume of data that other industries are dealing with is compounded in the financial space by regulations mandating massive, long-term storage.

But the industry itself is finding value in the ability to tap those datasets in both real-time and historical context. What this means is that Wall Street is looking for snappy new ways to keep the meaningful data at the fore, while maintaining a monster archive of historical transactions and other data for more leisurely access and analysis.

During the course of a panel on the exploding demands for storage, analytics, risk management and ultra-low latency (not to mention the compute horsepower required), Emile Werr, VP and Head of Enterprise Architecture at NYSE Euronext described the system-wide challenges of massive, swift data across their HPC infrastructure. He noted that, for them, the challenges went far beyond the “three Vs” of big data: volume, variety and velocity. Their entire approach and methodologies had to shift.

The volume and complexity challenges were keenly felt in the context of the volatility of changing systems, new markets, and even new businesses his firm is exploring. Note that NYSE Technologies is the spin-out company from the exchange of the same name, and offers financial services that encompasses an increasingly large buffet of software and services, from custom middleware packages to hosted exchange analysis.

They have had to keep pace with an evolving exchange market for their customers, necessitating new approaches to their system environments on both the hardware and software sides. According to him, these tweaks and new services have allowed them to expand their traditional market business significantly.

Werr, who proudly notes that he’s the “big data guy” at NYSE, says that one thing that isn’t obvious in terms of their requirements is that the data that is fed into their systems is not user-friendly and certainly doesn’t come read-made for BI platforms. This means there is a whole, often invisible layer of complex data enrichment that is required.

But when you’re talking about billions of transactions per day, building systems that can take this unfriendly data and turn it into regulation-friendly, analysis-ready information is a key, ongoing struggle. Still, they think they may have solved some pieces of that system-wide puzzle and they’re marketing their architecture as a big data, HPC problem solver for this industry.

As mentioned earlier, another aspect of NYSE’s “macro data architecture strategy” that Werr defines is the regulatory-plus-storage problem. “We are obligated to maintain data for seven years,” he said, not without some exasperation. “There’s not one system out there that could actually store that data and have it online. Besides, it wouldn’t be practical. It’s old, old data, it’s just used for regulatory needs and then maybe trending over time details.”

But if the big data hype that insists all bytes are a potential goldmine rings with any validity, NYSE Euronext has a solution that could lend some credence to that ideal.  The company has developed a clever system whereupon data is scattered across distributed resources in such a way that makes it possible to provision it on the fly. Using an on-demand approach they’ve refined, the system can serve an array of applications, everything from an historical audit to an analyst’s real-time query.

NYSE Technologies is commercializing its reported success with its inventive macro data architecture, which Werr says has been rolling along nicely in production for four years. While skipping on the specifics, he noted that the system works in harmony with messaging systems and feed handlers designed to capture certain transactions with keen latency.

Those files are generated in small mini-batches and then fired off to the firm’s “transformation-archive farm” that offloads a lot of the ETL processing across a commodity cluster. The data then moves into the enrichment phase where relational models can be constructed and dropped into distributed storage for the rapid, on-demand access capabilities he hinted at earlier. At the prettier end of the process is a services layer that allows for rapid provisioning and access for all applications as well as APIs for systems and schedulers, not to mention a more seamless end-result for that data to be analyzed for any other business purpose.

A well-oiled machine, no? Werr says that it took a lot of determination to climb out of their old paradigm of being a big database shop with the standard Oracle, Sybase, etc. tools. At the heart of that shift is the need for ever-faster ingestion of data.  They’re at the point now where they can load around 20 terabytes per hour into their federated server farm. Since they have a short window of genuine production data, they’re able to then quickly provision that data into sandboxes to allow for more refined operation on specific subsets of that data, or use narrowly defined tools and integration approaches.

Whether or not we want to think abstractly about this big data craze as a mere concept or hype-bubble, the fact remains that the vendors on every conference panel throughout the day seemed to find some element of value in this topic. By presenting the opportunities and challenges of all the hardware and software this technology touches, attendees were left with the impression that the financial industry is in for some major retooling.


Related Articles

Big Data: A View from Wall Street

The Best Kept Secret in Big Analytics?

On Wall Street, The Race to Zero Continues

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penalties to HPC applications. Even as these patches are rolled o Read more…

By Pete Beckman

Intel Touts Silicon Spin Qubits for Quantum Computing

February 14, 2018

Debate around what makes a good qubit and how best to manufacture them is a sprawling topic. There are many insistent voices favoring one or another approach. Referencing a paper published today in Nature, Intel has offe Read more…

By John Russell

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

HPE Extreme Performance Solutions

Safeguard Your HPC Environment with the World’s Most Secure Industry Standard Servers

Today’s organizations operate in an environment with ever-evolving threats, and in order to protect themselves they must continuously bolster their security strategy. Hewlett Packard Enterprise (HPE) and Intel® are addressing modern security challenges with the world’s most secure industry standard servers powered by the latest generation of Intel® Xeon® Scalable processors. Read more…

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended to make it easier, faster and cheaper to train and run machi Read more…

By Doug Black

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penal Read more…

By Pete Beckman

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

The Food Industry’s Next Journey — from Mars to Exascale

February 12, 2018

Global food producer and one of the world's leading chocolate companies Mars Inc. has a unique perspective on the impact that exascale computing will have on the food industry. Read more…

By Scott Gibson, Oak Ridge National Laboratory

Singularity HPC Container Start-Up – Sylabs – Emerges from Stealth

February 8, 2018

The driving force behind Singularity, the popular HPC container technology, is bringing the open source platform to the enterprise with the launch of a new vent Read more…

By George Leopold

Dell EMC Debuts PowerEdge Servers with AMD EPYC Chips

February 6, 2018

AMD notched another EPYC processor win today with Dell EMC’s introduction of three PowerEdge servers (R6415, R7415, and R7425) based on the EPYC 7000-series p Read more…

By John Russell

‘Next Generation’ Universe Simulation Is Most Advanced Yet

February 5, 2018

The research group that gave us the most detailed time-lapse simulation of the universe’s evolution in 2014, spanning 13.8 billion years of cosmic evolution, is back in the spotlight with an even more advanced cosmological model that is providing new insights into how black holes influence the distribution of dark matter, how heavy elements are produced and distributed, and where magnetic fields originate. Read more…

By Tiffany Trader

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Leading Solution Providers

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

V100 Good but not Great on Select Deep Learning Aps, Says Xcelerit

November 27, 2017

Wringing optimum performance from hardware to accelerate deep learning applications is a challenge that often depends on the specific application in use. A benc Read more…

By John Russell

SC17: Singularity Preps Version 3.0, Nears 1M Containers Served Daily

November 1, 2017

Just a few months ago about half a million jobs were being run daily using Singularity containers, the LBNL-founded container platform intended for HPC. That wa Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This