The Opportunity for Predictive Analytics in Finance

By Sue Korn

April 21, 2011

It is often said that managing enterprise risk and micro risk is about finding the needle in the haystack. Predictive analytics uses powerful computers with large memory and storage to eliminate 90 percent of the hay, those “easy” decisions that a computer can handle effortlessly. The modeling systems then score the remaining 10 percent, prioritizing the activities of the human analysts and investigators to do what they do best, which is to make the optimal decision.

That entails such things as finding the best risk/reward trade-offs for new customers, avoiding fraudulent insurance claims, identifying fraud or abuse in government programs, stopping questionable transactions, and optimally pricing assets against the degree of risk.

Predictive analytics is the discipline that uses computational techniques to search for ways to optimize business decisions. Applications in financial services include front-end customer acquisition analytics, offer selection, relationship management, pricing optimization, risk management, fraud management, and actuarial analysis for insurance.

High Performance Business Computing in Financial Markets

Financial services is the second-largest commercial high performance computing (HPC) vertical market, second only to manufacturing. It is also one of the fastest growing, and as a result, it is a critical part of our High Performance Business Computing (HPBC) methodology. Within financial services, high-frequency trading is the most well-known application, but there are several other areas where HPC is in use.

Intersect360 Research tracks a number of broad application areas as part of the financial services vertical. These include trading, both high-frequency trading and algorithmic trading; risk management, at the enterprise, portfolio or customer level, as well as actuarial analysis for insurance; pricing and valuation of individual securities, derivatives, and compound derivatives; and business and economic analytics, including modeling, simulation, and decision support.

Financial services companies take many forms, from large, multinational, multiline organizations to regional boutiques. An individual company might run all or none of these application types. (You cannot guarantee that a given bank runs HPC risk management applications any more than you can guarantee that any manufacturer runs HPC computer-aided engineering simulations.) But among these application types, analytics, particularly predictive analytics, is important for its potential to be leveraged in multiple ways.

There are several different levels of predictive analytics techniques used, with increasing levels of sophistication. At the simplest level, traditional techniques such as regression, linear modeling, rules-based algorithms and decision trees are used. More complex techniques such as neural networks and machine learning are at the next level. Newer techniques include text analysis (where, for example, notes entered by a service representative after a customer calls in can be mined or sentiment can be coaxed out of tweets) and social network analysis (looking for patterns in the relationship between a customer and provider, in context of all other customers and providers).

These individual techniques can be combined into compound engines such as net lift (or uplift) modeling, where two or more scenarios are analyzed simultaneously to trace all possible outcomes and choose the right treatment (or lack of treatment) for a particular situation. There’s also ensemble modeling, in which a suite of models are run and the final response comes from a weighting of the individual models’ results, and where the model-weighting can also be refined based on the situation.

We expect that analytics will be a growing market for what we call High Performance Business Computing (HPBC), particularly in financial services and related disciplines. There are three legs to the stool supporting this belief. First, there is an explosion of data becoming available, both internal and external, to organizations. Second, there are methodologies to analyze and make sense of this vast amount of data are being developed and improved every day. The third leg of the stool is the availability of cost-effective and accessible systems (in terms of computational speed, data storage, memory) to be able to do something useful with it. Put these three legs together and you get a large potential opportunity for HPBC.

The systems required to perform predictive analytics range from Excel using a SAS dataset on a laptop computer, all the way to custom-designed, self-tuning engines running on large clusters or in-database, and everything in between. On one extreme, predictive analytics is clearly using high performance computing. On the other extreme, it clearly is not. Where to draw that line right now is less important than the conclusion that more and more companies are moving towards these sophisticated techniques.

Industry leaders have their own internal teams, and this capability provides a differentiating competitive advantage. Those who have not made the switch will be evaluating these techniques and systems with more interest as more and more success stories are written by those using predictive analytics.

Companies moving to predictive analytics will get there in one of two ways, either building teams internally or by hiring third-party providers to develop their systems for them. These third parties can use the principal company’s systems, or can run the analytics on behalf of the principals, sending back scores and metrics to be loaded onto the principal company’s internal database.

Why Predictive Analytics

Financial institutions do not sell widgets, take in revenue on those sales and pay a cost of making the good that they sold. While manufacturing companies can build a better product (better quality at a better price) using digital manufacturing, financial institutions’ assets are monetary in nature. In contrast to a manufacturing organization, financial institutions make their money on the spread, or difference, between what they earn on their financial assets and what they pay for their liabilities. This spread also has to be enough to cover their operating expenses, which generally include credit losses, fraud losses and fraud management.

Assets, in this sense, are insurance policies that provide premium income. They can be loans that provide origination fees, finance charges and service fees. They can be investment portfolios that provide management fees or trading revenue. Liabilities could be deposits or debt where the institution is paying a rate of interest for the use of the depositor or investor’s money.

A financial institution maximizes this profit calculation through two mechanisms: risk management and pricing optimization. Risk management encompasses the institution’s initial decision to originate a loan or insurance policy, their ongoing behavior analysis (e.g., fraud, delinquency, late payment, increased claims) and exposure management, like not renewing a policy or implementing line reductions. On the other side is pricing optimization, which includes the initial pricing decision, whether to do special offers or provide discounts to entice profitable customers to stay or deepen their relationship, and the implement ion of the penalty pricing (e.g., if the customer goes over their limit or pays late).

The analytically elite companies have these types of analytics as part of their DNA. They are constantly loading new transaction or behavior data, evaluating assumptions, calibrating models, rebalancing among methodologies, reweighting results in ensemble infrastructures. “Constantly” used to mean monthly not too long ago. Increasingly it means weekly, daily or even as transactions are initiated.

Predictive Analytics Beyond Banking

Although financial services institutions are among the most advanced users, the potential benefits are available to many business areas. Already, predictive analytics are also making a difference in non-financial markets. For example in the government arena, it’s being used to reduce waste, identify fraud in government programs, and uncover tax fraud. In health care, it’s being employed for cost management, system fraud, and more accurate or quicker diagnoses. Finally in telecom, predictive analytics is being used to minimize customer base churn.

On that last point, basically any company has groups of customers it would like to manage, both in terms of customer relationship management (CRM) issues like customer acquisition and turnover, as well as tailoring product portfolios and pricing to different categories of customers.

Because of its broad potential applicability, predictive analytics should continue to be a significant growth driver for HPBC markets. The vast amount of data being collected by companies virtually guarantees that there are some valuable nuggets of information waiting to be brought to light that can have a material impact on profitability. Finding these needles in the haystack is a challenge, but predictive analytics provides a way for companies to take advantage of them.

About the Author

Sue Korn is a senior analyst at Intersect360 Research specializing in High Performance Business Computing (HPBC) applications, and a 20-year veteran of the financial services industry. In her role at Intersect360 Research, Korn spearheads the company’s analysis of the drivers and barriers of HPC adoption in business environments and the growing role of HPBC applications.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Pfizer HPC Engineer Aims to Automate Software Stack Testing

January 17, 2019

Seeking to reign in the tediousness of manual software testing, Pfizer HPC Engineer Shahzeb Siddiqui is developing an open source software tool called buildtest, aimed at automating software stack testing by providing the community with a central repository of tests for common HPC apps and the ability to automate execution of testing. Read more…

By Tiffany Trader

Senegal Prepares to Take Delivery of Atos Supercomputer

January 16, 2019

Update (Jan. 21): HPCwire has received confirmation from Atos that the system will have a peak speed of 537.6 teraflops, not 320 teraflops as had previously been reported. We plan to report additional details as we recei Read more…

By Tiffany Trader

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three big public cloud vendors has by turn touted the latest and Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE Systems With Intel Omni-Path: Architected for Value and Accessible High-Performance Computing

Today’s high-performance computing (HPC) and artificial intelligence (AI) users value high performing clusters. And the higher the performance that their system can deliver, the better. Read more…

IBM Accelerated Insights

Resource Management in the Age of Artificial Intelligence

New challenges demand fresh approaches

Fueled by GPUs, big data, and rapid advances in software, the AI revolution is upon us. Read more…

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchmark or suite of benchmarking tools to compare the performanc Read more…

By John Russell

Google Cloud Platform Extends GPU Instance Options

January 16, 2019

If it's Nvidia GPUs you're after to power your AI/HPC/visualization workload, Google Cloud has them, now claiming "broadest GPU availability." Each of the three Read more…

By Tiffany Trader

STAC Floats ML Benchmark for Financial Services Workloads

January 16, 2019

STAC (Securities Technology Analysis Center) recently released an ‘exploratory’ benchmark for machine learning which it hopes will evolve into a firm benchm Read more…

By John Russell

A Big Data Journey While Seeking to Catalog our Universe

January 16, 2019

It turns out, astronomers have lots of photos of the sky but seek knowledge about what the photos mean. Sound familiar? Big data problems are often characterize Read more…

By James Reinders

Intel Bets Big on 2-Track Quantum Strategy

January 15, 2019

Quantum computing has lived so long in the future it’s taken on a futuristic life of its own, with a Gartner-style hype cycle that includes triggers of innovation, inflated expectations and – though a useful quantum system is still years away – anticipatory troughs of disillusionment. Read more…

By Doug Black

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM’s New Global Weather Forecasting System Runs on GPUs

January 9, 2019

Anyone who has checked a forecast to decide whether or not to pack an umbrella knows that weather prediction can be a mercurial endeavor. It is a Herculean task: the constant modeling of incredibly complex systems to a high degree of accuracy at a local level within very short spans of time. Read more…

By Oliver Peckham

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Microsoft to Buy Mellanox?

December 20, 2018

Networking equipment powerhouse Mellanox could be an acquisition target by Microsoft, according to a published report in an Israeli financial publication. Microsoft has reportedly gone so far as to engage Goldman Sachs to handle negotiations with Mellanox. Read more…

By Doug Black

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This