The Opportunity for Predictive Analytics in Finance

By Sue Korn

April 21, 2011

It is often said that managing enterprise risk and micro risk is about finding the needle in the haystack. Predictive analytics uses powerful computers with large memory and storage to eliminate 90 percent of the hay, those “easy” decisions that a computer can handle effortlessly. The modeling systems then score the remaining 10 percent, prioritizing the activities of the human analysts and investigators to do what they do best, which is to make the optimal decision.

That entails such things as finding the best risk/reward trade-offs for new customers, avoiding fraudulent insurance claims, identifying fraud or abuse in government programs, stopping questionable transactions, and optimally pricing assets against the degree of risk.

Predictive analytics is the discipline that uses computational techniques to search for ways to optimize business decisions. Applications in financial services include front-end customer acquisition analytics, offer selection, relationship management, pricing optimization, risk management, fraud management, and actuarial analysis for insurance.

High Performance Business Computing in Financial Markets

Financial services is the second-largest commercial high performance computing (HPC) vertical market, second only to manufacturing. It is also one of the fastest growing, and as a result, it is a critical part of our High Performance Business Computing (HPBC) methodology. Within financial services, high-frequency trading is the most well-known application, but there are several other areas where HPC is in use.

Intersect360 Research tracks a number of broad application areas as part of the financial services vertical. These include trading, both high-frequency trading and algorithmic trading; risk management, at the enterprise, portfolio or customer level, as well as actuarial analysis for insurance; pricing and valuation of individual securities, derivatives, and compound derivatives; and business and economic analytics, including modeling, simulation, and decision support.

Financial services companies take many forms, from large, multinational, multiline organizations to regional boutiques. An individual company might run all or none of these application types. (You cannot guarantee that a given bank runs HPC risk management applications any more than you can guarantee that any manufacturer runs HPC computer-aided engineering simulations.) But among these application types, analytics, particularly predictive analytics, is important for its potential to be leveraged in multiple ways.

There are several different levels of predictive analytics techniques used, with increasing levels of sophistication. At the simplest level, traditional techniques such as regression, linear modeling, rules-based algorithms and decision trees are used. More complex techniques such as neural networks and machine learning are at the next level. Newer techniques include text analysis (where, for example, notes entered by a service representative after a customer calls in can be mined or sentiment can be coaxed out of tweets) and social network analysis (looking for patterns in the relationship between a customer and provider, in context of all other customers and providers).

These individual techniques can be combined into compound engines such as net lift (or uplift) modeling, where two or more scenarios are analyzed simultaneously to trace all possible outcomes and choose the right treatment (or lack of treatment) for a particular situation. There’s also ensemble modeling, in which a suite of models are run and the final response comes from a weighting of the individual models’ results, and where the model-weighting can also be refined based on the situation.

We expect that analytics will be a growing market for what we call High Performance Business Computing (HPBC), particularly in financial services and related disciplines. There are three legs to the stool supporting this belief. First, there is an explosion of data becoming available, both internal and external, to organizations. Second, there are methodologies to analyze and make sense of this vast amount of data are being developed and improved every day. The third leg of the stool is the availability of cost-effective and accessible systems (in terms of computational speed, data storage, memory) to be able to do something useful with it. Put these three legs together and you get a large potential opportunity for HPBC.

The systems required to perform predictive analytics range from Excel using a SAS dataset on a laptop computer, all the way to custom-designed, self-tuning engines running on large clusters or in-database, and everything in between. On one extreme, predictive analytics is clearly using high performance computing. On the other extreme, it clearly is not. Where to draw that line right now is less important than the conclusion that more and more companies are moving towards these sophisticated techniques.

Industry leaders have their own internal teams, and this capability provides a differentiating competitive advantage. Those who have not made the switch will be evaluating these techniques and systems with more interest as more and more success stories are written by those using predictive analytics.

Companies moving to predictive analytics will get there in one of two ways, either building teams internally or by hiring third-party providers to develop their systems for them. These third parties can use the principal company’s systems, or can run the analytics on behalf of the principals, sending back scores and metrics to be loaded onto the principal company’s internal database.

Why Predictive Analytics

Financial institutions do not sell widgets, take in revenue on those sales and pay a cost of making the good that they sold. While manufacturing companies can build a better product (better quality at a better price) using digital manufacturing, financial institutions’ assets are monetary in nature. In contrast to a manufacturing organization, financial institutions make their money on the spread, or difference, between what they earn on their financial assets and what they pay for their liabilities. This spread also has to be enough to cover their operating expenses, which generally include credit losses, fraud losses and fraud management.

Assets, in this sense, are insurance policies that provide premium income. They can be loans that provide origination fees, finance charges and service fees. They can be investment portfolios that provide management fees or trading revenue. Liabilities could be deposits or debt where the institution is paying a rate of interest for the use of the depositor or investor’s money.

A financial institution maximizes this profit calculation through two mechanisms: risk management and pricing optimization. Risk management encompasses the institution’s initial decision to originate a loan or insurance policy, their ongoing behavior analysis (e.g., fraud, delinquency, late payment, increased claims) and exposure management, like not renewing a policy or implementing line reductions. On the other side is pricing optimization, which includes the initial pricing decision, whether to do special offers or provide discounts to entice profitable customers to stay or deepen their relationship, and the implement ion of the penalty pricing (e.g., if the customer goes over their limit or pays late).

The analytically elite companies have these types of analytics as part of their DNA. They are constantly loading new transaction or behavior data, evaluating assumptions, calibrating models, rebalancing among methodologies, reweighting results in ensemble infrastructures. “Constantly” used to mean monthly not too long ago. Increasingly it means weekly, daily or even as transactions are initiated.

Predictive Analytics Beyond Banking

Although financial services institutions are among the most advanced users, the potential benefits are available to many business areas. Already, predictive analytics are also making a difference in non-financial markets. For example in the government arena, it’s being used to reduce waste, identify fraud in government programs, and uncover tax fraud. In health care, it’s being employed for cost management, system fraud, and more accurate or quicker diagnoses. Finally in telecom, predictive analytics is being used to minimize customer base churn.

On that last point, basically any company has groups of customers it would like to manage, both in terms of customer relationship management (CRM) issues like customer acquisition and turnover, as well as tailoring product portfolios and pricing to different categories of customers.

Because of its broad potential applicability, predictive analytics should continue to be a significant growth driver for HPBC markets. The vast amount of data being collected by companies virtually guarantees that there are some valuable nuggets of information waiting to be brought to light that can have a material impact on profitability. Finding these needles in the haystack is a challenge, but predictive analytics provides a way for companies to take advantage of them.

About the Author

Sue Korn is a senior analyst at Intersect360 Research specializing in High Performance Business Computing (HPBC) applications, and a 20-year veteran of the financial services industry. In her role at Intersect360 Research, Korn spearheads the company’s analysis of the drivers and barriers of HPC adoption in business environments and the growing role of HPBC applications.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

House Bill Seeks Study on Quantum Computing, Identifying Benefits, Supply Chain Risks

May 27, 2020

New legislation under consideration (H.R.6919, Advancing Quantum Computing Act) requests that the Secretary of Commerce conduct a comprehensive study on quantum computing to assess the benefits of the technology for Amer Read more…

By Tiffany Trader

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to have bipartisan support, calls for giving NSF $100 billion Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers in Neuroscience this month present IBM work using a mixed-si Read more…

By John Russell

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even in the U.S. (which has a reasonably fast average broadband Read more…

By Oliver Peckham

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

It is with great sadness that we announce the death of Rich Brueckner. His passing is an unexpected and enormous blow to both his family and our HPC family. Rich was born in Milwaukee, Wisconsin on April 12, 1962. His Read more…

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the dominant primate species, with the neanderthals disappearing b Read more…

By Oliver Peckham

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

HPC in Life Sciences 2020 Part 1: Rise of AMD, Data Management’s Wild West, More 

May 20, 2020

Given the disruption caused by the COVID-19 pandemic and the massive enlistment of major HPC resources to fight the pandemic, it is especially appropriate to re Read more…

By John Russell

AMD Epyc Rome Picked for New Nvidia DGX, but HGX Preserves Intel Option

May 19, 2020

AMD continues to make inroads into the datacenter with its second-generation Epyc "Rome" processor, which last week scored a win with Nvidia's announcement that Read more…

By Tiffany Trader

Hacking Streak Forces European Supercomputers Offline in Midst of COVID-19 Research Effort

May 18, 2020

This week, a number of European supercomputers discovered intrusive malware hosted on their systems. Now, in the midst of a massive supercomputing research effo Read more…

By Oliver Peckham

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This