Banks Boost Infrastructure to Tackle GDPR

By Jeff Hong

June 13, 2018

As banks become more digital and data-driven, their IT managers are challenged with fast growing data volumes and lines-of-businesses’ (LoBs’) seemingly limitless appetite for analytics. Becoming data-driven requires: easy, yet managed access to data no matter where it resides; and timely analysis of structured and unstructured data with economy-of-scale.

At the same time, IT managers are implementing multiple costly, and complex regulations e.g. General Data Protection Regulation1 (GDPR) and Fundamental Review of Trading Book (FRTB). FRTB alone is expected to increase capital reserve requirements 40%, reducing money for other uses. Regulations can be contradictory in key areas as in the case of GDPR and Know Your Customer (KYC) on the topic of data retention.

Firms delaying compliance risk significant fines. A 2017 global survey1 of 500 firms identified Financial Services industry as one likely facing the greatest scrutiny for GDPR compliance. The assessed penalty under GDPR may be equal to 4% of a bank’s annual revenue or 20 million Euros. To start, many banks are:

  • Adopting a more standardized, documented approach to data management
  • Hiring a Chief Data Officer to sort out data policies i.e. where data resides, how data can be used, and who can access the data and for how long
  • Replacing their aging servers and storage devices to provide greater capacity more efficiently

However continually adding more hardware on-premises is an increasingly unfeasible approach to capacity management given many bank’s limited IT space, budget, and manpower. Public cloud is an option.   However, IT managers are still proceeding down this path with caution due to proprietary, regulatory, and/or competitive reasons. Select applications and data will remain behind the datacenter’s wall or co-located at exchanges for the foreseeable future.

IT managers are adopting more agile, open technologies that can scale out their analytics by optimizing their existing IT resources, and that can work on a hybrid cloud environment. They include:

  • A software defined infrastructure with disaggregated compute and storage to provide greater agility and efficiency for hybrid cloud and/or Hadoop-based environments
  • Virtualization, high performance multi-tenant grid computing, and secure cloud bursting to deliver scalability and cost savings for compute-heavy risk simulations
  • Multi-tier storage systems, data lakes, high performance file/object managers, data-aware job schedulers, and life-cycle management tools to optimize data movement (minimize latency impact), access, and storage choice
  • Open frameworks and pre-integrated systems like respectively Apache Spark and IBM Systems’ PowerAI platform to accelerate innovation in IoT and machine learning analytics with less operating risky

To learn how IBM’s software defined approach can provide your stakeholders with the ability to access and manage data more efficiently, click here.

To learn how IBM simplifies and accelerates AI and data analytics, click here

  1. Beginning in May 2018­­, most financial firms must comply with General Data Protection Regulation (GDPR), as well as planning for the Fundamental Review of Trading Book (FRTB).
  • In a 2017 survey of 500 IT decision makers on GDPR, 41% of financial services firms participating in the survey expected greater costs and complexity. Satisfying right to erasure, maintaining records of processing activities and managing security of processing pose the top challenges to compliance. Src: One Year Out: Views on GDPR, VansonBourne
  • For FRTB, the new Expected Shortfall (ES) calculation rule requires a more standardized approach to modeling and calculations using more comprehensive data sets. Banks will need more capacity, throughput, and scalability, as well as faster I/O for simulations that may involve hundreds of thousands of files and millions of jobs.
Return to Solution Channel Homepage
HPCwire