SaaS: Making IT a ‘Strategic Weapon’ for Banks

By Nicole Hemsoth

February 11, 2008

3Tera Chairman and CEO Barry X Lynn discusses why it’s only a matter of time before all financial institutions — at those that want to be competitive — will be utilizing utility computing to enable software-as-service applications. Says Lynn: “[W]e are seeing financial institutions starting to think about it as a strategic weapon again, and using it that way. Those who do not will either play catch up or fade away.”

— 

GRIDtoday: Generally speaking, what kind of market are you seeing for the delivery of financial applications as a service? What kinds of applications are companies looking to get from a SaaS model?

BARRY X LYNN: The market is huge and already somewhat tapped. Look at banks; they all have online banking now. Look at brokerages; they all have online trading. Many banks offer corporate treasurers Web-based cash management services. Insurance companies allow subscribers to submit claims online, et cetera. The big change is this: These Web-based services have become predominant in financial institutions, and, in many cases, dominant. So, a financial institution no longer adds value simply by paying higher interest rates, charging lower commissions, keeping premiums down, et cetera. They add more value as distributors and processors of financial information. Affecting this distribution and processing on the Web through SaaS applications is less expensive, by, I’d estimate, 50-fold, versus doing the same transactions in person in a branch office.

Gt: What factors are driving this, and how is this demand not being snuffed out by concerns over security, reliability, availability, etc.?

LYNN: The main driving factors are progress, the ability to reduce costs, and being able to provide more services to more customers (e.g., online banking, payroll and tax payments now available equally to large corporations and to SMBs), and with far greater convenience to the customers all things that become competitive advantage to institutions who offer such services. Given the available bandwidth, processor speed and the cost effectiveness of it all, the expense incurred doing SaaS-y things is so much less than it is for human interaction. The Web also provides financial institutions with unlimited geographical reach.

It is not being snuffed out by concerns over security, reliability, availability, et cetera, but, because of these concerns, uptake is slower than it can be. Many of these concerns are valid, but many are the result of resistance to change and the remaining Luddites in the large corporate datacenters. However, so much investment has been, and is being, made in the areas of information technology security, privacy, scalability, et cetera, that technologies addressing these issues are state-of-the-art. Platforms that enable Web-based services and SaaS, and use these technologies, such as 3Tera’s AppLogic, have become extremely secure. Running the SaaS in a “cloud” makes it far more difficult to be targeted by hackers. Uptime is several nines out of the box and several more nines with relatively inexpensive customized configuration. Our product scales linearly. We have found that running a SaaS on our platform — be it four processors, 40 processors or 400 processors — incurs very little overhead, which remains flat in all of these cases. And this is not theory — we have done it!

Gt: How does 3Tera play into the concept of delivering financial applications as a service? What essential elements of a SaaS strategy does 3Tera bring to the table, and on what level(s) of the SaaS model does it operate?

LYNN: If a financial institution uses 3Tera as its SaaS platform, it can choose from several models. If they are offering a completely standalone service, including its own data, they can set up that application as a fully encapsulated element, either in their own datacenter or at a hosted datacenter, including all of its virtual infrastructure, totally isolated from whatever else they are running in their datacenters. Because the volume fluctuations of SaaS are volatile and unpredictable, without 3Tera, they would have to provision themselves for peak volume, something they may hit a small percentage of the time or, perhaps, not at all. With 3Tera, they can provision for their average needs. When those needs are exceeded, they can dynamically burst into additional resources, provisioning on the fly, and releasing those resources when they are no longer needed. Thus, they only incur the cost of what they consume, rather than incurring the cost of what they might, maybe, perhaps, but maybe not, someday need.

Also, if financial institutions want their SaaS applications to have access to their corporate database information or systems of record, they can do so with 3Tera, as we enable secure virtual gateways, rather than channel connections, so that those databases and SORs remain physically isolated from the SaaS. I have already discussed the security, reliability and scalability that 3Tera affords. To do this, we operate at the lowest level of the SaaS environment. Our platform runs on top of a computer grid, so it is, effectively, an operating system for the grid that enables all the other elements of the SaaS infrastructure. The native operating system or systems (yes, 3Tera enables the mixing and matching of multiple OSs in a single application!) runs on top of our AppLogic and the SaaS’s required middleware, application software, infrastructure, data, et cetera, runs on top of the native OS or OSs.

Gt: How important is a flexible utility computing platform for hosting/powering SaaS applications? Is enabling these flexible, scalable platforms 3Tera’s sweet spot in this market?

LYNN: The economics of delivering SaaS make it much more affordable, cost-effective and reasonable to use a utility computing model, where companies can grow their infrastructure and scale on demand, chooose the best geographic location to serve their applications, and not have long-term commitment and overprovisioning. Most 3Tera customers are ready to start small with a new application, test the hosted platform, and after that move more applications to the utility. There are companies who already have invested in building their own datacenters and are considering bringing the utility computing model in-house to serve various internal departments. 3Tera is offering both options hosted grids through our growing network of hosting partners, and direct enterprise licenses for building internal utilities. The platform also provides metering services, so that datacenters can easily provide charge-backs without adding expensive management systems. But, most importantly, 3Tera allows those who want to provide a new application with the convenience of not having to spent huge amounts of time, money and talent on building and maintaining the infrastructure, thus allowing them to focus on the particular application and functionality that they want to enable.

Gt: Where is 3Tera seeing the most demand for its product in regard to SaaS — from companies with internal “clouds” or from hosting providers supplying the “clouds?”

LYNN: As the number of companies trying to deliver their applications as services grows, we see tremendous interest in both hosted solutions and in-house clouds. While a lot of customers are interested in the licensed model, some are willing to try with a small hosted grid as a proof-of-concept for their applications. As the applications are deployed and start running, and as the demand grows and they are free to easily add more resources, the confidence in the reliability of the hosted environment grows, not to mention the cost effectiveness of the solution. Therefore, we see many more companies going with the hosted model than with the enterprise license.

Gt: Has the company been doing a lot with SaaS thus far? How much of this has been in the financial sector?

LYNN: Naturally, our early adopters were mostly in the Web 2.0 and online content and application areas, where the immediate advantages and enablement could be realized. As the product matured, we are seeing more and more SaaS adoption and those customers are shocked at the ease with which AppLogic solves problems they have considered hard to crack. From three days for proofs-of-concept to less than a month for full production deployments, AppLogic has also enabled SaaS customer to shorten their deployment times, make upgrades more reliable and secure, and offer more differentiated services. Now that our customers have experienced firsthand that our platform is secure and reliable, we are working with several large financial companies who are currently in various stages of evaluation.

Gt: From your perspective, what are the major barriers to SaaS in the financial sector? Are they mostly technological or cultural?

LYNN: I would say that it was a fairly balanced mix just a couple of years ago, but the technology barriers have been broken by the hardware providers enabling necessary processor speed and bandwidth — and by 3Tera providing a platform on which this power can be harnessed. The cultural resistance varies from institution to institution. There are still a lot of people out there who resist change. With platforms such as 3Tera, empires shrink. And there are still those out there who only do what behemoth, nimbleness-challenged vendors tell them to do. Traditionally, large financial institutions have seen information technology as an operational necessity, but, more and more, we are seeing financial institutions starting to think about it as a strategic weapon again, and using it that way. Those who do not will either play catch up or fade away.

In 1969, a man walked on the moon. I’d say any financial institution that claims, almost 40 years later, that any of their creative ways of debiting and crediting accounts cannot be offered as a Web-based service is not being truthful.

Gt: How, and when, will these concerns be overcome?

LYNN: They are in the process of being overcome. As the early adopters gain higher profit margins while offering lower prices, thus increasing market share, these concerns will mystically disappear, for the most part.

Gt: Ultimately, how important do you see SaaS and computing as a service becoming in the financial sector? Will the brunt of access be internal or external, and what platforms for powering these applications will emerge as the big winners?

LYNN: Obviously, I see this to be ultra-important. Technology and the Web enable financial institutions to offer an infinitely diverse set of products, geographically unlimited at huge profit margins. As a result, I can actually see the day when financial services firms think about dropping their charters to become distributors and processors of financial information and transactions, without regulation. Think about it. If you are, for example, JPMorgan Chase and someone comes to your SaaS to buy a home loan, you might be better off selling them a Wells Fargo home loan, as long as they came to your SaaS to buy it. You could devoid yourself of all the “manufacturing” and servicing costs — and credit risk — and, you would pocket pure profit.

I think the uptake will result in significant internal access, as well. First, when done right, these SaaS Web-based applications afford the ultimate in efficiency. So, why have separate front-ends, applications and databases for internal use when you have these? Simply, deploy them on your intranets. Furthermore, this technology enables datacenter managers to turn their datacenters into metered utilities to offer to their end users. My completely unbiased opinion is that 3Tera will be the platform of choice for SaaS and utility computing.

Gt: What kind of timeframe are we looking at until SaaS becomes a normal practice for financial applications?

LYNN: SaaS is already, and has been, for quite some time, normal practice. What it is not yet is pervasive. This will be a follow-the-leader game. In financial services, as most industries, there are early adopters, followers and also-rans. The early adopters will kick butt and the followers will scurry to replicate what they are doing. When a critical mass of followers is on board, we will have pervasiveness. Unless your head is in the sand, the coming hockey stick is apparent. Predicting when that hockey stick comes is difficult, though. It could be six months. It could be a few years. But, if it’s any longer than that, the early adopters will rule the world of financial services and the others will have to live on their scraps.

Gt: Is there anything else you’d like to add?

LYNN: I grew fond of being able to say that utility computing, cloud computing and SaaS are the future of information technology — but I no longer say that. Information technology is maintaining, processing and moving data. My epiphany was in realizing that information technology is a subset of utility computing, rather than the other way around. Utility computing is the future. The future of information technology is that it will become a commodity offered by the utility.

—–

Barry X Lynn will be discussing this subject Monday, Feb. 11, at the Web Services/SOA on Wall Street conference in New York. More information can be found at www.webservicesonwallstreet.com/.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

SIA Recognizes Robert Dennard with 2019 Noyce Award

November 12, 2019

If you don’t know what Dennard Scaling is, the chances are strong you don’t labor in electronics. Robert Dennard, longtime IBM researcher, inventor of the DRAM and the fellow for whom Dennard Scaling was named, is th Read more…

By John Russell

Leveraging Exaflops Performance to Remediate Nuclear Waste

November 12, 2019

Nuclear waste storage sites are a subject of intense controversy and debate; nobody wants the radioactive remnants in their backyard. Now, a collaboration between Berkeley Lab, Pacific Northwest National University (PNNL Read more…

By Oliver Peckham

Using HPC and Machine Learning to Predict Traffic Congestion

November 12, 2019

Traffic congestion is a never-ending logic puzzle, dictated by commute patterns, but also by more stochastic accidents and similar disruptions. Traffic engineers struggle to model the traffic flow that occurs after accid Read more…

By Oliver Peckham

Mira Supercomputer Enables Cancer Research Breakthrough

November 11, 2019

Dynamic partial-wave spectroscopic (PWS) microscopy allows researchers to observe intracellular structures as small as 20 nanometers – smaller than those visible by optical microscopes – in three dimensions at a mill Read more…

By Staff report

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quantum annealing) – ion trap technology is edging into the QC Read more…

By John Russell

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. That’s the conclusion drawn by the scientists and researcher Read more…

By Jan Rowell

IBM Adds Support for Ion Trap Quantum Technology to Qiskit

November 11, 2019

After years of percolating in the shadow of quantum computing research based on superconducting semiconductors – think IBM, Rigetti, Google, and D-Wave (quant Read more…

By John Russell

Tackling HPC’s Memory and I/O Bottlenecks with On-Node, Non-Volatile RAM

November 8, 2019

On-node, non-volatile memory (NVRAM) is a game-changing technology that can remove many I/O and memory bottlenecks and provide a key enabler for exascale. Th Read more…

By Jan Rowell

MLPerf Releases First Inference Benchmark Results; Nvidia Touts its Showing

November 6, 2019

MLPerf.org, the young AI-benchmarking consortium, today issued the first round of results for its inference test suite. Among organizations with submissions wer Read more…

By John Russell

Azure Cloud First with AMD Epyc Rome Processors

November 6, 2019

At Ignite 2019 this week, Microsoft's Azure cloud team and AMD announced an expansion of their partnership that began in 2017 when Azure debuted Epyc-backed ins Read more…

By Tiffany Trader

Nvidia Launches Credit Card-Sized 21 TOPS Jetson System for Edge Devices

November 6, 2019

Nvidia has launched a new addition to its Jetson product line: a credit card-sized (70x45mm) form factor delivering up to 21 trillion operations/second (TOPS) o Read more…

By Doug Black

In Memoriam: Steve Tuecke, Globus Co-founder

November 4, 2019

HPCwire is deeply saddened to report that Steve Tuecke, longtime scientist at Argonne National Lab and University of Chicago, has passed away at age 52. Tuecke Read more…

By Tiffany Trader

Spending Spree: Hyperscalers Bought $57B of IT in 2018, $10B+ by Google – But Is Cloud on Horizon?

October 31, 2019

Hyperscalers are the masters of the IT universe, gravitational centers of increasing pull in the emerging age of data-driven compute and AI.  In the high-stake Read more…

By Doug Black

Cray Debuts ClusterStor E1000 Finishing Remake of Portfolio for ‘Exascale Era’

October 30, 2019

Cray, now owned by HPE, today introduced the ClusterStor E1000 storage platform, which leverages Cray software and mixes hard disk drives (HDD) and flash memory Read more…

By John Russell

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Intel Debuts Pohoiki Beach, Its 8M Neuron Neuromorphic Development System

July 17, 2019

Neuromorphic computing has received less fanfare of late than quantum computing whose mystery has captured public attention and which seems to have generated mo Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

With the Help of HPC, Astronomers Prepare to Deflect a Real Asteroid

September 26, 2019

For years, NASA has been running simulations of asteroid impacts to understand the risks (and likelihoods) of asteroids colliding with Earth. Now, NASA and the European Space Agency (ESA) are preparing for the next, crucial step in planetary defense against asteroid impacts: physically deflecting a real asteroid. Read more…

By Oliver Peckham

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This