Cloud Sparking Rapid Evolution of Life Sciences R&D

By Bruce Maches

April 6, 2011

The increasing adoption of cloud computing in its various forms is having a dramatic impact on the way life-science CIO’s provision the applications needed by their organizations to support the R&D process.

Life science research has always been a very complex and time consuming endeavor requiring a broad and diverse set of systems and computer applications. The complexity and resource requirements of these specialized applications has grown so tremendously that many life science companies are struggling to afford to internally build, implement, and support much of the required systems and infrastructure.

Before the cloud, life science companies would just throw more resources, storage, computational and people, at the problem. This is simply no longer possible in today’s economic climate and cloud technologies can provide a viable alternative.

While I have written in more detail about many of the topics herein as part of my extended series of pharma R&D-focused entries over the last year, I wanted to provide some high level information on the methodology of life science R&D and how IT supports that process. Where appropriate I will refer to the specific blog article where you can find additional information. 

Although the R&D process differs by type of life science company, for example, drug research versus internally used devices (heart valves etc) versus external devices such as a knee brace, the overall steps are basically the same:

– Research/Discovery: identifying the disease mechanism, compound, target, genome or device needs

– Development: developing and refining the compound, therapeutic, or device

– Phase 1, 2, &3 Clinical Trials: performing the requisite safety and effectiveness testing of the compound/device

– Regulatory Approval: seeking FDA consent to market the drug/device

– Post Approval Monitoring: tracking the use of the new product, outcomes, and any patient adverse events which must be reported to the FDA

All of these steps involve a multitude of activities and can take years of intensive effort to complete. Each area requires a variety of specialized systems to support the process and to capture and manage the data being generated. The number of systems and processes to support can be quite large depending on the type and complexity of the research being performed.

The life science industry is facing many other challenges besides just increased technology needs and complexity. The entire industry is under intensive revenue pressures as insurance companies and policy makers try to rein in ever increasing health costs by demanding discounts or simply reducing reimbursement levels. The cost to develop a major new drug is close to $1 billion and around 10 years to complete and most potential drug research projects are abandoned at some point during the development process due to unexpected side effects or insufficient efficacy. Industry averages show that out of 1,000 potential compounds identified in the discovery phase as worth pursuing only 1 will make it through the process, be granted market approval, and actually be sold.

On top of this dismal success rate, the FDA has increased its scrutiny of new drugs asking for more in-depth safety studies and trials before granting approval. The exclusivity period or patent life only lasts for a set number of years and the clock starts ticking long before approval is granted. On average a newly approved drug has about 10 years on the market before competitors can start marketing their own versions.

Many large pharmaceutical companies are dealing with impending major revenue shortfalls as popular drugs come off patent and are open to generic competition. Just in 2011 alone there are over a dozen name brand drugs representing nearly $13 billion in revenue coming off of patent protection. The largest by far of these is Pfizer’s cholesterol drug Lipitor which generates over $6 billion in revenue.

All of these factors have created an increase in risk depressing investment in new drug startups while increasing the pressure on existing companies to bring their new therapeutics to market as quickly as possible. These economic realities have forced life science companies to find ways to reduce costs while at the same time increasing productivity and reducing time to market for new products.

As for the life science CIO, there are multiple challenges to be dealt with on a daily basis. Not only are these CIO’s being asked to do more with less they also must deal with issues such as:

– An aging portfolio of legacy systems that must be kept in service as they contain critical data, the FDA requires that any data related to a drug be kept at least 2 years after it was last sold (think of the challenge of dealing with something like aspirin)

– Ensuring continued regulatory compliance for system related issues per FDA & HIPAA along with applicable foreign regulatory guidelines

– Pressure to reduce budgets while meeting increasing needs from the business for responsiveness and agility

– The need to reduce time-to-market for new therapies as every day of market exclusivity can potentially mean millions of dollars in revenue

– Continual vendor and technology changes in the marketplace

– Increasingly complex and resource intensive applications along with the explosion of data in R&D. The amount of data managed by life science companies nearly doubles every 3 months

The combined IT spend for life science companies in the US is over $700 billion per year. Overall budgets have remained flat the last 2 years meaning that life science CIO’s, like many of their counterparts in other industries, must do more with less while increasing flexibility and responsiveness to meet business needs.

Regulatory Compliance

One of the more complex and time consuming issues that the life science companies CIO’s have to deal with is ensuring that their systems and applications are in compliance with regulatory agency guidelines. In the US, the FDA not only provides guidance on how drugs are developed, manufactured and marketed but also on how the supporting systems must be tested and validated in order to ensure that the data (i.e.: results) contained in them can be trusted and is accurate. In a nut shell this requires that any system containing product data, clinical testing information or used to create submissions for regulatory approval must be validated as described in the Code for Federal Regulations (CFR) 21 Part 11.

How to deal with compliance is a major concern when any new applicable systems or major modifications to existing validated ones are being planned.  Project plans for new systems must incorporate a significant amount of time to build compliance in while upgrades to existing systems can require partial or complete re-validation. This can very expensive which means that many systems are not retired but kept in service for much longer than would normally be expected. You can read more about this aspect of life science IT in this more thorough exploration of the topic.

Given the critical nature of regulatory compliance, life science CIO’s must include it as a key piece of their overall cloud strategy or they will face a much more difficult road as they move to the cloud. Even worse, they may follow a path that may impact their ability to ensure compliance in their cloud strategy leaving them open to issues being raised during FDA audits.

Impact of Cloud Computing

The first portion of this article was meant to give the reader an overall understanding of the current state of IT in the life sciences, the process, issues, and some of the challenges. While the impact of cloud computing is very similar across a number of industries, I will now address the effect that cloud computing is having on life science companies.

Costs

The cost considerations regarding the delivery of information technology services are certainly not unique to the life sciences environment. All CIO’s are continually dealing with budget considerations while having to rationalize and justify the expense it takes to design, build, provision, and support the systems and applications that their users need. In the life sciences IT can consume an inordinate amount of the total operational budget compared to other industries. This is not surprising given that the R&D process is so data driven. While I was at Pfizer Global R&D the IT budget consumed over 15% of the total R&D expenditures and about 8% of the total headcount of the organization. This high level of resource consumption, while maybe necessary, does take away from the organizations core mission which is the science of drug discovery and development.

So, how can cloud computing help life science CIO’s with bringing their costs down? There are a number of ways and I will describe a few of them briefly below.

IAAS is a key component for reducing direct IT costs and overall TCO of applications. Provisioning of new hardware along with the data center space, power and support personnel is a major component of the CIO’s budget.  Life science CIO’s should have a clear understanding of their application and project portfolio so they can leverage the technology to reduce infrastructure costs. Using public cloud infrastructure for non-validated applications especially ones that are very ‘bursty’ in their resource needs will save thousands in hardware and support costs. Private cloud for internal apps can be leveraged to handle those applications requiring a validated or controlled environment. By having a defined application deployment strategy life science CIO’s can significantly reduce costs for the hardware and supporting infrastructure.

SAAS (as described below) can also be a major cost saver for the life sciences. Many vendors are offering specialized R&D applications as validated SAAS systems. A client of mine is a medium sized bio-tech that is doing clinical trials on their new drug. One of the major tasks in getting FDA approval is collecting, storing, collating all of the data and documents that will be a part of the NDA (New Drug Application). Normally companies like this would purchase a document management and publication system along with the supporting hardware and administrative resources. My client has chosen (as many other are) to use a SAAS based document management tool to store their documents and a similarly provisioned publishing tool to pull the documents together and create the files that will be electronically submitted to the FDA. Just by doing this not only were they able to bring this functionality on-line almost immediately but they also save tens of thousands of dollars over doing it in-house.

Disaster recovery and cloud backup are also areas where significant cost savings can be realized. Virtual images can be built of critical applications allowing for temporary running of these systems in the cloud in case of a disaster. Also, one of the major components of data management, backup and restore, can be mitigated by utilizing cloud backups as a part of the overall offsite backup strategy. This not only saves on manpower and hardware but also for backup tapes and offsite storage.
 
Cloud computing can not only reduce costs it also allows the IT group to be much more agile and responsive to their organization and to more quickly deploy the systems and applications to support R&D. The goal is to ultimately get new therapies out the door much quicker not only saving money but also increasing revenues by getting drugs to market quicker and extending the exclusivity period.

Regulatory Compliance

A significant amount of the expense and effort in any life science IT shop goes into ensuring that the systems and applications deployed comply with the appropriate regulatory guidelines. While there are a number of ways that cloud computing can assist with compliance there are two major areas where cloud technologies can have the biggest impact.

Validating and supporting the operating infrastructure of a system is a major component of the compliance effort. Building a validated private cloud environment will allow for the leveraging of the compliance costs over multiple systems. This reduces the hardware costs, data center footprint, and support requirements. Life science CIO’s should examine their portfolio of applications to see which ones can be moved to virtual environments and to deploy new systems only in virtual mode.

Legacy Systems

The management and maintenance of existing legacy systems is a huge headache for the life science CIO. In large IT shops a majority of the personnel and funding goes to supporting systems that have been in service for 5-10 even 15 years. It is not unusual to walk into a big pharma data center and see nameplates from companies past, Wang, DEC, and Compaq systems are just a few that I have seen recently. The primary cause of this is the time and expense that went into validating these systems when they were first brought on-line. There is usually budget for new systems but not budget to either re-validate upgraded systems or provide a validated method for transferring data from a system being retired to a new application. Instead, quite often, new systems are deployed and layered on top of existing applications which must be kept alive as they contain critical information that must be available on demand.

While cloud computing is no miracle cure for this problem a potential solution would be to create a validated private cloud environment and build the appropriate VM flavors to move these legacy applications into a virtual state. This would allow the life science CIO to retire the old hardware and free up both the support resources and space in their data center, as I touched on in this entry.

SAAS

Many firms that sell specialized applications into the life science space are now provisioning their applications via the SAAS model. While SAAS has been a around quite a while in a variety of forms, it is the ability to quickly provide their users with state of the art applications that is appealing to a life science CIO. Beyond leveraging the normal advantages of SAAS life science CIO’s can access application environments that are either pre-validated or is a semi-validated state significantly reducing the resources and time required to provision a new application. Over the last year I’ve weighed in quite a bit on the role of SaaS solutions in the industry.

Impediments

So what are some of the factors impeding the adoption of cloud computing in the life science?  Not surprisingly they are the usual suspects, questions around security, protection of intellectual property, vendor lock-in, latency etc. The biggest factor is how to deal with validation in an environment that is not under your control. Public clouds can be difficult at best to provide the necessary validation required although some IAAS vendors are contemplating building private cloud offerings that would include validation of the physical environment and there are companies offering pre-validated software images that can be loaded on demand.  This type of pre-validated hosted environment would be extremely appealing as they greatly reduce the cost and effort to deploy new R&D applications, which is rehashed in more detail here.

Start Ups & Small Bio-techs

Some of you may get the impression that only large companies can appropriately leverage cloud computing. While it is true that larger IT shops may gain more from cloud computing small companies can also benefit from utilizing cloud technologies. Many smaller companies have as a core strategy that as much of their IT needs as possible will be fulfilled by the cloud first and internally second. In a way smaller companies have an advantage as they do not have an inventory of legacy applications or entrenched people and processes to deal with.

I have two small bio-tech clients using this strategy. One has to an extreme where if you went into their offices all you would find are several wireless access points and printers. There are no servers, no desktops, no phones, and no need for IT administrative support. Everybody brings in their laptop and cell phone and all normal IT services are provided via or IAAS SAAS vendors. Even their phone system is a SAAS provisioned VOIP PBX that is connected to their cell phones. My other client has different needs but still a major portion of both their infrastructure and applications are accessed via the internet, a matter I discussed a while back.

In Closing

So – how is cloud computing impacting the life sciences IT organization? Certainly the changes being wrought by cloud computing are not unique to the life sciences but cloud is changing how life science CIO’s provide the systems needed by their users. Utilizing cloud as a integral part of their overall IT strategy has given life science CIO’s a major tool for reducing costs, being able to quickly response to user needs, ease the burden of regulatory compliance and support the complex process of life science R&D.

Many forward thinking CIO’s are already incorporating cloud into their current application portfolios and long term strategic plans. Those that have a clear and direct strategy for utilizing cloud and are aggressively looking at cloud technologies to help them with their problems will be much more successful than those who are flying by the seat of their pants.

Now what does the future hold? It would be great to be able to look at the future 5 years down the road to see how cloud has been adopted and how it is being utilized in the life sciences. Certainly cloud will be an integral part of the CIO’s portfolio and a much larger portion of the budget will be allocated to cloud computing technologies than we are seeing today. There can be no doubt that the life science IT shop of 2016 will be much different than today, not only in technology and infrastructure but also in staffing and required skill sets.

One thing is clear; cloud is a game changer and a technology that all life science companies need to embrace to remain competitive. Those that do not adapt to the cloud will find themselves unable to keep up or remain competitive with those that do.

About the Author

Bruce Maches is a former Director of Information Technology for Pfizer’s R&D division, current CIO for BRMaches & Associates and a contributing editor for HPC in the Cloud.

He has written extensively about the role of cloud computing and related technologies in the life sciences in his series in our Behind the Cloud section,

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

UCSD, AIST Forge Tighter Alliance with AI-Focused MOU

January 18, 2018

The rich history of collaboration between UC San Diego and AIST in Japan is getting richer. The organizations entered into a five-year memorandum of understanding on January 10. The MOU represents the continuation of a 1 Read more…

By Tiffany Trader

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Tennessee), Satoshi Matsuoka (Tokyo Institute of Technology), Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown and Spectre security updates on the performance of popular H Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

HPE and NREL Take Steps to Create a Sustainable, Energy-Efficient Data Center with an H2 Fuel Cell

As enterprises attempt to manage rising volumes of data, unplanned data center outages are becoming more common and more expensive. As the cost of downtime rises, enterprises lose out on productivity and valuable competitive advantage without access to their critical data. Read more…

Fostering Lustre Advancement Through Development and Contributions

January 17, 2018

Six months after organizational changes at Intel's High Performance Data (HPDD) division, most in the Lustre community have shed any initial apprehension around the potential changes that could affect or disrupt Lustre Read more…

By Carlos Aoki Thomaz

UCSD, AIST Forge Tighter Alliance with AI-Focused MOU

January 18, 2018

The rich history of collaboration between UC San Diego and AIST in Japan is getting richer. The organizations entered into a five-year memorandum of understandi Read more…

By Tiffany Trader

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Te Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Fostering Lustre Advancement Through Development and Contributions

January 17, 2018

Six months after organizational changes at Intel's High Performance Data (HPDD) division, most in the Lustre community have shed any initial apprehension aroun Read more…

By Carlos Aoki Thomaz

When the Chips Are Down

January 11, 2018

In the last article, "The High Stakes Semiconductor Game that Drives HPC Diversity," I alluded to the challenges facing the semiconductor industry and how that may impact the evolution of HPC systems over the next few years. I thought I’d lift the covers a little and look at some of the commercial challenges that impact the component technology we use in HPC. Read more…

By Dairsie Latimer

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Momentum Builds for US Exascale

January 9, 2018

2018 looks to be a great year for the U.S. exascale program. The last several months of 2017 revealed a number of important developments that help put the U.S. Read more…

By Alex R. Larzelere

ANL’s Rick Stevens on CANDLE, ARM, Quantum, and More

January 8, 2018

Late last year HPCwire caught up with Rick Stevens, associate laboratory director for computing, environment and life Sciences at Argonne National Laboratory, f Read more…

By John Russell

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Leading Solution Providers

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

  • arrow
  • Click Here for More Headlines
  • arrow
Share This