Cloud Sparking Rapid Evolution of Life Sciences R&D

By Bruce Maches

April 6, 2011

The increasing adoption of cloud computing in its various forms is having a dramatic impact on the way life-science CIO’s provision the applications needed by their organizations to support the R&D process.

Life science research has always been a very complex and time consuming endeavor requiring a broad and diverse set of systems and computer applications. The complexity and resource requirements of these specialized applications has grown so tremendously that many life science companies are struggling to afford to internally build, implement, and support much of the required systems and infrastructure.

Before the cloud, life science companies would just throw more resources, storage, computational and people, at the problem. This is simply no longer possible in today’s economic climate and cloud technologies can provide a viable alternative.

While I have written in more detail about many of the topics herein as part of my extended series of pharma R&D-focused entries over the last year, I wanted to provide some high level information on the methodology of life science R&D and how IT supports that process. Where appropriate I will refer to the specific blog article where you can find additional information. 

Although the R&D process differs by type of life science company, for example, drug research versus internally used devices (heart valves etc) versus external devices such as a knee brace, the overall steps are basically the same:

– Research/Discovery: identifying the disease mechanism, compound, target, genome or device needs

– Development: developing and refining the compound, therapeutic, or device

– Phase 1, 2, &3 Clinical Trials: performing the requisite safety and effectiveness testing of the compound/device

– Regulatory Approval: seeking FDA consent to market the drug/device

– Post Approval Monitoring: tracking the use of the new product, outcomes, and any patient adverse events which must be reported to the FDA

All of these steps involve a multitude of activities and can take years of intensive effort to complete. Each area requires a variety of specialized systems to support the process and to capture and manage the data being generated. The number of systems and processes to support can be quite large depending on the type and complexity of the research being performed.

The life science industry is facing many other challenges besides just increased technology needs and complexity. The entire industry is under intensive revenue pressures as insurance companies and policy makers try to rein in ever increasing health costs by demanding discounts or simply reducing reimbursement levels. The cost to develop a major new drug is close to $1 billion and around 10 years to complete and most potential drug research projects are abandoned at some point during the development process due to unexpected side effects or insufficient efficacy. Industry averages show that out of 1,000 potential compounds identified in the discovery phase as worth pursuing only 1 will make it through the process, be granted market approval, and actually be sold.

On top of this dismal success rate, the FDA has increased its scrutiny of new drugs asking for more in-depth safety studies and trials before granting approval. The exclusivity period or patent life only lasts for a set number of years and the clock starts ticking long before approval is granted. On average a newly approved drug has about 10 years on the market before competitors can start marketing their own versions.

Many large pharmaceutical companies are dealing with impending major revenue shortfalls as popular drugs come off patent and are open to generic competition. Just in 2011 alone there are over a dozen name brand drugs representing nearly $13 billion in revenue coming off of patent protection. The largest by far of these is Pfizer’s cholesterol drug Lipitor which generates over $6 billion in revenue.

All of these factors have created an increase in risk depressing investment in new drug startups while increasing the pressure on existing companies to bring their new therapeutics to market as quickly as possible. These economic realities have forced life science companies to find ways to reduce costs while at the same time increasing productivity and reducing time to market for new products.

As for the life science CIO, there are multiple challenges to be dealt with on a daily basis. Not only are these CIO’s being asked to do more with less they also must deal with issues such as:

– An aging portfolio of legacy systems that must be kept in service as they contain critical data, the FDA requires that any data related to a drug be kept at least 2 years after it was last sold (think of the challenge of dealing with something like aspirin)

– Ensuring continued regulatory compliance for system related issues per FDA & HIPAA along with applicable foreign regulatory guidelines

– Pressure to reduce budgets while meeting increasing needs from the business for responsiveness and agility

– The need to reduce time-to-market for new therapies as every day of market exclusivity can potentially mean millions of dollars in revenue

– Continual vendor and technology changes in the marketplace

– Increasingly complex and resource intensive applications along with the explosion of data in R&D. The amount of data managed by life science companies nearly doubles every 3 months

The combined IT spend for life science companies in the US is over $700 billion per year. Overall budgets have remained flat the last 2 years meaning that life science CIO’s, like many of their counterparts in other industries, must do more with less while increasing flexibility and responsiveness to meet business needs.

Regulatory Compliance

One of the more complex and time consuming issues that the life science companies CIO’s have to deal with is ensuring that their systems and applications are in compliance with regulatory agency guidelines. In the US, the FDA not only provides guidance on how drugs are developed, manufactured and marketed but also on how the supporting systems must be tested and validated in order to ensure that the data (i.e.: results) contained in them can be trusted and is accurate. In a nut shell this requires that any system containing product data, clinical testing information or used to create submissions for regulatory approval must be validated as described in the Code for Federal Regulations (CFR) 21 Part 11.

How to deal with compliance is a major concern when any new applicable systems or major modifications to existing validated ones are being planned.  Project plans for new systems must incorporate a significant amount of time to build compliance in while upgrades to existing systems can require partial or complete re-validation. This can very expensive which means that many systems are not retired but kept in service for much longer than would normally be expected. You can read more about this aspect of life science IT in this more thorough exploration of the topic.

Given the critical nature of regulatory compliance, life science CIO’s must include it as a key piece of their overall cloud strategy or they will face a much more difficult road as they move to the cloud. Even worse, they may follow a path that may impact their ability to ensure compliance in their cloud strategy leaving them open to issues being raised during FDA audits.

Impact of Cloud Computing

The first portion of this article was meant to give the reader an overall understanding of the current state of IT in the life sciences, the process, issues, and some of the challenges. While the impact of cloud computing is very similar across a number of industries, I will now address the effect that cloud computing is having on life science companies.

Costs

The cost considerations regarding the delivery of information technology services are certainly not unique to the life sciences environment. All CIO’s are continually dealing with budget considerations while having to rationalize and justify the expense it takes to design, build, provision, and support the systems and applications that their users need. In the life sciences IT can consume an inordinate amount of the total operational budget compared to other industries. This is not surprising given that the R&D process is so data driven. While I was at Pfizer Global R&D the IT budget consumed over 15% of the total R&D expenditures and about 8% of the total headcount of the organization. This high level of resource consumption, while maybe necessary, does take away from the organizations core mission which is the science of drug discovery and development.

So, how can cloud computing help life science CIO’s with bringing their costs down? There are a number of ways and I will describe a few of them briefly below.

IAAS is a key component for reducing direct IT costs and overall TCO of applications. Provisioning of new hardware along with the data center space, power and support personnel is a major component of the CIO’s budget.  Life science CIO’s should have a clear understanding of their application and project portfolio so they can leverage the technology to reduce infrastructure costs. Using public cloud infrastructure for non-validated applications especially ones that are very ‘bursty’ in their resource needs will save thousands in hardware and support costs. Private cloud for internal apps can be leveraged to handle those applications requiring a validated or controlled environment. By having a defined application deployment strategy life science CIO’s can significantly reduce costs for the hardware and supporting infrastructure.

SAAS (as described below) can also be a major cost saver for the life sciences. Many vendors are offering specialized R&D applications as validated SAAS systems. A client of mine is a medium sized bio-tech that is doing clinical trials on their new drug. One of the major tasks in getting FDA approval is collecting, storing, collating all of the data and documents that will be a part of the NDA (New Drug Application). Normally companies like this would purchase a document management and publication system along with the supporting hardware and administrative resources. My client has chosen (as many other are) to use a SAAS based document management tool to store their documents and a similarly provisioned publishing tool to pull the documents together and create the files that will be electronically submitted to the FDA. Just by doing this not only were they able to bring this functionality on-line almost immediately but they also save tens of thousands of dollars over doing it in-house.

Disaster recovery and cloud backup are also areas where significant cost savings can be realized. Virtual images can be built of critical applications allowing for temporary running of these systems in the cloud in case of a disaster. Also, one of the major components of data management, backup and restore, can be mitigated by utilizing cloud backups as a part of the overall offsite backup strategy. This not only saves on manpower and hardware but also for backup tapes and offsite storage.
 
Cloud computing can not only reduce costs it also allows the IT group to be much more agile and responsive to their organization and to more quickly deploy the systems and applications to support R&D. The goal is to ultimately get new therapies out the door much quicker not only saving money but also increasing revenues by getting drugs to market quicker and extending the exclusivity period.

Regulatory Compliance

A significant amount of the expense and effort in any life science IT shop goes into ensuring that the systems and applications deployed comply with the appropriate regulatory guidelines. While there are a number of ways that cloud computing can assist with compliance there are two major areas where cloud technologies can have the biggest impact.

Validating and supporting the operating infrastructure of a system is a major component of the compliance effort. Building a validated private cloud environment will allow for the leveraging of the compliance costs over multiple systems. This reduces the hardware costs, data center footprint, and support requirements. Life science CIO’s should examine their portfolio of applications to see which ones can be moved to virtual environments and to deploy new systems only in virtual mode.

Legacy Systems

The management and maintenance of existing legacy systems is a huge headache for the life science CIO. In large IT shops a majority of the personnel and funding goes to supporting systems that have been in service for 5-10 even 15 years. It is not unusual to walk into a big pharma data center and see nameplates from companies past, Wang, DEC, and Compaq systems are just a few that I have seen recently. The primary cause of this is the time and expense that went into validating these systems when they were first brought on-line. There is usually budget for new systems but not budget to either re-validate upgraded systems or provide a validated method for transferring data from a system being retired to a new application. Instead, quite often, new systems are deployed and layered on top of existing applications which must be kept alive as they contain critical information that must be available on demand.

While cloud computing is no miracle cure for this problem a potential solution would be to create a validated private cloud environment and build the appropriate VM flavors to move these legacy applications into a virtual state. This would allow the life science CIO to retire the old hardware and free up both the support resources and space in their data center, as I touched on in this entry.

SAAS

Many firms that sell specialized applications into the life science space are now provisioning their applications via the SAAS model. While SAAS has been a around quite a while in a variety of forms, it is the ability to quickly provide their users with state of the art applications that is appealing to a life science CIO. Beyond leveraging the normal advantages of SAAS life science CIO’s can access application environments that are either pre-validated or is a semi-validated state significantly reducing the resources and time required to provision a new application. Over the last year I’ve weighed in quite a bit on the role of SaaS solutions in the industry.

Impediments

So what are some of the factors impeding the adoption of cloud computing in the life science?  Not surprisingly they are the usual suspects, questions around security, protection of intellectual property, vendor lock-in, latency etc. The biggest factor is how to deal with validation in an environment that is not under your control. Public clouds can be difficult at best to provide the necessary validation required although some IAAS vendors are contemplating building private cloud offerings that would include validation of the physical environment and there are companies offering pre-validated software images that can be loaded on demand.  This type of pre-validated hosted environment would be extremely appealing as they greatly reduce the cost and effort to deploy new R&D applications, which is rehashed in more detail here.

Start Ups & Small Bio-techs

Some of you may get the impression that only large companies can appropriately leverage cloud computing. While it is true that larger IT shops may gain more from cloud computing small companies can also benefit from utilizing cloud technologies. Many smaller companies have as a core strategy that as much of their IT needs as possible will be fulfilled by the cloud first and internally second. In a way smaller companies have an advantage as they do not have an inventory of legacy applications or entrenched people and processes to deal with.

I have two small bio-tech clients using this strategy. One has to an extreme where if you went into their offices all you would find are several wireless access points and printers. There are no servers, no desktops, no phones, and no need for IT administrative support. Everybody brings in their laptop and cell phone and all normal IT services are provided via or IAAS SAAS vendors. Even their phone system is a SAAS provisioned VOIP PBX that is connected to their cell phones. My other client has different needs but still a major portion of both their infrastructure and applications are accessed via the internet, a matter I discussed a while back.

In Closing

So – how is cloud computing impacting the life sciences IT organization? Certainly the changes being wrought by cloud computing are not unique to the life sciences but cloud is changing how life science CIO’s provide the systems needed by their users. Utilizing cloud as a integral part of their overall IT strategy has given life science CIO’s a major tool for reducing costs, being able to quickly response to user needs, ease the burden of regulatory compliance and support the complex process of life science R&D.

Many forward thinking CIO’s are already incorporating cloud into their current application portfolios and long term strategic plans. Those that have a clear and direct strategy for utilizing cloud and are aggressively looking at cloud technologies to help them with their problems will be much more successful than those who are flying by the seat of their pants.

Now what does the future hold? It would be great to be able to look at the future 5 years down the road to see how cloud has been adopted and how it is being utilized in the life sciences. Certainly cloud will be an integral part of the CIO’s portfolio and a much larger portion of the budget will be allocated to cloud computing technologies than we are seeing today. There can be no doubt that the life science IT shop of 2016 will be much different than today, not only in technology and infrastructure but also in staffing and required skill sets.

One thing is clear; cloud is a game changer and a technology that all life science companies need to embrace to remain competitive. Those that do not adapt to the cloud will find themselves unable to keep up or remain competitive with those that do.

About the Author

Bruce Maches is a former Director of Information Technology for Pfizer’s R&D division, current CIO for BRMaches & Associates and a contributing editor for HPC in the Cloud.

He has written extensively about the role of cloud computing and related technologies in the life sciences in his series in our Behind the Cloud section,

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Nvidia Shares Recipe to Accelerate AI Cloud Adoption

May 29, 2017

In March, Nvidia revealed blueprints for a new open source Tesla GPU-based accelerator – HGX-1 – developed for clouds with Microsoft under its Project Olym Read more…

By Tiffany Trader

Doug Kothe on the Race to Build Exascale Applications

May 29, 2017

Ensuring there are applications ready to churn out useful science when the first U.S. exascale computers arrive in the 2021-2023 timeframe is Doug Kothe’s job Read more…

By John Russell

PRACEdays Reflects Europe’s HPC Commitment

May 25, 2017

More than 250 attendees and participants came together for PRACEdays17 in Barcelona last week, part of the European HPC Summit Week 2017, held May 15-19 at t Read more…

By Tiffany Trader

Russian Researchers Claim First Quantum-Safe Blockchain

May 25, 2017

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurr Read more…

By Doug Black

HPE Extreme Performance Solutions

Exploring the Three Models of Remote Visualization

The explosion of data and advancement of digital technologies are dramatically changing the way many companies do business. With the help of high performance computing (HPC) solutions and data analytics platforms, manufacturers are developing products faster, healthcare providers are improving patient care, and energy companies are improving planning, exploration, and production. Read more…

Google Debuts TPU v2 and will Add to Google Cloud

May 25, 2017

Not long after stirring attention in the deep learning/AI community by revealing the details of its Tensor Processing Unit (TPU), Google last week announced the Read more…

By John Russell

Nvidia CEO Predicts AI ‘Cambrian Explosion’

May 25, 2017

The processing power and cloud access to developer tools used to train machine-learning models are making artificial intelligence ubiquitous across computing pl Read more…

By George Leopold

PGAS Use will Rise on New H/W Trends, Says Reinders

May 25, 2017

If you have not already tried using PGAS, it is time to consider adding PGAS to the programming techniques you know. Partitioned Global Array Space, commonly kn Read more…

By James Reinders

Exascale Escapes 2018 Budget Axe; Rest of Science Suffers

May 23, 2017

President Trump's proposed $4.1 trillion FY 2018 budget is good for U.S. exascale computing development, but grim for the rest of science and technology spend Read more…

By Tiffany Trader

Nvidia Shares Recipe to Accelerate AI Cloud Adoption

May 29, 2017

In March, Nvidia revealed blueprints for a new open source Tesla GPU-based accelerator – HGX-1 – developed for clouds with Microsoft under its Project Olym Read more…

By Tiffany Trader

Doug Kothe on the Race to Build Exascale Applications

May 29, 2017

Ensuring there are applications ready to churn out useful science when the first U.S. exascale computers arrive in the 2021-2023 timeframe is Doug Kothe’s job Read more…

By John Russell

PRACEdays Reflects Europe’s HPC Commitment

May 25, 2017

More than 250 attendees and participants came together for PRACEdays17 in Barcelona last week, part of the European HPC Summit Week 2017, held May 15-19 at t Read more…

By Tiffany Trader

PGAS Use will Rise on New H/W Trends, Says Reinders

May 25, 2017

If you have not already tried using PGAS, it is time to consider adding PGAS to the programming techniques you know. Partitioned Global Array Space, commonly kn Read more…

By James Reinders

Exascale Escapes 2018 Budget Axe; Rest of Science Suffers

May 23, 2017

President Trump's proposed $4.1 trillion FY 2018 budget is good for U.S. exascale computing development, but grim for the rest of science and technology spend Read more…

By Tiffany Trader

Cray Offers Supercomputing as a Service, Targets Biotechs First

May 16, 2017

Leading supercomputer vendor Cray and datacenter/cloud provider the Markley Group today announced plans to jointly deliver supercomputing as a service. The init Read more…

By John Russell

HPE’s Memory-centric The Machine Coming into View, Opens ARMs to 3rd-party Developers

May 16, 2017

Announced three years ago, HPE’s The Machine is said to be the largest R&D program in the venerable company’s history, one that could be progressing tow Read more…

By Doug Black

What’s Up with Hyperion as It Transitions From IDC?

May 15, 2017

If you’re wondering what’s happening with Hyperion Research – formerly the IDC HPC group – apparently you are not alone, says Steve Conway, now senior V Read more…

By John Russell

Quantum Bits: D-Wave and VW; Google Quantum Lab; IBM Expands Access

March 21, 2017

For a technology that’s usually characterized as far off and in a distant galaxy, quantum computing has been steadily picking up steam. Just how close real-wo Read more…

By John Russell

Trump Budget Targets NIH, DOE, and EPA; No Mention of NSF

March 16, 2017

President Trump’s proposed U.S. fiscal 2018 budget issued today sharply cuts science spending while bolstering military spending as he promised during the cam Read more…

By John Russell

Google Pulls Back the Covers on Its First Machine Learning Chip

April 6, 2017

This week Google released a report detailing the design and performance characteristics of the Tensor Processing Unit (TPU), its custom ASIC for the inference Read more…

By Tiffany Trader

HPC Compiler Company PathScale Seeks Life Raft

March 23, 2017

HPCwire has learned that HPC compiler company PathScale has fallen on difficult times and is asking the community for help or actively seeking a buyer for its a Read more…

By Tiffany Trader

CPU-based Visualization Positions for Exascale Supercomputing

March 16, 2017

Since our first formal product releases of OSPRay and OpenSWR libraries in 2016, CPU-based Software Defined Visualization (SDVis) has achieved wide-spread adopt Read more…

By Jim Jeffers, Principal Engineer and Engineering Leader, Intel

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Last week, Google reported that its custom ASIC Tensor Processing Unit (TPU) was 15-30x faster for inferencing workloads than Nvidia's K80 GPU (see our coverage Read more…

By Tiffany Trader

Nvidia’s Mammoth Volta GPU Aims High for AI, HPC

May 10, 2017

At Nvidia's GPU Technology Conference (GTC17) in San Jose, Calif., this morning, CEO Jensen Huang announced the company's much-anticipated Volta architecture a Read more…

By Tiffany Trader

TSUBAME3.0 Points to Future HPE Pascal-NVLink-OPA Server

February 17, 2017

Since our initial coverage of the TSUBAME3.0 supercomputer yesterday, more details have come to light on this innovative project. Of particular interest is a ne Read more…

By Tiffany Trader

Leading Solution Providers

Facebook Open Sources Caffe2; Nvidia, Intel Rush to Optimize

April 18, 2017

From its F8 developer conference in San Jose, Calif., today, Facebook announced Caffe2, a new open-source, cross-platform framework for deep learning. Caffe2 is Read more…

By Tiffany Trader

Tokyo Tech’s TSUBAME3.0 Will Be First HPE-SGI Super

February 16, 2017

In a press event Friday afternoon local time in Japan, Tokyo Institute of Technology (Tokyo Tech) announced its plans for the TSUBAME3.0 supercomputer, which w Read more…

By Tiffany Trader

Is Liquid Cooling Ready to Go Mainstream?

February 13, 2017

Lost in the frenzy of SC16 was a substantial rise in the number of vendors showing server oriented liquid cooling technologies. Three decades ago liquid cooling Read more…

By Steve Campbell

MIT Mathematician Spins Up 220,000-Core Google Compute Cluster

April 21, 2017

On Thursday, Google announced that MIT math professor and computational number theorist Andrew V. Sutherland had set a record for the largest Google Compute Eng Read more…

By Tiffany Trader

US Supercomputing Leaders Tackle the China Question

March 15, 2017

As China continues to prove its supercomputing mettle via the Top500 list and the forward march of its ambitious plans to stand up an exascale machine by 2020, Read more…

By Tiffany Trader

HPC Technique Propels Deep Learning at Scale

February 21, 2017

Researchers from Baidu's Silicon Valley AI Lab (SVAIL) have adapted a well-known HPC communication technique to boost the speed and scale of their neural networ Read more…

By Tiffany Trader

DOE Supercomputer Achieves Record 45-Qubit Quantum Simulation

April 13, 2017

In order to simulate larger and larger quantum systems and usher in an age of "quantum supremacy," researchers are stretching the limits of today's most advance Read more…

By Tiffany Trader

Knights Landing Processor with Omni-Path Makes Cloud Debut

April 18, 2017

HPC cloud specialist Rescale is partnering with Intel and HPC resource provider R Systems to offer first-ever cloud access to Xeon Phi "Knights Landing" process Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This