Cloud Sparking Rapid Evolution of Life Sciences R&D

By Bruce Maches

April 6, 2011

The increasing adoption of cloud computing in its various forms is having a dramatic impact on the way life-science CIO’s provision the applications needed by their organizations to support the R&D process.

Life science research has always been a very complex and time consuming endeavor requiring a broad and diverse set of systems and computer applications. The complexity and resource requirements of these specialized applications has grown so tremendously that many life science companies are struggling to afford to internally build, implement, and support much of the required systems and infrastructure.

Before the cloud, life science companies would just throw more resources, storage, computational and people, at the problem. This is simply no longer possible in today’s economic climate and cloud technologies can provide a viable alternative.

While I have written in more detail about many of the topics herein as part of my extended series of pharma R&D-focused entries over the last year, I wanted to provide some high level information on the methodology of life science R&D and how IT supports that process. Where appropriate I will refer to the specific blog article where you can find additional information. 

Although the R&D process differs by type of life science company, for example, drug research versus internally used devices (heart valves etc) versus external devices such as a knee brace, the overall steps are basically the same:

– Research/Discovery: identifying the disease mechanism, compound, target, genome or device needs

– Development: developing and refining the compound, therapeutic, or device

– Phase 1, 2, &3 Clinical Trials: performing the requisite safety and effectiveness testing of the compound/device

– Regulatory Approval: seeking FDA consent to market the drug/device

– Post Approval Monitoring: tracking the use of the new product, outcomes, and any patient adverse events which must be reported to the FDA

All of these steps involve a multitude of activities and can take years of intensive effort to complete. Each area requires a variety of specialized systems to support the process and to capture and manage the data being generated. The number of systems and processes to support can be quite large depending on the type and complexity of the research being performed.

The life science industry is facing many other challenges besides just increased technology needs and complexity. The entire industry is under intensive revenue pressures as insurance companies and policy makers try to rein in ever increasing health costs by demanding discounts or simply reducing reimbursement levels. The cost to develop a major new drug is close to $1 billion and around 10 years to complete and most potential drug research projects are abandoned at some point during the development process due to unexpected side effects or insufficient efficacy. Industry averages show that out of 1,000 potential compounds identified in the discovery phase as worth pursuing only 1 will make it through the process, be granted market approval, and actually be sold.

On top of this dismal success rate, the FDA has increased its scrutiny of new drugs asking for more in-depth safety studies and trials before granting approval. The exclusivity period or patent life only lasts for a set number of years and the clock starts ticking long before approval is granted. On average a newly approved drug has about 10 years on the market before competitors can start marketing their own versions.

Many large pharmaceutical companies are dealing with impending major revenue shortfalls as popular drugs come off patent and are open to generic competition. Just in 2011 alone there are over a dozen name brand drugs representing nearly $13 billion in revenue coming off of patent protection. The largest by far of these is Pfizer’s cholesterol drug Lipitor which generates over $6 billion in revenue.

All of these factors have created an increase in risk depressing investment in new drug startups while increasing the pressure on existing companies to bring their new therapeutics to market as quickly as possible. These economic realities have forced life science companies to find ways to reduce costs while at the same time increasing productivity and reducing time to market for new products.

As for the life science CIO, there are multiple challenges to be dealt with on a daily basis. Not only are these CIO’s being asked to do more with less they also must deal with issues such as:

– An aging portfolio of legacy systems that must be kept in service as they contain critical data, the FDA requires that any data related to a drug be kept at least 2 years after it was last sold (think of the challenge of dealing with something like aspirin)

– Ensuring continued regulatory compliance for system related issues per FDA & HIPAA along with applicable foreign regulatory guidelines

– Pressure to reduce budgets while meeting increasing needs from the business for responsiveness and agility

– The need to reduce time-to-market for new therapies as every day of market exclusivity can potentially mean millions of dollars in revenue

– Continual vendor and technology changes in the marketplace

– Increasingly complex and resource intensive applications along with the explosion of data in R&D. The amount of data managed by life science companies nearly doubles every 3 months

The combined IT spend for life science companies in the US is over $700 billion per year. Overall budgets have remained flat the last 2 years meaning that life science CIO’s, like many of their counterparts in other industries, must do more with less while increasing flexibility and responsiveness to meet business needs.

Regulatory Compliance

One of the more complex and time consuming issues that the life science companies CIO’s have to deal with is ensuring that their systems and applications are in compliance with regulatory agency guidelines. In the US, the FDA not only provides guidance on how drugs are developed, manufactured and marketed but also on how the supporting systems must be tested and validated in order to ensure that the data (i.e.: results) contained in them can be trusted and is accurate. In a nut shell this requires that any system containing product data, clinical testing information or used to create submissions for regulatory approval must be validated as described in the Code for Federal Regulations (CFR) 21 Part 11.

How to deal with compliance is a major concern when any new applicable systems or major modifications to existing validated ones are being planned.  Project plans for new systems must incorporate a significant amount of time to build compliance in while upgrades to existing systems can require partial or complete re-validation. This can very expensive which means that many systems are not retired but kept in service for much longer than would normally be expected. You can read more about this aspect of life science IT in this more thorough exploration of the topic.

Given the critical nature of regulatory compliance, life science CIO’s must include it as a key piece of their overall cloud strategy or they will face a much more difficult road as they move to the cloud. Even worse, they may follow a path that may impact their ability to ensure compliance in their cloud strategy leaving them open to issues being raised during FDA audits.

Impact of Cloud Computing

The first portion of this article was meant to give the reader an overall understanding of the current state of IT in the life sciences, the process, issues, and some of the challenges. While the impact of cloud computing is very similar across a number of industries, I will now address the effect that cloud computing is having on life science companies.

Costs

The cost considerations regarding the delivery of information technology services are certainly not unique to the life sciences environment. All CIO’s are continually dealing with budget considerations while having to rationalize and justify the expense it takes to design, build, provision, and support the systems and applications that their users need. In the life sciences IT can consume an inordinate amount of the total operational budget compared to other industries. This is not surprising given that the R&D process is so data driven. While I was at Pfizer Global R&D the IT budget consumed over 15% of the total R&D expenditures and about 8% of the total headcount of the organization. This high level of resource consumption, while maybe necessary, does take away from the organizations core mission which is the science of drug discovery and development.

So, how can cloud computing help life science CIO’s with bringing their costs down? There are a number of ways and I will describe a few of them briefly below.

IAAS is a key component for reducing direct IT costs and overall TCO of applications. Provisioning of new hardware along with the data center space, power and support personnel is a major component of the CIO’s budget.  Life science CIO’s should have a clear understanding of their application and project portfolio so they can leverage the technology to reduce infrastructure costs. Using public cloud infrastructure for non-validated applications especially ones that are very ‘bursty’ in their resource needs will save thousands in hardware and support costs. Private cloud for internal apps can be leveraged to handle those applications requiring a validated or controlled environment. By having a defined application deployment strategy life science CIO’s can significantly reduce costs for the hardware and supporting infrastructure.

SAAS (as described below) can also be a major cost saver for the life sciences. Many vendors are offering specialized R&D applications as validated SAAS systems. A client of mine is a medium sized bio-tech that is doing clinical trials on their new drug. One of the major tasks in getting FDA approval is collecting, storing, collating all of the data and documents that will be a part of the NDA (New Drug Application). Normally companies like this would purchase a document management and publication system along with the supporting hardware and administrative resources. My client has chosen (as many other are) to use a SAAS based document management tool to store their documents and a similarly provisioned publishing tool to pull the documents together and create the files that will be electronically submitted to the FDA. Just by doing this not only were they able to bring this functionality on-line almost immediately but they also save tens of thousands of dollars over doing it in-house.

Disaster recovery and cloud backup are also areas where significant cost savings can be realized. Virtual images can be built of critical applications allowing for temporary running of these systems in the cloud in case of a disaster. Also, one of the major components of data management, backup and restore, can be mitigated by utilizing cloud backups as a part of the overall offsite backup strategy. This not only saves on manpower and hardware but also for backup tapes and offsite storage.
 
Cloud computing can not only reduce costs it also allows the IT group to be much more agile and responsive to their organization and to more quickly deploy the systems and applications to support R&D. The goal is to ultimately get new therapies out the door much quicker not only saving money but also increasing revenues by getting drugs to market quicker and extending the exclusivity period.

Regulatory Compliance

A significant amount of the expense and effort in any life science IT shop goes into ensuring that the systems and applications deployed comply with the appropriate regulatory guidelines. While there are a number of ways that cloud computing can assist with compliance there are two major areas where cloud technologies can have the biggest impact.

Validating and supporting the operating infrastructure of a system is a major component of the compliance effort. Building a validated private cloud environment will allow for the leveraging of the compliance costs over multiple systems. This reduces the hardware costs, data center footprint, and support requirements. Life science CIO’s should examine their portfolio of applications to see which ones can be moved to virtual environments and to deploy new systems only in virtual mode.

Legacy Systems

The management and maintenance of existing legacy systems is a huge headache for the life science CIO. In large IT shops a majority of the personnel and funding goes to supporting systems that have been in service for 5-10 even 15 years. It is not unusual to walk into a big pharma data center and see nameplates from companies past, Wang, DEC, and Compaq systems are just a few that I have seen recently. The primary cause of this is the time and expense that went into validating these systems when they were first brought on-line. There is usually budget for new systems but not budget to either re-validate upgraded systems or provide a validated method for transferring data from a system being retired to a new application. Instead, quite often, new systems are deployed and layered on top of existing applications which must be kept alive as they contain critical information that must be available on demand.

While cloud computing is no miracle cure for this problem a potential solution would be to create a validated private cloud environment and build the appropriate VM flavors to move these legacy applications into a virtual state. This would allow the life science CIO to retire the old hardware and free up both the support resources and space in their data center, as I touched on in this entry.

SAAS

Many firms that sell specialized applications into the life science space are now provisioning their applications via the SAAS model. While SAAS has been a around quite a while in a variety of forms, it is the ability to quickly provide their users with state of the art applications that is appealing to a life science CIO. Beyond leveraging the normal advantages of SAAS life science CIO’s can access application environments that are either pre-validated or is a semi-validated state significantly reducing the resources and time required to provision a new application. Over the last year I’ve weighed in quite a bit on the role of SaaS solutions in the industry.

Impediments

So what are some of the factors impeding the adoption of cloud computing in the life science?  Not surprisingly they are the usual suspects, questions around security, protection of intellectual property, vendor lock-in, latency etc. The biggest factor is how to deal with validation in an environment that is not under your control. Public clouds can be difficult at best to provide the necessary validation required although some IAAS vendors are contemplating building private cloud offerings that would include validation of the physical environment and there are companies offering pre-validated software images that can be loaded on demand.  This type of pre-validated hosted environment would be extremely appealing as they greatly reduce the cost and effort to deploy new R&D applications, which is rehashed in more detail here.

Start Ups & Small Bio-techs

Some of you may get the impression that only large companies can appropriately leverage cloud computing. While it is true that larger IT shops may gain more from cloud computing small companies can also benefit from utilizing cloud technologies. Many smaller companies have as a core strategy that as much of their IT needs as possible will be fulfilled by the cloud first and internally second. In a way smaller companies have an advantage as they do not have an inventory of legacy applications or entrenched people and processes to deal with.

I have two small bio-tech clients using this strategy. One has to an extreme where if you went into their offices all you would find are several wireless access points and printers. There are no servers, no desktops, no phones, and no need for IT administrative support. Everybody brings in their laptop and cell phone and all normal IT services are provided via or IAAS SAAS vendors. Even their phone system is a SAAS provisioned VOIP PBX that is connected to their cell phones. My other client has different needs but still a major portion of both their infrastructure and applications are accessed via the internet, a matter I discussed a while back.

In Closing

So – how is cloud computing impacting the life sciences IT organization? Certainly the changes being wrought by cloud computing are not unique to the life sciences but cloud is changing how life science CIO’s provide the systems needed by their users. Utilizing cloud as a integral part of their overall IT strategy has given life science CIO’s a major tool for reducing costs, being able to quickly response to user needs, ease the burden of regulatory compliance and support the complex process of life science R&D.

Many forward thinking CIO’s are already incorporating cloud into their current application portfolios and long term strategic plans. Those that have a clear and direct strategy for utilizing cloud and are aggressively looking at cloud technologies to help them with their problems will be much more successful than those who are flying by the seat of their pants.

Now what does the future hold? It would be great to be able to look at the future 5 years down the road to see how cloud has been adopted and how it is being utilized in the life sciences. Certainly cloud will be an integral part of the CIO’s portfolio and a much larger portion of the budget will be allocated to cloud computing technologies than we are seeing today. There can be no doubt that the life science IT shop of 2016 will be much different than today, not only in technology and infrastructure but also in staffing and required skill sets.

One thing is clear; cloud is a game changer and a technology that all life science companies need to embrace to remain competitive. Those that do not adapt to the cloud will find themselves unable to keep up or remain competitive with those that do.

About the Author

Bruce Maches is a former Director of Information Technology for Pfizer’s R&D division, current CIO for BRMaches & Associates and a contributing editor for HPC in the Cloud.

He has written extensively about the role of cloud computing and related technologies in the life sciences in his series in our Behind the Cloud section,

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire