Cloud Sparking Rapid Evolution of Life Sciences R&D

By Bruce Maches

April 6, 2011

The increasing adoption of cloud computing in its various forms is having a dramatic impact on the way life-science CIO’s provision the applications needed by their organizations to support the R&D process.

Life science research has always been a very complex and time consuming endeavor requiring a broad and diverse set of systems and computer applications. The complexity and resource requirements of these specialized applications has grown so tremendously that many life science companies are struggling to afford to internally build, implement, and support much of the required systems and infrastructure.

Before the cloud, life science companies would just throw more resources, storage, computational and people, at the problem. This is simply no longer possible in today’s economic climate and cloud technologies can provide a viable alternative.

While I have written in more detail about many of the topics herein as part of my extended series of pharma R&D-focused entries over the last year, I wanted to provide some high level information on the methodology of life science R&D and how IT supports that process. Where appropriate I will refer to the specific blog article where you can find additional information. 

Although the R&D process differs by type of life science company, for example, drug research versus internally used devices (heart valves etc) versus external devices such as a knee brace, the overall steps are basically the same:

– Research/Discovery: identifying the disease mechanism, compound, target, genome or device needs

– Development: developing and refining the compound, therapeutic, or device

– Phase 1, 2, &3 Clinical Trials: performing the requisite safety and effectiveness testing of the compound/device

– Regulatory Approval: seeking FDA consent to market the drug/device

– Post Approval Monitoring: tracking the use of the new product, outcomes, and any patient adverse events which must be reported to the FDA

All of these steps involve a multitude of activities and can take years of intensive effort to complete. Each area requires a variety of specialized systems to support the process and to capture and manage the data being generated. The number of systems and processes to support can be quite large depending on the type and complexity of the research being performed.

The life science industry is facing many other challenges besides just increased technology needs and complexity. The entire industry is under intensive revenue pressures as insurance companies and policy makers try to rein in ever increasing health costs by demanding discounts or simply reducing reimbursement levels. The cost to develop a major new drug is close to $1 billion and around 10 years to complete and most potential drug research projects are abandoned at some point during the development process due to unexpected side effects or insufficient efficacy. Industry averages show that out of 1,000 potential compounds identified in the discovery phase as worth pursuing only 1 will make it through the process, be granted market approval, and actually be sold.

On top of this dismal success rate, the FDA has increased its scrutiny of new drugs asking for more in-depth safety studies and trials before granting approval. The exclusivity period or patent life only lasts for a set number of years and the clock starts ticking long before approval is granted. On average a newly approved drug has about 10 years on the market before competitors can start marketing their own versions.

Many large pharmaceutical companies are dealing with impending major revenue shortfalls as popular drugs come off patent and are open to generic competition. Just in 2011 alone there are over a dozen name brand drugs representing nearly $13 billion in revenue coming off of patent protection. The largest by far of these is Pfizer’s cholesterol drug Lipitor which generates over $6 billion in revenue.

All of these factors have created an increase in risk depressing investment in new drug startups while increasing the pressure on existing companies to bring their new therapeutics to market as quickly as possible. These economic realities have forced life science companies to find ways to reduce costs while at the same time increasing productivity and reducing time to market for new products.

As for the life science CIO, there are multiple challenges to be dealt with on a daily basis. Not only are these CIO’s being asked to do more with less they also must deal with issues such as:

– An aging portfolio of legacy systems that must be kept in service as they contain critical data, the FDA requires that any data related to a drug be kept at least 2 years after it was last sold (think of the challenge of dealing with something like aspirin)

– Ensuring continued regulatory compliance for system related issues per FDA & HIPAA along with applicable foreign regulatory guidelines

– Pressure to reduce budgets while meeting increasing needs from the business for responsiveness and agility

– The need to reduce time-to-market for new therapies as every day of market exclusivity can potentially mean millions of dollars in revenue

– Continual vendor and technology changes in the marketplace

– Increasingly complex and resource intensive applications along with the explosion of data in R&D. The amount of data managed by life science companies nearly doubles every 3 months

The combined IT spend for life science companies in the US is over $700 billion per year. Overall budgets have remained flat the last 2 years meaning that life science CIO’s, like many of their counterparts in other industries, must do more with less while increasing flexibility and responsiveness to meet business needs.

Regulatory Compliance

One of the more complex and time consuming issues that the life science companies CIO’s have to deal with is ensuring that their systems and applications are in compliance with regulatory agency guidelines. In the US, the FDA not only provides guidance on how drugs are developed, manufactured and marketed but also on how the supporting systems must be tested and validated in order to ensure that the data (i.e.: results) contained in them can be trusted and is accurate. In a nut shell this requires that any system containing product data, clinical testing information or used to create submissions for regulatory approval must be validated as described in the Code for Federal Regulations (CFR) 21 Part 11.

How to deal with compliance is a major concern when any new applicable systems or major modifications to existing validated ones are being planned.  Project plans for new systems must incorporate a significant amount of time to build compliance in while upgrades to existing systems can require partial or complete re-validation. This can very expensive which means that many systems are not retired but kept in service for much longer than would normally be expected. You can read more about this aspect of life science IT in this more thorough exploration of the topic.

Given the critical nature of regulatory compliance, life science CIO’s must include it as a key piece of their overall cloud strategy or they will face a much more difficult road as they move to the cloud. Even worse, they may follow a path that may impact their ability to ensure compliance in their cloud strategy leaving them open to issues being raised during FDA audits.

Impact of Cloud Computing

The first portion of this article was meant to give the reader an overall understanding of the current state of IT in the life sciences, the process, issues, and some of the challenges. While the impact of cloud computing is very similar across a number of industries, I will now address the effect that cloud computing is having on life science companies.

Costs

The cost considerations regarding the delivery of information technology services are certainly not unique to the life sciences environment. All CIO’s are continually dealing with budget considerations while having to rationalize and justify the expense it takes to design, build, provision, and support the systems and applications that their users need. In the life sciences IT can consume an inordinate amount of the total operational budget compared to other industries. This is not surprising given that the R&D process is so data driven. While I was at Pfizer Global R&D the IT budget consumed over 15% of the total R&D expenditures and about 8% of the total headcount of the organization. This high level of resource consumption, while maybe necessary, does take away from the organizations core mission which is the science of drug discovery and development.

So, how can cloud computing help life science CIO’s with bringing their costs down? There are a number of ways and I will describe a few of them briefly below.

IAAS is a key component for reducing direct IT costs and overall TCO of applications. Provisioning of new hardware along with the data center space, power and support personnel is a major component of the CIO’s budget.  Life science CIO’s should have a clear understanding of their application and project portfolio so they can leverage the technology to reduce infrastructure costs. Using public cloud infrastructure for non-validated applications especially ones that are very ‘bursty’ in their resource needs will save thousands in hardware and support costs. Private cloud for internal apps can be leveraged to handle those applications requiring a validated or controlled environment. By having a defined application deployment strategy life science CIO’s can significantly reduce costs for the hardware and supporting infrastructure.

SAAS (as described below) can also be a major cost saver for the life sciences. Many vendors are offering specialized R&D applications as validated SAAS systems. A client of mine is a medium sized bio-tech that is doing clinical trials on their new drug. One of the major tasks in getting FDA approval is collecting, storing, collating all of the data and documents that will be a part of the NDA (New Drug Application). Normally companies like this would purchase a document management and publication system along with the supporting hardware and administrative resources. My client has chosen (as many other are) to use a SAAS based document management tool to store their documents and a similarly provisioned publishing tool to pull the documents together and create the files that will be electronically submitted to the FDA. Just by doing this not only were they able to bring this functionality on-line almost immediately but they also save tens of thousands of dollars over doing it in-house.

Disaster recovery and cloud backup are also areas where significant cost savings can be realized. Virtual images can be built of critical applications allowing for temporary running of these systems in the cloud in case of a disaster. Also, one of the major components of data management, backup and restore, can be mitigated by utilizing cloud backups as a part of the overall offsite backup strategy. This not only saves on manpower and hardware but also for backup tapes and offsite storage.
 
Cloud computing can not only reduce costs it also allows the IT group to be much more agile and responsive to their organization and to more quickly deploy the systems and applications to support R&D. The goal is to ultimately get new therapies out the door much quicker not only saving money but also increasing revenues by getting drugs to market quicker and extending the exclusivity period.

Regulatory Compliance

A significant amount of the expense and effort in any life science IT shop goes into ensuring that the systems and applications deployed comply with the appropriate regulatory guidelines. While there are a number of ways that cloud computing can assist with compliance there are two major areas where cloud technologies can have the biggest impact.

Validating and supporting the operating infrastructure of a system is a major component of the compliance effort. Building a validated private cloud environment will allow for the leveraging of the compliance costs over multiple systems. This reduces the hardware costs, data center footprint, and support requirements. Life science CIO’s should examine their portfolio of applications to see which ones can be moved to virtual environments and to deploy new systems only in virtual mode.

Legacy Systems

The management and maintenance of existing legacy systems is a huge headache for the life science CIO. In large IT shops a majority of the personnel and funding goes to supporting systems that have been in service for 5-10 even 15 years. It is not unusual to walk into a big pharma data center and see nameplates from companies past, Wang, DEC, and Compaq systems are just a few that I have seen recently. The primary cause of this is the time and expense that went into validating these systems when they were first brought on-line. There is usually budget for new systems but not budget to either re-validate upgraded systems or provide a validated method for transferring data from a system being retired to a new application. Instead, quite often, new systems are deployed and layered on top of existing applications which must be kept alive as they contain critical information that must be available on demand.

While cloud computing is no miracle cure for this problem a potential solution would be to create a validated private cloud environment and build the appropriate VM flavors to move these legacy applications into a virtual state. This would allow the life science CIO to retire the old hardware and free up both the support resources and space in their data center, as I touched on in this entry.

SAAS

Many firms that sell specialized applications into the life science space are now provisioning their applications via the SAAS model. While SAAS has been a around quite a while in a variety of forms, it is the ability to quickly provide their users with state of the art applications that is appealing to a life science CIO. Beyond leveraging the normal advantages of SAAS life science CIO’s can access application environments that are either pre-validated or is a semi-validated state significantly reducing the resources and time required to provision a new application. Over the last year I’ve weighed in quite a bit on the role of SaaS solutions in the industry.

Impediments

So what are some of the factors impeding the adoption of cloud computing in the life science?  Not surprisingly they are the usual suspects, questions around security, protection of intellectual property, vendor lock-in, latency etc. The biggest factor is how to deal with validation in an environment that is not under your control. Public clouds can be difficult at best to provide the necessary validation required although some IAAS vendors are contemplating building private cloud offerings that would include validation of the physical environment and there are companies offering pre-validated software images that can be loaded on demand.  This type of pre-validated hosted environment would be extremely appealing as they greatly reduce the cost and effort to deploy new R&D applications, which is rehashed in more detail here.

Start Ups & Small Bio-techs

Some of you may get the impression that only large companies can appropriately leverage cloud computing. While it is true that larger IT shops may gain more from cloud computing small companies can also benefit from utilizing cloud technologies. Many smaller companies have as a core strategy that as much of their IT needs as possible will be fulfilled by the cloud first and internally second. In a way smaller companies have an advantage as they do not have an inventory of legacy applications or entrenched people and processes to deal with.

I have two small bio-tech clients using this strategy. One has to an extreme where if you went into their offices all you would find are several wireless access points and printers. There are no servers, no desktops, no phones, and no need for IT administrative support. Everybody brings in their laptop and cell phone and all normal IT services are provided via or IAAS SAAS vendors. Even their phone system is a SAAS provisioned VOIP PBX that is connected to their cell phones. My other client has different needs but still a major portion of both their infrastructure and applications are accessed via the internet, a matter I discussed a while back.

In Closing

So – how is cloud computing impacting the life sciences IT organization? Certainly the changes being wrought by cloud computing are not unique to the life sciences but cloud is changing how life science CIO’s provide the systems needed by their users. Utilizing cloud as a integral part of their overall IT strategy has given life science CIO’s a major tool for reducing costs, being able to quickly response to user needs, ease the burden of regulatory compliance and support the complex process of life science R&D.

Many forward thinking CIO’s are already incorporating cloud into their current application portfolios and long term strategic plans. Those that have a clear and direct strategy for utilizing cloud and are aggressively looking at cloud technologies to help them with their problems will be much more successful than those who are flying by the seat of their pants.

Now what does the future hold? It would be great to be able to look at the future 5 years down the road to see how cloud has been adopted and how it is being utilized in the life sciences. Certainly cloud will be an integral part of the CIO’s portfolio and a much larger portion of the budget will be allocated to cloud computing technologies than we are seeing today. There can be no doubt that the life science IT shop of 2016 will be much different than today, not only in technology and infrastructure but also in staffing and required skill sets.

One thing is clear; cloud is a game changer and a technology that all life science companies need to embrace to remain competitive. Those that do not adapt to the cloud will find themselves unable to keep up or remain competitive with those that do.

About the Author

Bruce Maches is a former Director of Information Technology for Pfizer’s R&D division, current CIO for BRMaches & Associates and a contributing editor for HPC in the Cloud.

He has written extensively about the role of cloud computing and related technologies in the life sciences in his series in our Behind the Cloud section,

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Tuning InfiniBand Interconnects Using Congestion Control

July 26, 2017

InfiniBand is among the most common and well-known cluster interconnect technologies. However, the complexities of an InfiniBand (IB) network can frustrate the most experienced cluster administrators. Maintaining a balan Read more…

By Adam Dorsey

NSF Project Sets Up First Machine Learning Cyberinfrastructure – CHASE-CI

July 25, 2017

Earlier this month, the National Science Foundation issued a $1 million grant to Larry Smarr, director of Calit2, and a group of his colleagues to create a community infrastructure in support of machine learning research Read more…

By John Russell

DARPA Continues Investment in Post-Moore’s Technologies

July 24, 2017

The U.S. military long ago ceded dominance in electronics innovation to Silicon Valley, the DoD-backed powerhouse that has driven microelectronic generation for decades. With Moore's Law clearly running out of steam, the Read more…

By George Leopold

HPE Extreme Performance Solutions

HPE Servers Deliver High Performance Remote Visualization

Whether generating seismic simulations, locating new productive oil reservoirs, or constructing complex models of the earth’s subsurface, energy, oil, and gas (EO&G) is a highly data-driven industry. Read more…

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in 2017 with scale-up production for enterprise datacenters and Read more…

By Tiffany Trader

Tuning InfiniBand Interconnects Using Congestion Control

July 26, 2017

InfiniBand is among the most common and well-known cluster interconnect technologies. However, the complexities of an InfiniBand (IB) network can frustrate the Read more…

By Adam Dorsey

NSF Project Sets Up First Machine Learning Cyberinfrastructure – CHASE-CI

July 25, 2017

Earlier this month, the National Science Foundation issued a $1 million grant to Larry Smarr, director of Calit2, and a group of his colleagues to create a comm Read more…

By John Russell

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in Read more…

By Tiffany Trader

Fujitsu Continues HPC, AI Push

July 19, 2017

Summer is well under way, but the so-called summertime slowdown, linked with hot temperatures and longer vacations, does not seem to have impacted Fujitsu's out Read more…

By Tiffany Trader

Researchers Use DNA to Store and Retrieve Digital Movie

July 18, 2017

From abacus to pencil and paper to semiconductor chips, the technology of computing has always been an ever-changing target. The human brain is probably the com Read more…

By John Russell

The Exascale FY18 Budget – The Next Step

July 17, 2017

On July 12, 2017, the U.S. federal budget for its Exascale Computing Initiative (ECI) took its next step forward. On that day, the full Appropriations Committee Read more…

By Alex R. Larzelere

Women in HPC Luncheon Shines Light on Female-Friendly Hiring Practices

July 13, 2017

The second annual Women in HPC luncheon was held on June 20, 2017, during the International Supercomputing Conference in Frankfurt, Germany. The luncheon provid Read more…

By Tiffany Trader

Satellite Advances, NSF Computation Power Rapid Mapping of Earth’s Surface

July 13, 2017

New satellite technologies have completely changed the game in mapping and geographical data gathering, reducing costs and placing a new emphasis on time series Read more…

By Ken Chiacchia and Tiffany Jolley

Google Pulls Back the Covers on Its First Machine Learning Chip

April 6, 2017

This week Google released a report detailing the design and performance characteristics of the Tensor Processing Unit (TPU), its custom ASIC for the inference Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Quantum Bits: D-Wave and VW; Google Quantum Lab; IBM Expands Access

March 21, 2017

For a technology that’s usually characterized as far off and in a distant galaxy, quantum computing has been steadily picking up steam. Just how close real-wo Read more…

By John Russell

HPC Compiler Company PathScale Seeks Life Raft

March 23, 2017

HPCwire has learned that HPC compiler company PathScale has fallen on difficult times and is asking the community for help or actively seeking a buyer for its a Read more…

By Tiffany Trader

Trump Budget Targets NIH, DOE, and EPA; No Mention of NSF

March 16, 2017

President Trump’s proposed U.S. fiscal 2018 budget issued today sharply cuts science spending while bolstering military spending as he promised during the cam Read more…

By John Russell

CPU-based Visualization Positions for Exascale Supercomputing

March 16, 2017

In this contributed perspective piece, Intel’s Jim Jeffers makes the case that CPU-based visualization is now widely adopted and as such is no longer a contrarian view, but is rather an exascale requirement. Read more…

By Jim Jeffers, Principal Engineer and Engineering Leader, Intel

Nvidia’s Mammoth Volta GPU Aims High for AI, HPC

May 10, 2017

At Nvidia's GPU Technology Conference (GTC17) in San Jose, Calif., this morning, CEO Jensen Huang announced the company's much-anticipated Volta architecture a Read more…

By Tiffany Trader

Facebook Open Sources Caffe2; Nvidia, Intel Rush to Optimize

April 18, 2017

From its F8 developer conference in San Jose, Calif., today, Facebook announced Caffe2, a new open-source, cross-platform framework for deep learning. Caffe2 is the successor to Caffe, the deep learning framework developed by Berkeley AI Research and community contributors. Read more…

By Tiffany Trader

Leading Solution Providers

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

Russian Researchers Claim First Quantum-Safe Blockchain

May 25, 2017

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurrencies like Bitcoin, along with classified government communications and other sensitive digital transfers. Read more…

By Doug Black

MIT Mathematician Spins Up 220,000-Core Google Compute Cluster

April 21, 2017

On Thursday, Google announced that MIT math professor and computational number theorist Andrew V. Sutherland had set a record for the largest Google Compute Engine (GCE) job. Sutherland ran the massive mathematics workload on 220,000 GCE cores using preemptible virtual machine instances. Read more…

By Tiffany Trader

Google Debuts TPU v2 and will Add to Google Cloud

May 25, 2017

Not long after stirring attention in the deep learning/AI community by revealing the details of its Tensor Processing Unit (TPU), Google last week announced the Read more…

By John Russell

Groq This: New AI Chips to Give GPUs a Run for Deep Learning Money

April 24, 2017

CPUs and GPUs, move over. Thanks to recent revelations surrounding Google’s new Tensor Processing Unit (TPU), the computing world appears to be on the cusp of Read more…

By Alex Woodie

Six Exascale PathForward Vendors Selected; DoE Providing $258M

June 15, 2017

The much-anticipated PathForward awards for hardware R&D in support of the Exascale Computing Project were announced today with six vendors selected – AMD Read more…

By John Russell

Top500 Results: Latest List Trends and What’s in Store

June 19, 2017

Greetings from Frankfurt and the 2017 International Supercomputing Conference where the latest Top500 list has just been revealed. Although there were no major Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This