Visit additional Tabor Communication Publications
October 13, 2006
Statistically, every person in Germany spends about 2700 Euros per year keeping himself or herself alive and well, which amounts to about 11 percent of GNP or around 230 billion Euros. Although the majority of illnesses can be treated fairly well nowadays, doctors do encounter problems, where a medicine is not effective due to a specific predisposition of a person, or where one has to decide amongst several alternatives to find the optimum treatment, e.g., surgery.
Thanks to modern tools such as radiography or NMR, looking inside a patient is no longer a problem. The deciphering of the human genome and the associated research of the human proteome has enhanced our understanding of processes on the molecular and cellular level. This huge body of information however is of little use when dynamic or bio-mechanic effects are involved. Examples are the flow of blood in arteries, air in lungs, plastic surgery involving bone tissue, or patient-specific major implants. For the patient, to find out the best possible therapy, a trial-and-error approach is out of question. However, a new class of simulation tools is starting to appear that makes such an approach feasible for the doctors by providing a realistic 3-D patient-specific numerical model of a body or body parts, where different approaches can be tested and optimised without direct patient involvement.
In the public eye, eHealth is still mainly associated with applications such as databases, networking of data, hospital administration and data classification. These new upcoming applications play a more active role and promise to optimize patients' treatment rather than their care. Through massive compute power, they promise to distil a new level of insight and provide a tool for patient-specific diagnosis and therapy support.
In a recent briefing at NEC's Computers & Communications Research lab (CCRLE) in St. Augustin, Germany, Dr. Guy Lonsdale, the Centre's manager, and Professor Daniel Ruefenacht of the University Hospital of Geneva, gave an overview on the trends and goals of the European eHealth program and a specific implementation concerning the treatment of ballooning arteries, so called aneurysms, especially in the human brain. Estimations are that 2 to 4 percent of the population harbours at least one aneurysm.
Since 2000, the CCRLE has been involved in the European eHealth program. With SimBio, a toolset has been developed that enabled the use of actual scan data from patients to build a numerical model of required body parts as the basis for further simulation to support the diagnosis. In 2002, a second toolset, GEMSS (Grid enabled medical simulation services) was developed to provide a framework for a doctor's interface, a secure workflow and service allocation capabilities towards a future set of service based eHealth applications.
This know-how was brought into the new EU project @neurist (http://www.aneurist.org/), which was started this year and is one of the flagship projects in eHealth, bringing together about 30 partners. CCRLE is involved in key parts of the Grid infrastructure and the simulation tool chain.
Operating on the brain is very risky. Therefore one would like to have some reliable means to decide on its necessity. Simulation comes to the rescue. Understanding aneurysms involves biomechanics (both haemodynamics -- the flow of the blood -- and structural mechanics -- the deformation of the vessel wall), biology (the properties and behaviour of the substances and materials) and micro mechanics (the interaction between structures in the blood and the vessel's wall). This highly complex environment requires a large amount of compute power both to be able to make reliable predictions on the stability of the aneurysm as well as determine the effects on flow patterns and pressure distributions caused by inserting supports (stents) into the artery. The idea is to have the blood stagnate within the aneurysm so that it has time to clot and fill the void, making a rupture of the artery impossible. In terms of computational fluid dynamics, this is a complex flow fluid-structure interaction with chemical/biological processes on a variety of time-scales.
Using the NEC SX-8 of Stuttgart's High Performance Compute Centre, the CCRLE in collaboration with the University of Sheffield succeeded in modelling the flow in such a complicated environment while also allowing for blood clotting effects. The modelling was based on the Lattice-Boltzmann method (see e.g., http://www.science.uva.nl/research/scs/projects/lbm_web/lbm.html). Also, the first results on the effectiveness of different stent designs in redirecting the blood flow were achieved by research coordinated by the University Hospital of Geneva. These results will be taken further in another EU research project NEC is involved in, called COAST, where the effects of such devices for treating coronary artery diseases, including blood clotting and drug transport, will be simulated based on a hierarchical aggregation of coupled cellular automata.
With the availability of high performance computers within a Grid environment, the medical sector is now able to add simulation to visualisation -- a step that the general industry had already taken decades ago.
With 25 years of experience in IT, Herbert Wenk (firstname.lastname@example.org) is working as a consultant and technical journalist in Germany.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.