Research tackles the big questions, delving into uncharted territory in pursuit of knowledge that could change the world. Today’s research simulations are generating more data than ever before, a trend that shows no signs of slowing. Access to this ever increasing volume of data opens the possibility to blend AI with traditional computational methods for research, unlocking new possibilities in terms of performance and insight. Indeed, these vast volumes of data are friend rather than foe in the quest to advance research.
The University of Birmingham, a major research destination in the UK, is very familiar with these challenges. Established by Queen Victoria in 1900, the University of Birmingham is one of the largest universities in the UK, serving approximately 34,000 undergraduate and graduate students. The university’s Computer Centre is the centerpiece of the Birmingham Environment for Academic Research (BEAR), a collection of IT resources available without cost to the University of Birmingham community and qualified external researchers.
“We support research in a wide range of areas including applying and developing techniques to use AI and deep learning,” explains Simon Thompson, Research Computing Infrastructure Architect at the University of Birmingham. “For example, we’re collaborating with the University of Nottingham on the Centre of Membrane Proteins and Receptors [COMPARE] project. By analyzing the super high-resolution images produced by the latest generations of microscopes, the project will shed light on how cardiovascular disease, respiratory disorders and cancer can be better prevented and treated. We’ve also joined Health Data Research UK, which focuses on developing and applying cutting-edge data science approaches to enable more efficient and accurate diagnostics.
“A research team is using our facilities to tackle an area where HPC [high-performance computing] hasn’t typically been applied: linguistics. They’re using textual analysis to understand how the most translated text of all time – the Bible – has changed over the centuries, and what this can teach us about language and culture. The university recently became part of The Alan Turing Institute, which is the UK institute for data science and AI, which aims to bring together researchers with different skill sets.
Today more than ever, data is the cornerstone of research and development. Modern research calls for advanced techniques blending AI methods with traditional HPC modeling and simulation – with data being intrinsic to the accuracy and fidelity of the results. These advanced workloads rely upon growing volumes of data from a diverse of data sources for training of AI models, as well as large data sets produced by simulation and modeling workloads with an increasing level of fidelity. Dealing with these large data volumes can quickly turn into a battle between the storage infrastructure and the demands from the researchers – with high stakes. Computing efficiency and throughput has the potential to suffer, so the need for a carefully defined strategy for the management of data is crucial for organizations on the cutting edge of research and development.
Formulating the strategy
Like any good plan of battle, there are a number of key considerations when formulating a data strategy for a modern computing environment. This includes:
- Throughput – high performance storage to meet the demands of modern accelerated servers which have the ability to rapidly process large volumes of data.
- Data availability – the need for data to be available across a large user base, potentially geographically dispersed.
- Archival – reliable long term archival capabilities are crucial for both compliance reasons as well as being able to retrieve past bodies of knowledge crucial for future research work.
Faced with these same considerations, the University of Birmingham in the UK worked with IBM business partner OCF PLC to design and implement a software defined storage strategy to facilitate the growing demands of advanced computing for the 5000 researchers at the campus. A premier research institution, the University of Birmingham achieve their goals of a central provision for data that provides performance and security combined with ease of management for the Research Computing team.
IBM Spectrum Scale provides the data backbone of the Research Computing infrastructure at the University of Birmingham. The high performance delivered by Spectrum Scale ensures that data is readily available across all computing clusters automatically, so that researchers can focus on the quest for knowledge. “Breakthroughs in a range of fields are happening all the time at the university. Underpinning all of this pioneering innovation, IBM Spectrum Storage solutions make sure that the data is there, whenever our researchers need it.” says Thompson. With IBM Spectrum Scale Data Management Edition, the output from research can be encrypted at rest to help ensure the protection of IP in addition to meeting more stringent data protection requirements.
Read the full University of Birmingham case study here.