GRDI2020 Envisions New Science Paradigms

By Nicole Hemsoth

September 22, 2011

This week at the EGI Technology Forum in Lyon France, leaders from around Europe gathered to discuss the creation of sustainable, broad-reaching research objectives that would address existing user communities, encourage new ones to develop, and find ways to leverage the influx of data to increase scientific discovery and European competitiveness.

Among the many elements on the agenda for the week, the GRDI2020 Vision was the topic of a great deal of conversation. This effort proposes a ten-year plan to create global research infrastructures that can address the needs of data-intensive scientific projects while remaining sustainable. While this was announced last year, the EGI Technical Forum was the locale for a more comprehensive roadmap as Europe looks ahead to 2020.

The roadmap that was presented this week laid out Europe’s vision for a global research data infrastructure that can be the “enabler of an open, extensible and evolvable digital science ecosystem. This will be maintained and created through advances in grid and cloud computing via the use of science gateways (community-specific sets of tools, application and data collections that can be accessed via a portal or application suite) and virtual research environments (VREs). VREs will be the technological framework behind virtual working environments and communities that collaborate via cloud or grid-based portals.

By creating these environments that take advantage of distributed resources, hardware, software and knowledge-wise, GRDI2020 hopes to create an “interoperable science ecosystem” that will reduce data fragmentation and speed access and use of data stores.

This, of course, involves a great deal of coordination of every layer of the technical and scientific community in Europe. From those providing tools and support to create and maintain the grid or cloud-based ecosystem to those governing—and all researchers in between.

The group sees a large number of challenges, however. Issues like international collaboration, laws and policies at odds certainly tops the list, but other problems, including the technical challenges of creating interoperable tools, authentication layers, data movement issues, and more general aspects of distributed computing are present as well.

Despite the practical concerns that could limit progress, GRDI2020 sees cloud computing as one of the leading forces in the march toward their goals. The group claims that they “envision that the future Digital Data Libraries (Science Data Centers) will be based on cloud philosophy and technology” with each community having its own cloud. This is where federation (an issue that EGI lead Steven Newhouse discussed at length this week) comes into play. By federating these clouds, GRDI2020 can approach the vision with increased collaboration to enable multidisciplinary research.

At the heart of this anticipation of the era of data-intensive scientific computing are a number of concerns that the organization hopes to address, including the need to build realistic, scalable research infrastructures to support data-intensive research. This is a program that the European Commission is backing, and that many attendees at the EGI Technology Forum seemed to feel was a critical first step in advancing European research.

As part of these far-reaching efforts, GRDI2020 seeks to spend the next decade creating “a framework for obtaining technological, organizational and policy recommendations guiding the development of ecosystems of global research data infrastructures.” This means leveraging existing user communities, experts, leaders behind large projects and policy makers to help lead to this vision of sustainable global research systems.

The experts behind this initiative say that Europe has entered the “new science paradigm” in which many areas of research are facing a hundred, if not a thousand-fold increase in the amount of data they contend with compared to just ten years ago. This is due to an explosion in the number of sensors and scientific instruments, not to mention the fact that storing the data gathered is now far more affordable than it was ten years ago.

Officials from the GRDI2020 project say “this data deluge can revolutionize the way research is carried out and lead to the emergence of a new fourth paradigm of science based on data-intensive computing.” They say that this new era will lead to a “data-centric” way of thinking about research and solving problems—but that there is a severe lack of infrastructure available to support these opportunities.

GRDI2020’s vision of research data infrastructures requires a great deal of cross-disciplinary collaboration. They define this new way of thinking about infrastructure in the following categories:

•    Tools and services that support the whole research cycle
•    The movement of scientific data across scientific disciplines (which was an issue that was addressed in detail by the representatives who spoke about GlobusEUROPE and GlobusOnline)
•    The creation of open linked data spaces by connecting data sets from diverse disciplines
•    The management of scientific workflows
•    The interoperation between scientific data and literature
•    The development of an integrated science policy framework.

This week nearly every session addressed one or more of these issues—but the actual GDRI statement on the vision for 2020 broke down these generalities—and shed light on progress toward goals.

The efforts from GRDI2020 might serve as ample motivation to other localized areas to create similar policies. For instance, this type of federated vision could easily extend to multiple universities sharing individual disciplinary clouds that are gathered under one roof. However, if there was one thing that became clear this week, it’s that Europe has its act together, organizationally speaking. By creating research infrastructures that filter down from a policy hierarchy and extend into hundreds of sub-branches, there is cohesion—an essential element of any distributed computing or resource-sharing effort.

The GRDI2020 program is funded by the European Commission under the 7th Framework Programme, which designed to boost Europe’s competitiveness through key technology and research investments. It combines all of the research-driven initiatives in the EU together under one organization, spitting the focus between four areas, including cooperation, ideas, people and capacities. As one can imagine this creates a rather complex set of hierarchies under each classification, leading to a wide range of activities that receive funding and support from the program.

To put the work of GRDI2020 in context and see how they are enabling researchers to have better access to needed tools and infrastructure, a good example is below. While data from fisheries might not be every scientist’s cup of tea, his challenges are similar to those in nearly every discipline—needing to contend with massive data sets in a way that promotes quick access, collaboration and thorough host of tools to manage and solve problems.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

SRC Spends $200M on University Research Centers

January 16, 2018

The Semiconductor Research Corporation, as part of its JUMP initiative, has awarded $200 million to fund six research centers whose areas of focus span cognitive computing, memory-centric computing, high-speed communicat Read more…

By John Russell

US Seeks to Automate Video Analysis

January 16, 2018

U.S. military and intelligence agencies continue to look for new ways to use artificial intelligence to sift through huge amounts of video imagery in hopes of freeing analysts to identify threats and otherwise put their Read more…

By George Leopold

URISC@SC17 and the #LongestLastMile

January 11, 2018

A multinational delegation recently attended the Understanding Risk in Shared CyberEcosystems workshop, or URISC@SC17, in Denver, Colorado. URISC participants and presenters from 11 countries, including eight African nations, 12 U.S. states, Canada, India and Nepal, also attended SC17, the annual international conference for high performance computing, networking, storage and analysis that drew nearly 13,000 attendees. Read more…

By Elizabeth Leake, STEM-Trek Nonprofit

HPE Extreme Performance Solutions

HPE and NREL Take Steps to Create a Sustainable, Energy-Efficient Data Center with an H2 Fuel Cell

As enterprises attempt to manage rising volumes of data, unplanned data center outages are becoming more common and more expensive. As the cost of downtime rises, enterprises lose out on productivity and valuable competitive advantage without access to their critical data. Read more…

When the Chips Are Down

January 11, 2018

In the last article, "The High Stakes Semiconductor Game that Drives HPC Diversity," I alluded to the challenges facing the semiconductor industry and how that may impact the evolution of HPC systems over the next few years. I thought I’d lift the covers a little and look at some of the commercial challenges that impact the component technology we use in HPC. Read more…

By Dairsie Latimer

SRC Spends $200M on University Research Centers

January 16, 2018

The Semiconductor Research Corporation, as part of its JUMP initiative, has awarded $200 million to fund six research centers whose areas of focus span cognitiv Read more…

By John Russell

When the Chips Are Down

January 11, 2018

In the last article, "The High Stakes Semiconductor Game that Drives HPC Diversity," I alluded to the challenges facing the semiconductor industry and how that may impact the evolution of HPC systems over the next few years. I thought I’d lift the covers a little and look at some of the commercial challenges that impact the component technology we use in HPC. Read more…

By Dairsie Latimer

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Momentum Builds for US Exascale

January 9, 2018

2018 looks to be a great year for the U.S. exascale program. The last several months of 2017 revealed a number of important developments that help put the U.S. Read more…

By Alex R. Larzelere

ANL’s Rick Stevens on CANDLE, ARM, Quantum, and More

January 8, 2018

Late last year HPCwire caught up with Rick Stevens, associate laboratory director for computing, environment and life Sciences at Argonne National Laboratory, f Read more…

By John Russell

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

The @hpcnotes Predictions for HPC in 2018

January 4, 2018

I’m not averse to making predictions about the world of High Performance Computing (and Supercomputing, Cloud, etc.) in person at conferences, meetings, causa Read more…

By Andrew Jones

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Leading Solution Providers

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

Nvidia, Partners Announce Several V100 Servers

September 27, 2017

Here come the Volta 100-based servers. Nvidia today announced an impressive line-up of servers from major partners – Dell EMC, Hewlett Packard Enterprise, IBM Read more…

By John Russell

Intel Delivers 17-Qubit Quantum Chip to European Research Partner

October 10, 2017

On Tuesday, Intel delivered a 17-qubit superconducting test chip to research partner QuTech, the quantum research institute of Delft University of Technology (TU Delft) in the Netherlands. The announcement marks a major milestone in the 10-year, $50-million collaborative relationship with TU Delft and TNO, the Dutch Organization for Applied Research, to accelerate advancements in quantum computing. Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This