Paving the Way for Accelerated Data Sharing: An Interview with Francine Berman

February 27, 2015

How can we create more effective treatments for Alzheimer’s? Can we increase food security across the globe? Is there a way to more accurately predict natural disasters? Solutions to these and other critical challenges are being advanced through the sharing and exchange of research data, and the efforts of the Research Data Alliance (RDA). In this Q&A, Dr. Francine Berman, chair of the Research Data Alliance/United States, comments on the organization’s past, present and future as a prelude to its 5th Plenary Meeting, taking place next week in San Diego, California.

Briefly tell us about the Research Data Alliance. Why was it created?

Dr. Francine Berman: The Research Data Alliance (RDA) was founded in 2013 as an international organization whose purpose was to facilitate the development, coordination and use of infrastructure that supports data sharing and exchange. This is tremendously important for the research community where the development of technical and social infrastructure such as interoperability frameworks, registries, policy, practice and community standards are needed to utilize and coordinate today’s wealth of data to address complex science and societal challenges.

How do RDA members collaborate?

RDA’s members collaborate through working and interest groups. Interest groups are open-ended and focus on issues around the development and use of data infrastructure. Working group are short-term (12-18 months) and come together to develop and implement specific tools, policy, practices and products that are adopted and used by projects, organizations, and communities. Interest groups often spawn off one or more working groups when individual pieces of infrastructure that need to be developed are identified.

To keep in synch, RDA members meet face to face through Plenaries – three-day meetings held twice a year in various locations worldwide. These are both working meetings, where members meet and advance the efforts of interest and working groups, and community meetings with speakers of interest and updates from funding agencies and policy makers.

With the Research Data Alliance celebrating its second anniversary as an international organization, has the organization’s mission changed or remained the same?

RDA’s mission remains the same and the RDA community is even more interested in making an impact through the development of infrastructure. Last fall, RDA’s first working groups produced the organization’s first set of RDA deliverables; more are set to be delivered this year. Our first deliverables are giving the RDA community an opportunity to assess what is needed for adoption beyond the users embedded in the working groups. We are really looking forward to our first “Adoption Day” for the RDA community hosted by SDSC on March 8.

What type of growth has the RDA experienced since 2013?

RDA had its first “pre-meeting” in Washington, D.C. in the fall of 2012. About 100 people showed up. Today, RDA has more than 2,600 individual members from over 90 countries and in all sectors. A number of organizations have become organizational members as well. Many RDA members meet both at the international plenaries and as RDA “regional” communities. Regional groups in Europe (RDA/EU), Australia (RDA/AU) and the U.S. (RDA/US) are all active and new regional groups are also coming on board. As RDA celebrates its second birthday, the organization is working closely with a broad set of countries, communities, and agencies to expand both the RDA community and organizational infrastructure to include new participants and partners in Japan, Brazil, Canada, South Africa, and other countries.

What is the U.S. RDA region doing?

In the last year, RDA/US has been focusing on three pilot initiatives: outreach, adoption, and student and early career professional engagement. In the outreach area, RDA/US members have been working with organizations in a variety of domains to bring their issues to existing RDA working and interest groups or to create new ones around their infrastructure needs. In particular, the outreach effort is focusing on helping other groups utilize RDA as a vehicle to develop data sharing infrastructure within the U.S.

In the adoption area, RDA/US has developed the Adoption Day program and assisted specific groups in enhancing their own infrastructure through the adoption of RDA deliverables. In the student and early career professional area, RDA/US has funded students to work with RDA interest and working groups, broadening their own technical knowledge and professional networks during the process.

All of these initiatives have been funded by the National Science Foundation and have been important pilots for the community. The student and early career effort was just funded by the Sloan Foundation as a larger program. Beth Plale, Inna Kouper and Kathy Fontaine are now heading that up that activity. Under the leadership of Larry Lannom, RDA/US has also co-sponsored workshops and developed partnerships with CENDI, the National Data Service, the Sustaining Digital Repositories group, and others. These collaborations are creating important linkages within the U.S. data community that can advance the development of needed infrastructure and help us address U.S. data challenges.

Can you describe some of RDA’s deliverables and users?

Absolutely. RDA’s working groups were conceptualized as “tiger teams” that combine members who can build infrastructure with users who need the infrastructure to share data and get their work done. The purpose of RDA infrastructure deliverables is to enable impact.

One of RDA’s first working groups was the Data Type Registries group. Formed at the first RDA Plenary in early 2013, this group’s objective was to make it easier to create machine-readable and researcher-accessible registries of data types that support the accurate use of data, as unclear typing of data can make data open to misinterpretation and limit its usefulness. For more than a year, this working group collaborated together to develop its model and an implementation. The infrastructure products of this group are being adopted by European Data Infrastructure (EUDAT), the National Institute of Standards and Technology in the U.S., and additional groups who are applying it to their own research activities.

In contrast, RDA’s Wheat Data Interoperability working group is still in process. This group’s objective is to build an integrated wheat information system for the international wheat community of researchers, growers, breeders, etc. Such a system is critical to advance and sustain wheat data information sharing, reusability and interoperability. The working group includes members from the French National Institute for Agricultural Research, the International Maize and Wheat Improvement Center, and other agriculture-related organizations.

What can we expect at RDA’s Plenary in San Diego?

RDA/US will be hosting the fifth RDA Plenary in San Diego on March 9-11 with Adoption Day on March 8. We’ll welcome a worldwide community of members as well as organizational partners, funders, students, and local colleagues from San Diego. We will have a welcome by Jim Kurose, the new Assistant Director of NSF’s Computer and Information Science and Engineering Directorate and 3 keynotes that span the data landscape. Margaret Leinen, Director of the Scripps Institute of Oceanography will speak about ocean data, Stephen Friend, head of Sage Bionetworks will talk about open data commons and patient engagement, and Nao Tsunematsu will talk about data policy in Japan.

The meeting will also be a working meeting with much time spent in open working groups and interest groups. This is a great time for new members to experience the RDA discussions, join a group, talk to RDA members about starting a new group to focus on their own issues of interest, or explore organizational membership. We’ll also have panels with funders and plenary sessions around the digital humanities and other topics. It’s a full schedule but we still left time for a beach party.

What’s next for RDA?

The importance of data sharing to support innovation is increasing and RDA will continue to focus on the development, coordination and use of this infrastructure. RDA has also emerged as a neutral “town square” in which organizations can come together to develop common agendas – each meeting seems to attract other meetings that can benefit from co-location with the RDA community. In Washington at Plenary 2, the data citation community came together. During the San Diego meeting at Plenary 5, the Preservation and Archiving Special Interest Group will convene to discuss trends and issues related to digital preservation.

Most important is that RDA continues to provide a vehicle for getting things done and accelerating data sharing. The first two years have created a great culture for doing this. Our hope is that the next years accelerate and improve RDA’s usefulness and impact.

About Francine Berman

Francine Berman is Chair of Research Data Alliance / United States and co-Chair of the RDA Council. She is the Edward P. Hamilton Distinguished Professor of Computer Science at Rensselaer Polytechnic Institute (RPI).

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

How the United States Invests in Supercomputing

November 14, 2018

The CORAL supercomputers Summit and Sierra are now the world's fastest computers and are already contributing to science with early applications. Ahead of SC18, Maciej Chojnowski with ICM at the University of Warsaw discussed the details of the CORAL project with Dr. Dimitri Kusnezov from the U.S. Department of Energy. Read more…

By Maciej Chojnowski

At SC18: Humanitarianism Amid Boom Times for HPC

November 14, 2018

At SC18 in Dallas, the feeling on the ground is one of forward-looking buoyancy. Like boom times that cycle through the Texas oil fields, the HPC industry is enjoying a prosperity seen only every few decades, one driven Read more…

By Doug Black

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, produ Read more…

By John Russell

HPE Extreme Performance Solutions

AI Can Be Scary. But Choosing the Wrong Partners Can Be Mortifying!

As you continue to dive deeper into AI, you will discover it is more than just deep learning. AI is an extremely complex set of machine learning, deep learning, reinforcement, and analytics algorithms with varying compute, storage, memory, and communications needs. Read more…

IBM Accelerated Insights

New Data Management Techniques for Intelligent Simulations

The trend in high performance supercomputer design has evolved – from providing maximum compute capability for complex scalable science applications, to capacity computing utilizing efficient, cost-effective computing power for solving a small number of large problems or a large number of small problems. Read more…

New Panasas High Performance Storage Straddles Commercial-Traditional HPC

November 13, 2018

High performance storage vendor Panasas has launched a new version of its ActiveStor product line this morning featuring what the company said is the industry’s first plug-and-play, portable parallel file system that delivers up to 75 Gb/s per rack on industry standard hardware combined with “enterprise-grade reliability and manageability.” Read more…

By Doug Black

How the United States Invests in Supercomputing

November 14, 2018

The CORAL supercomputers Summit and Sierra are now the world's fastest computers and are already contributing to science with early applications. Ahead of SC18, Maciej Chojnowski with ICM at the University of Warsaw discussed the details of the CORAL project with Dr. Dimitri Kusnezov from the U.S. Department of Energy. Read more…

By Maciej Chojnowski

At SC18: Humanitarianism Amid Boom Times for HPC

November 14, 2018

At SC18 in Dallas, the feeling on the ground is one of forward-looking buoyancy. Like boom times that cycle through the Texas oil fields, the HPC industry is en Read more…

By Doug Black

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can Read more…

By John Russell

New Panasas High Performance Storage Straddles Commercial-Traditional HPC

November 13, 2018

High performance storage vendor Panasas has launched a new version of its ActiveStor product line this morning featuring what the company said is the industry’s first plug-and-play, portable parallel file system that delivers up to 75 Gb/s per rack on industry standard hardware combined with “enterprise-grade reliability and manageability.” Read more…

By Doug Black

SC18 Student Cluster Competition – Revealing the Field

November 13, 2018

It’s November again and we’re almost ready for the kick-off of one of the greatest computer sports events in the world – the SC Student Cluster Competitio Read more…

By Dan Olds

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

OpenACC Talks Up Summit and Community Momentum at SC18

November 12, 2018

OpenACC – the directives-based parallel programing model for optimizing applications on heterogeneous architectures – is showcasing user traction and HPC im Read more…

By John Russell

How ASCI Revolutionized the World of High-Performance Computing and Advanced Modeling and Simulation

November 9, 2018

The 1993 Supercomputing Conference was held in Portland, Oregon. That conference and it’s show floor provided a good snapshot of the uncertainty that U.S. supercomputing was facing in the early 1990s. Many of the companies exhibiting that year would soon be gone, either bankrupt or acquired by somebody else. Read more…

By Alex R. Larzelere

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

TACC Wins Next NSF-funded Major Supercomputer

July 30, 2018

The Texas Advanced Computing Center (TACC) has won the next NSF-funded big supercomputer beating out rivals including the National Center for Supercomputing Ap Read more…

By John Russell

IBM at Hot Chips: What’s Next for Power

August 23, 2018

With processor, memory and networking technologies all racing to fill in for an ailing Moore’s law, the era of the heterogeneous datacenter is well underway, Read more…

By Tiffany Trader

Requiem for a Phi: Knights Landing Discontinued

July 25, 2018

On Monday, Intel made public its end of life strategy for the Knights Landing "KNL" Phi product set. The announcement makes official what has already been wide Read more…

By Tiffany Trader

House Passes $1.275B National Quantum Initiative

September 17, 2018

Last Thursday the U.S. House of Representatives passed the National Quantum Initiative Act (NQIA) intended to accelerate quantum computing research and developm Read more…

By John Russell

CERN Project Sees Orders-of-Magnitude Speedup with AI Approach

August 14, 2018

An award-winning effort at CERN has demonstrated potential to significantly change how the physics based modeling and simulation communities view machine learni Read more…

By Rob Farber

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

By John Russell

Leading Solution Providers

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

TACC’s ‘Frontera’ Supercomputer Expands Horizon for Extreme-Scale Science

August 29, 2018

The National Science Foundation and the Texas Advanced Computing Center announced today that a new system, called Frontera, will overtake Stampede 2 as the fast Read more…

By Tiffany Trader

HPE No. 1, IBM Surges, in ‘Bucking Bronco’ High Performance Server Market

September 27, 2018

Riding healthy U.S. and global economies, strong demand for AI-capable hardware and other tailwind trends, the high performance computing server market jumped 28 percent in the second quarter 2018 to $3.7 billion, up from $2.9 billion for the same period last year, according to industry analyst firm Hyperion Research. Read more…

By Doug Black

Intel Announces Cooper Lake, Advances AI Strategy

August 9, 2018

Intel's chief datacenter exec Navin Shenoy kicked off the company's Data-Centric Innovation Summit Wednesday, the day-long program devoted to Intel's datacenter Read more…

By Tiffany Trader

Germany Celebrates Launch of Two Fastest Supercomputers

September 26, 2018

The new high-performance computer SuperMUC-NG at the Leibniz Supercomputing Center (LRZ) in Garching is the fastest computer in Germany and one of the fastest i Read more…

By Tiffany Trader

Houston to Field Massive, ‘Geophysically Configured’ Cloud Supercomputer

October 11, 2018

Based on some news stories out today, one might get the impression that the next system to crack number one on the Top500 would be an industrial oil and gas mon Read more…

By Tiffany Trader

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

Google Releases Machine Learning “What-If” Analysis Tool

September 12, 2018

Training machine learning models has long been time-consuming process. Yesterday, Google released a “What-If Tool” for probing how data point changes affect a model’s prediction. The new tool is being launched as a new feature of the open source TensorBoard web application... Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This