Paving the Way for Accelerated Data Sharing: An Interview with Francine Berman

February 27, 2015

How can we create more effective treatments for Alzheimer’s? Can we increase food security across the globe? Is there a way to more accurately predict natural disasters? Solutions to these and other critical challenges are being advanced through the sharing and exchange of research data, and the efforts of the Research Data Alliance (RDA). In this Q&A, Dr. Francine Berman, chair of the Research Data Alliance/United States, comments on the organization’s past, present and future as a prelude to its 5th Plenary Meeting, taking place next week in San Diego, California.

Briefly tell us about the Research Data Alliance. Why was it created?

Dr. Francine Berman: The Research Data Alliance (RDA) was founded in 2013 as an international organization whose purpose was to facilitate the development, coordination and use of infrastructure that supports data sharing and exchange. This is tremendously important for the research community where the development of technical and social infrastructure such as interoperability frameworks, registries, policy, practice and community standards are needed to utilize and coordinate today’s wealth of data to address complex science and societal challenges.

How do RDA members collaborate?

RDA’s members collaborate through working and interest groups. Interest groups are open-ended and focus on issues around the development and use of data infrastructure. Working group are short-term (12-18 months) and come together to develop and implement specific tools, policy, practices and products that are adopted and used by projects, organizations, and communities. Interest groups often spawn off one or more working groups when individual pieces of infrastructure that need to be developed are identified.

To keep in synch, RDA members meet face to face through Plenaries – three-day meetings held twice a year in various locations worldwide. These are both working meetings, where members meet and advance the efforts of interest and working groups, and community meetings with speakers of interest and updates from funding agencies and policy makers.

With the Research Data Alliance celebrating its second anniversary as an international organization, has the organization’s mission changed or remained the same?

RDA’s mission remains the same and the RDA community is even more interested in making an impact through the development of infrastructure. Last fall, RDA’s first working groups produced the organization’s first set of RDA deliverables; more are set to be delivered this year. Our first deliverables are giving the RDA community an opportunity to assess what is needed for adoption beyond the users embedded in the working groups. We are really looking forward to our first “Adoption Day” for the RDA community hosted by SDSC on March 8.

What type of growth has the RDA experienced since 2013?

RDA had its first “pre-meeting” in Washington, D.C. in the fall of 2012. About 100 people showed up. Today, RDA has more than 2,600 individual members from over 90 countries and in all sectors. A number of organizations have become organizational members as well. Many RDA members meet both at the international plenaries and as RDA “regional” communities. Regional groups in Europe (RDA/EU), Australia (RDA/AU) and the U.S. (RDA/US) are all active and new regional groups are also coming on board. As RDA celebrates its second birthday, the organization is working closely with a broad set of countries, communities, and agencies to expand both the RDA community and organizational infrastructure to include new participants and partners in Japan, Brazil, Canada, South Africa, and other countries.

What is the U.S. RDA region doing?

In the last year, RDA/US has been focusing on three pilot initiatives: outreach, adoption, and student and early career professional engagement. In the outreach area, RDA/US members have been working with organizations in a variety of domains to bring their issues to existing RDA working and interest groups or to create new ones around their infrastructure needs. In particular, the outreach effort is focusing on helping other groups utilize RDA as a vehicle to develop data sharing infrastructure within the U.S.

In the adoption area, RDA/US has developed the Adoption Day program and assisted specific groups in enhancing their own infrastructure through the adoption of RDA deliverables. In the student and early career professional area, RDA/US has funded students to work with RDA interest and working groups, broadening their own technical knowledge and professional networks during the process.

All of these initiatives have been funded by the National Science Foundation and have been important pilots for the community. The student and early career effort was just funded by the Sloan Foundation as a larger program. Beth Plale, Inna Kouper and Kathy Fontaine are now heading that up that activity. Under the leadership of Larry Lannom, RDA/US has also co-sponsored workshops and developed partnerships with CENDI, the National Data Service, the Sustaining Digital Repositories group, and others. These collaborations are creating important linkages within the U.S. data community that can advance the development of needed infrastructure and help us address U.S. data challenges.

Can you describe some of RDA’s deliverables and users?

Absolutely. RDA’s working groups were conceptualized as “tiger teams” that combine members who can build infrastructure with users who need the infrastructure to share data and get their work done. The purpose of RDA infrastructure deliverables is to enable impact.

One of RDA’s first working groups was the Data Type Registries group. Formed at the first RDA Plenary in early 2013, this group’s objective was to make it easier to create machine-readable and researcher-accessible registries of data types that support the accurate use of data, as unclear typing of data can make data open to misinterpretation and limit its usefulness. For more than a year, this working group collaborated together to develop its model and an implementation. The infrastructure products of this group are being adopted by European Data Infrastructure (EUDAT), the National Institute of Standards and Technology in the U.S., and additional groups who are applying it to their own research activities.

In contrast, RDA’s Wheat Data Interoperability working group is still in process. This group’s objective is to build an integrated wheat information system for the international wheat community of researchers, growers, breeders, etc. Such a system is critical to advance and sustain wheat data information sharing, reusability and interoperability. The working group includes members from the French National Institute for Agricultural Research, the International Maize and Wheat Improvement Center, and other agriculture-related organizations.

What can we expect at RDA’s Plenary in San Diego?

RDA/US will be hosting the fifth RDA Plenary in San Diego on March 9-11 with Adoption Day on March 8. We’ll welcome a worldwide community of members as well as organizational partners, funders, students, and local colleagues from San Diego. We will have a welcome by Jim Kurose, the new Assistant Director of NSF’s Computer and Information Science and Engineering Directorate and 3 keynotes that span the data landscape. Margaret Leinen, Director of the Scripps Institute of Oceanography will speak about ocean data, Stephen Friend, head of Sage Bionetworks will talk about open data commons and patient engagement, and Nao Tsunematsu will talk about data policy in Japan.

The meeting will also be a working meeting with much time spent in open working groups and interest groups. This is a great time for new members to experience the RDA discussions, join a group, talk to RDA members about starting a new group to focus on their own issues of interest, or explore organizational membership. We’ll also have panels with funders and plenary sessions around the digital humanities and other topics. It’s a full schedule but we still left time for a beach party.

What’s next for RDA?

The importance of data sharing to support innovation is increasing and RDA will continue to focus on the development, coordination and use of this infrastructure. RDA has also emerged as a neutral “town square” in which organizations can come together to develop common agendas – each meeting seems to attract other meetings that can benefit from co-location with the RDA community. In Washington at Plenary 2, the data citation community came together. During the San Diego meeting at Plenary 5, the Preservation and Archiving Special Interest Group will convene to discuss trends and issues related to digital preservation.

Most important is that RDA continues to provide a vehicle for getting things done and accelerating data sharing. The first two years have created a great culture for doing this. Our hope is that the next years accelerate and improve RDA’s usefulness and impact.

About Francine Berman

Francine Berman is Chair of Research Data Alliance / United States and co-Chair of the RDA Council. She is the Edward P. Hamilton Distinguished Professor of Computer Science at Rensselaer Polytechnic Institute (RPI).

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penalties to HPC applications. Even as these patches are rolled o Read more…

By Pete Beckman

Intel Touts Silicon Spin Qubits for Quantum Computing

February 14, 2018

Debate around what makes a good qubit and how best to manufacture them is a sprawling topic. There are many insistent voices favoring one or another approach. Referencing a paper published today in Nature, Intel has offe Read more…

By John Russell

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

HPE Extreme Performance Solutions

Safeguard Your HPC Environment with the World’s Most Secure Industry Standard Servers

Today’s organizations operate in an environment with ever-evolving threats, and in order to protect themselves they must continuously bolster their security strategy. Hewlett Packard Enterprise (HPE) and Intel® are addressing modern security challenges with the world’s most secure industry standard servers powered by the latest generation of Intel® Xeon® Scalable processors. Read more…

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended to make it easier, faster and cheaper to train and run machi Read more…

By Doug Black

Fluid HPC: How Extreme-Scale Computing Should Respond to Meltdown and Spectre

February 15, 2018

The Meltdown and Spectre vulnerabilities are proving difficult to fix, and initial experiments suggest security patches will cause significant performance penal Read more…

By Pete Beckman

Brookhaven Ramps Up Computing for National Security Effort

February 14, 2018

Last week, Dan Coats, the director of Director of National Intelligence for the U.S., warned the Senate Intelligence Committee that Russia was likely to meddle in the 2018 mid-term U.S. elections, much as it stands accused of doing in the 2016 Presidential election. Read more…

By John Russell

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

The Food Industry’s Next Journey — from Mars to Exascale

February 12, 2018

Global food producer and one of the world's leading chocolate companies Mars Inc. has a unique perspective on the impact that exascale computing will have on the food industry. Read more…

By Scott Gibson, Oak Ridge National Laboratory

Singularity HPC Container Start-Up – Sylabs – Emerges from Stealth

February 8, 2018

The driving force behind Singularity, the popular HPC container technology, is bringing the open source platform to the enterprise with the launch of a new vent Read more…

By George Leopold

Dell EMC Debuts PowerEdge Servers with AMD EPYC Chips

February 6, 2018

AMD notched another EPYC processor win today with Dell EMC’s introduction of three PowerEdge servers (R6415, R7415, and R7425) based on the EPYC 7000-series p Read more…

By John Russell

‘Next Generation’ Universe Simulation Is Most Advanced Yet

February 5, 2018

The research group that gave us the most detailed time-lapse simulation of the universe’s evolution in 2014, spanning 13.8 billion years of cosmic evolution, is back in the spotlight with an even more advanced cosmological model that is providing new insights into how black holes influence the distribution of dark matter, how heavy elements are produced and distributed, and where magnetic fields originate. Read more…

By Tiffany Trader

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Leading Solution Providers

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

V100 Good but not Great on Select Deep Learning Aps, Says Xcelerit

November 27, 2017

Wringing optimum performance from hardware to accelerate deep learning applications is a challenge that often depends on the specific application in use. A benc Read more…

By John Russell

SC17: Singularity Preps Version 3.0, Nears 1M Containers Served Daily

November 1, 2017

Just a few months ago about half a million jobs were being run daily using Singularity containers, the LBNL-founded container platform intended for HPC. That wa Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This