Paving the Way for Accelerated Data Sharing: An Interview with Francine Berman

February 27, 2015

How can we create more effective treatments for Alzheimer’s? Can we increase food security across the globe? Is there a way to more accurately predict natural disasters? Solutions to these and other critical challenges are being advanced through the sharing and exchange of research data, and the efforts of the Research Data Alliance (RDA). In this Q&A, Dr. Francine Berman, chair of the Research Data Alliance/United States, comments on the organization’s past, present and future as a prelude to its 5th Plenary Meeting, taking place next week in San Diego, California.

Briefly tell us about the Research Data Alliance. Why was it created?

Dr. Francine Berman: The Research Data Alliance (RDA) was founded in 2013 as an international organization whose purpose was to facilitate the development, coordination and use of infrastructure that supports data sharing and exchange. This is tremendously important for the research community where the development of technical and social infrastructure such as interoperability frameworks, registries, policy, practice and community standards are needed to utilize and coordinate today’s wealth of data to address complex science and societal challenges.

How do RDA members collaborate?

RDA’s members collaborate through working and interest groups. Interest groups are open-ended and focus on issues around the development and use of data infrastructure. Working group are short-term (12-18 months) and come together to develop and implement specific tools, policy, practices and products that are adopted and used by projects, organizations, and communities. Interest groups often spawn off one or more working groups when individual pieces of infrastructure that need to be developed are identified.

To keep in synch, RDA members meet face to face through Plenaries – three-day meetings held twice a year in various locations worldwide. These are both working meetings, where members meet and advance the efforts of interest and working groups, and community meetings with speakers of interest and updates from funding agencies and policy makers.

With the Research Data Alliance celebrating its second anniversary as an international organization, has the organization’s mission changed or remained the same?

RDA’s mission remains the same and the RDA community is even more interested in making an impact through the development of infrastructure. Last fall, RDA’s first working groups produced the organization’s first set of RDA deliverables; more are set to be delivered this year. Our first deliverables are giving the RDA community an opportunity to assess what is needed for adoption beyond the users embedded in the working groups. We are really looking forward to our first “Adoption Day” for the RDA community hosted by SDSC on March 8.

What type of growth has the RDA experienced since 2013?

RDA had its first “pre-meeting” in Washington, D.C. in the fall of 2012. About 100 people showed up. Today, RDA has more than 2,600 individual members from over 90 countries and in all sectors. A number of organizations have become organizational members as well. Many RDA members meet both at the international plenaries and as RDA “regional” communities. Regional groups in Europe (RDA/EU), Australia (RDA/AU) and the U.S. (RDA/US) are all active and new regional groups are also coming on board. As RDA celebrates its second birthday, the organization is working closely with a broad set of countries, communities, and agencies to expand both the RDA community and organizational infrastructure to include new participants and partners in Japan, Brazil, Canada, South Africa, and other countries.

What is the U.S. RDA region doing?

In the last year, RDA/US has been focusing on three pilot initiatives: outreach, adoption, and student and early career professional engagement. In the outreach area, RDA/US members have been working with organizations in a variety of domains to bring their issues to existing RDA working and interest groups or to create new ones around their infrastructure needs. In particular, the outreach effort is focusing on helping other groups utilize RDA as a vehicle to develop data sharing infrastructure within the U.S.

In the adoption area, RDA/US has developed the Adoption Day program and assisted specific groups in enhancing their own infrastructure through the adoption of RDA deliverables. In the student and early career professional area, RDA/US has funded students to work with RDA interest and working groups, broadening their own technical knowledge and professional networks during the process.

All of these initiatives have been funded by the National Science Foundation and have been important pilots for the community. The student and early career effort was just funded by the Sloan Foundation as a larger program. Beth Plale, Inna Kouper and Kathy Fontaine are now heading that up that activity. Under the leadership of Larry Lannom, RDA/US has also co-sponsored workshops and developed partnerships with CENDI, the National Data Service, the Sustaining Digital Repositories group, and others. These collaborations are creating important linkages within the U.S. data community that can advance the development of needed infrastructure and help us address U.S. data challenges.

Can you describe some of RDA’s deliverables and users?

Absolutely. RDA’s working groups were conceptualized as “tiger teams” that combine members who can build infrastructure with users who need the infrastructure to share data and get their work done. The purpose of RDA infrastructure deliverables is to enable impact.

One of RDA’s first working groups was the Data Type Registries group. Formed at the first RDA Plenary in early 2013, this group’s objective was to make it easier to create machine-readable and researcher-accessible registries of data types that support the accurate use of data, as unclear typing of data can make data open to misinterpretation and limit its usefulness. For more than a year, this working group collaborated together to develop its model and an implementation. The infrastructure products of this group are being adopted by European Data Infrastructure (EUDAT), the National Institute of Standards and Technology in the U.S., and additional groups who are applying it to their own research activities.

In contrast, RDA’s Wheat Data Interoperability working group is still in process. This group’s objective is to build an integrated wheat information system for the international wheat community of researchers, growers, breeders, etc. Such a system is critical to advance and sustain wheat data information sharing, reusability and interoperability. The working group includes members from the French National Institute for Agricultural Research, the International Maize and Wheat Improvement Center, and other agriculture-related organizations.

What can we expect at RDA’s Plenary in San Diego?

RDA/US will be hosting the fifth RDA Plenary in San Diego on March 9-11 with Adoption Day on March 8. We’ll welcome a worldwide community of members as well as organizational partners, funders, students, and local colleagues from San Diego. We will have a welcome by Jim Kurose, the new Assistant Director of NSF’s Computer and Information Science and Engineering Directorate and 3 keynotes that span the data landscape. Margaret Leinen, Director of the Scripps Institute of Oceanography will speak about ocean data, Stephen Friend, head of Sage Bionetworks will talk about open data commons and patient engagement, and Nao Tsunematsu will talk about data policy in Japan.

The meeting will also be a working meeting with much time spent in open working groups and interest groups. This is a great time for new members to experience the RDA discussions, join a group, talk to RDA members about starting a new group to focus on their own issues of interest, or explore organizational membership. We’ll also have panels with funders and plenary sessions around the digital humanities and other topics. It’s a full schedule but we still left time for a beach party.

What’s next for RDA?

The importance of data sharing to support innovation is increasing and RDA will continue to focus on the development, coordination and use of this infrastructure. RDA has also emerged as a neutral “town square” in which organizations can come together to develop common agendas – each meeting seems to attract other meetings that can benefit from co-location with the RDA community. In Washington at Plenary 2, the data citation community came together. During the San Diego meeting at Plenary 5, the Preservation and Archiving Special Interest Group will convene to discuss trends and issues related to digital preservation.

Most important is that RDA continues to provide a vehicle for getting things done and accelerating data sharing. The first two years have created a great culture for doing this. Our hope is that the next years accelerate and improve RDA’s usefulness and impact.

About Francine Berman

Francine Berman is Chair of Research Data Alliance / United States and co-Chair of the RDA Council. She is the Edward P. Hamilton Distinguished Professor of Computer Science at Rensselaer Polytechnic Institute (RPI).

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

The New Scalability

April 20, 2021

HPC is all about scalability. The most powerful systems. The biggest data sets. The most cores, the most bytes, the most flops, the most bandwidth. HPC scales! Notwithstanding a few recurring arguments over the last twenty years about scaling up versus scaling out, the definition of scalability... Read more…

Supercomputer-Powered Climate Model Makes Startling Sea Level Rise Prediction

April 19, 2021

The climate science community is tasked with striking a difficult balance: inspiring precisely the amount of alarm commensurate to the climate crisis. Make estimates that are too conservative, and the public might not re Read more…

San Diego Supercomputer Center Opens ‘Expanse’ to Industry Users

April 15, 2021

When San Diego Supercomputer Center (SDSC) at the University of California San Diego was getting ready to deploy its flagship Expanse supercomputer for the large research community it supports, it also sought to optimize Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU technology. The University of Cambridge will expand... Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impact on how large a piece of the DL pie a user can finally enj Read more…

AWS Solution Channel

Research computing with RONIN on AWS

To allow more visibility into and management of Amazon Web Services (AWS) resources and expenses and minimize the cloud skills training required to operate these resources, AWS Partner RONIN created the RONIN research computing platform. Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

The New Scalability

April 20, 2021

HPC is all about scalability. The most powerful systems. The biggest data sets. The most cores, the most bytes, the most flops, the most bandwidth. HPC scales! Notwithstanding a few recurring arguments over the last twenty years about scaling up versus scaling out, the definition of scalability... Read more…

San Diego Supercomputer Center Opens ‘Expanse’ to Industry Users

April 15, 2021

When San Diego Supercomputer Center (SDSC) at the University of California San Diego was getting ready to deploy its flagship Expanse supercomputer for the larg Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU technology. The University of Cambridge will expand... Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impa Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Nvidia Aims Clara Healthcare at Drug Discovery, Imaging via DGX

April 12, 2021

Nvidia Corp. continues to expand its Clara healthcare platform with the addition of computational drug discovery and medical imaging tools based on its DGX A100 platform, related InfiniBand networking and its AGX developer kit. The Clara partnerships announced during... Read more…

Nvidia Serves Up Its First Arm Datacenter CPU ‘Grace’ During Kitchen Keynote

April 12, 2021

Today at Nvidia’s annual spring GPU Technology Conference (GTC), held virtually once more due to the pandemic, the company unveiled its first ever Arm-based CPU, called Grace in honor of the famous American programmer Grace Hopper. The announcement of the new... Read more…

Nvidia Debuts BlueField-3 – Its Next DPU with Big Plans for an Expanded Role

April 12, 2021

Nvidia today announced its next generation data processing unit (DPU) – BlueField-3 – adding more substance to its evolving concept of the DPU as a full-fledged partner to CPUs and GPUs in delivering advanced computing. Nvidia is pitching the DPU as an active engine... Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Programming the Soon-to-Be World’s Fastest Supercomputer, Frontier

January 5, 2021

What’s it like designing an app for the world’s fastest supercomputer, set to come online in the United States in 2021? The University of Delaware’s Sunita Chandrasekaran is leading an elite international team in just that task. Chandrasekaran, assistant professor of computer and information sciences, recently was named... Read more…

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Saudi Aramco Unveils Dammam 7, Its New Top Ten Supercomputer

January 21, 2021

By revenue, oil and gas giant Saudi Aramco is one of the largest companies in the world, and it has historically employed commensurate amounts of supercomputing Read more…

Quantum Computer Start-up IonQ Plans IPO via SPAC

March 8, 2021

IonQ, a Maryland-based quantum computing start-up working with ion trap technology, plans to go public via a Special Purpose Acquisition Company (SPAC) merger a Read more…

Leading Solution Providers

Contributors

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

AMD Launches Epyc ‘Milan’ with 19 SKUs for HPC, Enterprise and Hyperscale

March 15, 2021

At a virtual launch event held today (Monday), AMD revealed its third-generation Epyc “Milan” CPU lineup: a set of 19 SKUs -- including the flagship 64-core, 280-watt 7763 part --  aimed at HPC, enterprise and cloud workloads. Notably, the third-gen Epyc Milan chips achieve 19 percent... Read more…

The History of Supercomputing vs. COVID-19

March 9, 2021

The COVID-19 pandemic poses a greater challenge to the high-performance computing community than any before. HPCwire's coverage of the supercomputing response t Read more…

HPE Names Justin Hotard New HPC Chief as Pete Ungaro Departs

March 2, 2021

HPE CEO Antonio Neri announced today (March 2, 2021) the appointment of Justin Hotard as general manager of HPC, mission critical solutions and labs, effective Read more…

Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions

February 12, 2021

The International Space Station will soon get a delivery of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to expand technology experi Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire