CHICAGO, Oct. 7, 2020 — Globus, a leading research data management service, reached a huge milestone by breaking the exabyte barrier. While it took over 2,000 days for the service to transfer the first 200 petabytes (PB) of data, the last 200PB were moved in just 247 days. This rapidly accelerating growth is reflected by the more than 150,000 registered users who have now transferred over 120 billion files using Globus.
IDC predicts that 59 zettabytes—which equals 59,000 exabytes of data—will be created, captured, copied and consumed in 2020. Over the past decade many new technological developments have led to this data explosion. Artificial intelligence, IoT, the proliferation of powerful HPC systems with faster processors and better networking, and new scientific instruments such as genome sequencers are all contributors. Just a few years ago a genome sequencer generated 15GB of data. Now, scientists leveraging next generation sequencers can generate over 6TB of raw data per run—and even more if you factor in the necessary downstream analysis.
“It is such an exciting time for the research community. The massive data volumes being collected, analyzed, and shared today will enable us to address and solve some of the world’s most challenging problems, from discovering new vaccines to tackling climate change and uncovering some of the mysteries of the Universe”, states Ian Foster, Globus co-founder.
So exactly how big is one exabyte? It is a billion billion bytes. That is 10¹⁸, or a 1 with 18 zeros! It would take a person on a continuous video call 237,823 years to transfer 1EB, or a Netflix subscriber 3.5 billion years of watching movies at their current rate to surpass the 1EB mark.
This unprecedented growth in data creates exceptional opportunities in modern scientific research. But progress requires that the data be effectively managed. Last year for example, a team led by Argonne National Laboratory ran three of the largest cosmological simulations known to date, and generated a total of eight petabytes of data. And at the Advanced Photon Source, ultra-bright, high-energy x-ray beams routinely generate more than 30TB per week as scientists collect data in unprecedented detail and in amazingly short time frames. Securely and reliably moving and processing tens of terabytes on a daily basis is the “new normal” for many research computing facilities in this exascale era.
“Our mission at Globus is to make the management of research data as frictionless as possible so researchers can get on with their important work”, said Rachana Ananthakrishnan, Globus executive director. “We are gratified to see the service becoming a critical part of cyberinfrastructure at thousands of leading research institutions around the world.”
Globus can be used by non-profit research institutions to move data at no cost; more advanced features are available via a paid subscription. All that’s required is a standard Internet connection to initiate data sharing or transfers of any size, from anywhere to anywhere, using any web browser. Globus provides instant access to tens of thousands of storage endpoints, including all XSEDE systems as well as systems at most national laboratories and hundreds of leading research universities. Many commercial organizations also use Globus to address their data management needs and facilitate collaboration with their partners.
About Globus
Globus is software-as-a-service for research data management, used by hundreds of research institutions and high-performance computing (HPC) facilities worldwide. The service enables secure, reliable file transfer, sharing, and other services for managing data throughout the research lifecycle. Globus is an initiative of the University of Chicago, and is supported in part by funding from the Department of Energy, the National Science Foundation, the National Institutes of Health, and the Sloan Foundation. Visit us at www.globus.org.
Source: Globus