Nautilus Harnessed for Humanities Research, Future Prediction

By Nicole Hemsoth

September 9, 2011

The observer influences the events he observes by the mere act of observing them or by being there to observe them.

        –Isaac Asimov, Foundation’s Edge

Elements of science fiction have helped us venture guesses about what the future might look like—at least in terms of the technologies some suspect might be pervasive one day. Flying cars, robot housekeepers, and of course, supercomputers that can predict the future and answer humanity’s most pressing questions, are all staples.

This week news emerged that might bring the all-knowing “supercomputer as fortuneteller” trope into reality—or if nothing quite as dramatic, help us better understand the connections between the news and its tone in geographical context.

A recent project called “Culturomics 2.0: Forecasting Large-Scale Human Behavior Using Global News Media Tone in Time and Space” set about to find a way to use tone and geographical analyses methods to yield new insights about global society.  If the lead researcher behind the project is correct, this could not only provide opportunities for societal research at global scale—but could also act as a warning bell before crises occur.

Kalev H. Leetaru, Assistant Director for Text and Digital Media Analytics at the Institute for Computing in the Humanities, Arts and Social Science at the University of Illinois and Center Affiliate at NCSA spearheaded the Culturomics 2.0 project. He claims that his analytics experiment has already allowed him to successfully forecast recent revolutions in Tunisia, Egypt, and Libya. Leetaru also says that he has been able to foresee stability in Saudi Arabia (at least through May 2011), and retroactively estimate Osama Bin Laden’s likely hiding place within a 200-kilometer radius.

Whereas initial Culturomics (1.0) studies focused on the frequency of a particular set of words from digitized books, he says that mere frequency isn’t enough to gain real-time, imminently useful information that reflects the modern world.

Shedding the word frequency element that defined version 1.0 of Culturomics, Leetaru set to take deep analytics to a new level by moving past frequency altogether and opting instead to sharpen the focus on tone, geography and the associations these two factors produced.

The project received funding from the National Science Foundation and was managed in part by the University of Tennessee’s Remote Data Analysis and Visualization Center (RDAV) and the National Institute for Computational Science (NICS). Leetaru was granted time on the large shared memory supercomputer Nautilus as part of the Extreme Science and Engineering Discovery Environment (XSEDE) program.

Leetaru says using a large shared memory system like the Nautilus was the key to achieving his research goals. The 1,024 Intel Nehalem core, 8.2 teraflop system with 4 terabytes available for big data workloads was manufactured by SGI as part of their UltraViolet product line. A system like this allows researchers more flexibility as they seek to take advantage of vast computing power to analyze “big data” in innovative ways.

Leetaru’s goals with this project represent a perfect example of a data-intensive problem in research. To arrive at his results, Leetaru needed to gather 100 million news articles stretching back half a century. From this point, the process required a staged approach, which began with a data mining algorithm that extracted important terms—people, places and events—to create a base network of 10 billion “nodes” in the network of news history.

With a mere 10 billion elements left following extraction, Leetaru next set about seeking out relationships that connected these nodes to begin building a second network. He said that when this was complete, he was left with a total of 100 trillion relationships, yielding a network that was about 2.4 petabytes in size.

Few machines have that kind of disk space let alone memory so he then found that to process the data, he needed to break the project up into pieces. He would look carefully at key pieces, generate that network on the fly using the shared memory system to begin the process of refining—a task he said wouldn’t be possible without Nautilus or another large shared memory system.

With the connections established, Leetaru then ran tools to seek out patterns to find interesting differences in tone in different countries or regions. Using 1500 dimensions of analysis that fall under the banner of “tone mining” which determines the positive or negative “score” of a dictionary of words from existing sources, Leetaru was able to build a profile of more profound connections.

These variances in tone of global news were matched with geographic mining efforts, which places the nodes and tones via an algorithm that seeks to determine where the news sources are talking about. Leetaru explains that this is not a simple algorithm since there are many cities called “Cairo” in the world. The algorithm must mine for contextual references to nearby places or elements to correctly place the coordinates.

The final element is the network analysis or modularity finding step. Leetaru takes his network and looks for nodes that are more tightly connected to each other than the rest of the network to find out how nations are related—an analytics project that yields a well-defined set of seven civilizations on Earth. To get this kind of network requires taking every city, every article that has ever referenced it, and each city then becomes a node with its own complex network of tones, meanings and potential for new findings.

With all of these stages in place, Leetaru says the possibilities are endless. One can watch change over time and create reproducible models—or even go back to look at past events to see how closely one can predict the end result. In the full paper, Leetaru hits on some of his successes showing how major crises have played out in a particular set of ways—offering a chance for researchers to predict the future.

Leetaru pointed to the benefits of using the shared memory system Nautilus with the example that has generated a lot of buzz this week—that his methods led to a retroactive map that pinpointed Bin Laden’s location within 200 km.

“One of the beauties of using a large shared memory machine is that for example I could see an interesting pattern (like the Bin Laden portion where I was assuming there was enough information to pinpoint where he was hiding) and then begin exploring different techniques, including writing quick little Perl scripts that would wrap a small network on the fly actually and process that material and basically make a quick chart or table or map.”

He went on to note:

“With a large shared memory machine, you don’t have to worry about memory—I never had to worry about writing MPI code to distribute memory across nodes; it’s like it was infinite–with a quick script I could grab all locations that mentioned “Bin Laden” since he first started to appear in the news around 10 years ago, and map it over time or in different ways. It boiled down to writing easy Perl scripts, running in a matter of minutes—if I didn’t have all that memory it would have taken weeks or months with each iteration so one benefit is that leveraging that much hardware allows you to do simple things.”

Leetaru says that even as an undergraduate at NCSA working with some of the first iterations of web-scale web mining, he has been fascinated with the possibilities of deep analytics. While his goal with the Culturomics 2.0 project was to forecast large-scale human behavior using global news media tone in time and space but along the way he stumbled upon a few other unexpected findings, including the fact that indeed, the news is becoming “more negative” in terms of general tone and also that the United States tends to favor itself in its own news filings.

In this era of deep analytics that harness real-time news and sentiment, the Foundation series from Isaac Asimov is never far from the mind. For those who haven’t read the books, in a very small nutshell, mathematical formulas allow civilization to predict the future course of history…and madness ensues.

All arguments about potential for chaos or leaps forward for civilization aside, advances in analytics and high-performance computing like those produced on the Nautilus supercomputer have brought this series of classic science fiction tales into the realm of possibility.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Q&A with Google’s Bill Magro, an HPCwire Person to Watch in 2021

June 11, 2021

Last Fall Bill Magro joined Google as CTO of HPC, a newly created position, after two decades at Intel, where he was responsible for the company's HPC strategy. This interview was conducted by email at the beginning of A Read more…

A Carbon Crisis Looms Over Supercomputing. How Do We Stop It?

June 11, 2021

Supercomputing is extraordinarily power-hungry, with many of the top systems measuring their peak demand in the megawatts due to powerful processors and their correspondingly powerful cooling systems. As a result, these Read more…

Honeywell Quantum and Cambridge Quantum Plan to Merge; More to Follow?

June 10, 2021

Earlier this week, Honeywell announced plans to merge its quantum computing business, Honeywell Quantum Solutions (HQS), which focuses on trapped ion hardware, with the U.K.-based Cambridge Quantum Computing (CQC), which Read more…

ISC21 Keynoter Xiaoxiang Zhu to Deliver a Bird’s-Eye View of a Changing World

June 10, 2021

ISC High Performance 2021 – once again virtual due to the ongoing pandemic – is swiftly approaching. In contrast to last year’s conference, which canceled its in-person component with a couple months’ notice, ISC Read more…

Xilinx Expands Versal Chip Family With 7 New Versal AI Edge Chips

June 10, 2021

FPGA chip vendor Xilinx has been busy over the last several years cranking out its Versal AI Core, Versal Premium and Versal Prime chip families to fill customer compute needs in the cloud, datacenters, networks and more. Now Xilinx is expanding its reach to the booming edge... Read more…

AWS Solution Channel

Building highly-available HPC infrastructure on AWS

Reminder: You can learn a lot from AWS HPC engineers by subscribing to the HPC Tech Short YouTube channel, and following the AWS HPC Blog channel. Read more…

Space Weather Prediction Gets a Supercomputing Boost

June 9, 2021

Solar winds are a hot topic in the HPC world right now, with supercomputer-powered research spanning from the Princeton Plasma Physics Laboratory (which used Oak Ridge’s Titan system) to University College London (which used resources from the DiRAC HPC facility). One of the larger... Read more…

A Carbon Crisis Looms Over Supercomputing. How Do We Stop It?

June 11, 2021

Supercomputing is extraordinarily power-hungry, with many of the top systems measuring their peak demand in the megawatts due to powerful processors and their c Read more…

Honeywell Quantum and Cambridge Quantum Plan to Merge; More to Follow?

June 10, 2021

Earlier this week, Honeywell announced plans to merge its quantum computing business, Honeywell Quantum Solutions (HQS), which focuses on trapped ion hardware, Read more…

ISC21 Keynoter Xiaoxiang Zhu to Deliver a Bird’s-Eye View of a Changing World

June 10, 2021

ISC High Performance 2021 – once again virtual due to the ongoing pandemic – is swiftly approaching. In contrast to last year’s conference, which canceled Read more…

Xilinx Expands Versal Chip Family With 7 New Versal AI Edge Chips

June 10, 2021

FPGA chip vendor Xilinx has been busy over the last several years cranking out its Versal AI Core, Versal Premium and Versal Prime chip families to fill customer compute needs in the cloud, datacenters, networks and more. Now Xilinx is expanding its reach to the booming edge... Read more…

What is Thermodynamic Computing and Could It Become Important?

June 3, 2021

What, exactly, is thermodynamic computing? (Yes, we know everything obeys thermodynamic laws.) A trio of researchers from Microsoft, UC San Diego, and Georgia Tech have written an interesting viewpoint in the June issue... Read more…

AMD Introduces 3D Chiplets, Demos Vertical Cache on Zen 3 CPUs

June 2, 2021

At Computex 2021, held virtually this week, AMD showcased a new 3D chiplet architecture that will be used for future high-performance computing products set to Read more…

Nvidia Expands Its Certified Server Models, Unveils DGX SuperPod Subscriptions

June 2, 2021

Nvidia is busy this week at the virtual Computex 2021 Taipei technology show, announcing an expansion of its nascent Nvidia-certified server program, a range of Read more…

Using HPC Cloud, Researchers Investigate the COVID-19 Lab Leak Hypothesis

May 27, 2021

At the end of 2019, strange pneumonia cases started cropping up in Wuhan, China. As Wuhan (then China, then the world) scrambled to contain what would, of cours Read more…

AMD Chipmaker TSMC to Use AMD Chips for Chipmaking

May 8, 2021

TSMC has tapped AMD to support its major manufacturing and R&D workloads. AMD will provide its Epyc Rome 7702P CPUs – with 64 cores operating at a base cl Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

Berkeley Lab Debuts Perlmutter, World’s Fastest AI Supercomputer

May 27, 2021

A ribbon-cutting ceremony held virtually at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC) today marked the official launch of Perlmutter – aka NERSC-9 – the GPU-accelerated supercomputer built by HPE in partnership with Nvidia and AMD. Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Google Launches TPU v4 AI Chips

May 20, 2021

Google CEO Sundar Pichai spoke for only one minute and 42 seconds about the company’s latest TPU v4 Tensor Processing Units during his keynote at the Google I Read more…

Iran Gains HPC Capabilities with Launch of ‘Simorgh’ Supercomputer

May 18, 2021

Iran is said to be developing domestic supercomputing technology to advance the processing of scientific, economic, political and military data, and to strengthen the nation’s position in the age of AI and big data. On Sunday, Iran unveiled the Simorgh supercomputer, which will deliver.... Read more…

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

Quantum Computer Start-up IonQ Plans IPO via SPAC

March 8, 2021

IonQ, a Maryland-based quantum computing start-up working with ion trap technology, plans to go public via a Special Purpose Acquisition Company (SPAC) merger a Read more…

Leading Solution Providers

Contributors

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

AMD Launches Epyc ‘Milan’ with 19 SKUs for HPC, Enterprise and Hyperscale

March 15, 2021

At a virtual launch event held today (Monday), AMD revealed its third-generation Epyc “Milan” CPU lineup: a set of 19 SKUs -- including the flagship 64-core, 280-watt 7763 part --  aimed at HPC, enterprise and cloud workloads. Notably, the third-gen Epyc Milan chips achieve 19 percent... Read more…

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Microsoft to Provide World’s Most Powerful Weather & Climate Supercomputer for UK’s Met Office

April 22, 2021

More than 14 months ago, the UK government announced plans to invest £1.2 billion ($1.56 billion) into weather and climate supercomputing, including procuremen Read more…

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire