Michio Kaku Sketches Technological Wonderland of the Future at SC12

By Ian Armas Foster

November 16, 2012

Imagine a world where a computer chip costs just a penny. They could be embedded anywhere and everywhere, including the wallpaper of your house. Instead of sitting home alone on a Friday night drinking oneself into a stupor, one could simply go to his wall and look up others who are alone looking at their wall on a Friday night in order to find a companion for the night.

Dr. Michio Kaku, celebrity physicist who has written New York Times Bestselling books, Physics of the Impossible and Physics of the Future, talked about the implications of this smart wall and much more in his much-anticipated keynote address at Supercomputing 2012 (SC12) this week in Salt Lake City, where he discussed the huge role that high performance computing will play in the year 2100.

Since the 18th century, science and technology have been key to attaining wealth in this world, Kaku observed. When physicists figured out the laws of thermodynamics and were thus able to calculate the amount of energy and power one could derive from manipulating steam, the Industrial Revolution ensued. The steel mills and railroads that followed generated tremendous revenue, but after too much of that wealth was invested in railroads on the London Stock Exchange, the system ground to a halt in 1850.

Incidentally, in 1850 the Industrial Revolution was just getting underway in the United States. While part of that had to do with the relative youth of the country, an amusing part (in a historical sense anyway) had to do with Britain’s flat refusal to let so much as a blueprint leave their country. It wasn’t until Francis Cabot Lowell returned to America with the technical specifications in his photographic memory that the revolution took off in the US.

Either way, by the time Maxwell’s light equations and Faraday’s force field lines began paving the way for physicists harnessing the power of electricity and magnetism, the United States had clearly made up their deficit from the Industrial Revolution delay. But once again, an unsustainable portion of the ensuing wealth was poured into one thing, in this case the utilities. As a result, the New York Stock Exchange crash of 1929 plunged the US into the Great Depression, Kaku noted.

Physicists, as Kaku continued setting the historical scene, then further manipulated the laws of electricity and magnetism to create machines that could add large numbers together by simply flipping little magnets. These machines were called computers. The led to a third expansion of wealth, a third improper allocation of investments (this time in the housing market), and a third economic collapse.

This is an intriguing and relevant history for one paramount reason: the people in the audience listening to Dr. Kaku talk about the results of the first three technological revolutions will be the people responsible for the fourth. Kaku calls the upcoming 80 years an “era of high technology.” Some may call it the Information Revolution. Whatever the new era happens to be called, advances in supercomputing will drive it.

The benefits as Dr. Kaku predicts them are vast and can be best described in terms of vocabulary that will become obsolete. Cars will be able to drive themselves, essentially eliminating the 30,000 auto accident deaths a year in the United States. As Kaku puts it, the term “car accident” will become passé. In fifty years, the word “traffic” may refer more to the 1960’s musical group than a bottleneck of automobiles.

Like the word “polio,” the word “tumor” could be relegated to a reminder of unpleasant times past, as smart toilets equipped with computer chips hooked up to a supercomputing network analyze DNA for signs of cancerous cells. Destroying those cancerous cells individually through nanotechnology, instead of through brute force chemotherapy could become possible. Perhaps most impressively, MRIs could literally be conducted from a Star Trek-like Tricorder, as chips extend magnetic fields from supercomputers such that they envelop a person like a natural MRI machine.

Further, like society simply accepts running water and electricity as facts of life that need not be mentioned, computers are likely to be accepted a similar fact of life. As computer chips are imprinted onto almost everything, from walls to paper, to clothing, to contact lenses, the entire world becomes, in essence, one large, networked computer.
How will this all happen? Through a system of mass producing computer chips where each chip costs about a penny. While Kaku leaves it somewhat unclear how exactly that will happen (he’s a string theory physicist after all), it is clear that the path is not through silicon. Moore’s Law, the physical constraint which allows chip size to halve every 18 months or so, is slowing down.

That notion led to possibly the most harrowing possibility Kaku brought up: Silicon Valley becoming somewhat of a rust belt in the next 20 to 25 years. However, this should not be news to those in the know. As with previous technological advancements, businesses will have to adapt or be left by the wayside.

Maybe carbon nanotubes will take silicon’s place. Maybe that job falls to quanta. Either way, according to Kaku, the cheapening of these computing resources will lead to a much more automated the needs of society.

Of course, with increased automation comes an anxiety that the automation will replace humans. To a certain extent they will, says Kaku, but not to the extent that many may fear. It is important to remember that computers at their core are highly intricate adding machines. So only those with jobs that are highly iterative and repetitive, accountants for example, may need to worry, he argues.

The marketplace as Kaku sees it is shifting from a commodity-based system to one based in intelligence and creativity. For example, computer hardware can be mass-produced without much human intervention. Software cannot. It requires common sense, intuition, and creativity to produce software. Jobs that require those skills will persist. For the most part, those jobs will require a fair amount of higher education. Those which don’t require common sense, intuition, and creativity—the most boring of desk jobs—will  cease to exist according to Kaku.

An audience member brought up an interesting point during the Q and A session: if we know that this upcoming information revolution will come to a head in 80 years or so, how do we avoid the bubble bursting once again? According to Kaku, the answer lies in changing investment rules to control reckless speculation.

Interestingly, the nature of the oncoming information revolution might actually be able to prevent such unsustainable growth. Today’s predictive analytics are far superior to those of four years ago and may have been able to warn investors when markets become over-heated.

As SC12 wraps up, it is important to remember how key the HPC industry will be in advancing society throughout the next 80 years. Dr. Kaku was preaching to the choir here in his keynote speech, but those songs resonate with scientific and societal reality.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Exascale Computing Project Names Doug Kothe as Director

September 20, 2017

The Department of Energy’s Exascale Computing Project (ECP) has named Doug Kothe as its new director effective October 1. He replaces Paul Messina, who is stepping down after two years to return to Argonne National L Read more…

Takeaways from the Milwaukee HPC User Forum

September 19, 2017

Milwaukee’s elegant Pfister Hotel hosted approximately 100 attendees for the 66th HPC User Forum (September 5-7, 2017). In the original home city of Pabst Blue Ribbon and Harley Davidson motorcycles the agenda addresse Read more…

By Merle Giles

NSF Awards $10M to Extend Chameleon Cloud Testbed Project

September 19, 2017

The National Science Foundation has awarded a second phase, $10 million grant to the Chameleon cloud computing testbed project led by University of Chicago with partners at the Texas Advanced Computing Center (TACC), Ren Read more…

By John Russell

HPE Extreme Performance Solutions

HPE Prepares Customers for Success with the HPC Software Portfolio

High performance computing (HPC) software is key to harnessing the full power of HPC environments. Development and management tools enable IT departments to streamline installation and maintenance of their systems as well as create, optimize, and run their HPC applications. Read more…

NERSC Simulations Shed Light on Fusion Reaction Turbulence

September 19, 2017

Understanding fusion reactions in detail – particularly plasma turbulence – is critical to the effort to bring fusion power to reality. Recent work including roughly 70 million hours of compute time at the National E Read more…

Exascale Computing Project Names Doug Kothe as Director

September 20, 2017

The Department of Energy’s Exascale Computing Project (ECP) has named Doug Kothe as its new director effective October 1. He replaces Paul Messina, who is s Read more…

Takeaways from the Milwaukee HPC User Forum

September 19, 2017

Milwaukee’s elegant Pfister Hotel hosted approximately 100 attendees for the 66th HPC User Forum (September 5-7, 2017). In the original home city of Pabst Blu Read more…

By Merle Giles

Kathy Yelick Charts the Promise and Progress of Exascale Science

September 15, 2017

On Friday, Sept. 8, Kathy Yelick of Lawrence Berkeley National Laboratory and the University of California, Berkeley, delivered the keynote address on “Breakt Read more…

By Tiffany Trader

DARPA Pledges Another $300 Million for Post-Moore’s Readiness

September 14, 2017

The Defense Advanced Research Projects Agency (DARPA) launched a giant funding effort to ensure the United States can sustain the pace of electronic innovation vital to both a flourishing economy and a secure military. Under the banner of the Electronics Resurgence Initiative (ERI), some $500-$800 million will be invested in post-Moore’s Law technologies. Read more…

By Tiffany Trader

IBM Breaks Ground for Complex Quantum Chemistry

September 14, 2017

IBM has reported the use of a novel algorithm to simulate BeH2 (beryllium-hydride) on a quantum computer. This is the largest molecule so far simulated on a quantum computer. The technique, which used six qubits of a seven-qubit system, is an important step forward and may suggest an approach to simulating ever larger molecules. Read more…

By John Russell

Cubes, Culture, and a New Challenge: Trish Damkroger Talks about Life at Intel—and Why HPC Matters More Than Ever

September 13, 2017

Trish Damkroger wasn’t looking to change jobs when she attended SC15 in Austin, Texas. Capping a 15-year career within Department of Energy (DOE) laboratories, she was acting Associate Director for Computation at Lawrence Livermore National Laboratory (LLNL). Her mission was to equip the lab’s scientists and research partners with resources that would advance their cutting-edge work... Read more…

By Jan Rowell

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

MIT-IBM Watson AI Lab Targets Algorithms, AI Physics

September 7, 2017

Investment continues to flow into artificial intelligence research, especially in key areas such as AI algorithms that promise to move the technology from speci Read more…

By George Leopold

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Russian Researchers Claim First Quantum-Safe Blockchain

May 25, 2017

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurrencies like Bitcoin, along with classified government communications and other sensitive digital transfers. Read more…

By Doug Black

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

Google Debuts TPU v2 and will Add to Google Cloud

May 25, 2017

Not long after stirring attention in the deep learning/AI community by revealing the details of its Tensor Processing Unit (TPU), Google last week announced the Read more…

By John Russell

Six Exascale PathForward Vendors Selected; DoE Providing $258M

June 15, 2017

The much-anticipated PathForward awards for hardware R&D in support of the Exascale Computing Project were announced today with six vendors selected – AMD Read more…

By John Russell

Top500 Results: Latest List Trends and What’s in Store

June 19, 2017

Greetings from Frankfurt and the 2017 International Supercomputing Conference where the latest Top500 list has just been revealed. Although there were no major Read more…

By Tiffany Trader

Leading Solution Providers

IBM Clears Path to 5nm with Silicon Nanosheets

June 5, 2017

Two years since announcing the industry’s first 7nm node test chip, IBM and its research alliance partners GlobalFoundries and Samsung have developed a proces Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Cray Moves to Acquire the Seagate ClusterStor Line

July 28, 2017

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

IBM Advances Web-based Quantum Programming

September 5, 2017

IBM Research is pairing its Jupyter-based Data Science Experience notebook environment with its cloud-based quantum computer, IBM Q, in hopes of encouraging a new class of entrepreneurial user to solve intractable problems that even exceed the capabilities of the best AI systems. Read more…

By Alex Woodie

  • arrow
  • Click Here for More Headlines
  • arrow
Share This