Visit additional Tabor Communication Publications
December 09, 2009
Computer games and TV account for bulk of information consumed in 2008
SAN DIEGO, Dec. 9 -- U.S. households consumed approximately 3.6 zettabytes of information in 2008, according to the "How Much Information? 2009 Report on American Consumers," released today by the University of California, San Diego. One zettabyte is 1,000,000,000 trillion bytes, and total bytes consumed last year were the equivalent of the information in thick paperback novels stacked seven feet high over the entire United States, including Alaska.
The How Much Information? project is creating a census of the world's information in 2008. The study measured information consumed by U.S. consumers in and outside the home for non-work related reasons, and included the gamut of information sources, including going to the movies, listening to the radio, talking on the cell phone, playing video games, surfing the Internet, and reading the newspaper, among other things.
"This report is a snapshot of what the information revolution means to the average American on an average day, who consumes 34 gigabytes and 100,000 words of information," said report author Roger Bohn, director of the Global Information Industry Center at UC San Diego's School of International Relations and Pacific Studies. "The total volume of 3.6 zettabytes consumed last year is much larger than previous studies have reported, partly because they measured different views of information, such as information creation rather than consumption. Also, nobody had looked at the role of computer games, which generate a staggering number of bytes."
The new report estimates that between 1980 and 2008, bytes consumed increased 350 percent, for an average annual growth rate of 5.4 percent. According to the report, the average American's information consumption of 34 gigabytes a day is the equivalent of about one fifth of a notebook computer's hard drive, depending on the model.
Hourly statistics confirm that a large chunk of the average American's day is spent watching television. The new report estimates that on average 41 percent of information time is watching TV (including DVDs, recorded TV and real-time watching). American consumers watched 36 million hours of television on mobile devices each month -- a number that, while expected to grow, is a fraction of the hours spent watching television at home.
Based on bytes alone, however, computer games are the biggest information source totaling 18.5 gigabytes per day for the average American consumer, or about 67 percent of all bytes consumed. Approximately 80 percent of the population plays some kind of computer game, including casual games such as Bookworm, Tetris and social networking games.
"Games are almost universal, but most of the gaming bytes come from graphically intensive games on high-powered computers and consoles, which have the equivalent of special-purpose supercomputers from five years ago," said Bohn. "Games today generate their bytes inside the home, rather than having to transmit them over cables into the house, but gaming is increasingly moving online."
Americans spent 16 percent of their information hours using the Internet (second only to TV's 41 percent). With the proliferation of email, instant messaging and social networking, the Internet today dominates two-way communications, with more than 79 percent of those bytes every day.
Despite rapid growth, consumption of new media such as YouTube videos, text messaging or games on smartphones is still outpaced by traditional media.
"There are several hundred million TV sets in the U.S., and depending on whom you ask, about 50 million smartphones," explained report co-author James Short, research director of the Global Information Industry Center. "And new media devices are increasingly personal devices -- mobile phones, Kindles and handheld gaming devices -- with small screens and relatively low resolution, limiting the number of bytes consumed."
Looking to the future, the report's authors point to current patterns of information consumption that will change the information landscape by 2015. In addition to the expected widespread use of HDTV, mobile television and video over the Internet have the potential to revolutionize where American consumers receive their visual information.
According to the study, the 3.6 zettabytes of total information used by Americans in their homes far exceeds storage or transmission capacity. For example, the total is roughly 20 times more than what can be stored at one time on all the hard drives in the world. Less than two percent of the total information was transmitted over the Internet.
"What is clear is that we consume orders of magnitude more information than can be stored on hard drives or transmitted over today's Internet," said Internet pioneer Larry Smarr, director of the California Institute for Telecommunications and Information Technology, a partnership of UC San Diego and UC Irvine. "Even small changes in how Americans consume information would have serious implications for network planners and require large-scale investments."
To allow comparisons with earlier studies, the UC San Diego report's authors devised mathematical formulas to convert all information statistics into words, bytes and the number of hours spent consuming information.
This initial report focuses on the U.S. consumer sector (both inside and outside the home, including use of cell phones and movie-watching). Future reports will focus on information in the U.S. workplace, in other regions, and different types of information (such as machine-to-machine data that is analyzed automatically and may never be seen by human eyes).
The report, "How Much Information? 2009 Report on American Consumers", is available online and can be downloaded in PDF format at http://hmi.ucsd.edu/howmuchinfo.php. A quote sheet is available on request to firstname.lastname@example.org.
About How Much Information?
The HMI? project is a research consortium led by the University of California, San Diego. Funding for the project comes from industry sponsors AT&T, Cisco Systems, IBM, Intel, LSI, Oracle and Seagate Technology, with early support from the Alfred P. Sloan Foundation.
Source: UC San Diego
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.