Visit additional Tabor Communication Publications
October 13, 2011
Billions of tweets, Facebook updates, location-enabled applications and web searches are leading to an unprecedented amount of data “byproduct” that an increasing number of business are mining through in search of new insights, trends and sentiments. While predictive and real-time analytics hold enormous value for business, as one might imagine, governments to see an opportunity to understand citizens far better than ever as well.
According to a report this week by John Markoff in the New York Times, this past summer, an obscure government intelligence agency solicited ideas from the academic community about how it might be able to automatically “scan the Internet in 21 Latin American countries.”
This three-year experiment, which is slated to begin in April, would devise an automated data collection system that looks for patterns of “communication, consumption and movement of populations.” Rather vague, yes?
This “data eye in the sky” will use publicly available data to take the digital pulse of an entire region. In their view, this includes everything from IP traffic and web searches to more “easily” available sources like blogs and social media streams.
This type of research has been in the news quite a bit over the last year. Stories have emerged about everything from mining Twitter for brand sentiments to using supercomputing resources to predict the future. What is different here is that this is no longer a branding-driven or academic institution-geared initiative, this is a project backed with public funds on behalf of a small agency that is refusing to comment about the scope of the analytics endeavor.
The group behind the effort, the Intelligence Advanced Research Projects Activity, is part of the office of the director of national intelligence in the United States. As the NYT report claimed, the agency’s research would “not be limited to political and economic events, but would also explore the ability to predict pandemics and other types of widespread contagion, something that has been pursued independently by civilian researchers and by companies like Google.”
As the article’s author noted, there are potential privacy and more general logistics concerns involved. Markoff writes that “the ease of acquiring and manipulating huge data sets charting Internet behavior causes many researchers to warn that the data mining technologies may be quickly outrunning the ability of scientists to think through questions of privacy and ethics."
Full story at New York Times
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.