Visit additional Tabor Communication Publications
April 16, 2012
Alan Turing was famous for believing computers could act like humans. He devised what is now known as the Turing test, whereby a computer would supply responses to human questioning. If the questioner could not distinguish the answers from real live person, the computer would have passed the test.
His premise was that since human thinking is logical and computers are based on logic, behavior should be computable. Nice idea. But it hasn’t quite panned out -- at least not yet. Even with recently developed AI-type technology such as automated customer service assistants, most people are aware when they are talking to a computer.
A recent article in Wired summed up the problem with Turing’s thinking:
That simplistic idea proved ill-founded. Cognition is far more complicated than mid-20th century computer scientists or psychologists had imagined, and logic was woefully insufficient in describing our thoughts. Appearing human turned out to be an insurmountably difficult task, drawing on previously unappreciated human abilities to integrate disparate pieces of information in a fast-changing environment.
But as the Wired piece suggests, the technology might finally be catching up to Turing. Recent advances in AI-type machines, like Google’s search engine and IBM’s Watson points to how that might come about.
Current AI relies on connection and probability algorithms. This technology drives language recognition found in both Google searches and in IBM Watson’s DeepQA technology. The systems understand, to a degree, what a human is requesting and produce search results or the most likely answers for a Jeopardy question.
However, those systems have to rely on their limited datasets, which confines their answers to particular domains. Neither Google nor Watson can supply ad lib responses like a human. But, as the Wired article points out, the ability to ingest large datasets and correlate the information seems to be the approach that could scale up.
Robert French, cognitive scientist at the French National Center for Scientific Research, theorized that a massive dataset could be the final key. This dataset would contain every memory, including olfactory, audio, visual and sensory data, from millions of people. Says French: “These data and the capacity to analyze them appropriately could allow a machine to answer heretofore computer-unanswerable questions”.
Recent advancements in big data, analytics, and language recognition are enabling the creation of much more intelligent machines than even just a few years ago. At the right scale these technologies may indeed lead to systems that could pass the Turing Test. While such machines would only be able to imitate human behavior, they would impact nearly every industry of the modern era.
Full story at Wired
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.