Visit additional Tabor Communication Publications
September 27, 2012
CAMBRIDGE, MA, Sept. 27 — Knome Inc. announced today that it is taking orders for the knoSYS 100, the first plug-and-play, fully integrated hardware and software system designed to help researchers in medical and academic institutions interpret human whole genomes. The knoSYS 100 was developed to help geneticists discover relevant genetic variation, investigate diseases of unknown cause, and create next generation in silico gene tests. Units will begin shipping in Q4, 2012.
Starting at $125,000, the knoSYS 100 is based on Knome’s big data informatics technology. The system will accept next generation sequence data from leading sequencers, including those sold by Illumina, Life Technologies, and Complete Genomics.
Breaking the genome interpretation bottleneck
The difficulty and cost associated with human genome sequencing has largely been addressed, with the cost of sequencing a whole genome expected to decline to under $1,000 in 2013. But it still takes a team of researchers weeks to months to annotate, compare, and interpret genome data. This slow pace and the lack of robust tools have significantly limited the ability of researchers to scale the process of interpreting human genomes.
With an average throughput of one genome per day, the knoSYS 100 eliminates the current informatics bottleneck in whole genome interpretation—matching the speed of today’s fastest sequencers.
“In the first half of this year, we saw the demand for genome interpretation surge as researchers in many of the world’s leading medical institutions started preparing for the broad utilization of whole genome interpretation for patient care,” said Martin Tolar, MD, PhD, Chief Executive Officer of Knome. “All of these institutions face the same issue—how to industrialize genome interpretation so that it is not only accurate, but fast.”
More than a dozen of the world’s top medical institutions have joined an early access program to pilot Knome’s genome interpretation technology, including: ARUP Laboratories, Cedars-Sinai Medical Center, Cincinnati Children’s Hospital, The Hospital for Sick Children (SickKids) in Toronto, Hyundai Cancer Institute at CHOC Children’s, University of Liverpool, and University of Verona.
An in silico genetic testing “lab in a box”
In addition to providing geneticists with query and visualization applications for conducting in-depth research into sets of whole genomes, the knoSYS 100 ships with tools and libraries that allow developers to create in silico gene tests that can be run at the push of a button.
“The advent of fast and affordable whole genome interpretation will fundamentally change the genetic testing landscape,” said George Church, PhD, professor of genetics at Harvard University and co-founder of Knome. “The genetic testing lab of the future is a software platform where gene tests are apps. This will shift genetic testing from a fixed, lengthy process to a rapid and highly dynamic one that makes full use of the data contained in the entire genome.”
Developers can use the tools and libraries included with the knoSYS 100 to replicate existing single gene tests in software. They can also go further, creating next generation superpanels that examine thousands of genes, as well as incorporate artificial intelligence algorithms; deep reference data on protein interaction and expression; statistical functions; and the power of kindred, population, and tumor/non-tumor comparison.
“In silico superpanels allow hundreds of conditions to be tested simultaneously and open the door to the development of a new class of molecular diagnostics for complex, multi-gene disorders,” said Dr. Church. “Moving from a world of assays to apps will expand the definition of what a gene ‘test’ actually is, raising important questions but also presenting tremendous opportunities to help improve human well-being.”
As a demonstration of capability, the knoSYS 100 will include several superpanels for research into cancer, epilepsy, heart disorders, and other conditions.
The knoSYS 100 is an integrated hardware and software system constructed around Knome’s core big data informatics technology, used since 2008 to interpret thousands of whole human genomes and exomes for medical, pharmaceutical, and academic research projects. The components of the knoSYS 100 include:
Since the knoSYS™100 is installed on-premises and is maintained behind the client’s firewall, it is well-suited for institutions that do not wish to send genome data to third parties or the cloud due to privacy, consent, or confidentiality concerns.
“There are close to 2,000 next gen sequencers in labs around the world generating enormous amounts of data,” said Knome Chief Executive Officer Martin Tolar, MD, PhD. “Every one of those sequencers should have a knoSYS 100 right next to it. To further facilitate the application of genomics in patient care, we are investing over $50 million in R&D over the next several years. This is where we intend to make a lasting contribution to molecular-based, precision medicine.”
Knome Inc. is a leading provider of human genome interpretation systems and services. We help clients in two dozen countries identify the genetic basis of disease, tumor growth, and drug response. Designed to accelerate and industrialize the process of interpreting whole genomes, Knome’s big data technologies are helping to pave the healthcare industry’s transition to molecular-based, precision medicine.
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.