Visit additional Tabor Communication Publications
October 04, 2010
Mid-range and entry-level professional graphics solutions deliver superior price-performance and computational visualization capabilities to next generation applications
SANTA CLARA, Calif., Oct. 4 -- NVIDIA announced today the expansion of its award-winning line of NVIDIA Quadro professional graphics solutions based on the NVIDIA Fermi architecture. The mid-range Quadro 2000 with 192 NVIDIA CUDA processing cores and the entry-level Quadro 600 with 96 CUDA processor cores now bring the computational and visualization benefits of the breakthrough Fermi architecture to all segments of the market.
The Quadro 2000 delivers 1.5 times the geometry performance of the previous Quadro graphics processing unit (GPU) mid-range solution and utilizes the new NVIDIA Scalable Geometry Engine technology to deliver dramatically higher performance across leading CAD and DCC applications such as SolidWorks and Autodesk 3ds Max.
"We believe the technology that NVIDIA has built into their new Quadro professional graphics, namely the new NVIDIA Fermi architecture, will provide an exceptional solution for SolidWorks users worldwide," said Nick Iwaskow, manager, Alliances, Dassault Systèmes SolidWorks Corp. "With our expected certification of the Quadro 2000, SolidWorks anticipates the newest Quadro solutions will empower designers and engineers with the finely detailed geometry, real-time simulation and analysis, and high visual fidelity they demand."
The new entry-level Quadro 600 is a flexible half height solution that features the industry's best performance per watt for applications such as Autodesk AutoCAD 2011, and empowers professional designers to interact with models that are twice the size and complexity compared to previous entry-level solutions.
"At last month's GPU Technology Conference, the world learned all about the power of our Fermi architecture and how NVIDIA Quadro GPUs are being used to solve some of the world's most complex problems," said Jeff Brown, general manager, Professional Solutions Group, NVIDIA. "With these new Quadro solutions, we're making this computational horsepower available for all users of professional CAD and content creation software applications."
Both the Quadro 2000 and Quadro 600 feature 1GB of graphics memory and are compatible with the new NVIDIA 3D Vision Pro active shutter-glasses solution, providing powerful visualization and analysis in an immersive, high-quality stereoscopic 3D experience.
Designed, Built and Engineered by NVIDIA to the Highest Standards of Quality
Quadro professional graphics cards are designed and built by NVIDIA to provide industry-leading performance, reliability, compatibility and stability when running professional applications. Software companies such as Adobe, Autodesk, Dassault Systemes and SolidWorks consistently certify Quadro professional graphics solutions for their users whose livelihoods depend on them.
The NVIDIA Quadro 2000 and Quadro 600 are built on industry standards, including OpenGL 4.1, DirectX 11, Shader Model 5.0, DirectCompute and OpenCL. They also leverage the NVIDIA CUDA parallel computing architecture that enables dramatic increases in computing performance. Featuring 30-bit color fidelity (10-bits per color), these Quadro solutions enable the display of billions of color variations for rich, vivid image quality with the broadest dynamic range. Both the Quadro 2000 and Quadro 600 are PCI Express 2.0 compliant, and feature an ultra-quiet design, with tailored acoustics for an ultra-quiet desktop environment.
The newest line of Quadro GPUs leverages the CUDA parallel processing architecture and NVIDIA Application Acceleration Engines to enable the world's fastest performance across a broad range of applications. Additionally, these new solutions feature NVIDIA Mosaic Technology, which will enable any application to utilize one or more Quadro professional graphics solutions to scale across up to eight high-resolution displays. Whether the application is CATIA, 3ds Max or PowerPoint or Google Earth, users just simply hit the Maximize button and the application will seamlessly span across all connected displays.
"As the world's leading workstation brand, Dell Precision workstations are designed from the ground up specifically for professional users who demand the ultimate in performance of their systems, graphics and ISV application integration," said Greg Weir, senior manager, Dell Precision Workstations Product and ISV Marketing. "NVIDIA Quadro GPUs, combined with our Dell Precision workstations, deliver on those expectations with exceptional value, superior performance, and broad application support for all segments of the market."
Availability and Pricing
The Quadro 2000 ($599 MSRP, USD) and Quadro 600 ($199 MSRP, USD) are available from leading global workstation manufacturers, including Dell, HP and Lenovo, as well as authorized distribution partners including: PNY Technologies in North America and Europe, ELSA in Japan, and Leadtek in Asia Pacific.
To learn more, visit www.nvidia.com/quadro.
NVIDIA (NASDAQ: NVDA) awakened the world to the power of computer graphics when it invented the GPU in 1999. Since then, it has consistently set new standards in visual computing with breathtaking, interactive graphics available on devices ranging from tablets and portable media players to notebooks and workstations. NVIDIA's expertise in programmable GPUs has led to breakthroughs in parallel processing which make supercomputing inexpensive and widely accessible. The company holds more than 1,100 US patents, including ones covering designs and insights which are fundamental to modern computing. For more information, see www.nvidia.com.
Source: NVIDIA Corp.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.