Visit additional Tabor Communication Publications
September 30, 2010
IPv6 Plus updated PDAC and .NET Framework support also featured
AUSTIN, Texas, Sept. 30 -- Pervasive Software Inc., a global leader in embeddable data management and data integration software and in enabling revolutionary next-generation analytics, today announced the release of Pervasive PSQL v11 MC, with several new capabilities including multicore enablement and Internet Protocol version 6 (IPv6) support.
Pervasive PSQL v11 is optimized for multicore to allow for increased performance and scalability. With its ability to make more efficient use of commodity 2-, 4- and 8-core servers, Pervasive PSQL v11 provides throughput gains of up to four times that of earlier versions. Its performance benefits the full range of vendor applications that leverage the new version, whether or not those applications are multicore-optimized. The company also released updated versions of its Pervasive PSQL eco-system products: Pervasive DataExchange v4, Pervasive Backup Agent v3 and Pervasive AuditMaster v7 compatible with Pervasive PSQL v11.
"PSQL v11 is truly a parallel implementation that makes use of multicore processors and will accelerate most multiuser applications dramatically," said Dan Woods, CTO and Editor, CITO Research, a research and analysis firm focused on the needs of CIOs and CTOs.
"The massive performance increase from Pervasive PSQL v10 to Pervasive PSQL v11 exhibits a version-to-version increase in transactions per second that I cannot remember ever seeing in the history of database benchmarking," said Robin Bloor, chief research analyst and president, The Bloor Group and Founder, Bloor Research.
In addition, Pervasive PSQL v11 supports IPv6, which uses 128-bit addresses to exponentially increase the number of devices that can connect to the internet. The new standard, which succeeds the 32-bit addresses supported in today's IPv4, is being phased-in in several key markets, most notably Japan.
The new version features expanded Pervasive Direct Access Components (PDAC) to help developers create Pervasive PSQL-based applications with ease using Embarcadero RAD Studio 2010 for Delphi and C++ developers. Pervasive also extended its ADO.NET data provider to embrace new features in .NET Framework 3.5 Service Pack 1 including Entity Framework version 1.0 support, to allow for seamless integration for developers who rely on the .NET Framework. Pervasive PSQL v11 now provides 64-bit ODBC connectivity on Windows platforms.
"Pervasive PSQL v11 provides both comprehensive backwards-compatibility and future-proofing technology for our customers and partners," said Gilbert Van Cutsem, general manager of database products at Pervasive. "Pervasive database products anticipate our customers' needs and provide them with continuity of existing platforms, operating systems and applications, plus support for new breakthrough hardware and software, so they can evolve and adopt technologies at their own pace. Our new multicore-optimized performance should benefit all our partners as multicores proliferate, with particular value for ISVs who want to provide their applications in SaaS or hosted delivery models."
"As commodity multicore chips proliferate, ISVs who are not prepared may learn from customers that their application performance actually degrades when they move from a single-core to a 2-, 4- or 8-core machine," said Richard Shalowitz, senior vice president and general manager, Real Estate and Property Management at SS&C Technologies, Inc. "SS&C's SKYLINE customer base is one of the most sophisticated property management groups in the country. Technically they are particularly savvy in getting the performance they need whether dual or quad or even eight-way processors. Pervasive PSQL v11 will be very popular with our user base as it combines enhanced performance as they inevitably adopt multicore hardware platforms."
The new Pervasive PSQL v11 capabilities include:
About Pervasive Software
Pervasive Software (NASDAQ: PVSW) helps companies get the most out of their data investments through agile and embeddable software and cloud-based services for data management, data integration, B2B exchange and analytics. The embeddable Pervasive PSQL database engine provides robust database reliability in a near-zero database administration environment for packaged business applications. Pervasive's multi-purpose data integration platform, available on-premises and in the cloud, accelerates the sharing of information between multiple data stores, applications, and hosted business systems and allows customers to re-use the same software for diverse integration scenarios. Pervasive DataRush is an embeddable parallel dataflow platform enabling data-intensive applications such as claims processing, risk analysis, fraud detection, data mining, predictive analytics, sales optimization and marketing analytics. For more than two decades, Pervasive products have delivered value to tens of thousands of customers in more than 150 countries with a compelling combination of performance, flexibility, reliability and low total cost of ownership. Through Pervasive Innovation Labs, the company also invests in exploring and creating cutting edge solutions for the toughest data analysis and data delivery challenges. Robin Bloor, chief research analyst and president, The Bloor Group and Founder, Bloor Research, recently cited Pervasive as one of the "10 IT Companies to Watch in 2010." For additional information, go to www.pervasive.com.
Source: Pervasive Software Inc.
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.