Intel’s drive to solidify its stranglehold on the datacenter via its Optane DC persistent memory architecture was on full display this week during the chipmaker’s “data-centric” event.
Navin Shenoy, general manager of Intel’s Data Center Group, touted the growing ecosystem emerging around Optane, including more than 50 OEMs, ISVs and cloud service providers. Among them is Google, the recipient of the first production Optane persistent memory modules last year. Since then, Google Cloud has been beta testing the technology.
Bart Sano (shown), Google Cloud’s vice president of platforms, said the cloud provider is applying Optane to big data workloads such as in-memory database applications—as are other early users such as software giant SAP’s flagship HANA database management platform.
Among Google’s early users of Optane persistent memory in the cloud are non-relational database vendor Aerospike Inc. and Redis Labs. (Startup MemVerge emerged from stealth mode this week with a persistent memory-data storage combination also based on Optane DC.)
Sano also said Google Cloud customers including consumer products giant Colgate-Palmolive and ATB Financial are currently testing Optane deployments.
For platforms such as SAP HANA, persistent memory is being promoted for handling large data sets in-memory as well as the ability to recover data following upgrades or reboots. Intel unveiled its Optane DC persistent memory last August, promising a three-fold increase in system memory capacity over its previous-generation Xeon Scalable processor.
“Optane provides a credible alternative to DRAM, and I think that now customers don’t have to balance against operational efficiency versus cost,” Sano said Tuesday (April 2). He cited Optane’s “sheer capacity” to have large data sets in memory, at memory speeds, and the module’s data persistence to quickly recover from reboots or upgrades. “We cannot wait to deploy this wider in our global infrastructure,” he added.
Intel and Google rolled out the cloud implementation of the chipmaker’s Skylake processor in 2017. Sano said that deployment demonstrated that a “cloud platform is basically the fastest platform that you can introduce this technology.”
Early access to Cascade Lake and the Optane DIMMs followed last October.
This week, Google Cloud released two new virtual machine instances running on the new Intel second-gen Xeon Scalable (Cascade Lake) processors, one compute and the other memory “optimized.” Sano said the former offers a more than 40 percent performance improvement over earlier generations of Google Cloud’s VM instances.
Meanwhile, the memory optimized VM is touted as providing the highest memory configuration—up to 12 terabytes of memory capacity and 416 virtual CPUs. “These [M2 instances] will be able to run virtually any scale of application,” Sano claimed.
Intel said this week its second-generation Xeon Scalable processor supporting Optane persistent memory also includes AI acceleration with DL Boost technology. These and other moves underscore the chipmaker’s sharpened focus on cloud computing and edge applications as well as “high-growth” workloads such as AI and the rollout of 5G wireless networks expected to transport huge amounts of machine-generated data.