Harnessing hyperscale is no longer the domain of digital giants such as Google® and Amazon®. Innovative utilities are capturing every movement in the life of millions of digital prosumers to find new sources of revenue and deliver personalized services. All while, in the pursuit of zero waste, consumer products companies that commercialize perishable and temperature-sensitive goods are remotely monitoring complex production lines, transportation fleets, and cooling units at thousands of locations in real-time to promptly detect performance anomalies and take action, preventing food contamination or spoilage.

Enabling these business transformations requires intelligent applications and analytic solutions that capture, process and analyze enormous sizes of diverse data coming from inside and outside a company’s walls. Data is at the core of all of these innovations and needs to be instantly processed to support everything from anomaly detection and outcome predictions to in-process decision making and intelligent automation.

 

The In-Memory Difference

In-memory computing has been widely recognized as essential to power this new breed of solutions. By maintaining data close to the processor instead of locking it into traditional disk-based storage, in-memory data management reduces processing latency by several orders of magnitude eliminating I/O bottlenecks. It also allows for fast execution of complex operations such as scoring machine learning algorithms or combining advanced analytics and transactions.

Today, however, some companies cannot find hardware configurations with enough memory capacity to store all their data. For many companies, budget constraints make it difficult to justify adopting in-memory data management for warm data – e.g. data that is not accessed that often, but is still necessary to provide historical context, such as trend analysis. Current computer memory technology, DRAM, provides very fast data access but has limited data capacity and is more expensive than other storage technology, such as solid-state disks (SSDs). DRAM is also not a permanent storage solution, as it loses its data when the power is turned off. As a result, businesses often have no choice but to opt for multi-tier data storage solutions, sacrificing data processing speed for lower costs.

 

Ushering in the Next Generation of In-Memory Computing

Hardware innovations such as Intel® OptaneTM DC persistent memory are going to transform the current data storage hierarchy and allow more data to be processed at in-memory speed.  Persistent memory is non-volatile, thus preserving its content even if the power goes off, and is almost as fast as DRAM while storing more data at a lower TCO.  This means more than a 1000x less latency to delivering data to each CPU core’s registers than classic SSD.

With SAP HANA, SAP has been at the forefront of the in-memory data management revolution for many years. With more than 23,000 customers, SAP HANA is powering core business processes and transformational innovations across all major industries.  The latest release, SAP HANA 2.0 SPS 03, has added native support for persistent memory, and the benefits are significant. Not only will customers be able to store more data in-memory for faster processing, but they will also benefit from shorter start-up times since data is already available in the persistent memory and does not have to be reloaded from I/O constrained storage.

By the numbers, the gains are also impressive. For example, an internal benchmark of SAP HANA using Intel® OptaneTM DC persistent memory, compared to a traditional DRAM and SSD persistent disk configuration with 6 terabytes of data, showed a 12.5 times improvement in SAP HANA startup time. Intel® OptaneTM DC persistent memory will also expand total memory capacity to greater than 3 terabytes per CPU socket enabling SAP HANA to better optimize workloads by maintaining larger amounts of data closer to the CPU.

With higher capacity and improved startup time, SAP and Intel® are removing the barriers to fast, always-on data processing[1]. As Lisa Davis, Intel Data Center VP & GM of Enterprise & Government, put it: “Intel and SAP are proud to launch a revolutionary change for SAP HANA. SAP HANA 2.0 SPS03 & Intel® OptaneTM DC persistent memory will help deliver a lower TCO through larger in-memory capacity, faster start times & simplified data tiering while moving more data closer to the processor for faster time to insights. After a multi-year collaboration, we are excited to see how this significant technology advancement will accelerate SAP’s customers’ journey to the intelligent enterprise.”

The combination of persistent memory and in-memory data management is poised to transform the price/performance ratio of next-generation compute and storage environments, paving the way for faster, cheaper, and bigger hyperscale systems to process data. With data processing capacity radically improved and the ability to store and retrieve large volumes of data as rapidly as needed, business innovation can thrive.

Further References

 

[1] Intel® OptaneTM DC persistent memory is available today for early adoption testing and production shipping to select customers later this year, with broad availability in 2019.

 

The post Harnessing Hyperscale: Processing More Data at Speed with Persistent Memory appeared first on SAP HANA.

source https://blogs.saphana.com/2018/06/12/harnessing-hyperscale-processing-more-data-at-speed-with-persistent-memory/

L’assistance proposée par ANKAA PMO

ANKAA PMO présent depuis plus de 20 ans sur le marché des services IT, accompagne les DSI dans leur recherche de compétences pour des besoins de renforts en mode régie ou l’externalisation de projets.
Vous souhaitez plus d’information ? Cliquez ici