Hewlett Packard Enterprise (HPE) has just taken the wraps off what it claims is the world’s largest single-memory computer – a prototype from ‘The Machine’ research project.
This line of research is all about HPE’s memory-driven computing, an architecture specifically built to crunch through the ever-increasing quantities of big data floating around these days.
Of course, big data is a key factor across a range of industries, and HPE believes that when it comes to analytics, throwing more and more processors at vast sets of data is not the answer.
Rather, a different approach to computing is needed, built around memory rather than the CPU – thus eliminating the relatively clunky and inefficient manner that the processor, memory and storage interact in today’s machines.
HPE’s prototype computer boasts some 160TB of memory spread across 40 physical nodes, and uses an optimised Linux-based operating system driven by a ThunderX2 dual-socket ARMv8-A workload-optimised processor.
HPE says it’s capable of simultaneously working with all the data held in every single book in the Library of Congress – five times over. In other words, 160 million books – a truly immense dataset that can be held in this single-memory system.
Or it can hold 6,400 Blu-ray discs, for the less bookworm-ish.
Scaling mountains of data
The company believes this architecture could easily scale from terabytes to exabytes, and indeed beyond that to a system with ‘near-limitless’ quantities of memory to draw on: 4,096 yottabytes is the figure HPE quotes as the sort of scale to expect in the future. A single yottabyte is a trillion terabytes, to give you some perspective there.
Mark Potter, CTO at HPE and Director, Hewlett Packard Labs, commented: “We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society. The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers.”
- Our best laptops of 2017 don’t have TB of memory, unsurprisingly
By Darren Allan
from Blogger http://ift.tt/2qsccw9