In 1965, Gordon E. Moore published the now-famous prediction that computer processors would double in power every two years. And despite widespread doubts about how long exponential growth could continue, Moore’s Law still holds true, roughly half a century later. 

Now, though, industry leaders are beginning to fret about a dilemma of a much larger order. The problem is hybrid in nature, encompassing the combined challenges of improving processor speed, storage capacity and networking throughput simultaneously to meet the towering growth demands of data coursing across global networks.

Few understand just how big “big data” has gotten. Many of us probably still think a terabyte is a lot of data. Today, our digital universe is about four zettabytes. To put that in perspective, while one terabyte can store roughly 100,000 minutes of music, one zettabyte can store just over two billion years of music. By the end of the decade, we’ll be starting to use a unit that few people have ever heard of: the brontobyte — a billion exabytes — or two quadrillion years of music.

At an enterprise level, IT planners are looking out a decade or so ahead and recognize that the current mix of technologies will have trouble keeping pace with the exponential growth of big data. 

For Hewlett-Packard (HP), the solution to this challenge has a deceptively simple name: “The Machine.” The name belies the ambitious scope of a technology development path with few antecedents in company, or industry, history. 

The Machine, a program announced in June 2014, aims to solve this rising problem by coordinating and advancing four emergent technologies in parallel, to prevent the possibility that the rising data flow could flood conventional legacy technologies. 

HP’s recipe:

  • System on a chip. It starts with replacing general-purpose processors with special-purpose cores integrated with memory and networking into a single chip package. Building on the foundations laid by HP’s revolutionary Moonshot microservers, this promises to slash the energy that conventional microprocessors require and chomp through huge amounts of data much more rapidly. It’s like having a toolbox full of specialized tools rather than a Swiss army knife — the right tools are faster and more efficient.
  • Memristors. Today, all our devices — from phone to supercomputer — constantly shuttle information between three layers of memory: what’s needed this instant (SRAM), what will be needed very soon (DRAM) and what may be needed later (storage). Memristors will be fast, dense and cheap enough to play both the “soon” and “later” roles at once and thereby speed up throughput by eliminating most of the to and fro. Critically, Memristor is also “nonvolatile” — meaning that no electricity is needed to maintain the data. This massively reduces the energy required to store data and makes systems virtually immune to power cuts.
  • Photonics. Today, we use fiber optics to move data over long distances. To boost the throughput of information flowing between processor cores within data warehouses, HP is pushing ahead with optical links that relay bits via photons rather than electrons. This eliminates copper wires as the conduit and with them, a big source of energy and space inefficiency. High speed photonic fabrics — a term used to mean the web of connections between processor cores - will allow unprecedented storage and computational resources to be marshalled under a radically simplified programming model — moving data between hundreds of thousands of optimized computing cores and exabytes of Memristor storage.
  • OS. Hardware alone won’t solve this problem. The fourth piece of HP’s vision is a brand new operating system that orchestrates the flow of data between these hardware upgrades. Virtually all software in existence is written to cope with the limitations of conventional architecture. Coders will be able to create applications that can manipulate and extract meaning from vastly larger data sets than is possible today.

So what does this all mean?

If HP can deliver on these technologies — its timeline to do so reaches out through 2020 — the benefits will be enormous, with quantum leaps in performance and energy efficiency. The problems of having to build and find electricity for thousands of new data centers will effectively disappear. Computational tasks that today require government levels of funding and legions of data scientist will be within the reach of almost anyone.

As with all innovations that come with great promise, The Machine will face its ultimate test once it’s in the field. Once it’s there, HP hopes it will not only address critical business needs today, but, more importantly, the ones we will face tomorrow.