When I started my career in I.T., one of the first computers I encountered was a Xerox Sigma mainframe that filled a room and played the “Stars and Stripes” tune when it was in idle mode. This large scale computer of its day had a massive 8K of memory.
In the last three decades, we have moved from Kilobytes (KB) to Megabytes (MB) to Terabytes (TB) of memory. This is hard to fathom since it only seemed like yesterday that a Terabyte of disk space was huge and now we are talking about Terabytes of memory. Well maybe not all of us, but certainly the big BI vendors are talking about it a lot.
At this year’s Sapphire event, SAP made a lot of noise about its new High Performance Analytical Appliance (HANA) and the in-memory database it uses that enables real-time business intelligence. While there were a lot of potential uses for this discussed, there was one that really struck me as both poignant and cool at the same time. It was an energy company in the U.K. that was using HANA to analyze electrical usage patterns for its residential and commercial customers. They had recently moved to electronic meter readings that were being recorded every four hours generating millions of data readings that could be compared against previous readings over many days. As a result of this analysis, one customer was seen to be using 30% more electricity over a 12 hour period that any other 12 hour period for the previous month. This caused an alert and they called the customer who turned out to be a bakery that had accidently left an oven on overnight!
This energy company is also talking about providing customer facing business intelligence to allow its residential customers to compare their energy usage with their neighbors in the same street or comparable houses in order that they can see if they are possibly using too much.
Real-time business intelligence is not practical for every situation but this example shows that having the ability to house large amounts of data in memory with high speed appliances like HANA can enable valuable analysis to be done in time periods not previously possible.
So as we move into this brave new world of Terabyte memory appliances, what’s next? Well, a quick Google search showed me there is the Petabyte (1024 Terabytes) followed by the Exabyte (1024 Petabytes) followed by the Zettabyte (1024 Exabytes) and then the Yottabyte (1024 Zettabytes or 1,208,925,819,614,629,174,706,176 bytes). Just to put this in perspective, Apple cites that the entire contents of all U.S. academic libraries could be stored on just 2 Petabytes
Will any of us be around to see those Yottabyte memory chips? Maybe.