Is deep analytics finally a reality for the masses?
by Ben Rossi, InformationAge
The combination of in-memory technology with complex event processing drops the requirements for pre-processing and expensive connections to mainframe storage.
Until fairly recently, it was only financial sector businesses with deep pockets that could utilise truly huge amounts of diverse data in real time. Deploying streaming analytics and event processing technology, banks and other institutions are using complex algorithms and low latency messaging to take split-second decisions as they trade in the markets.
Now, however, a confluence of factors means the same technology is being adapted for use in very different sectors, such as telecoms, manufacturing or retail, where individual businesses generate masses of data.
One of the key dynamics has been the falling price of parallel processing chips, which has occurred just as their computational power has increased hugely, following Moore’s law about the doubling of transistors in integrated circuits every two years.
Similarly, the cost of memory has fallen markedly, so that it is no longer prohibitive for a business to place several terabytes of its data in RAM.
In simple terms, this means medium-sized business can now conduct tasks that were previously out of reach and required big farms of servers.
These new opportunities have led to a rapid change in attitudes towards in-memory computing, since it now takes a few hours to load and index huge amounts of data that would previously have taken days. …. Read the article
DCL: This is an intriguing article. The question is whether you believe it or not. Or how much of it you believe!
Leave a Reply
You must be logged in to post a comment.