Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

“I Wanna Go Fast” – Ricky Bobby

Some of you may remember that line from Will Farrell’s ‘Talladega Nights’. As a NASCAR driver, I’m sure that it is part of your job description that you have to have an inclination for wanting to drive at high speeds. I’ll admit that given the chance I’d want to take a few laps around a NASCAR track.

So what does this have to do with Business Intelligence (BI), well, pretty much everything these days. Can’t you hear it? If you can’t then you haven’t been listening to your users. In every hallway, cubicle, executive office there is the chattering of “I want to analyze my data, and I want to be able to get to it Fast!” No one wants to wait for the results to come up on the screen, I don’t care if it is a report, data dump, or a dashboard (and by that I mean a true dashboard – you know, the kind Mico Yuk would be proud of – not a report posing as one) our users are looking for results that appear in under a few seconds. To make matters more complicated, users want to be able to sift through their data, lots of data, without having the burden of knowing how to formulate a query. Did I mention that they also want to be able to change data filters, analyze the data the way they think and changing their analytical quests on the fly? All of this, and don’t forget their mantra “I wanna go fast!!

Queue the arrival of in-memory analytics. Is this the panacea to the BI conundrum, served up on a silver appliance? This concept is not necessary new, we all know that accessing information via memory rather than searching on a physical disk is much faster, and in the world of BI this is where the money is at – figuratively and literally.

So why is it taking so long for us to get there? One problem is older systems, and by that I mean 32-bit operating systems, can only provide up to 4 gigabytes (GB) of addressable memory, a pittance in the analytical world. Now, here we are over a full decade into the 21st century and 64-bit OS’s are making their way into our data centers, replacing old 32-bit servers. With the ability to provide up to 1 terabyte (TB) of addressable memory it is now possible to cache large volumes of data into RAM. Can I get an “AMEN”? What’s that? Oh, right, many of us still do not have 64-bit OS’s on our desktops; we will take a look at how that affects this solution later.

But I digress, it is a fact that incredibly fast query times will bring back the data faster, hence reducing the time a user has to wait for their report or dashboard. As the cost of RAM drops, the idea of in-memory analytics becomes more of a reality for even the most frugal of businesses. It has been touted by BI and Data pundits that the use of in-memory analytics can reduce or eliminate the need for data indexing and pre-aggregating your data in cubes or tables (tell that to a DBA).

Sounds great, right? I mean if using in-memory solutions reduce IT costs and allows for faster implementations of BI and analytic applications, all while providing users the speed they desire then it should be a no brainer. After all, we hear the users (and ourselves) saying that getting results faster by shortening the query times supports faster business decisions. That’s the theory anyway and I will delve more into this in the next part of this blog when we take a look at the in-memory solutions and take a peak behind the curtain – now where is that wizard…?

Labels in this area