cancel
Showing results for 
Search instead for 
Did you mean: 

What are your ideas for new business apps based on in-memory computing?

oliver_mainka
Active Participant
0 Kudos

When we look at the current set of SAP applications using in-memory computing we see a few themes: aggregating vast amounts of data in BWA (think: summing up a billion records in less than a second), parsing complex data structures of structured and unstructured (here: text) content in Enhanced Material Search, or quickly divvying up tens of millions of customer records in the CRM High Volume Customer Segmentation to find the most appropriate target group for a marketing campaign, all with drag and drop in an attractive UI.

Where the power of in-memory computing really makes a difference to the business user is in the ability to do many iterations of a search or a planning/simulation exercise in a short amount of time, thus being able to put the results into the business context of a meeting or a customer call (instead of having to push it to some background batch, which the business person will look at some time later). And this combined with the ability to process huge amounts of data.

Look at the [SAP video|http://www.sdn.sap.com/irj/scn/elearn?rid=/library/uuid/702a1f5f-c6cb-2c10-f29a-845465ad0b60] on "Real Real-Time Computing" to get a feeling for this.

In my job I am thinking about new kinds of SAP business applications which are made possible with in-memory computing technologies. Crawling the web for millions of opinions of a company's products (and their competitors')? Reading Smart Meter data and making decisions on energy management? Genome comparisons?

Dream with me: if you think of unlimited computing power and unlimited amounts of data which could be processed, what applications would invoke a radical shift in your or your colleague's business lives?

Or if you want to be closer to the ground: where in your job are you currently hampered by too slow data processing or too many limitations on accessing the data you need? And if you would get results in seconds, how would that alter the way you do business?

Accepted Solutions (0)

Answers (5)

Answers (5)

former_member609706
Discoverer
0 Kudos

As of now HANA is yet to evolve a lot and from the road map and information from various blogs this are the set of new apps that are going to be released from the stable of SAP based on In memory computing.

Apps and Processes where HANA can be a demand:

u2022 Enterprise Performance Management

u2022 Predictive analysis

u2022 Manufacturing and supply chain optimisation and scheduling.

u2022 Dock door Scheduling

u2022 Work force Scheduling

u2022 Financial close and profitability analysis

u2022 Enterprise Data warehouse (EDW) and Datamart extraction, Transformation and Loading

u2022 Cross-selling and Up-selling offers

u2022 Dynamic Pricing

u2022 Fact based decisions

u2022 Continuous payroll

u2022 Continuous billing

u2022 Real-time adjudication(Healthcare)

u2022 Same day or Real-time automated clearing house(Banking)

u2022 Batch run length optimisation( Chemicals, Oil & Gas, Discrete Manufacturing)

u2022 Demand Response and u201CSmart Gridu201D Applications

u2022 Pricing and available to Promise for complex engineered to order Industries

Martin_Lauer
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello Oliver,

there are already some applications that run in memory.

For example: Workday

- all data in memory

- 3 tables

- object oriented approach

http://www.monash.com/uploads/Workday-August-2010.ppt

Kind regards, Martin

carlos_weffer
Participant
0 Kudos

Hi Oliver

Although really tempted to join the discussion about in-memory databases and ER DataModels. I am going to restrict myself to the thread's subject: new business apps.

A few years ago I worked on a BW project for ISU (utilities). It is amazing how time consuming batch processes are in utilities OLTPs. I reckon all those processes like billing, simulations, etc would have to be revisited and re-thinked based on the speed provided by in-memory computing.

As a costumer of utilities companies I would love an app which would provide my current consuption balance on the flight. I mean an app that shows the current amount to be paid if I don't turn off my LCD TV, and see how that amount grows in real time while I consume energy. How nice would be to see the direct impact on your bill and carbon emission by turning a light off. Rather than just wait til the end of the month to realize how much your consumption (bill) is. I reckon lot of people would be more consious about their energy consumption patterns having such application hanging on front of their home fridge. Not to mention companies.

Cheers,

Carlos Weffer

Former Member
0 Kudos

if i got this straight, in-memory will allow a reduction of underlying ER model from tens of thousands to only 2 (yes, two) tables. if this is not a revolution in IT then i don't know what a revolution is...and if it's feasible then i'm voting with both my hands for it.

Former Member
0 Kudos

I don't think so it will simply the underlying model, consider it as a supplementary database to your main database. Sole purpose of this DB would be speed like real time transactions systems, warehouse reporting and so on....

Since memory would be a constraint so usually this db would hold only relevant data for ex: in case of reporting system only past two years or in case of a real-time transaction system just daily transactions...

Compression is the key, more you can compress the data more can fit in memory. Compression ratio largely depends on the underlying technology. But usually with column based database, most in-memory database tend to achieve a fair deal of compression..

oliver_mainka
Active Participant
0 Kudos

Hey Greg,

as Sam wrote: this technology is not about reducing the data model, but it is about speeding up applications. I think that most often the app would really benefit from in-memory computing if it is refactored, to make use of the technical capabilities. As a simple example: the app "customer segmentation" to create a marketing campaign target list existed before, but each step to create that list took some time. Now we can slice and dice the customer list by the customer attributes and get a reponse time of say 0.1 seconds, pretty much regardless of how many customers I have. We just showed this at Sapphire with a live example of 10 million customers. Now, wit the fast response time, as an application developer I can think of creating a new drag-and-drop UI, which would not have been possible even with response times of one second. I will refactor the app to get to that new UI. At the same token I may also re-examine my data model or how code is written. An option would be for example to take pieces of the app code and let it run in the DB, without involving the app. You could do this for example for revaluations of financial posting if they are in different currencies.

Best, Oliver

oliver_mainka
Active Participant
0 Kudos

Hi Sam,

I am not sure wheher memory is really a serious constraint anymore. Consider this:

a) the price of memory today is pretty much at the price point of hard disks in the year 2000 (at least according to a study I saw). In 2000 we happily ran business systems without being worried about the cost of hard disks ... and that is where we are today with RAM. Sure, disks also became much cheaper, as did flash memory, but the speed benefits of RAM are so superior that it is worth investing in it.

b) there are already 1 TB RAM blades available, and I think there is now a 2 TB RAM blade on the market (I heard about Samsung having one, and others may have one too)

c) we indeed routinely achieve compression factors of 10-20 x. It all depends much on how full the table field is, and how homogeneous the content. A field "Currency" compresses great, a field with a "UUID (Universally Unique Identifier)" does not.

d) customers typically only want to access this and last year's business data (and keep the rest just for reference)

So I think that it is quite possible to have the active business data even of the largest SAP customers completely in memory, at a reasonable price point. Sure, more expensive than disk ... but who cares about disk anymore (apart for backup)?

Kind regards, Oliver

Former Member
0 Kudos

Oliver,

i think reducing the number of relations is the biggest promise of In-memory. why maintain normalization if all u need is just one record described with hundreds of attributes that are now scattered throughout the ER model. many of today's keys' relationships have been created to stage data to reduce the response time to the end user. with all the records in memory there is no need for staging any more, is there? again, as it's evangelized at HPI, everything can be handled by the memory and code only. no more erp, crm, srm, scm, pp, mm, etc...this may not be just around the corner, but rather a long-term goal of enterprise computing and reporting. pie in the sky?

Former Member
0 Kudos

The advantage would be you no longer need primary keys, indices,...

Database schema will be simplified in a way that you may no longer need to store the same data ten different times to achieve performance and things like that..

But you still need the ER that defines the business logic...

Former Member
0 Kudos

Oliver,

Memory is not a constraint as long as the operating cost is not.

Most of the mid/small size companies would be hesitant in investing in expensive hardware's.

So if the application can compress data to an good extent or utilize memory in a better way similar performance can be achieved at a much lower cost. Check out the compression which InfoBright achieves and you may also want to check MoentDB..

Former Member
0 Kudos

Hi Oliver,

Its a nice video but well whats the solution is it BIA, BO Explorer or something new which SAP is going to come out.

Let me share my experience,

In BI space I particularly saw this approach first with QlikView, quite fascinated I started exploring for some alternatives. Most of the solutions tend to come with a cost for H/W + plus extra licensing for the software.

Curious to implement, for one of the small sized companies I did a pilot implementation. I created a Adobe Flex report over Blaze-DS/Tomcat engine over an In-Memory Database. I was astonished with the performance for 100s of millions of records. But again the memory available put an limitation on the size of In-Memroy Database.

Regards,