1 2 3 9 Previous Next

SAP Predictive Analytics

135 Posts

Smackdown.jpgOxford Dictionaries defines smackdown as “A bitter contest or confrontation”.  I didn’t realize that the word “smackdown” originated from the world of “entertainment wrestling” and isn’t even 30 years old.  But this is the word that comes to mind whenever I talk to someone with a data science background about the topic of SAP Predictive Analytics’ automated machine learning algorithms.

 

Data scientists have a healthy amount of skepticism whenever we say we have a technology that can automate something that typically requires so much training, practice, and experience.  Can you blame them?

 

Automatic Vs Manual Transmissions

 

I’m not a data scientist, and statistically speaking, there’s a pretty good chance you aren’t one either.  There is an analogy that is quite appropriate here (if not completely boring) that most of us can associate with: automobile transmissions.

 

A manual transmission car relies on the human driver to engage the clutch properly and shift to the correct gear at the correct time.  This co-ordination requires practice, and some are not comfortable with it even after days or weeks of training.   There are others (like myself) that prefer a manual transmission even though it requires more work because it provides better feedback (ability to adapt), more control (flexibility), and in most cases better fuel economy (more efficient operation).

 

An automatic transmission uses a computer to measure various metrics (speed, RPM, throttle) and operates the clutch and gearbox on behalf of the driver.  Some prefer this because it works automatically without their intervention, training, or experience.

 

Can you mess up with an automatic transmission? Yup, although it is much harder to stall the vehicle or “bunny hop” the car by being in the wrong gear.  You can spot a person who knows how to drive a manual transmission by the extra special pain they feel when they hear someone "grind the gears" in a car.

 

Data Science – The Manual Transmission of Predictive

 

Ask a data scientist about predictive analysis and many times you will get either an extremely simple explanation (they are dumbing it down for you) or a highly theoretical one (they want to make sure you know it is complex).  Now that I’ve ticked off all the data scientists, let me cover by saying “both answers are right” .

 

gearshift-knob.jpgPredictive modelling is pretty complex – The “secret sauce” is not as much in which algorithm should be used, but the intelligence put into the modelling process by the data scientist him/herself.  As humans, we have a semantic understanding of the data that can improve the predictive model and ultimately the predictive power of it.

 

For example, if we take all viewers in a movie theater, you likely would want to group people based on something like age or the type of relationship of the others in their party (i.e. parent, sibling, spouse), and then apply a different analysis for each group based on their common traits.  You wouldn't want to apply the same heuristics to siblings watching the same movie as you would for a couple on a date would you?

 

A good predictive model can take days, weeks, or even longer.  The kicker is that in the end it is the effectiveness of the model, not how long it took to create it.  A data scientist will do a lot of analysis and iterate through a number of predictive algorithms, models, and variables before settling on the final model.

 

The “data science profession” is probably one of the most subjective occupations in the world – how do you know if you have a great data scientist that is extremely creative or a lazy one that follows a very formulaic process that could be taught to anyone?  Unfortunately it is not immediately obvious, but neither is a person who reached their destination by driving in second gear the whole time.  You'll eventually figure it out when the car gets to the destination but stinks of burning oil.

 

Automated Analytics – The Automatic Transmission of Predictive

 

genuine-mazda-miata-gearshift-knob-wood-automatic-transmission-6.jpgAutomatic transmission cars are very easy to drive because you simply need to understand the concepts of “Drive” and “Reverse”  gears and away you go.  The automated predictive algorithms in SAP Predictive Analytics are definitely more complicated than that, but aim to provide the same level of ease – you need to understand the concepts of clustering and time series but do not actually have to know how they work.   This is what makes Automated Analytics so approachable to people without a data science background.

 

Data scientists typically scoff at these automated capabilities because they do not have the same level of visibility and control they are used to.  We all tend to be a bit suspicious of “magic black boxes” because we usually don’t have any way of determining how effective they are.  In a car, the RPM gauge and the sound of the car are the only indicators that gear shifting is working correctly.   For the majority of drivers, this is enough to operate the car and get to their destination.

 

Automated Analytics generates reams and reams of analysis to help data scientists understand the performance of the algorithms on a specific dataset, much like an uber-set of gauges.  However in keeping with the nature of “automatic”, there are some limitations on how much a data scientist can configure parameters.  Just like an automatic transmission car, you either like it, hate it, or tolerate it because it gives what you want in the end.

 

Expert Analytics – The Semi-Automatic Transmission of Predictive

 

bmw_5_series_550i_sedan_2008_interior_gearshift.jpgSAP Predictive Analytics also includes Expert Analytics which is designed for data scientists to take advantage of any predictive technology they wish – including our automated predictive algorithms, the open source predictive language R, the SAP Predictive Analytics Library (PAL), and the SAP Automated Predictive Library (APL).   In Expert Analytics, the user is not tied to any one predictive technology or algorithm and in fact can create multiple algorithm chains in parallel and use the new Model Comparison feature in PA 2.2 to enable the system to advise which is the best predictive model to use.

 

A question I get a lot about Expert Analytics is how we position it against other predictive analysis tools from competitors.  That is a topic out of scope for this post, but consider what the purpose of a semi-automatic transmission is – give the driver the control and fun of gear shifting when they want while eliminating the less desirable requirements of a manual transmission such as using the clutch at the right time.   Expert Analytics is about getting you to your destination in the most efficient way, no matter whether you are letting the system do the shifting or if you want to step in and be more prescriptive about what happens when.

 

Smackdown Winner: You

 

Shake-hands-website-photo.jpgI was with a customer last week who brought two data scientists and three data analysts to an all-day analytics workshop.   These meetings are usually a challenge because we have data analysts who want to do more predictive analysis but we also have data scientists who tend to be perfectionists around process (since this is the best way to control the quality of analysis).   Presenting Automated Analytics to the data analysts is always well received because it brings a new capability to them that does not require a PhD in Math or some intensely technical statistical training.    This is usually the point where the data scientists say that what they do cannot be automated and lots of arms get folded.

 

However in this case the data scientists quickly understood the value of others in the organization doing their own analysis for some of (what they consider to be) the simpler tasks so they could focus on the higher value projects where complex modelling is required.   One of them said the coolest (and in my opinion the most humble) thing I’ve heard a data scientist say:

 

“The business user knows more about the semantics of the data than I ever will.  They can sometimes better understand how the data should be used because they are solving a specific business question. So while I can create complicated predictive models, they may not be as efficient as simpler models that have more business meaning in them”.

 

The strategy of including auto-nodes in Expert Analytics is to provide data scientists with yet another tool in their spectrum of technologies they can use. So, (some) data scientists will recognize the value of using automated algorithms alongside their traditional techniques. They likely also will want to encourage the data analysts to use Automated Analytics because they can better solve their own problems and free up the data scientist to focus on more hardcore predictive problems that require them to hand-craft their models. 

 

Take SAP Predictive Analytics For a Test Drive

 

SAP Predictive Analytics includes both Automated Analytics and Expert Analytics in a single package so regardless of whether you are a business user or a data scientist, there’s something in there for you.  You can download a free trial of SAP Predictive Analytics here: SAP Predictive Analytics Trial Download

 

For more information, ensure you are checking out the SAP Predictive Analytics regularly.

SAP PA 2.png

On June 12th we formally released SAP Predictive Analytics (PA) 2.2 and it is on the SAP Service Marketplace (SMP) now! .  For those of you who are not already using it, you can download a 30-day trial here.

 

What’s The Big Deal With PA 2.2? 

 

This is another big release for us with improvements and new features across the entire SAP Predictive Analytics portfolio. 

 

Instead of just listing the new features/functions, let’s take a look at how SAP PA 2.2 moves us forward in pursuing some of our core goals (note some features address multiple goals, but I’m keeping it simple here):

 

AA = Automated Analytics

EA = Expert Analytics

HANA = Native on SAP HANA

 

A better, smoother experience for data scientists AND business users:


  • (AA) Very wide datasets (up to 15K columns) support: Automatically handle very wide datasets to improve both the efficiency and effectiveness of your predictive models
  • (EA) Ability to share custom R and PAL components:  Enable other users to use your algorithms with ease.

 

Making data scientists more agile and efficient:

 

  • (EA) New Model Performance Comparison: Compare the performance of two or more algorithms and get a recommendation and detailed explanation for which one is the best to use.
  • (EA) New Model Statistics: Calculate performance statistics on datasets generated by classification and regression algorithms.
  • (EA) Support for R 3.1.2:  To make it possible to use the latest libraries
  • (EA) Support for multiple charts: Use more than one chart in your offline custom R components

 

Enabling customers to better leverage their existing data and investments:

 

  • (AA)Support for SAP HANA Views: Connect directly to SAP HANA Analytic and Calculation Views
  • (AA)Support for SAP BW on HANA: Use BW on HANA systems as a data source
  • (EA)Improved BW acquisition: easier and faster variable selection and handling of hierarchies.
  • (HANA)Updated Automated Predictive Library (APL): Now includes automated recommendation

 

These are only the biggies - check out the What's New Guide for SAP Predictive 2.2 for these and more.

 

My colleagues Antoine CHABERT and Didier MAZOUE have created an extensive and comprehensive post (Frequently Asked Questions - Downloading, Installating and Activating) that is very well worth reading as well!

 

 

Flashback to a Key Feature of PA 2.1: Lumira Co-Existence!coexistence of messaging platforms exchange sharepoint lotus notes-resized-600.jpg

 

The new advances in SAP Predictive Analytics 2.x this year have generated so much excitement that it has kept our teams extremely busy – so busy I did not have a chance to publicize one of the most important features of SAP PA 2.1: Lumira co-existence.

 

YES! You can install SAP PA 2.1 (or later) on the same machine as SAP Lumira 1.25 (or later).  This was a heavily requested feature and now you can even have Automated Analytics, Expert Analytics, and Lumira running all at the same time.  Note that you will need to uninstall previous versions of SAP Predictive Analytics before installing the new versions that include co-existence.

 

You can find out more in this article: Lumira + Predictive Co-Existence: Good News Never Comes Single Handed!

 

How To Get Started?

 

  1. Download the trial!
  2. Check out the online materials and tutorials:
  3. Participate in the SCN Community: SAP Predictive Analytics
    • Learn, ask questions, get answers!

 

So, What’s Next For Predictive Analytics?

 

innovation.jpg

Our development team is really hard at work on the next version already and we are nailing down 2.3, 2.4 and even further down the line, so keep the feedback coming!  The Legal People don’t let me give too many things away, but here are just a few things that we are working on (** Note: these items are in a state of planning.  These can change at any time and are not statements of commitment or of future features).

 

  • Bringing our automated machine learning predictive services to SAP HANA Cloud Platform (HCP)
  • Continuing our innovation on Hadoop and Spark to supercharge our unique Big Data capabilities
  • Even better integration with other SAP systems and landscapes, including SAP HANA, SAP BW, and SAP BI
  • Continued UX progression as we bring Expert and Automated Analytics experiences closer together.

 

Four months ago, I announced PA 2.0 (Introducing SAP Predictive Analytics 2.0!) and “predicted” 2015 would be a BIG year for SAP Predictive Analytics and so far it has been a pretty wild ride.  I would encourage you to set up alerts for the SCN Predictive Analytics Community so that you are always up to date. Simply go to SAP Predictive Analytics and select, "Start email notifications" on the right hand "actions" menu.

 

Now, download SAP Predictive Analytics 2.2 and go predict something!

 

Ashish

SCN: Ashish Morzaria

Twitter: Ashish C. Morzaria (@AshishMorzaria) | Twitter

Many questions users ask on our community are related to the download, installation and activation of SAP Predictive Analytics.


The intention of this document is to provide answers to these questions. The FAQ will evolve over time and take into account new questions & feedback from our user community.


We hope this will prove useful for you. Enjoy SAP Predictive Analytics!


Antoine Chabert and Didier Mazoue


PS: Heartfelt thanks to all our contributors and reviewers!

 

 

General

What is SAP Predictive Analytics?

SAP Predictive Analytics is SAP’s powerful predictive analytics software and the successor of SAP InfiniteInsight and SAP Predictive Analysis.

 

SAP Predictive Analytics combines SAP InfiniteInsight and SAP Predictive Analysis in a product.

 

In SAP Predictive Analytics, SAP InfiniteInsight is renamed to Automated Analytics and SAP Predictive Analysis is renamed to Expert Analytics.

 

Regarding the release version numbers:

  • The last released version of SAP InfiniteInsight is 7.0.1.
  • The last released version of SAP Predictive Analysis is 1.21.
  • The first released version of SAP Predictive Analytics is 2.0, the version that was just released is SAP Predictive Analytics 2.2.

 

What are the current names being used for the products? How do they map to the old names?

Current name

New name

SAP InfiniteInsight

Automated Analytics*

SAP Predictive Analysis

Expert Analytics*

Explorer

Data manager

Factory

Model manager

InfiniteInsight Authenticated Server

Automated analytics server

*Dedicated user Interface in SAP Predictive Analytics.

 

Product Download

Does SAP proposes a trial version of SAP Predictive Analytics?

Yes, we do.

 

Go to this page: http://scn.sap.com/community/predictive-analytics/blog/2015/03/20/sap-predictive-analytics-20-30-day-trial-now-available, click on the Download button, and start downloading a 30-day trial of SAP Predictive Analytics.

 

The releases proposed for trial are the Windows 64-bit and Windows 32-bit releases of SAP Predictive Analytics, desktop version.

 

The 30-day trial starts right after the software is installed.

 

Does the trial version corresponds with the up-to-date release?

Don’t worry, as soon as we release a new version, we update the trial release as well.

At the time of writing, the trial version offered is SAP Predictive Analytics 2.2.

 

As a SAP Predictive Analytics customer, I want to download the latest releases I am licensed to. Where should I go?

That’s easy.

 

First go to the SAP Service Market Place: https://support.sap.com/software/installations/a-z-index.html, click on the letter P (like Predictive), select the entry SAP Predictive Analytics.

Support Portal - Predictive.png

Support Portal - Predictive - 2.png

Support Portal - Predictive - 3.png

Support Portal - Predictive - 4.png

You will see the different products:

 

  • Your first interest might go to SAP Predictive Analytics. SAP Predictive Analytics comes in two deployment modes, a client/server one and a desktop one. The client/server mode has two download packages codenamed PRED ANALYTICS CLIENT 2 and PRED ANALYTICS SERVER 2. The desktop mode is the download package code named: PRED ANALYTICS DESKTOP 2.


  • An important thing to note is that our Expert Analytics user interface is only part of the desktop deployment. It is not part of the client installer.

 

  • Once your first models will be applied for business production purposes, you will need a Model Manager, to monitor the evolution of models over time and schedule model refreshes. Model Manager has the code name PRED ANALYTICS MODEL MGR 2. Model Manager was previously known as InfiniteInsight Factory. To be noted: Model Manager only connects to SAP Predictive Analytics Server.

 

 

For more details, please check our PDF guide: https://websmp110.sap-ag.de/~sapidb/012002523100009341912015E/pa22_architecture_spec_en.pdf

 

Your can also refer to our help portal: http://help.sap.com/pa

 

I downloaded and installed the product. I noticed the product is already activated. Do I still need to apply for a license?

The installation is provided with a 30-day key that is set to expire.

You must apply for your license as soon as possible.

 

As a SAP OEM Partner, I would like to embed predictive capabilities into applications. What is proposed to me?

If you are a SAP OEM Partner and you want to embed the functionalities of SAP Predictive Analytics 2.2 (Automated Analytics) in another application you are building, we have a dedicated product edition for you.

 

Go to the SAP Service Market Place: https://support.sap.com/software/installations/a-z-index.html, click on the letter P (like Predictive), select the entry SAP Predictive Analytics OEM.

Support Portal - Predictive for OEM - 1.png

 

Support Portal - Predictive for OEM - 2.png

 

Product Installation

How many installations of SAP Predictive Analytics are supported on a single machine?

Only one installation of SAP Predictive Analytics desktop is supported on a single machine.

 

Is there a document describing the installation of SAP Predictive Analytics Desktop 2.2?

For a summarized view on the installation process, please refer to this useful SCN post: http://scn.sap.com/docs/DOC-64662.

The detailed installation guide is found here: http://service.sap.com/~sapidb/012002523100009343372015E/pa22_install_en.pdf

 

Is there a document describing the installation of SAP Predictive Analytics Client 2.2?

Please refer to the installation guide here: http://service.sap.com/~sapidb/012002523100009341942015E/pa22_client_install_en.pdf

 

Is there a document describing the installation of SAP Predictive Analytics Server 2.2?

Please refer to the installation guide here: http://service.sap.com/~sapidb/012002523100009342702015E/pa22_inst_server_win_en.pdf

 

Is there a document describing the installation of Model Manager 2.2?

Please refer to the installation guide here: http://service.sap.com/~sapidb/012002523100009343402015E/pa22_model_mgr_install_en.pdf

This blog post nicely complements the documentation: Installing and connecting to SAP Predictive Analytics Model Manager

 

Is there a document describing the installation of SAP HANA APL 2.2?

Please refer to the guide here: https://websmp104.sap-ag.de/~sapidb/012002523100009346632015E/pa22_hana_apl_user_en.pdf

A nice blog post is also here: How to Install the Automated Predictive Library in SAP HANA

 

I am a former SAP Predictive Analysis user. What is the installation process to move from SAP Predictive Analysis to SAP Predictive Analytics?

You need to uninstall SAP Predictive Analysis and then install SAP Predictive Analytics – pick the desktop version so that Expert Analytics is installed as well.

 

Please note your current SAP Predictive Analysis key code will only enable the Expert Analytics of SAP Predictive Analytics, which is what you are looking for. If you want to access the Automated Analytic capabilities, you need to be licensed for it. Please contact your SAP Account Executive.

 

I am a former SAP InfiniteInsight user. What is the installation process to move from SAP InfiniteInsight to SAP Predictive Analytics?

You don’t need to uninstall SAP InfiniteInsight and you can install SAP Predictive Analytics on the same machine. You will need to get a specific key code to activate Automated Analytic capabilities.

 

I am a former SAP Predictive Analytics user. What is the installation process to move from SAP Predictive Analytics 2.0 or 2.1 to SAP Predictive Analytics 2.2?

You need to uninstall your previous release of SAP Predictive Analytics and then install SAP Predictive Analytics 2.2.

 

I love SAP Lumira and I love SAP Predictive Analytics. I cannot choose and I want to install both on my machine. Is it possible?

Yes, you can! This is possible since the 2.1 release of SAP Predictive Analytics and continues to be possible with SAP Predictive Analytics 2.2.

You have to install in the following order though: SAP Predictive Analytics should be installed first and SAP Lumira should be installed second.

It means that if SAP Lumira is already installed, you need to uninstall it first, then install SAP Predictive Analytics, then install SAP Lumira.

Installing SAP Predictive Analytics side-by-side with SAP Lumira without uninstalling SAP Lumira should be normally. In practice, it is not, due to a problem detected too late before release in SAP Predictive Analytics 2.2 code. Apologies for that, we will do a better job on this in future releases.

 

I would like to use R algorithms in Expert Analytics. What should I do?

Detailed instructions are provided in Expert Analytics User Guide here: http://help.sap.com/businessobject/product_guides/pa22/en/pa22_expert_user_en.pdf. Please refer to pages 12/13.

 

I have heard that I can integrate R code into SAP HANA. Where do I find more information?

Please refer to the guide here: http://help.sap.com/hana/SAP_HANA_R_Integration_Guide_en.pdf

 

Where can I find the product availability matrix?

The document that is applicable to the latest release can be found here: https://support.sap.com/content/dam/library/ssp/infopages/pam-essentials/Pred_Ana_20.pdf

 

Are there any restrictions associated with the release of SAP Predictive Analytics 2.2?

Please refer to the central SAP note http://service.sap.com/sap/support/notes/2165858. Our help portal is always available here: http://help.sap.com/pa.

 

Product Activation

I do not really know if my company is licensed to SAP Predictive Analytics. What should I do?

Please get in touch with your SAP Account Executive.

 

I would like to get access to my key codes. Where should I go?

You should go here: https://support.sap.com/licensekey.

Request Licenses.png

  • You can request keys and monitor you key requests (in “What would you like to do today”)
  • You have access to a detailed how to – How to Request License keys - Analytics solutions from SAP
  • If you are facing issues with the license request or creation, you can ask for support using the component XX-SER-LIKEY-BOJ.

 

I have a license for SAP Predictive Analysis. Does it translate into a license enabling SAP Predictive Analytics – Expert Analytics mode?

Yes.

 

I have a license for KXEN or SAP InfiniteInsight. Does it translate into a license enabling SAP Predictive Analytics – Automated Analytics mode?

Yes. Notice you will have to ask and install a new license code.


Where should I input the Automated Analytics key code?

SAP Support usually supply a key code.

 

So it is left to the end-user to add to the proper file so that this is taken into account by Automated Analytics.

The usual location for Automated Analytics license file is the following: C:\Program Files\SAP Predictive Analytics\Desktop 2.2 (in case of a SAP Predictive Analytics desktop 2.2 installation).

 

The file that contains the code name is named License.cfg.

 

When a new key code is to be added, a new line should be added to the file content using the format:

# KeyCode<TAB><TheReceivedKeycode>


The file support multiple key entries.

 

As an example imagine I am licensed to the Modeler component of Automated Analytics, then later on I get licensed to the Engine components of Automated Analytics , I can get an additional key to activating this module and I will need to add this key as a new line in the file.

 

Where should I input the Expert Analytics key code?

Open Expert Analytics, go to the Help menu, and select the entry Key Code. Click on the button Enter Keycode and input (or paste) the key code that was given to you, then click OK.

Step 7.png

 

How many key codes do I need to get Automated Analytics and Expert Analytics working?

First of all, this question is only applicable to the desktop installation of SAP Predictive Analytics.

 

If you are licensed to both Automated Analytics and Expert Analytics, the key code you will receive has to be used in two ways:

  • Adding this code to the Automated Analytics license file (see Where should I input the Automated Analytics key code?)
  • Entering this code using the Expert Analytics user interface (see Where should I input the Expert Analytics key code?)

 

On my installation, I see the entry to trigger Expert Analytics being greyed and I cannot start the product. What can I do?

Expert Analytics.png

First of all check that your installation is a desktop one.

 

If it is, the workaround consists in starting the executable directly.

The executable is usually located here: C:\Program Files\SAP Predictive Analytics\Desktop 2.2\Expert\Desktop and is called SAPPredictiveAnalysis.exe.

 

You should contact SAP Support about the fact that Expert Analytics link is greyed. SAP Support will investigate the problem with you. At the time of writing, we miss understanding what is causing this behavior.

 

If you have installed a client/server deployment, it is expected that the Expert Analytics link is greyed as the Expert Analytics user interface is not installed by the client/server deployment.

 

On my installation, I see some of the entries to trigger Automated Analytics being greyed and I cannot start the product. What can I do?

Automated Analytics - Grayed.png

This is probably due to your license code that is not enabling all the Automated Analytic product capabilities. Please refer to the section of the document related to Product Activation.

Back in April of this year, we've started a new series of webinars, the Advanced Analytics Virtual Classroom.

Our goal was to offer free monthly webinars where experts would be discussing the topic of advanced analytics in a short format (30 minutes). The series is aimed at anyone interested in the topic of predictive analytics, so you don't need to be a data scientist to attend. Each webinar addresses a particular topic and at least 50% of the webinar is spent on showcasing (i.e. demo) our solution, SAP Predictive Analytics and its capabilities.

 

If you are new to the series, you can go back and binge-watch all previous sessions (listed below). We've also put together a handy timeline for each should you decide to go and watch a particular demo or topic.

 

Video Session #1: An Introduction to Advanced Analytics

00:64 SAP Predictive Analytics is about being Fast, Simple, and Everywhere

06:04 SAP Predictive Analytics Overview

08:17 Popular Predictive Use Cases

12:10 2 Modes: Automated Analytics and Expert Analytics

14:38 Social Network Analysis Demo: Building a Social graph

20:28 Social Network Analysis Demo: The Resulting Graph

23:57 Example of Recommendation for Customer - On-the-fly or Batch


 

 

Video Session #2: Data Preparation Made Easy

01:27 New Data Challenges, New Opportunities

03:17 The Advanced Analytics Process: Prepare, Model, Deploy, and Manage

04:37 Successful Data Preparation

09:05 Data Preparation Demo

11:49 Retrieving Additional Data Attributes

14:13 Deriving New Attributes from Data (dates; number of days)

16:36 Computing Aggregates from Transactional Data

23:36 Looking at Generated SQL

26:32 Creating a Classification Model

28:00 Overview of Generated Model

28:38 Contributions by Variables

 

 

 

Video Session #3: Putting Insight to Work

02:02 New Data Challenges, New Opportunities

03:42 The Advanced Analytics Process: Prepare, Model, Deploy, and Manage

04:50 Automated Mode - aka Automated Analytics

05:30 Demo: Selecting a Data Source

07:25 Demo: Selecting your Variables

08:07 Demo: Overview of Generated Model

08:36 Demo: Applying Your Model (output example: Excel file)

10:34 Demo: Applying Your Model (output example: database)

12:59 Demo: Generating the Source Code (example: Java code)

14:25 Demo: Saving the Model (example: text file)

15:00 Demo: Saving the Model (example: in-database; SAP HANA)

17:18 Expert Mode - aka Expert Analytics

18:18 Demo: Predict Room

20:03 Demo: Visualize Room

21:34 Demo: Writing Back to Database

24:08 Demo: Scoring on-the-fly (Stored Procedure)

27:05 Real-World Integration Example: SAP hybris

29:25 Real-World Integration Example: SAP Fraud Management

33:49 Demo: Model Management

41:26 Advanced Analytics Portfolio for SAP HANA

 

 


Video Session #4: When to Choose the Expert Mode


Catch it live on June 18. Click here to Register


For future sessions and events, please go to our SAP Predictive Analytics event page on SCN!


Yesterday SAP’s Charles Gadalla and ASUG Volunteer Joyce Butler launched the start of the ASUG SAP Predictive Analytics Council.  The focus of this council is “exploratory analytics” which according to SAP Analytics on twitter is the “Next generation analytics is analytics powered by math but for human consumption”

charlesexploratory.jpg

Source: SAP

 

Above is Charles explaining the “next generation analytics architecture” at BI 2015 today.

 

ASUG is looking for volunteers to be the customer chair

1fig.jpg

Figure 1: Source: ASUG

 

Joyce explained what ASUG influence councils are, the leadership, and the participants

2fig.jpg

Figure 2: Source: ASUG

 

Participants attend council meetings and give feedback to SAP

3fig.jpg

Figure 3: Source: ASUG/SAP

 

Figure 3 covers the participant profile that SAP/ASUG is looking for – the business Analyst/designer; IT who keep data in right shape.  You do not need math degree; the plan for this product is “just a few clicks”

4fig.jpg

Figure 4: Source: SAP

 

The legend for Figure 4 is as follows:

  • Green SAP embedded apps
  • Blue  is on the cloud
  • Red on premise
  • Automated Predictive Library – grey

 

Plan to make BW available in the automated mode (it is available in Expert mode)

 

Planned innovations is mostly completed (2.2 my guess)

 

Exploratory analytics will use Predictive Cloud Services

5fig.jpg

Figure 5: Source: SAP

 

Figure 5 provides more details or “an idea of where going”

 

Predictive Analytics 2.2 is already generally available.

 

Future direction on the right of Figure 5 shows they are “going for ultra wide datasets – unique data manipulation, interrogation”

 

Simplify includes unifying UI, automated, streamlined, detect faint signals (failing equipment)

 

Blue means cloud – they completed HEC integration, working on HCP now,  and soon have full set of predictive on HCP for exploratory analytics. This is the piece where SAP wants feedback.

6fig.jpg

Figure 6: Source: ASUG

 

Figure 6 covers the council’s charter.

7fig.jpg

Figure 7: Source: ASUG

 

Figure 7 shows the tight timeframe.

8fig.jpg

Figure 8: Source: ASUG

 

Please note you need to be a member of ASUG and your company will need to sign a feedback agreement with SAP to participate.

 

According to ASUG: "The recording and slide deck from ASUG SAP Predictive Analytics Influence Council Launch Call has been posted to the ASUG Influence general discussions space (ASUG logon required). If you are interested in joining the council, please complete the council’s participation survey."

 

Reference

Upcoming ASUG Business Intelligence Community W... | ASUG

On November 13th and 14th 2014, artist Sven Sachsalber spent two days searching (and finally finding) a needle in a haystack at the Palais de Tokyo modern art museum in Paris (http://www.palaisdetokyo.com/en/events/sven-sachsalber).

 

When we are in front of a new dataset we sometimes feel like we are in front of a haystack: will we ever find the information we hope for? How long will it take to find it? And is there really the information we are expecting in it? Could the dataset hide some unexpected and even more precious information?

Sven Sachsalber used his hands and eyes for two days to find the needle as we could use classic analytic tools to manually find information.

But today we can make use of Exploratory Analytics technologies to point us quickly and automatically to the most useful information hidden in a dataset.

What are exploratory analytics and how do they relate to classic and advanced analytics solutions?

 

Classic Analytics, Advanced Analytics and Exploratory Analytics

 

Classic analytics (aka, classic business intelligence)

For the past 30 years business intelligence has been about workflows where the analyst would take a dataset and then try to find information in it with a trial and error approach. We call this Classic Analytics.

 

Hypothesis are done and then tested. Analysts run queries to retrieve data then they filter, drill, pivot, create sections, etc. until when any useful information would surface. Data visualization techniques greatly help in this manual search for information.

Usually the output of the analysis is presented in reports or dashboards where analysts want to share the findings with their community. The information is not usually directly actionable but is rather used to support or influence decisions.

More recently, with the advent of big data, analysts are facing datasets which become longer (more records) and wider (more columns, more attributes). Bigger dataset make it very difficult for a human being to grasp the meaning of the data in its entirety and they also require a longer processing time.

As an example, if it is possible for a human being to fully comprehend the meaning of a 10 columns dataset it is virtually impossible to do the same with a set containing thousands of columns (e.g. generated by sensor readings).

In Classic Analytics, the human being drives the analysis and the software is just used to speed up calculations and provide the requested visualizations. The limits of Classic Analytics are hence based on the limits of human brains to deal with their representation of the information.

With the new kind of business data, wide and long, those limits have been reached and human beings need new facilities to deal with the information.

 

Advanced Analytics

On the other end of the spectrum of data analysis, Advanced Analytics solutions make use of mathematical algorithms and computer automation to find patterns in data and use them to support decision making in operational environments.

In Advanced Analytics, a data scientist or a skilled analyst would set up algorithms of regression, classification, clustering, association, time series analysis, outlier detection etc. to understand the internal structure of an existing dataset and extrapolate rules which can be applied to new data with a precise goal in mind.

Those rules, or models, can be embedded into applications and decision taking processes to support operational users in their daily tasks.

Advanced analytics, by looking at existing data, answer forward looking questions and suggest best actions: “what customers should I target in my marketing campaign?”, “what is the likelihood that this customer will purchase this product?”, “what product should I recommend to this customer? “, “how many items should I have in stock next Tuesday?”.

Advanced Analytics have then the infrastructure of mathematical algorithms, computer programs, data mining concepts which could help human beings overcome their limitations when dealing with finding information in datasets.

 

Exploratory analytics

Exploratory Analytics are the applications of methods and technologies of Advanced Analytics to the task of helping the business analyst finding useful insights in a dataset, as is manually done in Classic Analytics. Exploratory Analytics provide an extension of the classic descriptive and diagnostic analytics by automatically exposing interesting information available in a dataset.

A detailed explanation of the concept of exploratory data analysis, and mainly on the use of visual representations is provided in NIST/SEMATECH e-Handbook of Statistical Methods (http://www.itl.nist.gov/div898/handbook/).

Long datasets are very lengthy to analyze, wide datasets are difficult to interpret and analysts don’t know in advance what parts are important and what parts are not for a given problem (and importance is always a relative concept). Moreover, analysts might not even know if the available datasets do contain useful information for their problem.  In classic analytics, analysts, faced with the challenge of working with a large dataset (long and/or wide), might just give up any attempt of analysis.

In Exploratory Analytics a business analyst would have a dataset automatically analyzed with data mining algorithms to find information about key influencers of measures, outliers, anomalies, points of interest, hidden structures (such as associations between values),  groups of records showing similarities, bands of values having a common business meaning. Visualizations proposed by the solution would help the analyst to better grasp the meaning of the data in the dataset, its pertinence and value to the business problem.

 

Figure 1 shows how Exploratory Analytics is enabling analysts to go beyond classic analysis by means of automated, advanced algorithms and visualization techniques.

EXA_White-Paper_Infographic-image.jpg

Figure 1: Exploratory Analytics at the intersection of human driven and algorithmic aided analysis

 

With an Exploratory Analytics approach, the application used to analyze the dataset would automatically highlight the most important findings and suggest the best way to visualize the information. The end user would receive information to understand the content of the dataset and will be able to judge of its business value. Starting with this initial set of pre-analyzed information, the analyst can concentrate on the important parts. Exploratory Analytics reduce the noise and provide a smaller but more insightful space of data.

 

Exploratory Analytics at SAP

Exploratory Analytics are at the core of various activities in SAP. One of the goal of SAP Analytics team is to make data analysis as easy as possible for any user.

SAP Predictive Analytics is the key solution which enables Exploratory Analytics workflows for all users. In SAP Predictive Analytics it is possible, in a few steps, to analyze a dataset to find outliers, influencers, patterns in data, data quality issues, correlations between variables, etc. With the solution it is possible to automatically answer questions such as ‘does this dataset contain useful information for my business problem?’, and ‘what parts of this dataset are actually influencing the answer I am looking for?’.  SAP Predictive Analytics finds the needle (or the golden ring) in the data haystack.

Moreover, by applying the underlying technology of SAP Predictive Analytics into business analyst tools such as SAP Lumira or into vertical applications such as SAP Hybris we enable also other users to automatically get the insight they need into their business problem.

SAP historically has a deep knowledge of the business analyst needs and dreams across many industries and lines of business and has all the technology needed to satisfy them. With the existing foundation of SAP Predictive Analytics, SAP HANA predictive libraries and the SAP cloud infrastructure, our users will experience more Exploratory Analytics workflows in their preferred applications and environments.

 

Conclusion

Imagine if Sven Sachsalber had a tool which would find the needle in the haystack in a few seconds and, possibly, would have told him that in the haystack there was also a lost golden ring.

Well, that probably wouldn’t have made the headlines in newspapers all over the world but it might have interested any person who’s looking after lost treasures.

Exploratory Analytics on the other hand, can actually make a good headline showing how business can be improved by better understanding data, of any length and any width, and by accepting that humans can be helped by automation and mathematical algorithms to do the initial part of the job and then let real brains work on higher value problems.

In this blog, I have shared my experience gained while creating forecast procedure using following features of SAP HANA:

  • APL (Automated Predictive Library) Forecast Function
  • Application Function Modeler (AFM)


I haven't seen any convincing blog which promotes this feature of SAP HANA. Personally, I believe it is really good feature for people with beginner or no SQL scripting skills as AFM is graphical tool. However, the end result is a procedure which can be used later for different purpose. I created this forecast procedure based on the use case provided by Forefront Analytics. The use case instructed to use APL forecast function. The input dataset was provided by Forefront Analytics. I started my research on APL and my main focus was time series function i.e. "Forecast". I learnt about APL installation and different APL functions as I progressed. I discovered about AFM when I was watching SAP Technology videos as part of my APL research.

 

I have tried to document my work in following three sections:


Prerequisites

There are following prerequisites before we start developing:

  • SAP HANA (SPS 09)
  • SAP AFL (this should be installed as part of SAP HANA)
  • SAP APL (you will need to install it separately)
  • unixODBC 64 bit (No version was mentioned in guide so I used 2.3.2)
  • SAP HANA Script Server should be enabled



You can refer to this blog if you need to install APL.


What is APL Forecast Function?

The forecast operation consists in building and training a time series model, and then applying the model in order to obtain the forecast data. Following table lists the input and output tables required for forecast function to work. Also, there is expected table structure within the description.

                       

Direction
Type
Description
IN

FUNC_HEADER
Optional (The table must be provided but it can be   empty.)
The function header defines the operating parameters   of an APL function call. For instance, a function header can be used to set   the logging level or to provide an operation id. A function header is made of   a collection of string pairs { name, value }. It's usually the first input   parameter of a function. It can be empty.

Expected Table   Structure:

                            
Column
SQL Data Type
Description
KEY
(N)VARCHAR, (N)CLOB
The parameter key
VALUE
(N)VARCHAR, (N)CLOB
The parameter value

Supported   Parameter Names:

                           
Parameter Name
Description
OID
Optional
●Supported values: Any string
●Default value: None.
An operation ID. This optional ID can     be provided by the APL caller to tag all the inserted rows in the the     output tables.
ModelFormat
Optional
●Default value: bin.
The requested output format for the     model. This impacts the table structure of the model output table.
LogLevel
Optional
●Min value: 0 (Disabled)
●Max value: 10
●Default: 8
This impacts the amount of progress     messages and logging information produced by APL and the Automated     Analytics engine. These messages can then be retrieved from the log operation     table.


IN
VARIABLE_DESCS
Optional
The table must be provided but it can be empty. The   variable descriptions for the input dataset, as expected by the Automated   Analytics engine.

Expected   Table Structure:

                                                                                                                    
Column
SQL Data Type
Description
1st Column
INTEGER
Variable rank
2nd Column
(N)VARCHAR, (N)CLOB
Variable name
3rd Column
(N)VARCHAR, (N)CLOB
Storage.
Possible values:
●number (the variable contains only     "computable" numbers (be careful a telephone number, or an     account number should not be considered numbers)
●integer
●string (: the variable contains     character strings)
●date (the variable contains dates)
●datetime (the variable contains date     and time stamps)
●angle
4th Column
(N)VARCHAR, (N)CLOB
Value type: the value type of the variable.
Possible values:
●nominal (categorical variable which     is the only possible value for a string)
●ordinal (discrete numeric variable     where the relative order is important)
●continuous (a numeric variable from     which mean, variance, etc. can be computed)
●textual (textual variable containing     phrases, sentences or complete texts)
5th Column
INTEGER
Key level
6th Column
INTEGER
Order level
7th Column
(N)VARCHAR, (N)CLOB
Missing string value: the string used     in the data description file to represent missing values (for example,     "999" or "#Empty" - without the quotes)
8th Column
(N)VARCHAR, (N)CLOB
Group name.
9th Column
(N)VARCHAR, (N)CLOB
Variable description: an additional     description label for the variable.
OID(10th Column)

The operation ID, if set. Otherwise a     new one is generated. This column is optional.
IN
OPERATION_CONFIG
Mandatory
The configuration of the training operation.
For this function, you must declare the type of   Automated Analytics model and you can declare the optional Cutting Strategy   in the OPERATION_CONFIG table as follows:

                                   
Key
Supported Values /DescriptionAPL/
DescriptionAPL/
TimePointColumnName (Mandatory)
Name of the column in the dataset that     contains the time points of the time series.
APL/Horizon (Mandatory)
Number of forecasted time points
APL/LastTrainingTimePoint (Optional)
Value in the time point column which     represents the last point in time for the training dataset
APL/CuttingStrategy (Optional)
The Cutting Strategy defines how a     training set is cut under three subsets (estimation, validation and test     sets) when needed.
For time series:
'sequential     with no test''sequential' (default value)

Expected   Table Structure:

                            
Column
SQL Data Type
Description
KEY
(N)VARCHAR, (N)CLOB
The parameter alias name
VALUE
(N)VARCHAR, (N)CLOB
The parameter value
IN
VARIABLE_ROLES
Optional
The table must be provided but it can be empty. The   roles of the variables for this training.

When training a model, the roles of the variables   can be specified. These variable roles are provided as string pairs {variable   name, variable role}.
In data modeling, variables may have four roles.   They may be:
●Target variables (also known as output variables):   a target variable is the variable that you seek to explain, or for which you   want to predict the values in an application dataset. It corresponds to your   domain-specific business issue. In some businesses, target variables may also   be known as variables to be explained or dependant variables.
●Explanatory variables (also known as input   variables): an input variable describes your data and serves to explain a   target variable. Explanatory variables may also be known as causal variables   or independent variables.
●Weight variables: a weight variable allows one to   assign a relative weight to each of the observations it describes, and   actively orient the training process.
●Skipped variables: these variables are ignored   during the training process

Expected   Table Structure:

                            
Column
SQL Data Type
Description
NAME
(N)VARCHAR, (N)CLOB
Variable name
ROLE
(N)VARCHAR, (N)CLOB
Variable role.
Supported roles:
●input
●skip
●target
●weight


IN
DATASET
Mandatory
The name of your input (training) dataset.
Datasets are used for training models and applying   models.
Datasets used for training or applying models can   contain any number of columns (within HANA and AFL limits). The supported SQL   datatypes for the dataset columns are all the datatypes supported by AFL:
●INTEGER
●BIGINT
●DOUBLE
●CLOB & NCLOB
●VARCHAR & NVARCHAR
●DATE, TIME, TIMESTAMP, SECONDDATE
OUT
DATASET
The resulting forecast

OUT
LOG
The training log
The operation log. When performing an APL operation,   especially training or applying a model, the Automated Analytics engine   produces status/warning/error messages. These messages are returned from an   APL function through an output database table.

Expected   table structure:

                                                             
Column
SQL Data Type
Description
OID
(N)VARCHAR, (N)CLOB
Operation ID (OID)
TIMESTAMP
TIMESTAMP
Message timestamp
LEVEL
INTEGER
Message level
ORIGIN
(N)VARCHAR, (N)CLOB)
Message origin
MESSAGE
NCLOB
The actual message
OUT
SUMMARY
The training summary

When training a model, debriefing information   related to the training operation is produced. This is known as the training   summary. This information is a set of indicators, provided as string pairs {   KEY, VALUE }.

Expected table   structure:

                                       
Column
SQL Data Type
Description
OID
(N)VARCHAR, (N)CLOB
Operation ID (OID)
KEY
(N)VARCHAR, (N)CLOB
Indicator name
VALUE
(N)VARCHAR, (N)CLOB
Indicator value
OUT
INDICATORS
The variable statistics and indicators

When training, testing or querying a model, it's   possible to retrieve variable indicators (i.e variable statistics). For each   variable, a collection of indicators may be retrieved. These indicators are   described using the following attributes: { variable name, indicator name,   indicator value, indicator detail (when applicable) }. Indicators are   returned from an APL function through an output database table. The output   table contains estimator indicators for regression models, to help plotting   the regression curve.

Expected   table structure:

                                                                        
Column
SQL Data Type
Description
OID
(N)VARCHAR, (N)CLOB
Operation ID (OID)
TARGET
(N)VARCHAR, (N)CLOB
Name of the target, when the indicator     is based on it, for example: predictive power of predictive confidence
VARIABLE
(N)VARCHAR, (N)CLOB
Variable name
KEY
(N)VARCHAR, (N)CLOB
Indicator name

VALUE
(N)VARCHAR, (N)CLOB
Indicator value
DETAIL
(N)VARCHAR, (N)CLOB
Indicator detail




What is Application Function Modeler?


The SAP HANA Application Function Modeler (AFM) is the default editor for flowgraphs. A flowgraph is a development object. It is stored in a project and has extension .hdbflowgraph. By default, the activation of a flowgraph generates a procedure in the catalog. A flowgraph models a data flow that can contain

  • tables, views, and procedures from the catalog
  • relational operators such as projection, filter, union, and join
  • functions from Application Function Libraries (AFL) installed on your system
  • attribute view and calculation view development objects
In addition the AFM provides support for some optional, additional components of the SAP HANA Platform such as
  • the Business Function Library
  • the Predictive Analysis Library
  • R Scripts
  • Data Provisioning operators
  • the generation of task plans
We will be using Application function modeler to create flowgraph consuming APL forecast function. This will generate procedure in our defined schema. We will be following below approach to create forecast flowgraph.
  •        Create Catalog Objects such as schema, source tables, function tables, output tables
  •        Create Flowgraph

      Assumptions:

  • XS project has already been created. In this example we have used following XS project: APP1

Create Catalog Objects


Schema

  • First of all, please create schema definition file named APP1.hdbschema using following method:
  • Right click project name and select New > Other > SAP HANA > Database Development > Schema
  • Select the folder where you want to save this file and give it a name “APP1”. Please make sure there is no template selected
  • Once file is created then open it and enter following
              schema_name = "APP1";
  • Right click and activate it. This will create the schema
  • Run following SQL statements to grant privileges to _SYS_REPO
          grant alter on schema "APP1" to "_SYS_REPO";

Tables


Please create DDL source file (core data services) using following method.
  • Right click project name and select New > Other > SAP HANA > Database Development > DDL source file
  • Select the folder where you want to save this file and give it a name. Please make sure you have selected empty structure. Perform this three times and create three files func_data.hdbdd, output_data.hdbdd, source_data.hdbdd.
  • Once files are created then open it and enter following:
     

func_data.hdbdd

namespace APP1.db;
@Schema: 'APP1'
context func_data {
@Catalog.tableType : #COLUMN
@nokey
    entity FUNCTION_HEADER {
        KEY : hana.VARCHAR(50);
        VALUE : hana.VARCHAR(50);
          
    };
@Catalog.tableType : #COLUMN
@nokey
       entity OPERATION_CONFIG {
       KEY : hana.VARCHAR(1000);
        VALUE : hana.VARCHAR(50);
          
    };
@Catalog.tableType : #COLUMN
@nokey
    entity VARIABLES_ROLES {
       NAME : hana.VARCHAR(50);
        ROLE : hana.VARCHAR(10);
          
    };
@Catalog.tableType : #COLUMN
@nokey
    entity VARIABLES_DESC {
     RANK : Integer;
     NAME : hana.VARCHAR(50);
     STORAGE : hana.VARCHAR(10);
     VALUETYPE : hana.VARCHAR(10);
     KEYLEVEL : Integer;
     ORDERLEVEL : Integer;
     MISSINGSTRING : hana.VARCHAR(50);
     GROUPNAME : hana.VARCHAR(50);
     DESCRIPTION : hana.VARCHAR(100);
          
    };
};

source_data

namespace APP1.db;
@Schema: 'APP1'
context source_data {
    @Catalog.tableType : #COLUMN
       Entity DATA {
              key Date: LocalDate;
                     Produced: Double;
    };
 
};

output_data

namespace APP1.db;
@Schema: 'APP1'
context output_data {
@Catalog.tableType : #COLUMN
@nokey       
    entity FORECAST_OUTPUT {
        Date : LocalDate;
            Produced: Double;
            kts_1: Double;
    };
@Catalog.tableType : #COLUMN
@nokey
       entity FORECAST_LOG {
               OID: hana.VARCHAR(50);
       TIMESTAMP: UTCTimestamp;
       LEVEL: Integer;
       ORIGIN: hana.VARCHAR(50);
       MESSAGE: hana.CLOB;
    };
@Catalog.tableType : #COLUMN
@nokey  
    entity FORECAST_SUMM {
       OID: hana.VARCHAR(50);
              KEY: hana.VARCHAR(100);
       VALUE: hana.VARCHAR(100);
   
    };
@Catalog.tableType : #COLUMN
@nokey  
    entity FORECAST_INDI {
    OID: hana.VARCHAR(50);
    VARIABLE: hana.VARCHAR(100);
    TARGET: hana.VARCHAR(100);
    KEY: hana.VARCHAR(100);
    VALUE: hana.VARCHAR(100);
    DETAIL: hana.CLOB;
      
       };
};



  • Save the files, right click on them and activate. This will create required tables in schema “APP1”. You will see following tables in schema



Creating Flowgraph

Assumption:

Source data table is populated with data

Please create FORECAST_FG.hdbflowgraph file using following method:
  •          Right click project name and select New > Other > SAP HANA > Database Development > flowgraph model             
  •     Select the folder where you want to save this file and give it a name.

  • Once files are created then open it and right click on empty space to go to properties of this flowgraph model. Please change the schema from _SYS_BIC to APP1


  • Add forecast function (drag and drop) from general list in node palette



  • Go to function properties and then choose APL_AREA in area and once functions are populated then then choose FORECAST function. Your screen should be like below:





  • Next, we will add input table and then map it to corresponding function node. Please select FUNCTION_HEADER from tables list in APP1 schema and then drag and drop to empty area in flowgraph model. Select “Data Source” when system asks you. Now, create a connection from this table to FUNC_HEADER node in function. Please note you can name this each node as per your need.


  • Please perform similar for all the input tables and your screen should be like below:



  • We will now add output tables and then map it to corresponding function node. Please select FORECAST_OUTPUT from tables list in APP1 schema and then drag and drop to empty area in flowgraph model. Select “Data Sink” when system asks you.
  • Before we create a connection between node and table we will need to define the columns in output node of function. Please use similar table structure as we defined above when creating output table in schema APP1. Go to properties of node > signature > add/change columns and its data types
  • Now, create a connection from node “Output_Data” to this table FORECAST_OUTPUT. Please note you can name each node as per your need.



  • If you are going to truncate table everytime you run procedure then please select following option in node properties.



  • Please perform similar for all the output tables and your screen should be like below:

  • Please note that in each node there is green tick which means there is no errors
  • Now, please activate this model
  • Once activated you will see new procedures and table types are created in schema APP1




  • To call this procedure please use following SQL statement:


CALL "APP1"."APP1.fg::FORECAST_FG"();




References


How to Install the Automated Predictive Library... | SCN. 2015. How to Install the Automated Predictive Library... | SCN. [ONLINE] Available at:http://scn.sap.com/community/hana-in-memory/blog/2015/02/19/how-to-install-the-automated-predictive-library-in-sap-hana. [Accessed 11 March 2015].

What is the SAP Automated Predictive Library (A... | SCN. 2015. What is the SAP Automated Predictive Library (A... | SCN. [ONLINE] Available at:http://scn.sap.com/community/predictive-analysis/blog/2015/03/02/what-is-the-sap-automated-predictive-library-apl-for-sap-hana. [Accessed 11 March 2015].
SAP Help, SAP Automated Predictive Library Reference Guide, viewed 20th March, 2015, https://websmp203.sap-ag.de/~sapidb/012002523100002180172015E/apl11_apl_user_guide_en.pdf
IMG_0941.JPG

 

Attending BI2015 in Nice, France from June 16-18? Predictive Analytics is part of the Analytics track, but you will find the topic in many sessions across different tracks.

 

Here are 7 sessions you should include in your agenda:

 

 

1- Demo: Exploratory predictive analytics with HCP (HANA Cloud Platform)

Tuesday – June 16; 13:00 - 13:30

Exploratory predictive analytics brings automated analytics capabilities along platform-agnostic scenarios to the SAP HANA Cloud Platform (HCP) and targeted for developers as a set of easy to use services.


2- Demo: Predictive maintenance with SAP Predictive Analytics

Wednesday – June 17; 10:00 - 10:30

With SAP Predictive Analytics, you can analyze large volumes of operational data and apply predictive insights in real time to harness the power of Internet of Things to prevent asset failures before they happen. The demo session will showcase predictive maintenance scenario with SAP Predictive Analytics.


3- Session: Creating business forecasts using SAP predictive analytics technologies

Wednesday – June 17; 16:45 - 18:00

Through use-case examples and live demonstration, this session outlines functionality and algorithms for sales forecasting using R programming language vis-à-vis Predictive Analysis Library (PAL) functions.


4- Session: A technical guide to leveraging advanced analytics capabilities from SAP

Thursday – June 18; 08:30 - 09:45

With an emphasis on use-case examples, this session examines how to exploit SAP’s advanced analytics solutions for big data and their associated algorithms to drive business impact.


5- Discussion Forum: Predictive automation – What, why, and how

Thursday – June 18; 10:00 - 10:30

Competing in today’s demanding marketplace means going beyond “what happened” and predictive analytics plays a major role going beyond sense and act. A skills shortage and proliferation of Internet of Things (IoT) and Big Data demands predictive automation.


6- Session: SAP Predictive Analytics: Latest updates, future roadmap, and live demo

Thursday – June 18; 10:30 - 11:45

Attend this session to explore the fundamentals of SAP Predictive Analytics, and find out how data scientists, business analysts, and business users can leverage it for greater visibility into trends, risks, and opportunities.


7- Session: How to detect and investigate fraud for improved financial processes

Thursday – June 18; 114:45 - 16:00

With the amount of data growing exponentially every day, this session will explore how SAP Fraud Management can help you keep up with changing behaviours and patterns in fraud and misuse of financial data and assets to ensure that your operations run efficiently.

 

Want more? View all the sessions where predictive is featured

Hi Guys,

Attached a short Summary regarding Predictive Analytics.

Important for Sales.

 

Link for the Summary (I´m not allowed to upload a pdf document):

http://heger.top-it.at/download/Webinar_PredictiveAnalytics_20150429.pdf

 

 

Regards Manfred

Continuing from our earlier blog where I talked about how to access the HANA cloud image on AWS for Predictive Analytics, herewith I am attaching a short 20min video which explains the complete process.

 

  • What is all the buzz about Predictive Analytics in the Amazon cloud
  • Why the cloud
  • Where can I find it
  • How you go about the Amazon Web Services
  • How does it work with SAP Cloud Appliance library
  • Business scope with RDS
  • Brief history of the RDS
  • Current use case scenarios pre-built in the HANA cloud image
  • How is it exactly pre-configured
  • What's more to do
  • A high level demo of the use cases pre-built on the cloud image
  • How does RDS provide a logical path to value

 

 

This was probably already addressed by another blog post but I couldn't find it unfortunately, so if you find it’s a duplicate, I’ll withdraw mine.

 

One of my colleague at SAP asked me this question and I kind of found out that the answer was not so obvious if you haven't played with the new “Auto” algorithm feature for long.

Also I think this could apply to any HANA online scenarios.

 

So, usually when you use the “Expert Analytic” (former Predictive Analysis), with text files, you were able to add your initial data set, create an “Auto-Classification”, save the model and then add a new data set and finally apply the saved model.

But when working with HANA online, “Expert Analytic” allows you to only work with one data set. In other words, you cannot add a new data set in the “Prepare” room.

 

So how do I apply a model I build with my training data on a new data set?

 

The answer is not far from what the “Automated Analytic” module is doing when using it in a production environment. You will first need to export the model, then apply it to your new dataset.

Next question is how do I export my model?

 

But let's start from the beginning, after launching the Expert Analytic, create a new “Document” and select "Connect to SAP HANA One"

 

image002.png

 

Next is to enter your HANA connection details then hit “Connect”:

 

image003.png

 

Then, select your dataset

 

image005.png

 

The dataset will be analyzed, and the column names and type will be retrieved.


image007.png

 

Now you can select the “Predict” room, add your “HANA Auto-Classification” algorithm, configure it, then you can run it ,

 

image009.png

 

Once the run phase is completed, you can then right-click on the “HANA Auto-Classification” node and use the “Save as Model” item.


image010.png

 

Once saved, the model will appear on the right side bar under “Models”. Select the saved model, then click on “Export Model”

 

image011.png

 

From the following popup, use the “.spar” file format.

 

image012.png

 

Save it on your desktop for example.

 

image013.png

 

Now, your model has been exported, so we now have to create a new “Document” and select "Connect to SAP HANA One", select your the dataset to be scored. Then, on the right side bar, you can click on the “+” sign, and use the “Import Model” item.

 

image014.png

 

 

Select the “spar” file previously created

 

image015.png

 

Select the model to import from the list

 

image016.png

 

You will now be able to use it in the "Predict" room to score your new dataset.

 

image017.png

 

Hope this save you some time.

 

This article was built using SAP Predictive Analytics 2.1 using a HANA instance where the Automated Predictive Analytics (APL) was previously setup.

 

Any comments, feedback or opinions are very welcome!

Attending SAPPHIRE NOW / ASUG Annual Conference to be held in Orlando, Florida from May 5-7? SAPPHIRENOW is an enormous show and to get the most of your 3 days, we have recommended 12 predictive sessions you should include in your schedule. But attending lectures, panels, microforums is not enough. To see our predictive analytics solutions in action, here's 5 demo sessions not to be missed:

 

PT20324

Make More Meaningful Business Decisions by Using All Your Big Data

Enable better decisions, improve business results, and support your Big Data initiatives in a simpler manner while minimizing costs by providing complete, accurate, and accessible information from all data sources. Manage and integrate large volumes of data and get real-time response rates regardless of data volume, location, or type.

05/06/2015

2PM

PT20325

Transform Your Business with the Internet of Things

Discover how the Internet of Things will transform your existing business operations. See how SAP HANA Cloud Platform supports the IoT by enabling enterprises to process large volumes of data generated by thousands of sensors to allow real-time and predictive decision making that can revolutionize your products and services.

05/05/2015

1:30PM

PT20330

Personalize Customer Relations with Predictive Analytics

Discover how embedded predictive applications have the ability to anticipate, learn continuously, and provide the right context and content at the right time. Watch and learn how you can enhance customer relations with SAP Predictive Analytics software.

05/05/2015

2PM

PT20332

Combat Fraud with Predictive Analytics

Find ways to uncover and prevent fraud with a powerful analytic application. Learn how, when combined with predictive analytics, SAP Fraud Management goes beyond detection to actually prevent fraud by spotting the events and conditions preceding fraud and averting its occurrence.

05/05/2015

3:30PM

PT20330

Build a Predictive Application in 10 Minutes

Discover how predictive applications have the ability to anticipate, learn continuously, and provide the right context and right content at the right time. Watch as a predictive application is built using SAP Predictive Analytics software in the cloud in just 10 minutes.

05/05/2015

2PM

 

Hope to see your there.

Pierre Leroux, @pileroux

Attending SAPPHIRE NOW / ASUG Annual Conference to be  held in Orlando, Florida from May 5-7? Predictive analytics should be on your list of hot topics to check (read why) during these three days. Here are 12 predictive sessions you should include in your schedule:

 

Tuesday May 5

1. Microforum PT20340: Discover Why Your Business Needs Predictive Analytics Now 11:00-11:45 a.m.
Go beyond “what happened.” Discover the major role that predictive analytics plays in letting your organization respond to change before it happens.


2. Demo Theater GS21990: Take the Guesswork Out of Trade Promotions with Predictive Analytics 1:30-1:50 p.m.
Focus on promotions that are most likely to achieve objectives by making data-driven decisions based on predictive analytics. Understand price and time sensitivity, true net lift, and volume drivers. Simulate promotions, regular business, and customer plans.

 

3. Lecture BI747: Improving manufacturing processes with SAP HANA Predictive Maintenance at Mohawk Industries 3:00-4:00 p.m.
Learn how Mohawk, the world’s leading flooring manufacturer, uses SAP HANA and predictive analytics in its predictive maintenance program to increase product quality by improving its manufacturing process parameters.

 

Wednesday May 6

4. Demo Theater PS20796: Determine Propensity to Buy Using Advanced  Visualization and Analytics – 12:00-12:20 p.m.
Searching for better ways to learn customer spending habits? Use predictive analytics to identify customer buying behavior and forecast for better sales and planning.

 

5. Demo Theater IN20709: Benefit from the Convergence of IT and Operations in the Internet of Things 1:30-1:50 p.m.
Take advantage of the Internet of Things and of the new analytical and predictive capabilities that result from the integration of smart data from various devices.

 

6. Lecture BI121: Deploying Predictive Analytics Solutions – How Lockheed Martin Space Systems Forecast Supply Chain Management Performance 2:30-3:30 p.m.
Learn how Lockheed Martin Space Systems and SAP teamed up to deliver a predictive analytics capability to forecast supplier scheduling performance and better manage and engage with suppliers.

 

7.  Microforum PT20341: Experience the Power of Predictive Personalization 4:00-4:45 p.m.
This discussion centers on how customers today demand a personalized experience in every interaction, through every channel, every time. Transform the customer experience with advanced personalization approaches enabled by predictive analytics.

 

8. Lecture BI109: Predictive Analytics 2.0 Roadmap 4:15-5:15 p.m.
This session will appeal to anyone interested in predictive analytics from SAP and how this space is evolving. A roadmap will be shown as well as use cases and success stories.

 

9. Theater Presentation PT21256: Enlighten Customer Services with Real-Time Analytics 5:00-5:20 p.m.
See why rapid global expansion, while good for business, puts pressure on IT to provide easy access to critical data, identify market trends, and react faster to customers’ needs.

 

Thursday May 7

10.Panel Discussion PT25133: Simplify, Innovate, and Transform Your Business with Analytics and Mobile 9:00-9:30 a.m.
See how companies have harnessed the power of SAP technology to transform the way they serve their customers and employees to maximize performance and innovate business models.

 

11. Demo Theater PT20332: Combat Fraud with Predictive Analytics 1:00-1:20  p.m.
Learn how, when combined with predictive analytics, SAP Fraud Management goes beyond detection to actually prevent fraud by spotting the events and conditions preceding fraud and averting its occurrence.

 

12. Demo Theater PT20330: Build a Predictive Application in 10 Minutes 1:30-1:50 p.m
Discover how predictive applications have the ability to anticipate, learn continuously, and provide the right context and right content at the right time. Watch as a predictive application is built using SAP Predictive Analytics in just 10 minutes.

 

Looking for more? We have over 141 sessions and demos showcasing predictive. View all SAPPHIRE predictive sessions here (check predictive analytics under platform and technology).

 

Finally, if you arrive early, you can catch the ASUG pre-conference seminar: Hands-On Predictive Modeling and Application Development Using SAP HANA Predictive Analysis Library (PAL) and R

 

Looking for predictive demos? View my list of 5 demos not to miss at SAPPHIRE Now: 5 #Predictive Demos Not To Miss at #SAPPHIRENOW

 

Hope to see your there.

Pierre Leroux, @pileroux

 

Note: This blog post was originally published on SAP Analytics Blog

On a vibrant online community, there is so much activity that you may miss some pieces that were voted in by your peers. Here are the most viewed ones published in Q1 2015; if you did not have the time yet to read them, it’s time to!

 

#1 Introducing SAP Predictive Analytics 2.0! - by Ashish Morzaria
This blog post introduces our new product SAP Predictive Analytics (combination of SAP Predictive Analysis and SAP InfiniteInsight, and their advanced predictive capabilities), full of tips and delivering the essential: explaining the Why, what it brings to you, what’s new in, how to get started and what’s next!

 

#2 14 Examples on How to Use Predictive Analytics Solutions - by P Leroux
14 short and fun videos covering a large spectrum of key predictive possibilities on common business cases that can also give you fresh ideas on how to address critical questions whatever your company’s industry!

 

#3 Basket Analysis with SAP Predictive Analysis and SAP HANA - Part 1 - by Ian Henry
Market Basket Analysis (MBA) is a common predictive use case. It allows to find relationships between purchases when many products are involved. The three part article explains how you can do MBA with SAP Predictive Analytics and SAP HANA.

Basket Analysis with SAP Predictive Analysis and SAP HANA - Part 2: Visualisation of Results
Enhancing Market Basket Analysis with PA 2.0 and SAP HANA

 

#4 What is the SAP Automated Predictive Library (APL) for SAP HANA? - by Ashish Morzaria
A detailed, very complete article about (APL) – a major milestone in our efforts to integrate and embed our advanced analytics services everywhere and into everything!

 

#5 3 Easy Steps to Run Predictive Analytics on Hadoop - by Victor Lu
Automated Analytics (formerly SAP InfiniteInsight) made ‘run predictive analytics on Hadoop’ easy and available to non-Data Scientists. Learn more!

 

#6 Try SAP Predictive Analytics (a HANA Cloud Image) on AWS - by Venkata Raghu Banda
Rapid Deployment Solutions is a key part of our community. The first sentence of this article will urge you to learn more: “The wait is over! (…) Now SAP Predictive Analytics scenarios can be simulated and run on the AWS cloud in the context of HANA.”

 

#7 7 #Predictive Sessions You Should Attend @ #BI2015 in Las Vegas - by P Leroux
Yes #BI2015 Las Vegas is over but stay tuned: #BI2015 Nice is coming (June 16-18)! More about the predictive sessions there to come soon!

 

#8 2015 College Basketball Predictive Analysis - by Charles Gadalla
A predictive and visualization challenge around the federator sport that is basketball? Yes it is! 2015 #VizTheMadness Challenge powered by SAP Lumira & SAP Predictive Analytics

 

#9 Predictive Analytics 2.0 – What is New - ASUG Webcast - by Tammy Powlas
In case if you missed the ASUG webcast “What’s New in SAP Predictive Analytics 2.0?”, Tammy put together the key takeaways for you. Here!

 

#10 SAP Predictive Analytics 2.0 30-Day Trial Now Available! - by Ashish Morzaria
The most expected SAP Predictive Analytics 2.0 trial is available, try it now!

 

There are many predictive resources available on SCN and sap.com/predictive! Here are 3 ways to get engaged:
- Follow the SAP Predictive Analytics community to be informed as soon as there is something new posted or discussed here
- Check the ‘Content’ tab to make discoveries here
- Follow your favorite authors to be informed when they publish a new piece
And don’t forget the tutorials page that is updated on a regular basis and where you find tons of crucial tips!

http://scn.sap.com/servlet/JiveServlet/showImage/38-123149-674738/1asug.png

 

In 3 weeks ASUG Annual Conference begins, co-located with SAPPHIRENOW.  Last year's ASUG SAP Predictive sessions was highlighted by Peter Tanner, who used to work for the Obama campaign.  The campaign used SAP predictive tools to as a way to analyze and mine social media, and send targeted e-mails.  I am thinking of this today as the 2016 US Presidential campaign kicks into high gear.

 

 

For sure please consider adding the following ASUG Predictive Analysis sessions to your agenda, selected by ASUG Volunteers:

 

CODESESSION NAMESESSION DESCRIPTION
BI109Predictive Analytics 2.0: Roadmap, what's new and what's nextThis session will appeal to anyone interested in Predictive Analytics from SAP and how this space is evolving.  A roadmap will be shown, as well as use cases and success stories. At the end of this session, you should have a good idea of how to make your business smarter with SAP.
BI1521Deploying Predictive Analytics Solutions – How Lockheed Martin Space Systems Uses Predictive Analytics to Forecast Supply Chain Management PerformanceLearn how Lockheed Martin Space Systems and SAP Data Science teamed up to deliver a predictive analytics capability to forecast supplier scheduling performance using R, SAP HANA, SAP Data Services, SAP Enterprise Resource Planning and SAP ABAP Acceleration to help Lockheed Martin better manage and engage with its suppliers.
BI747Improving manufacturing processes with SAP HANA Predictive Maintenance at Mohawk IndustriesLearn how Mohawk, the world's leading flooring manufacturer, uses SAP HANA and predictive analytics in the "Predictive Maintenance" program to increase product quality by improving its manufacturing process parameters, including many millions of sensor data readings. Also learn about SAP's related customer co-innovation program.

 

 

The Lockheed session above is not to be missed; they share their experiences with the SAP Data Science team. 

 

But there's more to come.  Starting very soon, ASUG will have call for speakers for SAP Analytics & BusinessObjects Conference in Austin, Texas August 31-September 2nd.  Next week, April 20th, call for speakers for ASUG sessions at SAP TechEd Las Vegas (October 19-23) start.

 

If you can't make the face to face events, ASUG has the following webcasts planned related to Predictive Analytics:

 

 

So take advantage of your ASUG membership to help you predict the future.

Actions

Filter Blog

By author:
By date:
By tag: