Following the blog post The TechEd season is back!, by Antoine CHABERT you may have already noticed some details about the sessions for which I was the speaker with my colleagues Narasimha Rao Addanki,Orla Cullen,Flavia Moser and Abdel DADOUCHE.


We, SAP BusinessObjects Predictive Analytics Product Management Team were very much engaged demoing Predictive Analytics at the show floor, conducting many hands-on and expert networking sessions and also met many Predictive Analytics enthusiast customers which were great experiences for all of us.


I would like to highlight the three - 2-hour hands-on sessions that we repeated twice during the whole Tech Ed week.


  • ANP260: Massively Automate Predictive Models with Predictive Factory and SAP HANA
  • ANP270: SAP BusinessObjects Planning and Consolidation with Predictive Analytics
  • ANP160: SAP BusinessObjects Predictive Analytics 101: Discovery Session


There was a lot of interest in all of these Predictive Analytics hands-on sessions. They were all almost full house and it was great to observe that a lot of questions were asked by the participants.


Let me summarize at a high-level what has been covered in the above 3 sessions.


ANP260: Massively Automate Predictive Models with Predictive Factory and SAP HANA (2-hour hands-on)


In this session, we introduced the attendees to the next-generation SAP BusinessObjects Predictive Analytics 3.0 product, with a new brand SAP Fiori user experience. Attendees created a model, imported into the Predictive Factory, and automated the various tasks in the life cycle of a predictive model, including applying it to fresh data, tested for deviations, and retrained a model using data from SAP HANA. The attendees also got a hands-on experience using segmented, time-series model management enabling massive model automation.


ANP270: SAP BusinessObjects Planning and Consolidation with Predictive Analytics (2-hour hands-on)


In this session, my colleague Narasimha Rao Addanki , Flavia Moser Uwe Fischer and I explained how to enable business users to make data-driven decisions using Machine Learning-based Time Series forecasting embedded in their dashboard and then use the prediction as a basis of their forecast within their BPC application. There was a great deal of interest in PA-BPC scenario from various customers.


ANP160: SAP BusinessObjects Predictive Analytics 101: Discovery Session (2-hour hands-on)


This was a discovery session for the beginners with SAP BusinessObjects Predictive Analytics, but also for "Predictive" newbies.

The attendees ran a "real" customer predictive analytics end to end scenario with simulated / re-arranged data.

For more details on this session, you can refer the blog My TechEd season forecast written by colleague Abdel DADOUCHE.


Expert Networking Sessions.

Several customers showed up to our Predictive Analytics with R and HANA integration, PA-BPC scenario and Fraud Management networking session, asking great questions.

Overall, it was a very busy, but a great Las Vegas TechEd 2016.

If you are interested in attending any of these sessions in Bangalore (Oct 5- 7th) or Barcelona (Nov 8-10th) Tech Ed, please bookmark the session numbers in your agenda.


Enjoy the TechEd season!!

ASUG is hosting a six-part webinar series which will cover how SAP BusinessObjects Predictive Analytics supports the end-to-end processes of a Predictive Analytics project using the example of a use case in a financial organization.

More details  on the series can be found here: Kicking off six-part ASUG webcast series: Impro... |ASUG


In this specific webcast session, "Prepare Your Data for Predictive Models Using Data Manager", I talked about Data Preparation, which is one of the trickiest aspects of a predictive analytics project and typically consume the majority of the time. In this webinar, I demonstrated how SAP BusinessObjects Predictive Analytics Data Manager can enable business users and data scientists quickly prepare analytical dataset by connecting to various databases, create thousands of derived variables by few clicks and make the result of the predictive model much more accurate. You can also learn how to create complex aggregates and conditions in an analytical dataset by watching this webinar.

Timestamp population is an important feature of an analytical dataset, in this webinar, you can review the process of creating various snapshots of input dataset using timestamp population object and how to define targets for the predictive model.

The recording can be viewed via this on demand link.


Happy watching !! And please don't forget to provide your feedback.



Q3 2016 SCN Recap

Posted by Antoine CHABERT Sep 23, 2016

TechEd is a major event for SAP and for the SAP Predictive teams!

Please come meet us in Las Vegas, Bangalore and Barcelona!



Las Vegas lectures were recorded for your listening pleasure - hear from our top-notch experts!


Multiple webinars happened in Q3.


There was an askSAP webcast end of August, nicely summarized by Tammy Powlas


A new series of webcasts is being delivered to ASUG members - this is in progress and awaits for your registration if not done yet!


For more webinars, please refer to our Events page.

Top picks for now:

  • Replay the recording for our Data Manager webinar delivered by Debraj Roy
  • Register for our Sept 29 webinar on supervised vs unsupervised clustering by Bertrand LAMY


In August, a trial version of the HCP, predictive services was launched.

Read more from Thierry BRUNET


Thierry BRUNET announced that some database platforms will no longer be supported in our planned Q4 release (spoiler alert: this future release is nicknamed 3.1)


The future of Predictive?

Orla Cullen tells it all, with no crystal ball!

Introducing the future of Predictive Analytics at SAP


Surya Kunju uses it to predict Box Office Success!


You are bored by Deezer or Spotify recommendations? Try SAP Predictive's ones thanks to Andreas Forster!


Engage with us and stay tuned for a rocking Q4 on SCN Predictive!

Machine learning, sentient artificial intelligence, humanoid robotics—all of a sudden these terms don’t feel as strictly ‘sci-fi’ as they once did. Films like Her and Ex Machina offered visions of a digital future that felt almost close enough to touch, in the sense that the very same technology could feasibly be in our own hands soon.




Machine learning in particular has seen strong progress in recent years, with the likes of Google, Amazon and SAP breaking new ground in creating algorithms that learn from data. In the spirit of Toronto International Film Festival (TIFF) and SAP’s Our Digital Future film series, why don’t we get down and dirty with a little machine learning to help us predict the success of a soon-to-be-released movie?

Let’s use La La Land, a comedy drama in which a jazz pianist falls for an aspiring actress in Los Angeles, as our test case. Ahead of the film’s Canadian premiere at TIFF ’16 and its full release in December, how can we determine the biggest factors in how it will perform at the box office?




The answer is in using predictive analytics, an aspect of machine learning that depends greatly on historical data. In today’s world, we can pull historical data about movies from various sources. Some of the key data points for our test include the starring cast, genre, the film’s MPAA rating (in this case PG-13), production budget, country of origin and runtime.


Another factor is the film’s critical reception, both from the media and from movie database users. There are also technical data points such as sound mix, aspect ratio, camera, laboratory, negative format, cinematographic process and printed film format. The target variable is movie revenue. Using the above data, I created a classification model using SAP Predictive Analytics. Here is one of the most crucial outputs of the model—contributing variables:




La La Land is written and directed by Damien Chazelle of Whiplash fame, is of the same genre, has a similar MPAA rating and has some of the same technical data points. With star power being the most important variable, however, we are left with the burning question: Has the director put together the right cast, and was he right to pair Emma Stone with Ryan Gosling?



To find out, I used a technique called social network analysis. I began by scraping data using publicly available Twitter APIs for mentions of #EmmaStone. I then filtered the data to show only male lead actors as part of the hashtag. Below is the graph I created to show the strongest recommendations to play Emma Stone’s love interest.





The width of the line from Emma Stone to Ryan Gosling doesn’t lie—it’s a perfect match. Is it simply that the casting director is a genius, or has he been making use of machine learning himself? Either way, our little glimpse into the world of machine learning has resulted in the prediction of success for La La Land. Now we just have to wait for the film’s release to test this theory out. This article first appeared on SAP BusinessObjects Analytics blog:

Analytics has gained a huge importance in past few years and has turned out to be very useful in gaining the insights of data and helping many

companies to improve their business performance.It mainly helps the companies to get the answer of three important questions:"what has happened?",

"what is happening?" and "what will happen?". Analytics not only helps in deriving the insights from historical data but also helps in predicting the future by optimizing business.


Analytics is of three main types and they are as follows:

1. Descriptive analytics.

2. Predictive analytics.

3. Prescriptive analytics.

The main role of Descriptive analytics is to analyze and gain the insights of historical data depending on the behavior of data. It helps in finding

out the answer to question "what has happened?". It is used by most of the organisations to for their functioning.



Predictive  Analytics

It is the technique of gaining insights and patterns from historical data and using these insights and patterns to predict the future to help the organisations to take important decisions to improve the business of organisation.

Imagine how helpful it would be when you receive advertisements of  only products you are interested in.How helpful it would be when you can

predict the disease of a person by just checking his historical medical background and current symptoms? All of this can be done using predictive analytics.

Many organisations have started using predictive analytics to improve their business.Companies are able to target the needs of customer and

communicate with relevant product information only.For example, if a customer buys a laptop from eCommerce website than it is most likely he might

be interested in accessories of laptop immediately.Currently,chances of him buying accessories of competitor laptop are less.



Following are the steps to build the predictive modelpredictive_model_journey.jpg

  • Firstly the data is collected from various data sources and it can be of any type.
  • The next step is to clean the data and transform it into a structured data depending upon the business hypothesis.
  • once the data sets become ready for use, predictive modeling techniques and algorithms can be applied on them to gain the important

        insights of data that can be used to predict the future and thus improving the business of organisation.

predictive model builds on descriptive model to build the future.


Prescriptive analytics uses optimization to find a set of possible options and give a better solution or action for a given situation. It helps in

finding answers to "what should be done?". It gives optimal action to what should be done in future depending upon the results of descriptive

and predictive models.It uses complex algorithms to compare the outcome of different actions and chooses the best out of all.

prescriptive models are the most complex models and hence used by less organisations. 

Improving the accuracy of a predictive model is difficult at times and  people get stuck at times but this is real situation where real story begins.

A predictive model can be built in many ways their is no rule but if you follow the ways shared below will increase the accuracy of the predictive model.



1. Add more data.

Presence of more data always results in more accurate model and allows the data to speak for itself than relying on assumptions.we can always ask for more data in order to improve the accuracy.


2. Treat missing values.

Missing values can create a problem while building a predictive model.This is because we dont analyze the relationship with other variables correctly.

These missing values can be treated by different ways such as replacing the missing value with the mean or median of the data. Missing string data

can be replaced by the mode of given data outliers can be treated by removing them from data but it is not always good idea to remove data.



Changing the scale of variable from normal scale to scale between 0 to 1 is data normalization. some algorithms work really very well when data is distributed normally therefore e should remove the skewness of variable.we must normalize the whole data into same scale so as to improve the accuracy of the predictive model.


4. Feature Selection

It is the process of finding out the best subset of attributes of given data which can better explain relationship of independent variable with the target variable.

You can select subset of attributes based on different matrices such as domain knowledge,data visualization etc. it will surely help in making better predictive model.


5. Use Different and Multiple Algorithms

Always try to use different algorithms and check which algorithm gives a better result according to the data and you can also combine multiple algorithms to get more accurate model. hitting the right algorithm is a very important task. Always check the performance of model using different algorithms and hit the correct one.

Building a predictive model is difficult but these steps can help in making a better predictive model.

Data always speaks, all you need is to listen.

This was an SAP webcast held last week

The usual legal disclaimer applies


Source: SAP


Source: SAP

SAP's Ashish Morzaria started with Predictive Use cases, how the digital economy has changed


The internet has moved from a push to a peer-to-peer network

Enterprises have invested to connect suppliers, systems, every transaction is recorded and digitized.


This information is shared with people in the supply chain.


Source: SAP


New economy encourages enterprises to turn digitize assets to their advantage


Inject predictive processes into help improve processes and decisions


How create new products, services and models?  Use algorithms to analyze data that you have to develop a competitive edge.


Source: SAP


Companies embracing these technologies are winning


Companies who have invested in algorithms have more revenue, more profitable, more competitive


Source: SAP


Predictive Analytics accelerates process using automated techniques


Faster, without a single line of code, and models can be reused by analysts in organization


You can apply a segmentation to them to create derivatives of models


Workflows are repeatable, with guided algorithms


The data scientists can use guided workflow and do additional configuration and parameter tuning to understand results


Supports HANA and Hadoop and can push calculations down to HANA and Hadoop to reduce data transfer required


Source: SAP


HANA can run R scripts, an open source library


The issue with R is that it runs outside of HANA and this is inefficient


To solve, SAP created PAL, and PAL follows 80/20 - you can create algorithm using R, but common ones are run natively in HANA to enable in-memory execution


APL (automated predictive library) for relational sources, now in HANA


HANA can use native algorithms, APL, and automated capabilities to run


Predictive Analytics use all of this with one interface


Source: SAP


On the left you have traditional analytics

Bring into warehouse, analytical source, each step has a data extraction


Using HANA, with Predictive Analytics, federate queries/calculations to HANA layer


Flexibility of using R algorithm, mix and match between APL, PAL, R


Source: SAP


Smart data streaming includes the capability to apply scoring algorithm to streams, apply scoring "on the fly"



Source: SAP


Traditional data solutions start sampling, reducing efficiency and accuracy


When using Hadoop with a single source, limited by system with analytic engine


Hadoop + Spark allows doing things across a large number of nodes


Native spark modeling federates calculation to the model itself; have Predictive Analytics software orchestrate in Spark


Source: SAP


This is a deeper diagram showing how it works


The Spark engine is running underneath


Traditional analytics was using code


New predictive analytics has a wizard approach; workflow is similar to using a relational source


Source: SAP


Tool that works for both the data scientist and citizen data scientist


Source: SAP:


Predictive analytics is part of an analytical environment


Traditional analytics answers the questions on the left; very descriptive types of questions answering


What promotions should I run (on the right)


With predictive, understand pattern of past to apply to the future


Source: SAP:


Predictive analytics is part of an analytical environment


Traditional analytics answers the questions on the left; very descriptive types of questions answering


What promotions should I run (on the right)


With predictive, understand pattern of past to apply to the future


Source: SAP


Embedding in a workflow


Start with Predictive, create model, export model, and apply directly to a database (HANA, Oracle, etc.)


Once exported, any application using SQL will get a recordset out


Now any tool that can access database can report on it.


Every BI user can access score in this example


Source: SAP


Batch scoring; creating a retention campaign, address customers who are likely to leave


Take data that represents customers


Apply predictive model, take the score


Your BI user filters on those likely to churn


Q: Are there any plans to make R scripts first class citizens by embedding them in HANA?
A: Plans to bring them to HANA, due to legal issues, will run on HANA but still be a separate process
No plans to make it an in-memory engine inside HANA itself
Q: BusinessObjects Predictive Analytics a separate license?
A: Separate product, licenses are included in BusinessObjects Enterprise Premium announced at SAPPHIRENOW
Capabilities exist in BusinessObjects Cloud for Predictive - more of an embedded case
Q: How Graphx - plan to embed in HANA?
A: Able to do link analysis in HANA using tools


Poll: Source: SAP

In a Digital Enterprise age, competitors are very aggressive, they are fighting in a market share that is no longer loyal to a brand, the new consumers are loyal to the company who provide innovation, make their decisions easy and thereby bring a value to the offer, if you cannot provide this, they will “google it” and find another provider in a matter of seconds.

A couple of years ago the #BigData wave hit the Retail industry, nowadays almost all the key retailers implemented successfully a BigData architecture, elastic, analytic and bring data democratisation to their organisations.

But that is the past, now the Retailers face a new challenge, how to retain customers, how to make them to come again to your store (physical or virtual) and prefer you against the competitors, this is a recurrent block of questions that any retailer has in their mind.

The solution rely in a revolutionary Personalised Marketing strategy, not the typical one, this one will be based technology, specifically in Machine Learningtogether with Predictive Algorithms, and that’s for the backend of course, once your personal offer is selected by those two elements you need to give it in a customer hands in a non-intrusive manner, that looks so natural that the consumer feels it as an obvious option, Mobility is the key here, but mobility in the wide vision that include new TV Apps (Like AppleTV, Google ChromeTV, etc), Wearables (Apple Watch, etc) and Enhancement in native mobile behaviour.

Let me explain each component with a quick example:

Machine Learning: Machine learning is a subfield of computer science that evolved from the study of pattern recognition and computational learning theory in artificial intelligence, this can be exploit in the retail industry to analyse the individual trends, what does he/she like, what not, how reacted to previous offers, what attract him/her to our store. Whit your BigData and the right infrastructure (like SAP HANA, Etc) you can get this very easy.

Predictive Algorithm: Once you have a Machine Learning processing the data and learning what are your customer’s preferences and trends, you need to apply algorithms that identify when the customer has the next buy intention and if possible which articles are in his/her Wishlist, make an offer and wait for the catch, successful retailers like Amazon, Alibaba, etc already use this with an amazing results

New TV Apps: Yes, wake up, traditional channels are obsoletes, you want to have a personalised offer, you need to be sure the customer see it and when they expect to, so you need to link your publicity to the new Television, and it is on demand, via Apps on platforms like AppleTV (tvOS) or Android.

Wearables: Smartwatches and Virtual Reality are there, people get one device each day, new retailers already have VR stores, what about you ? what about notifications in the smartwatch ?

Native Mobile Behaviours: A coupon that goes directly to their digital wallet, using Paypal, Bitcoin. Your fidelity card virtual, with some money on it, you need to re-design your mobile apps to make it more attractive and use the latest technology gadgets, use iBeacons to detect that customer when is in your store proximity and offer him/her something he or she really wants, based in your machine learning & predictive algorithm from your backend.

This is the 2017 challenge, are you working on it ?

The next release of SAP BusinessObjects Predictive Analytics 3.1 is currently planned to be released in Q4 2016. SAP plans to remove Greenplum, MySQL and Ingress database support in this release.  This decision was made based on a careful evaluation of the databases used in our customer install base and also by taking into account the fact the database versions supported on these platforms were legacy versions which were no longer supported by the database provider.  This will speed up product innovation deliveries by allowing us to focus on the database platforms currently in production use.


For customers who wish to continue using SAP BusinessObjects Predictive Analytics on these databases, the end of mainstream maintenance dates are the following :

  • For SAP InfiniteInsight 7.x : December 31st, 2018
  • SAP Predictive Analytics 2.x : February 10th, 2017
  • SAP Predictive Analytics 3.x : June 30th 2018


To take advantage of the new features of SAP BusinessObjects Predictive Analytics 3.1, we recommend you to contact your customer support and adopt one of the supported databases listed in the PAM of SAP BusinessObjects Predictive Analytics 3.1.


For more information about this communication, feel free to contact me directly via email (see my SCN profile).

Following Antoine CHABERT blog post about The TechEd season is back!, you will find here some of my personal thoughts about why TechEd is really a great event but also additional details about the sessions I'm in charge of:

  • ANP160: SAP BusinessObjects Predictive Analytics 101: Discovery Session
  • ANP360: Integration and Scripting with SAP BusinessObjects Predictive Analytics
  • ANP600: Start Developing with SAP HANA Cloud Platform Predictive Services


This year will be my third TechEd season. During the first one, I was their supporting my team to deliver their sessions. Last year, I had the chance to get selected for one my own sessions which I delivered in Vegas and Barcelona. But this year, I'm super proud to "reach" a new level with 3 sessions selected and being able to deliver all of them at the 3 TechEd locations.


Yes, all 3 TechEd locations!! This will be my first time in India as well.


In numbers, this means that over a 2-month period I will be flying around 35,000 km in distance or 64 hours in flight time.


Despite the fact that I will most likely lose my hotel key card at least 5 times, not remember where my room is at least twice and desperately try to open someone else room or simply forget something at home at least once, I feel really lucky!


The reason is simple: I'll meet you guys, our user community!


Being part of a "Go-To-Market" team implies being close to our user base, meeting them as much as possible, to understand what products or solutions they are looking for, the issues they are trying to solve, explain what our products and solutions can do for them and of course how, and all that with no "marketing" slide decks (which is one of my personal goals), but just with 3 simple things: listen, explain and demo!


So I really look forward to meeting all you guys there!


Now, let's go back to the session list. I can try to tease you a little bit so you will add them to your agenda right away after you register!


ANP160: SAP BusinessObjects Predictive Analytics 101: Discovery Session (2 hour hands on)

This is a discovery session for beginners with SAP BusinessObjects Predictive Analytics, but also for "Predictive" newbies (don't be shy, everyone is welcome anyway with me!).


Following one of the leading methodologies used to run a "Data Mining / Predictive" project, we will be running a "real" customer scenario from end to end but with simulated / re-arranged data of course (we are still part of SAP and we are really keen on data privacy).


All in all, you will prepare a data set, build models, deploy and monitor those models using SAP BusinessObjects Predictive Analytics. But that's not all, you will also have some quizzes to answer and some room for discussion about how you would do things (and this is where I'd like to hear from you).


ANP360: Integration and Scripting with SAP BusinessObjects Predictive Analytics (2 hour hands on)

This session is tailored for "intermediate" or "expert" users of SAP BusinessObjects Predictive Analytics.


SAP BusinessObjects Predictive Analytics provides many ways to get integrated into your eco-system but also to get its core capabilities extended.


During this session we will be looking at both the Automated Analytics and the Expert Analytics and their respective capabilities.


Some of the topics we will cover during this session are:

    • Automated Analytics:
      • Generate your scoring equation
      • Introduction to KxShell scripting
      • Build a program using the Java API
    • Expert Analytics
      • Extend your predictive library with open source R
      • Export your model chain to SAP HANA


Based on recent polls (What is your favorite SAP Predictive Analytics API or scripting language capabilities? & What is your favorite SAP HANA Predictive Analytics Library?), we will also have a dedicated section about the SAP HANA Automated Predictive Library (APL).

Just like during any of my sessions, you are more than welcome to provide your feedback about things you liked or not, things you'd like to see in the future session or in the product itself.

ANP600: Start Developing with SAP HANA Cloud Platform Predictive Services (1 hour Code Jam mini edition)

This Code Jam will be focused on SAP HANA Cloud Platform (HCP), predictive services (HCPps), and this will be the first time we have a session TechEd on this fresh new product area.


The idea during this session is to let you configure your HCP trial account to enable HCPps, and then either build a quick SAPUI5 application to use the predictive services or to use a RESTful client to interact with them.


This will be the first Code Jam I'll be delivering, so I hope I got the format right and that you will enjoy it!.



To conclude this blog post, I'd like to say that I know that picking up sessions at TechEd is always a tricky exercise (may be we should have a session at TechEd for this as well!), we all know that:

  • There is a limit on the number of sessions you can register for
  • There is a limit on the number of seats per session
  • There might be conflicting schedules across different topics (we managed not to have any conflicting session schedule across Predictive in Vegas, and it was challenging!)


But I'm sure you will do your best to attend these sessions.


So, I look forward to seeing you @ TechEd Vegas, Bangalore and Barcelona!


By the way, if you are into Predictive and attending TechEd in Barcelona, please check the following blog post about our pre-conference session: SAP TechEd Barcelona Predictive Pre-Conference

PS: For those attending TechEd in Las Vegas, the 21st of September will be my birthday! And have a lot of fun like being hugged at the food court by my friend and colleague Ashish Morzaria after I decided to wear this sticker on my TechEd shirt!

Pre-Conference Activity.PNG

As I described in my post The TechEd season is back!, we have plenty of cool predictive sessions that we propose across the 3 TechEd locations.


In this post I would like to focus on our half-day exclusive pre-conference activity taking place in Barcelona, on November 7 morning.

For detailed schedule & registration information, please refer here: Pre-Conference Activities.


This pre-conference seminar has been designed for our customers & partners that want to learn how to extend and leverage the capabilities of SAP BusinessObjects Predictive Analytics from our best experts Abdel DADOUCHE, Jean-Baptiste GAUTRON, Jayanta Roy and Paul Pallath


This seminar will be delivered uniquely during TechEd Barcelona. 


Extend means creating R-scripts to tackle specific use cases. These R-scripts can be packaged as extensions, and advertised by our partners in the SAP Analytics Extensions directory.  This part of the seminar assumes some basic understanding of R language from the participants.


Leverage means taking advantage of our HCP, predictive services that make it possible to create forecasts, understand key influencers of an output variable.. all using simple interfaces. This is all about user-friendly services, participants need to have some understanding of how to use a REST web service.

You might notice that we also have scheduled a 1-hour CodeJAM ANP600 on HCP, predictive services.

This pre-conference seminar goes in a deeper detail of what you can do with the HCP predictive services and how you can do it.

Feel free to post any questions that you might have about this session in the comments, don't wait & register today by clicking the link below!

Add this Seminar to My Current Registration


This post lists the steps to deploy and configure predictive services on your HCP Trial account. This it assumes you already have a trial account. Otherwise, visit this site: and request for a free account.


Step 1: Enable Predictive Services

From the “Services” page, click on tile “Predictive Services”.


Click on the blue button “Enable”.


After few seconds, the predictive services will be enabled. You will then have to deploy them.


Step 2: Deploy Predictive Services

From the previous page, you have a link to the online help which describe you the services and also how to configure them.


For the moment, click on the link “Go to Services” which brings you to the cockpit of the services.


Click on the tile. You arrive in a page already filled with your HCP trial account and user name. Just enter your HCP password and click on “Deploy” button. Click “Yes” on the confirmation dialog.


The deployment takes few seconds and you get a URL to go to the dashboard of the predictive services.


In this URL you can recognize:

  • Your HCP trial account and
  • The name of the JAVA application of the predictive services: aac4paservices


There is also a note which advise you to grant you specific role to access to this dashboard. The remark on APL is not valid for HCP Trial. This means that you don’t have to install APL on your HANA database schema. You will just have to create a binding between Predictive Services and your HANA database schema, but this point will be detailed later.


So go back to the “Predictive Services – Overview” page and click on link “Configure Predictive Services”.


Click on “Roles” and set roles “C4PA-USER” to your HCP user name to be able to use Predictive Services. If you also want to administrate them, grant you the role “AA-Admin”.


Step 3: Start Predictive Services

Go to the dashboard: or click on “Application/Java Applications” and select “aac4paservices”.


Predictive Services are stopped:


Click on “Start” button and after a while, they will start.


Step 4: Create a HANA MDC database

Such a binding must be done on a HANA MDC database. So go to “Persistence/Databases & Schemas” and click on “New” button. Fill the dialog like this (note that you can give another database ID). Remember the password for the user SYSTEM you will give because it will be used to launch “SAP HANA Web-based Development Workbench”.

I presume you have the roles to use this workbench otherwise visit the help of SAP HANA Web-Based Development Workbench.


Once this database created you have this in the Overview page.


Step 5: Create a technical user

Click on “SAP HANA Web-based Development Workbench” and then on the tile “Security”.


Select “Users” and on the right click, select “New user”. Complete the page like this and click on “Save” icon.


Important note: Before to continue, you have to connect to the database with this new user at least once. The reason is that at the first connection the system request to change the initial password.


Step 6: Create a schema for your data

The data you will use with the SAP HCP, predictive services must be in the SAP HANA database, either in a table or a view. To guarantee data privacy, this table or this view should be in protected schema. This is why we create here the schema PS_DATA. It is also necessary to grant select privilege to PS_USER to the tables of schema PS_DATA.


From “SAP HANA Web-based Development Workbench”, click on tile “catalog”. Right click on Catalog and select New Schema. Give a name and click OK.


The schema is created. Go to Security tile and grant select privilege to PS_USER to the tables of schema PS_DATA and save.


After this you can create tables or view in the schema and import data.


Step 7: Create a HANA database binding

Now that the database technical user is created and has an access to the schema PS_DATA which contains tables that can be used as dataset, a last step to finish configuration is to establish the link between the predictive services and the schema PS_DATA.


Do to this, you will bind the application of the predictive services to your SAP HANA instance with the technical user PS_USER created before and which has the authorization to access to your data. It is through this connection that predictive services will analyze your data and provide you insights.


From HCP go to the Java applications and select the Java application of the predictive services.


Click on “Data Source Bindings” and then on New Binding. Fill the page like this and click on Save.


Once created, you get this:


To check the predictive Services are correctly bound, go to the cockpit: menu Applications/Java Applications and chose “aac4paservices”. Click on the application URLs: and click on tile “administration”.


You see the status is OK.


And a click on the tile “Binding” display the binding just created and ready to work.


You are now ready to use the HCP Predictive Services in a HTML/Javascript application or in a Java application.


To get more information about development of HCP applications using HCP Predictive Services visit the online technical help or watch these videos.


Pierre Leroux

A Recap of 2016 So Far

Posted by Pierre Leroux Aug 15, 2016

It’s been six months since our first Predictive Thursdays blog and during that time, 11 different authors have shared their thoughts about predictive analytics on 27 occasions. Looking back at these blogs, I’ve noticed four distinct categories, and today, I want to revisit the best blogs for each category.


1) Why Predictive Analytics

Yes, predictive is going mainstream in 2016, but many enterprises don’t feel ready to take the plunge yet.

2) Predictive Use Cases

Today, predictive analytics is everywhere—marketing, health care, sports, even the production of sandwiches! Here are some of the popular use cases that our staff see when working with customers:

3) Machine Learning and Algorithms


4) News About Predictive Analytics

We had an extremely busy first half of 2016 at SAP and we were excited to share the latest news as they were happening. The highlight so far has been the release of SAP BusinessObjects Predictive Analytics 3.0.




[This post originally appeared on the SAP BusinessObjects Analytics blog]



The ever-increasing interconnectedness of people, business, and ‘things’ in the digital realm is radically disrupting existing business models. This digital shakeup is changing the way companies create value, interact with customers and business partners, and compete. The good news is that with new digital technologies, companies can reimagine business models, rise to disruptive market entrants, and squeeze more productivity from fewer resources.

It’s now recognized that the use of predictive analytics can serve as a catalyst for digital transformation by surfacing powerful conclusions from disparate data sources. Companies embracing digital transformation by investing in advanced analytics are winning. They are increasing their revenue, market valuations, and profitability.

Coming up on August 31, the latest #askSAP Analytics Community Call, “Reimagine Predictive Analytics for the Digital Enterprise, will give participants a chance to look at how  SAP BusinessObjects Predictive Analytics must be part of your digital transformation today.


A game changer in the predictive space, SAP BusinessObjects Predictive Analytics helps companies create, deploy, and maintain thousands of predictive models that anticipate future outcomes and guide better, more profitable decision-making across your digital enterprise.

On the call will be SAP experts Richard Mooney, Lead Product Manager for Advanced Analytics, and Ashish Morzaria, Global Go-To-Market Director, Advanced Analytics, as well as SAP Mentor Greg Myers of EVtechnologies.


The speakers will provide details about the 3.0 release of SAP BusinessObjects Predictive Analytics, and discuss how the solution can be used to solve real-world predictive problems, such as Customer Churn and Product Recommendations.

During the call, attendees will hear about how the release can work with SAP HANA to provide optimized, on-the-fly data processing directly on an in-memory platform. They’ll also learn how to scale the usage of machine learning across an entire enterprise and manage thousands of models throughout their lifecycle across different IT landscapes with the predictive factory.

You won’t want to miss this informative, interactive event. You can connect during the community call and submit your questions via Twitter with #askSAP. You can also interact now with our #askSAP experts—Richard Mooney, Ashish Morzaria, and Greg Myers.


Details of the #askSAP Analytics Innovations Community Call  “Reimagine Predictive Analytics for the Digital Enterprise”

  • Wednesday, August 31
  • 8AM PST/11AM EST/5PM CET (90mins)
  • Register Now


[This post originally appeared on the SAP BusinessObjects Analytics blog]

SAP HANA Smart Data Access (SDA) allows the SAP HANA Studio user to define a virtual table pointing to a table located in a remote data system like Hadoop, Sybase IQ, Teradata, to name a few. Thanks to SDA, with a single HANA ODBC connection, the Data Manager user in SAP Predictive Analytics 3.0 can manipulate tables coming from multiple databases.

In order to predict customer churn a business analyst can define variables in Data Manager using customer information stored in the SAP HANA system as well as call center detailed historical information stored, for example, in Hadoop. If you want to see how it works, watch the tutorial videos from the SAP HANA Academy:


Filter Blog

By author:
By date:
By tag: