This is part 2 of my blog series called: "Building a start-up in the SAP HANA space".

Over the past 15 years, I have developed a variety of applications. Web, mobile, some in the startup realm, some just as side projects, customers projects, partner applications, and the list goes on. Although if there is 1 common denominator across all of them, is that I generally started to write code, before I knew *exactly* what I was going to build. Looking back, I realize why I did this, and generally it was because of all the enthusiasm I had for wanting to make something work, bringing something to reality and showing it *is* possible.


metric² was no different, of course I learned a few things over years, but since I was working with a newer technology, I was keen to get my hands dirty and see what was possible. Before I got started, I put together a list of core features, a technology stack, frameworks to be used, and how it should all be integrated. I thought I was doing things "differently" than before, by starting out with a sure fire plan to develop something which was going to win. But I missed one of the most important topics, my "Value Proposition".


I think that your value prop should be something that gets "sketched" before you write a single line of code, before you decide what tech stack your app or product is going to be governed by, and before you tell your best friend about your big idea. More specifically, it will be *what* you tell your best friend is your big idea. More often than not, and in my case, my ability to write code, without having a goal in mind causes your "code" or product to define your value proposition which is the wrong way around! It was also extremely difficult for me to blurt out what my application actually was (Just ask Mark Finnern and Phil Loewen), because I wasn't able to clearly define what my application was to someone. So even though I had written a ton of code, was "pitching" my application and felt like my beta was finished, I went back to the drawing board to try and clearly define my value proposition and what it meant to metric².




So what exactly is a "Value Proposition"?


To me, a value proposition defines what benefit you provide, for who, how, and what makes your service unique over the competition.


Nate Gold once told this story:

The story is told of an unannounced visit by John F. Kennedy to the space center at Cape Canaveral in the mid 1960's. Kennedy toured the complex and met a man in overalls. "What do you do here?" he asked. The man replied, "I'm earning a living." Kennedy nodded and moved on. He met another man in overalls and asked him the same question. "I clean away all the rubbish," the man said. Kennedy smiled and strode on until he met another man in overalls and put the same question again. This time a big smile came across the face of the man who replied, "Mr President, I'm helping to put a man on the moon."


Nate also suggested that this should be a great example of your Value Prop, I like to think of it as the sum of the parts.


What did I do to create metric² value prop, I sat down and went through the "Hows" of what we do, (which is what Nate suggested), I tried to define what the application or product was trying to accomplish and through what means, and I came up with the statement below.






Whats the difference between a "Value Proposition" and a "Elevator Pitch"?

A elevator pitch is better suited to what you would say to someone who you just meet and they ask what you do. You say it with passion and pride, and its probably more focused on why you are doing what you do. It may also include your value proposition and add to it, or elaborate on a specific topic.


I created this elevator pitch for m²:


metric² is developing a easy to use web based platform making it quick and simple to build predictive, streaming and real-time dashboards helping enterprises succeed through data-driven decisions.




Why are these 2 statements important?


Once I decided what they were going to be, I put them up on my wall in front of my desk and every time I develop, or ideate something for m², I take into consideration whether they fall into category of helping to make these 2 statements a reality.


While it might seem like a no-brainer, I really believe that these are the "small" things which count to keeping you on track when building a product. (or on any project where you get to define the scope). I also believe now, that these 2 statements drive my development on the project versus the other way around.




Whats next?

In my next post I plan to chat a little bit about what a startup is? Whats the difference between a startup and a side-project? And more importantly why does it matter? We will also look at some of the stages of building a startup.




As a technology advisor to the startups, I have the opportunities to work with the innovative startups from all different kinds of areas. One particular area very popular this year is IoT. Streaming data processing in real-time is what many startups had asked me to share, that I'd like to introduce the new Smart Data Streaming capability in SPS09 of HANA with same sample use cases.


Smart data streaming provides the ability to process incoming event streams in real-time, as fast as the data arrives, capturing the desired data in the SAP HANA database, as well as monitoring the data, generating alerts and notifications as desired. Data flows into a streaming project running on the streaming cluster through adapters, which provide the connectivity to data sources. Continuous queries contained in the streaming projects turn the raw input into the desired output. Data is then pushed out to destination sources through output adapters which are connected to the streams or windows in the project.


Go to the Landscape view of HANA Studio Administration Console to check the services and make sure the streaming server is running.



Install HANA Streaming Plug-in in HANA Studio:

You can get the installer from your instructor, then go to Help->Install New Software, click Add to and select the path where you put the installer, following the onscreen instructions. After the installation, restart your HANA Studio and now you should see two additional prospectives, SAP HANA Streaming Development and SAP HANA Streaming Run-Test.


Add Streaming Server and Data Services

Switch to the SAP HANA Streaming Run-Test perspective, in the Server View panel, click to add a new Server URL, esps://<Your HOST>:30026 and login with HANA user name and password, you can use the SYSTEM user. You can also create new workspaces in which you can put your streaming projects later, here we have created a workspace “sfp".


Now go to SAP HANA Streaming Development to add the data services, right click the steaming server and click “Add ODBC Service”, choose the 64 bit version driver, enter HANA user and password, check to enable as HANA reference service and specify the ODBC DSN. If you do not know the ODBC DSN, please ask your instructor, the ODBC DSN must be configured in the Linux Server of your HANA system.


Here is an example of ODBC DSN configuration in Linux, where a .odbc.ini file must be existed in the home directory of the hdbadm user in HANA system, by default at /usr/sap/HDB/home and below is the DSN definition in our example.




Now save the data service and right click to discover the server, you can see the Schemas, Tables, Table Columns from HANA system are visible.


Go to the Preferences, change the default compile server(localhost) to your streaming server. In the following screen, select the Change button and choose your server in the prompt screen.


Now all the prerequisites have been fulfilled and we can start to create the streaming project to simulate a real IoT use case.

The Streaming Use Case

Scenario: Simple IoT scenario where we are collecting and monitoring sensor data from refrigeration units.  The coolers are equipped with sensors for temperature, power, and door open/close events. The data structure of the messages generated from the refrigerators look like this:

We will create streaming projects that assume the incoming event data will look like the following table represents the events related to Door, Temperature and Power.









2015-01-20 12:00:00.000





2015-01-20 12:00:01.000





2015-01-20 12:00:02.000





2015-01-20 12:00:03.000





2015-01-20 12:00:06.000




These are some common use cases:

1. Filter the data to capture all the “door events” – i.e. open/close events – in a HANA table that represents the activity for the coolers (Objects: Filter window object, HANA Output Object)

2. Create a moving average of the temperature for each cooler and monitor that temperature of each cooler is in desired range; if not add an alert to an output window (Objects: Join window object, Average window object, Derived window object)

Exercise 1 - Capture the Door Events

Lets start from the simplest use case to filter out some events and capture the certain types of events into HANA. In this case, we only want the “door events” using a Filter on Stream/Window and the HANA Output Object in the Smart Data Streaming.

Create the Streaming Project: Go to the SAP HANA Streaming Development perspective of HANA Studio, right click in the Project Explorer panel, click New->Others, select SAP HANA smart data streaming->New Streaming Project, give a project name “door_event” and click Finish to create the project.


Rename the default stream to MACHINEDATA, add the columns MACHINEID, EVENT_TIME, EVENT_NAME, EVENT_DESCRIPTION and EVENT_VALUE to the stream, create the filter to ensure the EVENT_NAME is equal to ‘DOOR’. Create a HANA Output from Output Adapters, configure the adapter properties like below. The target database table here will be “” here(the table must have been created).



Connect the stream object with the Output and save the project. Select the project name, right click to and go to SAP HANA smart data streaming->Compile Streaming Project. If there is no error message showing, you project has been compiled and ready to run.



There is a corresponding .ccl file in your project, which is the generated code of the project. CCL stands for Continuous Computation Language is the primary event processing language of SAP HANA smart data streaming. CCL is based on Structured Query Language (SQL), adapted for stream processing. The key distinguishing feature of CCL is its ability to continuously process dynamic data. A SQL query typically executes only once each time it is submitted to a database server and must be resubmitted every time a user or an application needs to execute the query. By contrast, a CCL query is continuous. Once it is defined in the project, it is registered for continuous execution and stays active indefinitely. We can cover more in the other examples.




Now select the project, right click SAP HANA smart data streaming->Run->Run Streaming Project in Workspace <Your Workspace> and it will switch to the SAP HANA Streaming Run-Test perspective automatically and you can also see the running project and the input stream object MACHINEDATA, double click it and it will show you the stream view of the object.




In the real world, it will the devices/machines connect to the streaming server through various adapters for all kinds of source systems. Here we just simulate it use the Manual Input represent the incoming events. I am going to send these two events, as you can imagine, the first event will be filtered out and the second one is the door event and it will be captured by SAP HANA.









2015-01-20 12:00:00.000





2015-01-20 12:00:01.000




As you can see here, two messages have been sent to the streaming server but only the door event is showing in the stream view and the HANA table "".



Exercise 2 - Monitor Moving Average of Temperature

We will create a moving average of the temperature for each cooler and monitor that temperature of each cooler is in desired range, if not add an alert to an output window. We will use the Join window object, Average window object and the Derived window object in this project.

Go to the SAP HANA Streaming Development perspective of HANA Studio, right click in the Project Explorer panel, click New->Others, select SAP HANA smart data streaming->New Streaming Project, give a project name “avg_temperature” and click Finish to create the project and you will see the .ccl, .ccr and .cclnotation file of the project.

Like the previous project, rename the default stream object to MACHINEDATA, add the same columns with the appropriate types.

In the right side of the canvas, go to Streams and Windows, select the Reference that refer to existing tables in HANA. Here we will use the table “”, which stores the detail information of each refrigeration unit, e.g. the location, the name, the Max and Min temperature of each unit.


Join the Reference with the MACHINEDATA stream object based on the MACHINEID column. The MACHINEDATA stream object represent the real-time streaming data and the MACHINE_DETAIS_REFERENCE is the Window object represent the static data of the machine specifications, this is very common in the real world that you will need to join these kinds of data in streaming processing.



Here is the join expression prompt:




As we are going to monitor the average temperature, we will then need to create the Aggregate object, calculate the average temperature for every 1000 records with the formula highlighted below, also specify the Group by condition and filter out those events are not related to the temperatures.



Finally, create a Derived Window object for the Alarm, when the moving average temperature is higher than the Max temperature of the machine, send out the alarm.



So the entire project will look like below, compile the project and fix any error you see in the console.



Below is the CCL code of the project. The code itself explains the workflow and it should be easy to understand.

msdate ,
string ,
string ,
string );
string ,
string ,
string ,
integer ,
integer ,
string ,
string ) PRIMARY KEY ( MACHINEID ) PROPERTIES service = 'hana' ,
  sourceSchema =
'SFP' ,
  source =
'' ;
  avg ( to_integer ( EVENTS.EVENT_VALUE ) ) AVG_TEMP ,
'Machine not maintaining temperature' ALARM_DESC FROM AVG_TEMP WHERE


Now select the project, right click SAP HANA smart data streaming->Run->Run Streaming Project in Workspace <Your Workspace> and it will switch to the SAP HANA Streaming Run-Test perspective automatically and you can also see the running project and the input stream/window objects MACHINEDATA, double click any of them to monitor the data.

Again, we will use the Manual Input to simulate the incoming events and event data is like the table below.







2015-01-20 12:00:00.000





2015-01-20 12:00:02.000



2DDDBW3TA2015-01-20 12:00:03.000TEMPInsideTemp66
2DDDBW3TA2015-01-20 12:00:04.000TEMPInsideTemp60


Here you can see the list of Stream/Window objects for the current project. From the console, you can see there are four messages have been sent to the streaming server. For the EVENTS object, which is the JOIN of MACHINEDATA and MACHINE_DETAILS_REFERENCE, you can see the name, location and other information of the machines that were stored in an existing HANA table.




For the AVG_TEMP window, you can see the moving average value has been calculated.




For the ALARM_TEMP window, you can see the alert like below. This represents the output of streaming processing, in the real world, you may take the result of it to your monitoring dashboard, send an alert to the onsite technician to fix the problem or any other necessary actions.





Monitor Steaming with SAP HANA Cockpit:

SAP HANA cockpit allows you to manage the nodes, workspaces, adapters and projects involved in streaming, and to monitor streaming alerts and memory usage. There are seven tiles in the Streaming catalog, which can be added to a new or existing cockpit group.

You will need to assign these two roles in order to see the Streaming tiles in HANA Cockpit.


CALL _SYS_REPO.GRANT_ACTIVATED_ROLE('sap.hana.admin.roles::Monitoring','SYSTEM');


CALL _SYS_REPO.GRANT_ACTIVATED_ROLE('sap.hana.streaming.monitoring.roles::Monitoring','SYSTEM');

Select the system, right click and choose Configuration and Monitoring—>Open SAP HANA Cockpit to launch:



My knowledge is relatively focused on experience in the United States of America, but it would be great to be able to answer the questions around what are the greatest career needs in a particular region of the world relative to the number of skilled resources available in that region.  Going a step further, a user could then use the analytics to filter based on their own personal criteria to determine what might be the most likely considerations for them to pursue to provide the greatest contribution to their community.

For example, an issue we are seeing in recent years in parts of the US is a delayed response for the need for trained pharmacists to fill the demand for employers.  Then, a number of universities provided new programs to meet those needs and now that field is somewhat saturated.

There may be opportunities for similar skills in other parts of the world that would allow for an equilibrium of supply and demand for those skills while meeting the needs of certain individuals to live in a different part of the world.

This information would also give insight to the academic industry with predictive analytics based on additional information regarding changes to demographics, changes to the general health of the population, retirement rates, etc., to suggest to universities what changes they should make to provide a stronger economic impact to the world, as a whole, and providing more satisfying career choices to individuals who have better opportunities to make an impact in their circles of influence.

Run powerful real-time monitoring that supports your real-time in-memory investment.

Register Today!


March 24th 2015 @ 10:00 PT / 1:00 ET


Join us, as we welcome guest speaker Dan Lahl, vice president, product marketing SAP, who will discuss how IT organizations run transactional and analytical applications on a single in-memory platform delivering real-time actionable insights while simplifying their IT landscape.  During this one-hour webcast, Bradmark’s Edward Stangler, R&D director, HANA products will showcase key essentials for effectively monitoring SAP HANA, including:

  • Tracking Key SAP HANA Feature
    • Top Column Store Tables.
    • Status / progress on delta merges and full / partial loads.
    • Memory breakdown.
  • Reviewing Overall Health
    • CPU usage, volume I/O, memory breakdown, and instance information.
  • Familiar Metrics for Experienced DBAs
    • Statements.
    • Memory usage.
    • Space Usage.
    • Operations and transactions.
    • Connections. 
    • Network I/O, app tier, login, current SQL and SQL plan, etc. 
  • Alerting on HANA Resources
    • Space usage in volume data / log, long-running statements / transactions / SQL, delta growing too large (not merged fast enough) for column store tables, and more.
  • Flashback on Recent HANA Problems
    • Viewing historical data through real-time UI.


Register Today... to join us for this informative event.


And learn how Bradmark's Surveillance for SAP HANA satisfies an organization’s system management requirements across the SAP HANA computing platform, so you can maintain a production-ready environment and your peace of mind.


We look forward to seeing you online!

I have some fond memories of a television series from the BBC called “A Car is Born” a 15 episode show where the presenter, Mark Evans painstakingly builds a AC Cobra replica. The show highlighted his experiences, trials and tribulations of building something from the ground up.


At TechEd 2014 in Las Vegas I gave a presentation on being a part of the SAP HANA startup program, developing a product, and trying to make an impact in the world of SAP HANA. While its been an interesting ride, I really felt that my 45 minutes up at the podium was really not enough to convey the past 2 years of highs, lows, successes and failures. Since that was the case, I thought I would get back to some blogging about my experiences along the road, hoping that I can inspire, dissuade and educate others about my quest, just like the BBC show. If you are a budding entrepreneur, HANA guru or just wanting to gain some life lessons (at someone else cost), I encourage you to read on, and share your experiences through some of your life journeys.


“You should never underestimate enthusiasm”


Over the past 15 odd years of my life I have developed a myriad of applications, some that succeed, and others which failed miserably, but in every case on my path from taking the product from inception to reality, it has always been done with such optimism and enthusiasm that even if the idea was mediocre, in my eyes it was a clear winner. Sometimes this “fog” can get the better of you, but in most cases its the drive which encourages you to work late and over the weekends with the intention to build something which is going to be a winner.


Working on metric² was no different. I recall spending multiple hours at TechEd/Sapphire hearing about HANA and its opportunity to change the world, wondering what the true benefit of HANA really was, and it was just not evident to me. That was until a few things changed my perception, I was working at a customer site which had 23 different systems being consolidated into a single data mart for a single report which was run daily. People worked tirelessly to ensure 23 different ETL jobs were processed timely, correctly and accurately to produce 1 measly report, and it struck me that this would be a great use-case for HANA. I started to understand and realize more and more of the benefits, technologies like AFL, PAL, XS Engine, Columnar store, In-Memory, etc. are all clear winners to simplify IT at the foundational level. Once I understood the opportunities, the enthusiasm kicked in, and drove me to work tirelessly on metric² through challenges, time constraints and personal issues to deliver something which I *knew* was going to be a success. Yes, by this stage the fog had set-in.


“Getting started"


Since I had started work on the metric² product on .NET/MS SQL (and had 1 customer live) it made a lot of sense for me to switch the infrastructure and re-build it on HANA. It took a bit of a learning curve, but since I had developed multiple web applications in the past, it was simple and straight forward to get up and running with XS. I also went through the OpenSAP course (from Thomas Jung) which gave me some great fundamental understanding of the core technologies and some opportunities to take advantage of. With my new found understanding and development skills I started down the road of developing metric². Sketched napkins, rough architecture drawings and random emails littered my desk describing how metric² should work, but an important I never quite decided was, what and who metric² was really for, what was my target audience and who would ultimately be my end users? Unfortunately, looking back, this was one of my biggest failures in the project and still is a challenge today.


In my next post I will chat about my 1 new requirement which everyone should have before they write a single line of code: Defining your value proposition. PS. This is relevant for any form of project (Internal, external, customer etc.)


I posted an idea hopefully it could be done as we have learned systems need to be flexible. Object Oriented Database for rapid changing business processes or requirements : View Idea


What I wrote in a quick paragraph is this:

Plain and simple businesses requirements change overtime. With the power of HANA users should be able to override or overload settings in terms the data used. For example a computer vendor may be selling a machine, but iteration might have more drive bays than the previous and instead of creating redundant table just to store the information of each machines drive bays just make it possible to be one tuple with variable size or entities to do the job.

Use SAML to enable SSO for your XS App on SAP HANA rev 92 or later

This blog post will give you step-by-step instructions to enable your XS app to authenticate existing users from your SAP BI, NW, BW or your non-SAP apps.


I now have the pleasure of rewriting my previous blog on this topic that consisted of 2 documents. This blog post may be a little longer due to the explanations and screenshots, but the process is simpler and much faster to implement based on the many enhancements in HANA SPS08 and SPS09.


Here is how I enabled SAML authentication for my XS app using SAP HANA rev 92. Special thanks to Markus Strehle for his many contributions to this blog.


You must be using SAP HANA rev 92 or later. This guide will NOT document how to set up an Identity Provider (IDP) for SAML or teach you how to develop an XS application. It assumes that you already have access to an IDP and have access to an administrator of the IDP. Chances are that you already have a SAML IDP set up in your company. If not, you can use the SAP BI Platform, the NetWeaver SSO product, or SAP’s own cloud-based ID Service (SAP IDS) as your IDP (see the Further Reading section for more on those products).


Fortunately you will no longer need access to the HANA Linux environment nor require the Linux admin user ID for your HANA instance. You also will no longer need to download and install crypto libraries since they are now installed with SAP HANA.


You will need a HANA user ID that has been assigned the following roles to administer the SAML configuration tool:

  • sap.hana.xs.admin.roles::SAMLAdministrator
  • sap.hana.xs.admin.roles::TrustStoreAdministrator
  • sap.hana.xs.wdisp.admin::WebDispatcherAdmin
  • sap.hana.xs.admin.roles::RuntimeConfAdministrator

Plan your configuration

You should be familiar with SAML concepts and may wish to read the following sections of the SAP HANA SPS09 Administration Guide:

  • 5.4 Managing Trust Relationships
  • 5.5 Maintaining SAML Providers,
  • 5.10 Maintaining Single Sign-On for SAP HANA XS Applications


Before you begin, discuss your plans with the security administrator of your SAML IDP and your XS development team.

Task Overview

  1. Step1: Enable SSL Encryption (May be optional or required by your IDP)
    1. Step 1a: Create the Certificate Request
    2. Step 1b: Send the Certificate Request to a Certificate Authority to be signed
    3. Step 1c: Confirm HTTPS and SSL are Working
  2. Step 2: Setup the SAML IDP and Trust Relationship
    1. Step 2a: Get your IDP certificate information
    2. Step 2b: Add Your IDP
    3. Step 2c: Add Service Provider
  3. Step 3: Configure your IDP and Application
    1. Step 3a: Register your App with your IDP
    2. Step 3b: Configure App
  4. Step 4: Modify your XS Application Code
    1. Step 4a: Using Named Users
    2. Step 4b: Set Default Role for Dynamically Generated Users
    3. Step 4c: Implement Logout Code
    4. Step 4d: Retrieving User Information from the IDP (Optional)

Step1: Enable SSL Encryption (May be optional or required by your IDP)

Important Note: Using SSL may be optional depending on your company’s security policy and SAML configuration.


SAP HANA uses the Extended Application Services (XS) Engine as a lightweight web application.  HANA leverages SAP’s existing Web Dispatcher to act as a proxy relaying communication between front-end http requests and HANA’s back-end XS Engine.


At the time this document was written the SAP HANA hardware vendors do not deliver the HANA appliance with SSL/HTTPs enabled for the XS Engine/Web Dispatcher.  These next steps will enable secure HTTP communication with the XS Engine using SAP’s Common CryptoLib libraries and an SAP CA evaluation certificate for use in development or test environments.


These steps are a supplement to the HANA security guide. If you are interested in securing HANA communication with OpenSSL please see this document. Note that OpenSSL is only supported for SQL connectivity to SAP HANA. The Web Dispatcher in SAP HANA does not support OpenSSL.

Symptoms to Resolve

The HTTP protocol should be working for communication with SAP HANA’s XS Engine when we navigate our browser to http://<host_name>:<xs_port>.



However, https protocol may or may not be working. In general (with rev 92 or later), a self-signed certificate should automatically be generated and SSL should automatically work. Note: If you delete your SAPSSLS.pse file and restart your web dispatcher the new PSE file should contain a newly generated self-signed certificate. See SAP Note 2014996 for more information regarding this new feature.


To test using https point your browser to https://<host_name>:<xs_ssl_port>. Note we use https as the protocol and use the SSL port for your XS engine. This should be 43<instance_nbr> (e.g. 4300 for HANA instance 00).


You may see a warning message from your browser, such as the following screen. You can work around this warning message by clicking “Advanced” (in Google Chrome) or “Continue to this website” (in Internet Explorer).



If you do not have an SSL certificate properly installed in your SAPSSLS.pse file you may see a different error when you go to https://<host_name>:<xs_ssl_port>.


If you receive an error you may view the details in your SAP HANA web dispatcher trace file. This file can be viewed, from the HANA Studio. Just double click on your system, select the Diagnosis Files tab and open the latest file starting with the “webdispather” prefix and “webdisp” suffix.



Important Note: If your page loaded properly but you wish to avoid the warning, you must install an evaluation or a signed certificate. If, for now, you are okay receiving the warning, you can skip ahead to Step 2 to configure your SAML authentication.

Step 1a: Create the Certificate Request

Here we will create a request for an evaluation certificate to use for our SSL encryption from a CA (certificate authority). We will use the Web Dispatcher Administration tool. These next steps should alleviate the need for a warning message when accessing our site via SSL.


Roles Required:

  • sap.hana.xs.admin.roles::TrustStoreAdministrator
  • sap.hana.xs.wdisp.admin::WebDispatcherAdmin


Open the URL below with a browser.



Click on “PSE Management” from the navigation pane.


Keep the selection of SAPSSLS.pse and click the “Create CA Request” button.


Note: for more information regarding the various PSE files, see SAP Support Note 2009878.


Select and copy all of the text in the first text area. You will provide this information to your CA.


Step 1b: Send the Certificate Request to a Certificate Authority to be signed

This guide will generate a free evaluation certificate from SAP’s website.


Browse to



Click “SSL Test Server Certificates”.


Then click the Test it Now! button that appears in the main canvas.



Paste the request text from the previous step into the Order SSL Server Test Certificate page shown here, choose the server type “PKCS#7 certificate chain”, and click the Continue button


SAP returns the signed certificate as text. Copy this text to your clipboard. You may choose to save the copied text into a local file using your favorite text editor for later reference.


Switch back to your Web Dispatcher Administration screen and paste the text into the “Import CA Response…” text area. And click “Import”


Note: If you previously closed this page, click on “PSE Management” in the navigation bar. Make sure SAPSSLS.pse is selected at the top of the screen. Then click on the “Import CA Response” button. Paste the certificate text into the text area and click import.


You should see a message stating that the CA-Response was imported into the SAPSSLS.pse and you should see certificate details in the “PSE Attributes” section of the page.


Step 1c: Confirm HTTPS and SSL are Working

If everything is working as expected, then SSL should now be enabled.


You can now call your XS Engine using the SSL port for your XS engine. This should be 43<instance_nbr> (e.g. 4300 for HANA instance 00). Type the URL into your browser.



You may see a warning if your browser does not have the root certificate for the CA in its certificate store. You can click “Advanced” (in Chrome) or “Continue to this website” (in Internet Explorer) to proceed.


If you used the SAP evaluation certificate, you can save and import the SAP Server root certificate into your browser’s trust store. Just save the certificate file from here: and then double click on the file from Windows Explorer and follow the import wizard.


If everything went well you are now looking at the XS engine screen again, but now using your evaluation certificate and SSL!


If it does not work, you can check the trace file for the web dispatcher by clicking on the “Trace” link in the navigation bar of the Web Dispatcher Administration tool as shown below. Or, as shown earlier, the trace file can also be viewed from the HANA Studio.



Important Notes: 

  • If you are using HANA One on AWS, you will need to make sure you open your target (SSL) port as needed.
  • As needed, from HANA Studio, you may change various web dispatcher settings in the [profile] section of the webdispatcher.ini file. A restart of the web dispatcher service is no longer required for parameter changes to take effect.

Step 2: Setup the SAML IDP and Trust Relationship

The next steps will walk you through gathering your IDP information, then adding that information to your SAP HANA configuration. This will tell HANA which IDP to trust and use when authenticating your application users.


Note for this section you will need the following roles assigned for your HANA user account.

  • sap.hana.xs.admin.roles::SAMLAdministrator
  • sap.hana.xs.admin.roles::TrustStoreAdministrator

Note: See section 5.10 [Maintaining Single Sign-On for SAP HANA XS Applications] of the SAP HANA Administration Guide for further support and reference on how to configure SAML settings for XS.

Step 2a: Get your IDP certificate information

Here you must gather the IDP metadata from your IDP service. In this example I will use the SAP’s cloud-based ID service (SAP IDS) as my SAML IDP. Again, you must choose your IDP which may be internal to your company. It may even be the SAP IDS or an IDP offered by SAP software such as the SAP BI Platform, the NetWeaver SSO product (see the Further Reading section for more on those products).

You will need the metadata URL for your IDP. For example, the metadata URL for the production SAP IDS service is:

Note: Of course you should always start with a development or QA system, NOT a production system.

Once you know the appropriate metadata URL for your desired IDP, open the URL using your favorite browser. You will be copying some XML content – so right-click and select “View page source”.


Now select ALL of the text and copy it into your clipboard. You will use this content in the next step.

Step 2b: Add Your IDP

Open the URL below with a browser. This time use the https:// protocol and port number (43<instance_nbr>)


Note 1: If you will not use SSL for your site that is okay as long as your SAML IDP allows support for the HTTP protocol.

Note2 : In my screen shots below I did not use the SSL protocol and port. That is okay too, but before I copy my metadata (in a later step) I will have had to open the tool using SSL. I will point out where this occurs.

Login using the SYSTEM user ID or a user ID that has the following role assigned.

Role required: sap.hana.xs.admin.roles::SAMLAdministrator

Click on the menu icon and then click on “SAML Identity Provider”


You will then see the following screen.


Click the Plus "+" icon to add a SAML IDP.


You will see a form to enter the IDP metadata.Now paste in the content that you copied in the previous step. When you click in any of the fields in the “General Data” section, the XML content that you pasted should be reformatted and the “General Data” fields should automatically be populated as shown below.


Click to activate the checkbox for “Dynamic User Creation”


Note: Checking the box to enable dynamic user creation will automatically add a database user ID for each user as they first log in to your XS application. This is highly recommended, unless you are planning to use a single technical user ID for all users or if you will provision all HANA user IDs by some other method.

Click “Save”

You should see a fleeting message stating that the IDP was successfully saved and then you should see the IDP listed in the “SAML Identity Provider List”.


So what just happened? Quite a bit actually.

Pressing the “Save” button added our IDP as well as established a trust relationship with the IDP. The certificate from the IDP was contained in the metadata that you pasted. This certificate was imported into the “sapsrv.pse” file.

To view the certificate and ensure it was loaded, click on “Trust Manager” from the navigation bar and select SAML. Then click on “Certificate list” you should your IDP’s certificate listed.


Note: You can also view the certificates stored in the “sapsrv.pse” from the Web Dispatcher Administration tool that we used in Step 1. This is completely optional. To see the certificate using this approach, you need the following role:

  • sap.hana.xs.wdisp.admin::WebDispatcherAdmin

If you have this role assigned to your user account, you can go to the following URL.


Click PSE Management and select “sapsrv.pse”. If the certificate was imported properly you should see the certificate in the “Trusted Certificates” section of the screen.

Step 2c: Add Service Provider

Now from the XS Administration Tool, click the menu icon and click “SAML Service Provider”



Click “Edit”

Fill in information about your organization and name your service provider.

Click “Save”.

You should see a fleeting “success” message.

Step 3: Configure your IDP and Application

Step 3a: Register your App with your IDP

If you will use SSL for your site, and if you did not use SSL when opening the XS Admin tool, you must do so now. Point your browser to:


Click “SAML Identity Provider” from the navigation bar.


Copy all of the text in the Metadata text box shown on the screen.

Important Note: Notice that when you view this page with HTTPS in your URL the metadata uses the HTTPS protocol and port number. This is important to properly set up the IDP configuration for our XS URL (if you will be using SSL for your site).

Save this text in a local file named as you wish, but with an XML file extension (e.g. myXsAppMetadata.xml).

Note 1: If you prefer or are required to instead send an endpoint URL that provides the metadata you can use the following link, based on your host name and port. Again, use your SSL port if you will use SSL.https://<hostname>:<ssl_port>/sap/hana/xs/saml/info.xscfunc

Note 2: If you wish to return user attributes back to your application other than default attributes configured by your IDP, you must modify the XML file that you created. The SAP IDS returns the user’s email address, company, first and last names by default so that they may be accessed by your application.

But if there was a special field, let’s say “nickname”, I would have to insert XML in my XML file to request that the field be returned. I will not go into details in this blog, but this may be documented by your IDP provider.

Now you need to submit this file (or the endpoint URL) to the administrator of your IDP so that they can create an entry to recognize your Service Provider that you just configured. You may need to email the file or submit the contents via a web form. It depends on your company policy.

Note: While you wait for the activation confirmation from your IDP team, you may wish to continue with the next steps. Of course nothing will work until they complete the necessary set up on their end.

Step 3b: Configure App

Here we will configure our XS application to use SAML authentication.

Role required:  sap.hana.xs.admin.roles::RuntimeConfAdministrator

From your XS Admin tool, select “XS Artifact Administration” from the navigation bar.


Select the arrow next to your package. In my case I will select the package mycorp.myapp. You may choose to just select a parent package depending on your application’s package layout.


After you have navigated to your package, click “Edit”.


Activate the “Force SSL” checkbox if you want to enforce SSL.

Activate the “SAML” checkbox and select your IDP from the dropdown.

Uncheck all other authentication options such as “Form based” and “Basic”.

Click “Save”


You should see a fleeting success message such as the one above.

Note: These authentication settings override the authentication settings that you may have in your .xsaccess file for your application and its individual packages.

I am attaching actual code in the form of a delivery unit that you may wish to use to test your configuration. This sample code accesses the fields returned by SAP IDS and displays them along with providing a logout button. Just place the files in your mycorp.myapp package (minus the ".txt" extension. Also add an empty ".xsapp" file to the folder. To test this sample app, point your browser to https://<host>:<ssl_xs_port>/mycorp/myapp/saml.html

At this point, you can test your SAML authentication once you received confirmation from your IDP team that they have configured your application in the IDP system. However, please continue on to step 4 to learn more about how your XS code should work with SAML.

Step 4: Modify your XS Application Code

Package Layout and Other Development Considerations

Your XS application needs a clear security model. Will you have different levels of security for each package? Will you allow database connectivity for named user IDs or maybe just one technical user ID? Answers to these questions fall outside the scope of this blog.

However, you must consider your authorization requirements early in your development and may choose to organize your package hierarchy accordingly. For example you may want to organize your application into 3 primary packages: a public package, a user package and an admin package. Each of these packages can be granted different authentication and authorization policies.

In this blog I will provide the steps to use named users and secure the entire app (the root package of the app) with one authentication policy using SAML. We will also demonstrate how to assign a default role to new users logging into the system.

Step 4a: Using Named Users

Because we set the option to dynamically generate database users in Step 2b above, our code to get the database connection does not need to take the user ID as a parameter. See sample code below.

var conn = $.hdb.getConnection();

Each user will access the database with his/her own user ID that is automatically generated for us. Not too bad!

For more information see:$.hdb.html

Step 4b: Set Default Role for Dynamically Generated Users

Double click on your system in the Systems tab of HANA Studio. Click on the Configuration tab and add the parameter defaultrole under indexserver.ini->saml. Set the value to be the name of a role that you have defined.

Now new users that login to your app will be assigned your specified role by default.


Note: I did not get it to work using the syntax of a design-time role. So I created a run-time role and assigned the design time role to it.

Step 4c: Implement Logout Code

To provide logout logic you must call the following url: /sap/hana/xs/formLogin/logout.xscfunc

You can invoke the call to the URL when a user clicks on a logout button or link in your application.

Logout button logic

<div id="logoutButton">

        <form action="/sap/hana/xs/formLogin/logout.xscfunc" method="post">

            <input type="hidden" name="X-CSRF-Token" value="">

            <input type="hidden" name="x-sap-origin-location" value="/sap/hana/xs/formLogin/">

            <input type="submit" value="Logout">



Step 4d: Retrieving User Information from the IDP (Optional)

SAML user attributes can be accessed from your XSJS code with the following code. Notice that there are two syntax options to choose from here.

var value = $.session.samlUserInfo[“<name>”];

var value = $.session.samlUserInfo. <name>;

A valid example could be

var response_string=$.session.samlUserInfo["first_name"] + " " + $.session.samlUserInfo["last_name"] + " (<a href=mailto:" + $.session.samlUserInfo["mail"] + ">" + $.session.samlUserInfo["mail"] + "</a>). You logged on via SAML from company " + $.session.samlUserInfo["company"];

Create your Own Registration Experience (Overlay)

The SAP IDS SAML identity provider and other IDP’s offer the ability to customize your login and registration experience. Review your IDP documentation for details.


Assertion did not contain a valid MessageID.

If you receive the above error message when logging in to your app then your XS app is considering the response from your IDP as having timed out. You can change the timeout setting as needed by following these steps.

Double click on your system in the Systems tab of HANA Studio. Click on the Configuration tab and add the parameter assertion_timeout under indexserver.ini->saml. Set the value to be the number of seconds before a timeout takes place. The default value is 10.


Try deleting PSE and Restart the XS Engine

If you have issues that you ultimately cannot resolve, you may choose to delete the 2 PSE files that we modified and then restart the XS Engine to have them recreated in their original form. Note: New certificates being generated results in the need to reconfigure your IDP to recognize and trust your application.

From HANA Studio, double-click on your SYSTEM name from the “Systems” tab. From the landscape tab, right-click on the “xsengine” row and select “kill”.

Alternatively you can type HDB stop and then HDB start from a Linux prompt to restart the whole HANA server.

If you need to open a support ticket with SAP, you can assign your ticket to one of the following two components: HAN-DB-SEC or BC-SEC.

Further Reading


With recent concerns with Ebola in Africa and beyond, HANA, in conjuction with the CDC, the World Health Organization, and others could be used to more quickly respond to major medical concerns as well as look for trends or growth patterns in cases with certain diseases.  HANA would be able to process updates from the different organizations as well as aggregate the data in a timely fashion.


This data would allow the organizations, volunteers, governments, and the medical community the ability to plan for and respond to global concerns.  This could mean education, medical teams, preventative measures, etc.  By looking at the data over time, opportunities for impacting global communities can be observed, as well.


Given that I am not in the medical community, I suspect there are many more great ideas that could benefit from this initiative.  I welcome your thoughts and feedback as we strive to make this world a better place by giving back.

Having used the SAP HANA Predictive Analysis Library (PAL) for a few years now, I had to post a short video to demonstrate the vast improvement made in SAP HANA SPS 09.


This example uses linear regression (LR) to predict stock movements based on the movement of another closely correlated stock and the S&P 500 index. The goal is to just show a simple LR example that is easily understood, nothing more. Having the historical data we can train the algorithm and make our forecast all in one flowgraph model, procedure. We can then check the results with our test data. Since the model uses historical data it is not useful for predicting future stock movements, but shows how you could guess the movement of one stock if you only had the corresponding data of a closely correlated stock and the S&P500.


For further details, visit the HANA Academy videos on the SAP HANA Application Function Modeler (AFM) found here.




In our previous post we introduced XS Data Services (XSDS), a native client for Core Data Services and query builder for XS JavaScript.  In this blog post we will show how we can use XSDS to work with CDS entity instances as plain JavaScript objects.


Working with Managed CDS Entity Instances


Once our entity imports are in place we can use the resulting constructor functions to work with our entities in our application. For this, XSDS provides two different modes to interact with the database: managed mode and unmanaged mode.  Managed mode works with entity instances, i.e., singleton objects with “personality” which correspond one-to-one to database records. Unmanaged mode, on the other hand, works with plain values that are simple nested JSON-like structures.


In this post we’ll be looking at managed mode.  To retrieve an existing entity instance in managed mode, we may query it by its key:


var post = Post.$get({ pid: 14 });


This will retrieve the post with pid = 14 if it exists, or null otherwise.  The $get() method requires that all key properties be supplied.


The $find() and $findAll() methods are more general in that they return one entity or all entities, respectively, which satisfy a given set of property values:


var users = User.$findAll({ Name: "Bob" });  // return all users named Bob


The instance filter expression may be built using the following basic expression language that is valid for all JavaScript and SQL types:


<expr>  ::=  { <cond>, <cond>, … }

<cond>  ::= prop: value | prop: { <op>: value, … }

<op>    ::= $eq | $ne | $lt | $le | $gt | $ge | $like | $unlike | $null


All conditions in an expression are joined by logical-AND. The expression { prop: value } is a shorthand for { prop: { $eq: value } }.


The comparison operators $eq, $ne, $lt, $le, $gt, $ge apply to all suitable types if the value is supplied in appropriate format:


$findAll({ p: { $lt: 100 } });                // returns all instances where p < 100

$findAll({ p: { $ge: 100, $le: 200 } });      // … p between 100 and 200

$findAll({ p: "Bob", q: { $ne: "Jones" } });  // … p is Bob but q is not Jones


The $like and $unlike operators can be used for SQL-like pattern matching:


User.$findAll({ Name: { $like: "B%" } });     // returns all users whose name starts with B

User.$findAll({ Name: { $unlike: "J.." } });  // returns Bill but not Jim


The $null operator checks if a given property is NULL in the database:


Post.$findAll({ Rating: { $null: true } });    // returns posts with unknown rating

Post.$findAll({ Created: { $null: false } });  // returns posts with known authors


Note again that in the second example, $null checks for NULL, not the $none value of the Author association in Post!  In JavaScript terms this is the difference between Post.Author === undefined and Post.Author === null.


Expressions are evaluated by the database but also by JavaScript when checking the entity cache for matching instances (see section on Entity Management below).  This can yield unexpected results due to JavaScript’s peculiar semantics, especially for dates:


$findAll({ myDate: { $eq: new Date(2014, 0, 1) } }); // always fails (in JavaScript)

$findAll({ myDate: { $lt: new Date(2014, 0, 1) } }); // OK


Managed queries are inherently limited in their expressiveness, as the XSDS runtime needs to check its instance cache against the filter condition provided to $find. Applications requiring advanced query capabilities should use unmanaged queries described in the next section. Of course, both managed and unmanaged queries can be used side by side.


Updating Entities


Entity instances are regular JavaScript objects and can be used and modified as one would expect:



post.Author = users[0];

post.Author.Name = "Newt";


All changes are made in memory only.  If we want to persist our changes for an instance we invoke its $save() method:


post.$save();   // update post and its associated entity instances


Calling $save() will flush our in-memory changes of that instance to the database, following all associations reachable from the root instance.  Only entities actually changed will be written to the database.


Key properties must not be changed; any attempts to persist an entity with changed key values will raise a runtime exception:


var post = Post.$get({ pid: 14 });;         // bad idea!

post.$save();       // throws runtime exception


There is an additional batch persist for persisting multiple instances in one operation:


Post.$saveAll([ post1, post2 ]);   // persist post1 and post2


Note that modified instances in memory always take precedence over their unmodified versions on the database.


Creating new Entities


The entity constructor is used to create new entity instances:


var user = User.$find({ Name: "Alice" });

var post = new Post({ pid: 101, Title: "New Post", Text: "Hello BBoard!",

                     Author: user, Rating: 1, Created: new Date() });



Key properties for which an HDB sequence has been supplied may be omitted from the constructor call.


New entities will be put under entity management right away, i.e., the $find() and $findAll() methods will return them if their properties match.  They will not be written to the database, however, until explicitly persisted with the $save() method.


For entities with associations there is an alternative method to create new instances:


var post = Post.$build({ pid: 102, Title: "Another Post",

                         Text: "New post, same author",

                        Author: { uid: 1, Name: "Alice" } });

var post = Post.$build({ pid: 103, Title: "Another Post",

                         Text: "New post, new author",

                        Author: { uid: 2, Name: "Newt" } });


The $build method will automatically create or retrieve associated target instances matching the properties supplied, whereas new requires that all targets be supplied as instances:


var post = new Post({ pid: 104, Title: "Fail Post",

                      Text: "This will not work",

                      Author: { uid: 2, Name: "Newt" } });

post.$save();  // fails, post.Author is not an entity instance


By passing true as optional second argument to $build XSDS will not try to retrieve already existing instances.


Discarding Entities


Retrieved entities are stored in the entity manager cache and subject to general JavaScript garbage collection.  To permanently delete an entity instance from the database the $discard() method is used:


post = Post.$get({ pid: 99 });



Note that after calling $discard() on an instance, the actual JavaScript object remains in memory as an unmanaged entity instance, i.e., $find() will no longer not return references to it.  It is up to the application to not use any remaining references to the object that may still be stored in some JavaScript variables.


The $discardAll() method for batch discards works analogous to the $saveAll() method:


Post.$discardAll([ post1, post2 ]);   // discards post1 and post2


For the special use case of deleting instances on the database without instantiating them in memory first XSDS provides the $delete() operation for unmanaged deletes:


Post.$delete(<cond>);  // deletes all posts matching <cond>, BYPASSING CACHE!


An unmanaged delete will delete all matching records on the database, ignoring the state of the entity cache.  Thus, the set of affected entity instances may differ from that of the sequence


var posts = Post.$findAll(<cond>);



Mixing managed and unmanaged data operations is dangerous, so please use $delete() with care.


Lazy Navigation


By default, all associations are eagerly resolved, i.e., association properties store a reference to their associated entity instance. For heavily connected data this may lead to very large data structures in memory.


To control which associations are being followed the association may be declared $lazy:


var LazyPost = XSDS.$importEntity("sap.hana.democontent.xsds", "", {

    Parent: { $association: { $lazy: true } }



A lazy association will delay the retrieval of the associated entity or entities until the property is actually accessed:


var post = LazyPost.$get({ pid: 102 });  // retrieves single Post and Author from database


if (post.Author.Name === "Lassie") {

    var parent = post.Parent;   // retrieve parent from database, if it exists

    if (parent)



post.$save();   // persist any changes made


The first time the lazy association is accessed the associated entity is queried from the entity cache or the database.  Once a lazy association has been resolved it is a normal property of its parent entity instance.


Lazy associations may be chained and updated transparently:


var post = LazyPost.$get({ pid: 103 });  // retrieve single post

post.Parent = new LazyPost( … );         // sets new parent post, old parent is never loaded



Note that updates to an entity instance will not update associated lazy instances if they haven’t been followed yet!


A lot may happen between the retrieval of some entity instance and the navigation to any of its lazy association targets.  It is left to the application to ensure the consistency of data.


Entity Management and Consistency


Entities retrieved from the database are stored in the entity manager cache.  Any subsequent query for that entity will be served from the cache instead of the database.


It is important to realize that if we modify an entity instance in memory, then all XSDS queries for that entity instance through $get(), $find(), and $findAll() will return the modified, in-memory version of the entity, even if it hasn’t been persisted to the database yet.


// assume post #1 and post #2 share same author

var post1 = Post.$get({ pid: 1 });

post1.Author.Email = "";

var post2 = Post.$get({ pid: 2 });


In above example, the value of post2.Author.Email equals the new value “”, even though post1 has not been $save()ed yet.


An unmanaged query, on the other hand, will ignore unpersisted changes to our data and return the database view instead, so continuing the example from above,


var post2_value = Post.$query().$matching({ pid: 2 }).$execute();


will yield post2_value.Author.Email === "".


You may use a transaction rollback to revert to the last committed state of your data, irrespective of whether data was persisted or not (see below).




There are some additional subtleties about consistency when working with CDS-based associations that impose certain restrictions on using backlinks and entity links in managed mode.


For backlink associations, the associated instances are included in the parent entity instance, be it eagerly or lazily:


var post = Post.$get({ pid: 69 });

var count = post.Comments.length;   // post.Comments contains all associated Comment instances


In JavaScript parlance the type of post.Comments is called “array-like”: while the object is not an instanceof Array, it supports all non-mutating Array.prototype functions and may be passed to functions that expect an array:


post.Comments.forEach(function(comment) { … });  // array-like

var comments = post.Comments.slice();            // convert to real but detached array


Another important property of post.Comments is that the array is read-only, so you cannot reassign individual array elements or mutate the array:


post.Comments.pop();                    // error: read-only

post.Comments = [];                     // error: read-only

post.Comments.length = 0;               // error: read-only

post.Comments[0] = new Comment(…);      // error: read-only

post.Comments[0].Author.Name = "Newt";  // OK: array elements are mutable


Consequently, the association relation has to be updated through the members’ backlinks.  Whenever a backlink changes, the corresponding association arrays are updated automatically:


var c0 = post.Comments[0];

post.Comments.splice(0, 1);             // error: array is read-only

post.Comments[0].Parent = null;         // correct: change backlink

var index = post.Comments.indexOf(c0);  // index == -1 -> c0 immediately removed from array


Supporting association updates through the association array would impose an unduly amount of runtime overhead on the application, which is why backlink association arrays are read-only.


For many-to-many associations, on the other hand, the association array is a native JavaScript array that may be manipulated in any way that JavaScript supports:


var tag = Tag.$find({ Text: "cool" });  // find certain tag

var post = Post.$get({ pid: 69 });      // get particular post

post.Tags.push(tag);                    // attach tag to post

post.$save();                           // update database


This example attaches the tag “cool” to the post with pid === 69 without modifying any already existing tags.


Note that the direct manipulation of linking entities is discouraged, as this may lead to inconsistencies between the views on the association arrays of the entities with many-to-many associations and their link tables:


post.Tags = [ tag1 ];          // post has one tag

var link = new TagsByPost({ lid: 69, Tag: tag2, Post: post }); // don’t do this!

link.$save();                  // post now has two tags on database!

var count = post.Tags.length;  // but count === 1 not updated!


For unmanaged associations, the resulting property is again a read-only array-like object:


var thread = PostWithTopReplies.$get({ pid: 101 });

for (var i = 0; i < thread.TopReplies.length; ++i)



But unlike managed associations such as those defined by backlinks, an unmanaged association is static and ignores the entity cache containing unsaved updates.  To re-sync the association array with the current database state the $reread method must be called explicitly:


var reply = thread[0];

reply.Rating = 0;



// reply still contained in thread.TopReplies even though Rating >= 3 no longer holds

var i1 = thread.TopReplies.indexOf(reply);  // === 0



// now reply is no longer contained in thread.TopReplies

var i2 = thread.TopReplies.indexOf(reply);  // === -1


Note, however, that invoking $reread will still not reflect unsaved changes contained in the entity cache only!  In other words, if we removed reply.$save() from our example above, variable i2 would still contain a value of 0.  If your application relies on a fully consistent view on all data at all times unmanaged associations should be used cautiously.


Discarding an instance will delete the root instance, but not associated entities by default, even for one-to-one associations.  If you do want to delete associated entities as well, you can add a $cascadeDiscard property to your association definition:


var PostCD = XSDS.$importEntity("…", "…", {

    Comments: {

        $association: {

            $entity: "Comment",

            $viaBacklink: "Parent",

            $cascadeDiscard: true





XSDS supports explicit cascading for deletion only at the moment. All other operations such as creation, tracking, and updates are always cascaded to all associated entities automatically.


Note that XSDS currently doesn’t support orphan removal. It is left to the application to maintain integrity of associations and references.  You can let HANA help you there by defining key constraints for your associations.


Converting between managed and unmanaged values


XSDS offers applications both managed and unmanaged views of their data.  In general, these views should not be mixed to avoid confusion and potential data corruption, but for some use cases a careful conversion of managed and unmanaged data is required.


To convert managed entity instances into unmanaged values, the $clone method yields a detached plain value without modifying the instance:


var value = instance.$clone();


$clone() creates an independent deep copy of instance where all references have been resolved into separate plain JavaScript objects.


To create managed instances from unmanaged values, the situation is slightly more complex.  In simple cases without associations, the new operator may be used:


var instance = new Entity(value);


Note, however, that the instance created by new will only be valid if value supplies actual instances for all associated targets of Entity.  If value is the result of an unmanaged query this is unlikely to be the case.


For the general case the $build method will take an arbitrary unmanaged value and construct a valid instance from that value by a combination of new and $get operations.  The result will be a valid instance, including associations, that reflects the data updates originating from the unmanaged value supplied:


var post = Post.$get({ pid: 101 });

--> post === {

     "pid": 101,

     "Title": "First Post!",

     "Author": { "uid": 1, "Name": "Alice", "Email": "alice@sap.corp", … },

     "Rating": 1,

     "Created": "2014-08-14T12:22:50.000Z"  }

var updatedPost = Post.$build({ pid: 101, Rating: 3, Author: { uid: 1, Name: "Newt" } });

--> post === updatedPost === {

     "pid": 101,

     "Title": "First Post!",

     "Author": { "uid": 1, "Name": "Newt", "Email": "alice@sap.corp", … },

     "Rating": 3,

     "Created": "2014-08-14T12:22:50.000Z"  }


If $build cannot get an existing instance for the key values supplied it will try to create a new instance using the new operator. When constructing all new instances you can greatly improve performance by passing true as second argument to $build.  XSDS will then skip getting already existing instances and invoke new for all instances right away.


Converting between managed and unmanaged values may be a costly operation, especially when using $build.  We thus recommend designing your application carefully in advance to minimize the number of conversions required.




This concludes our introduction to XSDS managed mode for now. In a follow-up post we’ll turn to unmanaged mode and its powerful query interface.

XS Data Services (XSDS) are a JavaScript library for the XS engine to consume SAP HANA artifacts that have been defined using Core Data Services (CDS), the central data modeling concept of HANA.  The XSDS library supports the import of CDS entities and their associations and offers managed and unmanaged manipulation of such instances. With additional features such as transaction handling, lazy navigation, support for custom types and views, and incremental query building XSDS offers a level of convenience that makes writing native HANA applications with XS a breeze.


What are XS Data Services?


Core Data Services (CDS) are a cross-platform solution to define semantically rich data models for SAP HANA applications.  In essence, a CDS model defines an application’s relevant set of business entities and their relations while abstracting from the underlying technical storage.


The XS Data Services (XSDS) client library for the XS Engine allows JavaScript applications to consume these CDS artifacts as native JavaScript objects.  Instead of writing low-level SQL queries involving database tables and columns, application developers can use XSDS to reason meaningfully about high-level business entities and their relations.  XSDS understands the CDS metadata model offers methods for typical instance management operations.  Advanced queries are constructed using a fluent query API that follows the CDS Query Language specification as closely as possible.


This mini-series of posts expands upon the introductory coverage of Thomas Jung’s blog article on XSDS for SPS9.  Our articles cover the import of CDS entities into native HANA applications, working with managed entity instances in XS JavaScript, and advanced unmanaged queries that will unlock all of HANAs capabilities inside native JavaScript.  XSDS comes pre-installed with SAP HANA SPS9 or later, so you don’t have to install or import anything at all.


Importing CDS Entities


Let’s assume that we’re building a bulletin board application that allows users to post short text messages and to reply to other users’ posts.  Our CDS data model thus defines two entities, user and post:


namespace sap.hana.democontent.xsds;


context bboard {

    entity user {

        key uid: Integer not null;

        Name: String(63) not null;

        Email: String(63);

        Created: UTCDateTime;


    type text {

        Text: String(255);

        Lang: String(2);


    entity post {

        key pid: Integer not null;

        Title: String(63) not null;

        Text: text;

        Author: association [1] to user;

        Parent: association [0..1] to post;

        Rating: Integer;

        Created: UTCDateTime;




To use the User and Post entities in our JavaScript application we first have to import them:


$.import("sap.hana.xs.libs.dbutils", "xsds");

var xsds = $.sap.hana.xs.libs.dbutils.xsds;

var Post = XSDS.$importEntity("sap.hana.democontent.xsds", "");

var User = XSDS.$importEntity("sap.hana.democontent.xsds", "bboard.user");


The first two lines import the XSDS library itself, and the last two lines define the User and Post entities, respectively.  $importEntity returns a constructor function that is used as a handle for retrieving and creating entity instances.  For most data models this is all that is required to define an entity.


Advanced options allow you to override certain properties of the columns to import.  In the simplest case, you could rename a table column to be used as object property or override the primary key definitions:


var Post = XSDS.$importEntity("sap.hana.democontent.xsds", "", {

    UserID: { $column: "uid" },

    Name: { $key: true }



You may also provide an existing HDB sequence to generate keys automatically for you.  By using a sequence you will not have to populate the key column for newly created instances yourself.


var User = XSDS.$importEntity("sap.hana.democontent.xsds", "bboard.user", {

    uid: { $key: "\"SAP_HANA_DEMO\".\"sap.hana.democontent.xsds::bb_keyid\"" }



Note that XSDS does not apply any additional logic to key generation.  It is left to the application and the sequence to generate valid, unique keys for your entities.




One of the key benefits of CDS is the explicit definition of associations between entities.  XSDS will read association information automatically and integrate associations as references into the entity definition.


To introduce one-to-one associations manually, e.g., for legacy models not converted to CDS yet, you may provide an $association property that specifies the associated entity type and the foreign key that links your parent entity to the associated entity:


var Post = XSDS.$importEntity("sap.hana.democontent.xsds", "", {

    Parent: {

        $association: {

            $entity: "",


        pid: {

            $column: "Reference",

            $none: 0





The entity provided by $entity may be the name of the entity of the actual entity constructor itself.  Entities referenced by name are loaded automatically with default options if no matching $importEntity()call can be found.


If the association is a one-to-zero-or-one association, the foreign key should define a specific “no association” value with the $none property.  The $none values will be used to populate the table columns if the parent entity is persisted without associated entity.  In the example above, a value of 0 in the Reference column indicates that the post is not in response to another post.


Note that the “no association” value is different from NULL!  A value of $none indicates no association, whereas a value of NULL indicates that the status of the association is unknown.  If no $none value is specified, however, XSDS will assume that $none: undefined in order to be consistent with the current de-facto behavior of CDS.


One-to-Many Associations


CDS defines one-to-many associations by using backlinks from the targets to the source.  Assume for our example that each post may have any number of unique comments, where each comment contains a reference to its parent post:


entity comment {

    key cid: Integer not null;

    Text: text;

    Author: association [1] to user;

    Parent: association [1] to post; // as backlink

    Created: UTCDateTime;



Since HANA SPS9 does not support the creation of CDS backlinks yet we will have to add the backlink definition manually to our import of post:


var Post = XSDS.$importEntity("sap.hana.democontent.xsds", "", {

    Comments: {

        $association: {

            $entity: "sap.hana.democontent.xsds::bboard.comment",

            $viaBacklink: "Parent"





This is equivalent to importing the following, still unsupported CDS definition:


entity post {

    key pid: Integer not null;


    Comments: association [0..*] to comment via backlink Parent;



Comments will appear as a native read-only array-like JavaScript object in our instances of the Post entity:


var post = Post.$get({ pid: 69 });

for (var i = 0; i < post.Comments.length; ++i) {

    if (post.Comments[i].Author.Name === "Alice") { … }



We’ll see how to work with these array-like objects further down below.


Many-to-Many Associations


CDS defines many-to-many associations by using linking entities that connect source and target instances.  Assume for our example that any number of tags may be attached to our posts:


entity tag {

    key tid: Integer not null;

    Name: String(63);



The actual linkage data is stored in a separate table that corresponds to a (potentially keyless) entity:


@nokey entity tags_by_post {

    lid: Integer;

    Tag: association to tag;

    Post: association to post;



Again, as HANA SPS9 does not support linking entities, we will have to enrich our Post data model in our import manually:


var Post = XSDS.$importEntity("sap.hana.democontent.xsds", "", {

    Tags: {

        $association: {

            $entity: "sap.hana.democontent.xsds::bboard.tag",

            $viaEntity: "sap.hana.democontent.xsds::bboard.tags_by_post",

            $source: "post",

            $target: "tag",





The $source and $target properties indicate the direction of the association.  This import statement is equivalent to importing the following unsupported CDS definition:


entity post {

    key pid: Integer not null;


    Tags: association [0..*] to ds_comments via entity tags_by_post;



The tags will appear as a native JavaScript array in our instances of the Post entity:


var post = Post.$get({ pid: 69 });

var tag = Tag.$find({ Text: "cool" });

if (post.Tags.indexOf(tag) < 0) {




While the representation of tags is similar to that of our comments before, their semantics are quite different.  In particular, attaching a new tag to a post would not remove it from other posts this tag may be attached to.


Unmanaged Associations


The most general form of associations supported by CDS are unmanaged associations that define an arbitrary JOINON condition:


var PostWithTopReplies = XSDS.$importEntity("sap.hana.democontent.xsds", "", {

    TopReplies: {

        $association: {

            $entity: "",

            $on: "$SOURCE.\"TopReplies\".\"Parent\".\"pid\" = $SOURCE.\"pid\" AND " +
                 "$SOURCE.\"TopReplies\".\"Rating\" >= 3"





Make sure to quote all field names in the $on expression.  The syntax of the $on condition is preliminary and likely to change for the next release.


Associated instances are included in a read-only array-like object. Due to the unmanaged nature of these associations the contents of the array cannot reflect unsaved changes to the relation and must be refreshed manually.




The XSDS library imports existing CDS entities, but it can also transform entity definitions and derive new entities.  If we want to add a new association not included in the CDS data model, we can derive a new entity in XSDS:


var PostWithComments = Post.$deriveEntity({

    Comments: {

        $association: {

            $entity: "sap.hana.democontent.xsds::bboard.user",

            $viaBacklink: "Post"





This extensibility also allows us to work with legacy SQL data models, where we enrich the data model to make implicit associations explicit:


var SqlPost = XSDS.$defineEntity("SqlPost", "\"SOME_TABLE\"", {

    Author: {

        $association: { $entity: User },

        uid: { $column: "Author" }




Note that we’re using $defineEntity instead of $importEntity here -- $defineEntity is another method for defining entities in your application that is specialized on creating data models for plain SQL-based tables without CDS metadata.  XSDS will import every table column as an entity property, but as plain SQL lacks the metadata about associations, we need to supply this information ourselves.




With our entity imports in place we can now use the constructor functions to work with individual instances or to build advanced queries. Stay tuned for the next installment of our XSDS mini-series!

I've recently posted an idea in SAP Idea Incubator.


The basic concept is to develop an iphone app which makes both developers and end users life easy.


More details can be found from below link


Mobile SAP ECC : View Idea


I urge you to start discussing in the possibility and possible suggestions on the same


Many Thanks,


Hi everyone, in the previous blog post Creating news tiles with SAP HANA UI Integration Services (UIS), we learned how to create news tiles in the UIS catalog and show them on the Fiori Launchpad. As I showed in the previous post, besides news tiles, another new feature in UIS is the custom tiles or "Custom App Launcher" formally. So, in this post I will continue sharing with you how to create custom tiles and put them on the Fiori Launchpad with UIS. If you are new to UIS, I highly recommend to first have a look at the section "What's UIS? & Some useful materials" in Creating news tiles with SAP HANA UI Integration Services (UIS)



As you can see, in SAP HANA SPS08 we can only select the following tile templates with UIS.




It's difficult for us to create the following Fiori Launchpad (which contains news tile, bullet chart tile, comparison chart tile, etc.) with UIS inside SAP HANA SPS08, since there's no template for news tiles and custom tiles at that time.




Image source


As of SAP HANA SPS09, news tiles and custom tiles are newly introduced in UIS. With these new features, now we are able to create the above Fiori Launchpad using UIS. We've already discussed news tiles in Creating news tiles with SAP HANA UI Integration Services (UIS), so in this post we'll be focused on "Custom App Launcher" or we say custom tiles.




Creating SAPUI5 views

First of all let's have a look at what we can configure in custom tiles. As you can see in the following screenshot, there are three parts including "General", "Configuration" and "Navigation". "General" and "Navigation" parts are the same as "Static App Launcher", but "Configuration" is a particular part for "Custom App Launcher" and that's the place where we can configure our own custom tiles.




OK. What we want to do is placing two custom tiles on the Fiori Launchpad. One is bullet chart tile, another is comparison chart tile. Based on the XS project in Creating news tiles with SAP HANA UI Integration Services (UIS), we can first create these two SAPUI5 views. In order to save time, I used the code from SAPUI5 Explored and did some modifications.





  header="Cumulative Totals"
  <TileContent footer="Actual and Target" unit="EUR" size="M">
  <BulletChart size="M" scale="M" targetValue="75" targetValueLabel="75c" minValue="0" maxValue="150">
  <BulletChartData value="125" color="Error"/>
  <BulletChartData value="35" color="Critical"/>
  <BulletChartData value="115" color="Error"/>




  header="Comparative Annual Totals"
  subheader="By Region"
  <TileContent footer="Compare across regions" unit="EUR" size="M">
  <ComparisonChart size="M" scale="M">
          <ComparisonData title="Americas" value="234" color="Good"/>
          <ComparisonData title="EMEA" value="97" color="Error"/>
          <ComparisonData title="APAC" value="197" color="Critical"/>

Creating "Custom App Launcher"




Bullet Chart Tile



Comparison Chart Tile



Placing custom tiles to Fiori Launchpad




Save and activate everything. Now let's have a look at the Fiori Launchpad. Here you go.




Hope you enjoyed reading my blog and create your custom tiles successfully!


Hi everyone, in this post I want to share with you some experiences about how to create news tiles with SAP HANA UI Integration Services (UIS). I installed SAP HANA SPS09 three months ago and have already tried lots of cool new features. Today I wanted to test something with UIS in SAP HANA Stuido. So when I created the "UIS Catalog" artifact and added a tile, I found there's also something new with UIS in SAP HANA SPS09.




As of SAP HANA SPS08, there's no news tile template in UIS. But as you can see in the following screenshot, in SAP HANA SPS09 now we have the news tile template which means we are able to place news tiles in the Fiori Launchpad in just few seconds. That's cool! "Custom App Launcher" is also newly introduced in SAP HANA SPS09, but I'm not going to focus on that in this blog post. Maybe I will show you something in another blog post.




What's UIS? & Some useful materials

As usual, first of all I wanted to show you some basics and useful materials about this topic, since perhaps for someone it's the first time to hear SAP HANA UI Integration Services (UIS) and I didn't plan to introduce everything in this post. So, what's UIS? I think the following slide page explained this clearly. As of SAP HANA SPS09, I think the killer feature is the capability of creating Fiori Launchpad inside SAP HANA.






If you are new to UIS, you may have a look at the following materials first.


  1. HANA UI Integration Services? What's that? You can learn some basics of UIS, but the examples are very old since standard application sites are deprecated as of SAP HANA SPS09.
  2. A slide deck from openSAP, also get some basics and ignore the old examples.
  3. Using UI Integration Services - SAP HANA Developer Guide for SAP HANA Studio - SAP Library
  4. News Tile - SAP HANA Developer Guide for SAP HANA Studio - SAP Library
  5. SAP HANA UI Integration Services (New and Changed) - What's New in the SAP HANA Platform (Release Notes) - SAP Library
  6. SAP Hana UI Integration Services - YouTube Be careful. Some videos are very old, since widget and SAPUI5 application tile are deprecated as of SAP HANA SPS09.


Creating our first news tile

As its name implies, we can use news tile to display news feeds or we say RSS. In this section, let's start to create our first news tile and show it on our Fiori Launchpad step by step. I'm using SAP HANA SPS09 Rev. 93.


Step 1: Create XS project, as of SAP HANA SPS09, you can create .xsapp and .xsaccess together with creating the project.




Step 2: Create "UIS Catalog"


Select a wizard



Create new catalog, for simplicity I did not generate privileges.





Step 3: Add and configure the news tile


Add the news tile



Step 4: Add article feeds


Find a feed URL, e.g., we can choose one from



Add the feed URL and save it, see all configurations from News Tile - SAP HANA Developer Guide for SAP HANA Studio - SAP Library



Step 5: Create "UIS Application Site" - Fiori Launchpad


Select the wizard



Create new application site





Step 6: Grant the role

When launchpad.xsappsite is opened, you will see the following error.



How can we solve this? Granting the corresponding role. Since I wanted to show the error, I just granted the role in this step. You'd better do it at the beginning.



Now the error is gone.


Step 7: Create group and add tile


Create group



Add tile



Save and activate it



Step 8: Run Fiori Launchpad


Click "Runtime Version"



Login and the news tile didn't work, saying "No articles to display". You can use Chrome developer tools to find the issue in red box.



What happened? This is because of the same origin policy and the response header of our requested resource did not include "Access-Control-Allow-Origin". Are there any solutions? Yes, one of the solutions is called CORS. You can have a look at Cross-origin resource sharing - Wikipedia, the free encyclopedia and Cross-Origin Resource Sharing. That's why the following note is stated in News Tile - SAP HANA Developer Guide for SAP HANA Studio - SAP Library


If the URL references an external feed, the feed must be CORS-compliant. If the URL references an internal feed, the feed must originate from the same server and port as the launchpad.


Although there's no "Access-Control-Allow-Origin" in the response header, we can find a workaround from Fiori News Tile - no articles displayed. So, I installed Allow-Control-Allow-Origin: * - Chrome Web Store and enabled CORS. It worked and now we can get RSS feeds from all content of SAP HANA Developer Center.




Make more configurations

As you can see in News Tile - SAP HANA Developer Guide for SAP HANA Studio - SAP Library, there are some other configurations we can make in order to customize our RSS feeds. Since everyday I spend some time on SAP HANA Developer Center and SAP HANA and In-Memory Computing, and I love to participate in discussions about SAP HANA, I'd like to make my own RSS feeds.




As you can see in the above screenshot, I made three configurations.


  1. Add my tile default image and set "Always Use Default Image" to true. I borrowed the image from Hana Maui | Escape to Hana, Maui
  2. Add two article feeds. The first one is discussion feeds from SAP HANA Developer Center and the second one is discussion feeds from SAP HANA and In-Memory Computing You can find the links from and
  3. I want to focus on the discussions whose title includes "HANA"




That's it! No coding, only configurations, you got the Fiori Launchpad with your customized News Tile. Want to have more RSS feeds on SCN? You can have a look at Everything I know about... SCN RSS Feeds and configure your own.


Failed to start the app...

However, it seems to be an error when clicking the news tile. I found SAP Note 1968559 and 2065811 are related with this issue but don't know yet how to solve this inside SAP HANA.




Hope you enjoyed reading my blog and create your own news tile successfully!

User CRUD operations is very common during the HANA XS application development process.

And let's have a try and discovery on the concurrency control and the lock mechanism design.


1.Assumed use case.

In the assumed scenario:

The database table has 4 fields, DATE,PERSON_ID,NAME and PERSON_INFO.

DATE and PERSON_ID make the primary key.


Here is two pages in the application:

The page shows the content exposed from HANA database.

And you could click on the "Edit Content" button in the head of the table.




The designed scenario is "Multiple users cannot edit the same record at the same time period".

For example, while user1 is editing content of PersonID 1001 on DATE '2015-02-09', user2 cannot edit PersonID 1001 on '2015-02-09' but user2 is allowed to edit PersonID 1001 on '2015-02-10' or PersonID 1002 on '2015-02-09'.


If no confilcts happens, user would be directed to this page, then do the save or cancel operations.




If someone is editing, the page would give an alert:





2.Business requirement analysis


Here I draw a very simple business requirment process flow diagram for better understanding.





There is some point need to be noticed:

(1)     When user click on the "edit" button, the application should decide whether it should be directed to the next page or alert a notice that other user is editing the same record.

(2)     For situation when user forget to terminal the session, the system should have a default time-out mechnism and release the lock.


3.technical design


I came up an idea which is writing all the users' opertions down and maintaining the opertion log in a trasaction table.

When user click on the "edit" button, the service would check the transaction log first and then decide whether the user could enter the next page.


Lock Table design:



This table has four fields, firstly it should contain the primary key as the database table has.

Then the LOCK_TIMESTAMP is used for tracking the time when the click event session starts.

The SESSSION_USER is used for taking notes of the operator of this session.

Beacuse this table is used for OLTP scenarios and it always to be fetched or looked up by record, so row store is a better choice.


Here is the enhanced process flow:




Whenever the user click on the "Edit" button, the service would first look up the transaction table:

Three situations would cover:

(1)     If no log exist for this record, then add a new log for this record. Return yes to the user and let him/her view the next page.

(2)     If log exist, check the difference of the timestamp:

          (a)     Timestamp difference is larger than the value we set, overwrite the log  with current timestamp, return yes.

          (b)     Timestamp difference is smaller than the value we set, keep the log  same, return no and the session_user value, then alert a message to                               the user and tell him/her someone is editing the records.


4. implement and show the code


Let's have a look at the code:

I use XSJS service to implement this:


function EditRecord(){

    var PersonID= $.request.parameters.get('PersonID');             //Get the PersonID which user inputs

    var Date= $.request.parameters.get('Date');                         //Get the Date which user inputs

    var diffTime;                                                         //Store the value for differnce between current timestamp and the timestamp value in the transaction table

    var username;                                                     //Store the lock user name in the transaction table

    var conn=$.db.getConnection();

    var pstmt;

    var query ="select seconds_between(time_lock,CURRENT_TIMESTAMP),DATE,SESSION_USER from \"XXX\".\"LOG_FOR_LOCK\" where PERSON_ID= ?";



    var rs=pstmt.executeQuery();





        if(Date.getTime()-LockTableTime.getTime() ==0 && diffTime<1200 && username!==$.session.getUsername())

        {$.response.setBody(-1);return -1;}


     // We check whether the DATE are the same by using the getTime() function. And I set the time-out value to be 1200 seconds.

     // Also, here we allow the same user to open multiple sessions. Otherwise, it is possible to let the user be locked by himself.


     // if the function does not end, which means whether there is no log for the records or the log is out-dated.

     // So we should insert or update the log information with the current session


    var query1="upsert \"XXX\".\"LOG_FOR_LOCK\" values('"+ Date + "','" + PERSON_ID +

    "',now(),'"+ $.session.getUsername()+ "') where DATE= " + DATE + " and PERSON_ID= " + PersonID ;







    return 1;



// When the application page call this service, it could decide whether jump to the next page according to the return value.

// I wrote this code just for test use, no gurantee for 100% secure or accurate.



5. Summary & FAQ

Absolutely, there is no best solution for every scenario.

This lock mechanism is designed for only some kind of situation and it may contain some defects.

I just implement for a test use case and the code I wrote may contain some flaws and it is not an official recommanded mechism.

Different methods would effects according to different design.




1.     How about orphan lock situation?

The service would check the timestamp between current timestamp and the timestamp stored in the log_for_lock table.


2.   What is the most siginificant difference between this application lock and database concurrency controll.

As my understanding, this application is meant to detect confilct when user click on the "edlt" button.

And database concurrenct controll is a lock mechism effect when comit action happens.


For other methods, please refer to below links for detailed information:



2. Optimistic Lock:

Just Google it and it would bring many good results.


3. HANA MVCC controll:

SAP HANA Concurrency Control


Special thanks for help from Thomas, Neil, Yuan and Antonie.


Please feel free to point out if there is any other flaw in this design and have discusion with me.

Thanks for reading.

Have a nice day.




Filter Blog

By author:
By date:
By tag: