I just pushed a little showcase application to github which demonstrates the use of a SAPUI5 frontend application with a Grails backend system. The reasons I wrote this frontend in the RDE are:

  • I love SAPUI5 as frontend technology.
  • I love Grails as backend technology cause it`s so groovy.
  • I needed a simple showcase for authentication in restful application scenarios.
  • I want a colleague to work on the frontend part of another bigger Grails application with minimum effort on local system setup.

So I decided to put the frontend part of the application into a GITHUB repository. I then cloned this in the RDE that comes as beta version with HCP trial edition. After that I could develop my frontend application in the RDE and push my changes into this repository.


To connect this frontend to my Grails backend application and to overcome the SOP (same origin policy) problem I had to expose the latter via the SAP Cloud connector to my HCP account.



With this I could create a destination in the HCP cockpit


After the destination was available I had to refer to it in the file neo-app.json of my SAPUI5 application.


Now I could simple call my api functions from the SAPUI5 application as if they were running at the same host.

     oModel.loadData("api/logout", undefined, true, "POST", false, false, header );

If you are interested in the REST authentication stuff please look into the sourcecode at GITHUB and the readme file over there.

Between June 12th and July 31st 2014 we ran the openSAP course "Next Steps in SAP HANA Cloud Platform" to provide interested developers, customers and partners with more details around the usage of the SAP HANA Cloud Platform.


With this blog I want to thank all of the students for their participation and their feedback and want to provide some additional information around the course.


Direct access to the videos


First of all I'd like to encourage you to use the materials at the openSAP page for this course as it not only provides the videos and the slides, but also all the discussion threads with a lot of questions and answers that popped-up during the course.


The following list is meant for those of you who want to have a quick check of the videos to remember how to accomplish a specific task with the platform. You can find the videos related to the units of the course under the column "Additional Assets".


Again: don't forget that you get all the additional goodies like slides and Q&As around the course on the official Course: Next Steps in SAP HANA Cloud Platform!


Unit Videos
Week 1: SAP HANA Native Development  (related blog post)
1: Basics
  • The various SAP HANA Cloud Platform offerings
  • The specifics of the SAP HANA Cloud Platform trial landscape
  • How to set your development environment and connect to your SAP HANA instance
2: SAP HANA Applications
  • Using SAP HANA on-premise and on the SAP HANA Cloud Platform
  • How to import a sample SAP HANA application
  • Running the SHINE application on SAP HANA Cloud Platform
3: SAP HANA Web-based  Development Workbench
  • How to use the SAP HANA Web-based Development Workbench to quickly develop, modify, and test your SAP HANA application.
  • How to launch the SAP HANA Web-based Development Workbench directly from the SAP HANA Cloud Platform cockpit
  • How to modify the SHINE application on SAP HANA Cloud Platform directly, using the SAP HANA Web-based Development Workbench.
4: SAP HANA Predictive Analysis Library
  • How to use PAL on the SAP HANA Cloud Platform
  • PAL on the free SAP HANA Cloud Platform trial landscape
  • How to use ABC analysis PAL function to build an SAPUI5 graphical visualization
5: Extend SAP HANA Applications with HCP Services
  • Additional services and extension capabilities that SAP HANA Cloud Platform provides on top of SAP HANA native capabilities
  • How to configure and work with SAP HANA Cloud Platform feedback service
  • How to enhance a sample SHINE application with SAP HANA Cloud Platform feedback service
Week 2: Git and HTML5 Apps - Part 1 (related blog post)
1: Introduction to HTML5 Applications and Git
  • HTML5 applications on SAP HANA Cloud Platform
  • The development Infrastructure
  • What is Git?
2: Creating a Hello World HTML5 Application
  • How to create a simple HTML5 application
  • How to clone a repository
  • How to commit and push
  • How to test an HTML5 application
3: Git Basics
  • Where does Git store versions?
  • What is a working directory?
  • What is a commit and how can you create one?
  • What is a branch?
  • How to get a copy of a repository with clone?
  • How to transfer back your changes with push?
  • Where does Git store the  configuration settings?
4: Using SAPUI5 in Your HTML5 Application
  • How to use SAPUI5?
  • What is an SAPUI5 model?
  • What is an SAPUI5 view?
  • What is an SAPUI5 controller?
5: Using a REST Service in Your HTML5 Application
  • How to use a REST Service in an HTML5 application?
  • What is the application descriptor?
  • How to configure back-end routing?
  • How to create a destination?
Week 3: Git and HTML5 Apps - Part 2 (related blog post)
1: Releasing a Version of Your HTML5 Application
  • Know the difference between commit, version, and active version
  • How to create a version using Git.
  • How to create a version using the cockpit.
  • How to activate an application.
  • How to fetch in Eclipse.
2: Adding a Chart to Your HTML5 Application
  • Recap of the development and test lifecycle for HTML5 applications
  • How to use a chart in SAPUI5
3: Working with Multiple Branches
  • How to work with local branches.
  • Why local branches are useful.
  • How to rebase local branches.
4: Resolving Merge Conflicts
  • Merge conflicts created by git
  • How to resolve conflicts
5: Git History
  • How to filter the history
  • How to search in the history
  • How to find out when and why a line was changed
  • How to revert a commit
  • How to reset a branch
Week 4: Advance Identity Management (related blog post)
1: Working with User Profile Attributes
  • Different classes of user account information
  • Configuring attributes with the local IdP and in the Cloud Cockpit
  • Accessing user attributes in java based apps
2: Group Management
  • Using groups in SAP HANA Cloud Platform
  • Assigning users to groups
3: Federated Authorization with Groups
  • Defining mapping rules
4: Custom Roles
  • Defining and using custom roles
5: Working with Multiple Identity Providers
  • Using multiple identity providers
Week 5: Securing Web APIs (related blog post)
1: Protecting Web APIs
  • What are Web APIs?
  • Where to use SAML 2.0 and OAuth?
  • What are the benefits of OAuth?
2: OAuth 2.0 Fundamentals
  • How OAuth enables secure authentication and authorization for non-browser- based clients such as native mobile apps
  • Comparison OAuth vs. password authentication
3: Protecting the Cloud Application
  • How to configure the OAuth Filter
  • How to protect APIs programmatically
4: OAuth Configuration
  • How to register OAuth clients
  • How to configure scopes for your cloud application
5: Working with Multiple Identity Providers
  • How to integrate an OAuth Client with the SAP HANA Cloud Platform OAuth Authorization Server
  • How to implement a callback handler for the authorization code flow in a desktop client
Week 6: Advanced Features (related blog post)
1: SAP HANA Cloud Portal for Developers
  • What does SAP HANA Cloud Portal offer to developers
  • How to administrate SAP HANA Cloud Portal
  • The SAP HANA Cloud Portal marketplace concept
  • How to expose your custom apps as widgets in SAP HANA Cloud Portal
  • How to manage site pages and widgets to create engaging sites
  • How to preview the site, publish and revert changes made for the site
2: Developing Applications for Use in SAP HANA Cloud Portal Sites
  • Understand the SAP HANA Cloud Portal development process
  • Develop widgets for use in SAP HANA Cloud Portal sites
  • Develop an SAP HANA Cloud Portal solution with OpenSocial
  • How to use OpenSocial features available in SAP HANA Cloud Portal
  • SAP HANA Cloud Portal as a central UI framework
  • Building mobile-ready SAP HANA Cloud Portal sites
3: Design and Customize Cloud Portal Sites
  • How to design the site layout and select a theme for your site
  • How to customize the default SAP HANA Cloud Portal theme
  • How to apply an out-of-the-box theme to your site
  • The SAP HANA Cloud Portal page templates concept
  • Site navigation menu customization options
4: SAP HANA Cloud Integration
  • How to use the Catalog to view all prepackaged integration flows on the SAP HCI landing page
  • Configuring and using the Web UI
5: Wrap-Up and Outlook
  • Wrap-up of the course
  • Outlook to the platform and to other openSAP courses around SAP HANA Cloud Platform






True... it's been a while since the last chapter, but patience is one of these virtues that come with age. Hence let's hope that Granny doesn't mind too much and get it on with. In the last chapter we talked about proper user interface design, both from an outside-in but also inside-out approach. I still believe that the later is the enabler for a good user experience (UX). It's a classic principle of software development: proper layering, separation of concerns and componentization help to fine-tune individual aspects as one does not need to worry to break other things.


Judging by the feedback I received it seems that people believe that the time applications need to cater for browsers user-agents that do not support JavaScript aeh ECMAScript is past us and rather a relict of the past: "Common, we live in the 21st century!"


I'd say it depends on the usage scenario and I believe there are still use cases where people will have to implement such fallbacks (e.g. in the public sector etc.) As always, it's a case-by-case situation and the additional efforts required to maintain a solid fallback mechanism for limited user-agents certainly need to beconsidered and planned for. Yet, given that the whole idea behind this blog series is to talk about what it takes to implement an enterprise-ready solution I at least wanted to point out how it's done! Going forward we'll certainly focus on state-of-the-art techniques to maximize the user experience and only maintain a rudimentary fall-back solution (after all, convenience goes a long way, and users may be encouraged to update to a modern browser if they feel they are missing out!)


Having said all this, it's time to have a look at current trends in user interface technologies. Doing so, we quickly notice that there's a tendency to let the client do the heavy-lifting. Technologies such as HTML5, JavaScript (see above) and CSS3 have progressed tremendously over the last years and it's amazing to see what capable developers can build with these web standards! On the mobile side we see both web and native apps deliver great user experiences and containers like Cordova (aka PhoneGap) completely blur the lines between web and native apps. In both scenarios the client handles the user interaction and only communicates with the server/backend via web services (in the broader sense of the word!) Typically, this is done using light-weight communication protocols and standards such as REST (e.g. JSON via HTTP) or - as popular at SAP and Microsoft - OData. Consequently, the server is responsible of providing an API that can be used by clients.


There's an API for that!


I truly believe that in the context of cloud computing and the Internet of Things (IoT) the famous slogan "There's an app for that!" (copyright by Apple Inc.) will change into "There's an API for that!" - APIs are really the foundation of the magic of today's inter-connected world.



Growth in Web APIs Since 2005 - Source: ProgrammableWeb


Given the importance I opted for giving this topic some extra room and actually write a mini series about APIs... yet before we dig deeper I'd like to point out that the term API does not necessarily indicate that the exposed services/functionality is consumed from the outside. In fact, I strongly promote of clearly establishing an internal API comprising the business functionality of any application. Software is never finished! As such, as applications grow over time it clearly helps to have defined an internal API layer that is used between the individual components or modules across the application. Exposing this set of services (or parts) of it in such a way that external clients can consume them is a completely different story - and one that comes with its own challenges!


I already wrote extensively about Enterprise APIs, why they matter, their primary principles and how-to develop them in my respective blog post series called 'The Rise of Enterprise APIs':



As I hate to repeat myself I just refer you to the respective posts and only briefly point out things I deem important to make a point. From an implementation perspective Granny's Addressbook uses the same building blocks as the sample application I developed for part 3 of the series: Apache CXF (as a great implementation of the JAX-RS standard, Spring, etc.


API = A set of self-contained services


The most important aspect of a business service that shall be included within an API is that it needs to be self-contained. For the client/consumers it has to act as a black box and ultimately a client would not need to know anything about the internal workings of the service. Consequently, the service may not take anything for granted (e.g. that the incoming data is in proper format or that it has been validated for type-safety or plausibility). As such, the service needs to properly check any incoming data for validity and properly report back any issues there may be. Same applies for security aspects etc. All of these aspects need to be ensured regardless of whether it was an internal or an external client consuming the service. That's the reason why I usually promote separating the underlying service from the actual API endpoint (which is protocol and format specific!)


This mindset is reflected in the architecture of the Granny's Addressbook application. The com.sap.hana.cloud.samples.granny.srv package contains the actual service implementations, which are just regular Java classes. They use the standard data model objects of the application and declare to throw a dedicated ServiceException. The RESTful API however is located in a different package called: com.sap.hana.cloud.samples.granny.api. If you have a closer look at the so-called Facades within this package you will see that they contain certain meta information (annotations) that are specific to RESTful service communication (and part of the JAX-RS API.) Same is true for the Response objects returned by the individual service methods. This differentiation also takes effect in regards to error handling. The service needs to take care of all business related concerns, while the (REST) facade needs to make sure it properly handles issues that may arise during the process of marshalling/unmarshalling the data model objects into JSON (or any other format). Let's keep that in mind as we get back to that topic in a few...



The three commandments of a good API


One could write a while book about what makes a good API, yet I leave that for others and simple state three fundamental characteristics here:


  • easy to consume
  • well documented
  • good error handling


That's really it! If an API fails to adequately address these things it will have a hard time to see adoption. And - quite frankly - if your API is part of a service targeting adoption you better nail does three things or...


Easy to consume


This factor has been the main reason why RESTful services have taken the IT world by storm in the last years. Due to the fact that RESTful services simply sit on top of the matured HTTP protocol makes them very easy to consume (especially compared to prior standards such as Web Services (I'm referring to the kind based on SOAP and WS* standards here...). Developers can test-drive RESTful APIs with their browsers or light-weight REST clients with ease. (Personally, I prefer a tool called POSTman.)


Well documented


This is a make or break topic - PERIOD. Your documentation needs to be easy to find and easy to understand. Developers are an impatient bunch and never have enough time to deliver within tight project deadlines. If they are struggling with how-to consume your API then they may just move on to a competitor - simple as that. Now, the hard part about maintaining a good API is actually maintaining it!


Way too often the documentation gets outdated, especially if it is written by a different team (which is quite common!) In general, information experts are under-valued and consequently service/API providers are well-advised to invest into a great documentation team that has both writing and technical skills. Ultimately, you regularly validate ease-of-consumption of your documentation via user tests!


Ultimately, you are well advised to consider the documentation an integral part of your product/service/solution and tightly weave it into your development & delivery processes! In fact, one of the next chapters of this series will focus on exactly that topic - so stay tuned!


Good error handling


Last, but not least: error handling (everyone's favorite topic, right?) From experience I can say that there's nothing more frustrating than struggling with an API because of poor error messages. If a developer can't figure out what (s)he has done wrong in consuming your API then you have a problem!


Here's where the dots connect in regards to what I said earlier about self-contained services. Given the importance of the topic I propose we have a closer look at it now, shall we?


Catch me if you can!


As always, we want to deal with error handling in a central place. Yet, in reference to what we said earlier we have two different flavors of errors to deal with. Semantical errors (e.g. constraint violations such as an exceeded maximum length of a given attribute) and technical errors (e.g. invalid payload data or formatting issues.) The former needs to be taken care of by the business services, while the later needs to be taken care of by the layer that handles the incoming/outgoing communication. In the case of Granny's Addressbook the in- an outbound communication is handled by CXF and the JSON marshalling/unmarshalling is taken care of by Jackson. Fortunately, this combination allows us to nicely implement our requirements (and that's what makes a good framework after all!)


Let's see how this works by looking at the code (= the single source of truth!). Below is an extract of the spring conifguration file:


 <jaxrs:server id="api" address="/v1">
            <entry key="org.apache.cxf.propagate.exception" value="false" />
  <ref bean="contactFacade" />
            <ref bean="jacksonProvider" />
            <ref bean="parserExceptionMapper" />
            <ref bean="jsonMappingExceptionMapper" />
            <ref bean="serviceExceptionMapper" />
 <bean id="objectMapper" class="com.sap.hana.cloud.samples.granny.util.CustomObjectMapper" />
 <bean id="jacksonProvider" class="org.codehaus.jackson.jaxrs.JacksonJaxbJsonProvider">
  <property name="mapper" ref="objectMapper"/>
  <!-- Exception Mappers -->
 <bean id="parserExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.ParserExceptionMapper" />
 <bean id="jsonMappingExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.JsonMappingExceptionMapper" />
 <bean id="serviceExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.ServiceExceptionMapper" />


In lines 10-12 I register so-called ExceptionMappers defined in lines 23-25. We have two ExceptionMappers taken care of the most common areas during the marshalling/unmarshalling process (namely parsing and mapping) and one ExceptionMapper taking care of ServiceExceptions. Just as it should be!


I recommend to have a look at the ParserExceptionMapper and JsonMappingExceptionMapper respectively. They should be pretty self-explanatory for the most part, yet I'd like to explicitly mention one aspect I deem important - we have to strike a good balance between level of detail and usefulness, plus not expose too many details about the underlying technology stack (for security reasons!) Therefore, I have used a bit of regex magic to conceal certain internal aspects.


Note: This approach certainly has its trade-offs! Given that we (more or less) pass-through the error messages provided by Jackson we are not in control of its content. Nor can we apply any kind of I18N and so forth. This certainly is an edge case and in a real-world application we certainly have to thoroughly test this piece of functionality with a lot of test cases to make sure the effect is as desired. Or, we would need to map the Jackson error message to our own and then return our own error messages. Whatever you decide to do, it should be a conscious decision!


1st-level data validation


So, now that the parsing and mapping has been dealt with - what about 1st level data validation (e.g. type-safety, mandatory fields, min/max lengths, etc.) Given on what I said above about self-contained services this certainly is something that should be dealt with by the service layer, right? The answer is a clear "YES, but..."


... does it make sense to pass through unvalidated data to the next layer, or shouldn't this be checked early-on? That's a tricky question! We could check the data here as well, by just adding the same @Valid annotation we use in the service layer. This would help to make the app more scaleable as we would reject invalid data at the earliest time in the request processing. Given that we have clearly separated the validation logic we at least can ensue a consistent handling of data validation. On the other hand, the first rule still stands and so the individual services would still need to verify the incoming data, consequently resulting in a redundant check (= performance overhead).


Internal vs external APIs


Matter of fact, that brings us to another topic. Assume we have a complex application with lots of services, of which some are just business facades chaining several calls to other business services. In such an application we would see repetitive calls to the data validation functionality! This is where you have to set boundaries and differentiate between internal services and external services. Among the former you'd have to assume that the data is semantically correct and would not need to be verified, while for the later you'd always want to ensure proper data validation.


Note: I recall having developed such a complex application and we ended up with developing our own set of annotations that would allow us to setup trust relationships between individual services in order to avoid redundant checks. In a nutshell the solution used ThredaLocals to store information about which model objects have been validated (and by whom). Every subsequent service then checked if the incoming model object had been validated and whether or not it had been validated by a trusted service or not. Now, obviously certain attributes could be changed along the processing chain and consequently data that once was valid, may not be valid anymore when it is passed to another service. This is where checksums may need to come into place. Sooner or later you end up with a complex meta-validation framework in its own right and well... that may do more harm than good!!!


So, on which layer to perform your data validation needs to be decided on a case-by-case basis. For Granny's Addressbook (which is a very simple) application it may be sufficient to simply do the data validation on the service layer, especially since our RESTful facades are rather thin. However, for the sake of illustration purposes I'll demonstrate how-to check data on the RESTful API layer AND on the service layer. That way, you have a blueprint for both and you can decide on your own which way to go with for your project.


Apache CXF and JAX-RS 2.0


Recently, CXF version 3.0 was released and it marks a huge milestone as it also brought support for JAX-RS 2.0. JAX-RS 2.0 is a major step and it brings a lot of new features we eagerly waited for such as hypermedia support (for the die-hard HATEOAS fans out there!) and ... surprise, surprise... bean validation.


Here's the official documentation on how-to integrate bean validation into CXF: Apache CXF -- ValidationFeature


We pretty much stick to these instructions, yet there is one short-coming in how CXF handles this aspect: the default ValidationExceptionMapper is rather dumb as it only logs the validation errors and returns an HTTP status code 400 (Bad request). Ultimately we would want to provide that information back to the client to give the user a hint what is wrong. (Note: For the SAP/ABAP guys out there, we are looking for something like a BAPIRET2 structure here!)  The Jersey (another popular JAX-RS implementation) is a bit smarter here: Chapter 17 - Bean Validation Support


Here's our own ValidationError object:


* Object used to report information about validation errors.
@XmlRootElement(name = "error")
public class ValidationError implements Serializable
  * The <code>serialVersionUID</code> of the class.
    private static final long serialVersionUID = 1L;
    String messageKey = null;
    String message = null;
    String messageTemplate = null;
    String path = null;
    String invalidValue = null;
    Map<String, String> messageParameter = null;


Code Review


So, to wrap up things, let's quickly go through the major components and see how it all fits together and to highlight a few coding segments. Let's start with the final spring configuration:


<jarxs:server id="api" address="/v1">
        <ref bean="validationInInterceptor" />
        <ref bean="validationOutInterceptor" />
        <entry key="org.apache.cxf.propagate.exception" value="false" />
        <ref bean="contactFacade" />
        <ref bean="jacksonProvider" />
        <ref bean="parserExceptionMapper" />
        <ref bean="jsonMappingExceptionMapper" />
        <ref bean="serviceExceptionMapper" />
        <ref bean="validationExceptionMapper" />

At first (lines 2-7) we register both an inbound and an outbound interceptor (well, for our data validation purposes we only need the inbound one!) Also, note that we registered an additional ValidationExceptionMapper as a provider (line 19).


The respective definitions are as follows:


<bean id="validationExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.ValidationExceptionMapper" parent="constraintViolationMapper" />
<!-- Validation -->
<bean id="validationProvider" class="org.apache.cxf.validation.BeanValidationProvider">
  <constructor-arg><ref bean="validationConfiguration"/></constructor-arg>
<bean id="validationConfiguration" class="org.apache.cxf.validation.ValidationConfiguration">
  <property name="messageInterpolator" ref="resourceBundleMessageInterpolator"/>
  <property name="parameterNameProvider" ref="jaxRSParameterNameProvider" />
<bean id="resourceBundleMessageInterpolator" class="org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator">
  <constructor-arg index="0">
      <bean class="org.springframework.validation.beanvalidation.MessageSourceResourceBundleLocator">
          <constructor-arg index="0" ref="messageSource"/>
<bean id="jaxRSParameterNameProvider" class="com.sap.hana.cloud.samples.granny.web.util.CustomJAXRSParameterNameProvider" />
<bean id="validationInInterceptor" class="com.sap.hana.cloud.samples.granny.web.util.CustomJAXRSBeanValidationInInterceptor">
    <property name="provider" ref="validationProvider" />
<bean id="validationOutInterceptor" class="org.apache.cxf.jaxrs.validation.JAXRSBeanValidationOutInterceptor">
    <property name="provider" ref="validationProvider" />

On line 1 we define the ValidationExceptionMapper, which encapsulates the JAX-RS specific interface of ExceptionMapper<Throwable>. Most of the heavy lifting of converting ViolationConstraints to ValidationErrors is actually taking care of by the parent class: ConstraintViolationMapper. The primary reason for splitting it up into two classes is clean separation between a cross-cutting concern (data validation) and the (RESTful) API layer.


Lines 8-11 define the configuration for the validation framework. Here, we register our own JAXRSParameterNameProvider and a MessageInterpolator. Both are required to massage the information reported back as ValidationErrors, e.g. looking up the right resource bundle containing our validation messages and providing proper information about the invalid attribute.


In lines 23-25 we register the inbound validation interceptor. Please note that this is yet another custom implementation (based on the standard one of course), which comes with a slightly different approach of validating the incoming data. The main reason for the custom implementation was to ensure a similar handling of data validation regardless of whether it happens on the API or the service layer.


Last words


Now, as mentioned several times already we still need to ensure that the service layer is also validating the incoming data. This is done via a corresponding Aspect: DataValidationAspect.


With that, we conclude this chapter by looking at the result:




Hope you liked our first post about the "Road to API-ness" and that you tune back in next time when we talk about API documentation. Soo long...

Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.


Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.



Week 6: Advanced Features


The last week of the course will be around advanced features.



Unit 1 -  SAP HANA Cloud Portal for Developers

Based on what you've learned in the last week of the course Introduction to SAP HANA Cloud Platform this unit explains how developers can leverage the SAP HANA Cloud Portal.


Important/additional information



Unit 2 - Developing Applications for Use in SAP HANA Cloud Portal Sites

This unit explains the development process on the SAP HANA Cloud Portal and how you can develop widgets for SAP HANA Cloud Portal.



Unit 3 - Design and Customize Cloud Portal Sites

In the last unit around the SAP HANA Cloud Portal you learn how to design and customize a site on SAP HANA Cloud Portal.



Unit 4 - SAP HANA Cloud Integration

This unit is about the SAP HANA Cloud Integration and explains some fundamentals around it.


Unit 5 - Wrap-Up & Outlook

The last unit of this course wraps-up what you learned in this course and provides you with a further outlook what will come next around the platform and around new openSAP courses for the SAP HANA Cloud Platform.


ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Introduction to SAP HANA Cloud Platform (repeat)


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Course Guide Week 6 - Advanced Features


Hi everyone,

we're coming to the end of the  course Introduction to SAP HANA Cloud Platform.


This last week will provide you with some insights into additional features like the SAP HANA Cloud Portal, Gateway-As-A-Service and SAP Mobile Platform - cloud Version.


Find below some additional information related to this week of our course.


Table of Contents



Unit 1 - SAP HANA Cloud Portal

Additional Information

Promo video of SAP HANA Cloud Portal




Unit 2 - Gateway As A Service

Common issues

You can't access Gateway as a Service

In case your account was already created at the time when Gateway-As-A-Service was not available on the trial landscape you are not able to use Gateway-As-A-Service. The only way you can fix this is by creating a new account on the trial landscape again.


You get a "redirected to http://localhost:8080/saml2/localidp/sso"

Most probably you didn't reset the trust settings from the units in week5.

To fix it go to your cockpit and click Trust -> Local Service Provider -> Default





Unit 3 - SAP Mobile Platform - enterprise edition - Cloud version



Unit 4 - Wrap-Up And Outlook

Hope you have already heard about our new web based SAP River Rapid Development Environment (RDE) and had a chance to try it out. RDE is really much more than any typical Integrated Development Environment (IDE) – it is built with a much larger vision – which is to provide an extensible framework capable of hosting any number of independent tools to achieve seamless “end-to-end development”.


Many articles have been already written to help a developer setup the environment, create SAPUI5 applications from existing templates, consuming a service etc. So to maintain the spirit of end-to-end development, in this article we shall focus on the “service provisioning” aspects – i.e., steps to actually create the service-end-point which is then consumed by the application.


SAP River RDE is still in beta but we have already taken the initial steps to embed the service provisioning experience into the overall development lifecycle. This is motivated by the fact that a productive service is a pre-requisite for a productive SAPUI5 application, so it is also better to bring the two development experiences closer. Ofcourse the creation of actual data source itself (e.g., RFC, HANA View…) is mostly done via respective platform specific tools (e.g., Service Builder in SAP Gateway, HANA Studio…), but the provisioning of an OData service, from a variety of data sources, across different platforms, in a consistent manner, is important and that is the focus here.


So coming back to the RDE, what if you wanted to create a SAPUI5 application without a service? The service is yet to be built and let’s say you don’t want to wait until the service has been fully implemented. Good news here – You probably know about the mock service functionality of RDE already. You simply need a metadata xml file and couple of clicks later “Run with Mockdata” feature ensures that as an application developer you don’t miss the absence of service that badly anymore. Well, all is good so far but what if you don’t have the metadata also? Simple – SAP already provides a graphical OData modeling tool. You can even manually write a bunch of XML in your favorite text editor to create metadata.


Ofcourse. But we think “simple” is not good enough when things can be “simpler” – which by the way is SAP’s bold vision for the future of businesses as articulated by our CEO Mr. Bill McDermott as recently as last month in SAPPHIRE NOW event in Orlando, Florida.


How would you like if you did not have to leave the context or leave the RDE for creating metadata? With automatic code-completion? And with intellisense? Plus schema based validation on-the-fly? And SAP annotations support? Sounds good? Okay here we go…but just to be on the same page – the context of “simplicity” referred above is about enabling an end-to-end development experience. Otherwise the idea of “simple” is always subjective and different for different developers.


Open SAP River RDE and notice the option to create a new “EDMX File” under “File -> New” – that’s the starting point for creating our Entity Data Model XML. Select a project or a folder where this file needs to reside and follow below screen grab.




You are now presented with a dialog to type in the desired file name and select the OData Version. At present we support standard OData Version 2.0 with or without SAP specific annotations.




Once you are done, a (not-so) empty file is created with the skeleton code so the developer can focus on the actual scenario specific metadata definition and leave basic housekeeping to us.


At this point you cannot help but notice the two red squares – these errors are here to remind us to enter the mandatory parameters in the skeleton code which cannot be auto-generated. In this case these parameters are “Namespace” and “Name” fields and as soon as we give appropriate values we are good again.




Next, we would like to create an entity set. All I need to do at this point is to invoke the intellisense feature by pressing “Ctrl + Space” and I am presented with different possibilities as a drop-down list. Much less error prone! Let us select “EntityType” and move on.




You will notice that all the necessary code around the tag “EntityType” is auto-generated. Again, few errors are silently demanding your attention – they are mostly the user defined values for mandatory properties.




Let us quickly define these missing elements – All the “name” fields are obviously arbitrary user defined values but you will particularly notice the “Type” field. If you are someone like me, chances are that you will not remember the list of primitive data types under Sub-heading 6 of Section 1 “Overview”  in OData Version 2.0 specification. Well, before we really try to “Google” how exactly they represent the “date” type in OData let us give our intellisense a chance. Here are the results –




Now we can really convince ourselves that it is a context sensitive feature Anyway, I hope you get the drift – this is a fairly advanced OData Model editor which you can trust to quickly get your models crafted in an assisted manner. Once you are done, simply export the model using “File -> Export Project or Folder” and a zip file is dropped from cloud to your desktop. In this beta version RDE only supports exporting the entire project or folder so please make sure you select the appropriate folder before attempting to export.


As you can imagine this is just the beginning of our journey. As mentioned earlier in this blog, our larger motivations for doing this are a) simplicity and b) enabling end-to-end development. So as we work towards embedding this experience into the larger development lifecycle and further enrich the product, we invite you to try this out and let us know your feedback. For more in depth technical information feel free to refer product documentation of this tool.


If you have questions, suggestions, feedback etc. feel free to add it as a comment here and we will quickly get back to you. If you prefer to discuss it individually an email is equally good.


Signing off for now~


CC: UI Development Toolkit for HTML5 Developer Center

Been meaning to do this post for ages, and with a few free seconds (and because it seems I've almost forgotten how to relax) I thought I'd throw something together.


A few months ago I was given the challenge of facilitating an open discussion on Fiori, UI5 and the future of mobile in the SAP HR enterprise at the Australian "Mastering SAP HR, Payroll and SuccessFactors" conference.  It certainly was a challenge and I'm not sure one I'd put my hand up for again, but as Robbo says, you only learn by trying.


The Plan

To make it a bit more fun, I thought I'd build a mobile UI5 app to go with the session. To make it even more fun, I gave myself 16 hours the week before the session started to do the work. Oh and Cloud. Because Cloud.



The "plan" was to have a web site (an HTML "app") that connected the device to the cloud (SAP HANA Cloud Platform) via simple RESTful interface to send results of votes of all the attendees to a chart which would update in realtime as people voted. The chart would update using WebSockets so that any info sent to HCP by the apps would automatically get updated. Oh, and to make it fun, in order to retrieve the questions, the users would need to shake their mobile device.


16 hours, I was mad!


OpenUI5 vs SAPUI5.

One of the biggest differences between SAPUI5 and OpenUI5 (other than Open tends to be a release ahead) is the lack of charting libs in OpenUI5. Since this wasn't going to be a SAP product using any SAP tools (other than the HCP) I thought I'd better use OpenUI5. So that left me hunting (for approximately a minute) for an alternative way to draw graphs. After typing "open source javascript chart websockets" into Google the first result was a paper from a scientific journal, the next two had to do with D3.js. So I had a look at that.



Having played a bit with D3 now, I'm so impressed with what it can do, but even more impressed with the amount of documentation and example out there. And for me this was the clincher. The more examples I could copy, the less work I needed to do, and those 16 hours were looking mighty short. Plus it seemed that people had got D3 and websockets to play nicely previously, so surely I could do the same.


HCP for persistence and somewhere to run a WebSocket endpoint

By now HCP is my choice of development platform, I'm really getting the hang of JPA (especially with Spring) and it's just easy. And - because cloud.



Hmm I was sure I'd seen a blog on SCN... oh yes - http://scn.sap.com/community/developer-center/cloud-platform/blog/2013/12/19/websocket-on-sap-hana-cloud-platform there it was. Seems simple enough... (oh how foolish I am). The big thing to note is the JSR 356 definition and support. This means in simple terms that a WebSocket endpoint can be defined on the SAP HANA Cloud Platform just using some very simple notation of a class. Notation is awesome, XML sucks, I've learnt this through my learnings with Spring. So I was very happy to continue with WebSockets.


Shaking detection

About a month before I'd seen a cool feature that John Astill had demo'd on some of the SAP internal tools, where shaking the device entered "feedback mode" for sending error reports and other feedback about the app. He had told me that it was just a simple bit of code. So I went looking. I ended up finding a lib called shake.js by a chap called Alex Gibson who lives in the UK. http://alxgbsn.co.uk/ Very nice of him to share, and especially to make it clear what license. I'm not sure my usage of it is 100% right as I seem to occasionally have to restart my phone to make it pick up new shake events (think there are a limited number of listeners available for orientation events in the browser and sometimes (especially when debugging) I don't clean up the ones I have used. Bad me.


Putting it all together

So I had all the bits - what now? Well it was time to start coding.

All the code is available on GitHub - https://github.com/wombling/mobilequiz

I'll go through some of the more interesting bit (in my view) and also the results:


Playing with WebSockets


I'll include the entire code for the WebSockets class as I found very few full examples of how to build such a service. I was disappointed that I had to use a static method to send the updates out, but WebSocket support only arrived with Spring 4 and I haven't had much experience with that yet. Not to mention it doesn't look nearly as simple as the JSR-356 standard I've used below. So integration with the Spring dependency injection autowiring/services is something that will have to wait. NB the question service does have an interface and uses @Autowired when referenced, so I didn't complete give up on the idea.  I know I should have implemented an interface and had tests for this too and perhaps handled the exceptions, but well, 16 hours dudes!


package com.wombling.mobilequiz.admin;
import java.io.IOException;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.wombling.mobilequiz.api.ApiValues;
import com.wombling.mobilequiz.pojo.QuestionList;
import com.wombling.mobilequiz.user.QuestionService;
public class QuestionsWebSocket {
        Logger logger = LoggerFactory.getLogger(QuestionsWebSocket.class);
        private static Set<Session> clients = Collections
                        .synchronizedSet(new HashSet<Session>());
        private static void sendCurrentValues(Session session, QuestionService qs)
                        throws IOException {
                GsonBuilder builder = new GsonBuilder();
                Gson gson = builder.create();
                QuestionList questionList = qs.getAllQuestions("");
                String questionsJSON = gson.toJson(questionList);
        public static void sendUpdate(QuestionService qs) {
                for (Session client : clients) {
                        try {
                                sendCurrentValues(client, qs);
                        } catch (IOException e) {
                                // ignore for now
        public void onClose(Session session) {
                // Remove session from the connected sessions set
        public void onOpen(Session session) {
                // Add session to the connected sessions set
                try {
                        QuestionService qs = WebSocketSupportBean.getInstance().getQs();
                        if (qs != null) {
                                sendCurrentValues(session, qs);
                } catch (IOException e) {
                        // ignore for now
        public void processGreeting(String message, Session session) {

Worth noting the use of Google Gson which is the most awesome lib for converting pojos to JSON and JSON to pojos. It makes my life so much easier and is highly recommended. Also, who'd want to send data in any other format. (Unless of course you're feeling particularly enterprisy in which case there's a cure for that and it's called Apache Olingo.)



With the broadcasting of websockets underway and tested ( I also used Dark WebSocket Terminal to test) there needed to be something to subscribe to and react to my messages.

This was actually relatively easy - the MV* build of UI5 lends itself nicely to just updating the model from anything (in this case a websockets update) and the framework taking care of the rest.

Here's a code snippet taken from the onInit method of my admin view.

                var thisView = this;
                function url(s) {
                        var l = window.location;
                        return ((l.protocol === "https:") ? "wss://" : "ws://") + l.hostname + (((l.port != 80) && (l.port != 443)) ? ":" + l.port : "")
                                        + "/mobilequiz/" + s;
                var socket = new WebSocket(url("questionWebSocket"));
                socket.onopen = function() {
                        console.log('WebSocket connection is established');
                socket.onmessage = function(messageEvent) {
                        thisView._questionData = JSON.parse(messageEvent.data);


Being able to change the protocol and port depending on whether I was running locally over http and ws rather than https and wss  when running on the HCP was an important consideration. It is actually trivially simple code - in fact in many ways easier than making an AJAX call.


Shake that phone!

The other code snippet I'd like to discuss is the logic to capture a shake.

onInit : function() {
                var config = {
                        "showLoadingThingy" : true,
                        "shakeSupported" : false,
                        "shakeNotSupported" : false,
                        "showQuestion" : false
                var configModel = new sap.ui.model.json.JSONModel(config);
                this.getView().setModel(configModel, "cfg");
                var _e = null;
                var _i = null;
                var _c = null;
                var updateOrientation = function(e) {
                        _e = e;
                        window.removeEventListener("deviceorientation", updateOrientation, false);
                var thisView = this;
                window.addEventListener("deviceorientation", updateOrientation, false);
                _i = window.setInterval(function() {
                        if (_e !== null && _e.alpha !== null) {
                                // Clear interval
                        } else {
                                if (_c === 10) {
                                        // Clear interval
                                        // > Redirect
                }, 200);
        setConfig : function(shakeAllowed) {
                var thisView = this;
                if (shakeAllowed) {
                        window.addEventListener("shake", function shakeEventOccured() {
                        }, false);
                } else {
                        sap.m.MessageToast.show("Your browser does not support shake detection");
                var configModel = this.getView().getModel("cfg");
                configModel.setProperty("/shakeSupported", shakeAllowed);
                configModel.setProperty("/shakeNotSupported", !shakeAllowed);
                configModel.setProperty("/showLoadingThingy", false);

Here I've had to deal with phones/browsers that don't actually support shake detection, but do implement the deviceorientation event subscription (honestly - what were they thinking?! Blinking iPad rubbish).  I poll 10 times for a value in the orientation and check to see if the data is changing. If it is, hey hey!, we have shake support. If not, either you ought to go into professional poker, you can hold that phone so steady, or you have it on the desk, in which case shaking probably not a good idea. I then update my config model with the results. This means I can then decide whether to show you a button to press for new questions, or make you shake it like a Polaroid picture. If shake is allowed, then I call the function to get the next question.


Putting buttons in a toolbar on the bottom of the screen is a Fiori induced anti-pattern and I don't like it.


Seriously! Look at any modern UI, where are the buttons? At the TOP of the screen or right next to the thing you are working on! Check out the browser you are using right now, heck even MS office! Yes, there are items in the toolbar at the bottom, but user actions are placed where users can see them, not at the bottom of the screen in an unresponsive toolbar. New and better UI/UX is supposed to be about making things better for users. Consistent UI is fine, but consistently unintuitive UI does not make it intuitive because we get used to it. That just makes it obviously a SAP application.



So in my UI for the app, I kept things simple, because simple is best!


and even in my admin interface:


Drop shadows make everything look sexy, so I used them to highlight the questions. I probably should have rounded off the corners too, then it would have been extra cool. A little bit of CSS on top of UI5 makes all the difference I am finding. Whilst the number on the right hand side of the question probably means nothing to you, when you can see it counting down, it becomes pretty obvious that it's a countdown.


Simple! And then click on the graph icon and get some of that websockets real time data vibe going:



The number of yes/no votes updates in real time and also updates the graph. Pretty cool, especially when you have a few people voting.

I did some fun stuff with cookies to ensure that people don't vote multiple times (unless they decide to clear their cookies between votes, and in that case, they deserve it for being clever clogs.) The system keeps track of which cookie ids voted for what... (very big brother, ha and you thought this was anonymous - haven't you learnt ANYTHING in the last few years about the internet?!)

Though to be fair, it's going to take me a bit of work to figure out what this means: (not as if I stored much in the way of identifying data.)



I was having so much fun building this app, that I even got my family into the act and my daughter decided that she would help me by making a how to use the app video.

After the good, The bad stuff

Not everything was so awesome. OpenUI5 is currently only hosted out of Waldorf and takes a blinking age to download onto an Australian mobile phone. SAP has limited desire/resources to push the use of a CDN to make this faster. Even SAPUI5 does not benefit from a CDN. The more community _ahem_ encouragement that they get, should help this be fixed. Please do throw your comments about this on the stackoverflow post about this. Or even here, the more volume this problem gets the faster a solution.


I tried hosting the OpenUI5 code on my own website which is based in Australia, it certainly improved things. But then was having problems because of the mis-match of protocols - with my website being HTTP but AJAX comms to the HCP being via HTTPS (using CORS). It shouldn't have caused an issue, but when we tried it with lots of people it did. I'm not sure why.


Still, even with an AU website hosting the UI5 it still took too long to download onto a phone. I'll have to check about customised versions of UI5 that only contain the bits I need, I hear that speeds things up. I'll have to try it out. I'd also like to try hosting/using the AU HCP site for an experiment and see how that works. (unfortunately the AU HCP data centre wasn't up and running at the point I wanted to use my app!)



In the end I probably did spend a little more than 16 hours, probably more like 32 hours in total building the application, but most of that was in trying out and learning new stuff - like D3.js and WebSockets, so the late nights were worth it


Please have a look at my GitHub repository, fork it if you like or just borrow bits of code as you like.


I'd love to hear your comments, I think this combo of mobile, websockets and cloud is just about to take off - real real-time dashboard based on what people are doing right now. Combine this with HANA and I guess we also could get real real-time analytics of that data. Exciting new world!

Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.


Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.



Week 5: Securing Web APIs


This course week is all about securing web APIs on the SAP HANA Cloud Platform



Unit 1 - Protecting Web APIs

In this unit you learn what Web APIs are, when to use SAML 2.0 and oAuth and you also learn what the benefits of OAuth are.


Important/additional information


Unit 2 - OAuth 2.0 Fundamentals

This unit explains the fundamentals around OAuth 2.0.



Unit 3 - Protecting the Cloud Application

In the third unit of this week you learn how to protect APIs programmatically and how to configure the OAuth filter.


Important/additional information



Unit 4 - OAuth Configuration

This unit shows you how to register OAuth clients and how to configure scopes for your cloud application.


Important/additional information

In this unit you might notice that the video from minute 2:48 till 3:12 shows how I enter a wrong URL. It should be http://localhost:8000/oauthcallback, but in the video I enter http://localhost:8000/ouathcallback. Please enter the correct link http://localhost:8000/oauthcallback .


Unit 5 - Working with Multiple Identity Providers

Finally in unit 5 we develop an OAuth client. You learn how to integrate an OAuth Client with the SAP HANA Cloud Platform OAuth Authorization Server and how to implement a callback handler for the authorization code flow in a desktop client.


ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Introduction to SAP HANA Cloud Platform (repeat)


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Course Guide Week 5 - Connectivity


Hi everyone,

we are getting closer to the end of this course with week 5, but this week is about the exciting topic of the Connectivity Service.


We'll look into the Connectivity Service and how to use destinations and the Cloud Connector.


Compared to the initial course Introduction to SAP HANA Cloud Platform that has been provided to you at the end of 2013, you'll notice that this course week has been completely re-recorded due to some major improvements around the Connectivity Service.



Table of Contents


Unit 1 - Introduction To The Connectivity Service



Unit 2 - HelloWorld Connectivity Application



Unit 3 - Setting Up SAP HANA Cloud Connector

Common issues

Your OS is not supported

Please read the documentation of the SAP HANA Cloud Connector at SAP Development Tools for Eclipse because SCC is not available for all Operating Systems.



Unit 4 - Configuring And Using Destinations



Unit 5 - Other Scenarios



Cloud Extension Scenario


Integration with SuccessFactors Applications




Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.


Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.



Week 4: Advanced Identity Management


This course week is all about advanced identity management and extends the know-how around security management and security you've already build-up during the course Introduction to SAP HANA Cloud Platform.


To get deeper into the topics you can look into some additional material provided to you by Martin Raepple via SCN:



Unit 1 - Working with User Profile Attributes

This unit is about the different classes of user account information, how you configure user attributes with the local IdP and in the Cloud Cockpit and how one can access the the user attributes in Java-based apps.


Important/additional information


Unit 2 - Group Management

In unit 2 of this week you learn how to use groups in the SAP HANA Cloud Platform and how to assign users to groups.



Unit 3 - Federated Authorization with Groups

In this unit you learn how to define mapping rules to groups.


Important/additional information


Unit 4 - Custom Roles

Learning how use and define custom roles at runtime.


Unit 5 - Working with Multiple Identity Providers

In this week's last unit you learn how to setup multiple identity providers per account in the SAP HANA Cloud Platform Cockpit

Be Careful. You Might Succeed.

TLDR: Bill McDermott’s keynote at Sapphire didn’t talk about SAPs restructuring or corporate shifts but notably, the focus was on ‘simple’ which may not have resonated with everyone who was there looking for answers. But to EnterpriseJungle it was the conclusion to and delivery on a promise they made us, that as development partners we could build last mile solutions atop their platform and be given a direct channel to their customers: Simple.

Start-ups talking about the commercialization or revenue path of their product often talk about selling to enterprises. It’s almost a de facto response and whilst sometimes they achieve this goal, generally it’s deluded. The pipeline to the precious enterprise customer is a road littered with obstacles. Getting those few lighthouse customers – real enterprises – is nigh on impossible.

I have spent the better part of the last decade in the collaboration space, first as a user and then in corporate and advisory positions. My core focus has been on two simple but key premises. Firstly, “The Answer Is Always In the Room” … a statement of fact that whatever you need, there’s always someone in the room you are in who can solve the need, or knows someone who can solve it for you. The megaphone approach, shouting the question out in the room you are in, is rarely an option. Add to that employees have a right to privacy, a right to not have to let the entire company know they need help. So creating a confidential platform in place of a megaphone has been on the agenda. Secondly I look to Donald Rumsfeld’s “Unknown Unknowns:” There are things you don’t know you don’t know – and that being the case, what is the best way to bring to light knowledge or data that the user needs but doesn’t know they need?

Read The Full Article On LinkedIn


ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Introduction to SAP HANA Cloud Platform (repeat)


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Course Guide Week 4 - User Authentication and Security


Hi everyone,

in this week you'll get an introduction into user authentication and security in the SAP HANA Cloud Platform.


We'll learn how the platform helps you with the delegation of authentication and how to secure your applications on the platform.


At the end of this week you'll know how to accomplish this task efficiently and how to tackle even more sophisticated challenges around testing the security mechanisms of your application.


Find below some additional information related to this week of our course. There is also an overview document about all the blog posts I wrote during this course.


Table of Contents



Unit 1 - Security Development


Unit 2 - Local Security Testing




Unit 3 - Security Testing In The Cloud




Unit 4 - Security Configuration In The Cloud



Unit 5 - Security Troubleshooting


Troubleshooting Issues when implementing SAML SSO in HANA XS Engine


When implementing SAML SSO in HANA XS engine, I was searching for standard SAP installation guide.  It seemed to me that there is no official SAML installation guide that is currently available as I write this blog. However, I had access to this beautiful blog in SCN http://scn.sap.com/docs/DOC-50418  and the installation guide given at How to configure SAP Crypto libraries and SSL for HANA XS Engine.pdf that guides you through the SAML installation steps. During the implementation, we had various challenges that prompted me to write the blog on trouble shooting in this topic. Highlights of the few issues are given below:


Issue Enabling SSL in HANA XS Engine:

Caution to the readers, In an actual development landscape, you should NEVER use SSL Evaluation Certificate of SAP Service Market Place. This certificate is only to be used for demo purpose. You should always request the customer to provide with the necessary certificate for signature and import (example SAPSSL.cer).

If the certificate provided by the customer has root certificate and intermediate certificate associated to it, then those also need to be added into the PSE (example SAPSSL.pse)

Also when copying the signed certificate from Windows to Linux file system can be an issue due to different interpretations of carriage return and line feed. Best is to directly copy the content of the certificate in Linux system using VI editor. You may have to try this step few times before the certificate is actually created successfully.


Google Chrome Issue: This webpage has a redirect loop.

This issue is caused when the Service Provider (HANA) is redirecting the request to ADFS and where ADFS is unable to determine where to redirect the response back. Thus ending in a infinite redirect loop in the browser. The fix is to create the correct SP entry in ADFS. It redirects the response back to Service Provider(HANA). Be careful of the entries made as it should match exactly as the URL of the SP. Otherwise it will also lead to the next issue.


Unable to verify XML signature(StatusCode:, StatusMessage)

There are various reasons for getting this error message:

On ADFS Side:

  • Missing Name ID ” format “Unspecified”  parameter in Issuance Transformation Rule.
  • SHA-1 is not configured as a hash in its properties(Default is 256)
  • Relying Party identifier has multiple entry or incorrect entry that is not matching the entityID of the SP metadata. Make sure that every single character matches with the SP metadata.


On HANA Side: 

  • In the HANA XS admin, when creating the identity provider, if you get error at the Subject  stating it is invalid due to special character(- hypen), it is due to a bug in the older XS engine release 7.X. This bug is fixed in the new release. If upgrade is no option then you can also create the IDP entry from SQL editor as follows:




insert into _SYS_XS.HTTP_DESTINATIONS values('sap.hana.xs.samlProviders', 'ADFS', 'desc', '<YOUR ADFS HOST NAME>', 443, '', 0, '', 0, 0, 1, -1, '', '');

insert into _SYS_XS.SAML_PROVIDER_CONFIG values('ADFS', 0, 0, 'sap.hana.xs.samlProviders', 'ADFS', '/adfs/ls');

insert into _SYS_XS.SAML_PROVIDER_CONFIG values('ADFS', 0, 1, 'sap.hana.xs.samlProviders', 'ADFS', '/adfs/ls');

insert into _SYS_XS.SAML_PROVIDER_CONFIG values('ADFS', 1, 0, 'sap.hana.xs.samlProviders', 'ADFS', '/adfs/ls');

insert into _SYS_XS.SAML_PROVIDER_CONFIG values('ADFS', 1, 1, 'sap.hana.xs.samlProviders', 'ADFS', '/adfs/ls');

  • Another bug could be the mismatch of the IDP Subject with the certificate. In my case, I uploaded a certificate for SSO with Subject having SP=xxxxx whereas the subject line in XS admin IDP metadata had Subject with ST=xxxxx, Since this did not match too, we get this error.


SAML Logout Issue:

If you have implemented the SAML logout code as mentioned in the blog with logout.xscfunc and still unable to logoff, kindly do a http trace to find if the logout request is going to ADFS system or not. If the request is going to ADFS and still you are not getting logoff, probably the Endpoint is not properly configured in ADFS. Please check the Logout URL in the SAML Logout Endpoint. Also check the HANA XS Admin IDP setup for Single Logout Redirect and Post.


Error in Google Chrome: Assertion did not contain a valid MessageID.

As given in the blog, if you face this issue, then kindly create the SAML parameter, assertion_timeout and set the value to 30.

Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.


Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.



Week 2: Git and HTML5 Apps - Part 2

In the third week of this course you learn how to leverage the SAP HANA Cloud Platform to deploy HTML5 applications to the cloud. And all of this with a build-in Git repository for each of your HTML 5 applications. In case you haven't used Git, yet, you get a good introduction into it.


Unit 1 - Releasing a Version of Your HTML5 Application

In this unit we release a first version of our HTML5 applications on the SAP HANA Cloud Platform


Important/additional information



Unit 2 - Adding a Chart to Your Application

This unit shows how to add a chart to our SAPUI5 application and shows again how the versioning of the commits is used on the SAP HANA Cloud Platform.


Important/additional information


Unit 3 - Working with Multiple Branches

In this unit you learn how to work with multiple branches in Git. This helps you to work in teams in parallel on various features.


Important/additional information


Unit 4 - Resolving Merge Conflicts

With this unit you are provided with some basic understanding how you can resolve merge conflicts in the day-to-way work with Git.



Unit 5 - Git History

In the last unit the Git History functionality is explained and how it helps you to keep track of the changes you did.


ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Introduction to SAP HANA Cloud Platform (repeat)


You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.


Course Guide Week 3 - Advanced Persistency Features


Hi everyone,

in the third week of the course Introduction to SAP HANA Cloud Platform we will provide you with some features of the Persistence Service and we'll also look into the Document Service that stores unstructured data.


Find below additional information that can be useful to you during this course week. While monitoring the forums I'll also add some sections around common mistakes and how to fix them into the corresponding units.


Table of Contents



Unit 1 - Local Development


Issues With This Unit


When going through this unit you'll find a difference compared with the video. Read the blog post from Eli providing some help to get around that issue.


Additional Links


Unit 2 - Using HANA Modeler


Proxy settings

To be able to establish a database tunnel through the console client you need to have your proxy settings setup correctly. Just follow this instruction: https://help.hana.ondemand.com/help/frameset.htm?7613dee4711e1014839a8273b0e91070.html



Unit 3 - Introduction To The Document Service


Common issues


com.sap.ecm.api.ServiceException: Could not create repository. Document Service responded with http 400

One possibility for this error message can be that your password for your key is less than 10 characters long. Try to make the password at least 10 characters long and publish the code changes which should fix the issue.



Unit 4 - Consuming Document Service with External Tools



Unit 5 - Document Service Metadata & Queries



Filter Blog

By author:
By date:
By tag: