Since this week SAP HANA Cloud Platform enabled Web-based HANA XS Development for Free Trial Developer Accounts!



What do you need to try out Web-based SAP HANA XS Development?


The only thing you really need here is a Firefox or Chrome Web Browser and nothing else.


Then you can launch the new Online Tutorial (use Firefox or Chrome!) which guides you step-by-step through a Web-based HANA XS Development example.


How does the online tutorial work?


1. First the tutorial guides you through some prerequisite steps (e.g. make sure that you have or how you create your free developer trial account etc.)


2. After finishing the prerequisite steps you can start the tutorial step-by-step exercises


3. Execute step by step the short instructions with the pre-configured code snippets according to your settings.


4. Develop step by step the HANA XS PersonsList application on your SAP HANA System by means of SAP HANA Web IDE


Detailed Documentation for the Web-based PersonsList Development Scenario?



Bertram Ganz already provided in his SCN Blog a detailed PDF document for the PersonsList HANA XS Development Example but targeting on premise SAP HANA Platform.


The on-premise development on SAP HANA Platform works partly different than on HANA Trial instance (e.g. on HANA Trial instance you have a dedicated schema you have to develop with, whereas with on-premise SAP HANA Platform you can define your own development schema).

Therefore Bertram is working on a corresponding detailed PDF document which will take the Trial HANA instance specific parts into account.




Questions and Feedback?

I really appreciate your questions or feedback.

Please add your comments below.

In this tutorial you'll learn how-to setup a SAP NetWeaver Application Server ABAP 7.4 Trial Edition on Amazon Web Services (AWS) using the SAP Cloud Appliance Library (CAL).



You need to have an active Amazon Web Services (AWS) account for the Elastic Cloud Computing (EC2) module!


Registering for a CAL account


  • Navigate to and login using your free SAP HANA Cloud Platform Developer Edition account.




  • Create a CAL account by clicking on the Create Account link in the top-right corner.
  • On the first wizard page (Define General Properties) provide a Name for your account (e.g. "SAP HCP Backend"). Click on Next.
  • Now, we need to select our Cloud Provider. At the time of writing, the only available option is Amazon Web Services. Please also provide your AWS Access Key and Secret Key. Click on Next.
  • On the third wizard page (Select Account Users) you can maintain (additional) users for this account. Your user should be registered by default. Click on Next.
  • The next (optional) step is to provide information for the Cost Forecast. By default, the region us-east-1 is enabled by default. Click on Next.

Setting up a CAL-based solution




  • Switch to the Solutions tab and scroll down to the bottom of the list. Activate the entry SAP NetWeaver Application Server ABAP 7.4 on SAP MaxDB - Trial Edition by clicking on the corresponding Activatebutton.
  • Now, you can create an instance by clicking on Create instance button of the selected solution and provision it to AWS EC2.


Creating an instance of a CAL-based solution




  • On the first page of the Create Instance wizard you are asked to define the general properties of the to-be created instance. Provide a Name for your instance (e.g. "NW App Server 7.4 Public".) Your Account information should be filled in already as well as the default Region to be used. The most important setting is the Access From value, which can either be Public or Corporate Network (VPN). Within the scope of this tutorial we'll stick t the default (public) access level. Proceed by clicking Next.
  • On this wizard page you can configure the virtual machine settings. Leave the Virtual Machine Size as is as well as the other settings. (We will go back to this page later on and make some required changes, but let's take it step-by-step). Click on Next.
  • On the thrid wizard page we have to define our master password. Maintain one and then proceed to setp 4 by clicking on Next.
  • Here you can define the scheduling configuration. While it generally sounds like a good idea to setup a schedule to (re-)start and stop your instance, we opt for manual (de-)activation and proceed by clicking Next.
  • We step through the remaining wizard pages without making any changes and finally confirming the creation process by clicking on Finish. (When you do this for the first time it may take up to 35 min to setup the solution.)
  • You'll be prompted to store and/or download the generated license key (PEM file).


Note: Once you confirm the Create instance wizard dialog the selected solution will be provisioned to your AWS EC2 account and from that moment on you are generating costs!
  • Once the instance is up and running it will be listed in the Instances tab. If you click on the link in the Instance Name column you can open up a pop-up window to take a look at (and edit) the individual settings. One of the most important information being displayed is the IP address of your instance.


Note: In the following command line scripts we have used the IP address as an example. You need to replace this with the real IP address of your AWS system! Similarly, we refer to the key certificate issued to connect to the system as NW AppServer 7.4.pem. If you have chosen a different name, please substitute it respectively in the command line scripts below.


Setup the SAP Cloud Connector via SSH

  • Connect to your instance running on AWS via ssh:
ssh -i "NW AppServer 7.4.pem" root@
Note: Eventually your command line/shell will complain about the PEM certificate statint that it may be too open. In such a case, open the attributes of the PEM file and make the file permissions more restrictive (e.g. setting it to 400).


  • Download the (productive) Linux version of the SAP Cloud Connector from the tools page. Now we need to copy this file to our instance, then unzip and install it (via the rpm package manager):
scp -i "NW AppServer 7.4.pem" root@ 
rpm -i
  • Once the cloud connector is up and running it will start a lightweight web server listening to port 8443. However, this port is not yet exposed to the outside world, hence we need it as a 'custom port' to the virtual machine of our instance.
  • Navigate to the CAL console again and click on the name of your instance in the Instance tab. This will open up a pop-up window.
  • Switch to the VIRTUAL MACHINE tab and click on the Edit button in the lower rigth corner.
  • Now add a custom Access Point mapped to Service 'HTTP' and Port '8443'. Click on Add.
  • Confirm the changes by clicking on Save. The settings are instantly applied.
  • Connect to the cloud conenctor by opening the respective url: The default username/password combination is Administrator/manage.


Note: Make sure to connect to the administration console of the cloud connector as soon as you have exposed the port to public and change the default password to a more secure one right away!


For further information about how-to configure the cloud connector for your respective scenario please consult the official documentation:
SAP HANA Cloud Platform - Initial Configuration of Cloud Connector



Connecting to our ABAP system via SAP Gui (for Java)


You may want to access your ABAP instance via SAP Gui. If not (or if you are already familiar with using the SAP Gui) then you can safely skip this last section.


Note: In the scope of this tutorial we'll explain how-to install SAP Gui for Java as it can be used on non-Windows operation systems as well. (Windows users may want to use the Windows native version).


The simplest way to get access to the SAP Gui is by copying it from the AWS instance:

scp -i "NW App Server 7.4 Public.pem" ↩
root@ ↩


Simply extract the ZIP file and run the installation script as explained section 4.1.1 in the following guide: Getting Started with SAP NetWeaver Application Server ABAP 7.4 SPS02 on SAP MaxDB - Trial


Once installed the last remaining set is to setup a corresponding connection configuration. On a MAC the respective configuration file is located in the following file: /Users/<username>/Library/Preferences/SAP/connections.


The configuration file may look as follows:

# file    : /Users/neo/Library/Preferences/SAP/connections
# created : 11.04.2014 16:15:52 CEST
# encoding: UTF-8

NPL:conn=/H/ # Amazon



Although SAP offers trial editions for free your will still have to cover the costs for running these trial editions on AWS!


Next steps


SFlight sample application showing how to extend an on-premise ABAP system using JCo/RFC


Related Information


As there has been a lot of fuss around the recently announced Heartbleed vulnerability in OpenSSL we would like to inform you that HCP is not vulnerable to Heartbleed.


The internet connections can also be tested by open tools.


The OpenSSL version available on the OS images is not vulnerable. Therefore, even if your application uses OpenSSL it is safe if it uses the libraries provided by the platform.




Do you know that you don’t have to deploy and restart the whole application over and over again just to see your latest changes while developing on SAP HANA Cloud Platform? You can now speed up your development by applying and activating changes on the already running application.

Just use the hot-update command (beta) in the console client..




With hot-update, you redeploy and update the binaries of an application started on one process faster than the normal deploy and restart.

Use the -- strategy parameter to choose among three approaches to hot-update:

  • replace-binaries - redeploys and updates the application binaries
  • restart-runtime - redeploys and updates the application binaries and restarts the application process
  • reprovision-runtime - cleans up the file system, reprovisions the runtime and redeploys and updates the application binaries


The different strategies help you to update your application depending on its behavior.

For example, to update a web resource like image, you can use the first strategy. To support applications that need to retrigger the initialization logic, use the restart-runtime strategy, while apps that want to have a clean file system can make use of the reprovision strategy.




neo hot-update --host --account myacc --application myapp --source samples/deploy_war/example.war  --user --strategy replace-binaries





The hot update is still in beta so do not use it to update productive applications. Use it only for testing purposes during development.


The beta state still has the following limitations:


  • Works only if there is a single running process of the application.
  • Removing WAR files is not supported.
    You cannot change deploy parameters and context path of the application.

Try it out!


It will save you a lot of time.
Give us your feedback and we will make it even better.

I seem to be in a blogging mood recently. Unfortunately, I missed the chance to give a shout about a cool new feature which was released couple of weeks ago (on March 13, precisely speaking) but apparently has left unnoticed by many folks and friends of SAP HANA Cloud Platform. So - and as a wise man had once said - I thought "better late than never" and decided to try and fix this lapse.


What am I talking about? Well, until March 13, if you wanted to connect to an SAP HANA instance on SAP HCP, you had to open a DB tunnel in the Console Client by executing the open-db-tunnel command, copy the connection details it returned to you (including the instance number, DB user and password), go to your HANA Studio (and yes, it had to be a separate standalone IDE from your other Eclipse IDE which you were using for Java development on HCP and/or any other development you do!) and paste those connection details into the corresponding wizard pages. Oh, and did I mention that you had to keep the DB tunnel in the Console Client open while you were working on the remote HANA instance? Sounds fun? Guess not.


Well, forget about the above. We've put an end to all those tedious, awkward and error-prone steps within so many different tools. Now you can do everything in a single tool (Eclipse IDE) with just a couple of clicks. The prerequisite is that you have installed an Eclipse Kepler IDE and the SAP HANA Cloud Platform Tools on top, as described in Setting Up the Tools. (And yes, you got it right - all necessary SAP HANA Tools now can be installed as well from the SAP Development Tools for Eclipse - Kepler Software Repository, so you don't need a separate HANA Studio installation any longer.)


Now, if you already have a trial or productive SAP HANA instance in your account (see e.g. Creating a Trial SAP HANA Instance), you can just go to the SAP HANA Development perspective in your Eclipse IDE, right click in the Systems view and select Add Cloud System... which should open up this nice wizard:


Enter your account info and click Next. This will lead you to the second page where you have to select an instance from the list:


Click Finish and in a few seconds it will be added in the Systems view. If you are working with a productive instance, you also need to provide your own DB user and password. See in the official documentation how to create it. In this case, as a last step the wizard might ask you to change your DB password if it is an initial one and you haven't changed it yet.


That's it! Now you can go on and import some sample apps, play with them, or develop your own XS app - it's up to you. Enjoy! And don't hesitate to share your feedback and comments - we are always looking for further improvements and simplifications.



No, it's not April Fools' Day joke. We've just set the date for our second SAP CodeJam Sofia around SAP HANA Cloud Platform! It will be on May 15 at the SAP Labs Bulgaria office.


Now, I should also tell you this time it'll be a special one. And that's because it's organized together with University Alliances and designed especially for students from Bulgarian universities who are eager to get to know and make their first steps with this cool and innovative cloud platform from SAP!


As a participant you will:

  • get an introduction to SAP HANA Cloud Platform
  • get access to the technology, tools, experts and more
  • get your hands dirty developing on-demand applications using modern Java, Web UI and SAP HANA technologies
  • start small from “Hello World” and grow bigger step-by-step leveraging persistence, document management, identity management and other platform services to build an application for your desired use-case
  • all this in a fun and casual environment with your classmates and friends (and while enjoying some great food and beverages, btw )


So, why not put yourself to the test and sign up for SAP CodeJam now? The event is free, but space is limited.


Meet you there!

As today's post title suggest I've been having some fun with Apple PassKit passes and I wanted to share my findings with you.


Disclaimer: As fellow developers know we have our own understanding of 'having fun' and in this particular case it involves messing around with certificates, digital signatures and SHA1 hashes, well-formed JSON syntax and firing up a precise chain of console commands...


Having spent a lot of time on the road the last couple of years I came to appreciate Apple's PassKit technology and the convenience it brings. Fortunately my favorite airline has been supporting it from early-on and so I enjoyed the ease of using a digital boarding pass & getting all relevant information auto-magically displayed on the lock-screen of my mobile phone as soon as I got close to the airport. Ever since I've been interested in being able to create my own passes in order to extend this convenience factor to other scenarios.


During my research I stumbled upon a great tutorial that explains the entire process of creating a Passkit pass by hand in great detail. Instead of repeating it all here I'd recommend everyone interested to read the following blog posts:



Once you've gone through it step-by-step you'll understand that it surely is no rocket science, yet the whole process of creating SHA1 hashes and obtaining certificates and signing them is ... well, tedious to say the least.


Being the lazy developer that I am I was of course quick to search for libraries that would ease the pain and I didn't have to search for to long until I stumbled upon the jPasskit project on github. It's a small library, but given that it gets the trick done that's a great thing!


By looking at the provide jUnit tests I was able to get the heck of it quite fast and in no time I had created a utility class that wraps the signing and packaging aspect into a single method - voilà.

I've compiled a little sample project on our github page, so if you're interested in giving it a try - please head over there:


Oh, and before I call it a day... here's the result:


You can download the pass here


In part 2 I'll dig a bit deeper and look into advanced topics such as relevance information (e.g. geo-fencing) and updating of passes. Until then... cheers & have fun coding!

Have you ever wondered if it is possible to develop end to end your next cloud application directly via the browser? What language should you use? What kind of other tools should you need for the database management, authorization definitions, testing, life-cycle management, monitoring…?

What if you can write the whole application in JavaScript? You like Ruby? What about Groovy as well? And still in the same environment? 

What if you can run directly Apache Camel's Routes as simplified integration services and scheduled jobs?

What if you can browse your database schema or catalog and execute your query and update scripts again in the same environment?

What if you can have a jump start with your user interface by using predefined site templates based on OpenUI5? You are already familiar with other AJAX frameworks? Let me guess - jQuery? Bootstrap? AngularJS?

What if you can create your next SalesOrder entity from the table, through the CRUD service, to the user interface only via wizards, without a need to touch the code?

You want to share the project with the other members of your team? What about Git? What if you can collaborate also worldwide via the open place for exchange of samples?

Too many questions? We are just starting…

Dirigible is Open-Sourced


It’s a big pleasure to announce that the internal TGiF project Dirigible has been open sourced under Apache License and is available on GitHub joining the growing family of SAP HANA Cloud Labs projects.

Is it the answer to your questions? Could be…

Try it Now

The easiest way to taste the IDEaaS is to open the location with your preferred browser.





Deploy on Your HCP Account

Since the trial instance is shared, you have limited functionality (publish to registry, update SQL statements, etc. are forbidden), so it is better to run Dirigible on your own account on HANA Cloud Platform. How to do that?


Just download the latest release from GitHub:


Go to release section.


Download both WAR files.

Get HCP SDK 1.x from and unzip it on your local file system.

Go to folder.

Deploy with command:


neo deploy --account <your_account> --application <application_name> --user <your_user> --host <target_landscape_host> --source <downloaded_wars_directory>


Start with command:


neo start --account <your_account> --application <application_name> --user <your_user> --host <target_landscape_host> -y


Congratulations! You have a Dirigible in your account!


Go to at Authorizations section:




Add both roles to your user (Developer and Operator) to have full access to all the features.


Now open the IDE link from the Cockpit:



Great you setup your own environment.

Need a Sample?


Would you like to create quickly a sample project?


Choose File->Import->Sample menu:




Choose one from the list:




Click Finish and you are ready. You can explore the project structure.



Want to see it running? Select the project node (catalog) and choose the Activate action from the context menu.


It is done! Open the Web Viewer and select the catalog/WebContent/user/index.html from the Workspace Explorer.


You can Publish the project and show to your buddies your own Pizza Catalog.

You do not have a Pizza Restaurant? It is high time you established one, isn't it?


The project site:

The source code is available at GitHub -






Google Group:


By: Petra Lazarova, Dobrinka Stefanova, Boris Angelov

Disclaimer: This document relates to functionality available on the productive SAP HANA Cloud Platform landscape.

Just imagine you developed a web shop XS application on a productive HANA system, and you’d like your customers to be notified by email about changes of their sales orders.

The solution you need is sending an e-mail from the HANA XS application.


You send a mail to your personal email account by implementing an XS Service (XSJS) via consuming HTTPS REST mail provider. (MailGun email service provider will be used for the example solution)


  1. Get a free MailGun account at Log in your MailGun account and copy your API key. You will need it for Basic authentication at step 5. Copy your MailGun subdomain. You will use your subdomain into your destination path prefix and as your mail domain and mail sender domain at step 1 and step 2
  2. For more details on Sending Messages using Mailgun HTTP API, see
  3. Purchase a productive HANA account obtaining host, account and schema names.
  4. Set up SAP HANA Tools feature group and SAP HANA Cloud Platform Tools for Connecting to SAP HANA Systems following the offical documentation.
  5. Connect to productive SAP HANA instance following the documentation.
  6. Create, activate and test your simple XS application (For this example let’s assume you created a package “mypackage” and XS Application “myapplication” as a subpackage)



Step 1: Create XS HTTP destination file “mailgun.xshttpdest” into your application root folder and add the code:

host = "";

port = 443;

pathPrefix = "/v2/<Your MailGun subdomain>";

useProxy = true;

proxyHost = "proxy";

proxyPort = 8080;

authType = none;

useSSL = true;

timeout = 30000;

Note: Enter your MailGun subdomain as a part of pathPrefix and check and set the proxy for your environment at XS Destinations documentation as proxyHost.

Activate your XS HTTP destination file file.

Step 2: Create an XS Service “mailgun.xsjs” file into your application root folder, and add the following code:


var destination_package = "mypackage.myapplication";

var destination_name = "mailgun";

var message;

var he;



try {

  var dest = $.net.http.readDestination(destination_package, destination_name);

  var client = new $.net.http.Client();


  var req = new $.web.WebRequest($.net.http.POST, "/messages");

  req.headers.set('Content-Type', encodeURIComponent("application/x-www-form-urlencoded"));


  req.parameters.set("domain","<Your MailGun subdomain>");

  req.parameters.set("from","me@<Your MailGun subdomain>");

  req.parameters.set("to","<Your email address>");

  req.parameters.set("subject","Test subject");

  req.parameters.set("text","Test text");


  client.request(req, dest);

  var response = client.getResponse();



  $.response.contentType = "text/html";


  $.response.status = $.net.http.OK;

} catch (e) {

  $.response.contentType = "text/plain";



Note: Enter your MailGun subdomain as a "domain" and part of "from" mail address parameters and enter your mail address into "to" parameters .


Activate your XS Service file.


Step 3: Open MailGun API URL “” into your browser and export the site certificate of this site as a Base-64 encoded X.509 file local on your PC.

Note: Refer to your browser documentation on this step.

Step 4: Open “SAP HANA XS Administration” tool from “HANA XS Applications” node of SAP HANA Cloud Platform cockpit and create a trust store “MailGun” via TRUST MANAGER. Browse and import the MailGun certificate from your local PC:




Step 5: Browse the XS APPLICATIONS tree and edit your MailGun destination assigning the MailGun trust store you created, set user “api” and your account API Key from prerequisite 1 and save your destination.


Step 6: Launch your application from SAP HANA Cloud Cockpit adding “mailgun.xsjs” as a suffix:


You will see output like that:

{ "message": "Queued. Thank you.", "id": "<20140115151444.9049.23372@<Your MailGun subdomain>>" }

Step 7: Open your mail box and see the mail



Recently, SAP HANA Cloud Platform Tools released a new feature for Eclipse IDE (Kepler), that allows you to import sample SAP HANA XS applications into your account at the SAP HANA Cloud Platform Trial landscape. Importing the sample applications is quite easy - а few mouse clicks in a standard Eclipse import wizard, and you are ready to use them. It gives you the opportunity to play around with fully featured SAP HANA XS applications in no time. The provided sample applications show different features of SAP HANA XS.

Here are the sample applications that you may try:

  • EPM Sample - a sample application that will show you how to use SAP HANA analytic power via modeled views, XSJS scripts, and expose this to the end user via SAP UI5 user interface. You may find info how to build this application from scratch in the blog 8 Easy Steps to Develop an XS application on the SAP HANA Cloud Platform.
  • SHINE - SAP HANA Interactive Education SPS6, is a demo application that makes it easy to learn how to build native SAP HANA applications. It is complete with sample data and design-time developer objects for the application's database tables, data views, stored procedures, OData, and user interface.





1. Importing a sample XS application

  • In Eclipse IDE open menu File -> Import...
  • [Steps 1,2] Select SAP HANA Content -> Sample Applications, and press Next button
  • [Step 3] Select the sample application for import, and press Next button
  • [Steps 4,5] Select the system for import, and press Next button
  • [Steps 6,7] Select the package for import, and press Finish button
  • [Step 8] Observe the import job status until it states Completed Successful


After refreshing your package you have to see the imported application.



2. Using sample XS applications

The "Sample Applications for HANA Cloud Platform" plugin provides a few sample XS applications. In the examples below is used a sample account named: p123456trial and a sample trial instance named: sample. Having this setup <your_package> is replaced with p123456trial.sample.


2.1. EPM Sample

Here are the steps to try it:

  • Import EPM SAMPLE application (see point 1)
  • Grant <your_package>.epm_sample.roles::model_access role to the SCN user who will access EPM SAMPLE. Using SQL console execute:

CALL "HCP"."HCP_GRANT_ROLE_TO_USER"('<your_package>.epm_sample.roles::model_access', '<SCN user name>')



CALL "HCP"."HCP_GRANT_ROLE_TO_USER"('p123456trial.sample.epm_sample.roles::model_access', 'p789012')


Here is how the EPM SAMPLE application looks like:




Here are the steps to try it:

  • Import SHINE application (see point 1)
  • Grant <your_package> role to SCN user who will access SHINE. Using SQL console execute:

CALL "HCP"."HCP_GRANT_ROLE_TO_USER"('<your_package>', '<SCN user name>')



CALL "HCP"."HCP_GRANT_ROLE_TO_USER"('', 'p789012')


Here is how the SHINE application's entry screen looks like:


The SHINE sample application also has an admin part which allows you to generate data, synonyms, etc. (for more information see: SAP HANA Interactive Education (SHINE) SPS7). If you want to use the currency conversion in SHINE, you have to generate synonyms. Here are the steps:

  • Grant admin role to the SCN user who will access the admin part (e.g. your SCN user). Using SQL console execute:

    CALL "HCP"."HCP_GRANT_ROLE_TO_USER"('<your_package>', '<SCN user name>')



    CALL "HCP"."HCP_GRANT_ROLE_TO_USER"('', 'p789012')

  • [Step 1] Open the admin part of the SHINE application

<shine URL from cockpit>/shine/admin/ui/WebContent/admin.html


  • [Step 2,3] Check the "Create Synonyms" box, and press the Execute button


Now you may enjoy the SHINE reports graphics



Congratulations! Having done the steps above, means you have at least one fully functional XS application in your developer account on SAP HANA Cloud Platform Trial landscape.



Dobrinka Stefanova



Blog 8 Easy Steps to Develop an XS application on the SAP HANA Cloud Platform


By Martina Galabova, Valeri Sedevchev, Petra Lazarova and Daniel Vladinov

Here’s the deal:
You have a great application in the cloud that makes it possible to showcase your products or services to thousands of users over their mobile devices (put here any other scenario you might have in mind). Yet, the process of making all-important data (e.g. your public product catalog) feed this cloud application is still maintained in your on-premise hosted ERP backend system. You want to only replicate a subset of the data and, once such secure replication is set up initially, to have it running automatically and in real time. (In this way, all your users get fresh information for your latest and greatest products.)

So now what?

Easy-peasy: The solution is to replicate the necessary backend data to your productive SAP HANA instance on SAP HANA Cloud Platform and use the data in your application.

As you may have already guessed, in this blog you will learn how to get the data you need, when you need it, and put it where you need it! Or, in simple words (for developers only ): how to replicate your data from an on-premise system to your productive SAP HANA instance on the cloud via SLT (i.e. SAP Landscape Transformation Server).

The following diagram illustrates the process of replicating data on SAP HANA Cloud Platformp1.png


The situation:

What you have:

  • On-premise ERP system that holds your data - let’s call it Source;
  • Productive SAP HANA instance - let’s call it Target system, available in your SAP HANA Cloud Platform account;
  • SAP HANA Cloud Platform SDK: it brings the needed DB tunnel tool;
  • SLT system installed as on-premise to replicate the data from Source to Target system via the DB tunnel;
  • SAP HANA Studio to configure and manage the Target schema, users, roles and privileges.


What you want:

  • SAP HANA XS application that consumes and processes the backend ERP data
  • Near real-time data availability


What to do:

  1. Set the Users.
  2. Configure the Replication.
  3. Go and Replicate.
  4. Enjoy!




You need to have:

Let’s start!

I. Set the Users

Desired Result: Getting a replication user with all required permissions that you will need for the actual replication.


The whole procedure with setting up the users requires three types of users:

  • Your SCN user that you need to open the DB connectivity tunnel ( for example, p1234567)
  • Once the tunnel is open, it returns an SAP HANA DB user that grants you access to the dedicated SAP HANA instance on SAP HANA Cloud Platform via your SAP HANA Studio (it has the same name as your SCN user; in our example,  p1234567 )
  • In the SAP HANA Studio, you create a third user (<DR_ADM_USER>) that you need for the replication configuration itself.



  1. Open a DB tunnel for secure connection from your local machine to SAP HANA Cloud Platform. To do this, use your SCN user and enter command  neo open-db-tunnel. Then enter the required information, such as:
    • Landscape host
    • Your account
    • Your landscape account User
    • Target HANA database name – you will get this name once you order your productive HANA in your landscape.
      neo open-db-tunnel -i <target_hanadb_name> -h -a <my_account> -u <my_SCN_user>p2.png
      For more information, see the Documentation.

      RESULT: The DB tunnel makes your productive HANA system locally accessible via the properties displayed in the console output: Host name (usually localhost), JDBC Url, Instance number, newly created database User, and Initial password for it.
  2. Open your SAP HANA Studio and add a new system, using the credentials provided by the tunnel. When prompted, change the initial password to a permanent one.  Hint: this is the automatically created HANA database user with a name equal to your SCN user.
    NOTE: Remember the specified permanent password – you’ll need it later to configure the Target credentials for replication.
  3. In the SAP HANA Studio, create a new user that will be used as SLT administrator. This automatically creates a new schema with the same name. Provide the name and initial password for it. In our example, we will use the name <DR_ADM_USER>. Note that the initial password will be changed in just a moment (step 5 below)!
  4. Configure the following settings for  <DR_ADM_USER> :
    1. In the Granted Roles tab, add HCP_SYSTEM role.
    2. In the Object privilege tab, add REPOSITORY_REST SQL object with EXECUTE privilege.
    3. If SYS_REPL schema already exists, the EXECUTE, SELECT, INSERT, UPDATE and DELETE privileges should be grantable to others. SYS_REPL is a system schema used by SLT to store replication configuration data. Skip this step if SYS_REPL schema is not present!
    4. Save the <DR_ADM_USER> user settings.
  5. In the SAP HANA Studio, right-click on your system (from tab Systems), and select the “Add Additional User…” option to log on with the <DR_ADM_USER>. SAP HANA Studio will prompt you in a pop-up window to change the initial password.
    The second entry will appear in the Systems tab so that you can log off from the  p1234567 login session.


II. Configure the Replication

Desired Result:  Establishing a secure connection from your SLT to the dedicated SAP HANA on SAP HANA Cloud Platform, and to configure the replication, so that everything is set and ready to go.


You need to open a DB tunnel from your SLT system to the dedicated SAP HANA instance on SAP HANA Cloud Platform. (Create replication configuration to establish the connection between your on-premise ERP system to your dedicated HANA instance on the platform.) To establish a secure connection from your SLT system to your dedicated SAP HANA instance, you have to open a DB tunnel from your on-premise SLT machine.


  1. Log in to your SLT machine.
  2. Download and install the latest version of SAP HANA Cloud Platform SDK (Java Web) in accordance with page Installing the SDK.
  3. Open a DB tunnel with your SCN user to your target schema. The tunnel allows SQL statements only for the owner of that schema to pass through it. To keep the tunnel permanently alive, you should add the additional parameter --timeout 0
    neo open-db-tunnel -h -a <my_account> -u <my_SCN_user>-i <target_hanadb_name> --timeout 0p7.pngp8.png
    For more information, see the Documentation.
  4. Create Replication Configuration
    This document applies to SP05 of DMIS add-on. In previous versions, all the data is populated on one screen.
    1. Log in to your SLT system and open transaction LTR.
    2. In the opened browser window, log on and open the New Configuration wizard.
    3. Enter replication name.  The wizard will create a new schema in the Target HANA box in your SAP HANA Cloud Platform account with that name that holds your replication data.
    4. Choose Next.
    5. Specify Source System: Choose your RFC destination connecting to the Source (ERP) system.
      NOTE: By default, only one replication is allowed per source system.  If you want to create multiple replications (that is, replication to multiple SAP HANA systems) from your source system, select the “Allow multiple usage” checkbox.
    6. Specify Target System: In the System Data form, enter the following:
      • Administration User Name:<DR_ADM_USER>
      • Password:  the password for <DR_ADM_USER>
      • Host Name: localhost – the tunnel enables connectivity to the cloud via requests to localhost
      • Instance Number: the instance number retuned by the DB tunnel you opened from your SLT machine
    7. Choose Next.
    8. Continue the configuration as described in the Documentation.

III. Go and Replicate

Desired Result: Laying back and enjoying the seamless and safe replication process, and enjoying your app anytime and from anywhere.



  1. Open your SAP HANA Studio and switch to the Modeler Perspective.
  2. Open Quick Launch with your <DR_ADM_USER> user.
    Your system and username are listed there. You can switch the system by choosing the Select System… button.
  3. Open the Data Provisioning… editor from the Data pane in the bottom-middle part of the screen.
  4. Wait for your system tables (DD0xx) to be replicated. You can press the refresh button to see the latest status of the replication tables. The tables will go through few different actions and statuses. When they reach status “In Process” with action “Replicate” – which is an indication they are already (and will stay) replicated – you are ready to proceed with the replication of the tables needed for your cloud application.
  5. Choose the Replicate button and select the tables you want to replicate. For more details you can check this HANA Academy video.
  6. Enjoy!



So What Did You Just Do?

You can now quickly and easily replicate your on-premise data in SAP HANA Cloud Platform, where you can develop apps using this data.

Basically, you have conquered your own small data cloud in the big SAP HANA Cloud Platform family!



This blog offers you a comprehensive and easy-to-use tutorial for developing a hybrid application on SAP HANA Cloud Platform that integrates with an on-premise ABAP system using JCo and RFC. It is a simple example of how SAP HANA Cloud Platform can be used as extension platform for existing SAP systems running on customer side.


The tutorial below guides you through an application scenario that includes

  • how to develop a simple cloud application using JCo to call an on-premise ABAP system,
  • how to use Apache CXF + Spring Framework to define some RESTful services
  • how to configure and test connectivity of the application on SAP HANA Cloud Platform local runtime 
  • how to setup and configure the SAP HANA Cloud Connector
  • how to configure and test connectivity of  the application on SAP HANA Cloud Platform


List of technologies used:


  • JCo API and destinations
  • SAP HANA Cloud Connector
  • Apache CXF and Spring Framework
  • SAP UI5
  • SAP HANA Cloud Platform local runtime for testing


1. Development Scenario Overview


The scenario implements a simple flight booking application based on the well-known SFLIGHT model available as default sample content in ABAP systems [1]. The overall landscape of the sample application is shown in following picture:




The client consists of an HTML UI, based on SAP UI5, and implements the UI following the Model-View-Controller paradigm.


The client interacts with the Web application on SAP HANA Cloud Platform via a set of REST services, to request a list of flights, to get flight details and to book a flight.


The Web application provides these REST services using Apache CXF and the Spring Framework.


The REST services in the application delegate the actual calls to the ABAP system to a JCoFlightProvider class which in turn uses the Java Connector API to call an on-premise ABAP system directly using RFC.


For the on-demand to on-premise connectivity, a SAP HANA Cloud Connector is running in the on-premise landscape. It has the ABAP system and BAPIs configured as accessible resources which are needed by the SFlight application.


In the ABAP system, the sample data of the SFLIGHT model has been created, and the authentication to the SFLIGHT BAPIs is done with a service user in this example.


If you like to see the application running, start it in a browser (Chrome, FireFox, Safari) with following URL:


2. Setting up the Environment


The sources of the SFlight sample application are available on, as Extension-005 of the personslist git repository which holds a collection of similar HCP sample applications: [0].


In order to run the application in your landscape, an important pre-requisite is to have an ABAP system, version 4.6c or newer, with the SFLIGHT model and licensed named ABAP users available. How to setup the SFLIGHT model in an ABAP system (e.g. how to generate the flight sample data) is described here [1].


2.1 Setting up Local HCP Development Environment


If this pre-requisite is fulfilled, you can set up and run the application in your environment. In this tutorial, only the connectivity specific steps are described in detail, the general steps how to set up the SAP HANA Cloud Platform environment and how to import the SFlight sample project into your Eclipse workspace is just shortly summarized here. A more detailed step-by-step guide for these steps are provided in the ReadMe.txt file that you can find in the root directory of the Extensions-005 branch in the github project [0].


  1. Register for a free developer account on HCP
  2. Install Java 7 JDK
  3. Install Eclipse for Java EE Developers
  4. Install the HCP Eclipse tools
  5. Download the HCP SDK
  6. Set up the HCP runtime environment in Eclipse
  7. Install Egit
  8. Synchronize the cloud-personslist-scenario git repository from [0], and create a local branch for the Extensions-005 branch. This branch contains the sflight-web Maven project.
  9. Import the sflight-web Maven project into your Eclipse workspace.
  10. Run a Maven "clean install" build of the project in Eclipse and make sure the project builds without errors.


You should now have a local HCP development environment set up, and the sflight-web project imported and built successfully in your Eclipse workspace.


2.2 Installing the Cloud Connector and Establishing an SSL Tunnel to Your HCP Account


To make RFC calls from the cloud over SSL, you need to install the SAP HANA Cloud Connector in your on-premise landscape. You can download the cloud connector from and install it as described here [2].


Once you installed it, you can connect it against your HCP account by following the steps described here [3]. Once this is done, you have established a persistent SSL tunnel from your environment to your HCP account - but don't worry, none of your internal systems are yet accessible from the cloud applications. You have full control over the systems and resources you like to make accessible to your cloud applications, i.e. you need to configure each system and resource in the cloud connector that you like to make accessible to applications of your HCP account. We will do this in a later step of this tutorial.


3. SFlight User Interface


The SFlight application uses SAP UI5 as JavaScript library for its user interface. Following the Model-View-Controller paradigm of SAP UI5, the UI is built of the two files sflight.view.js and sflight.controller.js which you can find under /src/main/webapp/sflight-web in the Eclipse project.


A screenshot of the UI is shown below. On the right hand side, you find the function names within the sflight.view.js file which implement the respective panel.


The single panels interact with the sflight.controller.js file in following ways:

  • createFlightSearchPanel(...) function: calls the searchFlights(...) function of the controller to retrieve a list of flights from specified departure and arrival airports.
  • createFlightListPanel(...) function: calls the getFlightDetails(...) function of the controller to retrieve the details of the selected flight.
  • createFlightDetailsPanel(...) function:  calls the bookFlight(...) function of the controller to book the selected flight. It also calls the searchFlights(...) function of the controller again to retrieve an updated flight list.


The UI controller interacts with the SFlight Web application running on the server using REST services. Each of these REST services replies with a JSON response, so that the controller and view are using the sap.ui.model.json.JSONModel to bind the UI to the JSON data. The REST services called in the sflight.controller.js file are following:

  • GET: /rest/v1/flight.svc/cities
    Returns a JSON array of cities in  which airports are located.

  • GET: /rest/v1/flight.svc/flights/{cityFrom}/{cityTo}
    Returns a JSON array of flight with departure airport {cityFrom} and arrival airport {cityTo}.

  • GET: /rest/v1/flight.svc/flight/{carrier}/{connNumber}/{dateOfFlight}
    Returns a JSON array with the details of the specified flight.

  • POST: /rest/v1/flight.svc/booking
    Books a flight specified in  the response body and returns a JSON object with the booking ID.
    xmlhttprequest is used directly to trigger the POST request to the server.

4. Flight REST Services in the Web Application


The Web application is using Apache CXF and the Spring Framework to provide the necessary REST services. In order to use these libraries, following dependencies are define in the pom.xml of the application:


<!-- Apache CXF -->





















<!-- Spring framework -->












How to use CXF in combination with Spring is described in more detail here [4]. Shortly summarized, it provides a simple framework to define REST services in POJOs, taking care for all the boilerplate code of receiving and sending HTTP requests for you. It thus allows you to focus on the business logic and makes development of REST services an easy task.


To understand how the REST services of the sample application are implemented, you need to look into the sprintrest-context.xml file located under the /src/main/webapp/WEB-INF folder in the Eclipse project. There, a Spring bean is defined with name flightService.This bean is implemented by the Java class Using the CXF and Spring annotations, the FlightService class is a simple POJO which provides the GET and POST service endpoints listed above in section 3. A small code fragment that shows how the definition of the REST service is done is shown below:




@Produces({ "application/json" })

public class FlightService {





    public String getFlightList(@Context HttpServletRequest req, @PathParam("cityFrom") String cityFrom,

        @PathParam("cityTo") String cityTo) {





5. Fetching Flight Data and Trigger Booking in the ABAP system


The FlightService class delegates all calls to a FlightProvider object which then in turn does the actual call to the backend system. For this, an interface is used that defines the Java methods which shall be performed against the SFlight backend service.


Right now, there is just one implementation of the FlightProvider interface:
We might add e.g. an ODataFlightProvider in the future to this sample application.


The JCoFlightProvider uses the Java Connector API to make RFC calls directly against an ABAP system. Of course, all the communication is encrypted using the cloud connector and its SSL tunnel. You can use JCo in exactly the same way as you might know it from SAP NetWeaver Java. A tutorial how to work with JCo can be found here [5]. JCo uses RFC destinations which encapsulate the configuration of the ABAP system, and allows the separation of the actual ABAP system configuration from the application coding. The JCoFlightProvider makes use of an RFC destination called dest_sflight. This destination will be configured in section 6.


Note that the JCoFlightProvider class not only fetches data from the ABAP system, but also writes back a flight booking transaction to the ABAP system.The BAPIs called by the application on the ABAP system are:



To provide the result set of the JCo calls in a JSON format, a utility class is used to transform JCo tables and structures into JSON objects. This eases the consumption of the response in the SAP UI5 client.


6. Configuring Destinations


You should have now a good understanding how the application is structured and what the single layers are doing. Next step is to test the application, first on the local test server, and then in the cloud. In order to do this, the RFC destination used by the application needs to be configured.


6.1 Configure Destination in Local Test Server


As first step, deploy the SFlight application on your local HANA Cloud Platform test server:

  1. In Eclipse Project Explorer view select project node sflight-web.
  2. On the selected project node open context menu and choose Run As > Run on Server.
  3. Window Run On Server opens. Make sure that the Manually define new server option is selected.
  4. Select SAP > SAP HANA Cloud local runtime as server type
  5. Choose Finish.


You now need to configure a test user for your local server: Double click SAP > SAP HANA Cloud local runtime in the Servers' view and navigate into the Users tab. Add a test user here and save the changes. This user can be used to log on into the local application.


Afterwards, the RFC destination used by the application needs to be configured:


  1. In the Eclipse Project Explorer, copy the destination dest_sflight.jcoDestination from folder sflight-web/src/main/webapp/destinations into the root directory of   your local SAP HANA Cloud Platform test server which you can find in the Eclipse Project Explorer tree under Servers > SAP HANA Cloud Platform local runtime-config.
  2. Adopt the configuration: Open the copied destination dest_sflight.jcoDestination in Eclipse with a text editor and adopt the destination configuration to point to the locally accessible ABAP system you want to use for the sample application. Note that you need to have licensed named ABAP users according to the SAP licensing model for your scenario. 
    The JCo properties are explained here:


Now, you should be able to test the SFlight application in your browser, using the URL: http://localhost:<port_of_local_server>/sflight-web. The application should successfully call the ABAP system and display data when selecting cities for which flights exist in the ABAP system (e.g. choose Departure = Frankfurt, Arrival = New York).


6.2 Configure Destination in the Cloud


First, publish the SFlight application to your free Developer Account in the cloud:

  1. In Eclipse Project Explorer select sflight-web project node.
  2. Open context menu Run As > Run on Server.
  3. Make sure that 'Manually define a new server' option is selected.
  4. Change default Server's host name '' to ''
  5. Choose Next
  6. Enter Application name e.g. sflight (only small letters(a-z) and numbers are allowed)
  7. Then enter your SAP HANA Cloud Platform Developer Account name (<p-user>trial), (SCN) User name (<p-user>) and your password.
  8. Choose Finish.


After about a minute the SFlight application, running on SAP HANA Cloud Platform, is launched with the external Web browser you configured before.


Now, configure the RFC destination for the SFlight application in the cloud:

  1. In Eclipse Server view, open the server UI by double clicking the server.
  2. Navigate into the Connectivity tab.
  3. Click on the Import existing Destination button and browse to the destination located in the sflight-web/src/main/webapp/destinations folder of the Eclipse project.
  4. Adopt the destination configuration to point to your on-premise ABAP system. Note that you need to have licensed named ABAP users according to the SAP licensing model for your scenario. 
    The JCo properties are explained in detail here:
    The properties file is pre-filled with following values:
    These values are relevant for the configuration of the ABAP system in the SAP HANA Cloud Connector, as described in section 7 below. It's recommended to leave these two properties unchanged, so that they fit to the configuration used in the next section.
  5. Save the application. This will deploy the configuration into the cloud.


To run the application in the cloud, one more step is missing: The ABAP system and BAPIs which shall be made accessible to the SFlight application need to be configured in the Cloud Connector.


7. Configure ABAP System and BAPIs in the Cloud Connector


At this stage, you should have installed already a Cloud Connector and established a connection with your free developer account (see section 2.2 above).

Now you need to configure the ABAP system and BAPIs used by the SFlight application in the Cloud Connector. For this, go into the Cloud Connector administrator UI and execute the steps below:

  1. Navigate into the Access Control view of the Cloud Connector. Under Mapping Virtual to Internal System click the Add... button.
  2. Enter and save following values in the Add System Mapping dialog: 
          Virutal Host =
          Virtual Port = sapmsFLI
          Interal Host = <host name of the ABAP system>
          Interal Port = <port of the ABAP system>
          Protocol = RFC
          Backend Type = ABAP System  
  3. Select the newly created entry in the Mapping Virtual to Internal System table.
  4. In the Resources... table at the bottom, add the BAPIs the SFlight application shall get access to.
    Enter following 3 entries using the Add... button:
    1. Function Name: BAPI_SBOOK 
      Naming Policy: Prefix
    2. Function Name: BAPI_SFLIGHT
      Naming Policy: Prefix
      Naming Policy: Exact Name


Now you should be able to run the SFlight application in the cloud. The application should fetch the data from your on-premise ABAP system and display the data as in the local test case.


8. Summary


The SFlight sample application and this tutorial demonstrate how you can write an application on SAP HANA Cloud Platform that integrates directly with on-premise ABAP systems by using the Cloud Connector and JCo/RFC. Typical use cases for this are scenarios, where existing and potentially already pretty old on-premise ABAP systems should be extended by modern cloud applications, without the need to touch or upgrade the ABAP systems themselves and without the need of additional layers on top of the ABAP systems in order to make their BAPIs externally available. As the data is not stored in the cloud, but fetched by the cloud application, data consistency is not undermined by the cloud scenario. By using the capabilities of the Cloud Connector, it is possible to restrict access only to those parts of the ABAP system which are actually needed by the cloud application. Thus, you can consider this approach a very secure way how to extend an on-premise system in the cloud.


9. References


[0] SAP/cloud-personslist-scenario · GitHub

[1] Flight Data Application: Overview - Demo Example for Integration Technology

[2] SAP HANA Cloud Platform - Installing Cloud Connector 2.x

[3] SAP HANA Cloud Platform - Initial Configuration of Cloud Connector

[4] Apache CXF -- JAXRS Services Configuration

[5] SAP Java Connector

[6] SAP HANA Cloud Code Samples - Extensions of PersonsList Web Application

After reading Matthias Vachs excellent blog post Using an open source based development infrastructure with SAP HANA Cloud Platform I decided to play around with the tools mentioned in his blog post. The result of this is a SAP UI5 Application available on GitHub, that you can use as starting point for your ventures into Continuos Integration.


In part one of this blog series we will focus on getting the project up and running on our local Hana Cloud Server, subsequent parts will deal with setting up an environment for Continues Delivery.


The Project


To get started with contninuous delivery on the SAP HANA Cloud Platform we will start by cloning and building a simple SAP UI5 project, SAPUI5 for Maven, or UI5MVN. The project is meant to be used as a starting point for Hana Netweaver Cloud development, and the focus has been to streamline the development process with the help of best of breed open source tools.

  • Source Versioning is with GitHUB

  • Maven is used as dependency and build tool
  • The project is editor agnostic, but it has been tested using Eclipse






  1. Make sure you have all prerequisits installed and setup.
  2. Clone the project from the git repository to a directory of your choice.
  3. git clone

  4. Change into the directory SAPUI5-for-maven
  5. Rename or copy the following files:
    • pom.xml.local to pom.xml
    • to
    • src/main/destinations/Gateway.local to src/main/destinations/Gateway
  6. Edit the property <> in pom.xml to point to your local SAP Hana Cloud Platform SDK
  7. Edit the following properties in
    • sdk.dir - to point to your local SAP Hana Cloud Platform SDK
    • account - your Hana Cloud Platform account
    • user - your Hana Cloud Platform user
    • password - Your Hana Cloud Platform password
  8. Edit your username and password in src/main/destinations/Gateway
  9. Build and install using maven

    mvn clean install

  10. Deploy to your local Hana Cloud Server

    mvn nwcloud:deploy-local

  11. Start your local Hana Cloud Server

    mvn nwcloud:start-local

  12. Browse to the start page



After completing these steps you should have a simple SAP UI5 application up and running, the application connects to the SAP Gateway Demo system and retrieves a list of Products. By clicking on this product you get some more details about the product.


In part two we will focus on setting up the development environment, using Eclipse and the Eclipse file sync plugin.


A scenario where SAP UI 5 application has multiple SAP UI 5 Views with each view having an independent functionality, and each view needs to be extended as a separate widget for Use in Cloud Portal. In such a scenario, multiple spec files can be included in the project along with multiple startpoint HTML files.


Below are two sample spec files:-


<?xml version='1.0' encoding='UTF-8'?>  
  <ModulePrefs title="Trial Widgets" height="250" description='A widget for validating' thumbnail='<--Widget icon image path-->'>
   <Require feature="sap-xhrwrapper" /> 
 <Content view="authoring, consumption, mobile, preview" href="./index1.html"/>


<?xml version='1.0' encoding='UTF-8'?>  
  <ModulePrefs title="Trial Widgets new splitter" height="250" description='A widget for validating' thumbnail='<--Widget icon image path-->'>
   <Require feature="sap-xhrwrapper" /> 
 <Content view="authoring, consumption, mobile, preview" href="./index2.html"/>

Note that in both the xml we have separate "href" path in content tag.

Here both specification files will be deployed as two different widgets in SAP HANA cloud portal. 'href' separates them by having different html files for both of them.

Each html file can be configured accordingly to load only the desired view. This will make widgets to load only particular html file with its own initialized view.


Here are sample index.html files:-



  <meta http-equiv="X-UA-Compatible" content="IE=edge">
  <script src="resources/sap-ui-core.js"
  <!-- add sap.ui.table,sap.ui.ux3 and/or other libraries to 'data-sap-ui-libs' if required -->
  var view = sap.ui.view({id:"idwidgetView1", viewName:"trialwidgets.widgetView", type:sap.ui.core.mvc.ViewType.JS});
  <body class="sapUiBody" role="application">
  <div id="content"></div>


  <meta http-equiv="X-UA-Compatible" content="IE=edge">
  <script src="resources/sap-ui-core.js"
  <!-- add sap.ui.table,sap.ui.ux3 and/or other libraries to 'data-sap-ui-libs' if required -->
  var view = sap.ui.view({id:"idrteWidgetView", viewName:"trialwidgets.rteViewWidget", type:sap.ui.core.mvc.ViewType.JS});
  <body class="sapUiBody" role="application">
  <div id="content"></div>

Both index files are refering to their own independent views in thier <script> tag. This makes each index file load only mentioned SAP UI5 Views.

These index files can later be referenced in 'href' tag of specification xml files making them separate widgets but within same SAP UI5 application.

header image.jpg

This blog elaborates on issues related to the web domain under which cloud applications are visible in internet. Unlike the problem discussed in part 1 this one is more configuration-related, than a programming technique. This is the reason why we will not see code samples, which I hope will not make the blog boring. The topic is pretty important for cloud applications security, and deserves attention, I believe.

Using an own web domain, instead of common one, provided by the cloud (infrastructure or platform) service provider has multiple benefits. Of course the branding and marketing aspects are important for B2C applications, and search engine indexing does matter. However, here I will focus primarily on the security implications of having own domain versus sharing a web domain with other cloud tenants.

The key is that web domains offer isolation, which is a major pillar in the web security policies of various web technologies, protocols, standards, and products. They originate from the old school “perimeter security”, relying simply on the facts that a corporation or organization

  • hosts its data and the software processing it in an own premise
  • has own web domain
  • has a personnel responsible to protect this trusted area from the outside world (internet)

The introduction of cloud and on-demand software services has broken most of the perimeter security assumptions in IT, while the security policies related to web domains are here to stay and very difficult to evolve due to compatibility risks.

More concrete, the challenge related with web domains is that application providers sharing the same cloud platform or infrastructure (cloud tenants) usually do not trust each other, while sharing a web domain implies trust. If cloud tenants share a web domain then the trust boundaries are broken. The risk from that can be hardly assessed as it is hard to enumerate all technologies, protocols, standards, and products, which security policies rely on web domains as an isolation mechanism.


Let's observe some examples how web security policies and standards get affected by domain sharing.

To make examples more concrete, I will illustrate them with HANA Cloud Platform (HCP), where by default applications are exposed in internet in a common domain

Server certificate authentication

Recent browser versions provide convenient and user-friendly authentication of the server certificate during HTTPS connection establishment, which is a basic level of protection for the end user. In short, a browser will notify the user when the identity of the server, to which the connection is established, is not the one the user requests.

However, when it comes to a cloud application the question that matters is what the identity of the server certificate is – does it identify the exact application or the whole cloud domain? Technically, if the server certificate is issued to a generic entity, e.g. *, then during SSL connection establishment the client will identify the complete domain, instead of a particular server. That is the server authentication done by the browser will still succeed even if a user requesting is redirected to


Same-origin policy

The same-origin policy (SOP) uses host name, which is strong enough to distinguish and isolate from However, in order to enable some technologies and programming techniques SOP has relaxations. It allows a script in a web page to set the document.domain parameter to a super domain, a suffix of the host name. If two pages explicitly and mutually set their respective document.domain parameters to the same value, access is granted (for details see [1], [2], [3], and [4]).

Suppose that application wants to grant application access and avoid the SOP restrictions (e.g. in order to enable cross-application UI mash-ups or alike). To achieve that in both applications the document.domain property is changed to The problem now is that a malicious application can do the same, with which to circumvent SOP and execute attacks like XSS or CSRF against genuine1 and genuine2.


CORS (Cross Origin Resource Sharing)

CORS is a much better approach for relaxing SOP. It enables applications to define trust across origins (web domains) by allowing browsers to make cross-origin request (for details see [5], [6]).

The allowed origins configuration however shall be done very cautiously with maximum restriction only to the trusted entities. For example a trust to a common domain like * will be vulnerable; instead the exact application origin shall be used.


Cookies manipulation via cross-site cooking

Cross-site cooking is an exploit allowing the attacker to set or change cookies for an application in another domain (see [3], [7], [8], and [9]). The cookies management policy allows any subdomain of a given domain to set, and receive the cookies for that domain. For example a malicious application can set cookies with domain, which will force the user’s browser to send them with all subsequent requests to With this technique the malicious application can set or change sensitive cookies of other applications sharing the same domain.


Adobe cross-domain policy

Adobe also has a security policy based on web domains. The Adobe cross-domain trust configuration grants a web client, such as Adobe Flash Player or Adobe Reader, permission to handle data across different domains (see [3], [10]). If the trust is granted to a complete domain like, one can hardly separate trusted and untrusted entities served from


As you see there are various means of the trust between applications sharing the same web domain, which implies variety of threats from a potentially malicious application with which you share a domain. As the list of examples here cannot be considered complete, it is very hard to assess the security risk from running an application in a shared domain with other untrusted applications. Therefore the safe solution is to always run critical productive applications in own web domain.














Filter Blog

By author:
By date:
By tag: