The existential question, which only seems to offer two equal and yet feasible options.

 

The first time we face such a question in a distributed environment (typical in the context of cloud applications) we’ll have a hard time finding the “right” answer …hmm the second time as well? The actual problem is that we cannot give a universal solution just by definition, but have to go through case by case each time. Such scenarios could be transferring data between two legacy or 3-thd party systems, between a legacy system and our custom extension, even between the different extension applications that we could have. There could be different boundary conditions depending mainly on the source and target systems’ capabilities for external communication. Sometimes, additional intermediate component could be required to cope with the integration between the systems. Hence, we need to deeply investigate the pros & cons for the given scenario, personas and components and to take concrete conscious decisions for the architectural patterns, which can be used. We have to consider also aspects like the performance for direct synchronous calls between remote systems, scalability of the source system itself, lack of tailored interfaces for our brand new use-cases, preserving the security model e.g. identity propagation, need of preliminary cache and other non-functional requirements, we often reach to the well-known situation (after taking the red pill) when – “the choice has already been made, and now you have to understand it”.

 

The perfect solution, no doubt, would be to avoid distributed landscapes at all, hence to extend the existing business systems in place. Unfortunately, not so many technology platforms provide full-fledged development environment for “in-system programming”, like Dirigible does. Not so many also have already built-in or bundled Dirigible. …the world is not perfect… yet. If you already have eliminated all the easy-win options for your scenario, and the only one remaining is - replication, the natural question is – let me guess – “How can Dirigible help me also in this kind of situation?” Right? Well, let’s see…

 

The simplified view on the replication scenario would consist of two parties, which have to transfer data one another. Let’s first answer the question: - Is it possible for the source system to be the active party, i.e. to use a “push” communication pattern? If the push pattern is possible, probably you would prefer it; hence let’s see what needs to be done at the cloud application side.

 

     I. Push

 

To implement the push case, we’ll do the following:

  1. Create an Entity Service in JavaScript, which will consume the message – parses the incoming data and stores it in the target database table(s), according to the database schema layout of the cloud application.

 

Scenario Details:

 

1. Create a blank project

 

 

1.png

2.png

3.png

 

               2. Create Data Structures – for the Employees

               4.png

 

sfrepl_employees.table

 

{

"tableName":"SFREPL_EMPLOYEES",

"columns":

    [

      {

           "name":"EMPLOYEE_ID",

           "type":"INTEGER",

           "length":"",

           "notNull":"true",

           "primaryKey":"true",

           "defaultValue":""

      }

      ,

      {

           "name":"EXTERNAL_ID",

           "type":"VARCHAR",

           "length":"128",

           "notNull":"true",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"FIRST_NAME",

           "type":"VARCHAR",

           "length":"40",

           "notNull":"false",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"LAST_NAME",

           "type":"VARCHAR",

           "length":"40",

           "notNull":"false",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"REPLICATION_ID",

           "type":"INTEGER",

           "length":"0",

           "notNull":"false",

           "primaryKey":"false",

           "defaultValue":""

      }

    ]

}

 

                    3. Create an Entity Service

                    10.png

                    4. Select “Entity Service on Table” and…

                    5. … select ”SFREPL_EMPLOYEES” table

                    6. Give name e.g. “employees_entity_service.js”

                    7. Activate and Publish the project

                    8. Find and select the service in the Workspace Explorer

                    9. Take the “public” URL from the Preview

 

                    12.png

                    10. Use the RESTClient of your choice to send a test POST request

 

 

                    13.png

                  11. You can check the actual insert via the SQL Console at the Database Perspective

 

     II. Pull

 

If we cannot use the push communication pattern above, we have to make the cloud application party active. This can be achieved by scheduling a background job, which will trigger a service, which will “pull” the needed data from the external/backend system for a given period of time. Dirigible has a micro ESB built-in, based on Apache Camel. It supports limited number of use cases and EIPs compared to the ones supported in general by Camel. Fortunately, this use case is supported by the Dirigible’s built-in ESB.

      1. Create an Integration Service (route), which will schedule a job by “cron” expression using Quartz underneath.
      2. Create Scripting Services, which will call an external OData endpoint using Destination Configuration.
      3. The same as above – Entity Service to store the incoming data to the target database table.
      4. Create optional Entity Service for storing Replication Sessions log will be useful in this case

 

 

Scenario Details:

     1. Create a blank project and a table from the employees as following the steps from above

     2. Create Data Structures – for the Employees and for the Replication Sessions

sfrepl_sessions.table

 

{

"tableName":"SFREPL_SESSIONS",

"columns":

    [

      {

           "name":"SESSION_ID",

           "type":"INTEGER",

           "length":"0",

           "notNull":"true",

           "primaryKey":"true",

           "defaultValue":""

      }

      ,

      {

           "name":"SESSION_TYPE",

           "type":"VARCHAR",

           "length":"64",

           "notNull":"true",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"SESSION_STARTED",

           "type":"TIMESTAMP",

           "length":"0",

           "notNull":"true",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"SESSION_FINISHED",

           "type":"TIMESTAMP",

           "length":"0",

           "notNull":"false",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"SESSION_STATUS",

           "type":"INTEGER",

           "length":"0",

           "notNull":"true",

           "primaryKey":"false",

           "defaultValue":""

      }

      ,

      {

           "name":"SESSION_DESC",

           "type":"VARCHAR",

           "length":"128",

           "notNull":"false",

           "primaryKey":"false",

           "defaultValue":""

      }

    ]

}

 

 

 

          3. Activate the project

          4. Create an Integration Service

          6.png

          7.png

          8.png

          9.png

The generated service should looks like:

 

<routes xmlns="http://camel.apache.org/schema/spring">

<route id="sfrepl_users">

<from uri="timer://sfrepl_users?period=10000&amp;repeatCount=10&amp;fixedRate=true" />

<to uri="sfrepl_users"/>

</route>

</routes>

 

It uses the standard timer functionality. For our case we prefer to use Quartz job leveraging cron expression. Hence, replace the source code with the following:

 

<routes xmlns="http://camel.apache.org/schema/spring">

<route id="sfrepl_users">

<from uri="quartz://sfrepl/users?cron=0+0+0+*+*+?" />

 

<setHeader headerName="serviceName"><constant>/sfrepl/sfrepl_users.js</constant></setHeader>

<to uri="bean:js?method=process"/>

      

</route>

</routes>

 

Where:

* the expression “0+0+0+*+*+?” means the job will be triggered at every midnight.

* /sfrepl/sfrepl_users.js – is the path to the service to be executed

 

          5. Let’s create the actual service now

      10.png

       11.png

 

The source code is as follows:

 

var systemLib = require('system');

var odata_dest = require('sfrepl/odata_dest');

var odata_call = require('sfrepl/odata_call');

var repl_manager = require('sfrepl/replication_manager');

var employees = require('sfrepl/replication_employees');

 

try {

   

    var replId = repl_manager.startReplication("Users");

   

    var replicatedCount = 0;

    if (replId > 0) {

    try {

 

      var destinationPropeties = odata_dest.getODataDest();

 

      var url = destinationPropeties.get("URL");

      var user = destinationPropeties.get("User");

      var password = destinationPropeties.get("Password");

 

      var data = odata_call.callOData(url + "User?$select=userId,username,firstName,lastName", user, password);

 

      var inserted = 0;

      var updated = 0;

      var failed = 0;

      try {

                for(var i in data.d.results) {

                    if (data.d.results[i].userId !== null) {

                       

                        if (employees.find_employeesEntity(data.d.results[i].userId) === null) {

                            var id = employees.create_employees(

                                 data.d.results[i].userId,

                                 data.d.results[i].firstName,

                                 data.d.results[i].lastName,

                                 replId);

                            if (id !== null) {

                                inserted++;

                            } else {

                                failed++;

                                systemLib.println("Error on replicating entry: " + JSON.stringify(data.d.results[i]));

                            }

                        } else {

                            var id = employees.update_employees(

                                 data.d.results[i].userId,

                                 data.d.results[i].firstName,

                                 data.d.results[i].lastName,

                                 replId);

                            if (id !== null) {

                                updated++;

                            } else {

                                failed++;

                                systemLib.println("Error on replicating entry: " + JSON.stringify(data.d.results[i]));

                            }

                        }

                    }

                }

      } catch (e) {

           systemLib.println("Error on replicating: " + e.message);

           repl_manager.failReplication(replId, e.message);

      } finally {

           employees.purgeDeleted_employees(replId);

      }

 

    } catch (e) {

      systemLib.println("Error on getting destination parameters: " + e.message);

      repl_manager.failReplication(replId, e.message);

    } finally {

      var msg = "Replicated: " + (inserted + updated + failed) + ". Inserted: " + inserted + ". Updated: " + updated + ". Failed: " + failed + ".";

      systemLib.println(" Replication ID: " + replId + " " + msg);

      repl_manager.finishReplication(replId, msg);

      }

    }

 

} catch (e) {

      systemLib.println("Error on checking active replication session: " + e.message);

}

 

 

 

  1. There are several important details on which we have to focus:
  • Getting the destination for SuccessFactors OData API


The library module is:

odata_dest.jslib

 

exports.getODataDest = function() {

    var ctx = new javax.naming.InitialContext();

    var configuration = ctx.lookup("java:comp/env/connectivityConfiguration");

    var destinationConfiguration = configuration.getConfiguration("sfodata");

    var destinationPropeties = destinationConfiguration.getAllProperties();

    return destinationPropeties;

};

 

Where the name of the destination is sfodata. More info on how to create the destination via HANA Cloud Platfrom Cockpit can be found here: https://help.hana.ondemand.com/help/frameset.htm?60735ad11d8a488c83537cdcfb257135.html

 

Execute the HTTP request against the SuccessFactors endpoint:

 

odata_call.jslib

 

var systemLib = require('system');

var ioLib = require('io');


exports.callOData = function(url, user, password) {

    var getRequest = http.createGet(url);

    var httpClient = http.createHttpClient(true);

    var credentials = http.createUsernamePasswordCredentials(user, password);   

    var scheme = http.createBasicScheme();

    var authorizationHeader = scheme.authenticate(credentials, getRequest);

    getRequest.addHeader(http.createBasicHeader("Accept", "application/json"));

    getRequest.addHeader(authorizationHeader);

   

    var httpResponse = httpClient.execute(getRequest);

   

    var entity = httpResponse.getEntity();

    var content = entity.getContent();

   

    var input = ioLib.read(content);

    http.consume(entity);

   

    var data = JSON.parse(input);

    return data;

};

 

 

Do not forget – once replicated, the ownership and the responsibility of the data technically becomes yours! Alignment with the security model of the source system is a must.

 

Enjoy!

 

References:

The project site: http://www.dirigible.io

The source code is available at GitHub - http://github.com/SAP/cloud-dirigible

Forum: http://forum.dirigible.io

Twitter: https://twitter.com/dirigible_io

Youtube: https://www.youtube.com/channel/UCYnsiVQ0M9iQLqP5DXCLMBA/

Help: http://help.dirigible.io

Samples: http://samples.dirigible.io

Google Group: https://plus.google.com/111171361109860650442/posts

Blog: http://dirigible-logbook.blogspot.com/

Dirigible on SAP HANA Cloud Platform

Dirigible - Extensions vs Configurations

Dirigible is the fast track to your HCP HANA DB instance

 


Let's do something fun with WhatsApp and SAP HANA Cloud Platform. We will build an app whereby you can take a picture from your smartphone and send it via WhatsApp, save it to HANA database and finally view the picture from the browser. I named it as "Whana" = WhatsApp and Hana

 

b10.jpg

Prerequisites

 

 

Diagram

 

Presentation1.jpg

 

Yowsup Installation & Configuration

 

After you have downloaded the Yowsup library, copy the folder yowsup-master to the Python folder (in my case is C:\Python34). You also need to download the python-dateutil and copy dateutil folder to C:\Python34\Lib.

 

Now go to https://coderus.openrepos.net/whitesoft/whatsapp_sms to get the WhatsApp code. Enter your dedicated mobile phone number and select SMS.

b3.jpg

If no error, you will get the SMS message and note down the number:

WhatsApp code 192-828

 

Under C:\Python34\yowsup-master\src, copy the config.example to yowsup-cli.config, and indicate the following:

cc=country code; for example: 65

phone=your phone number: for example: 651234567 (without +)

leave the id and password blank at the moment.

 

Execute the following command to register with WhatsApp:

python C:\Python34\yowsup-master\src\yowsup-cli --register 192-828 --config C:\Python34\yowsup-master\src\yowsup-cli.config

 

If no error, you will get the following message:

status: ok

kind: free

pw: S1nBGCvZhb6TBQrbm2sQCfSLkXM=

price: 0,89

price_expiration: 1362803446

currency: SGD

cost: 0.89

expiration: 1391344106

login: 651234567

type: new

 

Copy the password S1nBGCvZhb6TBQrbm2sQCfSLkXM= and paste into pw field in yowsup-cli.config. In the end your yowsup-cli.config will look like this:

 

cc=65

phone=651234567

id=

password=S1nBGCvZhb6TBQrbm2sQCfSLkXM=

 

Now let's test to send a message to this phone number 6597312234. Execute the following command:

C:\Python34>python C:\Python34\yowsup-master\src\yowsup-cli --send 6597312234 "Test message" --config C:\Python34\yowsup-master\src\yowsup-cli.config

If no error, you will see the following result and the message will be sent to 6597312234:

Authed 651234567

Sent message

 

I have added the functions to save the image received from the WhatsApp to the local folder: C:\java\imageWA. Just overwrite the CmdClient.py in the  folder Example and downloader.py in folder Media.

 

 

 

Java Folder Monitoring


On this part, we will create the Java program to monitor the folder where the Yowsup stored the WhatsApp image (i.e., C:\java\imageWA) and check if there is any image file. If there is, the program will read the file and insert into the HANA database.

Below is the snippet of the java code:

 

static final String IMGFolder = "C:\\java\\imageWA";
String INSERT_PICTURE = "INSERT INTO \"NEO_CG2SX3P5XHHQEO58DKM7BWU0V\".\"p1940803061trial.fd2.data::mytable\" VALUES(?,?)";
public synchronized void insert(String fileName) throws Exception, IOException, SQLException{
  FileInputStream fis = null;
  PreparedStatement ps = null;
  Date currentDate = new Date();
  String s = Long.toString(currentDate.getTime() / 1000);
  System.out.println(s);
  try {
  System.out.println("filename: " + fileName);
  File file = new File(IMGFolder + "\\" + fileName);
  fis = new FileInputStream(file);
  ps = conn.prepareStatement(INSERT_PICTURE);
  ps.setString(1, s);
  ps.setBinaryStream(2, fis, (int) file.length());
  int rowsInserted = ps.executeUpdate();
  conn.commit();
  if (rowsInserted > 0) {
  System.out.println("A new record was inserted successfully!");
  }
  if(file.delete()){
  System.out.println(file.getName() + " is deleted!");
  }
  else{
    System.out.println("Delete operation is failed.");
    }
  } finally {
  ps.close();
  fis.close();
  }
  }


















 

HANA Cloud Platform Setup

 

I will go through the key important files. You can find the complete source code on GitHub: https://github.com/ferrygun/Whana

b4.jpg

mytable.hdbtable

We need to create a table, mytable.hdbtable in SAP HANA database to store the image from the WhatsApp.

 

mytable.hdbtable

table.schemaName = "NEO_CG2SX3P5XHHQEO58DKM7BWU0V";

table.tableType = COLUMNSTORE;

table.description = "My Table";

table.columns = [

          {name = "File_Name"; sqlType = NVARCHAR; nullable = true; length=20;},

          {name = "File_Content"; sqlType = BLOB; nullable = true;}

];

 

b5.jpg

 

GetImage.xsjs


Create the GetImage.xsjs to retrieve the image binary stream from the HANA database:

  var id = $.request.parameters.get('id');
    var conn = $.db.getConnection();
    try {
        var query = "Select \"File_Content\" From \"NEO_CG2SX3P5XHHQEO58DKM7BWU0V\".\"p1940803061trial.fd2.data::mytable\" Where \"File_Name\" = " + id;
        var pstmt = conn.prepareStatement(query);
        var rs = pstmt.executeQuery();
        rs.next();
        $.response.headers.set("Content-Disposition", "Content-Disposition: attachment; filename=filename.jpg");
        $.response.contentType = 'image/jpg';
        $.response.setBody(rs.getBlob(1));
        $.response.status = $.net.http.OK;
    } catch (e) {
        $.response.setBody("Error while downloading : " + e);
        $.response.status = 500;
    }
    conn.close();



























 

GetFileName.xsjs

Create the GetFileName.xsjs to list down all the image file names so user can select  from the SAPUI5 table:

var query = "Select \"File_Name\" From \"NEO_CG2SX3P5XHHQEO58DKM7BWU0V\".\"p1940803061trial.fd2.data::mytable\" ";
function close(closables) {
    var closable;
    var i;
    for (i = 0; i < closables.length; i++) {
        closable = closables[i];
        if (closable) {
            closable.close();
        }
    }
}
function getFileName() {
    var FNameList = [];
    var connection = $.db.getConnection();
    var statement = null;
    var resultSet = null;
    try {
        statement = connection.prepareStatement(query);
        resultSet = statement.executeQuery();
        while (resultSet.next()) {
            var fname = {};
            fname.file_name = resultSet.getString(1);
            FNameList.push(fname);
        }
    } finally {
        close([resultSet, statement, connection]);
    }
    return FNameList;
}
function doGetFileName() {
    try {
        $.response.contentType = "application/json";
        $.response.setBody(JSON.stringify(getFileName()));
    } catch (err) {
        $.response.contentType = "text/plain";
        $.response.setBody("Error while executing query: [" + err.message + "]");
        $.response.returnCode = 200;
    }
}
doGetFileName();




























Embed an Image in HTML Code


In order to view the image, we will call the function GetImage.xsjs and embed the picture into the HTML code with base64 encoding using the following tag:

<img src="data:{mime};base64,{data}" alt="My picture"/>
























 


where {Mime} is in ‘image/imagetype’ format (eg. ‘image/jpg’ or image/png) and {data} is the base64 encoded

 

Here is an example when I viewed the source code of the "WhatsApp" image in the Chrome browser:

 

b6.jpg

b7.jpg

 

The actSearch function in the view controller will call the GetImage.xsjs based on the given image ID (filename) and encode the content in the base64 format.

 

actSearch: function(fname) {
        var xmlHTTP = new XMLHttpRequest();
        xmlHTTP.open('GET', '../fd2/image/GetImage.xsjs?id=' + fname, true);
        xmlHTTP.responseType = 'arraybuffer';
        xmlHTTP.onload = function(e) {
            var arr = new Uint8Array(this.response);
            var raw = String.fromCharCode.apply(null, arr);
            var b64 = btoa(raw);
            var dataURL = "data:image/jpeg;base64," + b64;
            document.getElementById("image").src = dataURL;
        };
        xmlHTTP.send();
    }






















Running the App


Firstly you need to open the tunnel to the SAP HANA Cloud Platform. Under folder neo-javaee6-wp-sdk, execute the following command:

neo open-db-tunnel -h "hanatrial.ondemand.com" -u "p1940803061" -a "p1940803061trial" --id "fd2"

 

Replace "p1940803061" with your HANA trial account ID and "fd2" with your HANA instance.

 

b8.jpg


If successfull, you will get the the password: Df2sxp5HH8H6BGv and user: DEV_5GKYRR9GKKT9NTV5RD7U0DGMG. Update the pwd and user value in the config.properties:

user=DEV_5GKYRR9GKKT9NTV5RD7U0DGMG

pwd=Df2sxp5HH8H6BGv


Run the Yowsup. Execute below command:

python C:\Python34\yowsup-master\src\yowsup-cli --interactive <phone_number> --wait --autoack --keepalive --config C:\Python34\yowsup-master\src\yowsup-cli.config


b9.jpg


And finally run the Java Folder Monitoring app:

java FolderMonitor



 

Demo Video

 

 

Summary

 

In other case, I have also built a CCTV home security and upload the video clip to the DropBox (or HANA). Here is the video:

 

Hope your are enjoy "Whana" and thank you for reading my blog. Please let me know if you have any question/comment.

I was inspired by this blog to create a real-time dashboard to monitor my laptop Windows 8.1 CPU usage in the SAP HANA Cloud Platform. We will create a simple app in the HANA cloud platform with WebSocket, some Python and VBScript scripts to get the CPU usage and display it in a simple gauge.

 

Prerequisites

 

The app will look like this:

ss10.jpg

 

Create a Web Project in Eclipse

 

  • Let's start to create a project in Eclipse. Select New > Dynamic Web Projectss1.jpg
  • Give the project name: rtd. Target runtime is SAP HANA Cloud. Rtd is abbreviation of real-time dashboard.
    ss2.jpg
  • Create a Java package: rtd
    ss3.jpg
  • Create a java class: RTD
    ss4.jpg
  • Copy the following code into RTD.java:
    package rtd;
    import java.io.IOException;
    import java.util.Collections;
    import java.util.HashSet;
    import java.util.Set;
    import javax.websocket.OnClose;
    import javax.websocket.OnMessage;
    import javax.websocket.OnOpen;
    import javax.websocket.Session;
    import javax.websocket.server.ServerEndpoint;
    @ServerEndpoint("/rtd")
    public class RTD  {
      private static final Set<Session> peers = Collections
      .synchronizedSet(new HashSet<Session>());
      @OnMessage
      public void onMessage(final String message) {
      for (final Session peer : peers) {
      try {
      peer.getBasicRemote().sendText(message);
      } catch (final IOException e) {
      e.printStackTrace();
      }
      }
      }
      @OnOpen
      public void onOpen(final Session peer) {
      peers.add(peer);
      }
      @OnClose
      public void onClose(final Session peer) {
      peers.remove(peer);
      }
    }
    
    
    
    
    
  • And also create the index.html under folder WebContent and copy d3 and gauges folder under the same folder. I have stored the complete source-code in the Github: https://github.com/ferrygun/RTD

    Below is the snippet of the JavaScript in the index.html:
    <script>
      //var wsUri = "ws://localhost:8080/rtd/rtd";
      var wsUri = "wss://rtdp057134trial.hanatrial.ondemand.com/rtd/rtd";
            function init() {
                output = document.getElementById("output");
            }
            function send_message() {
                websocket = new WebSocket(wsUri);
                websocket.onopen = function(evt) {
                    onOpen(evt)
                };
                websocket.onmessage = function(evt) {
                    onMessage(evt)
                };
                websocket.onerror = function(evt) {
                    onError(evt)
                };
            }
            function onOpen(evt) {
                console.log("Connected to rtd!");
                doSend(0);
            }
            function onMessage(evt) {
                console.log(evt.data);
                updateGauges(evt.data)
           
            }
            function onError(evt) {
                console.log('Error: ' + evt.data);
            }
            function doSend(message) {
                websocket.send(message);
                //websocket.close();
            }
       
       
            window.addEventListener("load", init, false);
      var gauges = [];
      function createGauge(name, label, min, max)
      {
      var config =
      {
      size: 300,
      label: label,
      min: undefined != min ? min : 0,
      max: undefined != max ? max : 100,
      minorTicks: 5
      }
      var range = config.max - config.min;
      config.greenZones = [{ from: config.min, to: config.min + range*0.75 }];
      config.yellowZones = [{ from: config.min + range*0.75, to: config.min + range*0.9 }];
      config.redZones = [{ from: config.min + range*0.9, to: config.max }];
      gauges[name] = new Gauge(name + "GaugeContainer", config);
      gauges[name].render();
      }
      function createGauges()
      {
      createGauge("cpu", "CPU Usage Monitoring");
      }
      function updateGauges(value)
      {
      for (var key in gauges)
      {
      console.log(value);
      gauges[key].redraw(value);
      }
      }
      function initialize()
      {
      createGauges();
      send_message();
      }
    </script>
    
    
    
    
    
  • The final folder structure would be like this:
    ss5.jpg

 

Deploy to HANA Cloud Platform

 

After you have created the web project in Eclipse, we will deploy the app in the HANA Cloud Platform.

 

  • Export the war file to the location where you have installed the Java 6 EE Web Profile: <Java_6_EE_Web_Profile>/tools . Name it as rtd.war.

    ss7.jpg
  • Open your command prompt and cd to the location of <Java_6_EE_Web_Profile>/tools. Execute the following command:
    neo deploy -s rtd.war -b rtd -h hanatrial.ondemand.com --java-version 7 -a <accountname> -u <username>

    Where accountname is your HANA account name, for example: p057134trial and username is your account name without 'trial' (i.e., p057134) .
    ss8.jpg
    We are specifying the java-version 7 as the WebSocket implementation requires JRE 7. Please refer to this.
  • Once you have successfully deployed, go the Hana Cloud Platform Cockpit and check the status of the application. Just start the app if it is not started. You will also see the Java EE 6 Web Profile under the Runtime.
    ss9.jpg
  • Click the link under the Application URLs: https://rtdp057134trial.hanatrial.ondemand.com/rtd/
    ss13.jpg
    And you will see the gauge. At the moment we are not sending any data to the WebSocket so you will not see any movement on the gauge.
    ss14.jpg

 

Send Data to WebSocket

 

We are going to use the VBScript to get the CPU usage and send the information to WebSocket using Python. Since we are deploying the app in the HANA Cloud platform, the WebSocket URI is:


wss://rtdp057134trial.hanatrial.ondemand.com/rtd/rtd

 

If you would like to test in the HANA local runtime, change the URI to:

 

ws://localhost:/8080/rtd/rtd

 

Below is the snippet Python program to send the information to the WebSocket:

 

#!/usr/bin/python
import sys
from websocket import create_connection
ws = create_connection("wss://rtdp057134trial.hanatrial.ondemand.com/rtd/rtd")
#ws = create_connection("ws://localhost:8080/rtd/rtd")
arg1 = str(sys.argv[1])
print (arg1)
ws.send(arg1)
result =  ws.recv()
ws.close()





 

Open your command prompt and type the following: cscript //nologo getCPUUsage.vbs

 

ss12.jpg

If there is no error, you will see the movement of the pointer in the gauge.

Screen Shot 2014-08-06 at 08.10.41.png

You've heard about the SAP HANA Cloud Platform and are wondering what it is about and how it works?

 

At the ASUG pre-conference day you can participate in the half-day seminar "Fast Track to Development on SAP HANA Cloud Platform".

 

 

We'll start the seminar with an introduction to the platform and all it's main services and capabilities followed by a demo around native HANA development with the SAP HANA Cloud Platform.

This will be followed by three additional blocks where each of the blocks has a theoretical part and a demo.

 

We'll cover the following topics:

Screen Shot 2014-08-06 at 08.59.41.png

  • Introduction to SAP HANA Cloud Platform
  • Connect sensors to the SAP HANA Cloud Platform
  • Hybrid cloud applications with the SAP HANA Cloud Platform
  • Manage lightweight and responsive HTML5 apps on the SAP HANA Cloud Platform

 

Connecting sensors to SAP HANA Cloud Platform

In this block I'll show you how you can use devices like e.g. a RaspberryPi to connect sensors to it and how you

can store the sensor data on your SAP HANA Cloud Platform account. This demo shows you how you can exploit sensor data for real-time insights into your products and services (Internet of Things).

 

 

Hybrid cloud applications with SAP HANA Cloud PlatformScreen Shot 2014-08-06 at 08.16.58.png

Sometimes you want to leverage the benefits of an application running in the cloud but want to leave your data on your on-premise systems. This block will show you how to build such hybrid cloud applications and how you can leverage the data you already have in your on-premise systems to create additional value for your end-users.

 

 

Lightweight HTML5 apps on SAP HANA Cloud Platform

With technologies like SAP Gateway, oData services in general or JSON you can easily connect user interfaces build with SAPUI5 with your data. In the demo I'll show you how easy it is with the SAP HANA Cloud Platform to deploy such HTML5 applications to the cloud. At the same time you'll see how the SAP HANA Cloud Platform helps

you to tackle common issues like the same-origin policy in browsers with very low efforts and without the loss of data confidentiality or integrity.

 

 

Interested? Add the seminar to your dcode registration!

 

Interested? You can register to the seminar while going through the SAP TechEd && d-code event registration process. Already registered? Add the seminar "Fast Track to Development on SAP HANA Cloud Platform" to your registration.

button(4).png

 

Best,

Rui

I attended the MOOC course Next Steps in SAP Hana Cloud Platform from open SAP. In case you did not participate, the course is going to be offered again already next month.

 

The course topics include native Hana development, HTML5, git, SAML, oAuth and Cloud Portal. As always, attending the course is not really hard as you can access the material when you want during the week and only have to submit the weekly assignment until its deadline. Rui is not a bad presenter on topics around Hana Klaus Portal and while most of the content can be consumed passively, it is a good idea to have Eclipse started up to go through some of the excercises. Nice side effect: you get a local repository of projects you can use for future references.

 

What about the course content? When I saw that there is a new HCP course I was happy to be able to learn more about HCP. Git and HTML5 looked nice as well as web APIs. Things I really wanted to see more about.

 

Git

Short: too much git. 2 weeks of commit, push, tag, brances. I understand that SAP is proud of having git support in HCP, and publishing a HTML5 site to HCP is now really easy. You do not even have to take of setting up a git repository. HCP does that for you. Going into detail on git commands is nice for real developers working in real teams. But in case you are more on the architecture site or your team is composed mostly of just you, 2 weeks is simply too much. Besides, I’d liked to have seen more information on Gerrit in this context.

 

HTML5

The HTML5 app showed is using SAPUI5, but the part of SAPUI5 was not really elaborated in the details it deserves, leaving you with a SAPUI5 application without knowing how the page actually works. You substitute a small Javascript fragment, but what exactly it does is left to you to find out. There will be an open SAP course on Fiori coming which should solve that problem.

 

Web API

The content presented was about SAML, oAuth and how to protect the API and writing a (native) mobile client. To be honest, I thought that this section will be about Apache Olingo and Jersey for creating REST interfaces. Exposing JPA to REST/JSON/ODATA guidelines is what I expected to see.

 

What else

On my I wish list for the next HCP course is:

  • Foundation: PaaS and IaaS topics to explain how HCP is working and what standards are used (CloudFoundry, OpenStack)
  • PaaS: explaining how to write a HCP application that runs on other PaaS solutions like Google. Or how to migrate an AWS Beanstalk application to HCP.
  • Continuous integration: Developing an application is nice, but you have to test it, automate the deploy and gather statistics and KPIs. Learning from SAP on how to use maven together with HCP in an CI environment like Jenkins is high on my list. The currently shown developer model may be what you see in reality, but in the context of a course a (new) development paradigm should be teached.
  • Olingo and Jersey: what are the options to expose JPA data in a RESTful way with HCP?
  • Maven
  • Testing: Unit testing, specially on how to use Arquillian to test JPA apps on HCP, performance testing, etc.
  • A full end-to-end solution. Hana native development to expose data via OData, having a Java application for storing and retrieving other data, integrate an on-premise system via HCC, write a SAPUI5 app and access it via web and a mobile device.

I just pushed a little showcase application to github which demonstrates the use of a SAPUI5 frontend application with a Grails backend system. The reasons I wrote this frontend in the RDE are:

  • I love SAPUI5 as frontend technology.
  • I love Grails as backend technology cause it`s so groovy.
  • I needed a simple showcase for authentication in restful application scenarios.
  • I want a colleague to work on the frontend part of another bigger Grails application with minimum effort on local system setup.


So I decided to put the frontend part of the application into a GITHUB repository. I then cloned this in the RDE that comes as beta version with HCP trial edition. After that I could develop my frontend application in the RDE and push my changes into this repository.

Bild002.png

To connect this frontend to my Grails backend application and to overcome the SOP (same origin policy) problem I had to expose the latter via the SAP Cloud connector to my HCP account.

Bild001.png

 

With this I could create a destination in the HCP cockpit

Bild003.png

After the destination was available I had to refer to it in the file neo-app.json of my SAPUI5 application.

Bild004.png

Now I could simple call my api functions from the SAPUI5 application as if they were running at the same host.

     oModel.loadData("api/logout", undefined, true, "POST", false, false, header );


If you are interested in the REST authentication stuff please look into the sourcecode at GITHUB and the readme file over there.



Between June 12th and July 31st 2014 we ran the openSAP course "Next Steps in SAP HANA Cloud Platform" to provide interested developers, customers and partners with more details around the usage of the SAP HANA Cloud Platform.

 

With this blog I want to thank all of the students for their participation and their feedback and want to provide some additional information around the course.

 

Direct access to the videos

 

First of all I'd like to encourage you to use the materials at the openSAP page for this course as it not only provides the videos and the slides, but also all the discussion threads with a lot of questions and answers that popped-up during the course.

 

The following list is meant for those of you who want to have a quick check of the videos to remember how to accomplish a specific task with the platform. You can find the videos related to the units of the course under the column "Additional Assets".

 

Again: don't forget that you get all the additional goodies like slides and Q&As around the course on the official Course: Next Steps in SAP HANA Cloud Platform!

 

Unit Videos
Agenda
Week 1: SAP HANA Native Development  (related blog post)
1: Basics
  • The various SAP HANA Cloud Platform offerings
  • The specifics of the SAP HANA Cloud Platform trial landscape
  • How to set your development environment and connect to your SAP HANA instance
2: SAP HANA Applications
  • Using SAP HANA on-premise and on the SAP HANA Cloud Platform
  • How to import a sample SAP HANA application
  • Running the SHINE application on SAP HANA Cloud Platform
3: SAP HANA Web-based  Development Workbench
  • How to use the SAP HANA Web-based Development Workbench to quickly develop, modify, and test your SAP HANA application.
  • How to launch the SAP HANA Web-based Development Workbench directly from the SAP HANA Cloud Platform cockpit
  • How to modify the SHINE application on SAP HANA Cloud Platform directly, using the SAP HANA Web-based Development Workbench.
4: SAP HANA Predictive Analysis Library
  • How to use PAL on the SAP HANA Cloud Platform
  • PAL on the free SAP HANA Cloud Platform trial landscape
  • How to use ABC analysis PAL function to build an SAPUI5 graphical visualization
5: Extend SAP HANA Applications with HCP Services
  • Additional services and extension capabilities that SAP HANA Cloud Platform provides on top of SAP HANA native capabilities
  • How to configure and work with SAP HANA Cloud Platform feedback service
  • How to enhance a sample SHINE application with SAP HANA Cloud Platform feedback service
Week 2: Git and HTML5 Apps - Part 1 (related blog post)
1: Introduction to HTML5 Applications and Git
  • HTML5 applications on SAP HANA Cloud Platform
  • The development Infrastructure
  • What is Git?
2: Creating a Hello World HTML5 Application
  • How to create a simple HTML5 application
  • How to clone a repository
  • How to commit and push
  • How to test an HTML5 application
3: Git Basics
  • Where does Git store versions?
  • What is a working directory?
  • What is a commit and how can you create one?
  • What is a branch?
  • How to get a copy of a repository with clone?
  • How to transfer back your changes with push?
  • Where does Git store the  configuration settings?
4: Using SAPUI5 in Your HTML5 Application
  • How to use SAPUI5?
  • What is an SAPUI5 model?
  • What is an SAPUI5 view?
  • What is an SAPUI5 controller?
5: Using a REST Service in Your HTML5 Application
  • How to use a REST Service in an HTML5 application?
  • What is the application descriptor?
  • How to configure back-end routing?
  • How to create a destination?
Week 3: Git and HTML5 Apps - Part 2 (related blog post)
1: Releasing a Version of Your HTML5 Application
  • Know the difference between commit, version, and active version
  • How to create a version using Git.
  • How to create a version using the cockpit.
  • How to activate an application.
  • How to fetch in Eclipse.
2: Adding a Chart to Your HTML5 Application
  • Recap of the development and test lifecycle for HTML5 applications
  • How to use a chart in SAPUI5
3: Working with Multiple Branches
  • How to work with local branches.
  • Why local branches are useful.
  • How to rebase local branches.
4: Resolving Merge Conflicts
  • Merge conflicts created by git
  • How to resolve conflicts
5: Git History
  • How to filter the history
  • How to search in the history
  • How to find out when and why a line was changed
  • How to revert a commit
  • How to reset a branch
Week 4: Advance Identity Management (related blog post)
1: Working with User Profile Attributes
  • Different classes of user account information
  • Configuring attributes with the local IdP and in the Cloud Cockpit
  • Accessing user attributes in java based apps
2: Group Management
  • Using groups in SAP HANA Cloud Platform
  • Assigning users to groups
3: Federated Authorization with Groups
  • Defining mapping rules
4: Custom Roles
  • Defining and using custom roles
5: Working with Multiple Identity Providers
  • Using multiple identity providers
Week 5: Securing Web APIs (related blog post)
1: Protecting Web APIs
  • What are Web APIs?
  • Where to use SAML 2.0 and OAuth?
  • What are the benefits of OAuth?
2: OAuth 2.0 Fundamentals
  • How OAuth enables secure authentication and authorization for non-browser- based clients such as native mobile apps
  • Comparison OAuth vs. password authentication
3: Protecting the Cloud Application
  • How to configure the OAuth Filter
  • How to protect APIs programmatically
4: OAuth Configuration
  • How to register OAuth clients
  • How to configure scopes for your cloud application
5: Working with Multiple Identity Providers
  • How to integrate an OAuth Client with the SAP HANA Cloud Platform OAuth Authorization Server
  • How to implement a callback handler for the authorization code flow in a desktop client
Week 6: Advanced Features (related blog post)
1: SAP HANA Cloud Portal for Developers
  • What does SAP HANA Cloud Portal offer to developers
  • How to administrate SAP HANA Cloud Portal
  • The SAP HANA Cloud Portal marketplace concept
  • How to expose your custom apps as widgets in SAP HANA Cloud Portal
  • How to manage site pages and widgets to create engaging sites
  • How to preview the site, publish and revert changes made for the site
2: Developing Applications for Use in SAP HANA Cloud Portal Sites
  • Understand the SAP HANA Cloud Portal development process
  • Develop widgets for use in SAP HANA Cloud Portal sites
  • Develop an SAP HANA Cloud Portal solution with OpenSocial
  • How to use OpenSocial features available in SAP HANA Cloud Portal
  • SAP HANA Cloud Portal as a central UI framework
  • Building mobile-ready SAP HANA Cloud Portal sites
3: Design and Customize Cloud Portal Sites
  • How to design the site layout and select a theme for your site
  • How to customize the default SAP HANA Cloud Portal theme
  • How to apply an out-of-the-box theme to your site
  • The SAP HANA Cloud Portal page templates concept
  • Site navigation menu customization options
4: SAP HANA Cloud Integration
  • How to use the Catalog to view all prepackaged integration flows on the SAP HCI landing page
  • Configuring and using the Web UI
5: Wrap-Up and Outlook
  • Wrap-up of the course
  • Outlook to the platform and to other openSAP courses around SAP HANA Cloud Platform

 

Best,

Rui

Prologue

 

True... it's been a while since the last chapter, but patience is one of these virtues that come with age. Hence let's hope that Granny doesn't mind too much and get it on with. In the last chapter we talked about proper user interface design, both from an outside-in but also inside-out approach. I still believe that the later is the enabler for a good user experience (UX). It's a classic principle of software development: proper layering, separation of concerns and componentization help to fine-tune individual aspects as one does not need to worry to break other things.

 

Judging by the feedback I received it seems that people believe that the time applications need to cater for browsers user-agents that do not support JavaScript aeh ECMAScript is past us and rather a relict of the past: "Common, we live in the 21st century!"

 

I'd say it depends on the usage scenario and I believe there are still use cases where people will have to implement such fallbacks (e.g. in the public sector etc.) As always, it's a case-by-case situation and the additional efforts required to maintain a solid fallback mechanism for limited user-agents certainly need to beconsidered and planned for. Yet, given that the whole idea behind this blog series is to talk about what it takes to implement an enterprise-ready solution I at least wanted to point out how it's done! Going forward we'll certainly focus on state-of-the-art techniques to maximize the user experience and only maintain a rudimentary fall-back solution (after all, convenience goes a long way, and users may be encouraged to update to a modern browser if they feel they are missing out!)

 

Having said all this, it's time to have a look at current trends in user interface technologies. Doing so, we quickly notice that there's a tendency to let the client do the heavy-lifting. Technologies such as HTML5, JavaScript (see above) and CSS3 have progressed tremendously over the last years and it's amazing to see what capable developers can build with these web standards! On the mobile side we see both web and native apps deliver great user experiences and containers like Cordova (aka PhoneGap) completely blur the lines between web and native apps. In both scenarios the client handles the user interaction and only communicates with the server/backend via web services (in the broader sense of the word!) Typically, this is done using light-weight communication protocols and standards such as REST (e.g. JSON via HTTP) or - as popular at SAP and Microsoft - OData. Consequently, the server is responsible of providing an API that can be used by clients.

 

There's an API for that!

 

I truly believe that in the context of cloud computing and the Internet of Things (IoT) the famous slogan "There's an app for that!" (copyright by Apple Inc.) will change into "There's an API for that!" - APIs are really the foundation of the magic of today's inter-connected world.

 

programmable_web_growth_in_web_apis.jpg

Growth in Web APIs Since 2005 - Source: ProgrammableWeb

 

Given the importance I opted for giving this topic some extra room and actually write a mini series about APIs... yet before we dig deeper I'd like to point out that the term API does not necessarily indicate that the exposed services/functionality is consumed from the outside. In fact, I strongly promote of clearly establishing an internal API comprising the business functionality of any application. Software is never finished! As such, as applications grow over time it clearly helps to have defined an internal API layer that is used between the individual components or modules across the application. Exposing this set of services (or parts) of it in such a way that external clients can consume them is a completely different story - and one that comes with its own challenges!

 

I already wrote extensively about Enterprise APIs, why they matter, their primary principles and how-to develop them in my respective blog post series called 'The Rise of Enterprise APIs':

 

 

As I hate to repeat myself I just refer you to the respective posts and only briefly point out things I deem important to make a point. From an implementation perspective Granny's Addressbook uses the same building blocks as the sample application I developed for part 3 of the series: Apache CXF (as a great implementation of the JAX-RS standard, Spring, etc.

 

API = A set of self-contained services

 

The most important aspect of a business service that shall be included within an API is that it needs to be self-contained. For the client/consumers it has to act as a black box and ultimately a client would not need to know anything about the internal workings of the service. Consequently, the service may not take anything for granted (e.g. that the incoming data is in proper format or that it has been validated for type-safety or plausibility). As such, the service needs to properly check any incoming data for validity and properly report back any issues there may be. Same applies for security aspects etc. All of these aspects need to be ensured regardless of whether it was an internal or an external client consuming the service. That's the reason why I usually promote separating the underlying service from the actual API endpoint (which is protocol and format specific!)

 

This mindset is reflected in the architecture of the Granny's Addressbook application. The com.sap.hana.cloud.samples.granny.srv package contains the actual service implementations, which are just regular Java classes. They use the standard data model objects of the application and declare to throw a dedicated ServiceException. The RESTful API however is located in a different package called: com.sap.hana.cloud.samples.granny.api. If you have a closer look at the so-called Facades within this package you will see that they contain certain meta information (annotations) that are specific to RESTful service communication (and part of the JAX-RS API.) Same is true for the Response objects returned by the individual service methods. This differentiation also takes effect in regards to error handling. The service needs to take care of all business related concerns, while the (REST) facade needs to make sure it properly handles issues that may arise during the process of marshalling/unmarshalling the data model objects into JSON (or any other format). Let's keep that in mind as we get back to that topic in a few...

 

 

The three commandments of a good API

 

One could write a while book about what makes a good API, yet I leave that for others and simple state three fundamental characteristics here:

 

  • easy to consume
  • well documented
  • good error handling

 

That's really it! If an API fails to adequately address these things it will have a hard time to see adoption. And - quite frankly - if your API is part of a service targeting adoption you better nail does three things or...

 

Easy to consume

 

This factor has been the main reason why RESTful services have taken the IT world by storm in the last years. Due to the fact that RESTful services simply sit on top of the matured HTTP protocol makes them very easy to consume (especially compared to prior standards such as Web Services (I'm referring to the kind based on SOAP and WS* standards here...). Developers can test-drive RESTful APIs with their browsers or light-weight REST clients with ease. (Personally, I prefer a tool called POSTman.)

 

Well documented

 

This is a make or break topic - PERIOD. Your documentation needs to be easy to find and easy to understand. Developers are an impatient bunch and never have enough time to deliver within tight project deadlines. If they are struggling with how-to consume your API then they may just move on to a competitor - simple as that. Now, the hard part about maintaining a good API is actually maintaining it!

 

Way too often the documentation gets outdated, especially if it is written by a different team (which is quite common!) In general, information experts are under-valued and consequently service/API providers are well-advised to invest into a great documentation team that has both writing and technical skills. Ultimately, you regularly validate ease-of-consumption of your documentation via user tests!

 

Ultimately, you are well advised to consider the documentation an integral part of your product/service/solution and tightly weave it into your development & delivery processes! In fact, one of the next chapters of this series will focus on exactly that topic - so stay tuned!

 

Good error handling

 

Last, but not least: error handling (everyone's favorite topic, right?) From experience I can say that there's nothing more frustrating than struggling with an API because of poor error messages. If a developer can't figure out what (s)he has done wrong in consuming your API then you have a problem!

 

Here's where the dots connect in regards to what I said earlier about self-contained services. Given the importance of the topic I propose we have a closer look at it now, shall we?

 

Catch me if you can!

 

As always, we want to deal with error handling in a central place. Yet, in reference to what we said earlier we have two different flavors of errors to deal with. Semantical errors (e.g. constraint violations such as an exceeded maximum length of a given attribute) and technical errors (e.g. invalid payload data or formatting issues.) The former needs to be taken care of by the business services, while the later needs to be taken care of by the layer that handles the incoming/outgoing communication. In the case of Granny's Addressbook the in- an outbound communication is handled by CXF and the JSON marshalling/unmarshalling is taken care of by Jackson. Fortunately, this combination allows us to nicely implement our requirements (and that's what makes a good framework after all!)

 

Let's see how this works by looking at the code (= the single source of truth!). Below is an extract of the spring conifguration file:

 

 <jaxrs:server id="api" address="/v1">
  <jaxrs:properties>
            <entry key="org.apache.cxf.propagate.exception" value="false" />
        </jaxrs:properties>
  <jaxrs:serviceBeans>
  <ref bean="contactFacade" />
  </jaxrs:serviceBeans>
  <jaxrs:providers>
            <ref bean="jacksonProvider" />
            <ref bean="parserExceptionMapper" />
            <ref bean="jsonMappingExceptionMapper" />
            <ref bean="serviceExceptionMapper" />
        </jaxrs:providers>
 </jaxrs:server>
 <bean id="objectMapper" class="com.sap.hana.cloud.samples.granny.util.CustomObjectMapper" />
 <bean id="jacksonProvider" class="org.codehaus.jackson.jaxrs.JacksonJaxbJsonProvider">
  <property name="mapper" ref="objectMapper"/>
 </bean>
  <!-- Exception Mappers -->
 <bean id="parserExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.ParserExceptionMapper" />
 <bean id="jsonMappingExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.JsonMappingExceptionMapper" />
 <bean id="serviceExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.ServiceExceptionMapper" />

 

In lines 10-12 I register so-called ExceptionMappers defined in lines 23-25. We have two ExceptionMappers taken care of the most common areas during the marshalling/unmarshalling process (namely parsing and mapping) and one ExceptionMapper taking care of ServiceExceptions. Just as it should be!

 

I recommend to have a look at the ParserExceptionMapper and JsonMappingExceptionMapper respectively. They should be pretty self-explanatory for the most part, yet I'd like to explicitly mention one aspect I deem important - we have to strike a good balance between level of detail and usefulness, plus not expose too many details about the underlying technology stack (for security reasons!) Therefore, I have used a bit of regex magic to conceal certain internal aspects.

 

Note: This approach certainly has its trade-offs! Given that we (more or less) pass-through the error messages provided by Jackson we are not in control of its content. Nor can we apply any kind of I18N and so forth. This certainly is an edge case and in a real-world application we certainly have to thoroughly test this piece of functionality with a lot of test cases to make sure the effect is as desired. Or, we would need to map the Jackson error message to our own and then return our own error messages. Whatever you decide to do, it should be a conscious decision!

 

1st-level data validation

 

So, now that the parsing and mapping has been dealt with - what about 1st level data validation (e.g. type-safety, mandatory fields, min/max lengths, etc.) Given on what I said above about self-contained services this certainly is something that should be dealt with by the service layer, right? The answer is a clear "YES, but..."

 

... does it make sense to pass through unvalidated data to the next layer, or shouldn't this be checked early-on? That's a tricky question! We could check the data here as well, by just adding the same @Valid annotation we use in the service layer. This would help to make the app more scaleable as we would reject invalid data at the earliest time in the request processing. Given that we have clearly separated the validation logic we at least can ensue a consistent handling of data validation. On the other hand, the first rule still stands and so the individual services would still need to verify the incoming data, consequently resulting in a redundant check (= performance overhead).

 

Internal vs external APIs

 

Matter of fact, that brings us to another topic. Assume we have a complex application with lots of services, of which some are just business facades chaining several calls to other business services. In such an application we would see repetitive calls to the data validation functionality! This is where you have to set boundaries and differentiate between internal services and external services. Among the former you'd have to assume that the data is semantically correct and would not need to be verified, while for the later you'd always want to ensure proper data validation.

 

Note: I recall having developed such a complex application and we ended up with developing our own set of annotations that would allow us to setup trust relationships between individual services in order to avoid redundant checks. In a nutshell the solution used ThredaLocals to store information about which model objects have been validated (and by whom). Every subsequent service then checked if the incoming model object had been validated and whether or not it had been validated by a trusted service or not. Now, obviously certain attributes could be changed along the processing chain and consequently data that once was valid, may not be valid anymore when it is passed to another service. This is where checksums may need to come into place. Sooner or later you end up with a complex meta-validation framework in its own right and well... that may do more harm than good!!!

 

So, on which layer to perform your data validation needs to be decided on a case-by-case basis. For Granny's Addressbook (which is a very simple) application it may be sufficient to simply do the data validation on the service layer, especially since our RESTful facades are rather thin. However, for the sake of illustration purposes I'll demonstrate how-to check data on the RESTful API layer AND on the service layer. That way, you have a blueprint for both and you can decide on your own which way to go with for your project.

 

Apache CXF and JAX-RS 2.0

 

Recently, CXF version 3.0 was released and it marks a huge milestone as it also brought support for JAX-RS 2.0. JAX-RS 2.0 is a major step and it brings a lot of new features we eagerly waited for such as hypermedia support (for the die-hard HATEOAS fans out there!) and ... surprise, surprise... bean validation.

 

Here's the official documentation on how-to integrate bean validation into CXF: Apache CXF -- ValidationFeature

 

We pretty much stick to these instructions, yet there is one short-coming in how CXF handles this aspect: the default ValidationExceptionMapper is rather dumb as it only logs the validation errors and returns an HTTP status code 400 (Bad request). Ultimately we would want to provide that information back to the client to give the user a hint what is wrong. (Note: For the SAP/ABAP guys out there, we are looking for something like a BAPIRET2 structure here!)  The Jersey (another popular JAX-RS implementation) is a bit smarter here: Chapter 17 - Bean Validation Support

 

Here's our own ValidationError object:

 

/**
* Object used to report information about validation errors.
*/
@XmlRootElement(name = "error")
@XmlAccessorType(XmlAccessType.FIELD)
public class ValidationError implements Serializable
{
  /**
  * The <code>serialVersionUID</code> of the class.
  */
    private static final long serialVersionUID = 1L;
    String messageKey = null;
    String message = null;
    String messageTemplate = null;
    String path = null;
    String invalidValue = null;
    Map<String, String> messageParameter = null;
    ...
}

 

Code Review

 

So, to wrap up things, let's quickly go through the major components and see how it all fits together and to highlight a few coding segments. Let's start with the final spring configuration:

 

<jarxs:server id="api" address="/v1">
    <jaxrs:inInterceptors>
        <ref bean="validationInInterceptor" />
    </jaxrs:inInterceptors>
    <jaxrs:outInterceptors>
        <ref bean="validationOutInterceptor" />
    </jaxrs:outInterceptors>
    <jaxrs:properties>
        <entry key="org.apache.cxf.propagate.exception" value="false" />
    </jaxrs:properties>
    <jaxrs:serviceBeans>
        <ref bean="contactFacade" />
    </jaxrs:serviceBeans>
    <jaxrs:providers>
        <ref bean="jacksonProvider" />
        <ref bean="parserExceptionMapper" />
        <ref bean="jsonMappingExceptionMapper" />
        <ref bean="serviceExceptionMapper" />
        <ref bean="validationExceptionMapper" />
    </jaxrs:providers>
</jaxrs:server>

At first (lines 2-7) we register both an inbound and an outbound interceptor (well, for our data validation purposes we only need the inbound one!) Also, note that we registered an additional ValidationExceptionMapper as a provider (line 19).

 

The respective definitions are as follows:

 

<bean id="validationExceptionMapper" class="com.sap.hana.cloud.samples.granny.web.util.ValidationExceptionMapper" parent="constraintViolationMapper" />
<!-- Validation -->
<bean id="validationProvider" class="org.apache.cxf.validation.BeanValidationProvider">
  <constructor-arg><ref bean="validationConfiguration"/></constructor-arg>
</bean>
<bean id="validationConfiguration" class="org.apache.cxf.validation.ValidationConfiguration">
  <property name="messageInterpolator" ref="resourceBundleMessageInterpolator"/>
  <property name="parameterNameProvider" ref="jaxRSParameterNameProvider" />
</bean>
<bean id="resourceBundleMessageInterpolator" class="org.hibernate.validator.messageinterpolation.ResourceBundleMessageInterpolator">
  <constructor-arg index="0">
      <bean class="org.springframework.validation.beanvalidation.MessageSourceResourceBundleLocator">
          <constructor-arg index="0" ref="messageSource"/>
      </bean>
  </constructor-arg>
</bean>
<bean id="jaxRSParameterNameProvider" class="com.sap.hana.cloud.samples.granny.web.util.CustomJAXRSParameterNameProvider" />
<bean id="validationInInterceptor" class="com.sap.hana.cloud.samples.granny.web.util.CustomJAXRSBeanValidationInInterceptor">
    <property name="provider" ref="validationProvider" />
</bean>
<bean id="validationOutInterceptor" class="org.apache.cxf.jaxrs.validation.JAXRSBeanValidationOutInterceptor">
    <property name="provider" ref="validationProvider" />
</bean>

On line 1 we define the ValidationExceptionMapper, which encapsulates the JAX-RS specific interface of ExceptionMapper<Throwable>. Most of the heavy lifting of converting ViolationConstraints to ValidationErrors is actually taking care of by the parent class: ConstraintViolationMapper. The primary reason for splitting it up into two classes is clean separation between a cross-cutting concern (data validation) and the (RESTful) API layer.

 

Lines 8-11 define the configuration for the validation framework. Here, we register our own JAXRSParameterNameProvider and a MessageInterpolator. Both are required to massage the information reported back as ValidationErrors, e.g. looking up the right resource bundle containing our validation messages and providing proper information about the invalid attribute.

 

In lines 23-25 we register the inbound validation interceptor. Please note that this is yet another custom implementation (based on the standard one of course), which comes with a slightly different approach of validating the incoming data. The main reason for the custom implementation was to ensure a similar handling of data validation regardless of whether it happens on the API or the service layer.

 

Last words

 

Now, as mentioned several times already we still need to ensure that the service layer is also validating the incoming data. This is done via a corresponding Aspect: DataValidationAspect.

 

With that, we conclude this chapter by looking at the result:

 

granny_validation_postman.jpg

 

Hope you liked our first post about the "Road to API-ness" and that you tune back in next time when we talk about API documentation. Soo long...

Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform

 

You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.

 

Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.

 

Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.

 

 

Week 6: Advanced Features

 

The last week of the course will be around advanced features.

 

 

Unit 1 -  SAP HANA Cloud Portal for Developers

Based on what you've learned in the last week of the course Introduction to SAP HANA Cloud Platform this unit explains how developers can leverage the SAP HANA Cloud Portal.

 

Important/additional information

 

 

Unit 2 - Developing Applications for Use in SAP HANA Cloud Portal Sites

This unit explains the development process on the SAP HANA Cloud Portal and how you can develop widgets for SAP HANA Cloud Portal.

 

 

Unit 3 - Design and Customize Cloud Portal Sites

In the last unit around the SAP HANA Cloud Portal you learn how to design and customize a site on SAP HANA Cloud Portal.

 

 

Unit 4 - SAP HANA Cloud Integration

This unit is about the SAP HANA Cloud Integration and explains some fundamentals around it.

 

Unit 5 - Wrap-Up & Outlook

The last unit of this course wraps-up what you learned in this course and provides you with a further outlook what will come next around the platform and around new openSAP courses for the SAP HANA Cloud Platform.

Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Introduction to SAP HANA Cloud Platform (repeat)

 

You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.

 

Course Guide Week 6 - Advanced Features

 

Hi everyone,

we're coming to the end of the  course Introduction to SAP HANA Cloud Platform.

 

This last week will provide you with some insights into additional features like the SAP HANA Cloud Portal, Gateway-As-A-Service and SAP Mobile Platform - cloud Version.

 

Find below some additional information related to this week of our course.

 

Table of Contents

 

 

Unit 1 - SAP HANA Cloud Portal

Additional Information

Promo video of SAP HANA Cloud Portal

 

 

 

Unit 2 - Gateway As A Service

Common issues

You can't access Gateway as a Service

In case your account was already created at the time when Gateway-As-A-Service was not available on the trial landscape you are not able to use Gateway-As-A-Service. The only way you can fix this is by creating a new account on the trial landscape again.

 

You get a "redirected to http://localhost:8080/saml2/localidp/sso"

Most probably you didn't reset the trust settings from the units in week5.

To fix it go to your cockpit and click Trust -> Local Service Provider -> Default

 

 

 

 

Unit 3 - SAP Mobile Platform - enterprise edition - Cloud version

 

 

Unit 4 - Wrap-Up And Outlook

Hope you have already heard about our new web based SAP River Rapid Development Environment (RDE) and had a chance to try it out. RDE is really much more than any typical Integrated Development Environment (IDE) – it is built with a much larger vision – which is to provide an extensible framework capable of hosting any number of independent tools to achieve seamless “end-to-end development”.

 

Many articles have been already written to help a developer setup the environment, create SAPUI5 applications from existing templates, consuming a service etc. So to maintain the spirit of end-to-end development, in this article we shall focus on the “service provisioning” aspects – i.e., steps to actually create the service-end-point which is then consumed by the application.

 

SAP River RDE is still in beta but we have already taken the initial steps to embed the service provisioning experience into the overall development lifecycle. This is motivated by the fact that a productive service is a pre-requisite for a productive SAPUI5 application, so it is also better to bring the two development experiences closer. Ofcourse the creation of actual data source itself (e.g., RFC, HANA View…) is mostly done via respective platform specific tools (e.g., Service Builder in SAP Gateway, HANA Studio…), but the provisioning of an OData service, from a variety of data sources, across different platforms, in a consistent manner, is important and that is the focus here.

 

So coming back to the RDE, what if you wanted to create a SAPUI5 application without a service? The service is yet to be built and let’s say you don’t want to wait until the service has been fully implemented. Good news here – You probably know about the mock service functionality of RDE already. You simply need a metadata xml file and couple of clicks later “Run with Mockdata” feature ensures that as an application developer you don’t miss the absence of service that badly anymore. Well, all is good so far but what if you don’t have the metadata also? Simple – SAP already provides a graphical OData modeling tool. You can even manually write a bunch of XML in your favorite text editor to create metadata.

 

Ofcourse. But we think “simple” is not good enough when things can be “simpler” – which by the way is SAP’s bold vision for the future of businesses as articulated by our CEO Mr. Bill McDermott as recently as last month in SAPPHIRE NOW event in Orlando, Florida.

 

How would you like if you did not have to leave the context or leave the RDE for creating metadata? With automatic code-completion? And with intellisense? Plus schema based validation on-the-fly? And SAP annotations support? Sounds good? Okay here we go…but just to be on the same page – the context of “simplicity” referred above is about enabling an end-to-end development experience. Otherwise the idea of “simple” is always subjective and different for different developers.

 

Open SAP River RDE and notice the option to create a new “EDMX File” under “File -> New” – that’s the starting point for creating our Entity Data Model XML. Select a project or a folder where this file needs to reside and follow below screen grab.

 

1.png

 

You are now presented with a dialog to type in the desired file name and select the OData Version. At present we support standard OData Version 2.0 with or without SAP specific annotations.

 

2.png

 

Once you are done, a (not-so) empty file is created with the skeleton code so the developer can focus on the actual scenario specific metadata definition and leave basic housekeeping to us.

 

At this point you cannot help but notice the two red squares – these errors are here to remind us to enter the mandatory parameters in the skeleton code which cannot be auto-generated. In this case these parameters are “Namespace” and “Name” fields and as soon as we give appropriate values we are good again.

 

3.png

 

Next, we would like to create an entity set. All I need to do at this point is to invoke the intellisense feature by pressing “Ctrl + Space” and I am presented with different possibilities as a drop-down list. Much less error prone! Let us select “EntityType” and move on.

 

4.png

 

You will notice that all the necessary code around the tag “EntityType” is auto-generated. Again, few errors are silently demanding your attention – they are mostly the user defined values for mandatory properties.

 

5.png

 

Let us quickly define these missing elements – All the “name” fields are obviously arbitrary user defined values but you will particularly notice the “Type” field. If you are someone like me, chances are that you will not remember the list of primitive data types under Sub-heading 6 of Section 1 “Overview”  in OData Version 2.0 specification. Well, before we really try to “Google” how exactly they represent the “date” type in OData let us give our intellisense a chance. Here are the results –

 

6.png

 

Now we can really convince ourselves that it is a context sensitive feature Anyway, I hope you get the drift – this is a fairly advanced OData Model editor which you can trust to quickly get your models crafted in an assisted manner. Once you are done, simply export the model using “File -> Export Project or Folder” and a zip file is dropped from cloud to your desktop. In this beta version RDE only supports exporting the entire project or folder so please make sure you select the appropriate folder before attempting to export.

 

As you can imagine this is just the beginning of our journey. As mentioned earlier in this blog, our larger motivations for doing this are a) simplicity and b) enabling end-to-end development. So as we work towards embedding this experience into the larger development lifecycle and further enrich the product, we invite you to try this out and let us know your feedback. For more in depth technical information feel free to refer product documentation of this tool.

 

If you have questions, suggestions, feedback etc. feel free to add it as a comment here and we will quickly get back to you. If you prefer to discuss it individually an email is equally good.

 

Signing off for now~

 

CC: UI Development Toolkit for HTML5 Developer Center

Been meaning to do this post for ages, and with a few free seconds (and because it seems I've almost forgotten how to relax) I thought I'd throw something together.

 

A few months ago I was given the challenge of facilitating an open discussion on Fiori, UI5 and the future of mobile in the SAP HR enterprise at the Australian "Mastering SAP HR, Payroll and SuccessFactors" conference.  It certainly was a challenge and I'm not sure one I'd put my hand up for again, but as Robbo says, you only learn by trying.

 

The Plan

To make it a bit more fun, I thought I'd build a mobile UI5 app to go with the session. To make it even more fun, I gave myself 16 hours the week before the session started to do the work. Oh and Cloud. Because Cloud.

plan.jpg

 

The "plan" was to have a web site (an HTML "app") that connected the device to the cloud (SAP HANA Cloud Platform) via simple RESTful interface to send results of votes of all the attendees to a chart which would update in realtime as people voted. The chart would update using WebSockets so that any info sent to HCP by the apps would automatically get updated. Oh, and to make it fun, in order to retrieve the questions, the users would need to shake their mobile device.

 

16 hours, I was mad!

 

OpenUI5 vs SAPUI5.

One of the biggest differences between SAPUI5 and OpenUI5 (other than Open tends to be a release ahead) is the lack of charting libs in OpenUI5. Since this wasn't going to be a SAP product using any SAP tools (other than the HCP) I thought I'd better use OpenUI5. So that left me hunting (for approximately a minute) for an alternative way to draw graphs. After typing "open source javascript chart websockets" into Google the first result was a paper from a scientific journal, the next two had to do with D3.js. So I had a look at that.

 

D3.js

Having played a bit with D3 now, I'm so impressed with what it can do, but even more impressed with the amount of documentation and example out there. And for me this was the clincher. The more examples I could copy, the less work I needed to do, and those 16 hours were looking mighty short. Plus it seemed that people had got D3 and websockets to play nicely previously, so surely I could do the same.

 

HCP for persistence and somewhere to run a WebSocket endpoint

By now HCP is my choice of development platform, I'm really getting the hang of JPA (especially with Spring) and it's just easy. And - because cloud.

 

WebSockets

Hmm I was sure I'd seen a blog on SCN... oh yes - http://scn.sap.com/community/developer-center/cloud-platform/blog/2013/12/19/websocket-on-sap-hana-cloud-platform there it was. Seems simple enough... (oh how foolish I am). The big thing to note is the JSR 356 definition and support. This means in simple terms that a WebSocket endpoint can be defined on the SAP HANA Cloud Platform just using some very simple notation of a class. Notation is awesome, XML sucks, I've learnt this through my learnings with Spring. So I was very happy to continue with WebSockets.

 

Shaking detection

About a month before I'd seen a cool feature that John Astill had demo'd on some of the SAP internal tools, where shaking the device entered "feedback mode" for sending error reports and other feedback about the app. He had told me that it was just a simple bit of code. So I went looking. I ended up finding a lib called shake.js by a chap called Alex Gibson who lives in the UK. http://alxgbsn.co.uk/ Very nice of him to share, and especially to make it clear what license. I'm not sure my usage of it is 100% right as I seem to occasionally have to restart my phone to make it pick up new shake events (think there are a limited number of listeners available for orientation events in the browser and sometimes (especially when debugging) I don't clean up the ones I have used. Bad me.

 

Putting it all together

So I had all the bits - what now? Well it was time to start coding.

All the code is available on GitHub - https://github.com/wombling/mobilequiz

I'll go through some of the more interesting bit (in my view) and also the results:

 

Playing with WebSockets

 

I'll include the entire code for the WebSockets class as I found very few full examples of how to build such a service. I was disappointed that I had to use a static method to send the updates out, but WebSocket support only arrived with Spring 4 and I haven't had much experience with that yet. Not to mention it doesn't look nearly as simple as the JSR-356 standard I've used below. So integration with the Spring dependency injection autowiring/services is something that will have to wait. NB the question service does have an interface and uses @Autowired when referenced, so I didn't complete give up on the idea.  I know I should have implemented an interface and had tests for this too and perhaps handled the exceptions, but well, 16 hours dudes!

 

package com.wombling.mobilequiz.admin;
import java.io.IOException;
import java.util.Collections;
import java.util.HashSet;
import java.util.Set;
import javax.websocket.OnClose;
import javax.websocket.OnMessage;
import javax.websocket.OnOpen;
import javax.websocket.Session;
import javax.websocket.server.ServerEndpoint;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.wombling.mobilequiz.api.ApiValues;
import com.wombling.mobilequiz.pojo.QuestionList;
import com.wombling.mobilequiz.user.QuestionService;
@ServerEndpoint(ApiValues.QUESTIONS_WEBSOCKET)
public class QuestionsWebSocket {
        Logger logger = LoggerFactory.getLogger(QuestionsWebSocket.class);
        private static Set<Session> clients = Collections
                        .synchronizedSet(new HashSet<Session>());
        private static void sendCurrentValues(Session session, QuestionService qs)
                        throws IOException {
                GsonBuilder builder = new GsonBuilder();
                Gson gson = builder.create();
                QuestionList questionList = qs.getAllQuestions("");
                String questionsJSON = gson.toJson(questionList);
                session.getBasicRemote().sendText(questionsJSON);
        }
        public static void sendUpdate(QuestionService qs) {
                for (Session client : clients) {
                        try {
                                sendCurrentValues(client, qs);
                        } catch (IOException e) {
                                // ignore for now
                        }
                }
        }
        @OnClose
        public void onClose(Session session) {
                // Remove session from the connected sessions set
                clients.remove(session);
        }
        @OnOpen
        public void onOpen(Session session) {
                // Add session to the connected sessions set
                clients.add(session);
                try {
                        QuestionService qs = WebSocketSupportBean.getInstance().getQs();
                        if (qs != null) {
                                sendCurrentValues(session, qs);
                        }
                } catch (IOException e) {
                        // ignore for now
                }
        }
        @OnMessage
        public void processGreeting(String message, Session session) {
                logger.debug(message);
        }
}

Worth noting the use of Google Gson which is the most awesome lib for converting pojos to JSON and JSON to pojos. It makes my life so much easier and is highly recommended. Also, who'd want to send data in any other format. (Unless of course you're feeling particularly enterprisy in which case there's a cure for that and it's called Apache Olingo.)

 

 

With the broadcasting of websockets underway and tested ( I also used Dark WebSocket Terminal to test) there needed to be something to subscribe to and react to my messages.

This was actually relatively easy - the MV* build of UI5 lends itself nicely to just updating the model from anything (in this case a websockets update) and the framework taking care of the rest.

Here's a code snippet taken from the onInit method of my admin view.

                var thisView = this;
                function url(s) {
                        var l = window.location;
                        return ((l.protocol === "https:") ? "wss://" : "ws://") + l.hostname + (((l.port != 80) && (l.port != 443)) ? ":" + l.port : "")
                                        + "/mobilequiz/" + s;
                }
                var socket = new WebSocket(url("questionWebSocket"));
                socket.onopen = function() {
                        console.log('WebSocket connection is established');
                };
                socket.onmessage = function(messageEvent) {
                        thisView._questionData = JSON.parse(messageEvent.data);
                        thisView.fnLoadQuestionList(thisView._questionData);
                        thisView.fnRenderGraph();
                };

 

Being able to change the protocol and port depending on whether I was running locally over http and ws rather than https and wss  when running on the HCP was an important consideration. It is actually trivially simple code - in fact in many ways easier than making an AJAX call.

 

Shake that phone!

The other code snippet I'd like to discuss is the logic to capture a shake.

onInit : function() {
                var config = {
                        "showLoadingThingy" : true,
                        "shakeSupported" : false,
                        "shakeNotSupported" : false,
                        "showQuestion" : false
                };
                var configModel = new sap.ui.model.json.JSONModel(config);
                this.getView().setModel(configModel, "cfg");
                this.getView().bindElement("cfg>/");
                var _e = null;
                var _i = null;
                var _c = null;
                var updateOrientation = function(e) {
                        _e = e;
                        window.removeEventListener("deviceorientation", updateOrientation, false);
                };
                var thisView = this;
                window.addEventListener("deviceorientation", updateOrientation, false);
                _i = window.setInterval(function() {
                        if (_e !== null && _e.alpha !== null) {
                                // Clear interval
                                clearInterval(_i);
                                thisView.setConfig(true);
                        } else {
                                _c++;
                                if (_c === 10) {
                                        // Clear interval
                                        clearInterval(_i);
                                        // > Redirect
                                        thisView.setConfig(false);
                                }
                        }
                }, 200);
        },
        setConfig : function(shakeAllowed) {
                var thisView = this;
                if (shakeAllowed) {
                        window.addEventListener("shake", function shakeEventOccured() {
                                thisView.fnGetNextQuestion();
                        }, false);
                } else {
                        sap.m.MessageToast.show("Your browser does not support shake detection");
                }
                var configModel = this.getView().getModel("cfg");
                configModel.setProperty("/shakeSupported", shakeAllowed);
                configModel.setProperty("/shakeNotSupported", !shakeAllowed);
                configModel.setProperty("/showLoadingThingy", false);
                this.getView().byId("busyInd").destroy();
        },

Here I've had to deal with phones/browsers that don't actually support shake detection, but do implement the deviceorientation event subscription (honestly - what were they thinking?! Blinking iPad rubbish).  I poll 10 times for a value in the orientation and check to see if the data is changing. If it is, hey hey!, we have shake support. If not, either you ought to go into professional poker, you can hold that phone so steady, or you have it on the desk, in which case shaking probably not a good idea. I then update my config model with the results. This means I can then decide whether to show you a button to press for new questions, or make you shake it like a Polaroid picture. If shake is allowed, then I call the function to get the next question.

 

Putting buttons in a toolbar on the bottom of the screen is a Fiori induced anti-pattern and I don't like it.

<rant>

Seriously! Look at any modern UI, where are the buttons? At the TOP of the screen or right next to the thing you are working on! Check out the browser you are using right now, heck even MS office! Yes, there are items in the toolbar at the bottom, but user actions are placed where users can see them, not at the bottom of the screen in an unresponsive toolbar. New and better UI/UX is supposed to be about making things better for users. Consistent UI is fine, but consistently unintuitive UI does not make it intuitive because we get used to it. That just makes it obviously a SAP application.

</rant>

 

So in my UI for the app, I kept things simple, because simple is best!

phone1.pngphone2.png

and even in my admin interface:

admin.png

Drop shadows make everything look sexy, so I used them to highlight the questions. I probably should have rounded off the corners too, then it would have been extra cool. A little bit of CSS on top of UI5 makes all the difference I am finding. Whilst the number on the right hand side of the question probably means nothing to you, when you can see it counting down, it becomes pretty obvious that it's a countdown.

 

Simple! And then click on the graph icon and get some of that websockets real time data vibe going:

admin2.png

 

The number of yes/no votes updates in real time and also updates the graph. Pretty cool, especially when you have a few people voting.

I did some fun stuff with cookies to ensure that people don't vote multiple times (unless they decide to clear their cookies between votes, and in that case, they deserve it for being clever clogs.) The system keeps track of which cookie ids voted for what... (very big brother, ha and you thought this was anonymous - haven't you learnt ANYTHING in the last few years about the internet?!)

Though to be fair, it's going to take me a bit of work to figure out what this means: (not as if I stored much in the way of identifying data.)

data.png

 

I was having so much fun building this app, that I even got my family into the act and my daughter decided that she would help me by making a how to use the app video.

After the good, The bad stuff

Not everything was so awesome. OpenUI5 is currently only hosted out of Waldorf and takes a blinking age to download onto an Australian mobile phone. SAP has limited desire/resources to push the use of a CDN to make this faster. Even SAPUI5 does not benefit from a CDN. The more community _ahem_ encouragement that they get, should help this be fixed. Please do throw your comments about this on the stackoverflow post about this. Or even here, the more volume this problem gets the faster a solution.

 

I tried hosting the OpenUI5 code on my own website which is based in Australia, it certainly improved things. But then was having problems because of the mis-match of protocols - with my website being HTTP but AJAX comms to the HCP being via HTTPS (using CORS). It shouldn't have caused an issue, but when we tried it with lots of people it did. I'm not sure why.

 

Still, even with an AU website hosting the UI5 it still took too long to download onto a phone. I'll have to check about customised versions of UI5 that only contain the bits I need, I hear that speeds things up. I'll have to try it out. I'd also like to try hosting/using the AU HCP site for an experiment and see how that works. (unfortunately the AU HCP data centre wasn't up and running at the point I wanted to use my app!)

 

Summary

In the end I probably did spend a little more than 16 hours, probably more like 32 hours in total building the application, but most of that was in trying out and learning new stuff - like D3.js and WebSockets, so the late nights were worth it

 

Please have a look at my GitHub repository, fork it if you like or just borrow bits of code as you like.

 

I'd love to hear your comments, I think this combo of mobile, websockets and cloud is just about to take off - real real-time dashboard based on what people are doing right now. Combine this with HANA and I guess we also could get real real-time analytics of that data. Exciting new world!

Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform

 

You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.

 

Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.

 

Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.

 

 

Week 5: Securing Web APIs

 

This course week is all about securing web APIs on the SAP HANA Cloud Platform

 

 

Unit 1 - Protecting Web APIs

In this unit you learn what Web APIs are, when to use SAML 2.0 and oAuth and you also learn what the benefits of OAuth are.

 

Important/additional information

 

Unit 2 - OAuth 2.0 Fundamentals

This unit explains the fundamentals around OAuth 2.0.

 

 

Unit 3 - Protecting the Cloud Application

In the third unit of this week you learn how to protect APIs programmatically and how to configure the OAuth filter.

 

Important/additional information

 

 

Unit 4 - OAuth Configuration

This unit shows you how to register OAuth clients and how to configure scopes for your cloud application.

 

Important/additional information

In this unit you might notice that the video from minute 2:48 till 3:12 shows how I enter a wrong URL. It should be http://localhost:8000/oauthcallback, but in the video I enter http://localhost:8000/ouathcallback. Please enter the correct link http://localhost:8000/oauthcallback .

 

Unit 5 - Working with Multiple Identity Providers

Finally in unit 5 we develop an OAuth client. You learn how to integrate an OAuth Client with the SAP HANA Cloud Platform OAuth Authorization Server and how to implement a callback handler for the authorization code flow in a desktop client.

Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Introduction to SAP HANA Cloud Platform (repeat)

 

You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.

 

Course Guide Week 5 - Connectivity

 

Hi everyone,

we are getting closer to the end of this course with week 5, but this week is about the exciting topic of the Connectivity Service.

 

We'll look into the Connectivity Service and how to use destinations and the Cloud Connector.

 

Compared to the initial course Introduction to SAP HANA Cloud Platform that has been provided to you at the end of 2013, you'll notice that this course week has been completely re-recorded due to some major improvements around the Connectivity Service.

 

 

Table of Contents

 

Unit 1 - Introduction To The Connectivity Service

 

 

Unit 2 - HelloWorld Connectivity Application

 

 

Unit 3 - Setting Up SAP HANA Cloud Connector

Common issues

Your OS is not supported

Please read the documentation of the SAP HANA Cloud Connector at SAP Development Tools for Eclipse because SCC is not available for all Operating Systems.

 

 

Unit 4 - Configuring And Using Destinations

 

 

Unit 5 - Other Scenarios

 

 

Cloud Extension Scenario

 

Integration with SuccessFactors Applications

 

 

 

Course Overview

ENROLL TO THE COURSE HERE (in case you haven't, yet): Course: Next Steps in SAP HANA Cloud Platform

 

You can find a list of the course guides for each week of this course in the corresponding parent project of this blog post.

 

Currently there is not much to add with regards to additional information. But once more questions pop-up in the forums I'll add FAQs into here.

 

Please use the SAP HANA Cloud Platform Developer Center or the corresponding openSAP forum for week 3 of this course to post your questions regarding the openSAP course.

 

 

Week 4: Advanced Identity Management

 

This course week is all about advanced identity management and extends the know-how around security management and security you've already build-up during the course Introduction to SAP HANA Cloud Platform.

 

To get deeper into the topics you can look into some additional material provided to you by Martin Raepple via SCN:

 

 

Unit 1 - Working with User Profile Attributes

This unit is about the different classes of user account information, how you configure user attributes with the local IdP and in the Cloud Cockpit and how one can access the the user attributes in Java-based apps.

 

Important/additional information

 

Unit 2 - Group Management

In unit 2 of this week you learn how to use groups in the SAP HANA Cloud Platform and how to assign users to groups.

 

 

Unit 3 - Federated Authorization with Groups

In this unit you learn how to define mapping rules to groups.

 

Important/additional information

 

Unit 4 - Custom Roles

Learning how use and define custom roles at runtime.

 

Unit 5 - Working with Multiple Identity Providers

In this week's last unit you learn how to setup multiple identity providers per account in the SAP HANA Cloud Platform Cockpit

Actions

Filter Blog

By author:
By date:
By tag: