1 2 3 18 Previous Next

SMP Developer Center

268 Posts

Latest release of SMP 3.0 SP05, support OData Query operation for a service provisioned based on a REST service. In this blog post I want to show how you can develop and deploy an OData service to the SMP3 which relies on an REST service. That means that the Integration Gateway component in SMP3 will convert on the fly data coming from the REST service into OData.

Tools Used: Eclipse Kepler


1. SAP Mobile Platform Tools plugins added to Eclipse Kepler (make sure its version should be 1.1.3 check this guide for more info)

2. SAP Mobile Platform 3.0 SP05 Runtime

I got one publicly available REST service (http://www.thomas-bayer.com/sqlrest/CUSTOMER/1), i am going to convert this information into OData.



1. Create a new project in Eclipse Kepler: File>New>SAP Mobile Platform OData Implementation Project

    • Make sure you select Target Runtime server as SP05


    • Give any model name and Finish




2. Create an Entity and add properties as below:




3. Right click .odatasrv > Select Data source as REST Service

    • Request URL : /sqlrest/CUSTOMER/1
    • Response Format: XML (You can choose other option JSON as well)




4. Expand .odatasrv > Right click Query > Define Custom code




5. Replace generated code with below code. (I have attached it in a file as well)



function getUriInfo(message) {
  var uriInfo = message.getHeaders().get(ODataExchangeHeaderProperty.UriInfo.toString());
  return uriInfo;
  Function processRequestData will be called before the request data is
  handed over to the REST service. User can manipulate the request data here.
function processRequestData(message) {
//    System.setProperty("http.proxyHost","proxy");
//    System.setProperty("http.proxyPort","8080");
//    System.setProperty("http.proxySet","true");
    var context = message.getHeaders().get("odatacontext");
    var companyId = context.getRequestHeaders().get("companyId").get(0);
    var username = context.getRequestHeaders().get("username").get(0);
    var password = context.getRequestHeaders().get("password").get(0);
    parentMap = new LinkedHashMap();
    childMap = new LinkedHashMap();
    childMap.put("key:companyId", companyId);
    childMap.put("key:username", username);
    childMap.put("key:password", password);
    parentMap.put("key:credential", childMap);
    Object obj =  message.getHeaders().get(ODataExchangeHeaderProperty.UriInfo.toString());
    UriInfo uriInfo = (UriInfo) obj;
    GetEntitySetUriInfo entitysetUriInfo = (GetEntitySetUriInfo) uriInfo;
    String relativeUri = (String) message.getHeaders().get("RelativeUri");
    StringBuffer url = new StringBuffer(relativeUri);
    if (entitysetUriInfo.getFilter() != null) {
    String serviceUrl = url.toString();
    serviceUrl = serviceUrl.replaceAll(" ",
    message.setHeader("RelativeUri", new String(serviceUrl));
    return message;
    return message;
  Function processResponseData will be called after the response data is received
  from the REST service. User can manipulate the response data here.
function processResponseData(message) {
    var payload = message.getBody();
    var inputStringBuilder = new StringBuilder();
    var bufferedReader = new BufferedReader(new InputStreamReader(payload, "UTF-8"));
    var line = bufferedReader.readLine();
    while(line != null){
        line = bufferedReader.readLine();
    payload = inputStringBuilder.toString();
log.logErrors(LogMessage.TechnicalError, "\n +++++++++++++++++++++++++++++++++++ TRACE OUTPUT +++++++++++++++++++++++++++++++++++\n");
log.logErrors(LogMessage.TechnicalError, "\n Original Payload is:\n" + payload);
    // Remove newline chars
    payload = payload.replace("\n", "");
    // Convert OData XML response into the expected format <EntitySet><Entity>...</Entity></EntitySet>
    payload = payload.replaceFirst("<.xml .*?>", ""); // remove the line <?xml version="1.0"?>
    payload = "<CUSTOMERSet>" + payload + "</CUSTOMERSet>";
log.logErrors(LogMessage.TechnicalError, "\n Response Payload is:\n" + payload);
log.logErrors(LogMessage.TechnicalError, "\n +++++++++++++++++++++++++++++++ END OF TRACE OUTPUT ++++++++++++++++++++++++++++++++\n");
    message.setHeader("ODataMethod", ODataMethod.GET_ENTRY);
    return message;

6. Right click project > Generate and Deploy Integration Content




Creating a destination in Gateway cockpit:


7. Open Gateway cockpit (https://smpserver:adminport/gateway/cockpit) > Navigate to "Destinations" tab


    Destination URL : http://www.thomas-bayer.com



8. Select newly created destination and check if you are able to ping it .(Test connection)





9. Navigate to SERVICES > select deployed OData service (ThomasREST)> Add destination




10. Open service document







For more info, check official guide. Enabling REST Services as OData Services


Note: To support OData $filter (e.g. $filter=ID eq 2) to accommodate generic filtering (similar to a read operation) this project can further be changed by modified the processRequestData method. Keep you posted.

Thanks Mustafa Saglam and Bjoern Woppmann for your continuous support.


I’ve been working on a project to bring the major elements of SAP mobility together in a single example. This example will explore the details of creating a delta-enabled OData web service, something that I mentioned in my last blog post.


At the end of the project we'll have this:


  • A web service that supports OData Delta Token processing
  • A Hybrid Kapsel application created using Web IDE that uses that service
  • Support for disconnected operation in the mobile app using the SMP SDK’s Offline capabilities
  • Support for adding and editing records in the mobile app
  • Run both the web service and mobile application atop the HANA Cloud Platform


The end result would nicely tie UI5, Web IDE, and HCP Mobile Services together into an offline-ready Hybrid mobile application.


Walking through the steps will require a bit of effort.  I plan to break it up into several articles, each covering one segment.  My goal is to have each article small enough to complete in an hour or less.


So, where to start?


Part 1 - A Delta-Token-Enabled OData Web Service


HANA Cloud Platform Trial Account – with Mobile Services

If you don’t already have a HANA Cloud Platform trial account, you can register for an account using this link.

HANA Cloud Platform Mobile Services, abbreviated HCPms, is now available as part of your HANA trial account, too.  You can review the instructions in Martin Grasshoff’s recent blog post for information on how to add HCPms to your HCP trial account.


You will need to sign up for both to complete this series of articles.


Build an OData Web Service

I’ve been working extensively in the past months with Apache Olingo, an open source OData web services framework.  I have come to like Olingo quite a lot.  I find it well organized and reliable, so I’m going to use it in this example to generate our OData web service.


The Data Model

The techniques that I’ll apply in this article can be used with almost any JDBC-compatible backing data storage.  For this project, though, I have chosen to start with a variant of SAP’s Enterprise Sales and Procurement Model (ESPM).  ESPM is a business data model widely used in examples in many current SAP programming systems.


You can find a number of articles on SCN describing different variants of ESPM.  Some use HANA as a back-end, some use other databases.  This particular variant is one that I created using the Java Persistence Architecture (JPA) to define the data model.  Olingo provides several ways to define an OData web service -- it has a particularly easy set of APIs for doing that if your starting point is JPA, so that's why I'm using that here.


The diagram below depicts the entity relationship diagram for this version of the ESPM data model.







Configure Your Java Development Environment

The complete source code to our web service is available on Github.  Here’s a few simple steps that you can follow to import, build, and run the service on your local machine.  We'll deploy it to your local machine first.  Later, we'll deploy a copy of the same service to a HANA Cloud Platform Java application server.

  1. If you have not already, install the Eclipse JEE Edition of Eclipse Kepler or Luna and the HCP Development plugins for it.  The installation guide I recently used can be found here – that document contains more information than we really care about for our purposes – limit the instructions you follow to steps 1.1 through 1.6 [NOTE: these steps install JDK, Eclipse, & Maven on your local machine; you can safely stop at completion of the step in 1.6 labelled "Check Maven User Settings"].
  2. Import the ESPM Olingo git repository as a Maven project

    (2.a) Start Eclipse
    (2.b) Import the project from Github -
    Click "File>Import", Select "Maven>Check out Maven projects from SCM", Click "Next"

    Screen Shot 2015-01-14 at 4.05.04 PM.png
    (2.c) Import (continued) -
    Select "git" as the SCM system of choice; enter "https://github.com/SAP/sap_mobile_platform_espm_olingo_services.git" as the SCM URL; click "Finish";  The import will take a moment.  At the end of the import, your Project Explorer should contain an ESPM_V1 project:

    Screen Shot 2015-01-14 at 4.12.22 PM.png

  3. Build the ESPM web services application
    -- Right-click on the ESPM_V1 project in the Project Explorer window, select "Run As>Maven Build...". You should see a dialog that looks like this one below.  Enter "clean install" in the Goals field.  Press "Run".

    Screen Shot 2015-01-06 at 6.04.11 PM.png

    Under normal conditions the Eclipse Console window will indicate the build succeeded with a message similar to this:

    Screen Shot 2015-01-06 at 6.12.48 PM.png

  4. Download a copy of the Tomcat 7 server binary distribution .ZIP file -- I used 7.0.57, but almost any Tomcat 7 version will do.
  5. Create a new Tomcat server instance in your Eclipse workspace by following these instructions from the Eclipse documentation set.
  6. Run the project on your newly created Tomcat server -  Right-click on the ESPM_V1 project and select "Run As...>Run on Server".  You will see something like the dialog show below.  Select the "tomcat 7.0 server at localhost" and click "Finish".

    Screen Shot 2015-01-06 at 6.01.47 PM.png

  7. Verify the application runs correctly by visiting this link in your web browser: http://localhost:8080/ESPM_v1/api/Products


If the application was deployed correctly, you will see an XML representation of about 110 products.  The web service is programmed to install a collection of test data for Products, Stock, Suppliers, and Customers.  Each time the application is restarted, it will drop and reinitialize those tables in the database (you can change the behavior to preserve the state of the database between invocations by changing the eclipselink.ddl-generation property in src/main/resources/WEB-INF/persistence.xml from "drop-and-create-tables" to "create-tables").


If you scroll to the bottom of the Products response, you will see something like this:




As you can see, there is an UpdatedTimestamp field associated with each record.  Because our code supplied a delta token generator, there is a "delta" link URL at the bottom of the response.



OData Delta Tokens

The OData protocol is essentially a superset of a standard RESTful web service.  One of the really cool extensions to REST supplied by OData is Delta Token handling. Using Delta tokens affords more efficient communications between each client and the source web service (OASIS design document reference).


SAP’s Mobile Platform products – both cloud and on-premise versions – will leverage a delta token enabled web service to optimize data synchronization.  I recently wrote an article describing the mechanics of this process -- it might be worth your time to review it (link).  The point here is that we want the most scalable web service possible, so we’ll take the extra steps required to enable delta token processing for most of this service’s OData entity collections.


Apache Olingo provides a couple of framework-based options for adding delta token support to JPA-based projects.  I used the first of the two options documented in the Olingo on-line documentation.  To reconfigure the project to use that approach, I first added an “updated timestamp” column to each table where I wanted to add delta token support.  I then added or modified the JPA @PrePersist and @PreUpdate methods to automatically update this updatedTimestamp field with the current server time for both of these operations.  Finally, I created the ESPMJPADeltaListener class based on ODataJPATombstoneEntityListener, as described in the Olingo documentation.


Adding the "updated timestamp" column follows a common pattern of leveraging server timestamps to track what changes between requests.  In the implementation of the ESPMJPADeltaListener class, I set up the delta token generator to return a server timestamp.  This timestamp will be returned as part of the request contents.


public class ESPMJPADeltaListener extends ODataJPATombstoneEntityListener {
 static final String DATEFORMAT = "yyyy-MM-dd HH:mm:ss";
  * Generate a web-service-specific delta token to be passed back to the
  * client with a response.  In this case, we simply generate an ISO-8601-style
  * string reflecting the current UTC date/time.
  * @see org.apache.olingo.odata2.jpa.processor.api.ODataJPATombstoneEntityListener#generateDeltaToken(java.util.List, javax.persistence.Query)
  public String generateDeltaToken(List<Object> deltas, Query query) {
  final SimpleDateFormat sdf = new SimpleDateFormat(DATEFORMAT);
      final String utcTime = sdf.format(new Date());
      return utcTime;

The getQuery() method is a bit more complex, but the core functionality is found below.  Basically, if the client supplies a delta token as part of the GET request, the function with modify the SQL data fetched to be limited to rows that changed after the timestamp supplied in the delta token.  The client will not have a delta token to supply on the initial request -- and some applications may not want to leverage delta tokens at all.  In both of those cases the code responds to the missing delta token by returning a complete set of data results.



String deltaToken = ODataJPATombstoneContext.getDeltaToken();
  Query query = null;
  if (deltaToken != null) {
       String statement = jpqlStatement.toString();  
       String[] statementParts = statement.split(JPQLStatement.KEYWORD.WHERE);  
       String deltaCondition = jpqlContext.getJPAEntityAlias() + ".updatedTimestamp >= {ts '" + deltaToken + "'}";  
       if (statementParts.length > 1)  
            statement = statementParts[0] + JPQLStatement.DELIMITER.SPACE + JPQLStatement.KEYWORD.WHERE +                JPQLStatement.DELIMITER.SPACE + deltaCondition + JPQLStatement.DELIMITER.SPACE + JPQLStatement.Operator.AND +                statementParts[1];  
       else {    
            statement = statementParts[0] + JPQLStatement.DELIMITER.SPACE + JPQLStatement.KEYWORD.WHERE + JPQLStatement.DELIMITER.SPACE + deltaCondition;  
       query = em.createQuery(statement);
  else  {
       query = em.createQuery(jpqlStatement.toString());
  return query;

You can see delta tokens in action by copying the entire contents of the delta token's "href" attribute from step 7 and pasting that into your browser URL box. You will see a different response this time -- one that does not include any records, since nothing has been changed since the original request was issued.


Deploy the Web Service to your HCP Trial Account

For simplicity’s sake, I chose to store my ESPM data in an embedded Java-based SQL engine, Hypersonic. You could do the same thing in HANA, but the extra steps involved to do that would not really be relevant to our example.  To wrap up this exercise, we'll deploy the ESPM web service onto a HANA Cloud Platform app server using your trial account.


Your HANA Trial includes the capability to deploy one small sized Java application at no charge (this is a 'micro' instance in AWS terms).  Unless you have been working with Java in HCP, your probably have not taken advantage of this resource yet.  If it happens that you have, you will need to stop any Java applications you are running to deploy our ESPM application.


  1. Deploy the application using Eclipse -- A deployment to the HANA Cloud Platform Java server is performed in much the same way that your deployed to your local Tomcat instance.  Start by right-clicking on the ESPM_V1 project and select "Run>Run on Server...".

    Screen Shot 2015-01-07 at 7.51.36 PM.png

    Select "Manually Create a new Server" and choose the Server Type as "SAP>SAP HANA Cloud Platform".

    Enter "hanatrial.ondemand.com" as the Landscape host.

    Click "Next".

    Screen Shot 2015-01-07 at 9.36.18 PM.png

    Enter "espm" as the Application name (for some reason, HCP insists that this name be all lower case with no symbol characters).

    For Account Name, enter your SAP S/C/I/D-Number followed by "trial" (no spaces or other punctuation).

    For User Name, enter your SAP S/C/I/D-Number.

    Enter your password.

    I found that I had to uncheck "Save Password" to get this to work correctly on OS X.  You may start with it "checked" if you like and then uncheck it if the operation fails (the only consequence I experienced was having to re-enter my password a few extra times).

    Click 'Finish'.

    The deployment will take several minutes to complete.  You can move on to Step 2 to monitor the status of the deployment and to prepare to test the deployed service.

  2. Check the deployment in the HANA Cloud Platform Cockpit -- In your browser, open the HANA Cloud Platform Cockpit.

    Click "Java Applications" on the left hand bar.

    You will see something like the screen below, although most of the Actions (start, Stop, Switch, and Update) will be disabled (greyed-out) until the deployment completes.  You can refresh the page to monitor the deployment status.

    Once all the Action buttons are enabled. We're ready to move on to Step 3.

    Click on "espm" in the Name column to continue.
    Screen Shot 2015-01-07 at 9.44.24 PM.png

  3. Test the ESPM web service -- The details page on the "espm" application will include a link to the "Application URL" (below)

    Screen Shot 2015-01-07 at 9.45.06 PM.png

    Click the Application URL link.  You will see a details form looking something like this:

    Screen Shot 2015-01-07 at 9.46.03 PM.png

    Append "api/Products" to the URL and press "Go" or Enter. 

    The end result should be the same response you saw with your local Tomcat service.



This ends the first section of our four part project.  If all went according to plan, you have successfully deployed your Olingo-based web service into the HANA Cloud Platform.  You also now have your machine configured to expand your exploration of Java applications that are deployable to the HCP landscape.


In the next article in this series, we'll explore how to use HCP's Web IDE and other elements of the SAP Mobile Platform to build a Hybrid mobile application that accesses this service.

The SODataOfflineStore contains two methods for synchronizing the local Ultralite database with an OData backend via Mobilink:  flush:, and refresh:.


Flush:  submits all pending changes on local database to backend

Refresh:  pulls all changes from backend, and commits in local database


The two will commonly be used in sequence:  (1) flush the local changes, then (2) pull the remote changes.


Both are implemented as sets of delegate methods:  <SODataOfflineStoreRefreshDelegate, SODataOfflineStoreFlushDelegate>, but it is simple to wrap them into completion blocks, and then chain them in a single method.

Here is an example of how to convert the SODataOfflineStoreRefreshDelegate to a single block-style interface, named refresh:(void(^)(BOOL success))completion.  When the refresh is complete, the completion block will be called, passing success == YES on success, and success == NO on failure.

#pragma mark - OfflineStoreRefresh block wrapper


- (void) refresh:(void(^)(BOOL success))completion


    NSString *refreshFinishedNotification = [NSString stringWithFormat:@"%@.%@", kRefreshDelegateFinished, self.description];

    NSString *refreshFailedNotification = [NSString stringWithFormat:@"%@.%@", kRefreshDelegateFailed, self.description];


    [[NSNotificationCenter defaultCenter] addObserverForName:refreshFinishedNotification object:nil queue:nil usingBlock:^(NSNotification *note) {




    [[NSNotificationCenter defaultCenter] addObserverForName:refreshFailedNotification object:nil queue:nil usingBlock:^(NSNotification *note) { 




    [self scheduleRefreshWithDelegate:self];



#pragma mark - OfflineStore Refresh Delegate methods


- (void) offlineStoreRefreshSucceeded:(SODataOfflineStore *)store {


    NSString *refreshFinishedNotification = [NSString stringWithFormat:@"%@.%@", kRefreshDelegateFinished, self.description];

    [[NSNotificationCenter defaultCenter] postNotificationName:refreshFinishedNotification object:nil];



- (void) offlineStoreRefreshFailed:(SODataOfflineStore *)store error:(NSError *)error{


     NSString *refreshFailedNotification = [NSString stringWithFormat:@"%@.%@", kRefreshDelegateFailed, self.description];

    [[NSNotificationCenter defaultCenter] postNotificationName:refreshFailedNotification object:error];



-(void)offlineStoreRefreshFinished:(SODataOfflineStore *)store {




-(void)offlineStoreRefreshStarted:(SODataOfflineStore *)store {




You'll probably recognize this pattern from the post Using Blocks with the SAP Mobile SDK 3.0 SP05+ on using blocks with the scheduleRequest: API.  First, subscribe to notifications that will be fired when the delegate callbacks are invoked.  Second, prepare to call your completion block with the correct parameters when the notifications are observed.  Last, call the original method ( i.e.: [self scheduleRefreshWithDelegate:self]), to start the process.


I put this code in my OfflineStore<SODataOfflineStore> implementation.  By following the same pattern for the SODataOfflineStoreFlush delegate methods, we can then chain the flush: and refresh: in one method, named flushAndRefresh(void(^)(BOOL))completion.


#pragma mark - FlushAndRefresh block wrapper


- (void) flushAndRefresh:(void (^)(BOOL))completion


    [self flush:^(BOOL success) {

        if (success) {

            [self refresh:^(BOOL success) {

                if (success) {


                } else {




        } else {






This flushAndRefresh: method could be added to the interface of your SODataOfflineStore implementation.  I chose instead to put it in my own protocol, "ODataStore", which already contained my openStoreWithCompletion: block that is implemented by both my OnlineStore and OfflineStore classes.


#import <Foundation/Foundation.h>


@protocol ODataStore <NSObject>



- (void) openStoreWithCompletion:(void(^)(BOOL success))completion;



- (void) flushAndRefresh:(void(^)(BOOL success))completion;


The flushAndRefresh: is only relevant for the SODataOfflineStore; the SODataOnlineStore does not use the local Ultralite database, so all requests are immediately tried over the network.

Now that the flush: and refresh: delegate methods are wrapped into a single completion block, we can connect the method to a UITableViewController's UIRefreshController!  The following snippet can be added to any UITableViewController, to sync the local database when the end user drags the table view down.


@implementation MyUITableViewController


- (void)viewDidLoad {

    [super viewDidLoad];


    self.refreshControl = [[UIRefreshControl alloc] init];

    [self.refreshControl addTarget:self action:@selector(refreshStore) forControlEvents: UIControlEventValueChanged];



#pragma mark UIRefreshControl


- (void)refreshStore


    [[[DataController shared] localStore] flushAndRefresh:^(BOOL success) {

        if (success) [self.tableView reloadData];

        [self.refreshControl endRefreshing];




Resources for this project are located here:  SAP/STSOData · GitHub.


For extra credit, customize the look-and-feel of your UIRefreshController, using some of the examples here:  http://www.appcoda.com/pull-to-refresh-uitableview-empty/, to update your end user of the progress of the flush and refresh.


// happy coding!

User Propagation


Use Cases:




  • When accessing the service document directly from the Gateway Management Cockpit, the security profile Name must be same as the Service Namepace of the service. For example, if the Service Namespace is SAP_SSO2, the security profile Name must be SAP_SSO2.
  • For onboarded applications (SAP Mobile Platform applications that have an endpoint using the Integration Gateway service as back-end URL, with the internal option enabled), the security profileName and the Service Namespace of the service need not be same.


In this Blog we will be showcasing how an user can Propagate his Credentials which he has configured in the SMP server to IGW which internally gets Propagated to the Back End Service for Authentication.


In this Blog we will be covering Basic Authentication and MYSPASSO2 Based user Propagation.


This Blog will guide the Admin/developer on what configurations one has to do in SMP Admin Cockpit , IGW and in the Custom Script to Enable User Propagation.



Use Case - 1

Basic Authentication

Admin has to configure the http/https security profile from smp Admin cockpit with the back end system's url from where user will get the data with basic authentication, As shown below.

by setting up the below mentioned configuration, SMP will propagate the user credentials while making a call in the business oriented URL via request object as a header.

The Request Object is retrieved in the Script processor and is added to http header which is used while making a call to the back end web service.

NOTE: Credentials should be the Authorization Details of the Back end System.

Steps to Configure Basic Authentication Propagation

SMP ADMIN Cockpit Steps

1) Log on to the SMP Admin Cockpit https://localhost:8083/Admin and go to the Settings-> Security Profile tab.

2) Create a security profile with Authentication type as HTTP/HTTPS Authentication and provide the URL of the Back end System from where the Data/web-service is hosted. i.e. in my case i am using a SAP ABAP system's URL where my web service is hosted and the service has Basic Authentication enabled for the same and click on Save as Shown Below.

SP-Basic.JPGBasic Auth.PNG

   The Security Profile Created above can be assigned to the application which is created in the SMP Cockpit.

IGW Steps

1)      create a Content Bundle from Design Time Eclipse Editor by using the Web service generated from the back end, which was used to create security profile in previous step. and add the below mentioned code in the script either in first or second methods of Script Processor

the below mentioned code will fetch odata context from exchange headers and retrieve http request object from the odata context object and from the request object fetchs the Authorization header and finally add it to message header which will be passed to Web Service call as http client header.

function processRequestData(message) {


var headers = message.getHeaders();
var context = headers.get("ODataContext");
var request = context.getParameter("~httpRequestObject");
var Auth = request.getHeader("Authorization");



return message;


2) once the bundle is created with the above mentioned code in custom script,deploy the Bundle from Design Time with the Service Name Space as the Security profile Name which was created in SMP ADMIN Cockpit Steps.

    Basic Auth1.PNG

3) now open the service document and fire the call to the entity set in the business oriented URL i.e. http://localhost:8080/gateway/odata/SAP_BASIC/<ServiceName>;v=1/<entitySet>



    Authentication challenge will be thrown by the browser as shown below.


    enter the user name and password, here credentials should be same as the credentials of back end System. so the user credentials entered will be propagated from SMP to Odata and inturn will be propagated to the back end system to get Authorized.


MYSAPSSO2 Authentication

Admin has to configure the http/https security profile from smp Admin cockpit with the back end system's url from where user will get the data with MYSAPSSO2 cookie, As shown below.

Note: For MYSAPSSO2 Authentication Admin has to add SMP certificate to Back End System and in vice versa Back End Systems certificate into SMP server for Mutual Hand Shake.

Steps to Add the Back End Certificate to SMP Server is Mentioned at the End of the this Blog.

By setting up the below mentioned configuration, SMP will propagate the user credentials i.e. Corresponding Cookie in the request object as an Attribute.

The Request Object is retrieved in the Script processor and the Coockie added to http header which is used while making a call to the back end web service.

Steps to Create MYSAPSSO2 Scenario

SMP ADMIN Cockpit Steps

1) Log on to the SMP Admin Cockpit https://localhost:8083/Admin and go to the Settings-> Security Profile tab.

1) Create a security profile with HTTP/HTTPS Authentication and provide the URL of the Back end System from where the Data/web-service is hosted. i.e. in my case i am using a SAP ABAP system's URL where my web service is hosted and the service has MYSAPSSO2 Cookie enabled for the same and click on Save as Shown Below.




IGW Steps

1)  Create a Content Bundle from design time Eclipse Editor by using the Web service generated from the back end which was used to create security profile in previous step. and the add the below mentioned code in the script either in first or second methods.

the below mentioned code will fetch odata context from exchange headers and retrieve http request object from the odata context object and from the request object fetch the MYSAPSSO2 attribute and finally add it to message header which will be passed to cxf as http client header.

function processRequestData(message) {


var headers = message.getHeaders();
var context = headers.get("ODataContext");
var request = context.getParameter("~httpRequestObject");
var MYSAPSSO2 = request.getAttribute("MYSAPSSO2");



return message;


3) once the bundle is created with the above mentioned code in custom script,deploy the Bundle from Design Time with the Service Name Space as the Security profile Name which was created in previous Steps.


   Note: Here the private Key Alias is the Alias Name of the Back End Systems Certificate which is uploaded in the SMP servers Key Store. Private Key Alias is compulsory for the SSO2 Cookie Retrieval .

4) now open the service document and fire the call to the entity set in the business oriented URL i.e.




    Authentication challenge will be thrown by the browser as shown below.



    enter the user name and password, here credentials should be same as the credentials of back end System. so that SSO2 Cookie generated corresponding to the user will be propagated from SMP to Odata and in turn will be propagated to the back end system to get Authorized.


Use Case - 2

1) Create an Application in SMP Cockpit and Assign/Configure IGW service URL which we have created in UseCase 1 as Back end to the Application as Shown below.

smp App1.PNG



2) Create a New Security Profile with the Back End Url on which the webservice is hosted by Going to Authentication Tab As Shown Below or Assign an Existing Security Profile like SAP_BASIC or SAP_SSO2 which we have created already in Usecase 1  and click on the save to save the Application.

Security Profile 1.PNGSecurity Profile 2.PNG

3) Access the Application from Mobile Application to test the User Propagation.

Steps to Add the Back End Certificate to SMP Server

   1) Open File->Open Keystor fFile in portecle Tool and go the Location where the smp_keystore.jks file is Located. i.e. SMP server->Configurations  and click ok to open the Keystore.jks, Password for the keystore is "changeit".

Certificate Upload.PNG

   2)  Go to Tools-> Import Trusted Certificate and select the Back End System's Certificate from your system and click on import

Certificate Upload2.PNG

   3)  Click on the ok and yes buttons as shown below.

Certificate Upload3.PNG

Certificate Upload4.PNG

  4) Enter the Alias Name for the Certificate which is getting imported, this Alias Name is used while deploying the bundle from Design Time Eclipse.

5) Save the Keystore in the portecle Tools once the certificate is imported. Other wise the imported certificate will not be Reflected in SMP server's Configurations.

Certificate Upload5.PNG

Last year I had written a few blogs on Cross Platform Development using Xamarin

C# Cross Platform Mobile Application using Xamarin Forms

Using this approach I could develop SAP Purchasing App and order SAP Order to cash applications

Order to Cash Mobile Application using Xamarin Forms & C#

The basic idea in these applications was that I took SAP Transactions write C# code using Xamarin and develop native code generated apps for mobile phones. Using this approach we can develop mobile apps for most of SAP Business Processes.


I wanted to extend the functionality of my mobile apps by providing additional features which will improve the security and audit of documents by asking

the user put in their signature with each document they modify.  This will be very helpful in many situations and required in certain situations.


Luckily with Xamarin  there is Component called Signature Pad which provided the required functions from the device side.  A lot of documentation is provided by Xamarin at their web site and there are excellent sample projects on github that you can look at it to implement in your xamarin apps. I have developed a cross platform mobile application using Xamarin forms which will allow you to store signature with document number on SAP and this runs on IPhone and Andorid.


In my earlier blogs I have covered aspects of cross platform projects using Xamarin forms - you can also visit Xamarin web site or github for samples or youtube for Xamarin Videos.  Once you create a project you have include Signature Pad component for IOS and Android Project.  In this app I give the user the ability to

store signature for Notification (PM Tcode IW52) and Service Order (IW32).  The REST Calls are made to SAP and the process is the same as described in my earlier blogs.  Once the app is developed I have the following Screen on my mobile device



Once you click the appropriate button a record is created in SAP with info.  From Xamarin Side we invoke a REST Service with the the data.  The Signature Pad component requires a string with X,Y coordinates of each point - This can obtained using the method of signaturepad provided by Xamarin Components.

So we need to pass this huge array along with document number and comments to SAP. Due the limitations of SAP Field Size on Strings we have to design the tables so that we can store large number of co ordinate information.  To achieve this I designed the tables  so that Coordinate information is stores in one table and the other Text is stored in a different table



So for each Document we can store any number of Coordinate information as required.


When a click Button is invoked a REST Called is invoked passing the data  The following is C# Code that invokes SAP


                   ZPS_SIGNATURE inrec = new ZPS_SIGNATURE();

                    inrec.AEDAT = System.DateTime.Now.ToString("yyyyMMdd");

                    inrec.AUFNR = NotificationNumber;

                    inrec.OBJECT = "NOTI";

                    inrec.NOTES = NotiText;

                    inrec.SIGDATA = JsonConvert.DeserializeObject<List<ZPS_SIGNATUREDATA>>(SigArray);

                  string instring = Newtonsoft.Json.JsonConvert.SerializeObject(inrec);

                    List<ZTRETRUN> ret = await dataManager.updatesignature(instring);

                    ReturnMessage = new ObservableCollection<ZTRETRUN>(ret);



on The SAP side the structure the structure ZPS_SIGNATURE is as follows


The Rest call will convert JSON to this format and the SAP call will populate the table so that signature is stores in SAP.


When we want to retrieve the signature the signature the user enters document number on the device at which time the REST Service will fetch all records for the document ( there could be mutiple records on different dates with different comments) - We get the header information without the signature data to minimize the data - when the user clicks on the list - then we go make a REST call get the signature information .

On Header list page the following code in C# will fetch the header record list

public async void OnHistoryClicked(object sender, EventArgs e)


           if (string.IsNullOrEmpty(viewModelHistory.DocumentNo))


           await viewModelHistory.ExecutesetHistoryHeaderCommand();



When the user click on a record to see the signature - the following code will make REST call to SAP to get signature Data.


public async void OnItemSelected(object sender, ItemTappedEventArgs e)


           if (e.Item == null)


           ZPS_SIGNATURE L = e.Item as ZPS_SIGNATURE;

           viewModelHistory.DocumentNo = L.AUFNR;

           viewModelHistory.DocumentDate = L.AEDAT;

           await viewModelHistory.ExecutesetHistoryDetailCommand();

           L = viewModelHistory.HistoryDataList.FirstOrDefault<ZPS_SIGNATURE>();

           await Navigation.PushAsync(new DocumentHistoryDetailPage(L));



The following are the device screens for the same




I have tested on Android and Iphone 6 Plus should also work on Windows Phone due to simplicity for development using Xamarin Forms and free Signature Components provided  by Xamarin.

skip recursive paths

I had a terrible time with the Xcode6-beta2 Swift compiler’s auto-complete, when using objective-c headers and libs in .swift files, and I hear from a colleague that issues remain even with GA versions of Xcode6


For me, the application was compiling ok at build time, but after build, xCode would revert to popping an error for `unknown Type` for the objective-c classes.


I found that switching from a higher-level header search path with recursive search, to multiple specific paths without recursive search, resolved the issue.


Specifically, I could resolve the issue, by switching:










A very happy new year to all of you from the SMP product management team . We have an exciting year in store for you as reflected in our roadmap here. This blog is to recap some of the updates that were added to SDK SP06 (which was released in December 2014). The next release of SDK will be in Q1,2015.


The core features/updates in SDK SP06 are


Native SDK Updates

  • Mobile Place integration into Windows SDK .
  • Upgrade all the SDK libraries to the common supportability framework (for client log and E2ETrace).
  • Logon core enhancements to support push registration in Windows SDK.
  • Usage collection API in Windows SDK (HCP only)
  • Support for IOS8 and swift ( bug fixes on issues that were discovered after SDK SP05).


Kapsel Updates

  • Support for Cordova 3.6
  • Support for Android 4.4.4
  • Application usage reporting plugin ( Android , IOS)
  • Print Plugin ( IOS)
  • Calendar Plugin ( Android , IOS )
  • Custom Http headers plugin (Android , IOS , Windows 8.1)
  • Data vault security enhancements
  • Webview enhancement ( Android)


Agentry Updates

  • Open UI Interface extension to support custom Http based Authentication.
  • Agentry Test environment debug enhancements.

I was excited to hear the announcement about HANA Cloud Platform mobile services trial availability. This is a great opportunity to the developer community to try thehcpms1.png solution for free. HCPms is the cloud version of SAP Mobile Platform. SAP had another cloud version of SMP called SAP Mobile Platform, enterprise edition, cloud version it was deprecated.


Even there are few differences between SMP on-premise (SMP 3) and HCPms the mobile SDK is common for both, ie. an app written for SMP 3 runs against HCPms too without any code change - that caught my attention. Here I am going to run one of my existing hybrid mobile app developed for SMP 3 against HCPms.


Activate HCPms

Apache Ant should be installed and added to path

Android SDK - It also requires Java


Cordova - Version is 3.6.3-0.2.13



Configure Application in HCPms

From the HCPms admin cockpit (https://hcpmsadmin-<your HANA account user name>.dispatcher.hanatrial.ondemand.com/sap/mobile/admin/ui/home/index.html) click on Applications> Click on Add icon on the bottom bar and provide below details:

Application ID : com.kapsel.logon

Name: com.kapsel.logon

Type: Hybrid

Security Configuration : Basic

Optionally provide Description and Vendor.

ScreenHunter_16 Dec. 19 11.24.jpg



Click on Save.

Next, click on Backend and provide the below details.

Backend URL : http://services.odata.org/V2/OData/OData.svc/

Authentication Type: No Authentication

ScreenHunter_18 Jan. 03 20.42.jpg


Save the configuration.

Develop Hybrid Mobile App

Follow this blog to develop a kapsel logon based hybrid app. Replace index.html with index.html. Only change needed to run the app with HCPms is providing the HCPms host and port.

Run cordova command:

cordova prepare


Connect phone with PC using USB and execute the cordova command:

cordova run android to run the app in the device.



Offline a Demo

Adding offline functionality to a hybrid app is quite easy using Kapsel plugins. Following the blog Getting Started with Kapsel Offline OData I have added the offline plugin to the app for a small demo.

  • Download datajs file and place it in www folder
  • Replace index.html with index.html
  • Execute below commands from the project path to add the offline plugin.

          cordova -d plugin add com.sap.mp.cordova.plugins.odata

          cordova plugin add org.apache.cordova.network-information

ScreenHunter_18 Jan. 03 19.43.jpg



CC: SAP HANA Cloud Platform Developer Center


Happy Coding !

Midhun VP


Hi there,


we have finally made the SAP HANA Cloud Platform mobile services trial available for all hanatrial accounts. Activating the mobile services trial is unfortunately not as easy as just to click on "enable". The following describes how to fully enable the mobile services trial.


Prerequisite: You are already subscribed to HANA Cloud Platform trial .



  • Open your browser and navigate to https://hanatrial.ondemand.com/cockpit
  • Click on "Services" in the Content pane on the left.
  • You see a list of Services. Look out for SAP HANA Cloud Platform Mobile Services and click on "Enable".
  • After a couple of seconds it should look like this:


  • Now we need to subscribe the mobile services Admin Cockpit application to your trial account
    • Click on "Subscriptions" in the Content pane
    • Click "New Subscriptions"
    • Select Provider Account "sapmobile" and Application Name "hcpmsadmin"
    • Confirm by clicking "Create"
    • The screen should now look like this:


    • important is the first row
  • Now click on the link "hcpmsadmin" and select "Roles" in the Content pane
  • Click "New Role"
  • Type in the role name "HanaMobileAdmin" and confirm the dialog.
  • Click on "Assign..." in the lower part of the screen and assign your user to the freshly created role by providing your SCN ID in the dialog. Make sure the HanaMobileAdmin role is selected in the role list. If all is done correctly it should look like this:


  • In order to allow communication between the Admin Cockpit and the mobile services core you need to setup two destinations manually.
    • Navigate back to the start screen by clicking on your account name in the upper left corner (the link is labeled S00XXXXXXtrial).
    • Select "Destinations" in the Content pane
    • Create the following two destinations
Proxy TypeInternet
Cloud Connector Version2
Proxy TypeInternet
Cloud Connector Version2


  • It should look like this now:


  • The last thing we have to do is to assign another Administrator role to the service.
    • Click on "Services" in the Content pane
    • In the row of HANA Cloud Platform mobile services, click on the right icon showing a little person. The tooltips says "Configure roles".
    • Now select the role "Administrator" row in the list of roles.
    • Click "Assign..." in the lower part of the screen. Provide your S-User ID in the dialog and confirm by clicking "Assign".
    • it should look like this:


  • Now navigate back to the "Services" view using the Content pane and click "Go to Service". You should be redirected to the HANA Cloud Platform mobile services Admin Cockpit.


You can now start playing around with SAP HANA Cloud Platform mobile services.



In order to connect your mobile Application you want to point it to:


once you have a valid app configuration.


In another blog I will explain how to configure your first Application. Stay tuned.



Have Fun,



a couple of weeks ago I was very happy to announce SAP HANA Cloud Platform mobile services in my blog post SAP HANA Cloud Platform mobile services released.

While these new service where only available for customers and partners by entering the RampUp process (a kind of BETA), individual developers like you and me didn't have the opportunity to get their hands dirty with the mobile services. Luckily this uncomfortable situation will end soon and I am - again - happy to announce that we are preparing a public trial for the SAP HANA Cloud Platform mobile services.

If all went well you will have access to the mobile services within your HANA Cloud Platform trial account on hanatrial.ondemand.com beginning this week.


This is your mobile Christmas present - just for you.


But wait, what about the SMP Trial that is available on hanatrial.ondemand.com?


Well, your current SMP Cloud version trial subscription will be available for you until end of January. Please make sure that any configuration that you have created will be deleted and that you need to have manual backups of your configuration data and log files if necessary. We will remove the subscriptions to SAP Mobile Platform, enterprise edition, cloud version permanently.


Have Fun,

Martin Grasshoff


1. Update:

Also watch my CodeTalk with Ian Thain about the announcement of the Trial: https://www.youtube.com/watch?v=DMCP0_h-55w


2. Update:

Trial is already available: How to enable HANA Cloud Platform Mobile Services Trial

As we know that during SMP3 installation we provide the keystore password to protect SMP3 Keystore and Truststore locations. This Keystore password should be the same as all the private key passwords associated with the all the alias in the Keystore.


All the Keystore and Truststore related information are there in a single file. i.e. smp_keystore.jks (E:\SAP\MobilePlatform3\Server\configuration)


Keystore: The location where encryption keys, digital certificates and other credentials are stored (either encrypted or unencrypted keystore file types) for SAP                 Mobile Platform runtime components.

Truststore: The location where Certificate Authority (CA) signing certificates are stored.


Pre-requisite: Make sure to back-up the same file (C:\SAP\MobilePlatform3\Server\configuration\smp_keystore.jks)





1. First change the Keystore password by running the below command


E:\SAP\MobilePlatform3\Server\configuration>keytool -storepasswd -new s4pAdmin -keystore smp_keystore.jks

(Where s4pAdmin is the 'new password')

  • At prompt, enter the current password. (for me, it's s3pAdmin)





2. For changing the each of the passwords for all private keys in the Keystore, we need to change it one by one. By default, there are 2 private key alias entries in the SMP Keystore file. i.e. smp_crt and tomcat





2.1 To change the password for alias entry smp_crt, run the below command:



E:\SAP\MobilePlatform3\Server\configuration>keytool -keypasswd -alias smp_crt -new s4pAdmin -keystore smp_keystore.jks


     Keystore password:                        s4pAdmin (new keystore password as per step #1)

     Enter key password for <smp_crt> : s3pAdmin (current password)





2.2 To change the password for alias entry tomcat, run the below command:


     E:\SAP\MobilePlatform3\Server\configuration>keytool -keypasswd -alias tomcat -new s4pAdmin -keystore smp_keystore.jks


     Keystore password:                      s4pAdmin (new keystore password as per step #1)

     Enter key password for <tomcat> : s3pAdmin (current password)






3. Now, we need to configure the SMP to recognize the new password:


3.1  We have to encrypt the new password by obtaining the secret key from the-DsecretKeyproperty (E:\SAP\MobilePlatform3\Server\props.ini)





3.2 Run the below command:


               java -jar tools\cipher\CLIEncrypter.jar <secretKey> <newPassword>


E:\SAP\MobilePlatform3\Server>java -jar tools\cipher\CLIEncrypter.jar Vv4bm3LniE s4pAdmin




3.3 Open com.sap.mobile.platform.server.foundation.config.encryption.properties file available E:\SAP\MobilePlatform3\Server\config_master\com.sap.mobile.platform.server.foundation.config.encryption


  • Here we need to updateprivateKeystorePass to replace the existing password with the new encrypted password, keeping{enc}as the prefix.



  • Save the changes.
  • Restart restart the server for the changes to take effect.


To verify if above changes have been reflected, you can use keytool generator KeyStore Explorer to open Keystore file.


(A) . To verify Keystore password:



(B) To verify the password of alias smp_crt and tomcat

  • Open keytool explorer, Right click smp_crt>View Details > Private Key Details >Enter new password


  • If password is wrong, you would see an error message like below:


I hope it helps.




The SMP 3.0 OData SDK SP05 introduced the concept of store, which is an abstraction for services that can be consumed via OData protocol. There are two type of stores: online and offline. The methods for creating, updating, deleting and querying data are the same for both stores, however there are some differences. Let's get you started with the Offline store.



Your android project must include the following libraries under libs folder

  • AfariaSLL.jar
  • ClientHubSLL
  • ClientLog.jar
  • Common.jar
  • Connectivity.jar
  • CoreServices.jar
  • DataVaultLib.jar
  • E2ETrace.jar
  • HttpConvAuthFlows.jar
  • HttpConversation.jar
  • maflogger.jar
  • maflogoncore.jar
  • maflogonui.jar
  • mafuicomponents.jar
  • mafsettingscreen.jar
  • MobilePlace.jar
  • ODataAPI.jar
  • odataoffline.jar
  • ODataOnline.jar
  • perflib.jar
  • Request.jar
  • sap-e2etrace.jar
  • SupportabilityFacade.jar
  • XscriptParser.jar


The following resources should be imported under libs/armeabi folder

  • libmlcrsa16.so
  • thelibodataofflinejni.so


You can find the .jar and .so files in your OData SDK installation folder:

<Client SDK dir>\NativeSDK\ODataFramework\Android\libraries

<Client SDK dir>\NativeSDK\MAFReuse\Android\libraries

<Client SDK dir>\NativeSDK\ODataFramework\Android\libraries\armeabi



The offline store requires among other information, the collections (also called defining requests) that will be accessible offline. When the client app requests the initialization of the offline store this is what happens under the covers:

  1. The mobile services (either SMP 3.0 SP04 on premise or HCPms) will send a GET requests to the OData producer to get the metadata (OData model) and it will use the OData model to create the Ultralite database schema.
  2. For each defining requests, the mobile services will pull the data from the OData producer and will populate the database. The mobile services checks if there’s a delta token:
    1. If there is a delta token, it will cache it and use it in the following refresh.
    2. If there is not delta token, it will cache the keys populated in the database.
  3. The mobile services will notify the client app the database is ready
  4. Using Ultralite functionality the client app will download the database. At this point the database can use used offline



Code Snippet – How to open an offline store

//This instantiate the native UDB libraries which are located in the
//libodataofflinejni.so file

//Get application endpoint URL
LogonCoreContext lgCtx = LogonCore.getInstance().getLogonContext();
String endPointURL = lgCtx.getAppEndPointUrl();
URL url = new URL(endPointURL);
// Define the offline store options.
// Connection parameter and credentials and
// the application connection id we got at the registration
ODataOfflineStoreOptions options = new ODataOfflineStoreOptions();
options.serviceRoot= endPointURL;
//The logon configurator uses the information obtained in the registration
// (i.e endpoint URL, login, etc ) to configure the conversation manager

// It assumes you used MAF Logon component to on-board a user      
IManagerConfigurator configurator = LogonUIFacade.getInstance().getLogonConfigurator(context);

HttpConversationManager manager = new HttpConversationManager(context);



options.conversationManager = manager;


options.storeName ="flight";


//This defines the oData collections which will be stored in the offline store

options.definingRequests.put("defreq1", "TravelAgencies_DQ");


//Open offline store synchronously

ODataOfflineStore offlineStore = new ODataOfflineStore(context);



//A way to verify if the store opened successfully

Log.d("openOfflineStore: library version"+ ODataOfflineStore.libraryVersion());

Once the offline store is open, you can create, update, delete and query data offline. As we mentioned before, the methods for creating, updating, deleting and querying data are the same for both stores. Note that all offline store requests are sent to the local database.


Code Snippet – How to query data with an offline store

//Define the resource path

String resourcePath = "TravelAgencies_DQ";

ODataRequestParamSingle request = new ODataRequestParamSingleDefaultImpl();



//Send a request to read the travel agencies from the local database

ODataResponseSingle response = (ODataResponseSingle) offlineStore.executeRequest(request);

//Check if the response is an error

if (response.getPayloadType() == ODataPayload.Type.Error) {

       ODataErrorDefaultImpl error =  (ODataErrorDefaultImpl) response.getPayload();

       //TODO show the error

//Check if the response contains EntitySet

} else if (response.getPayloadType() == ODataPayload.Type.EntitySet) {

    ODataEntitySet feed = (ODataEntitySet) response.getPayload();

    List<ODataEntity> entities = feed.getEntities();

       //Retrieve the data from the response

    ODataProperty property;

    ODataPropMap properties;

    String agencyID, agencyName;

       for (ODataEntity entity: entities){

              properties = entity.getProperties();

              property = properties.get("agencynum");

              agencyID = (String) property.getValue();

              property = properties.get("NAME");

              agencyName = (String) property.getValue();

              . . .





When connectivity is available, the client app must send all the local changes, this process is called Flush. When the client app requests a flush, this is what happens under the covers:

  1. The offline store communicates with mobile services
  2. For each requests the mobile services attempts to execute the request against the OData Producer
  3. The mobile services send the responses (errors and successes) to the client app and the errors will be stored in the ErrorArchive collection of the offline store.

Code Snippet - Flush


After the flush, the client app must receive all the changes from the OData producer that have occurred since the last refresh. When the client app requests a refresh, this is what happens under the covers:

  1. If delta token is enabled, for each request the mobile services requests data with the delta token
  2. Otherwise, the mobile services retrieve all the data from the OData producer and retrieve keys from cache and compute the delta. Reducing the traffic from the mobile services to the client app
  3. The mobile services transform all the changes to the relational mobilink protocol and send it back to the client app.
  4. The client app Ultralite database performs all the instructions


Code Snippet - Refresh



The code snippets showed in this blog are using the synchronous methods for simplicity purposes. Please note there are asynchronous methods available.

This blog assumed

  • You have configured an application in the mobile services with Back-End Connections

http://<sap gateway host>:<port>/sap/opu/odata/IWFND/RMTSAMPLEFLIGHT/  

  • A user has been on-boarded with the mobile services using MAF Logon component.


For more information on how to create an application configuration, visit Deploying Applications

If you prefer hands-on exercises, check these guides out

How To... Enable user On-boarding using MAF Logon with Template Project (Android)

How To...Consume OData Services in Offline Mode (Android)

How to... Handle Synchronization Errors (Android)


Hope you find this information useful,


I frequently program directly against a NW Gateway when I'm starting or prototyping, then add SMP to the landscape, once the application is fleshed-out, and I want to add offline functionality.  This has tended to be when I add MAF Logon to the app in the past.


With the simple bootstrapping with Cocoapods, and the reusable STSOData framework, I get MAF Logon for free from the start.   Great, I like it, and I especially like using the new Discovery Service on-boarding to auto-import my connection settings from the cloud.  Using the STSOData framework's LogonHandler, I set the application's applicationID from in the AppDelegate -applicationDidBecomeActive: method.


[[LogonHandler shared].logonManager setApplicationId:@"stan.flight.https"];

The applicationID should match the applicationID in the SMP Admin console.


But today, my SMP system is being re-installed by QA, and I can't afford the down-time.  How do I switch back to connecting directly-to-NW Gateway?


I don't want to change anything in the application.  I know that I won't be able to use the offline features without SMP, but I can toggle that off in the STSOData DataController.  What should I do?


The easiest way to switch back to directly-to-NW Gateway, after working with SMP, is by changing the applicationID value set above.


The Solution

MAF Logon constructs the connection URL from the protocol/host/port parameters, then appends the applicationID to the url path.  So, these settings:


MAF Logon settings:

host:                     smpqa12-01.sybase.com

port:                      443

protocol:              https


Programmed in app:

applicationID:      stan.flight.https


are concatenated into this URL:  https://smpqaXX-XX.sybase.com:443/stan.flight.https/.  This is the base URL that the SODataStores use for querying $metadata, Collections, FunctionImports, etc. when connecting via my SMP server.


My NW Gateway system has this URL:  http://usxxxx21.xxx.sap.corp:8000/sap/opu/odata/IWFND/RMTSAMPLEFLIGHT/.  I can use the same inputs between MAF Logon, and the applicationID, by swapping in the OData application path components for the SMP applicationID, to produce the URL:  http://usxxxx21.xxx.sap.corp:8000/sap/opu/odata/IWFND/RMTSAMPLEFLIGHT/.


I accomplish this by changing the value when I set the applicationID on the LogonHandler, as above:


//[[LogonHandler shared].logonManager setApplicationId:@"stan.flight.https"];

[[LogonHandler shared].logonManager setApplicationId:@"sap/opu/odata/IWFND/RMTSAMPLEFLIGHT"];

Do not append a forward-slash "/" to the 'applicationID' value when substituting the OData application path components; the application is already adding the slash for the regular SMP applicationID value.

13.pngHi everyone!


I have great news...

Loads of How-To-Guides have been published. Hope these documents will help you to learn what has been discussed in the previous blogs even quicker and efficiently.

They are surely covering essential topics for both online and offline stores. In addition to it, it covers following topics:

- User Onboarding without MAF UI

- Push Notification

- Batch Request with Online Store

- Log & Trace

Together with the H2Gs, the associated Xcode projects are all ready for you. It has a set of Exercise (the one you can go through with H2G step-by-step instruction to complete) & Solution (completed one - it will run by adding the required SDK libs). In the Github UI, "master" indicates Exercises and "solution" does Solution Xcode projects.


Perhaps I should address a few tips, which are demonstrated in the example Xcode projects.

Reloading TableView & Showing Alert in the Main Thread

After you fetch the data, the second thing you would do is to render the data via a table view or popup. Have you encountered a strange situation where the data fetch works pretty fast but after the data retrieval, it takes a while to invoke table view to render…?

You have to make sure if you're calling it in the main thread. You can google the detailed general discussion but in a nutshell, here's a tableView example:

01  [tableView reloadData]; // normal way
02  // calling it in the main thread
03  [tableView performSelectorOnMainThread:@selector(reloadData)
04                              withObject:nil
05                           waitUntilDone:NO];
06  // alternative way to call it in the main thread
07  dispatch_async(dispatch_get_main_queue(), ^{
08    [tableView reloadData];
09  });

By calling it in the main thread, you can confirm the runtime speed boosts up. The same story goes with other UIs such as alert.

01  [alert performSelectorOnMainThread:@selector(show)
02                          withObject:nil
03                       waitUntilDone:NO];

How come we have to do this? This is not really OData SDK remark but a general iOS tip. Apple's API reference says:

"Note: For the most part, use UIKit classes only from your app’s main thread. This is particularly true for classes derived from UIResponder or that involve manipulating your app’s user interface in any way."

OData SDK's HttpConversationManager does not call back on the main thread, so the SODataStore also call back on a background thread. It’s the task of the app developer to take care of proper UI calls in the way explained above. The NSURLSession also calls back on a background thread.

Conclusion - Always call UI in the main thread! ;-)

OData Format in either JSON or XML


By default the online store handles OData in XML. Here's how you switch it to JSON format. As JSON is far lightweight than XML, you would like to go with JSON - but you might want to use XML during the development, as it would be easier to debug if something went wrong.

01  // Use options to configure the store to send the request payload in JSON format
02  SODataOnlineStoreOptions *storeOptions = [[SODataOnlineStoreOptions alloc] init];
03  storeOptions.requestFormat = SODataDataFormatJSON;
04  onlineStore = [[SODataOnlineStore alloc] initWithURL:[NSURL URLWithString:endpointUrl]
05                               httpConversationManager:httpConvManager
06                                               options:storeOptions];

The offline store only sends modification requests in JSON format. The server component can perform refreshes in either XML or JSON, but the default is JSON - Just about every OData producer supports JSON nowadays.


MAF UI Redirects to Afaria Client App

If you're deploying the app in your iOS device, you would notice the MAF UI redirects to Afaria Client App, every time you onboard. The context switch happens as MAF UI checks if Afaria provides configuration for the particular application. This could be annoying if you haven't configured Afaria - here's how to turn off the default context switch behavior.

1. Find the "MAFLogonManagerOptions.plist" in the bundles folder of OData SDK libs in the Xcode project.


2. Switch the "keyMAFUseAfaria" value to NO.


3. Make sure there's "MAFLogonManagerNG.bundle" in Copy Bundle Resources in Build Phases tab in Xcode project. If not - add it.


That's all, happy learning with H2G :-)

See you in the next blog,


List of blogs

In some cases, extending the Agentry Product JARs (like the SAPWM-x.x.x.x.jar) in an object oriented way is not an easy task and it can be tiresome if you want to add some generic functionality like adding additional (trace) logging, error handing, monitoring and so forth to "every" StepHandler/BAPI/etc.-class. If you do not want to touch the actual SAPWM-x.x.x.x.jar, you might want to consider using AspectJ load-time weaving. I will not explain the basic concepts of AspectJ here as there are plenty of tutorials and examples in the net. If you are new to aspect-oriented coding, I strongly recommend, you get your feet wet with some standalone Java application first. In the following, I just want to explain, how to set up AspectJ for the Agentry Java Backend of the SMP 3.0.


First, you need some tools and libraries:

  • AspectJ Development Tools for Eclipse
  • From the aspectj-x.x.x.jar:
    • lib/aspectjrt.jar
    • lib/aspectjweaver.jar


Basic configuration of the SMP for AspectJ:

  • Put the aspectjrt.jar into the Agentry Application Java folder (where the Agentry-v5.jar is located)
    • You need to modify the META-INF/MANIFEST.MF in the aspectjrt.jar:
      • Add the following line to make the JAR osgi compatible

                        Export-Package: org.aspectj.lang;org.aspectj.runtime

  • Add ;.\Java\aspectjrt.jar to your Agentry.ini classpath property.
  • Put the aspectjweaver.jar in the SMPs Server folder (not in the Server\lib folder)
  • Add the following lines to the SMPs Server/props.ini file in the jvm section (the -D options can be removed / set to false, once you are confident you have everything set up properly)
    • -javaagent:.\aspectjweaver.jar
    • -Dorg.aspectj.weaver.showWeaveInfo=true
    • -Daj.weaving.verbose=true


Write your AspectJ code (or use the attached code), compile and JAR it (e.g. as aopdemo.jar) and put it into the Agentry Application Java folder (where the Agentry-v5.jar is located). Don't forget to add your JAR to the Agentry.ini classpath property (e.g. ;./Java/aopdemo.jar).

  • Now, upon SMP startup, you should be able to see some AspectJ initialization logging in the <SERVER>-smp-server.log (assuming you have aj.weaving.verbose set to true). Look for the following lines:


AppClassLoader@142e6767 info AspectJ Weaver Version 1.8.2 built on Thursday Aug 14, 2014 at 21:45:02 GMT

AppClassLoader@142e6767 info register classloader sun.misc.Launcher$AppClassLoader@142e6767

AppClassLoader@142e6767 info using configuration file:/.../aopdemo.jar!/META-INF/aop.xml     

AppClassLoader@142e6767 info register aspect aopdemo.MyAspect

  • As soon as some aspect code is woven in, you should be able so see something like this (you might have to synchronize or perform the proper actions on the Agentry client, depending on your pointcut definitions; for the aopdemo.MyAspect this is not required):


AgentryApplicationClassLoader@7cc49e01 weaveinfo Join point 'method-execution(java.util.ArrayList com.syclo.sap.bapi.GetUserProfileDataBAPI.processResults())' in Type 'com.syclo.sap.bapi.GetUserProfileDataBAPI' (GetUserProfileDataBAPI.java:68) advised by around advice from 'aopdemo.MyAspect' (MyAspect.aj:42)

  • For the aopdemo.MyAspect, there should be a lot of console output like this:


[AOP] (BAPIFactory.java:34)                    boolean com.syclo.sap.BAPIFactory.validateClass(String, String) returns java.lang.Boolean: true
[AOP] (BAPIFactory.java:34)                    boolean com.syclo.sap.BAPIFactory.validateClass(String, String)
[AOP] (BAPIFactory.java:34)                      arg java.lang.String: WorkorderTransferBAPI
[AOP] (BAPIFactory.java:34)                      arg java.lang.String: com.syclo.sap.component.workorder.bapi.WorkorderTransferBAPI
[AOP] (BAPIFactory.java:82)                      void com.syclo.sap.BAPIFactory.register(String, String)
[AOP] (BAPIFactory.java:82)                        arg java.lang.String: WorkorderTransferBAPI
[AOP] (BAPIFactory.java:82)                        arg java.lang.String: com.syclo.sap.component.workorder.bapi.WorkorderTransferBAPI
[AOP] (BAPIFactory.java:82)                      void com.syclo.sap.BAPIFactory.register(String, String) returns <null>
[AOP] (BAPIFactory.java:34)                    boolean com.syclo.sap.BAPIFactory.validateClass(String, String) returns java.lang.Boolean: true


For the aopdemo.MyAspect, the client sync should be extremely slow due to the amount of logging data. Your next step should be to change the AspectJ code to reduce the number of join points by adjusting the pointcut definition. This should lead to less logging and better performance.


If you have been able to reproduce the above steps for your SMP installation, you have done it. From here on, its up to you, to identify those extensions, that are a pain with the object-oriented approach and can be done nicely by using aspect-orientation.

I would be interested to hear about your ideas on where AspectJ can be beneficial. Feel free to post them here...


Filter Blog

By author:
By date:
By tag: