Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

I am a newbie started exploring SAP HANA and Sybase Event Stream Processor. I was trying to figure out how I can load data from a text file into SAP HANA server in real time. This file is not located in SAP HANA server. Assume there is a text file, it can be a log file or any file that we continuously keep appending records to it and we need to load these records into HANA server in real-time.

Then, I found Sybase Event Stream Processor and installed free trial version. Sybase ESP has SAP HANA Output Adapter that uses ODBC connection to load information from Event Stream Processor into the SAP HANA server.

In this example, I thought of a scenario that there is a log file which has transaction logs. Each transaction is a line in the text file formatted as:

Transaction ID|Transaction Type|Transaction Status|Transaction Date|Details

So, I created a simple Java project to generate random transaction data and write this data to a log file called transactionLog.csv.


To be able to load data into HANA server from ESP, first you need to configure the HANA ODBC data source. Open ODBC Data Source Administrator and add HANA ODBC driver.

Figure1: Creating new HANA ODBC datasource

Figure 2: Adding and testing datasource


After you configured ODBC data source successfully, go to %ESP_HOME%\bin\service.xml and add the following entry into your service.xml file.


<Service Name="HANAODBCService" Type="DB">

  <Parameter Name="DriverLibrary">esp_db_odbc_lib</Parameter>

  <Parameter Name="DSN">HDB</Parameter>

  <Parameter Name="User">***user***</Parameter>

  <Parameter Name="Password">***password***</Parameter>

</Service>


Then, I created an ESP Project and added a .ccl file. I used three tools from the palette.

  1. File/Hadoop CSV Input Adapter
  2. Input Window
  3. SAP Hana Output Adapter.

Figure 3: Event Processing


1. File/Hadoop CSV Input Adapter is added to read transactionLog.csv file.

TransactionAdapter Configuration:

    • Directory: Directory path of the data file at runtime.
    • File: File which you want the adapter to read (you can specify regex pattern as well)
    • Dynamic Loading Mode: Adapter supports three modes, namely static, dynamicFile and dynamicPath. You need to use either dynamicFile or dynamicPath mode if you need to keep polling the new appended content into the file. I set this parameter to dynamicFile.
    • Column Delimiter: | for this example
    • Has Header: False (the text file that I generated doesn’t contain the descriptions of the fields).
    • Poll Period (seconds): Period to poll the specified file. It is set to 5 seconds in this example.


2. Input Window has a schema, which defines the columns in the events. In this example, we can say that each transaction is an event. TransactionInputWindow’s schema has columns transactionId, transactionType, transactionDate, status and description.


3. SAP HANA Output Adapter is used to load data rapidly from Event Stream Processor into SAP HANA database table.

TransactionHANAOutputAdapter Configuration:

    • Database Service Name: HANAODBCService (service name defined in %ESP_HOME%\bin\service.xml)
    • Target Database Schema: Source schema in HANA server
    • Target Database Table Name: Table where the data is loaded into.


Finally, I created a corresponding HANA database table into which the output adapter loads transaction data. Then, I ran my log generator and Event Stream Processor. Transaction data was loaded successfully into the table.


Before running log generator, the log file is empty and there is no event streamed into ESP and the HANA database table is empty as shown in the figures.

Figure 4: Transaction Input Window before generating data (0 rows)

Figure 5: Result of select query on HANA table before generating data (No rows retrieved)


After running log generator, transaction data written into log file is streamed into ESP via input adapter and loaded into HANA via HANA output adapter. 37,770 transaction records are added to the table.

Figure 6: Streaming transaction events after running log generator


Figure 7: Number of rows in HANA table after running log generator


Keep running log generator... New appended data is loaded into HANA table, the number of transactions has increased to 44,733 as seen in the figure.

Figure 8: Number of rows in HANA table after running log generator


After making sure that I am able to load the data into HANA, I created an attribute view and a calculation view.

Figure 9: Attribute View


An attribute view is created and calculated columns are added to format transaction date and status information.

Transaction Status:

0: Error

1: Warning

2: Success

Case() function under Misc Functions is used to format status information.

Figure 10: Case function

Figure 11: Calculation View


A Calculation View is created. Transaction data is grouped by transaction status.


After creating views, I created OData services to expose the views.


transactionView.xsodata

service namespace "experiment.services" {

       "experiment.model::VW_TRANSACTION"

       as "Transactions"

       keys generate local "ID";

}

status.xsodata

service namespace "experiment.services" {

       "experiment.model::CV_STATUS"

       as "TransactionStatus"

       keys generate local "ID";

}


Since the data is exposed, let’s consume it. I created a transaction view under my SAPUI5 project and added a table and a viz chart to show transaction data.

Figure 12: transaction.view.js

Figure 13: transaction.html


Below is the final ui for this example.

Figure 14: Transactions page

Labels in this area