Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
ericdu
Associate
Associate

Note: This blog is not up to date as new functionalities have been introduced in late HANA revisions.


As a technology advisor to the startups, I have the opportunities to work with the innovative startups from all different kinds of areas. One particular area very popular this year is IoT. Streaming data processing in real-time is what many startups had asked me to share, that I'd like to introduce the new Smart Data Streaming capability in SPS09 of HANA with same sample use cases.

Smart data streaming provides the ability to process incoming event streams in real-time, as fast as the data arrives, capturing the desired data in the SAP HANA database, as well as monitoring the data, generating alerts and notifications as desired. Data flows into a streaming project running on the streaming cluster through adapters, which provide the connectivity to data sources. Continuous queries contained in the streaming projects turn the raw input into the desired output. Data is then pushed out to destination sources through output adapters which are connected to the streams or windows in the project.



Go to the Landscape view of HANA Studio Administration Console to check the services and make sure the streaming server is running.

Install HANA Streaming Plug-in in HANA Studio:


You can get the installer from your instructor, then go to Help->Install New Software, click Add to and select the path where you put the installer, following the onscreen instructions. After the installation, restart your HANA Studio and now you should see two additional prospectives, SAP HANA Streaming Development and SAP HANA Streaming Run-Test.

Add Streaming Server and Data Services


Switch to the SAP HANA Streaming Run-Test perspective, in the Server View panel, click to add a new Server URL, esps://<Your HOST>:30026 and login with HANA user name and password, you can use the SYSTEM user. You can also create new workspaces in which you can put your streaming projects later, here we have created a workspace “sfp".



Now go to SAP HANA Streaming Development to add the data services, right click the steaming server and click “Add ODBC Service”, choose the 64 bit version driver, enter HANA user and password, check to enable as HANA reference service and specify the ODBC DSN. If you do not know the ODBC DSN, please ask your instructor, the ODBC DSN must be configured in the Linux Server of your HANA system.



Here is an example of ODBC DSN configuration in Linux, where a .odbc.ini file must be existed in the home directory of the hdbadm user in HANA system, by default at /usr/sap/HDB/home and below is the DSN definition in our example.

[hanaservice]
Driver=/hana/shared/HDB/hdbclient/libodbcHDB.so
ServerNode=10.248.135.107:30015

CHAR_AS_UTF8=1


Now save the data service and right click to discover the server, you can see the Schemas, Tables, Table Columns from HANA system are visible.


Go to the Preferences, change the default compile server(localhost) to your streaming server. In the following screen, select the Change button and choose your server in the prompt screen.


Now all the prerequisites have been fulfilled and we can start to create the streaming project to simulate a real IoT use case.


The Streaming Use Case


Scenario: Simple IoT scenario where we are collecting and monitoring sensor data from refrigeration units.  The coolers are equipped with sensors for temperature, power, and door open/close events. The data structure of the messages generated from the refrigerators look like this:


We will create streaming projects that assume the incoming event data will look like the following table represents the events related to Door, Temperature and Power.


MACHINEID

EVENT_TIME

EVENT_TYPE

EVENT_NAME

EVENT_VALUE

2DDDBW3TP

2015-01-20 12:00:00.000

TEMP

InsideTemp

45

2DDDBW3TA

2015-01-20 12:00:01.000

DOOR

DoorClose

DoorClose

2DDDBW3TP

2015-01-20 12:00:02.000

TEMP

InsideTemp

25

2DDDBW3TB

2015-01-20 12:00:03.000

DOOR

DoorOpen

DoorOpen

2DDDBW3TC

2015-01-20 12:00:06.000

POWER

PowerOn

PowerOn


These are some common use cases:


1. Filter the data to capture all the “door events” – i.e. open/close events – in a HANA table that represents the activity for the coolers (Objects: Filter window object, HANA Output Object)

2. Create a moving average of the temperature for each cooler and monitor that temperature of each cooler is in desired range; if not add an alert to an output window (Objects: Join window object, Average window object, Derived window object)


Exercise 1 - Capture the Door Events


Lets start from the simplest use case to filter out some events and capture the certain types of events into HANA. In this case, we only want the “door events” using a Filter on Stream/Window and the HANA Output Object in the Smart Data Streaming.


Create the Streaming Project: Go to the SAP HANA Streaming Development perspective of HANA Studio, right click in the Project Explorer panel, click New->Others, select SAP HANA smart data streaming->New Streaming Project, give a project name “door_event” and click Finish to create the project.


Rename the default stream to MACHINEDATA, add the columns MACHINEID, EVENT_TIME, EVENT_NAME, EVENT_DESCRIPTION and EVENT_VALUE to the stream, create the filter to ensure the EVENT_NAME is equal to ‘DOOR’. Create a HANA Output from Output Adapters, configure the adapter properties like below. The target database table here will be “sfp.iot.streaming.data::door_events” here(the table must have been created).

Connect the stream object with the Output and save the project. Select the project name, right click to and go to SAP HANA smart data streaming->Compile Streaming Project. If there is no error message showing, you project has been compiled and ready to run.

There is a corresponding .ccl file in your project, which is the generated code of the project. CCL stands for Continuous Computation Language is the primary event processing language of SAP HANA smart data streaming. CCL is based on Structured Query Language (SQL), adapted for stream processing. The key distinguishing feature of CCL is its ability to continuously process dynamic data. A SQL query typically executes only once each time it is submitted to a database server and must be resubmitted every time a user or an application needs to execute the query. By contrast, a CCL query is continuous. Once it is defined in the project, it is registered for continuous execution and stays active indefinitely. We can cover more in the other examples.

Now select the project, right click SAP HANA smart data streaming->Run->Run Streaming Project in Workspace <Your Workspace> and it will switch to the SAP HANA Streaming Run-Test perspective automatically and you can also see the running project and the input stream object MACHINEDATA, double click it and it will show you the stream view of the object.

In the real world, it will the devices/machines connect to the streaming server through various adapters for all kinds of source systems. Here we just simulate it use the Manual Input represent the incoming events. I am going to send these two events, as you can imagine, the first event will be filtered out and the second one is the door event and it will be captured by SAP HANA.

MACHINEID

EVENT_TIME

EVENT_TYPE

EVENT_NAME

EVENT_VALUE

2DDDBW3TP

2015-01-20 12:00:00.000

TEMP

InsideTemp

45

2DDDBW3TA

2015-01-20 12:00:01.000

DOOR

DoorClose

DoorClose

As you can see here, two messages have been sent to the streaming server but only the door event is showing in the stream view and the HANA table "sfp.iot.streaming.data::door_events".

Exercise 2 - Monitor Moving Average of Temperature


We will create a moving average of the temperature for each cooler and monitor that temperature of each cooler is in desired range, if not add an alert to an output window. We will use the Join window object, Average window object and the Derived window object in this project.


Go to the SAP HANA Streaming Development perspective of HANA Studio, right click in the Project Explorer panel, click New->Others, select SAP HANA smart data streaming->New Streaming Project, give a project name “avg_temperature” and click Finish to create the project and you will see the .ccl, .ccr and .cclnotation file of the project.


Like the previous project, rename the default stream object to MACHINEDATA, add the same columns with the appropriate types.


In the right side of the canvas, go to Streams and Windows, select the Reference that refer to existing tables in HANA. Here we will use the table “sfp.iot.streaming.data::machine”, which stores the detail information of each refrigeration unit, e.g. the location, the name, the Max and Min temperature of each unit.


Join the Reference with the MACHINEDATA stream object based on the MACHINEID column. The MACHINEDATA stream object represent the real-time streaming data and the MACHINE_DETAIS_REFERENCE is the Window object represent the static data of the machine specifications, this is very common in the real world that you will need to join these kinds of data in streaming processing.

Here is the join expression prompt:

As we are going to monitor the average temperature, we will then need to create the Aggregate object, calculate the average temperature for every 1000 records with the formula highlighted below, also specify the Group by condition and filter out those events are not related to the temperatures.

Finally, create a Derived Window object for the Alarm, when the moving average temperature is higher than the Max temperature of the machine, send out the alarm.

So the entire project will look like below, compile the project and fix any error you see in the console.

Below is the CCL code of the project. The code itself explains the workflow and it should be easy to understand.

CREATE INPUT STREAM MACHINEDATA SCHEMA (MACHINEID string ,
  EVENT_TIME
msdate ,
  EVENT_NAME
string ,
  EVENT_DESCRIPTION
string ,
  EVENT_VALUE
string );
CREATE REFERENCE MACHINE_DETAILS_REFERENCE SCHEMA (
  MACHINEID
string ,
  MACHINE_TYPE
string ,
  MACHINE_NAME
string ,
  MAX_TEMP
integer ,
  MIN_TEMP
integer ,
  LOCATION
string ,
  TEMP_UNIT
string ) PRIMARY KEY ( MACHINEID ) PROPERTIES service = 'hana' ,
  sourceSchema =
'SFP' ,
  source =
'sfp.iot.streaming.data::machine' ;
/**@SIMPLEQUERY=JOIN*/
CREATE OUTPUT STREAM EVENTS AS SELECT MACHINEDATA.MACHINEID MACHINEID ,
  MACHINEDATA.EVENT_TIME EVENT_TIME ,
  MACHINEDATA.EVENT_NAME EVENT_NAME ,
  MACHINEDATA.EVENT_DESCRIPTION EVENT_DESCRIPTION ,
  MACHINEDATA.EVENT_VALUE EVENT_VALUE ,
  MACHINE_DETAILS_REFERENCE.MACHINE_TYPE MACHINE_TYPE ,
  MACHINE_DETAILS_REFERENCE.MACHINE_NAME MACHINE_NAME ,
  MACHINE_DETAILS_REFERENCE.MAX_TEMP MAX_TEMP ,
  MACHINE_DETAILS_REFERENCE.MIN_TEMP MIN_TEMP ,
  MACHINE_DETAILS_REFERENCE.LOCATION LOCATION ,
  MACHINE_DETAILS_REFERENCE.TEMP_UNIT TEMP_UNIT
FROM MACHINEDATA INNER JOIN MACHINE_DETAILS_REFERENCE ON MACHINEDATA.MACHINEID =
MACHINE_DETAILS_REFERENCE.MACHINEID ;
/**@SIMPLEQUERY=AGGREGATE*/
CREATE OUTPUT WINDOW AVG_TEMP PRIMARY KEY DEDUCED KEEP EVERY 1000 ROWS AS SELECT EVENTS.MACHINEID MACHINEID ,
  EVENTS.EVENT_TIME EVENT_TIME ,
  EVENTS.EVENT_NAME EVENT_NAME ,
  EVENTS.EVENT_DESCRIPTION EVENT_DESCRIPTION ,
  avg ( to_integer ( EVENTS.EVENT_VALUE ) ) AVG_TEMP ,
  EVENTS.MACHINE_TYPE MACHINE_TYPE ,
  EVENTS.MACHINE_NAME MACHINE_NAME ,
  EVENTS.MAX_TEMP MAX_TEMP ,
  EVENTS.MIN_TEMP MIN_TEMP ,
  EVENTS.LOCATION LOCATION ,
  EVENTS.TEMP_UNIT TEMP_UNIT
FROM EVENTS GROUP FILTER EVENTS.EVENT_NAME = 'TEMP' GROUP BY EVENTS.MACHINEID ;
CREATE OUTPUT WINDOW ALARM_TEMP PRIMARY KEY DEDUCED AS SELECT AVG_TEMP.MACHINEID MACHINEID ,
  AVG_TEMP.EVENT_TIME EVENT_TIME ,
  AVG_TEMP.LOCATION LOCATION ,
  AVG_TEMP.AVG_TEMP AVG_TEMP ,
  AVG_TEMP.MAX_TEMP MAX_TEMP ,
 
'TEMP' ALARM_TYPE ,
 
'Machine not maintaining temperature' ALARM_DESC FROM AVG_TEMP WHERE

AVG_TEMP.AVG_TEMP > AVG_TEMP.MAX_TEMP ;


Now select the project, right click SAP HANA smart data streaming->Run->Run Streaming Project in Workspace <Your Workspace> and it will switch to the SAP HANA Streaming Run-Test perspective automatically and you can also see the running project and the input stream/window objects MACHINEDATA, double click any of them to monitor the data.

Again, we will use the Manual Input to simulate the incoming events and event data is like the table below.

MACHINEID

EVENT_TIME

EVENT_TYPE

EVENT_NAME

EVENT_VALUE

2DDDBW3TA

2015-01-20 12:00:00.000

TEMP

InsideTemp

80

2DDDBW3TA

2015-01-20 12:00:02.000

TEMPInsideTemp

55

2DDDBW3TA2015-01-20 12:00:03.000TEMPInsideTemp66
2DDDBW3TA2015-01-20 12:00:04.000TEMPInsideTemp60

Here you can see the list of Stream/Window objects for the current project. From the console, you can see there are four messages have been sent to the streaming server. For the EVENTS object, which is the JOIN of MACHINEDATA and MACHINE_DETAILS_REFERENCE, you can see the name, location and other information of the machines that were stored in an existing HANA table.

For the AVG_TEMP window, you can see the moving average value has been calculated.

For the ALARM_TEMP window, you can see the alert like below. This represents the output of streaming processing, in the real world, you may take the result of it to your monitoring dashboard, send an alert to the onsite technician to fix the problem or any other necessary actions.

Appendix:


Monitor Steaming with SAP HANA Cockpit:


SAP HANA cockpit allows you to manage the nodes, workspaces, adapters and projects involved in streaming, and to monitor streaming alerts and memory usage. There are seven tiles in the Streaming catalog, which can be added to a new or existing cockpit group.

You will need to assign these two roles in order to see the Streaming tiles in HANA Cockpit.


CALL _SYS_REPO.GRANT_ACTIVATED_ROLE('sap.hana.admin.roles::Monitoring','SYSTEM');

CALL _SYS_REPO.GRANT_ACTIVATED_ROLE('sap.hana.streaming.monitoring.roles::Monitoring','SYSTEM');


Select the system, right click and choose Configuration and Monitoring—>Open SAP HANA Cockpit to launch:

4 Comments