As with any application working with big data, archiving plays an important role and we need to periodically archive data that is rarely accessed. Memory is more expensive than disk, and hence we need to have hot data (frequently accessed) in SAP HANA and cold data (rarely accessed) in disk.
The focus of this article is to explain how archiving of data (moving cold data to disk) related to a business scenario in SAP Operational Process Intelligence can be achieved with SAP HANA Dynamic Tiering and SAP HANA Data Lifecycle Manager (which is part of SAP HANA Data Warehousing Foundation).
We also explain how archived data (cold data) can be brought back to SAP HANA.
For more details on:
- SAP Operational Process Intelligence, visit this
- SAP HANA Dynamic Tiering, refer this link
- SAP HANA Data Lifecycle Manager, refer this link
Pre-requisites
- SAP Operational Process Intelligence 1.12 or higher
- SAP HANA Dynamic Tiering installed and configured
- Data Lifecycle Manager installed and configured
Data that will be archived
- Completed or abruptly ended instances of a business scenario
- Completed tasks associated with a business scenario
- Completed checklist associated with a business scenario
Required Authorizations
Procedure to perform archiving
The following procedure will archive data for only one scenario. In order to archive data for other scenarios you need to create different profiles for the respective scenarios.
Step 1: Launch Data Life Cycle Manager
To start the Data Lifecycle Manager tool, depending on whether HTTP or HTTPS port has been configured, enter one of the following URLs in your browser:
- http://:<<Host>>:80<<SAP HANA instance number>>/sap/hdm/dlm
- https://<<Host>>:43<<SAP HANA instance number>>/sap/hdm/dlm
Step 2: Create Storage Destination
Create a storage destination with type as “HANA Dynamic Tiering Local”. After creating and saving the storage destination, activate it.
Refer the section “Managing Storage Destinations” in the Data Lifecycle Manager guide for details on creating storage destination.
Note:
- This step is to be performed only for the first scenario that you archive, as the same storage destination can be used across scenarios.
Step 3: Create Table Groups
Create two table groups, one for data related to the business scenario and the other for tasks.
Refer the section “Managing Modeled Persistence Objects” in the Data Lifecycle Manager guide to create table groups.
Step 4: Create Lifecycle Profiles:
There are 3 three lifecycle profiles which need to be created.
Refer the section “Managing Lifecycle Profiles” in the Data Lifecycle Manager guide for details on creating lifecycle profiles.
Business Scenario Profile
- In the Source Persistence tab:
- Source persistence type - “SAP HANA table Group”
- Table Group Name – Business scenario tables group
- Trigger Type - “Scheduled”
- In the Destination Attributes tab ensure the following:
- Relocation Direction - “Hot to Cold”
- Clash strategy Hot to Cold - “Overwrite”
- Go to the Rules Editor tab and add the query "BusinessScenario_query" which is attached with this article. Replace the placeholders <<<EVT>>> with the fully qualified table name of the scenario EVT table. Change the parameter of "ADD_DAYS". For example, give -90 to archive all business scenario instances which got completed before 90 days. Click on “Validate Syntax” to verify the query.
- Sample query would look like:
- Save and Activate the profile.
- The profile should be activated without any errors.
Task Profile
- The Relocation Direction, Clash strategy, Source persistence type and Trigger Type should be the same as for business scenario profile.
- Table Group Name should be given as the task tables group.
- Go to the Rules Editor tab and add the query "Task_query" which is attached with this article. Replace the placeholders <<<EVT>>> with the fully qualified table name of the scenario EVT table and <<<scenario_def_id>>> with the scenario definition ID.
- Note: Scenario definition ID can be found in the event log table of the scenario. The value in the “SCENARIO_DEF_ID” column is to be given as the value in the query for <<< scenario_def_id>>>
- Change the parameter of "ADD_DAYS" to the same value as given in business scenario profile. Click on Validate Syntax to verify the query.
- Save and Activate the profile.
- The profile should be activated without any errors.
Checklist Profile
- In the Source Persistence tab:
- Source persistence type - “SAP HANA Table”
- Schema - "SYS_PROCESS_VISIBILTY"
- Table - "sap.opi.pv.insight2action::CHECKLIST_REFERENCE"
- The Relocation Direction, Clash strategy and Trigger Type should be the same as for business scenario profile.
- Go to the Rules Editor tab and add the query "Checklist_query" which is attached with this article. Replace the placeholder <<<EVT>>> with the fully qualified name of the scenario EVT table. Click on Validate Syntax to verify the query.
- Save and Activate the profile.
- The profile should be activated without any errors.
Step 4: Run the lifecycle profiles
The lifecycle profiles created above has to be scheduled to run at specific intervals so that cold data can be moved to SAP HANA Dynamic Tiering and cannot be viewed in space.me.
- Click on "Run" -> "Schedule" to run archiving in the required frequency
- Check the logs in the Data Lifecycle Manager tool to check the results.
Procedure to restore archived data
If you are interested in viewing the archived data in space.me you would need to move archived business scenario data (cold data) to SAP HANA by following the below procedure
Step 1: Edit the profiles
Business Scenario Profile
- In the Destination Attributes tab ensure the following:
- Relocation Direction - “Cold to Hotter”
- Clash strategy Cold to Hot - “Skip”
- In the Rules Editor tab, change the query to "SCENARIO_INSTANCE_ID" IN (-999).
- Note: This is a dummy ID so that all data from dynamic tiering tables is moved back to SAP Operational Process Intelligence tables.
- Save and Activate the profile.
- The profile should be activated without any errors.
Task Profile
- Ensure the Relocation Direction and Clash Strategy Cold to Hot as described above.
- In the Rules Editor tab, change the query to "TASK_ID" IN (-999).
- Note: This is a dummy ID so that all data from dynamic tiering tables is moved back to SAP Operational Process Intelligence tables.
- Save and Activate the profile.
- The profile should be activated without any errors.
Checklist Profile
- Ensure the Relocation Direction and Clash Strategy Cold to Hot as described above.
- In the Rules Editor tab, change the query to "CHECKLIST_ID" IN (-999).
- Note: This is a dummy ID so that all data from dynamic tiering tables is moved back to SAP Operational Process Intelligence tables.
- Save and Activate the profile.
- The profile should be activated without any errors.
Step 2: Run the lifecycle profiles
Run the lifecycle profiles so that the required business scenario data can be moved to SAP HANA and you will be able to view the data in space.me.
- Click on "Run" -> "Schedule" to run archiving in the required frequency.
- Check the logs in the Data Lifecycle Manager tool to check the results.