Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
tomas-krojzl
Active Contributor

During SAP Sapphire in Orlando I got the opportunity to attend strategy sessions related to SAP HANA technology. I must say that I really appreciate that SAP is interested in getting feedback for their products.

Therefore I decided to try a new kind of experiment – to write a “brainstorming blog”. Below you can see some ideas how SLT replication could be improved. Feel free to criticize them in case you disagree or to append you own ideas. Maybe SLT team will find these ideas inspiring and we will influence their direction. Let’s see.

All suggestions are divided into two main areas based on associated technology.

SAP HANA Studio

1.) SLT heartbeat detection

As you might know SAP HANA Studio is playing only passive role in area of SLT replication. All information you can find in Data Provisioning screen is taken from local SAP HANA tables.

For example the list of replicated systems is stored in table RS_REPLICATION_COMPONENTS located in schema SYS_REPL and replication status for each table is stored in table RS_STATUS in the appropriate schema. Also all actions performed by users are not directly executed but only stored in table RS_ORDER (or table RS_ORDER_EXT).

SLT system is monitoring these command tables and in turn updating current activity in status tables.

This passive approach is creating quite a big space for error. In case that SLT is not working properly or not at all – there is no way how this can be seen from Data Provisioning screen where everything seems to be fine.

The potential solution is quite simple. SLT can in regular intervals update time-stamp in dedicated SAP HANA table and Data Provisioning cockpit can interpret this value. In case that time-stamp is not updated for certain period then there is a very high chance that SLT is in trouble.

2.) Easy resolution of replication errors

When table replication is in status error then there is nothing you can do to resolve this state from SAP HANA Studio. The only way is to run advanced monitoring workbench in SLT system where very specific knowledge is required.

Customers should able to run some kind of “auto-repair” function directly from Data Provisioning screen. This function would attempt to perform consistency check, clean-up and if required then after user confirmation also new provisioning of given table. No advanced knowledge should be required.

Also customer should be offered simple explanation what is the nature of error, whom to call and where to continue the investigation.

3.) Overall replication progress bar, table load progress

When you start replication of multiple tables then all you can do is to passively wait. You might guess what is the overall progress based on updated status of tables however there is no progress information displayed.

One solution would be to adjust Data Provisioning screen to contain simple progress bar and to display overall replication status as text (for example in similar way like R3load status is presented during migration: “Replication status: running 3, waiting 8, completed 13, failed 1, total 25”).

Another inconvenience is missing replication progress for each table – especially when initial load is running. In case of big tables initial load (or load operation in general) can take dozens of minutes or maybe a few hours.

You still have the possibility to manually check the amount of rows in source table (either by querying database statistics or by querying ABAP statistics from table DBSTATTORA). Then you can check amount of already loaded rows in SAP HANA database (using Show Definition function). Comparing these two values can give you hint about progress of load operation.

However this is a tedious manual process that can be easily automated.

4.) Initial load estimation / Re-provisioning estimation

When you are asked to provision a table then usually question that follows is “How long will that take?” Currently there is no way of predicting this especially when you are doing first replication on new hardware. You might have very rough estimation based on size of tables however this can be very inaccurate.

Again solution can be relatively simple. All that is required is that SLT needs to collect various statistics and then (if allowed by customer) these can be sent to SAP for analysis.

Following information should be collected:

  • hardware configuration where SLT is running – this can be then used to calculate first variable representing “power” of the machine (HW_POWER)
  • table name (and corresponding structure) – this can be then used to estimate complexity of table or to directly assign complexity to well known SAP tables (TABLE_COMPLEXITY)
  • amount of records in table and size of the table – this can be represent size factor of the replication (TABLE_SIZE)
  • replication duration – how much time the initial load took (REPLICATION_TIME)

These values can be then used to create following formula and to find proper generic variables:

     REPLICATION_TIME = TABLE_SIZE * TABLE_COMPLEXITY / HW_POWER

Of course historical values collected by SLT can be then used in case that table needs to be provisioned again.

Data Provisioning screen in SAP HANA studio should contain details about the table or selected tables to be provisioned including time estimation.

SLT system

1.) Consistency check and Clean-up functions

I really love SLT replication as my most favourite type of replication into SAP HANA. However I must say that things are not working as they should. Although the replication principle is very simple the implementation is so abstract that there is a huge space for errors. And errors are happening more often then what can be considered normal.

I have no constructive ideas in area of preventing errors. However I do have some ideas in area of error troubleshooting.

Definitely useful function would be the possibility to run consistency check for given objects. It happened to me multiple times that status in SAP HANA (table RS_STATUS, fields ACTION and STATUS) was different then status in SLT (table IUUC_RS_STATUS, fields ACTION and STATUS). This error is quite obvious yet there is no way how to fix it without running update query on database level in SLT, HANA or both systems.

Similar problem can be observed with tables RS_ORDER. Sometimes these tables are also having multiple “last” entries for same replicated table. It can also happen that when table is de-provisioned – it is removed from table list in transaction IUUC_SYNC_MON but does still exist in Mass Transfer definition and there is no way how to get rid of it.

Fantastic function would be consistency check where all these object would validated against each other and all inconsistencies would be removed. In case of unclear state user can be queried for decision.

Also “orphaned” entries should be automatically identified and removed during SLT start to keep the system clean and tidy.

2.) Purge functions

With following variants:

  • Purge of whole SLT   
  • Purge of specific Mass Transfer ID
  • Purge of specific table

Another nice function would be to purge the configuration. To remove EVERYTHING from SLT regarding specific table – like that it was never ever replicated by SLT for this particular Mass Transfer. This function would remove all entries related to given table in given Mass Transfer including possible inconsistencies without impacting other tables replicated by SLT. Then table can be safely provisioned again without risking collision with obsolete entries.

Same function should be available to be executed on Mass Transfer level (to clean up everything in given Mass Transfer definition) and also on whole SLT level (to make it like after installation including removal of all obsolete Mass Transfer IDs).

Of course corresponding purge actions should be also executed in source systems.

3.) Replication Statistics

Detailed statistics about replication process should be available:

  • how many records were replicated during last period (for example on hourly base)
  • how much time was spent in replication activities
  • how much time was spent reading from source system and how much time writing to SAP HANA (to determine where the replication time is spent)
  • what was minimum, average and maximum utilization of background jobs suggesting if more background jobs should be allocated

All these statistics would enable additional insight into the process of replication offering possibility to understand if and how SLT system should be adjusted.

4.) Visualization of replication process

Every activity in SLT is composed from series of steps. For example replication process is composed from initial load and then from ongoing replication. Initial load can be even more broken down to activities like table deletion in source system, table creation in source system, table creation in SLT system, creation of logging table, generation of runtime objects, calculation of access plan, trigger creation in source system, etc.

It is not very clear which activities are performed and in case of issue where exactly the replication was interrupted. It would help to have for each table details like tree of steps including semaphore lights and having possibility to watch as gray lights are turning into  green lights or in case of trouble into red light pinpointing step where the error occurred.

Such thing would allow everyone to better understand steps that are being performed and would also enable more effective problem determination.

5.) Troubleshooting wizard

Once the error in replication process is discovered (either by consistency check or from visualization of replication process) then troubleshooting wizard should be executed leading the user through the problem determination and guiding him to the problem area.

Nice example of such wizard can be seen in resolving data load problems in BW (in transaction RSA1).

6.) Dialog for replication adjustment (currently possible only by ABAP adjustment)

SLT is offering possibility to adjust the replication process. Features like row filtering based on defined criteria, removing columns or adding new calculated columns or changing column data type are possible with SLT.

However you need to develop new objects in ABAP language and register them in SLT tables. Then SLT is automatically calling these objects to run conversions mentioned above.

I believe that SAP should currently focus on stabilizing the product to avoid issues rather then adding new features – however possibility to adjust data type should be leveraged. Very simple dialog doing code generation and registration designed only for change of data type for particular table would do the job. Justification for the need is explained in next point.

7.) Data-type consistency with BO Data Services

This is very important point. I must admit that I did not test with latest versions however I would be surprised to see the change.

There is a big inconsistency in area of data types between BO Data Services and SLT replication technologies. SLT is replicating data types in same format as ABAP – which is often serialized string representing the value. Best example here is date field that stored as YYYYMMDD formatted string in ABAP and is replicated in same way by SLT.

Everything is fine as long as you do not need to use multiple replication technologies.

Problem will arise when you will start using BusinessObjects Data Services. BO Data Services are designed to translate the data between various systems. To allow this BO Data services is always interpreting source data into internal format and then translating into format used for target system. In other words date type field stored as serialized string in ABAP will be interpreted as date value and then stored as data type “Date”.

Again everything is fine as long as you are using only BO Data Services as replication technology.

Core of this trouble is that you cannot easily join tables using date as serialized string with tables using date as value. You might achieve the functionality only by using formulas however this approach will lead to serious performance problems and long query execution times.

In case that you need to combine these two technologies you need to make adjustments in one of these replication tools – either to change BO Data Services to use data types of SLT replication or to adjust SLT to convert data types used in BO Data Services.

Ideal situation would be if this adjustment can be done by click of a button – some kind of “compatibility mode” that can be easily activated in BO Data Services and/or in SLT.

8. ) Documentation

Last but not least – SLT needs documentation. SLT is currently designed as a black box where admin does not need to know the internal mechanics. This is fine as long as SLT is working as expected. However this is not daily reality – SLT can get some problems and then admin is left without any guidance how to solve the situation...

18 Comments
Labels in this area