cancel
Showing results for 
Search instead for 
Did you mean: 

Advice for unique client copy requirement for HANA data

abe_letkeman
Product and Topic Expert
Product and Topic Expert
0 Kudos

With so many products with overlapping functionality, I know there are many ways to achieve what is required, but I'd like to start a discussion to help decide on the best solution.  I don't want to miss any key considerations including roadmaps.

Background

  • We are developing an application that is using the HANA as the database.
  • Similar to most SAP products, multi-tenancy is application defined by using a tenant/client field on tables containing data for multiple clients/tenants.
    • So far, the tenant key field is not called MANDT or CLIENT and is not in the first position of each table
  • The data dictionary is maintained outside of HANA. Core Data Services is not used.
  • This client does not have any netweaver instances

Requirement

  • We need to be able to do a client/tenant copy. That is, copy all of the data from hundreds of tables where tenant_key = <tenant_code> to the respective tables in a different database.
  • Ideally, we would also do transforms on fields with sensitive data along the way if we're copying from production.
  • This needs to be configurable and repeatable between different databases.

Possible solutions

  • SAP HANA Enterprise Information Management (EIM)
  • SAP Landscape Transformation
  • SAP BusinesObjects Data Services
  • SAP Test Data Migration Server (TDMS)

So far it looks like Data Services is the best option because it offers better flexibility and management - specifically to choose the source and destination. HANA EIM doesn't have a central management repository like Data Services.

I think TDMS relies on RFC, so I'm pretty sure that's not even a contender for that reason alone.

Is SLT flexible enough?

OR, should the application-defined multi-tenancy approach be dropped entirely and we use HANA database containers for each client?  This generally seems to be the new strategy and is well suited to hosted/cloud situation.

I really appreciate your thoughts.  Thank you.

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

Hi Abe,

I am answering your questions based on some working knowledge and understanding of SLO Tools I have.

TDMS: TDMS should not used to build productive systems and the whole purpose of this product is build test systems and there are different selection criteria's possible to build the test systems. It also offers some predefined content for certain SAP business areas like HR, Manufacturing etc., Yes it primarily uses RFC's to read and write data, the landscape needs to be defined for Sender, Receiver and the Central Systems. Usually the TDMS software is installed on a SAP Solution Manager(Cental System). The tool also have inbuilt scrambling function that can be used to scramble the data.

SAP BODS: This ETL tool can connect to any database using the database connection and perform the Extraction, Transformation and Loading. For Loading to SAP Systems you might need to use any of the SAP loading methods like ALE/IDOCS, BAPI, DI, Function Module etc...

For loading to any database its more like table to table insert/updates using SQL commands. The tool also has lots of inbuilt packages: Ex: Address cleansing.

SAP SLT: It has the benefit of data replication from SAP to SAP, SAP to Non-SAP Systems(Database), Non SAP(Database) to SAP Systems. 

SLT uses ADBC(ABAP Database Connection) to talk to different databases. Performance is not an issue here as the tool can handle volumes and parallelization is an option for data extraction, transformation and load. 

The SLT tool can be used for data replication to different databases using database trigger technology(You may need to check the databases supported here).

SLT can be also be used for data migration from Legacy database to SAP using the inbuilt MWB framework. SLT is installed on an ABAP Stack:  Refer to the attached link for installing SLT: http://scn.sap.com/thread/3670842

As the SLT uses the SLO's MWB framework which has lots of inbuilt functionalities for data transfer monitoring and control, reprocessing the failed jobs etc..,The development effort using SLT is comparatively less as the framework takes care of majority of the work, coding is required for complex transformations.

Right tool in your case: BODS.

Reasons:

1)The Client doesn't have netweaver instances and TDMS, SLT requires a Netweaver instance as per my knowledge and they sit on ABAP stack.

2)You do not require any predefined content by Business Area.

3)Not sure if you want to utilize the database trigger feature of SLT

Coming to your question Client and tenant copy, Ideally one client per tenant may be a good design to segregate the data and also to enforce the security measures for data access.

I don't have much knowledge on HANA database containers..so can't answer

Hope this info helps you.

Thanks,

Rajeev