Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
tomas-krojzl
Active Contributor

First part of this blog can be found here:

SAP HANA - Modeling Content Migration - part 1: Preparations, Security Requirements, Export

This second part is covering import procedure and required post migration activities.

3.3 Transfer and adjust export of data content

Transfer the export files to the work directory of target HANA system.

Note: SAP probably does not support editing export files. Do that at your own risk.

To change the location of tables perform following steps:

3.3.1 locate schema name in export files

hanapoc:/usr/sap/HD1/HDB07/work> grep -irl IDES

./index/IDES/ZC/ZCOPAACTUAL2/RuntimeData

./index/IDES/ZC/ZCOPAACTUAL2/create.sql

./index/IDES/ZC/ZCOPAFORECAST2/RuntimeData

./index/IDES/ZC/ZCOPAFORECAST2/create.sq

hanapoc:/usr/sap/HD1/HDB07/work> grep -irl TECHED2011

./index/TECHED2011/KN/KNA1/RuntimeData

./index/TECHED2011/KN/KNA1/create.sql

./index/TECHED2011/MA/MARA/RuntimeData

./index/TECHED2011/MA/MARA/create.sql

./index/TECHED2011/SC/SCAL1/RuntimeData

./index/TECHED2011/SC/SCAL1/create.sql

./index/TECHED2011/T0/T001W/RuntimeData

./index/TECHED2011/T0/T001W/create.sql

3.3.2 adjust export files to use new schema names

find . -name RuntimeData -exec sed -i 's/indexid: IDES:/indexid: DATA_SLT_IDD800:/g' {} ;

find . -name create.sql -exec sed -i 's/"IDES"[.]/"DATA_SLT_IDD800"./g' {} ;

find . -name RuntimeData -exec sed -i 's/indexid: TECHED2011:/indexid: DATA_MAN_TECHEDCOPA:/g' {} ;

find . -name create.sql -exec sed -i 's/"TECHED2011"[.]/"DATA_MAN_TECHEDCOPA"./g' {} ;

Note: Use with caution to avoid unwanted replacement.

3.3.3 rename directories with schema name

mv index/IDES index/DATA_SLT_IDD800

mv index/TECHED2011 index/DATA_MAN_TECHEDCOPA

3.4 Target: Import of data content

In Information Modeler perspective select option Import in section Content.

Select option Tables in SAP HANA Studio folder and click Next.

On next screen choose correct connection for target database.

Confirm location where export files are located.

Provide list of tables to import in format <schema>.<table name> separated by column character. Add them to the list and for each table select format we used during export (BINARY).

Note: Buttons Next and Finish will be inactive until you define format for every table in the list.

Click Finish. Wait for import to finish.

3.4.1 Target: Import of data content using SQL interface

Alternatively you can use SQL command IMPORT with following syntax:

IMPORT "<schema1>"."<table1>"[,"<schema2>"."<table2>",...] AS BINARY FROM '<source directory>'

[WITH REPLACE | WITH  (REPLACE) CATALOG ONLY | WITH  (REPLACE) DATA ONLY]

In our case we will use this command:

IMPORT "DATA_MAN_TECHEDCOPA"."KNA1","DATA_MAN_TECHEDCOPA"."MARA",

"DATA_MAN_TECHEDCOPA"."SCAL1","DATA_MAN_TECHEDCOPA"."T001W",

"DATA_SLT_IDD800"."ZCOPAACTUAL2","DATA_SLT_IDD800"."ZCOPAFORECAST2"

AS BINARY FROM '/usr/sap/HD1/HDB07/work'

3.5 Target: Creation of schema mapping

Schema mapping is used during execution of information models to automatically adjust location of tables from source (authoring) schema to target (physical) schema.

Note: Schema mapping is not physically correcting definition of information models. It is used only during execution to remap schema names. If you remove or adjust schema mapping then information models might stop working. In case that permanent adjustment is required you need to adjust export files (see below).

In case that you decided to use schema mapping go to Information Modeler perspective and switch connection to correct database using button Manage Connections. Then select option Schema Mapping in section Setup.

  

Click Add button and fill in the mapping relation. Repeat the procedure to each migrated schema. Confirm dialog.

3.6 Adjust export of information models

You need to adjust all files to change names of schemas (in case schema mapping was not used) and name of package to the target ones.

To avoid a lot of manual work you can use for example **** utility:

http://sourceforge.net/projects/****-it/files

First check all occurrences using this command:

****.exe -rpC * "techedcopa\x2F"

****.exe -rpC * "IDES"

****.exe -rpC * "TECHED2011"

If you agree replace the package name and schema names:

****.exe -rC * "techedcopa\x2F" "prj-techedcopa\x2F"

****.exe -rC * "IDES" "DATA_SLT_IDD800"

****.exe -rC * "TECHED2011" "DATA_MAN_TECHEDCOPA"

As last step rename directory techedcopa to prj-techedcopa.

3.7 Target: Import of information models

In Information Modeler perspective choose option Import in section Content.

  

Select option Information Models in Information Modeler folder and click Next.

On next screen choose correct connection.

  

Select file location using Browse button. Select all objects you want to import and add them to the list. Then click Finish.

Wait for all jobs to end successfully.

4 Post-migration activities

4.1 Activation of information models

All imported information models will be inactive. Open package in Navigator window and start activation of objects.

Select all attribute views you wish to activate and in context menu choose option activate.

HANA system will check all dependencies to ensure that object can be safely activated.

Wait for all jobs to end successfully.

Repeat the procedure for analytic views and then for calculation views.

Note: You should activate information models from bottom to the top rather than all at once. In case that you will try to active dependent objects in one run you might encounter false activation errors. In such case select smaller subset of models and reactivate them.

4.2 Test the migration

Use data preview function on information models to ensure that migration was successful.

4.3 Connect external applications

Adjust all applications (like BusinessObjects Data Services) to connect to new HANA database and to use objects from new schemas and new packages.

4.4 Re-provision replicated objects

In case that some of objects were replicated from source system you should remove them and provision them using corresponding replication technique (SLT, BusinessObjects Data Services or Sybase replication).

Labels in this area