I am trying to export a few table's data from our production system to the quality system.
Reason: To synchronize the table data between Production and Quality system
System info: SAP ECC 6.0 on Oracle 10.2 ; OS: Solaris
There are few standard SAP tables which hold the data of the various institutions of our client. Our requirement is to export the master data relevant for a particular institution from the production system to the quality system. We intend to use the Oracle Export and Import utility (Suggestions on the usage of which tool are welcome).
Lets say, there is a table NXXX in production system. This table contains the data for the institutions AA, BB, CC. I intend to export the data relevant to the institution AA from the table NXXX and import the data to the table NXXX in the Quality system.
Now I understand that the schema of these tables are different. In the production system, the schema would be SAP<Prod SID>.NXXX and in the quality system, the schema is SAP<Qas SID>.NXXX. I need your suggestions to overcome this constraint.
I do not think you can just export data in your scenario using the oracle imp/exp tool it's all or nothing...
as for the scema naming between QA/PROD you just need to use :
imp <dump file> fromuser=<PROD schema> touser=<QA schema> only thing you have to look out for is the mandt as it will not change.
I had something similar recently where I had to copy 14+ million rows from Prod to a Dev client without impacting on other clients data for this table in Dev.
So for data consistancy
1) I restored a copy of prod on a new LPAR and exported the table.
2) Created a dummy prod client using client '000' as a base.
3) imported the table using the syntax above enabling the change of schema
We have now a 14 + million rows in the dummy prod client on the new LPAR
Now on the Dev server
4) Created new client on dev as per point 2
5) SCC9 Remote client copy from the new LPAR using SAP_ALL
6) As I had 9+ million rows in the target client to get rid of I used delete_commit procedure as in metalink note 3777.1
This enable me to delete rows fast without caning the roll back segs.
7) using R3trans and a command file from the command line copied the single table into the target client.
I do not know if this is useful for you because your scenario is slighley different, but this is what I did for a whole table for a particular mandt.
Edited by: Mark Norman on Oct 22, 2009 11:16 AM
Edited by: Mark Norman on Oct 22, 2009 11:17 AM
> Reason: To synchronize the table data between Production and Quality system
There is a special product from SAP for this: TDMS.
In almost all cases of SAP standard tables you need to keep the internal business logic. If you e. g. copy table MARA you also need to copy all dependent tables (MARC, MARD, MAKT) and also make sure your change document table (CDHDR, CDCLS etc.) are up to date and reflecting the data. Otherwise you may get strange errors.
If you have to do this regularly I suggest you consider getting TDMS where you can frequently update QA system with "fresh" data from the production consistently.
TDMS is an addon (better said, three addons) that are installed on all affected systems using SAINT.
You basically have three instances involved:
- source system (which is your production)
- target system (the QA system)
- TDMS server
You can create a package on TDMS and say (e. g.) I want all data from the last three months. The TDMS run will read the data from the production and transfer it to the QA system. You can also say "I want all stock data from the last 4 weeks".
TDMS, however, must be licensed separately.