Hello Gurus I have 2 questions:we have a master data that loads every once a week but we always get an error that reads duplicate data records and filter out new records with same key.Also whenever ever we filter the system don't load all the duplicates that but it overwrites it.we want to be able to see all data including duplicated data.any body had this kind of error before? another issue is that we need to add a table in one of our cube and one of the field have duplicate data and i was wondering if i load the data in the infocube will it give same error of duplicate records and is there a way not to delete all records and reload? hope is clear...
The property of master data is overwrite, if it has multiple records with same key field then it overwrites it.
Where you are getting error exactly? Are you getting error in DTP or IP(while loading data).
What do you mean by adding one table to info cube? we can not add table directly to info cube, we can only create new dimension and add objects(characteristics) to that.
If you duplicate values also system allow you to load it to cube (info cube is always additive property), but make sure if it requied for business, otherwise you will get wrong values in Info Cube and reports which gets data from this Info Provider.
I think you are getting the error in at DTP Level, having the Duplicate records at the master data level is not correct I believe.
Please make sure with the business that having a duplicate master data record is okay, you will get the error as long as you have the duplicate record as per the primary keys defined at the transformation level if you try to load the data in to master data info object.
Can you post the exact error message you are getting while loading master data?
Main property of master data is over write.
you get the duplicate record error while loading master data when there is a case where the master data is loaded and its not been activated ( data is in modified state) there is a chance of duplicate error.
Untill and unless you run the Change run or master data activation steps the data remains in modified state and is not available for reporting.
If you are getting this at IP level :
Schedule again with the option in the Info package, "without duplicate data" for master data upload.
or if its at DTP level then in the update tab select option "Handle Duplicate Keys" .
What do you mean by "add a table in one of our cube and one of the field have duplicate data" ?
If its a full load you are performing into IC, then try to remove the ovelapping request(old request) and then load.if not there is a chance of getting duplicate entries in your report.
Messages have been created for 0 data records. of these data records have not been processed further because of serious errors in the data in these records.
The overall status of the request has been set to red in order to prevent reporting on the valid data records.
Correct the data records that have been filtered out in the error stack of the DTP and use an error DTP to update these data records.
Filter Out New Records with the Same Key : 5500-5500Data Records
@5C@ ZCUSTxxx : Data record 6300 ('0070000641 ') : Duplicate data record RSDMD 191 @35@ @3R@
@5C@ ZCUSTxxx : Data record 6301 ('0070000641 ') : Duplicate data record RSDMD 191 @35@ @3R@
Message no. RSBK257
How do we get to allow duplicate and stop showing error?
Follow the below steps
1) Create a new session and go to Administrative Work Bench (RSA1)
2) Then select from menu select ‘Tools’-> ‘Apply Hierarchy/Attribute Change or RSATTR tcode.
3) Here click on ‘Info Object List’ button and one of the listed InfoObject should be the MD object which pertains to our current failed load with duplicate records.
4) Now go back to the monitor screen and set the ‘Total’ status of the failed request to red (even if it’s red already)
Now go to RSA1 and from ‘InfoObjects’ screen find the failed InfoObject.
Right click on the InfoObject and select ‘Activate master data’ from the context menu or use ACR to activate the master data.
Once the MasterData activation step is complete, go back to load monitor screen of failed load. In the ‘Details’ tab here check if the failed datapacket has been loaded succesfully into PSA.
--> If YES, and only a couple of Datapackets have failed with this error:
then right click on that Data Package and select ‘Manual Update’.
7) Once this step is successful we can change the ‘Total’ status that we changed earlier to red , back to green.
If there is no PSA step or failed data packets are more then -->Repeat the failed load.
These steps will complete successful MasterData load. ‘Activation’ of the data will be carried out by ‘Attribute Change Run’ in a subsequest step.
Hope these steps helps.
I already selected the option handle duplicate keys and the error is still occuring.
I mean we have a Ztable we need in Bw and it contains multiple duplicate and we need to append it to already existing cube won't same error occur? and is there a way i don't have to delete load and reload after appending?
In update tab of DTP , in error handling there are four options, select one of them according to your requirement. Create error DTP and push your errorneous records from error stack to your Infocube.
We cannot directly append a table in Infocube. If u want data from a particular table in ECC to your BI system, then create generic data source by table method. replicate that data source into your BI system. and after that load data to your infoobject. Now you can include an dimension having this infoobject to your infocube.Since infocube has addiditive property ,so it will add all the records. ( duplicate records). Either drop the data from your Infocube or create a DSO before loading data to Infocube.