cancel
Showing results for 
Search instead for 
Did you mean: 

Handling Duplicated Records in DTP

Former Member
0 Kudos

Dear Experts,

I am trying to load to the master data of 0PLANT using datasource 0BBP_PLANT_LOCMAP_ATTR (Location/Plant Mapping) using DTP.

This standard datasource is neither language- nor time-dependent. Also, in the source system, it is not marked to handle duplicate records.

I have also referred to OSS Note 1147563 - Consulting note: Indicator "Duplicate records". One of the key highlight is "If you use a DTP to transfer master data or texts, you can set the indicator 'Duplicate records' in the DTP tab page 'Data transfer'." I would suppose that this means the "Handle Duplicated Record Keys" option under tab "Update" of the respective DTP.

In this OSS Note, it was also mentioned that

"You must not set the indicator if the following prerequisites apply:

  • The indicator 'DataSource delivers duplicate records' is not set in the DataSource."

>> which is currently the case of datasource 0BBP_PLANT_LOCMAP_ATTR

Checked in SAP Help [link|http://help.sap.com/saphelp_nw04s/helpdata/en/42/fbd598481e1a61e10000000a422035/frameset.htm]:

You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).

My question is, I can't load the master data mainly because of these duplicated record key errors and when I checked on the indicator to handle duplicated record keys, I was given the error message "Enter a valid value". Therafter, I can't do anything at all - activate DTP, click on other tabs - and it just got stucked at this point, and I could only choose to exit the transaction.

Can anyone advise if I have basically missed anything?

Thank you in advance.

Regards,

Adelynn

Accepted Solutions (0)

Answers (11)

Answers (11)

Vijay4
Participant
0 Kudos

Hi Adelynn,

I got the similar problem when i check the Handling duplicated records in DTP.

can you guide me how to resolve this issue.

I am extraction data from 0REQUI_ATTR Data Source to 0REQUI Info Object.

I was stuck with this scree.

can you give me steps to resolve this problem.

Regards

Vijay

Former Member
0 Kudos

Hi Vijay,

Have you checked in your system if "OSS Note 1171462 - Error while maintaining DTP for Master Data Update" has been implemented? My issue was resolved by implementing this note.

Regards,

Adelynn

Vijay4
Participant
0 Kudos

Hi Rdelynn,

thanks for response, do you mean by source system E-Recrutment System, I have raised a ticket to basis team regarding this, a part from this what i will do from BI side as a BI Consultant.

I will do that as per your guidence.

Regards

Vijay

Former Member
0 Kudos

Hi Vijay,

I believe the OSS Note is to be implemented in your BI system, not the source system.

Regards,

Adelynn

Vijay4
Participant
0 Kudos

Hi Adelynn,

Ok! got it, I will raise a requist to Basis team.

thnaks for your response.

Regards

Vijay

Former Member
0 Kudos

Dear All,

Sorry for not getting back to you soon enough, as mentioned too by Bolun, SAP has also recommended to implement OSS Note 1171462 - Error while maintaining DTP for Master Data Update, in response to the SAP Message I raised earlier. This should resolve this issue.

Cheers.

Former Member
0 Kudos

Just got reply from SAP Support.

try Note 1171462.

Former Member
0 Kudos

Hi Adelynn,

Were you eventually be able to flag the indicator 'handle duplicated record keys' in the DTP? I am facing a similar situation and always getting prompted for "Enter a valid value" whenever I am trying to set the indicator and I get stuck with that screen.

Regards

Abu

Former Member
0 Kudos

Hi,

R u getting ABAP dump for the same. Please try by deleting old records & then loading it again.

Hope this helps

Thanks

Sushil

Former Member
0 Kudos

Hi Sushil,

No I am not getting any ABAP dump. I am just not able to set the indicator 'handle duplicated record keys' in the DTP. It just doesn't allow me to do it. I am using BI 7.0 SP18.

Thanks

Regards

Abu

Former Member
0 Kudos

I'm facing the same problem.

Former Member
0 Kudos
Former Member
0 Kudos
Former Member
0 Kudos

hi

see this screen shots...

DTP REEORS

Below are the steps that need to perform to handle data records with errors:

u2022

Failed status of DTP in DTP Process Monitor because of invalid character in records.

u2022 By clicking on Error Stack we can check error records.

Total of 3 records with error in source data.

u2022 Correcting erroneous records in Error Stack by clicking edit button on top left.

u2022 Creating Error DTP from the update tab of standard DTP.

Once Error DTP gets created, we can check the status of Standard DTP which is changed from create to display, and can also check the Error DTP under the Object for which we created the standard DTP.

Here is Error DTP:

u2022 Schedule the Error DTP from Execute tab.

u2022 In the Error DTP process monitor itu2019s showing 3 records that we corrected in Error Stack in earlier steps.

u2022 We can also check the status of Standard DTP, itu2019s also Green now (without errors).

u2022 You can also check the records updated status of Standard and Error DTP in the manage tab of data target.

Former Member
0 Kudos

Hi

did you get to the bottom of this issue?

Yann

Former Member
0 Kudos

Hi, Thanks for your quick reply but I am still stucked at the same point.

The problem is that I can't flag the option 'Handle Duplicated Record Keys' in my DTP.

And I do not know why.

Anyone else have other inputs?

Thank you in advance.

Former Member
0 Kudos

Hi,

Handling Duplicate Data Records

Use

DataSources for texts or attributes can transfer data records with the same key into BI in one request. Whether the DataSource transfers multiple data records with the same key in one request is a property of the DataSource. There may be cases in which you want to transfer multiple data records with the same key (referred to as duplicate data records below) to BI more than once within a request; this is not always an error. BI provides functions to handle duplicate data records so that you can accommodate this.

Features

In a dataflow that is modeled using a transformation, you can work with duplicate data records for time-dependent and time-independent attributes and texts.

If you are updating attributes or texts from a DataSource to an InfoObject using a data transfer process (DTP), you can specify the number of data records with the same record key within a request that the system can process. In DTP maintenance on the Update tab page, you set the Handle Duplicate Record Keys indicator to specify the number of data records.

This indicator is not set by default.

If you set the indicator, duplicate data records (multiple records with identical key values) are handled as follows:

● Time-independent data:

If data records have the same key, the last data record in the data package is interpreted as being valid and is updated to the target.

● Time-Dependent Data

If data records have the same key, the system calculates new time intervals for the data record values. The system calculates new time intervals on the basis of the intersecting time intervals and the sequence of the data records.

Data record 1 is valid from 01.01.2006 to 31.12.2006

Data record 2 has the same key but is valid from 01.07.2006 to 31.12.2007

The system corrects the time interval for data record 1 to 01.01.2006 to 30.06.2006. As of 01.07.2006, the next data record in the data package (data record 2) is valid.

If you set the indicator for time-dependent data, note the following:

You cannot include the data source field that contains the DATETO information in the semantic key of the DTP. This may cause duplicate data records to be sorted incorrectly and time intervals to be incorrectly calculated.

The semantic key specifies the structure of the data packages that are read from the source.

Example

You have two data records with the same key within one data package.

In the following graphic, DATETO is not an element of the key:

In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 1 is corrected:

Data record 1 is valid from 1.1.2002 to 31.12.2006.

Data record 2 is valid from 1.1.2000 to 31.12.2001.

In the following graphic, DATETO is an element of the key:

If DATETO is an element of the key, the records are sorted by DATETO. In this case, the data record with the earliest date is put before the data record with the most recent date. In the data package, the data records are in sequence DS2, DS1. In this case, the time interval for data record 2 is corrected:

Data record 2 is valid from 1.1.2000 to 31.12.2000.

Data record 1 is valid from 1.1.2001 to 31.12.2006.

If you do not set this indicator, data records that have the same key are written to the error stack of the DTP.

Note

You can specify how duplicate data records within a request are handled, independently of whether the setting that allows DataSources to deliver potentially duplicate data records has been made. This is useful if the setting was not made to the DataSource, but the system knows from other sources that duplicate data records are transferred (for example, when flat files are loaded).

Former Member
0 Kudos

Hi

f its only Infoobject,then under processing tab select only PSA and check the first option.After this relrun the infopackge.Go to monitor and see whether u got the same no of records including duplicates.After this follow the below steps,

1.Go to RSRV- tcode, expand Master data and double click on second option.

2.Click on the option that is present on right side of the panel. A pop-up window will appear and give the name of the infobject that is failed.

Then Select that particular infoobject and click on Delete button,if necessary then save it and click on execute----do this action only if necessary. Usually by deleting , the problem will be resolved. So, after clicking on Delete button, go back to the monitor screen and process the data packet manually by clicking on the wheel button.

3.Select the request and click on the read manually button (wheel button).

Normally load till PSA and then load manually to targets

thanks