cancel
Showing results for 
Search instead for 
Did you mean: 

Duplicate records when writing data to DataStore object DSO

Former Member
0 Kudos

Hi Gurus,

We have a weekly job running in the system which is a flat file updation to DSO. We are not able to track how the file is getting updated on the application server.Please find the error details,

"Duplicate records when writing data to DataStore object DSO

Processing was canceled"

This weekly job is getting cancelled with the above error.

Kindly suggest a solution.

Thanks and Regards.

Basistechie.

Edited by: BASISTECHIE on May 10, 2011 1:38 PM

Edited by: BASISTECHIE on May 10, 2011 1:40 PM

Accepted Solutions (0)

Answers (4)

Answers (4)

Former Member
0 Kudos

Hi,

Just as to run data load without failure you can change(increase)the size of error stack thorugh DTP settings -> Extraction tab.

After that Analyze/Modify the corrupt data in error stack and run error dtp.

Regards,

Akanksha

Former Member
0 Kudos

Hi,

if the data is available in PSA ,modify data in error stack and run the error dtp.

Regards,

Sai.

Former Member
0 Kudos

to track the file, go to infopackage and you shoudl see where its pulling the file, and then go to schedule on infopackage and gibve it a unique name in job box so youc an track it in sm37..

for the dso,, is it 7.0 or 3.5 as in 7.0 you can set up DTP to handle duplicate records... otherwise follow the other message..

Former Member
0 Kudos

Hi ,

Kindly check what type of DSO you are using here :

If you are using a write-optimized DSO , then it will not ot allow duplicate records. You have to select the checkbox Do not check uniqueness of data. But in that case you will have duplicate entries in write optimized DSO .

Or if you are using a standard DSO, then i guess you have selected the check box for Uniqueness of Data in the settings of the DSO . This will not allow duplicate records to enter .

Hope the above reply was helpful.

Kind Regards,

Ashutosh Singh