on 10-12-2015 10:42 PM
Hi there,
I have a job to do a full and delta load of general ledger data from ERP to SQL via SAP data service. The delta load is working without problem. But the full load met an issue when I triggered. The error message is as "
<[Microsoft][SQL Server Native Client 10.0][SQL
Server]Violation of PRIMARY KEY constraint 'PK__SAP_GL_L__9F83A9C50323BE7C'. Cannot insert duplicate key in object dbo.xxx" . The duplicate key value is '200,200001......'. Any hint on this issue? Thanks for any help!
Cindy
Please check if you have a duplicated key in your source. If you could not find a duplicated key, the key have to be create in the DF.
Do you use any logic in the DF or is it a 1:1 transport of the data.
Please check from which table you get the error. Sometimes you get this error because of a key violation in one of the repository AL_... tables.
Regards
Severin
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Severin,
Thank for your reply. I did use logic in the DF, (if the load_type ='DELTA' go to xxxxx DELTA load' otherwise go to xxxFULL load'. I checked the error got from the target table which used in either the DELTA or the Full load. Would you please explain more on ' a key violation in one of the repository AL_tables.."
In some case (normal with BW), it could happen, that the systems send a request two times. The problem is, that the key of the AL_ table has the request id as primary key.
A possible solution (not recommended from SAP) is to enhanced the PK with the timestamp column.
But I do not think, that this is your problem.
I think, that you create the duplicated keys in your logic. Could you send a screenshot of your DF for a better analysis of his logic.
Try to disable index in the database table and try it out.
some times index also create issue along with primary key
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
If you run a delta load and then run a full load, the latter will try and load all records already loaded before and result into duplicate key errors.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Cindy,
Just remove the primary key in the mapping for target sql server table.
It seems that there is a primary key coming from Source ERP table.
Right click on the primary key column and uncheck primary key option and then try to run the full load
Regards
Arun Sasi
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Dear Cindy,
You could undo the primary key in a transform object. Maybe you have one before the table object (your target table).
Also, I would take a look if that change will produce a violation in the table structure. That is:
-. If your table is a regular table, you must do that changes in a DDBB layer(SQL Server Managment, Oracle client, etc) and re-import the table in Data Services to apply those changes.
- If your table is a template table, just go to the table options and check 'Drop and re-create table'.
Hi Arun,
I found that the problem comes from a couple of primary key has duplicate data from the source. They are not totally the same, just most of the fields are the same. I changed the table as 'IGNORE_DUP_KEY = ON" ON PRIMARY. The issue solved. Since the primary keys are defined from the source table, I still would like to keep them as the primary keys in the target table.
User | Count |
---|---|
83 | |
10 | |
10 | |
9 | |
7 | |
6 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.