Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

ITAB_DUPLICATE_KEY

Former Member
0 Kudos

We are getting a "ITAB duplicate key" error in a DataSource enhancement for BW that selects data from table MBEW into a hash table.  The internal table has the same fields as the key of the table.  I don't understand how we could be getting ITAB_DUPLICATE_KEY error when the where clause on the select statement has the key of the table.  Is this a buffering issue?  Here's a subset of the important code:

    TYPES:
   
BEGIN OF TY_MBEW,          "Material Valuation
      MATNR
TYPE MBEW-MATNR,   "Material Number
      BWKEY
TYPE MBEW-BWKEY,   "Valuation Area
      BWTAR 
TYPE MBEW-BWTAR,   "Valuation Type
      STPRS
TYPE MBEW-STPRS,   "Standard Price
      PEINH
TYPE MBEW-PEINH,   "Price Unit
      VERPR
TYPE MBEW-VERPR,   "Moving Average Price/Periodic Unit Price
      VPRSV
TYPE MBEW-VPRSV,   "Price Control Indicator
   
END OF TY_MBEW.

  DATA:

        IT_MBEW_HSH TYPE HASHED TABLE OF TY_MBEW
       
WITH UNIQUE KEY MATNR BWKEY BWTAR,

  SELECT MATNR
         BWKEY
         BWTAR
         STPRS
         PEINH
         VERPR  
"Moving Average Price/Periodic Unit Price
         VPRSV  
"Price Control Indicator
   
FROM MBEW
   
INTO CORRESPONDING FIELDS OF TABLE it_mbew_hsh
   
FOR ALL ENTRIES IN LI_DATA
   
WHERE MATNR = LI_DATA-MATNR AND
          BWKEY
= LI_DATA-WERKS AND
          BWTAR
= ''.


18 REPLIES 18

Former Member
0 Kudos

Hi,

When you use for all entries in the select statement u should not use the table in the into clause as Hashed table instead use the standard table along with where clause with all primary key fields.

Regards,

Vineesh.

edgar_nagasaki
Contributor
0 Kudos

Hi Rob,

It seems problem lies in LI_DATA content. Indeed you have table key properly stated in your WHERE clause once it's MBEW full key but for sure LI_DATA is not following same key and, at FOR ALL ENTRIES, it's retrieving more than one line for same entry in MBEW, check it's content.

What you could do is, once LI_DATA seems to be supposed to follow same MATNR/BWKEY/BWTAR key do a SORT + DELETE ADJACENT DUPLICATES by this key over LI_DATA first.

Regards,

Edgar

Former Member
0 Kudos

Hi Rob,

Did you sort and delete adjacent duplicate from LI_DATA?  If not I would try doing this.  With the for all entries It may be selecting the same record twice.

Regards,

Steve

paul_bakker2
Active Contributor
0 Kudos

Hi,

That sounds strange.

- Please confirm: do you get that error during the execution of that SELECT statement?

-  Are there duplicate entries in LI_DATA? Does i still happen if you remove them>

-  Does it still happen if you use a STANDARD table instead of a HASHED table? (It shouldn't make any difference, but still)

cheers

Paul

cheers

Paul

Former Member
0 Kudos

FOR ALL ENTRIES does not work that way.  Any duplicates in LI_DATA are ignored.  From SAP help (http://help.sap.com/saphelp_nw73/helpdata/en/fc/eb3a1f358411d1829f0000e829fbfe/frameset.htm😞

  1. SELECT... FOR ALL ENTRIES IN itab WHERE cond...

cond may be formulated as described above. If you specify a field of the internal table itab as an operand in a condition, you address all lines of the internal table. The comparison is then performed for each line of the internal table. For each line, the system selects the lines from the database table that satisfy the condition. The result set of the SELECT statement is the union of the individual selections for each line of the internal table. Duplicate lines are automatically eliminated from the result set.

Also, Vineesh mentioned that I shouldn't use the hash in the into.  I can't see how that is an issue.  We are selecting UNIQUE records from MBEW to put into a hash table. I could throw them in a standard table first and then get rid of dupes and move to the hash, but the select can not return duplicate rows.  If the hash table has the same key as MBEW, then the  ITAB_DUPLICATE_KEY error is breaking the law, so to speak.

Something else must be going on here. Crazy

Cheers,

Rob

0 Kudos

Not really... if you have doubled entries in your FOR ALL ENTRIES table it will bring more than one match from other table... I do understand your point but that's the way it works.

Former Member
0 Kudos

hi,

the piece of code you posted was absolutely correct.. i think the problem is with the LI_DATA internal table.. may be you should delete the duplicates entries from that IT  by SORT and DELETE ADJACENT DUPLICATES ENTRIES and try use this code.. it ll work fine i think..

you can remove into corresponding from this code, cause it ll reduce your code performance.. anyway removing this wont help you to solve your issue.. but try to avoid it while coding..

thanks,

Mathan R.

former_member946717
Contributor
0 Kudos

Hi Rob,

Just a thought. Are there any entries that have blank MATNR or WERKS in the LI_DATA table?

Ideally MATNR shouldn't be blank but if these are dummy entries you never know. Since BWTAR is already blank, maybe there are entries where Material is the same, Plant is blank and BWTAR is blank or Material is blank, Plant is the same and BWTAR is blank. These can also cause duplicate entries.

So you can sort the table LI_DATA by MATNR WERKS

Delete adjacent duplicates comparing MATNR WERKS

Delete LI_DATA where MATNR = space. or WERKS = space.

and then you may try. Hope this helps!

Former Member
0 Kudos

Hi,

         Hashed table wont allow duplicate keys use standard table or sorted table with out unique key.

Thanks & Regards,

Ramu Velaga. 

Former Member
0 Kudos

hi,

Better use sort and

Delete adjacent duplicates entries comparing MATNR BWKEY BWTAR.

(i mean all the three fields) .. hence all the fields are primary key fields of MBEW.. so duplicates is not at all possible..


or else try to change the hash table into standard table.. May be it ll work..

this wasnt a correct solution for this issue.. but give a try.. it may work..

thanks,

Mathan R.

vinoth_aruldass
Contributor
0 Kudos

hi,

if the duplicate is still appearing use delete adjacent duplicates from itab.

hope it helps,

Vinoth

Former Member
0 Kudos

Hi,

I am facing smilar issue.If you were able to resolve this,Please let me know the solution.

Regards,

Haritha

0 Kudos

Hi,

You can also change the TYPE HASHED TABLE OF to TYPE STANDARD TABLE.

But if you have to know the reason for this error please insert a break point after select statement and execute the program.(After change the code.). then find the duplicate entry and try to view data it in se11.

Regards,

Prasad

P1389307362
Explorer
0 Kudos

Hi,

You can change the code as following and solve the problem.

DATA: IT_MBEW_HSH TYPE STANDARD TABLE OF TY_MBEW.

But if you have to know the reason for this error please insert a break point after select statement and execute the program.(After change the code.). then find the duplicate entry and try to view data it in se11.

Regards,

Prasad

0 Kudos

Hi,

I tried changing it as Type standard Table.

But I get a syntax error saying OBJVERS is not correctly defined.

My internal table is POSITION table with PERNR,OBJVERS,DATETO as key fields.

Regards,

Haritha

0 Kudos

Hi,

OBJVERS is not a valid data type. use RSOBJVERS insted.

in table declaration/type declaration change the code like this OBJVERS TYPE RSOBJVERS

Best Regards,

Prasad

Former Member
0 Kudos

Hi,

It is a good practice from performance point of view also, to use sort + Delete Adjascent Duplicates comparing fields  in case of large data in tables.

In this case also,  use sort LI_DATA by matnr werks.

Delete adjascent Duplicates from LI_DATA comparing matnr werks.

This will solve your problem.

Regards,

Sumit

former_member184958
Active Participant
0 Kudos

Hi,

Check there is any duplicate values, to avoid this make the unique key field as like database table. I.E. if suppose the field is not an key field in database table try to avoid in internal table also. or use like Sumit Sharma said.

Regards,

John.