cancel
Showing results for 
Search instead for 
Did you mean: 

Parsing error -8 when update-clause used with nextval

thomas_huebner2
Explorer
0 Kudos

Hello MaxDB-Experts,

I'm using MaxDB 7.7 via JDBC.

When using PrepardedStatements the following insert works absolutely great using the sequence.nextval (seq_syncid is a regular sequence):

     insert into s_entity(A,B,C,D,E,sequenceID) values(?,?,?,?,?,seq_syncid.nextval)

BUT when I'm using more or less the same command:

     update s_entity (A,B,C,D,E,sequenceID) values(?,?,?,?,?,seq_syncid.nextval) where (F = ?)

it fails stating:

     SQLException: [-8]: Execution failed, parse again

     SQLState:     I0008

     VendorError:  -8

Any ideas, any help from you guys is really highly appreciated.

Chris

Accepted Solutions (0)

Answers (4)

Answers (4)

thomas_huebner2
Explorer
0 Kudos

Some details:

pStmt = connection.prepareStatement("update parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E) values (?,?,?,?,?) where (SPALTE_A = 'a')");

pStmt.setString(1, "a");

pStmt.setString(2, "b");

pStmt.setString(3, "c");

pStmt.setString(4, "d");

pStmt.setBinaryStream(5, new java.io.ByteArrayInputStream("test3".getBytes()),"test3".getBytes().length);

                           

pStmt.execute();

Works perfect. As soon as the sequence is added the statement fails.

Chris

thorsten_zielke
Contributor
0 Kudos

Your MaxDB version of 7.7.06 is quite old, there will be no more bug fixes for that release (unless required for liveCache, which for a certain SCM version is still using that build). I would strongly recommend patching to MaxDB 7.7.07.48 or above (7.8.02 / 7.9.08)


Regarding the 'parse again'-problem, can you send me your java code as a 'jar' file? This would make it easier to recreate the problem here - just use the following link for upload:

https://share.sap.com/a:gns0cg/MyAttachments/R.03f75cb2-4770-482e-b185-474af33ed270/

Thorsten

thomas_huebner2
Explorer
0 Kudos

Hi Thorsten,

sorry for the late reply.

I'm right now quite busy with my other problem (see other post) and therefore hadn't the time to extract a jar. I will send you one as soon I have some time left.

Anyway I checked the version 7.7.07.48 and with that one I see the same behaviour.

Chris

thomas_huebner2
Explorer
0 Kudos

Hi Thorsten,

what a shame. That's too far for a coffee. 🙂

I'm running 7.7.06.

It doesn't fail without the sequence! In this case the setBinaryStream and the rest of the statement works great.

(Right now I'm solving - or working around - it that way, that I do the UPDATE itself in a first request and the UPDATE of the pure "sequence-column" inside a transaction in a second UPDATE which isn't the smoothest way.).

BUT: It seems to be the connection between updating a LONG BYTE with setBinaryStream AND the update of a sequence in ONE request.

Chris

thomas_huebner2
Explorer
0 Kudos

Hello Thorsten,

guess that's a tough one.

Change your setup to

create table "MONA"."PARSEAGAIN"(

"SPALTE_A" CHAR (1) ASCII,

"SPALTE_B" CHAR (1) ASCII,

"SPALTE_C" CHAR (1) ASCII,

"SPALTE_D" CHAR (1) ASCII,

"SPALTE_E" LONG BYTE,                          <-- ONLY CHANGE THIS!

"SPALTE_F" INTEGER)

Then feed for example data into the table using the INSERT statement:

pStmt = connection.prepareStatement("insert into parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E, SPALTE_F) values (?,?,?,?,?,seq_syncid.nextval)");

pStmt.setString(1, "a");

pStmt.setString(2, "b");

pStmt.setString(3, "c");

pStmt.setString(4, "d");

pStmt.setBinaryStream(5, new java.io.ByteArrayInputStream("test".getBytes()),"test".getBytes().length);

pStmt.execute();

-> WORKS FINE!

NOW TRY THE UPDATE LIKE:

pStmt = connection.prepareStatement("update parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E, SPALTE_F) values (?,?,?,?,?,seq_syncid.nextval) where (SPALTE_A = 'a')");

pStmt.setString(1, "a");

pStmt.setString(2, "b");

pStmt.setString(3, "c");

pStmt.setString(4, "d");

pStmt.setBinaryStream(5, new java.io.ByteArrayInputStream("test3".getBytes()),"test3".getBytes().length);

                           

pStmt.execute();

-> FAIL!!!!

So it seems to have a correlation with my LONG BYTE-column OR with the setBinaryStream to set a value.

What do you think?

Thanks a lot for your time. I highly appreciate that. Are you located in Walldorf? If yes I spend a coffee after this issue is solved cause I live just in the next village. 🙂

Chris

thorsten_zielke
Contributor
0 Kudos

Chris,

I am located in Berlin 🙂

Can you try to narrow down the error by simplifying the SQL statement?
E.g. does it also fail without using any sequence? If so, what happens for a table of just a single LONG BYTE column updated with setBinaryStream?

And finally, which MaxDB version 7.7 are you running?

Thorsten

thomas_huebner2
Explorer
0 Kudos

Hello,

That many experts and no ideas for this - at least here - quite easy to reproducable problem?

Any kind of hint is highly appreciated,

Chris

thorsten_zielke
Contributor
0 Kudos

Hello Chris,

I tried to recreate the error with my 7.7.07.48 database, but everything worked as expected.

Here is my setup:

create table "MONA"."PARSEAGAIN"(

"SPALTE_A" CHAR (1) ASCII,

"SPALTE_B" CHAR (1) ASCII,

"SPALTE_C" CHAR (1) ASCII,

"SPALTE_D" CHAR (1) ASCII,

"SPALTE_E" CHAR (1) ASCII,

"SPALTE_F" INTEGER)

CREATE SEQUENCE seq_syncid

insert into mona.parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E, SPALTE_F) values (1,2,3,5,6,seq_syncid.nextval)
update mona.parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E, SPALTE_F) values (5,4,3,2,1,seq_syncid.nextval) where (SPALTE_F = 1)
-> works fine

insert into mona.parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E, SPALTE_F) values (?,?,?,?,?,seq_syncid.nextval)
update mona.parseagain (SPALTE_A, SPALTE_B, SPALTE_C, SPALTE_D, SPALTE_E, SPALTE_F) values (?,?,?,?,?,seq_syncid.nextval) where (SPALTE_F = ?)

-> works fine

Do you see any difference to your test case?

Thorsten

PS:
You are probably aware of it, but please note that iin case you just need a simple '+1 counter' for a specific table, the use of 'DEFAULT SERIAL' as table attribute would be much easier...