cancel
Showing results for 
Search instead for 
Did you mean: 

Overwrite HANA Table or Delete old datasets

Former Member
0 Kudos

Hello,

I have a question for the Event Stream Processor:

I have an Data Flow that looks like that:

With that Data Flow I read data sets from HANA and write all older datasets into an Flat File.

But I would like delete all datasets who was outsourced in that flat file from the HANA table.

Is it possibly to overwrite an SAP HANA table (with the HANA Outputadapter as example)? Or is there any possibility to delete the old datasets?

Thanks in advance.

Best Regards

Stefan

Accepted Solutions (1)

Accepted Solutions (1)

JWootton
Advisor
Advisor
0 Kudos

Yes,  you can delete data in HANA from ESP.  There are two ways:

- In ESP you produce "Delete events" - i.e. events with a "delete" opcode, and then use the HANA output adapter to write them to a HANA table.  A couple of notes:  (1) you have to know the keys of the rows you want to delete, (2) keep in mind that deletes on HANA are much slower than inserts.

- You can use the CCL getdata() function to execute SQL statements on HANA.  Despite the name, you aren't limited to using it to retrieve data -  you can run other SQL statements as well.  So you could issue a delete statement, run a stored procedure, etc.

Former Member
0 Kudos

Hi Jeff,

thanks for the fast answer. I already read that I can use the opcode "delete". But find nothing how to use it and also don't found an parameter where I can set it. Do I have to set it in CCL-View or does it also works in the cclnotation in the data flow?

Best Regards

Stefan

JWootton
Advisor
Advisor
0 Kudos

Good question.  So the easy way is:  if the incoming data has opcodes,  then those will be passed along.  So if, for example,  you get delete events arriving as input,  those can be applied to HANA tables.  Also,  the results of certain operations will naturally produce delete events, e.g.

- rows being removed from a CCL window due to retention policies

But to manually create a "synthetic" delete event, you need to use a Flex operator, and the syntax is not exactly obvious, so here's an example:

CREATE INPUT STREAM InputStream1

SCHEMA ( Column1 INTEGER , Column2 INTEGER ) ;

CREATE FLEX FlexOperator1

IN InputStream1 OUT

OUTPUT STREAM DeleteEvents

SCHEMA ( Column1 INTEGER , Column2 INTEGER )

PRIMARY KEY (Column1)

BEGIN

  ON InputStream1 {

     output setOpcode( [Column1 = InputStream1.Column1; | Column2 = InputStream1.Column2;], delete); } ;

END;

Also, one final thing to note:  when configuring the HANA output adapter, be sure the DataWarehouse mode property is OFF - see this blog post for an explanation: http://scn.sap.com/docs/DOC-40208

Former Member
0 Kudos

Hello Jeff,

thanks for the example code.

I tried your example to create an delete opcode after I filtered my datasets.

My data flow for this part looks like that:

The code for the delete event looks like that:

But unfurtunally I got a lot of errors while trying to compilate my project after writing this command.

I also tried alot to change it. But everytime I got some failures.

I have the feeling there are some beginners syntax failures in the code. Is the syntax for this right?

Best Regards

Stefan

JWootton
Advisor
Advisor
0 Kudos

Looking at your ON method, one thing I see is this:

when you construct an output record,  you should only use the "|" separator once (or not at all, for a stream without keyds).  This separates the column or columns that make up the primary key from the others.  The key colums are to the left of | and the non-key colums are to the right.

I don't immeidately spot other problems. Try that. If there are still problems, please post the CCL as text (or send it to me).  Then I can try to compile it myself.  With the screen shot I ahve to re-type it, which risks introducing typos or other changes.

Former Member
0 Kudos

Thanks, now I can compile the project wihtout errors.

But unfurtunally after execute the project the filtered datasets were not deleted.

My CCL looks like that:


CREATE INPUT STREAM Sensordata SCHEMA (DATE msdate ,

    MEASURETIME time ,

    SENSORVALUE float );

/**@SIMPLEQUERY=FILTER*/

CREATE OUTPUT STREAM DeleteEventFilter AS SELECT * FROM Sensordata WHERE

timeToSec ( to_bigdatetime ( Sensordata.DATE ) ) < timeToSec ( now ( ) ) - 86400 ;

/**@SIMPLEQUERY=FILTER*/

CREATE OUTPUT STREAM HistoricDataFilter AS SELECT * FROM Sensordata WHERE

timeToSec ( to_bigdatetime ( Sensordata.DATE ) ) < timeToSec ( now ( ) ) - 86400 ;

CREATE FLEX DeleteQuery IN DeleteEventFilter OUT OUTPUT WINDOW DeleteQuery SCHEMA (

    DATE msdate ,

    MEASURETIME time ,

    SENSORVALUE float ) PRIMARY KEY ( DATE )

BEGIN

    ON DeleteEventFilter {

output setOpcode([DATE =  DeleteEventFilter.DATE; | MEASURETIME =  DeleteEventFilter.MEASURETIME; SENSORVALUE =  DeleteEventFilter.SENSORVALUE;]   ,delete);

} ;

END

;

CREATE OUTPUT WINDOW DeleteEventsWindow SCHEMA (

    DATE msdate ,

    MEASURETIME time ,

    SENSORVALUE float ) PRIMARY KEY DEDUCED AS SELECT * FROM DeleteQuery ;

ATTACH INPUT ADAPTER Generic_DB_Input1 TYPE db_in TO Sensordata PROPERTIES service = 'awshana' ,

    table = 'BRIDGEMODE' ;

ATTACH OUTPUT ADAPTER HANA_Output1 TYPE hana_out TO DeleteEventsWindow PROPERTIES service = 'awshana' ,

    table = 'BRIDGEMODE' ,

    bulkInsertArraySize = 1 ;

ATTACH OUTPUT ADAPTER File_Hadoop_CSV_Output1 TYPE toolkit_file_csv_output TO HistoricDataFilter PROPERTIES dir = 'C:/Sybase' ,

    file = 'test.json' ,

    csvDelimiter = ';' ,

    csvHasHeader = TRUE ;

Could it be that the reason for not deleting the datasets are the missing keep policy?

Best Regards

Stefan

JWootton
Advisor
Advisor
0 Kudos

Stefan,

Try the CCL below.  The problem was the Window after the DeleteQuery flex.  You were writing deletes to an empty Window. But Windows validate opCodes. So what would happen is that the DeleteEventsWindow would receive a "delete" event for a row that doesn't exist, so it would reject the event. If you look at the project log you would see that.  Also, if you use the EventTracer tool in the run-test perspective you could see that the event wasn't altering the Window (though the event tracer wouldn't tell you why).

I just removed the window - it served no purpose that I could tell.  I also turned the flex output from a window to a stream - again, you can't write delete events to a window that doesn't have events.  Windows don't hold "events" they hold rows.  So for a Window to emit a delete event it had to have a row to delete (and it would have to have been originally added by an insert event).

Also:  I removed one of your filters.  You had two filters doing exactly the same thing.  While that doesn't hurt anything,  there's no point in doing it - it's just extra overhead.

CREATE INPUT STREAM Sensordata

SCHEMA (DATE msdate ,

    MEASURETIME time ,

    SENSORVALUE float );

   

/**@SIMPLEQUERY=FILTER*/

CREATE OUTPUT STREAM DeleteEventFilter

AS SELECT *

FROM Sensordata

WHERE

timeToSec ( to_bigdatetime ( Sensordata.DATE ) ) < timeToSec ( now ( ) ) - 86400 ;

CREATE FLEX DeleteQuery

IN DeleteEventFilter

OUT OUTPUT STREAM DeleteRows

SCHEMA (

    DATE msdate ,

    MEASURETIME time ,

    SENSORVALUE float )

PRIMARY KEY ( DATE )

BEGIN

    ON DeleteEventFilter {

      output setOpcode([DATE =  DeleteEventFilter.DATE; | MEASURETIME =  DeleteEventFilter.MEASURETIME; SENSORVALUE =  DeleteEventFilter.SENSORVALUE;]   ,delete);

} ;

END;

Former Member
0 Kudos

Now it worked fine, thank you!

I interpreted the window wrong. But now I understand it. Thanks again

Best Regards

Stefan

Answers (0)