cancel
Showing results for 
Search instead for 
Did you mean: 

Entire flat file logged to Monitor log

Former Member
0 Kudos

Hi, all.

I´m investigating a DF that takes very long time to complete. This DF is part of a greater Job with several others DFs.

When I run the DF in test environment, it runs smoothly and takes only few seconds!

In production, the only strange thing I see is that the monitor file is logging every record of the problematic DF´s flat file.

This is not happening for the remaining DFs in the job with other FFs.

The monitor sample rate is ok, this can be checked in other parts of the monitor file.

This DF takes the FF and branches it to two query transforms. In one calculates max(dateColum) and then merges it to filter the records (dateColumn = calculatedMaxDateColumn. Then it outputs to a DT transform with database table.

I have compared the FF settings (DF level) to others FF with no differences found.

Monitor File output (partial, showing that every row is logged):

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 1, 0.016, 5218.785, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 2, 0.016, 5218.785, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3, 0.016, 5218.785, 0.000, 0.000, 0:5

...

...

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 258622, 2375.061, 7593.830, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 258623, 2375.077, 7593.846, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 258624, 2375.092, 7593.861, 0.000, 0.000, 0:5

The only difference found when comparing to a test run was that in production the following two lines are added in monitor file:

Thanks a lot.

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

Hi ,

Please clarify the following points

1. Your target database in both the environments are same?

2. File format used in both the environments are same?

3. just validate the job in production environment , are you getting any warnings? if any then please mention and rectify.

Thanks,

Swapnil

Former Member
0 Kudos

Hi, Swapnil!

Yes, is supposed to be the same. I´m comparing objects to detect any difference.

UPDATE:

I was able to reproduce the delay and the excessive logging to monitor file just by disabling the DT.

In production environment is supposed to be running enabled.

When running with the DT disabled, the monitor screen (windows application) updates every half second or less!

No matter I have specified 20 seconds monitor sample rate when launching the job !!

Also the monitor file is getting really big (20 MB+. 30 MB is the production monitor log file size)

And the job is taking very long time to complete.

I think this may be a bug in Data Services...?

Maybe the way the two queries are linked generates this error/problem?

I cannot attach an image so I will explain it again:

Source: simple flat file, with 4,3 million records.

The flow splits in two:

  Above: a query transform that calculates the maximum date of a datetime columm then merges to

the flow.

  Below: simple line.

  After that, a query that filters rows that maches the datetime column to the maximum date previously calculated.

  Third: the query targets a DT (data transfer) to a Sybase IQ table with bulk load.

Thanks a lot.

Charles.

Former Member
0 Kudos

Can you do one thing?

Just fetch data first of all in template table using DF in production environment.

Then with that template table you can go with whatever your logic is.

Former Member
0 Kudos

Thanks, Swapnil!

Yes I can modify the DF to avoid this problem.

But I would be great to understand what is the problem!

I think, the problem is between the flow of the FF, the queries design and the use of DT tranform.

I was wrong: It´s not logging the entire FF, it´s just loggin the output of the second query tranform, targeting the DT.

A lot of FF rows are logged one by one, then just one row of DT is logged.

This sequence is repeated until the end of data:

"PT_Sobregiros" is the FF

"Sobre_ultimo_dia" is the query transform that excludes records not matching the maximum date.

"DT_VC_SOBRE_01" is the DT (disabled)

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3576, 43.648, 5035.796, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3577, 43.663, 5035.811, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3578, 43.679, 5035.827, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3578, 43.695, 5035.843, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/Sobre_ultimo_dia: 0, PROCEED, 1, 73.287, 5035.858, 0.328, 0.374, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3579, 43.710, 5035.858, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/Sobre_ultimo_dia: 1, PROCEED, 3579, 43.726, 5035.874, 0.328, 0.374, 0:0

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3580, 43.741, 5035.889, 0.000, 0.000, 0:5

-VC_CARGA_SOBREGIROS_DF_1/DT_VC_SOBRE_01_1_Sobre_ultimo_dia_DT_VC_SOBRE_01, PROCEED, 3500, 43.305, 5035.905, 0.016, 0.374, 0:1000

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3581, 43.773, 5035.921, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3582, 43.773, 5035.921, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3583, 43.788, 5035.936, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3584, 43.804, 5035.952, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3585, 43.835, 5035.983, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3586, 43.835, 5035.983, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3587, 43.851, 5035.999, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3588, 43.866, 5036.014, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3589, 43.882, 5036.030, 0.000, 0.000, 0:5

/VC_CARGA_SOBREGIROS_DF_1/PT_Sobregiros_FF789, PROCEED, 3590, 43.897, 5036.045, 0.000, 0.000, 0:5

.

Former Member
0 Kudos

I think you first let me know the warnings generated by BODS in production region in any?

Former Member
0 Kudos

Where can I found the warnings?

The error log file is empty.

Thanks.

Former Member
0 Kudos

While validating a job you will have error logs as well as warning log.

So is BODS showing any warning log?

Former Member
0 Kudos

Validation shows no warnings or errors. Everything ok!

I have removed the query transform to compute the max(date) and removed the where condition in the next query transform to filter and the error disappears.

I think the problem is that. But why this is causing this strange behavior... I thing it´s a thing to investigate by the SAP team.

Former Member
0 Kudos

why don't you add data transfer after flatfile and then apply remaining logic.

Regarding your error I will try to replicate the same scenario at my end and let yoy know

Former Member
0 Kudos

I am now testing with no target DT (removed) and the problem persists!

I think definitively is the "compute max- filter by max" double query logic that is causing the problem!

But how to do it without intermediate stages? Storing the max date and then filtering by it...

Thanks Swapnil for your time.

Charles.