cancel
Showing results for 
Search instead for 
Did you mean: 

Job does not work after upgrade

Olj
Participant
0 Kudos

Hi,

After upgrade Data Services from 4.1 sp1 to 4.2 SP 5, patch 2, we migrated all FIM jobs, but one of jobs is created directly in DataServices Designer and it does not work after miration on new version.


After executing job freezes on step:

1452468DATAFLOW27.11.2015 9:56:29Data flow <fcprd2csv_masterdata> using IN MEMORY Cache.

Job server event log does not contain some strange rows.

Job server configured is normally, because we have many FIM jobs, and they works fine, but this problem job is created directly in DataServices Designer and unfortunately we don't have sufficient knowledge about DataServices jobs.

Please can you help me? Maybe it's a bug?

Job srv EVENT LOG:

Accepted Solutions (0)

Answers (3)

Answers (3)

Olj
Participant
0 Kudos

Experts,

What can I do else?

I'm already despair! 

former_member187605
Active Contributor
0 Kudos

I cannot understand that a change in code page has such a dramatic impact on data flow performance. therefore my question, what else has changed?

During installation there's a prompt about reusing the dsconfig.txt file. Have you replied no to that question?

Do you still have the settings of your 4.1 installation? If so, let's start by setting the AL_ENGINE parameter in dsconfig.txt back to its original value. That should remove the transcode operations (assuming you're still running on the same platform).

Olj
Participant
0 Kudos

Hi Dirk,

I think that change in code page is not problem.

We did new(clear) installation with default dsconfig on new server.

But on old(original) server we use default dsconfig too(nothing changed after installation).

After new installation I imported FIM jobs via FIM, all works fine.

Then I import this problem customized job. Job working, but too long.

One of dataflow working too long, and I don't why.

I already tried to find visual difference between two jobs, all identically.

What I can do else?

former_member198401
Active Contributor
0 Kudos

Hi Daulet,

We also faced this performance issue after upgrade but I could figure out what was causing the data flow to run so long. I know this would be a nightmare for you but you will have to analyze what is causing the issue.

Can you send me the screenshot of the data flow and the join being used.

Regards

Arun Sasi

Olj
Participant
0 Kudos

Hi Arun,

I attached video (DSVIDEOPR - YouTube).

Please tell me what can I do to help us.

former_member187605
Active Contributor
0 Kudos

Default means DS takes the default value from the OS. A new server might mean a different codepage than before.

And FYI: I personally never build any data flows that don't fit on my screen.

Olj
Participant
0 Kudos

Dirk,

OS regional settings is the same.

Unfortunately we cannot find consultant that customized this job.

It seems like basis problem or may be bug.

I already tried to install DS 4.2 sp6 (released previous week), - doesn't help.

Olj
Participant
0 Kudos

Hi Dirk, Arun,

On this weekend I tried to find from wich version of DS job works too long.

And after many installation/uninstallation I found it!

Jobs works too long from verison DS 4.2 SP3 patch 4 and newer.

On version DS 4.2 SP3 patch 3 and lower  job works fine (about 3mins).

Also I compared tracelog between patch 3 and patch 4:

You can see that patch 4  tried to transcode some datastores.

How can I disable this transcoding in DS Designer? Like in patch 3 version

PS: Datastores on patch 3 and 4 is the same,  also patch 3 and 4 installed on the same server

former_member198401
Active Contributor
0 Kudos

Daulet,

Found an option in Designer but with partial explanation and work around. I cannot confirm if it is related with your issue. But can you try running the job only with the problematic dataflow and start the Task manager(or Process Monitor) and see if the Al_engine process is spawned and consuming lots of memory. If this is the case then do the below setting

Uncheck the option "Automatically calculate column mappings" under Tools/Options/Designer/General in Designer and then try to run the job

Regards

Arun Sasi

Olj
Participant
0 Kudos

Arun,

I tried to disable that option, but it doesn't help.

former_member198401
Active Contributor
0 Kudos

In that case you can you go back to 4.2 SP3 Patch 3 where everything was working fine. Also raise a ticket with the SAP Support Team about the issue

Regards

Arun Sasi

former_member187605
Active Contributor
0 Kudos

Transcoding is only activated when your character sets are not identical end-to-end.

I do not inderstand how a patch install can have an effect on this. Unless it's a bug. Therefore, contact SAP Support.

former_member187605
Active Contributor
0 Kudos

It seems your job hangs, waiting for the source to become available. I assume record count for the fcprd2csv_masterdata data flow in the monitor file is 0, isn't it?

Can you check connectivity?

Olj
Participant
0 Kudos

Hi Dirk, Arun,

Thanks, but record count of data flow fcprd2csv_masterdata is not 0. Connectivity(source) is available is 100%.

I tried to wait end of this job, and it finished succsessfully after 3 hours. It's too long time for this job, because in original(old DS) system job works about 3 min.

Step "Data flow <fcprd2csv_masterdata> using IN MEMORY Cache" is taking too long time. What can be wrong with data flow?

(I tried to disable cache as shown by Arun, but it doesn't help)

PS: Also I saw that in new DS designer shows some logs about codepage and etc.

example:

ORIGINAL (OLD DS):

NEW DS:

former_member198401
Active Contributor
0 Kudos

Hi Daulet,

What is your Source database. Are you trying to fetch data from a database or an ERP Application. It looks like you are using a Russian codepage.

Can you try using UTF-8 codepage in the datastore

Regards

Arun Sasi

Olj
Participant
0 Kudos

Hi Arun,

Source is BOFC 10 Webservices, but job also using sql server database.

Shortly, job is taking consolidated data from BOFC and after transforms put data to csv file.

Sorry for my knowledge, I'm from basis team.

How can I change UTF-8 codepage in datastore?

What if I export job to ATL file and give you then? Or it's uselessly?

former_member198401
Active Contributor
0 Kudos

Hi Daulet,

If you are using a Web service then you might have created a HTTP Adapter datastore. Can you create a new HTTP Adapter Data store and use your BOFC Webservice URL to import all the functions which are needed. Then you need to modify the function call which is used in the query transform and point it to the imported functions in the newly created datastore.

1) Go to the Datastore section of Local Object Library and create a new Datastore

2) Import the external metadata by Right Clicking on the new datastore and click on Open

Regards

Arun Sasi

Olj
Participant
0 Kudos

Arun,

I tried to recreate datastore, but it doesn't help.

Also I'm attaching Monitor log (sorted by Elapsed time):

former_member198401
Active Contributor
0 Kudos

Check the suggestion by Dirk.

Can you please check the join in the query transform. This join us causing the issue it seems

Regards

Arun Sasi

Olj
Participant
0 Kudos

Hi Arun,

I open this transform, how can I check the join in the query?

Screenshot of transform:

former_member198401
Active Contributor
0 Kudos

Can you change the properties of the data flow fcprd2csv_masterdata and change the Cache to Pageable and then run the job

Right Click the Data flow and Click Properties

Change the Cache Type to Pageable

Also open the data flow and change the option Cache of the Source table to No

Regards

Arun Sasi

Olj
Participant
0 Kudos

Hi Arun,

Thanks.

But it doesn't works:

former_member198401
Active Contributor
0 Kudos

Hi Daulet,

It seems that you are using a Nested schema in the data flow. Are you using a function call to call a web service.

Can you provide more details about the data flow.Also can you check inside the data flow if any query transform has any option set like 'Run as a separate process' in Advanced Tab

Is BODS Job server installed on Linux server.

Regards

Arun Sasi

Olj
Participant
0 Kudos

Arun,

I checked all transforms in Data flow and there is no specific option in advanced tab.

BODS Job Server running on Windows Server 2012 r2.

former_member198401
Active Contributor
0 Kudos

Do you have any nested schema in your data flow.

Regards

Arun Sasi

former_member198401
Active Contributor
0 Kudos

As Dirk suggested, can you also check if your Source contains huge data or not?

try applying some filters in your query transform to check if data is coming for few records

Regards

Arun Sasi