on 11-27-2015 4:25 AM
Hi,
After upgrade Data Services from 4.1 sp1 to 4.2 SP 5, patch 2, we migrated all FIM jobs, but one of jobs is created directly in DataServices Designer and it does not work after miration on new version.
After executing job freezes on step:
1452 | 468 | DATAFLOW | 27.11.2015 9:56:29 | Data flow <fcprd2csv_masterdata> using IN MEMORY Cache. |
Job server event log does not contain some strange rows.
Job server configured is normally, because we have many FIM jobs, and they works fine, but this problem job is created directly in DataServices Designer and unfortunately we don't have sufficient knowledge about DataServices jobs.
Please can you help me? Maybe it's a bug?
Job srv EVENT LOG:
Experts,
What can I do else?
I'm already despair!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
I cannot understand that a change in code page has such a dramatic impact on data flow performance. therefore my question, what else has changed?
During installation there's a prompt about reusing the dsconfig.txt file. Have you replied no to that question?
Do you still have the settings of your 4.1 installation? If so, let's start by setting the AL_ENGINE parameter in dsconfig.txt back to its original value. That should remove the transcode operations (assuming you're still running on the same platform).
Hi Dirk,
I think that change in code page is not problem.
We did new(clear) installation with default dsconfig on new server.
But on old(original) server we use default dsconfig too(nothing changed after installation).
After new installation I imported FIM jobs via FIM, all works fine.
Then I import this problem customized job. Job working, but too long.
One of dataflow working too long, and I don't why.
I already tried to find visual difference between two jobs, all identically.
What I can do else?
Hi Daulet,
We also faced this performance issue after upgrade but I could figure out what was causing the data flow to run so long. I know this would be a nightmare for you but you will have to analyze what is causing the issue.
Can you send me the screenshot of the data flow and the join being used.
Regards
Arun Sasi
Hi Dirk, Arun,
On this weekend I tried to find from wich version of DS job works too long.
And after many installation/uninstallation I found it!
Jobs works too long from verison DS 4.2 SP3 patch 4 and newer.
On version DS 4.2 SP3 patch 3 and lower job works fine (about 3mins).
Also I compared tracelog between patch 3 and patch 4:
You can see that patch 4 tried to transcode some datastores.
How can I disable this transcoding in DS Designer? Like in patch 3 version
PS: Datastores on patch 3 and 4 is the same, also patch 3 and 4 installed on the same server
Daulet,
Found an option in Designer but with partial explanation and work around. I cannot confirm if it is related with your issue. But can you try running the job only with the problematic dataflow and start the Task manager(or Process Monitor) and see if the Al_engine process is spawned and consuming lots of memory. If this is the case then do the below setting
Uncheck the option "Automatically calculate column mappings" under Tools/Options/Designer/General in Designer and then try to run the job
Regards
Arun Sasi
It seems your job hangs, waiting for the source to become available. I assume record count for the fcprd2csv_masterdata data flow in the monitor file is 0, isn't it?
Can you check connectivity?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Dirk, Arun,
Thanks, but record count of data flow fcprd2csv_masterdata is not 0. Connectivity(source) is available is 100%.
I tried to wait end of this job, and it finished succsessfully after 3 hours. It's too long time for this job, because in original(old DS) system job works about 3 min.
Step "Data flow <fcprd2csv_masterdata> using IN MEMORY Cache" is taking too long time. What can be wrong with data flow?
(I tried to disable cache as shown by Arun, but it doesn't help)
PS: Also I saw that in new DS designer shows some logs about codepage and etc.
example:
ORIGINAL (OLD DS):
NEW DS:
Hi Arun,
Source is BOFC 10 Webservices, but job also using sql server database.
Shortly, job is taking consolidated data from BOFC and after transforms put data to csv file.
Sorry for my knowledge, I'm from basis team.
How can I change UTF-8 codepage in datastore?
What if I export job to ATL file and give you then? Or it's uselessly?
Hi Daulet,
If you are using a Web service then you might have created a HTTP Adapter datastore. Can you create a new HTTP Adapter Data store and use your BOFC Webservice URL to import all the functions which are needed. Then you need to modify the function call which is used in the query transform and point it to the imported functions in the newly created datastore.
1) Go to the Datastore section of Local Object Library and create a new Datastore
2) Import the external metadata by Right Clicking on the new datastore and click on Open
Regards
Arun Sasi
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Daulet,
It seems that you are using a Nested schema in the data flow. Are you using a function call to call a web service.
Can you provide more details about the data flow.Also can you check inside the data flow if any query transform has any option set like 'Run as a separate process' in Advanced Tab
Is BODS Job server installed on Linux server.
Regards
Arun Sasi
User | Count |
---|---|
89 | |
10 | |
9 | |
9 | |
9 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.