on 06-29-2015 6:29 PM
we have data services job processing in foloowing steps.
1) map - load data from oracle to HANA
2) validate - validate data in HANA table using validate function.
3) enrich - map required data to new value lookup_ext function.
4) load - load data in SAP BW system.
These jobs were working in old environment. In new pre-prod BODS server, the job is getting stuck after loading certain records. Usually it is happening in validate phase. The source HANA table and look up table are in different schema. we tried by using same schema for both but still job gets stuck. Any idea what could be the reason?
Is anybody aware of general error - 10108 Session has been reconnected. I got this error when running these jobs. Can it be the reason for jobs getting stuck for past few days?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Swati,
What type of Cache Specification are you using for your lookup tables(PRE_LOAD_CACHE or DEMAND_CACHE)
The DEMAND_CACHE does make sense where the lookup table is large.Try this is you have pre_load_cache as cache spec for all the lookup functions
Some more information is needed
1) What is the version of Data Services on the Server
2) Are you using any custom condition in Validate transform
3) Did you do upgrade data services or just installed Data Services on the Pre-Prod Server
4) Have you hard coded the lookup_ext() function in the mapping of columns or created a New function Call in the target schema
Regards
Arun Sasi
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
1) Data services verison is 4.2
2) We are using customer condition in validat only for dates. To check if the given date is a valid date we are suing below.
is_valid_date(Validate_Lookup_Pass.LOAD_DATE, 'YYYYMMDD')=1
3) No. There was no recent installation or upgrade.
4) This question I did not understand clearly. In query transform, in the mapping area, we are using lookup_ext function. I have given the example in reply to Mangesh.
5) Sometimes the job is stuck in VALIDATE part after certain record count. In VALIDATE, we are using only validation transform. So I am not sure if the cache specification change will resolve the issue. Still I will try that once.
Thanks.
Hi Swati,
The is_valid_date() function might take time as it is being used in the Validate transform. Also could you call a function using Right Click>New Function Call in the target schema instead of using the lookup_ext() syntax in the mapping.
Let us know if there is any performance improvement after changing the cache spec to Demand_load_cache
Regards
Arun Sasi
Do you use lookup functions in validate part?.How many records are we talking about?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Mangesh,
Yes. We are using lookup_ext function in validate part. I have given one example below.
Record count is more than 2 million.
lookup_ext([DS_LOOKUP.SCHEMA_LOOKUP.LKP_TABLE,'PRE_LOAD_CACHE','MAX'],[ARTICLE_NO],[NULL],[LEGACY_COMPANY_CODE,'=',STAGE_TABLE.ARTICLE_NO]) SET ("run_as_separate_process"='no', "output_cols_info"='<?xml version="1.0" encoding="UTF-8"?><output_cols_info><col index="1" expression="no"/> </output_cols_info>' )
Hi Dirk,
Sorry if I was not clear. My source table has records in millions. The llokup table has records in thousands. If I understand correctly, the PRE_LOAD_CACHE , extracts lookup table into memory which is moderate in size in our case. Then lookup searches source value in cached lookup one by one. which should be faster as all my lookup records are in memory. where as if I use demand cache, the lookup will not be pre loaded to memory. and performance will be slower. Please correct if I am wrong.
User | Count |
---|---|
85 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.