on 01-29-2015 6:50 AM
Dear Experts
I'm new to BODS please bear with me if this is basic
My requirement is to update Start_Time , End_Time and ROW_COUNT from repository view alvw_flow_stat in to the local table in staging schema along with various other details to maintain the audit data for each JOB. So i have created data flow to collect these stats based on RUNID and appended this DF at the end of my JOB , But its not working since the underlying repository table doesn't have committed record of this JOB.
Then i have created a separate JOB to collect stats and ran after my JOB , then it worked fine.But how to achieve this in real time where i have 100+ Jobs and it make lot of sense to do at the end of each JOB instead of completing all the jobs and run the stats collection job. So my challenge is how to link my JOB and stat JOB , is there any way in BODS to create bunch OR is there any other better to implement this ? Kindly help
Thanks
RK
Instead of using the reposiory views, you better create your own control tables to keep track of job-related statistics. This gives you full euh... control about what you want to achieve.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Much Dirk for instant reply , Indeed i have control tables i want to fill these tables from already available data from repository , however is not laborious task to keep the source record counter & target record counter , error record counter to compute Total record count
In general how do we link two BODS jobs like we connect two work flows or data flows
Sure, it requires some work. But you'll have to build this only once and you can reuse in all your jobs. And you don't have to reinvent the wheel. Most of DS customers use such a solution in one way or another. If you search this forum, you might come accross quite some code samples that would suite your purpose well.
There are several ways to link DS jobs together. For you, easiest would be to export the execution command of your 2nd job. You do this in the DS Admin console. It will generate a .bat file that you can run to start the job. Execute it with the exec built-in function in a script at the end of the 1st job.
Search for "schedule" in this forum and you'll find plenty of examples.
Fantastic Dirk Its really helpful , I will do some search for the code snippets.
I have a last question , I did this for JOB Batch Job Configuration -> Export Execution Command and it said Export was successful. but couldn't locate the output file/batch file where does it get stored?
User | Count |
---|---|
94 | |
11 | |
11 | |
10 | |
9 | |
7 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.