Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Former Member

Introduction:

This Article provides various solutions for scheduling multiple BODS Batch Jobs (Jobs) sequentially and conditionally. As you are aware that BODS does not contain an inbuilt mechanism to chain multiple Jobs within one parent Job and the default way to work is to chain multiple Workflow's within a Job. However we cannot execute workflow on its own and need a Job to execute it and in various scenarios there is a need to sequence the Jobs and conditionally run the Jobs. The approaches provided below can be used where chaining multiple Workflow's within a Job is not enough and Jobs have to be chained/Sequenced.

The advantages of using below approaches are

1.There is no need for a Third Party Scheduling Tool, Various features within BODS can be combined to create a Job, which acts as a Parent Job, which can be scheduled to trigger Jobs one after other. Parent Job acts like a Sequencer of Jobs.

2. We can avoid scheduling each and every Job and only Schedule Parent Jobs.

3. Using WebServices approach, Global Variables can be passed Via XML file to the Jobs in a simplified manner.

4. Using WebServices approach, Developer would need access only to the folder, which JobServer can access, to place the XML files and does not require access to the JobServer itself.

5. Avoids loading a Job with too many WorkFlows.

6. Time based Scheduling (example:Schedule Jobs at 10 every minutes Interval) can be avoided and Hence there will not any overlap if the preceding Job takes more than 10 minutes.

7.As the Child Jobs and the Parent Job will have its own Trace Logs it would make it easier to troubleshoot in case of any issues.

8.At any point, Child Jobs can be run independently too in Production Environment, this will not be possible if the entire Job logic is put into a WorkFlow.

Scheduling BODS Jobs Sequentially:

If the requirement is to just sequence the jobs so that it can be executed one after the other irrespective of whether the preceding job completes successfully or terminates with some error, then, one of the below approaches can be used. Note that in the example provided below it is considered that the jobs do not have any Global Variables. Approach for Chaining/Sequencing Jobs with Global Variables is explained in the later part of the Article.

Sequencing using Script:

Consider two simple Jobs: Job1 and Job2 are to be executed in sequence and Job2 does not have any Business dependency on Job1. The requirement is to execute only one Job at a time. i.e Job1 can be run first and then Job2 or the other way round but the only criteria is that no two jobs should run at the same time. This restriction could be for various reasons like  efficient utilization of Job Server or because both the Jobs use the same Temp Tables.

Steps to Sequence Jobs using Script:

1.Export the Jobs as .bat (Windows) using the Export Execution Command from Management console.

2.Check availability of Job1.bat and Job2.bat files in the Job Server.

3.Create a new Parent Job (call it as Schedule_Jobs) with just one Script Object.

4. In the Script, Call the Job1 and Job2 one after another using the exec function as given below

Print('Trigger Job1');

Print(exec('C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job1.bat','',8));

Print('Trigger Job2');

Print(exec('C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job2.bat','',8));

When the Schedule_Jobs Parent Job is run it triggers Job1 and then after completion (successful completion/Termination) of Job1 it triggers Job2. Now Parent Job can be Scheduled in the Management Console to run at a Scheduled time and it would trigger both Job1 and Job2 in sequence as required. Note that if the Job1 hangs due to some reason, Schedule_Job will wait until Job1 comes out of the hung state and returns control to Schedule_Job. In this way any number of Jobs can be sequenced.

Sequencing using Webservices:

If the same above two jobs (Job1 and Job2) have to be executed in sequence using Webservices, below approach can be used.

1.Publish both Job1 and Job2 as Webservice from Management Console.

2.Pick up the Webservice URL using the view WSDL option, the link will be as given below

               http://<hostname>:28080/DataServices/servlet/webservices?ver=2.1&wsdlxml

    

3.In Designer, create a new DataStore with Datastore type as WebService and provide the WebService URL fetched from View WSDL option.

4. Import the published Jobs as functions

5. Create a simple Parent Job (called Simple_Schedule) to trigger Job1 and Job2

6. In the Call_Job1 query object, call Job1 as shown in below diagrams, as no inputs are required for Job1, the DI_ROW_ID from Row_Generation or Null can be passed on to the Job1.

7. Similarly call Job2 in the Call_Job2 query object.

When the Simple_Schedule Parent Job is run, It triggers Job1 and then after completion (successful completion/Termination) of Job1 it triggers Job2. Now the Parent Job can be Scheduled in the Management Console to run at a Scheduled time and it would trigger both Job1 and Job2 in sequence as required. Note that if the Job1 hangs due to some reason, Parent Job will wait until Job1 comes out of the hung state and returns control to Parent Job.  In this way any number of Jobs can be sequenced.

Scheduling BODS Jobs Conditionally:

In most of the cases, Jobs are dependent on other Jobs and some Jobs should only be run, after all the Jobs that this Job depends on, has run successfully. In these scenarios Jobs should be scheduled to run conditionally.

Conditional Execution using Script:

Lets consider that Job2 should be triggered after successful completion (not termination) of Job1 and Job2 should not be triggered if Job1 fails.

Job status can be obtained from Repository table/view ALVW_HISTORY. The Job status for the latest instance of the Job1 run should be checked and based on that Job2 should be triggered.To do this,

1.The Repository Database\Schema should be created as new Datastore (Call it BODS_REPO).

2.Import the ALVW_HISTORY view from the Datastore.

3.Create a new Parent Job Conditionally_Schedule_Using_Script with just one Script Object

4.Create two Variables $JobStatus and $MaxTimestamp in the Parent job

5.Between the exec functions place the status check code as given in the below code

Print('Trigger Job1');

Print(exec('C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job1.bat','',8));

#Remain idle for 2 secs so that Job Status is Stable (Status moves from S to D for a Successful Job and E for Error)

Sleep(2000);

#Pick up the latest job Start time

$MaxTimestamp =  sql('BODS_REPO', 'SELECT  MAX(START_TIME) FROM DataServices.alvw_history WHERE SERVICE=\'Job1\';');

PRINT($MaxTimestamp);

#Check the latest status of the preceding job

$JobStatus = sql('BODS_REPO', 'SELECT STATUS FROM DataServices.alvw_history WHERE SERVICE=\'Job1\' AND START_TIME=\'[$MaxTimestamp]\';');

PRINT($JobStatus);

if ($JobStatus='E')

begin

PRINT('First Job Failed');

raise_exception('First Job Failed');

end

else

begin

print('First Job Success, Second Job will be Triggered');

end

Print('Trigger Job2');

Print(exec('C:\Program Files (x86)\Business Objects\BusinessObjects Data Services\log\Job2.bat','',8));

Using the above code in the Script, when the Parent Job is Run it will trigger Job1 and only if Job1 has completed successfully it will trigger Job2. If Job1 fails then Parent Job will be terminated using the raise_exception function. This approach can be used to conditionally schedule any number of Jobs.

Conditional Execution using Webservices:

To conditionally execute Job (Published as WebService) based on the status of preceding Job (again Published as WebService), the same concept used in the Conditional Execution using Script can be applied. i.e Call Job1, Check the Status of Job1 and then if Job1 is successful trigger Job2.

1.Create a Parent Job with 2 DataFlows and a Script in between the DataFlows

2. Use First DataFlow to call the First Job (Refer above section for detail on calling a Job as webservice within another Job)

3. Use the Second DataFlow to call the Second Job

4. Use the Script to Check the Status of First Job

The Script will have below code to check the status

#wait for 2 seconds

sleep(2000); 

#Pick up the latest job start time

$MaxTimestamp =  sql('BODS_REPO', 'SELECT  MAX(START_TIME) FROM DataServices.alvw_history WHERE SERVICE=\'Job1\';');

PRINT($MaxTimestamp);

#Check the latest status of the Preceding job

$JobStatus = sql('BODS_REPO', 'SELECT STATUS FROM DataServices.alvw_history WHERE SERVICE=\'Job1\' AND START_TIME=\'[$MaxTimestamp]\';');

PRINT($JobStatus);

if ($JobStatus='E')

begin

PRINT('First Job Failed');

raise_exception('First Job Failed');

end

else

begin

print('First Job Success, Second Job will be Triggered');

end

Using the above code in the Script when the Parent Job is Run it will trigger Job1 and only if Job1 has completed successfully it will trigger Job2. This approach can be used to conditionally schedule any number of Jobs that are published as WebService.

Conditional Execution using Webservices: Jobs with Global Variables

When Jobs have Global Variables for which values needs to be passed while triggering it,It needs to be handled differently as when the Job is called as webservice it expects Global Variables to be mapped. So the idea is to pass either Null Values (For Scheduled Run) or actual Values (For Manual Trigger) using XML file as input.

Lets assume that the First Job has 2 Global Variables like $GV1Path and $GV2Filename and that the second Job does not have any Global Variables and the requirement is to trigger Job2 immediately after successful completion of Job1.

1.Similar to above Parent Job, Create a Parent Job with 2 DataFlows and a Script in between the DataFlows

2. Use First DataFlow to call the First Job (Refer above sections for detail on calling a Job as webservice within another Job), Instead of Using Row Generator Object use XML input File as source

The XSD for the Input XML file will be as given below, if there are more Global Variables in the Job then elements GV3, GV4 and so on should be added to the Schema.

<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">

  <xs:element name="FIRSTJOB">

    <xs:complexType>

      <xs:sequence>

        <xs:element name="GLOBALVARIABLES">

          <xs:complexType>

            <xs:sequence>

              <xs:element type="xs:string" name="GV1"/>

              <xs:element type="xs:string" name="GV2"/>

            </xs:sequence>

          </xs:complexType>

        </xs:element>

      </xs:sequence>

    </xs:complexType>

  </xs:element>

</xs:schema>

          The Input XML file used is as given below

<FIRSTJOB>

<GLOBALVARIABLES>

<GV1>testpath1</GV1>

<GV2>testfilename</GV2>

</GLOBALVARIABLES>

</FIRSTJOB>

3. In the "WebService function Call" in "call_FirstJob" Query object, map the Global Variables as shown below

4.Use Second DataFlow to call the Second Job. As this Job does not contain Global Variables, Row Generation object would be enough (as in previous      section)

5. Use the Script object to Check the Status of First Job

Using the above approach, When the Parent Job is Run it will trigger First Job and pass the Global Variables present in the Input XML File and only if First Job has completed successfully it will trigger Second Job. This approach can be used to conditionally schedule any number of Jobs that are published as WebService. For every Job that has Global Variables, An XSD and XML file should be created. The Global Variables passed from the XML file to the WebService seems to be working only when the parameters are passed in right order, Hence it would be good practice to name the Global Variables with good naming convention like $GV1<name>, $GV2<name> and so on.

About me:

Anoop kumar V K, Technology Lead at Infosys Limited.

26 Comments
Labels in this area