cancel
Showing results for 
Search instead for 
Did you mean: 

Job scheduling in SAP BODS based on Conditional

Former Member
0 Kudos

Hi All,

I have two batch jobs in which if the  first job completed successfully.I need to capture the job status as success. Then second job will start running.

If first job fails then i need to display job error and second job should not start.

I need to do schedule the job. Kindly suggest the good way and please share the information in details.

Thanks

Selva

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi

  • Firstly Goto Data Services Management Console-->respective repository of "Second Job"-->Batch Job Configuration for "Second Job"-->Export execution command

  • Import the ALVW_HISTORY view in a data store which has backend as repository data base.

  • Create a new bacth job "Third JOB".

  • Create two global variables called $LatestJobTimeStamp and $StatusOfJob.

  • Then create the script as follows

#Recent job start time

$LatestJobTimestamp  =  sql('Data Store NAME','SELECT  MAX(START_TIME) FROM dbo.ALVW_HISTORY WHERE SERVICE=\'First Job Name\';');

PRINT($LatestJobTimestamp);

#Recent status

$StatusOfJob  = sql('Data Store NAME','SELECT STATUS FROM dbo.ALVW_HISTORY WHERE SERVICE=\'First Job Name\' AND START_TIME=\'[$LatestJobTimestamp]\';');

PRINT($StatusOfJob);

if ($StatusOfJob='E')

begin

Print('Job had failed');

Print('Re-run the job Again');

End

else

begin

print('First Job had Successfully executed');

print('Running the second Job');

Print(exec('C:\ProgramData\SAP BusinessObjects\Data Services\log\Secondjobfilename.bat',''));

end



When ever the first job is executed successfully then the second job is triggered using the above script from the Third Job.


Hope this would help you!!


Regards,

Mubashir Hussain

Answers (2)

Answers (2)

Former Member
0 Kudos

Hello

There are various possibilities to achieve what you require.  I would not go down the route of looping in a job waiting for a trigger (file or table entry), or putting any scheduing logic into the ETL jobs.

Have you considered putting all the logic in the same job?  If you have built your logic using workflows, it would be simple to drop the same workflow(s) into another job.  If that is not an option, the BIP/IPS sheduler can be configured to use triggers, so writing a file at the end of job a could trigger the scheduler to start job b.

Michael

former_member200473
Contributor
0 Kudos

Hi Michael,

Could you please provide more information for the statement "BIP/IPS sheduler can be configured to use triggers, so writing a file at the end of job a could trigger the scheduler to start job b."

Shiva Sahu

Former Member
0 Kudos

Hello

This wiki page shows 2 techniques - http://wiki.scn.sap.com/wiki/pages/viewpage.action?pageId=318669651

Michael

Former Member
0 Kudos

I have not implemented this idea (however am due to soon) but:

Your first job could have a script as the last thing it does writing to a table inserting a line or updating a field to show the job has finished. If it errors to the point of failure then it will never get to this script (unless of course you add a try and catch).

The second job would contain a conditional workflow looking for the specific value you have written to the table. It would only run if the value was there.

To make this work though you would need to schedule both jobs to start at the same time. The second job would loop and check for the "success" value in the table say every 5 minutes. You can do this by using the loop functions and combining with sleep function. So for example, you would check every 5 minutes for 1 hour and the second job would only actually execute it's work flows when it finds the entry in the table you write to in the first job.

Like I say, I haven't yet implemented but can't see why it wouldn't work.

Former Member
0 Kudos

I should probably clarify this is a bit of a work around as we cannot seem execute the job as a batch script after exporting the execution command.

Former Member
0 Kudos

Hi

Former Member
0 Kudos

Hi

I assume the script must be in a location on the data services job server? Does it have to be in a specific location to be executed successfully?

Thanks

Dan

Former Member
0 Kudos

The job containing the script must be in the same repository of the other two jobs.

This condition is enough.

Regards,

Mubashir Hussain

Former Member
0 Kudos

Where does the actual batch file have to be though?

In your example you had it here:

'C:\ProgramData\SAP BusinessObjects\Data Services\log\Secondjobfilename.bat'

I am assuming this is the C: of the data services job server? Can it go anywhere else?

Former Member
0 Kudos

Once you click on export button the batch file goes to the machine on which job server is located.