on 07-22-2014 4:45 PM
Hi,
I am exporting BODS jobs to an executable to be run by an external scheduler and would like to know if it is possible to passing an input parameter to the job using a command line (such that I can store the passed in value in a global variable and use in workflows)?
Thanks,
Zlat
you can pass global varaible values using the -GV<GlobalVariableName=value> in command line option
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Zlat,
While exporting job expand "Global Variables" and pass the value before clicking on Export.
Job: New_Job |
File name: | |
Job Server or Server Group: | |
Export Job Server: | |
Enable auditing: | |
Disable data validation statistics collection: | |
Enable recovery: | |
Recover from last failed execution: | |
Use password file: | |
Collect statistics for monitoring: | |
Collect statistics for optimization: | |
Use collected statistics: | |
Export Data Quality reports: | |
Distribution level: |
$START_DATETIME (datetime): | |
$START_DATE (date): | |
$START_TIME (time): | |
$END_DATETIME (datetime): | |
$END_DATE (date): | |
$END_TIME (time): | |
$ROW_COUNT (int): |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You can do this using web services Run_Batch_Job, or you can export the execution command and wrap some code around it to populate the substitution and/or global variables.
The variable contents are now encrypted by default, so if you want to use plain text I believe the switch is -GI for that. (or use the al_encrypt.exe to encrypt before populating) Just be careful with punctuation.
I also dynamically load them into the enterprise scheduler database, for load balancing. It works pretty slick.
Sure is. You can also execute them using a real-time datastore, but that would still require feeding the containing dataflow with your parameters. (which could easily be tabled at that point).
I think its easier to just use the .bat files with the scheduler. You go the other way you have to figure out a good way to get the executing job status. run_batch_job is fire and forget which doesnt bode well with your scheduler. ;(
yes it is possible to supply the parameters on the go in this way.
Say your job server is in a Unix box. You can edit the exported command file(.sh file) like any other shell script and add variables inside to accept input values from the command line. You can then assign this variable to the Global variables like '-GV<GlobalVariableName=value>' (as mentioned by Aasavari Bhave in an earlier post) in the job execution command.
Am sure this could be done in windows bat file too. This can be consumed in any other program which invokes the job, be it from a java program or a thrid party scheduler etc.
User | Count |
---|---|
87 | |
10 | |
10 | |
10 | |
7 | |
6 | |
6 | |
5 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.