setting max parallel running jobs
setting max parallel running jobs
Hi all,
Is there a way to set maximum parallel job on queue?
If I start my transcoding there are 20 parallel jobs. I'd like to have max 5 for example
Thanks!
Is there a way to set maximum parallel job on queue?
If I start my transcoding there are 20 parallel jobs. I'd like to have max 5 for example
Thanks!
Last edited by buddhabas on Wed Dec 18, 2024 12:28 pm, edited 1 time in total.
Re: queue control
Its a little confusing question, usually users want to set max paralell jobs running, i guess you found that setting already.
If you really want to define the max jobs in queue, you can do this actually but of course only related to watchfolder operations. API and manual jobs will just queue in normally (what else should they do).
In ffastrans.json find max_queue value. Also on webint ffastrans->Config menu, "Debug" tab.
If you really want to define the max jobs in queue, you can do this actually but of course only related to watchfolder operations. API and manual jobs will just queue in normally (what else should they do).
In ffastrans.json find max_queue value. Also on webint ffastrans->Config menu, "Debug" tab.
emcodem, wrapping since 2009 you got the rhyme?
Re: queue control
Thanks forr feedback.emcodem wrote: ↑Mon Dec 16, 2024 10:26 pm Its a little confusing question, usually users want to set max paralell jobs running, i guess you found that setting already.
If you really want to define the max jobs in queue, you can do this actually but of course only related to watchfolder operations. API and manual jobs will just queue in normally (what else should they do).
In ffastrans.json find max_queue value. Also on webint ffastrans->Config menu, "Debug" tab.
I'd like to have a user_variable to set on Populate_variables
Something like "max_parallel_running_jobs"
Re: queue control
Sorry but still not clear what you mean.
You can potentially alter the max_queue setting i mention above from a workflow but i believe this is not what you want.
So if there was a user_variable as you call it max_parallel_running_jobs, what exactly would it do?
You can potentially alter the max_queue setting i mention above from a workflow but i believe this is not what you want.
So if there was a user_variable as you call it max_parallel_running_jobs, what exactly would it do?
emcodem, wrapping since 2009 you got the rhyme?
Re: queue control
If a start one of my workflows (I put in 20 files) I can see on monitor all 20 files running parallelement.emcodem wrote: ↑Wed Dec 18, 2024 10:49 am Sorry but still not clear what you mean.
You can potentially alter the max_queue setting i mention above from a workflow but i believe this is not what you want.
So if there was a user_variable as you call it max_parallel_running_jobs, what exactly would it do?
I would like to limit to 5
Re: queue control
and why you want to change this value via a "user_variable"?
emcodem, wrapping since 2009 you got the rhyme?
Re: setting max parallel running jobs
Understood that but does the max_queue setting in ffastrans.json actually do what you want?
emcodem, wrapping since 2009 you got the rhyme?
Re: setting max parallel running jobs
I closed FFASTrans.
Set
on ffastrans.json
Started ffastrans.
Manuallky submitted 20+ files to a WF.
I can see all job started, not only 5 as set.
Why?
Set
Code: Select all
"max_queue": 5
Started ffastrans.
Manuallky submitted 20+ files to a WF.
I can see all job started, not only 5 as set.
Why?
Re: setting max parallel running jobs
are the 20 parallel jobs created because you're splitting the workflow in different sections all being executed at the same time like what you're doing with the audio extractor + H.264 video encoder on the other topic?
If you don't want them to be executed in parallel you can make them sequential, all you need is to access the original file, so you can either populate s_source with s_original_full and start back from that source file, then a new A/V Decoder etc etc etc or you could save the indexed source by putting a populate variable node after the first A/V Decoder to set s_my_source = s_source , and then after you've done your thing and delivered the first file, reference back to that with a populate variable to set back s_source to s_my_source.
I can send you a workflow as an example if you want but what I mean is:
watchfolder
Encoding node
delivery node (output 1)
populate node (s_source = s_original_full)
Encoding node
delivery node (output 2)
etc
If you don't want them to be executed in parallel you can make them sequential, all you need is to access the original file, so you can either populate s_source with s_original_full and start back from that source file, then a new A/V Decoder etc etc etc or you could save the indexed source by putting a populate variable node after the first A/V Decoder to set s_my_source = s_source , and then after you've done your thing and delivered the first file, reference back to that with a populate variable to set back s_source to s_my_source.
I can send you a workflow as an example if you want but what I mean is:
watchfolder
Encoding node
delivery node (output 1)
populate node (s_source = s_original_full)
Encoding node
delivery node (output 2)
etc