Page 1 of 1

Live video processing

Posted: Fri Jul 26, 2019 6:52 am
by Terry21s
Hello,

Is there any chance you will add in the future live video processing?
i.e.
Get video from a blackmagic board, and create a HLS stream for web use.

Regards
Terry

Re: Live video processing

Posted: Fri Jul 26, 2019 11:15 am
by emcodem
Hey Terry,

Good question!
Ffastrans targets automated background workflows with files. How would you imagine the process is started and stopped for live capturing? Something like a special Interface with a start and stop button?

Re: Live video processing

Posted: Sat Jul 27, 2019 7:26 am
by Terry21s
Yes something like that.
I imagine a workflow with different properties
No sleep timer
Priority always high

Live video processed as a file but it has to be buffered first and the buffered chunks must be kept at a perfect sequence.
in essence the sleep time would be 0.1 of a second monitoring the buffer, and chunks should be 0.2 to 1 seconds.

in terms of processors I imagine the following to start with.
1. Live input, BMD board, Webcam, web live stream, etc,
2. Encoding, H264, H265 etc. Wrapper, MPEG-2 transport stream, MP4.
4. File Delivery, In essence recording to file
or/and
5. Stream, with options like type, Chunk size, Key fame interval, list size etc.

All of the above are supported by FFMPEG, its only a matter of putting them together...

I know this is extremely difficult as its not really compatible with the way FFASTRANS is setup... but wishful thinking never hurt anyone.. lol

In any case, if this can happen ffastrans will become the ultimate workflow manager...

I might have everything wrong not really a programmer, but it makes sense to me...

Re: Live video processing

Posted: Sat Jul 27, 2019 11:46 am
by emcodem
I belive you are on a good path in understanding what it means to support live from ffastrans current perspective, your example about the job with no sleep etc.. hits it quite well. Sure there is some complex stuff to do in addition but this is one of the basics that would need to be added.
But how would you imagine the process is being started and stopped, only by manual interaction from a user or any different... E.g. user opens a sepcial UI and checks if the needed resources are there (capture board free and connected, processing resources) ...
Could you describe a little more about that please

You might want to know, from a programmers perspective, we need to concentrate on 3 things: input, transformation, output. The transformation part is clear: a running encoding process that captures from live. But what about input and output... E.g. do you imagine a different input source than manual interaction from a user etc...

Also, why is it interesting for you that you see live and offline file jobs in the same interface, wouldnt it be more useful to have one for live only and another one for file... And is it really interesting that live and file processing shares the same resources (processors), shouldnt live have dedicated resources because of the capture card etc

Re: Live video processing

Posted: Mon Aug 05, 2019 1:29 pm
by emcodem
For start, here a version of ffmpeg compiled with decklink support, along with a little gui for playout.
Sure there is not yet a direct relation to the topic "Live video processing" but it might help getting started for further experiments and maybe the one or other user is looking for it.

ffmpeg Version is something like 4.1 or 4.2 N-93406-g4b32f8b3eb
https://github.com/emcodem/ffplayout/releases/tag/0.1

Common usage examples:

List all decklink devices:

Code: Select all

ffmpeg -f decklink -list_devices 1 -i dummy
Play a file on decklink:

Code: Select all

ffmpeg  -i input.mxf  -f decklink -vcodec v210 -c:a pcm_s24le -ar 48000 -ac 8 "DeckLink 8K Pro (1)"
For building ffmpeg, https://github.com/jb-alvarado/media-autobuild_suite was used.