Hey @authorleon
what is "media encoder"?
To be honest, personally i would prefer to work on quicksync support first because not everyone has an nvidia card built in but a lot of users have quicksync onboard with their Core I CPU.
Anyway, the topic was discussed deeply here:
viewtopic.php?f=5&t=1059
The final conclusion is that the ffastrans team has the problem that we mostly work in the production sector where the hardware acceleration does not help, it is for delivery/archiving/live.
I have 2 ffastrans setups running that drive quicksync workflows using a commandline processor which starts ffmpeg but those setups are limited to "always the same source format". I recognized that the nvenc encoder in ffmpeg in difference to other encoders like libx264 is not able to tell ffmpeg to insert automatic filters like colorspace filters, so depending on the source format, the encoder causes different errors about wrong colorspace and such.
Here some example that i use productive in a commandline processor to capture from decklink card (note that i have a custom built ffmpeg.exe for decklink support copied to c:\windows\system32:
Code: Select all
%comspec% /C "ffmpeg.exe -y -f decklink -i "%s_decklink_card%" -c:v h264_qsv -b:v 4M -r 25 -s 1920x1080 -vf "yadif" -preset veryslow -segment_time %s_segment_duration_ch3% -f segment -strftime 1 -t %s_duration_ch3% "%s_directory_ch3%\chunk_%Y-%m-%d_%H-%M-%S.mp4" "
In above command all i needed to do is to force a resolution that the encoder supports plus deinterlace "yadif" because the h264_qsv failed on interlaced sources.
If you start working with this, i recommend to first try to encode your source files of interest using the ffmpeg.exe that comes with ffastrans on commandline BEFORE you start using it in custom ffmpeg or commandline processor.