FFmpeg Hardware (GPU) Encoding (on Linux, with VAAPI)A not-great guide, because there is no guide elsewhere
I spent over an hour trying to figure out how to do this. Despite what the FFmpeg documentation may imply, it's actually not all that complicated to set up.
Yes, there are dependencies for this
Of course, you'll need a working copy of FFmpeg.
You'll also need a
libva that's compatible with
the card(s) and drivers you want to use. For example, as an
Arch Linux user with an AMD card on the AMDGPU driver, I
Unfortunately I can't provide support for every possible
configuration of GPU, drivers, and distribution, so you'll
have to look into your distribution's documentation on your own.
If you're on Arch,
Hardware video acceleration on the wiki
is a good resource for what packages you'll want to install
for your card and drivers.
If you just want a command to copy & paste, here it is:
ffmpeg -vaapi_device /dev/dri/renderD128 -i [path/to/input.mime] -c:v h264_vaapi -vf 'format=nv12,hwupload' [path/to/output.mime]
Despite being syntax highlighted in FISH, this doesn't contain any FISH-specific functions and should work fine on any shell.
But there's a lot that goes into this command, and you shouldn't just copy shell commands from the internet without understanding what they do, so here's...
Of course, we're calling FFmpeg with
you use a weird path for your ffmpeg installation (such as
you should use that instead.
Despite people and documentation who may claim that you might
have to initialise your hardware accelerator with something
-init_hw_device, and/or use flags like
you're only going to be using one GPU you actually just need
-vaapi_device flag, and — of course
— a specified VAAPI-compatible device.
Now, how do you determine what device to point to? Just run
renderD*, where '
*' is a number. Usually '128' for your primary GPU, increasing by 1 for every additional GPU. For example, on my system,
renderD128is my primary GPU (an AMD Radeon RX 5500 XT) and
renderD129is my seconday GPU (an AMD Radeon RX 550).
-i [path/to/input.mime] specifies your input file.
You should replace
[path/to/input.mime] with the
actual path of the video/picture/whatever you want to process,
without the square backets, of couse.
-c:v h264_vaapi sets the codec for the
video to h264 via VAAPI. This could actually be
h265_vaapi, or whatever other codec your VAAPI-compatible
device may be capable of; h264 is by far the most common.
For whatever reason, it is strictly neccessary that
you include video format arguments with
-vf. I honestly don't fully understand what
'format=nv12,hwupload' means or does, but that's out
of the scope of this guide anyway. I just know that it works,
and that it should be fine for you, too. You'll probably know
if you want to change the value of this argument.
[path/to/output.mime] — where
you want the output file to go. Replace this with the actual
path of where you want the (re-)encoded output file to go.