Is there any detriment in encoding everything at level 5.1?

Apollo89x wrote on 9/8/2025, 12:56 PM

I have recently discovered that for higher bit-rates/FPS you need to encode at minimum 4.2 level / 5.0+ for UHD & 4k.

Whilst I only use HD, if I set it as default to encode everything at 5.1 level...

Would there be any "negative impact" from doing this?

Comments

johnebaker wrote on 9/8/2025, 1:18 PM

@Apollo89x

Hi

. . . . Would there be any "negative impact" from doing this? . . . .

The higher levels enable higher resolution and frame rate video formats, however they can take more CPU/GPU processing and require higher spec playback devices to decode.

As I have previously mentioned, what happens to your upload depends on the website you are uploading to and the viewers Internet connection speed.

They may create alternate versions of your upload, for example if you were to upload a 4K UHD , 3840x3160 video to Youtube it will re-encode into a variety of resolutions down to as low as 144p, 256 x 144 pixels.

John EB

 

VPX 16, Movie Studio 2025, and earlier versions 2015 and 2016, Music Maker Premium 2024.

PC - running Windows 11 23H2 Professional on Intel i7-8700K 3.2 GHz, 16GB RAM, RTX 2060 6GB 192-bit GDDR6, 1 x 1Tb Sabrent NVME SSD (OS and programs), 2 x 4TB (Data) internal HDD + 1TB internal SSD (Work disc), + 6 ext backup HDDs.

Laptop - Lenovo Legion 5i Phantom - running Windows 11 24H2 on Intel Core i7-10750H, 16GB DDR4-SDRAM, 512GB SSD, 43.9 cm screen Full HD 1920 x 1080, Intel UHD 630 iGPU and NVIDIA GeForce RTX 2060 (6GB GDDR6)

Sony FDR-AX53e Video camera, DJI Osmo Action 3 and Sony HDR-AS30V Sports cams.

ericlnz wrote on 9/8/2025, 7:49 PM

I have recently discovered that for higher bit-rates/FPS you need to encode at minimum 4.2 level / 5.0+ for UHD & 4k.
 

Doesn't your software automatically render at the required level? Best not to manually play around with the levels?