Regarding different exported output between GPU and CPU

kaistcha wrote on 5/1/2019, 1:17 AM

I exported below output from same source video through each GPU(1080 Ti) and CPU(4790k) on the Video Pro X10 (16.0.2.332) at the Windows 10.

[Encoded by GPU]

  • This is only a 10 bit colored video but not HDR rendered, so HDR monitor and TV didn't recognized it is HDR video. X10 has no feature to make BT.2020 output via GPU acceleration?

[Encoded by CPU]

  • This result is real HDR video with BT.2020 color space, but color grading was not done.

Source video is from my Mavic 2 pro with below setting

  • Video Size: 4K 3840x2160 HQ @ 30fps
  • Video Format: MP4
  • White Balance: Cloud
  • Style: None Color: DLOG-M
  • Encoding format: H.265

I chose this setting before exporting from the Video Pro X.

Please help me why the output can be different through each GPU and CPU rendering.

Comments

Scenestealer wrote on 5/2/2019, 6:16 PM

Hi @kaistcha

I have done a bit of googling on this and there is reference to the fact that the Nvidia HW encoding NVENC chip can not write the required HDR information into the metadata of the encode.

Have you tried using HW acceleration via the Intel GPU if your processor has one?

Peter

System Specs: Intel 6th Gen i7 6700K 4Ghz O.C.4.6GHz, Asus Z170 Pro Gaming MoBo, 16GB DDR4 2133Mhz RAM, Samsung 850 EVO 512GB SSD system disc WD Black 4TB HDD Video Storage, Nvidia GTX1060 OC 6GB, Win10 Pro 2004, MEP2016, 2022 (V21.0.1.92) Premium and prior, VPX7, VPX12 (V18.0.1.85). Microsoft Surface Pro3 i5 4300U 1.9GHz Max 2.6Ghz, HDGraphics 4400, 4GB Ram 128GB SSD + 64GB Strontium Micro SD card, Win 10Pro 2004, MEP2015 Premium.

kaistcha wrote on 5/2/2019, 9:03 PM

Hi @kaistcha

I have done a bit of googling on this and there is reference to the fact that the Nvidia HW encoding NVENC chip can not write the required HDR information into the metadata of the encode.

Have you tried using HW acceleration via the Intel GPU if your processor has one?

Peter

Hi Scenestealer,

Oh~my! Thank you to let me know the fact. Anyway, I've never used Intel GPU for this, Although the intel GPU is old 'HD Graphics 4600', I will try to do with Intel GPU today.

Anyway, Do you know why the color clipping and too much saturated color occurs after applying HDR with below video?

Of course, my color grading ability is not good and I didn't use any LUT for BT.2020(can't find at anywhere). but I don’t understand why it happened because the color was fine before HEVC encoding for HDR on the tool.

Thank you again.

Former user wrote on 5/9/2019, 8:02 PM

Hi @kaistcha

I have done a bit of googling on this and there is reference to the fact that the Nvidia HW encoding NVENC chip can not write the required HDR information into the metadata of the encode.

Have you tried using HW acceleration via the Intel GPU if your processor has one?

Peter


QSV on 4th Gen Intel CPUs (Haswell) can't do 10-Bit, or HEVC, at all. Only useful for 8-Bit H.264 Renders (and some other formats from that era, maybe...).

He's stuck with CPU or Nvidia. Probably have to just render using CPU until Nvidia addresses the issue (if ever... Pascal isn't "new hotness" anymore).

---

@ OP:

Make sure you have a calibrated display for grading footage, otherwise you're never going to get good at it because you're never going to see accurate results on the display you're using to monitor the footage you're grading. Getting good at color correction and grading requires you correct and grade on displays that display accurate colors; otherwise you will not know what accurate colors look like.

You can make it look true to life on your display, if it isn't calibrated properly, but it won't be. It will be like you color correcting while wearing yellow tinted glasses. You can make something look white, but when you take the glasses off (use a properly calibrated monitor), it won't be.

kaistcha wrote on 5/9/2019, 9:27 PM

Hi @Former user,

Thank you for your kind detail information.

Anyway, I'm using ASUS PA32UC-K 10bit monitor. I can enjoy real 10bit HDR contents on my monitor, but 10 bit color grading and editing seems different against just watching 10 bit contents.

So do I need to change current CPU to over 6th gen to render 10 bit color rendering or grading on most of Video editing tool which support 10 bit color grading? Because I heard Adobe Premiere CC 2019 needs over 6th gen i7 CPU.

Or how about using only CPU calculation to encode HEVC via 4790k?

By the way, monitor calibration was already done by manufacturer, and I've heard no need to do calibration further from my side.

Thank you.

Scenestealer wrote on 5/10/2019, 1:44 AM

@Former user

QSV on 4th Gen Intel CPUs (Haswell) can't do 10-Bit, or HEVC, at all. Only useful for 8-Bit H.264 Renders (and some other formats from that era, maybe...).

Thanks - I should have noticed that!

@kaistcha

Coincidentally I just asked the question at Magix support about GPU encoding and BT2020 a couple of days ago and have not heard back.

I am a little confused about your tests - you mention that the camera was set to DLOG-M but did you apply the corresponding LUT in VPX before you did colour grading (if any)?

Also what were your instruments - Waveform and Vectorscope - telling you prior to export did the waveform show you clipped highlights or excessive color saturation?

Do you have "Output to Monitors with High Bit Depth" ticked in the Program Settings> Display Tab?

 

System Specs: Intel 6th Gen i7 6700K 4Ghz O.C.4.6GHz, Asus Z170 Pro Gaming MoBo, 16GB DDR4 2133Mhz RAM, Samsung 850 EVO 512GB SSD system disc WD Black 4TB HDD Video Storage, Nvidia GTX1060 OC 6GB, Win10 Pro 2004, MEP2016, 2022 (V21.0.1.92) Premium and prior, VPX7, VPX12 (V18.0.1.85). Microsoft Surface Pro3 i5 4300U 1.9GHz Max 2.6Ghz, HDGraphics 4400, 4GB Ram 128GB SSD + 64GB Strontium Micro SD card, Win 10Pro 2004, MEP2015 Premium.