HDR Report

Questions and answers on how to get the most out of FFAStrans
Post Reply
andrezagato
Posts: 43
Joined: Tue Jun 09, 2020 4:07 pm

HDR Report

Post by andrezagato »

Hey guys!
This is probably a long shot. But I was wondering if would it be possible to create a report of a HDR file ? This report is required by Amazon with any HDR file.
So far the company I've been working uses Colorfront Transkoder to create this report, the report is created from a DPX sequence. But to be honest, I simply hate using this software.
I've attached the pdf with the report.

I know it is a long shot, but might as well ask, right ?

Thanks!
Attachments
MNES_101_pt-PROD_UHD_HDR_178_2398_20230424_HDR_report_R01.pdf
(266.65 KiB) Downloaded 129 times
User avatar
FranceBB
Posts: 258
Joined: Sat Jun 25, 2016 3:43 pm
Contact:

Re: HDR Report

Post by FranceBB »

### Explanation + Rant ###

Well, I'm pretty sure that the reason why they require such analysis is actually because the MaxCLL info is MANDATORY in PQ files as per ITU Specs. The reason is pretty historical.
I made a long post on Doom9 several years ago about this, but to summarize it as much as possible, the reason why they want MaxCll is that back when the HDR10 spec was finalized, such an info was gonna be passed to the TV which would read it from the header and use it to correctly display the picture. You see, a movie in PQ can have, let's say, 1200 nits, however there's no guarantee that the TV has as many nits as the content. Especially OLED TVs can't physically go above 890 nits (that's the price you pay to have those perfect blacks), so the TV would take the info and scale the 1200 peak brightness of the content to its peak brightness of 890 and scale everything downwards accordingly. This was terribly inefficient, especially with synthetic movies (i.e movies with special effects created on a software like Maya etc) 'cause they were graded as high as 5000 nits even though no camera has as many stops (think about Blade Runner), so the result of 5000 nits being brought down to 890 nits would be pretty poor, especially in normal scenes which would look too dark 'cause they were shot at like 600-700 nits (and would be brought down to like 80-90 nits). This led to HDR10+ in which the nits info in the header changes at every scene to allow TVs to better adjust and compensate (but the static metadata is also there to be backwards compatible with HDR10).

So... this is the reason why they want you to include that.
Without that info, the TV would have to guess and it would assume the picture to be a static 1000 nits all the time.

Now, this was true from 2013 to 2015-2016, however TV manufacturers quickly realized that there was an overwhelming number of files which lacked this info, 'cause, you know, you can be ITU and impose something, but people just won't care.

This led to new TVs (2017 onwards) to be produced with a dynamic pre-processing which analyses the MaxCll scene-by-scene and displays the output correctly according to the data of this pre-processing.
Some manufacturers (I'm looking at you, Samsung) went even further and decided to totally ignore the metadata in the header and use their own pre-processing analyser instead EVEN when those are available. The reason is that - especially in the early days of HDR (but even nowadays to some extent) - people really screwed up, and guess what's worse than not including the info? Well, you guessed it? Writing it with the wrong value! Since lots of folks didn't really know what they were doing (and I'm not talking about pirates sitting in their chair at home eating chocolate while torrenting, but """professionals""" working for real companies), many of them copy-pasted the following string, which is why you'll find it in an overwhelming number of PQ encodes:

--colorprim bt2020 --transfer smpte2084 --colormatrix bt2020nc --master-display "G(13250,34500)B(7500,3000)R(34000,16000)WP(15635,16450)L(10000000,0.0050)" --max-cll 1000,400


As I said, BT2020, HDR PQ, Display P3, 1000 nits max, 400 nits average, just like that, as if it was written in the Bible.



Anyway, getting the MaxCLL info right is still important for serious studios (and Amazon is one of them, 'cause I know Ben Waggoner and they are very keen to get things right), so I'm not surprised that they asked you that. If you were to send a PQ file to Sky I would ask you too and if you sent the parameters above I would analyze it myself and not trust you xD.
(As a side note, Amazon also uses x26x and Avisynth and they've been contributing for years, just FYI).


### Actual Answer ###

So, this leads us to the real answer, which is: absolutely, you can calculate MaxCLL yourself with MaxCLLFind() in Avisynth and you can automatize it to write a report of each file using a workflow in FFAStrans: http://avisynth.nl/index.php/MaxCLLFind

If you feel more Rust-inclined, you can use PQStats which is now part of the wider HLG-Tools developed by my dear friend William Swartzendruber. (You don't have to convert to HLG, you just have to calculate the nits). https://github.com/wswartzendruber/hlg-tools and https://forum.doom9.org/showthread.php?t=182499

After getting the stats, all is left is to create a PDF with them, but I'm sure you can create a PDF yourself :P


IMPORTANT NOTE: The calculation needs to be run on a .tiff sequence or ideally on a lossless codec. Codecs like MJPEG2000, DNXHQX, ProRes etc which are generally used as mezzanine intermediate aren't lossless, they're lossy, which means that they include compression overshooting. This will screw up measurements and it's gonna be your responsibility to take out outliers. And remember: the more lossy a codec is, the more outliers you're gonna have, so be very careful!
Post Reply