I have yet to figure out what HDR exactly is. Currently, all HDR content is 10-bit, with black mapped at the 64 value, middle gray at 512 and peak luminance at 940/960. The reason HDR standards are 10-bit or higher is because the additional values are needed to prevent the banding which would occur if 8-bit depth was used with an HDR gamma curve. ![]() You would have only four shades to work with, but the image would technically be "high dynamic range". For example, theoretically, you could have a 2-bit image (four tone values) that was "HDR" if the correct gamma curve was applied to it. First, dynamic range and bit depth are not related. ![]() There are a few misconceptions that I have seen presented in this thread. ![]() He tends to believe that when it comes to gaming consoles (Xbox One X and PS4 Pro), that you can get a game to run in 1080p with HDR 10bit / 12 Bit being effective (I.E., noticeable difference to standard 1080p). My son and I have been having a bit of discussion RE: HDR (primarily 10bit/12bit) and whether it can be applied at the 1080p resolution.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |