All of lore.kernel.org
 help / color / mirror / Atom feed
From: Harry Wentland <harry.wentland@amd.com>
To: Pekka Paalanen <ppaalanen@gmail.com>,
	Vitaly Prosyak <vitaly.prosyak@amd.com>
Cc: Deepak.Sharma@amd.com, Krunoslav.Kovac@amd.com,
	mcasas@google.com, Shashank.Sharma@amd.com,
	dri-devel@lists.freedesktop.org, Shirish.S@amd.com,
	Sebastian Wick <sebastian@sebastianwick.net>,
	hersenxs.wu@amd.com, amd-gfx@lists.freedesktop.org,
	laurentiu.palcu@oss.nxp.com, Bhawanpreet.Lakha@amd.com,
	Nicholas.Kazlauskas@amd.com
Subject: Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes
Date: Tue, 18 May 2021 10:19:25 -0400	[thread overview]
Message-ID: <9d4ec9c3-6716-7c80-97d5-dd3c5c50ab51@amd.com> (raw)
In-Reply-To: <20210518105615.212b84e4@eldfell>



On 2021-05-18 3:56 a.m., Pekka Paalanen wrote:
> On Mon, 17 May 2021 15:39:03 -0400
> Vitaly Prosyak <vitaly.prosyak@amd.com> wrote:
> 
>> On 2021-05-17 12:48 p.m., Sebastian Wick wrote:
>>> On 2021-05-17 10:57, Pekka Paalanen wrote:  
>>>> On Fri, 14 May 2021 17:05:11 -0400
>>>> Harry Wentland <harry.wentland@amd.com> wrote:
>>>>  
>>>>> On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:  
>>>>>> On Mon, 26 Apr 2021 13:38:49 -0400
>>>>>> Harry Wentland <harry.wentland@amd.com> wrote:  
>>>>
>>>> ...
>>>>  
>>>>>>> ## Mastering Luminances
>>>>>>>
>>>>>>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>>>>>>> pixels in absolute terms. Unfortunately we're again presented with
>>>>>>> physical limitations of the display technologies on the market   
>>>>> today.  
>>>>>>> Here are a few examples of luminance ranges of displays.
>>>>>>>
>>>>>>> | Display                  | Luminance range in nits |
>>>>>>> | ------------------------ | ----------------------- |
>>>>>>> | Typical PC display       | 0.3 - 200 |
>>>>>>> | Excellent LCD HDTV       | 0.3 - 400 |
>>>>>>> | HDR LCD w/ local dimming | 0.05 - 1,500 |
>>>>>>>
>>>>>>> Since no display can currently show the full 0.0005 to 10,000 nits
>>>>>>> luminance range the display will need to tonemap the HDR content,   
>>>>> i.e  
>>>>>>> to fit the content within a display's capabilities. To assist with
>>>>>>> tonemapping HDR content is usually accompanied with a metadata that
>>>>>>> describes (among other things) the minimum and maximum mastering
>>>>>>> luminance, i.e. the maximum and minimum luminance of the display   
>>>>> that  
>>>>>>> was used to master the HDR content.
>>>>>>>
>>>>>>> The HDR metadata is currently defined on the drm_connector via the
>>>>>>> hdr_output_metadata blob property.
>>>>>>>
>>>>>>> It might be useful to define per-plane hdr metadata, as different
>>>>>>> planes might have been mastered differently.  
>>>>>>
>>>>>> I don't think this would directly help with the dynamic range   
>>>>> blending  
>>>>>> problem. You still need to establish the mapping between the optical
>>>>>> values from two different EOTFs and dynamic ranges. Or can you know
>>>>>> which optical values match the mastering display maximum and minimum
>>>>>> luminances for not-PQ?
>>>>>>  
>>>>>
>>>>> My understanding of this is probably best illustrated by this example:
>>>>>
>>>>> Assume HDR was mastered on a display with a maximum white level of 500
>>>>> nits and played back on a display that supports a max white level of 
>>>>> 400
>>>>> nits. If you know the mastering white level of 500 you know that 
>>>>> this is
>>>>> the maximum value you need to compress down to 400 nits, allowing 
>>>>> you to
>>>>> use the full extent of the 400 nits panel.  
>>>>
>>>> Right, but in the kernel, where do you get these nits values from?
>>>>
>>>> hdr_output_metadata blob is infoframe data to the monitor. I think this
>>>> should be independent of the metadata used for color transformations in
>>>> the display pipeline before the monitor.
>>>>
>>>> EDID may tell us the monitor HDR metadata, but again what is used in
>>>> the color transformations should be independent, because EDIDs lie,
>>>> lighting environments change, and users have different preferences.
>>>>
>>>> What about black levels?
>>>>
>>>> Do you want to do black level adjustment?
>>>>
>>>> How exactly should the compression work?
>>>>
>>>> Where do you map the mid-tones?
>>>>
>>>> What if the end user wants something different?  
>>>
>>> I suspect that this is not about tone mapping at all. The use cases
>>> listed always have the display in PQ mode and just assume that no
>>> content exceeds the PQ limitations. Then you can simply bring all
>>> content to the color space with a matrix multiplication and then map the
>>> linear light content somewhere into the PQ range. Tone mapping is
>>> performed in the display only.
> 
> The use cases do use the word "desktop" though. Harry, could you expand
> on this, are you seeking a design that is good for generic desktop
> compositors too, or one that is more tailored to "embedded" video
> player systems taking the most advantage of (potentially
> fixed-function) hardware?
> 

The goal is to enable this on a generic desktop, such as generic Wayland
implementations or ChromeOS. We're not looking for a custom solution for
some embedded systems, though the solution we end up with should obviously
not prevent an implementation on embedded video players.

> What matrix would one choose? Which render intent would it
> correspond to?
> 
> If you need to adapt different dynamic ranges into the blending dynamic
> range, would a simple linear transformation really be enough?
> 
>>> From a generic wayland compositor point of view this is uninteresting.
>>>  
>> It a compositor's decision to provide or not the metadata property to 
>> the kernel. The metadata can be available from one or multiple clients 
>> or most likely not available at all.
>>
>> Compositors may put a display in HDR10 ( when PQ 2084 INV EOTF and TM 
>> occurs in display ) or NATIVE mode and do not attach any metadata to the 
>> connector and do TM in compositor.
>>
>> It is all about user preference or compositor design, or a combination 
>> of both options.
> 
> Indeed. The thing here is that you cannot just add KMS UAPI, you also
> need to have the FOSS userspace to go with it. So you need to have your
> audience defined, userspace patches written and reviewed and agreed
> to be a good idea. I'm afraid this particular UAPI design would be
> difficult to justify with Weston. Maybe Kodi is a better audience?
> 

I'm not sure designing a UAPI for Kodi that's not going to work for
Wayland-compositors is the right thing. From a KMS driver maintainer
standpoint I don't want an API for each userspace.

The idea here is to do design and discussion in public so we can eventually
arrive at a UAPI for HDR and CM that works for Wayland and by extension
for every other userspace.

> But then again, one also needs to consider whether it is enough for a
> new UAPI to satisfy only part of the possible audience and then need
> yet another new UAPI to satisfy the rest. Adding new UAPI requires
> defining the interactions with all existing UAPI as well.
> 
> Maybe we do need several different UAPIs for the "same" things if the
> hardware designs are too different to cater with just one.
> 

I feel we should have a section in the RFC that sketches how different HW
deals with this currently. It would be good if we can arrive at a UAPI that
captures at least the common functionality of various HW.

Harry

> 
> Thanks,
> pq
> 


WARNING: multiple messages have this Message-ID (diff)
From: Harry Wentland <harry.wentland@amd.com>
To: Pekka Paalanen <ppaalanen@gmail.com>,
	Vitaly Prosyak <vitaly.prosyak@amd.com>
Cc: Deepak.Sharma@amd.com, aric.cyr@amd.com, Krunoslav.Kovac@amd.com,
	mcasas@google.com, Shashank.Sharma@amd.com,
	dri-devel@lists.freedesktop.org, Shirish.S@amd.com,
	Sebastian Wick <sebastian@sebastianwick.net>,
	hersenxs.wu@amd.com, amd-gfx@lists.freedesktop.org,
	laurentiu.palcu@oss.nxp.com, Bhawanpreet.Lakha@amd.com,
	Nicholas.Kazlauskas@amd.com, ville.syrjala@linux.intel.com
Subject: Re: [RFC PATCH 0/3] A drm_plane API to support HDR planes
Date: Tue, 18 May 2021 10:19:25 -0400	[thread overview]
Message-ID: <9d4ec9c3-6716-7c80-97d5-dd3c5c50ab51@amd.com> (raw)
In-Reply-To: <20210518105615.212b84e4@eldfell>



On 2021-05-18 3:56 a.m., Pekka Paalanen wrote:
> On Mon, 17 May 2021 15:39:03 -0400
> Vitaly Prosyak <vitaly.prosyak@amd.com> wrote:
> 
>> On 2021-05-17 12:48 p.m., Sebastian Wick wrote:
>>> On 2021-05-17 10:57, Pekka Paalanen wrote:  
>>>> On Fri, 14 May 2021 17:05:11 -0400
>>>> Harry Wentland <harry.wentland@amd.com> wrote:
>>>>  
>>>>> On 2021-04-27 10:50 a.m., Pekka Paalanen wrote:  
>>>>>> On Mon, 26 Apr 2021 13:38:49 -0400
>>>>>> Harry Wentland <harry.wentland@amd.com> wrote:  
>>>>
>>>> ...
>>>>  
>>>>>>> ## Mastering Luminances
>>>>>>>
>>>>>>> Now we are able to use the PQ 2084 EOTF to define the luminance of
>>>>>>> pixels in absolute terms. Unfortunately we're again presented with
>>>>>>> physical limitations of the display technologies on the market   
>>>>> today.  
>>>>>>> Here are a few examples of luminance ranges of displays.
>>>>>>>
>>>>>>> | Display                  | Luminance range in nits |
>>>>>>> | ------------------------ | ----------------------- |
>>>>>>> | Typical PC display       | 0.3 - 200 |
>>>>>>> | Excellent LCD HDTV       | 0.3 - 400 |
>>>>>>> | HDR LCD w/ local dimming | 0.05 - 1,500 |
>>>>>>>
>>>>>>> Since no display can currently show the full 0.0005 to 10,000 nits
>>>>>>> luminance range the display will need to tonemap the HDR content,   
>>>>> i.e  
>>>>>>> to fit the content within a display's capabilities. To assist with
>>>>>>> tonemapping HDR content is usually accompanied with a metadata that
>>>>>>> describes (among other things) the minimum and maximum mastering
>>>>>>> luminance, i.e. the maximum and minimum luminance of the display   
>>>>> that  
>>>>>>> was used to master the HDR content.
>>>>>>>
>>>>>>> The HDR metadata is currently defined on the drm_connector via the
>>>>>>> hdr_output_metadata blob property.
>>>>>>>
>>>>>>> It might be useful to define per-plane hdr metadata, as different
>>>>>>> planes might have been mastered differently.  
>>>>>>
>>>>>> I don't think this would directly help with the dynamic range   
>>>>> blending  
>>>>>> problem. You still need to establish the mapping between the optical
>>>>>> values from two different EOTFs and dynamic ranges. Or can you know
>>>>>> which optical values match the mastering display maximum and minimum
>>>>>> luminances for not-PQ?
>>>>>>  
>>>>>
>>>>> My understanding of this is probably best illustrated by this example:
>>>>>
>>>>> Assume HDR was mastered on a display with a maximum white level of 500
>>>>> nits and played back on a display that supports a max white level of 
>>>>> 400
>>>>> nits. If you know the mastering white level of 500 you know that 
>>>>> this is
>>>>> the maximum value you need to compress down to 400 nits, allowing 
>>>>> you to
>>>>> use the full extent of the 400 nits panel.  
>>>>
>>>> Right, but in the kernel, where do you get these nits values from?
>>>>
>>>> hdr_output_metadata blob is infoframe data to the monitor. I think this
>>>> should be independent of the metadata used for color transformations in
>>>> the display pipeline before the monitor.
>>>>
>>>> EDID may tell us the monitor HDR metadata, but again what is used in
>>>> the color transformations should be independent, because EDIDs lie,
>>>> lighting environments change, and users have different preferences.
>>>>
>>>> What about black levels?
>>>>
>>>> Do you want to do black level adjustment?
>>>>
>>>> How exactly should the compression work?
>>>>
>>>> Where do you map the mid-tones?
>>>>
>>>> What if the end user wants something different?  
>>>
>>> I suspect that this is not about tone mapping at all. The use cases
>>> listed always have the display in PQ mode and just assume that no
>>> content exceeds the PQ limitations. Then you can simply bring all
>>> content to the color space with a matrix multiplication and then map the
>>> linear light content somewhere into the PQ range. Tone mapping is
>>> performed in the display only.
> 
> The use cases do use the word "desktop" though. Harry, could you expand
> on this, are you seeking a design that is good for generic desktop
> compositors too, or one that is more tailored to "embedded" video
> player systems taking the most advantage of (potentially
> fixed-function) hardware?
> 

The goal is to enable this on a generic desktop, such as generic Wayland
implementations or ChromeOS. We're not looking for a custom solution for
some embedded systems, though the solution we end up with should obviously
not prevent an implementation on embedded video players.

> What matrix would one choose? Which render intent would it
> correspond to?
> 
> If you need to adapt different dynamic ranges into the blending dynamic
> range, would a simple linear transformation really be enough?
> 
>>> From a generic wayland compositor point of view this is uninteresting.
>>>  
>> It a compositor's decision to provide or not the metadata property to 
>> the kernel. The metadata can be available from one or multiple clients 
>> or most likely not available at all.
>>
>> Compositors may put a display in HDR10 ( when PQ 2084 INV EOTF and TM 
>> occurs in display ) or NATIVE mode and do not attach any metadata to the 
>> connector and do TM in compositor.
>>
>> It is all about user preference or compositor design, or a combination 
>> of both options.
> 
> Indeed. The thing here is that you cannot just add KMS UAPI, you also
> need to have the FOSS userspace to go with it. So you need to have your
> audience defined, userspace patches written and reviewed and agreed
> to be a good idea. I'm afraid this particular UAPI design would be
> difficult to justify with Weston. Maybe Kodi is a better audience?
> 

I'm not sure designing a UAPI for Kodi that's not going to work for
Wayland-compositors is the right thing. From a KMS driver maintainer
standpoint I don't want an API for each userspace.

The idea here is to do design and discussion in public so we can eventually
arrive at a UAPI for HDR and CM that works for Wayland and by extension
for every other userspace.

> But then again, one also needs to consider whether it is enough for a
> new UAPI to satisfy only part of the possible audience and then need
> yet another new UAPI to satisfy the rest. Adding new UAPI requires
> defining the interactions with all existing UAPI as well.
> 
> Maybe we do need several different UAPIs for the "same" things if the
> hardware designs are too different to cater with just one.
> 

I feel we should have a section in the RFC that sketches how different HW
deals with this currently. It would be good if we can arrive at a UAPI that
captures at least the common functionality of various HW.

Harry

> 
> Thanks,
> pq
> 

_______________________________________________
amd-gfx mailing list
amd-gfx@lists.freedesktop.org
https://lists.freedesktop.org/mailman/listinfo/amd-gfx

  reply	other threads:[~2021-05-18 14:19 UTC|newest]

Thread overview: 64+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2021-04-26 17:38 [RFC PATCH 0/3] A drm_plane API to support HDR planes Harry Wentland
2021-04-26 17:38 ` Harry Wentland
2021-04-26 17:38 ` [RFC PATCH 1/3] drm/color: Add RGB Color encodings Harry Wentland
2021-04-26 17:38   ` Harry Wentland
2021-04-26 18:07   ` Ville Syrjälä
2021-04-26 18:07     ` Ville Syrjälä
2021-04-26 18:56     ` Harry Wentland
2021-04-26 18:56       ` Harry Wentland
2021-04-26 19:08       ` Ville Syrjälä
2021-04-26 19:08         ` Ville Syrjälä
2021-04-30  9:04         ` Pekka Paalanen
2021-04-30  9:04           ` Pekka Paalanen
2021-05-01  0:53       ` Sebastian Wick
2021-05-01  0:53         ` Sebastian Wick
2021-05-14 21:04         ` Harry Wentland
2021-05-14 21:04           ` Harry Wentland
2021-05-17  8:34           ` Pekka Paalanen
2021-05-17  8:34             ` Pekka Paalanen
2021-05-18 14:32             ` Harry Wentland
2021-05-18 14:32               ` Harry Wentland
2021-05-19  7:56               ` Pekka Paalanen
2021-05-19  7:56                 ` Pekka Paalanen
2021-04-26 17:38 ` [RFC PATCH 2/3] drm/color: Add Color transfer functions for HDR/SDR Harry Wentland
2021-04-26 17:38   ` Harry Wentland
2021-04-26 19:03   ` kernel test robot
2021-04-26 20:27   ` kernel test robot
2021-04-26 20:27   ` [RFC PATCH] drm/color: drm_get_color_transfer_function_name() can be static kernel test robot
2021-04-26 21:29   ` [RFC PATCH 2/3] drm/color: Add Color transfer functions for HDR/SDR kernel test robot
2021-04-26 22:00   ` kernel test robot
2021-04-26 17:38 ` [RFC PATCH 3/3] drm/color: Add sdr boost property Harry Wentland
2021-04-26 17:38   ` Harry Wentland
2021-04-26 20:38   ` kernel test robot
2021-04-26 21:45   ` kernel test robot
2021-04-26 21:45   ` [RFC PATCH] drm/color: drm_plane_create_sdr_white_level_property() can be static kernel test robot
2021-04-27  9:09 ` [RFC PATCH 0/3] A drm_plane API to support HDR planes Daniel Vetter
2021-04-27  9:09   ` Daniel Vetter
2021-04-27 14:50 ` Pekka Paalanen
2021-04-27 14:50   ` Pekka Paalanen
2021-04-28  7:54   ` Shashank Sharma
2021-04-28  7:54     ` Shashank Sharma
2021-04-30  9:43     ` Pekka Paalanen
2021-04-30  9:43       ` Pekka Paalanen
2021-04-30 10:39       ` Shashank Sharma
2021-04-30 10:39         ` Shashank Sharma
2021-05-14 21:01         ` Harry Wentland
2021-05-14 21:01           ` Harry Wentland
2021-05-14 21:05   ` Harry Wentland
2021-05-14 21:05     ` Harry Wentland
2021-05-17  8:57     ` Pekka Paalanen
2021-05-17  8:57       ` Pekka Paalanen
2021-05-17 16:48       ` Sebastian Wick
2021-05-17 16:48         ` Sebastian Wick
2021-05-17 19:39         ` Vitaly Prosyak
2021-05-17 19:39           ` Vitaly Prosyak
2021-05-18  7:56           ` Pekka Paalanen
2021-05-18  7:56             ` Pekka Paalanen
2021-05-18 14:19             ` Harry Wentland [this message]
2021-05-18 14:19               ` Harry Wentland
2021-05-18 23:00               ` Sebastian Wick
2021-05-18 23:00                 ` Sebastian Wick
2021-05-19  8:53               ` Pekka Paalanen
2021-05-19  8:53                 ` Pekka Paalanen
2021-05-19 10:02                 ` Pekka Paalanen
2021-05-19 10:02                   ` Pekka Paalanen

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=9d4ec9c3-6716-7c80-97d5-dd3c5c50ab51@amd.com \
    --to=harry.wentland@amd.com \
    --cc=Bhawanpreet.Lakha@amd.com \
    --cc=Deepak.Sharma@amd.com \
    --cc=Krunoslav.Kovac@amd.com \
    --cc=Nicholas.Kazlauskas@amd.com \
    --cc=Shashank.Sharma@amd.com \
    --cc=Shirish.S@amd.com \
    --cc=amd-gfx@lists.freedesktop.org \
    --cc=dri-devel@lists.freedesktop.org \
    --cc=hersenxs.wu@amd.com \
    --cc=laurentiu.palcu@oss.nxp.com \
    --cc=mcasas@google.com \
    --cc=ppaalanen@gmail.com \
    --cc=sebastian@sebastianwick.net \
    --cc=vitaly.prosyak@amd.com \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.