All of lore.kernel.org
 help / color / mirror / Atom feed
* [Media Summit] Imaging Sensor functionality
@ 2022-09-06 16:14 Dave Stevenson
  2022-09-06 17:53 ` Laurent Pinchart
  2022-09-10 12:50 ` Hans Verkuil
  0 siblings, 2 replies; 11+ messages in thread
From: Dave Stevenson @ 2022-09-06 16:14 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Sakari Ailus, Kieran Bingham, Nicolas Dufresne,
	Benjamin Gaignard, Hidenori Kobayashi, Paul Kocialkowski,
	Michael Olbrich, Laurent Pinchart, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Hans Verkuil,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

Hi All.

I realise that I'm in a slightly different position from many mainline
Linux-media developers in that I see multiple use cases for the same
sensor, rather than a driver predominantly being for one
product/platform. I'm therefore wanting to look at generic solutions
and fully featured drivers. Users get to decide the use cases, not the
hardware designers.

The issues I've raised are things that I've encountered and would
benefit from a discussion to get views as to the direction that is
perceived to be workable. I appreciate that some can not be solved
immediately, but want to avoid too much bikeshedding in the first
round of patches.
What's realistic, and what pitfalls/limitations immediately jump out at people.

Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing

See you on Monday.

  Dave

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-06 16:14 [Media Summit] Imaging Sensor functionality Dave Stevenson
@ 2022-09-06 17:53 ` Laurent Pinchart
  2022-09-07  0:41   ` Laurent Pinchart
  2022-09-07 12:42   ` Dave Stevenson
  2022-09-10 12:50 ` Hans Verkuil
  1 sibling, 2 replies; 11+ messages in thread
From: Laurent Pinchart @ 2022-09-06 17:53 UTC (permalink / raw)
  To: Dave Stevenson
  Cc: Linux Media Mailing List, Sakari Ailus, Kieran Bingham,
	Nicolas Dufresne, Benjamin Gaignard, Hidenori Kobayashi,
	Paul Kocialkowski, Michael Olbrich, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Hans Verkuil,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

Hi Dave,

On Tue, Sep 06, 2022 at 05:14:30PM +0100, Dave Stevenson wrote:
> Hi All.
> 
> I realise that I'm in a slightly different position from many mainline
> Linux-media developers in that I see multiple use cases for the same
> sensor, rather than a driver predominantly being for one
> product/platform. I'm therefore wanting to look at generic solutions
> and fully featured drivers. Users get to decide the use cases, not the
> hardware designers.

Could you clarify here what you mean by users and hardware designers ?
Users can be understood as

- Users of the camera sensor, i.e. OEMs designing a product
- Users of the hardware platform , i.e. software developers writing
  applications
- Users of the software, i.e. end-users

Hardware designers could then equally mean

- Sensor vendors
- SoC vendors
- Board vendors
- Product vendors

> The issues I've raised are things that I've encountered and would
> benefit from a discussion to get views as to the direction that is
> perceived to be workable. I appreciate that some can not be solved
> immediately, but want to avoid too much bikeshedding in the first
> round of patches.
> What's realistic, and what pitfalls/limitations immediately jump out at people.
> 
> Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing

Thank you, I will review that ASAP.

> See you on Monday.

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-06 17:53 ` Laurent Pinchart
@ 2022-09-07  0:41   ` Laurent Pinchart
  2022-09-07 13:12     ` Dave Stevenson
  2022-09-07 12:42   ` Dave Stevenson
  1 sibling, 1 reply; 11+ messages in thread
From: Laurent Pinchart @ 2022-09-07  0:41 UTC (permalink / raw)
  To: Dave Stevenson
  Cc: Linux Media Mailing List, Sakari Ailus, Kieran Bingham,
	Nicolas Dufresne, Benjamin Gaignard, Hidenori Kobayashi,
	Paul Kocialkowski, Michael Olbrich, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Hans Verkuil,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

Hi Dave,

On Tue, Sep 06, 2022 at 08:53:41PM +0300, Laurent Pinchart wrote:
> On Tue, Sep 06, 2022 at 05:14:30PM +0100, Dave Stevenson wrote:
> > Hi All.
> > 
> > I realise that I'm in a slightly different position from many mainline
> > Linux-media developers in that I see multiple use cases for the same
> > sensor, rather than a driver predominantly being for one
> > product/platform. I'm therefore wanting to look at generic solutions
> > and fully featured drivers. Users get to decide the use cases, not the
> > hardware designers.
> 
> Could you clarify here what you mean by users and hardware designers ?
> Users can be understood as
> 
> - Users of the camera sensor, i.e. OEMs designing a product
> - Users of the hardware platform , i.e. software developers writing
>   applications
> - Users of the software, i.e. end-users
> 
> Hardware designers could then equally mean
> 
> - Sensor vendors
> - SoC vendors
> - Board vendors
> - Product vendors
> 
> > The issues I've raised are things that I've encountered and would
> > benefit from a discussion to get views as to the direction that is
> > perceived to be workable. I appreciate that some can not be solved
> > immediately, but want to avoid too much bikeshedding in the first
> > round of patches.
> > What's realistic, and what pitfalls/limitations immediately jump out at people.
> > 
> > Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> 
> Thank you, I will review that ASAP.

A few questions:

- Regarding the sensor synchronization, are you considering the trigger
  signal as signaling the beginning of exposure only, or also use cases
  where it controls the exposure duration ?

- For VCM ringing reduction and standardization of parameters, are there
  examples you could share to explain this in more details, with the
  type of parameters that need to be specified ?

And one comment. On slide 20/23, you wrote

  Likely to produce a load of boilerplate in all drivers. Abstract out
  an image sensor pinctrl helper?

I think we need more than that, we need a large helper for camera
sensors (in particular for raw sensors) that will bridge the large gap
between the sensor and the V4L2 subdev API. There's too much boilerplate
code already, and worse, different sensor drivers exposing the same
feature to userspace in different ways.

> > See you on Monday.

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-06 17:53 ` Laurent Pinchart
  2022-09-07  0:41   ` Laurent Pinchart
@ 2022-09-07 12:42   ` Dave Stevenson
  2022-09-07 13:11     ` Laurent Pinchart
  1 sibling, 1 reply; 11+ messages in thread
From: Dave Stevenson @ 2022-09-07 12:42 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Linux Media Mailing List, Sakari Ailus, Kieran Bingham,
	Nicolas Dufresne, Benjamin Gaignard, Hidenori Kobayashi,
	Paul Kocialkowski, Michael Olbrich, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Hans Verkuil,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

Hi Laurent

On Tue, 6 Sept 2022 at 18:53, Laurent Pinchart
<laurent.pinchart@ideasonboard.com> wrote:
>
> Hi Dave,
>
> On Tue, Sep 06, 2022 at 05:14:30PM +0100, Dave Stevenson wrote:
> > Hi All.
> >
> > I realise that I'm in a slightly different position from many mainline
> > Linux-media developers in that I see multiple use cases for the same
> > sensor, rather than a driver predominantly being for one
> > product/platform. I'm therefore wanting to look at generic solutions
> > and fully featured drivers. Users get to decide the use cases, not the
> > hardware designers.
>
> Could you clarify here what you mean by users and hardware designers ?
> Users can be understood as
>
> - Users of the camera sensor, i.e. OEMs designing a product
> - Users of the hardware platform , i.e. software developers writing
>   applications
> - Users of the software, i.e. end-users

Users of the software.

Particularly on the Pi you have people using libcamera apps or Python
bindings that want to be able to choose modes of operation without
having to make kernel driver modifications.
I generally don't mind if that is through userspace or DT, but the
functionality should be exposed.

As an example, when the strobe signals were exposed for IMX477 we had
people hooking up various high intensity strobe devices and other
weird contraptions for synchronised events [1]. Can we replicate that
sort of open-ended functionality in a standardised way within sensor
kernel drivers so that the drivers are not constraining the use cases?

> Hardware designers could then equally mean
>
> - Sensor vendors
> - SoC vendors
> - Board vendors
> - Product vendors

All of the above.

For those Product Vendors designing specific products based on an SoC
and imaging sensor, if there is a defined mechanism that end users can
get to, then they can also use it to configure their specific use
case. Both cases therefore win. Hard coding their product's use case
in a mainline driver limits other use cases.

  Dave

[1] https://forums.raspberrypi.com/viewtopic.php?t=281913

> > The issues I've raised are things that I've encountered and would
> > benefit from a discussion to get views as to the direction that is
> > perceived to be workable. I appreciate that some can not be solved
> > immediately, but want to avoid too much bikeshedding in the first
> > round of patches.
> > What's realistic, and what pitfalls/limitations immediately jump out at people.
> >
> > Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
>
> Thank you, I will review that ASAP.
>
> > See you on Monday.
>
> --
> Regards,
>
> Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-07 12:42   ` Dave Stevenson
@ 2022-09-07 13:11     ` Laurent Pinchart
  0 siblings, 0 replies; 11+ messages in thread
From: Laurent Pinchart @ 2022-09-07 13:11 UTC (permalink / raw)
  To: Dave Stevenson
  Cc: Linux Media Mailing List, Sakari Ailus, Kieran Bingham,
	Nicolas Dufresne, Benjamin Gaignard, Hidenori Kobayashi,
	Paul Kocialkowski, Michael Olbrich, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Hans Verkuil,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

On Wed, Sep 07, 2022 at 01:42:16PM +0100, Dave Stevenson wrote:
> On Tue, 6 Sept 2022 at 18:53, Laurent Pinchart wrote:
> > On Tue, Sep 06, 2022 at 05:14:30PM +0100, Dave Stevenson wrote:
> > > Hi All.
> > >
> > > I realise that I'm in a slightly different position from many mainline
> > > Linux-media developers in that I see multiple use cases for the same
> > > sensor, rather than a driver predominantly being for one
> > > product/platform. I'm therefore wanting to look at generic solutions
> > > and fully featured drivers. Users get to decide the use cases, not the
> > > hardware designers.
> >
> > Could you clarify here what you mean by users and hardware designers ?
> > Users can be understood as
> >
> > - Users of the camera sensor, i.e. OEMs designing a product
> > - Users of the hardware platform , i.e. software developers writing
> >   applications
> > - Users of the software, i.e. end-users
> 
> Users of the software.
> 
> Particularly on the Pi you have people using libcamera apps or Python
> bindings that want to be able to choose modes of operation without
> having to make kernel driver modifications.
> I generally don't mind if that is through userspace or DT, but the
> functionality should be exposed.
> 
> As an example, when the strobe signals were exposed for IMX477 we had
> people hooking up various high intensity strobe devices and other
> weird contraptions for synchronised events [1]. Can we replicate that
> sort of open-ended functionality in a standardised way within sensor
> kernel drivers so that the drivers are not constraining the use cases?

We have the same goal, so let's see if we can find a way to make it
happen :-)

> > Hardware designers could then equally mean
> >
> > - Sensor vendors
> > - SoC vendors
> > - Board vendors
> > - Product vendors
> 
> All of the above.
> 
> For those Product Vendors designing specific products based on an SoC
> and imaging sensor, if there is a defined mechanism that end users can
> get to, then they can also use it to configure their specific use
> case. Both cases therefore win. Hard coding their product's use case
> in a mainline driver limits other use cases.
> 
>   Dave
> 
> [1] https://forums.raspberrypi.com/viewtopic.php?t=281913
> 
> > > The issues I've raised are things that I've encountered and would
> > > benefit from a discussion to get views as to the direction that is
> > > perceived to be workable. I appreciate that some can not be solved
> > > immediately, but want to avoid too much bikeshedding in the first
> > > round of patches.
> > > What's realistic, and what pitfalls/limitations immediately jump out at people.
> > >
> > > Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> >
> > Thank you, I will review that ASAP.
> >
> > > See you on Monday.

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-07  0:41   ` Laurent Pinchart
@ 2022-09-07 13:12     ` Dave Stevenson
  0 siblings, 0 replies; 11+ messages in thread
From: Dave Stevenson @ 2022-09-07 13:12 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Linux Media Mailing List, Sakari Ailus, Kieran Bingham,
	Nicolas Dufresne, Benjamin Gaignard, Hidenori Kobayashi,
	Paul Kocialkowski, Michael Olbrich, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Hans Verkuil,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

Hi Laurent

On Wed, 7 Sept 2022 at 01:42, Laurent Pinchart
<laurent.pinchart@ideasonboard.com> wrote:
>
> Hi Dave,
>
> On Tue, Sep 06, 2022 at 08:53:41PM +0300, Laurent Pinchart wrote:
> > On Tue, Sep 06, 2022 at 05:14:30PM +0100, Dave Stevenson wrote:
> > > Hi All.
> > >
> > > I realise that I'm in a slightly different position from many mainline
> > > Linux-media developers in that I see multiple use cases for the same
> > > sensor, rather than a driver predominantly being for one
> > > product/platform. I'm therefore wanting to look at generic solutions
> > > and fully featured drivers. Users get to decide the use cases, not the
> > > hardware designers.
> >
> > Could you clarify here what you mean by users and hardware designers ?
> > Users can be understood as
> >
> > - Users of the camera sensor, i.e. OEMs designing a product
> > - Users of the hardware platform , i.e. software developers writing
> >   applications
> > - Users of the software, i.e. end-users
> >
> > Hardware designers could then equally mean
> >
> > - Sensor vendors
> > - SoC vendors
> > - Board vendors
> > - Product vendors
> >
> > > The issues I've raised are things that I've encountered and would
> > > benefit from a discussion to get views as to the direction that is
> > > perceived to be workable. I appreciate that some can not be solved
> > > immediately, but want to avoid too much bikeshedding in the first
> > > round of patches.
> > > What's realistic, and what pitfalls/limitations immediately jump out at people.
> > >
> > > Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> >
> > Thank you, I will review that ASAP.
>
> A few questions:
>
> - Regarding the sensor synchronization, are you considering the trigger
>   signal as signaling the beginning of exposure only, or also use cases
>   where it controls the exposure duration ?

I was predominantly thinking of the modes offered by the sensor
vendors for synchronisation with other identical modules. Exactly
which phase of the capture is synchronised is therefore down to the
sensor vendor.

(AIUI typically this will be the start of exposure that is triggered.
If the sensors are programmed for different exposure times, then start
of readout will be at slightly different points. That would mean that
trying to synchronise the two streams based on timestamps will fail,
but sequence numbers should match as long as the slave is started
before the master).

> - For VCM ringing reduction and standardization of parameters, are there
>   examples you could share to explain this in more details, with the
>   type of parameters that need to be specified ?

This was one that I hadn't fully thought through as to whether it
could be standardised, but a discussion that ended with acceptance
that we needed module specific DT parameters was equally valid.
A couple of examples:
Fitipower FP5510E offers a linear slope mode, or two step control mode
http://www.jifangsz.com/FP5510E-Preliminary%200.2-JAN-2018.pdf.
Dongwoon DW9714A is nearly identical to FP5510E.
Dongwoon DW9807 and DW9817 are nearly identical to each other, but
DW9807 drives 0-100mA, whilst DW9817 is bi-drectional and +/- 100mA.
"Smart Actuator Control" is derived off an internal oscillator and
takes a prescaler (x1, x2, or x4), timing (7 bit as 0.03ms steps), and
a mode (target movement time against a "mechanical vibration period").

> And one comment. On slide 20/23, you wrote
>
>   Likely to produce a load of boilerplate in all drivers. Abstract out
>   an image sensor pinctrl helper?
>
> I think we need more than that, we need a large helper for camera
> sensors (in particular for raw sensors) that will bridge the large gap
> between the sensor and the V4L2 subdev API. There's too much boilerplate
> code already, and worse, different sensor drivers exposing the same
> feature to userspace in different ways.

Absolutely on too much boilerplate and then variation already. If
helpers can do all the work then that's great, and I'm happy to look
at doing some of that.
It does need the "correct" implementation to have been defined in the
first place.....

  Dave

> > > See you on Monday.
>
> --
> Regards,
>
> Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-06 16:14 [Media Summit] Imaging Sensor functionality Dave Stevenson
  2022-09-06 17:53 ` Laurent Pinchart
@ 2022-09-10 12:50 ` Hans Verkuil
  2022-09-10 16:17   ` Laurent Pinchart
  1 sibling, 1 reply; 11+ messages in thread
From: Hans Verkuil @ 2022-09-10 12:50 UTC (permalink / raw)
  To: Dave Stevenson, Linux Media Mailing List
  Cc: Sakari Ailus, Kieran Bingham, Nicolas Dufresne,
	Benjamin Gaignard, Hidenori Kobayashi, Paul Kocialkowski,
	Michael Olbrich, Laurent Pinchart, Ricardo Ribalda,
	Maxime Ripard, Daniel Scally, Jernej Škrabec,
	Niklas Söderlund, Michael Tretter, Philipp Zabel,
	Mauro Carvalho Chehab, Benjamin MUGNIER, Jacopo Mondi

Hi Dave,

On 06/09/2022 18:14, Dave Stevenson wrote:
> Hi All.
> 
> I realise that I'm in a slightly different position from many mainline
> Linux-media developers in that I see multiple use cases for the same
> sensor, rather than a driver predominantly being for one
> product/platform. I'm therefore wanting to look at generic solutions
> and fully featured drivers. Users get to decide the use cases, not the
> hardware designers.
> 
> The issues I've raised are things that I've encountered and would
> benefit from a discussion to get views as to the direction that is
> perceived to be workable. I appreciate that some can not be solved
> immediately, but want to avoid too much bikeshedding in the first
> round of patches.
> What's realistic, and what pitfalls/limitations immediately jump out at people.
> 
> Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> 
> See you on Monday.
> 
>   Dave

Some comments for the meeting on Monday:

- On-sensor temperature sensing:

If a control is used to read this, but the value is
not available yet, then -EACCES can be returned. That's already defined as a valid return
code in the API, it would just need to be extended for this use-case.

- Sync sensors:

Should it be part of the DT? That depends, I think, on whether this is a pure sw mechanism,
or whether the wiring dictates which sensor can be master and which can be slaves. I assume
that at the very least there has to be a way to group sensors that are/can be connected to
the same master sync signal.

- Lens assemblies:

For what it is worth, Cisco uses motor controlled lenses and irises. We extended the camera
controls with these new controls:

#define V4L2_CID_FOCUS_CURRENT                  (V4L2_CID_CAMERA_CLASS_BASE+36)
#define V4L2_CID_IRIS_CURRENT                   (V4L2_CID_CAMERA_CLASS_BASE+38)
#define V4L2_CID_FOCUS_MOTOR_STATUS             (V4L2_CID_CAMERA_CLASS_BASE+41)
#define V4L2_CID_IRIS_MOTOR_STATUS              (V4L2_CID_CAMERA_CLASS_BASE+43)
enum v4l2_motor_status {
        V4L2_MOTOR_STATUS_IDLE                  = (0),
        V4L2_MOTOR_STATUS_MOVING                = (1 << 0),
        V4L2_MOTOR_STATUS_FAILED                = (1 << 1),
        V4L2_MOTOR_STATUS_NOTCALIBRATED         = (1 << 2),
};
#define V4L2_CID_FOCUS_MOTOR_SPEED              (V4L2_CID_CAMERA_CLASS_BASE+46)
#define V4L2_CID_IRIS_MOTOR_SPEED               (V4L2_CID_CAMERA_CLASS_BASE+48)

This worked well for our use-case, but for us userspace has complete knowledge about
the camera assembly properties.

Regards,

	Hans

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-10 12:50 ` Hans Verkuil
@ 2022-09-10 16:17   ` Laurent Pinchart
  2022-09-11  7:12     ` Hans Verkuil
  0 siblings, 1 reply; 11+ messages in thread
From: Laurent Pinchart @ 2022-09-10 16:17 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Dave Stevenson, Linux Media Mailing List, Sakari Ailus,
	Kieran Bingham, Nicolas Dufresne, Benjamin Gaignard,
	Hidenori Kobayashi, Paul Kocialkowski, Michael Olbrich,
	Ricardo Ribalda, Maxime Ripard, Daniel Scally,
	Jernej Škrabec, Niklas Söderlund, Michael Tretter,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

Hi Hans,

On Sat, Sep 10, 2022 at 02:50:10PM +0200, Hans Verkuil wrote:
> On 06/09/2022 18:14, Dave Stevenson wrote:
> > Hi All.
> > 
> > I realise that I'm in a slightly different position from many mainline
> > Linux-media developers in that I see multiple use cases for the same
> > sensor, rather than a driver predominantly being for one
> > product/platform. I'm therefore wanting to look at generic solutions
> > and fully featured drivers. Users get to decide the use cases, not the
> > hardware designers.
> > 
> > The issues I've raised are things that I've encountered and would
> > benefit from a discussion to get views as to the direction that is
> > perceived to be workable. I appreciate that some can not be solved
> > immediately, but want to avoid too much bikeshedding in the first
> > round of patches.
> > What's realistic, and what pitfalls/limitations immediately jump out at people.
> > 
> > Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> > 
> > See you on Monday.
> > 
> >   Dave
> 
> Some comments for the meeting on Monday:
> 
> - On-sensor temperature sensing:
> 
> If a control is used to read this, but the value is
> not available yet, then -EACCES can be returned. That's already defined as a valid return
> code in the API, it would just need to be extended for this use-case.
> 
> - Sync sensors:
> 
> Should it be part of the DT? That depends, I think, on whether this is a pure sw mechanism,
> or whether the wiring dictates which sensor can be master and which can be slaves. I assume
> that at the very least there has to be a way to group sensors that are/can be connected to
> the same master sync signal.
> 
> - Lens assemblies:
> 
> For what it is worth, Cisco uses motor controlled lenses and irises. We extended the camera
> controls with these new controls:
> 
> #define V4L2_CID_FOCUS_CURRENT                  (V4L2_CID_CAMERA_CLASS_BASE+36)
> #define V4L2_CID_IRIS_CURRENT                   (V4L2_CID_CAMERA_CLASS_BASE+38)
> #define V4L2_CID_FOCUS_MOTOR_STATUS             (V4L2_CID_CAMERA_CLASS_BASE+41)
> #define V4L2_CID_IRIS_MOTOR_STATUS              (V4L2_CID_CAMERA_CLASS_BASE+43)
> enum v4l2_motor_status {
>         V4L2_MOTOR_STATUS_IDLE                  = (0),
>         V4L2_MOTOR_STATUS_MOVING                = (1 << 0),
>         V4L2_MOTOR_STATUS_FAILED                = (1 << 1),
>         V4L2_MOTOR_STATUS_NOTCALIBRATED         = (1 << 2),
> };
> #define V4L2_CID_FOCUS_MOTOR_SPEED              (V4L2_CID_CAMERA_CLASS_BASE+46)
> #define V4L2_CID_IRIS_MOTOR_SPEED               (V4L2_CID_CAMERA_CLASS_BASE+48)
> 
> This worked well for our use-case, but for us userspace has complete knowledge about
> the camera assembly properties.

Where does userspace get that information from ?

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-10 16:17   ` Laurent Pinchart
@ 2022-09-11  7:12     ` Hans Verkuil
  2022-09-11  9:13       ` Laurent Pinchart
  0 siblings, 1 reply; 11+ messages in thread
From: Hans Verkuil @ 2022-09-11  7:12 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Dave Stevenson, Linux Media Mailing List, Sakari Ailus,
	Kieran Bingham, Nicolas Dufresne, Benjamin Gaignard,
	Hidenori Kobayashi, Paul Kocialkowski, Michael Olbrich,
	Ricardo Ribalda, Maxime Ripard, Daniel Scally,
	Jernej Škrabec, Niklas Söderlund, Michael Tretter,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

On 10/09/2022 18:17, Laurent Pinchart wrote:
> Hi Hans,
> 
> On Sat, Sep 10, 2022 at 02:50:10PM +0200, Hans Verkuil wrote:
>> On 06/09/2022 18:14, Dave Stevenson wrote:
>>> Hi All.
>>>
>>> I realise that I'm in a slightly different position from many mainline
>>> Linux-media developers in that I see multiple use cases for the same
>>> sensor, rather than a driver predominantly being for one
>>> product/platform. I'm therefore wanting to look at generic solutions
>>> and fully featured drivers. Users get to decide the use cases, not the
>>> hardware designers.
>>>
>>> The issues I've raised are things that I've encountered and would
>>> benefit from a discussion to get views as to the direction that is
>>> perceived to be workable. I appreciate that some can not be solved
>>> immediately, but want to avoid too much bikeshedding in the first
>>> round of patches.
>>> What's realistic, and what pitfalls/limitations immediately jump out at people.
>>>
>>> Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
>>>
>>> See you on Monday.
>>>
>>>   Dave
>>
>> Some comments for the meeting on Monday:
>>
>> - On-sensor temperature sensing:
>>
>> If a control is used to read this, but the value is
>> not available yet, then -EACCES can be returned. That's already defined as a valid return
>> code in the API, it would just need to be extended for this use-case.
>>
>> - Sync sensors:
>>
>> Should it be part of the DT? That depends, I think, on whether this is a pure sw mechanism,
>> or whether the wiring dictates which sensor can be master and which can be slaves. I assume
>> that at the very least there has to be a way to group sensors that are/can be connected to
>> the same master sync signal.
>>
>> - Lens assemblies:
>>
>> For what it is worth, Cisco uses motor controlled lenses and irises. We extended the camera
>> controls with these new controls:
>>
>> #define V4L2_CID_FOCUS_CURRENT                  (V4L2_CID_CAMERA_CLASS_BASE+36)
>> #define V4L2_CID_IRIS_CURRENT                   (V4L2_CID_CAMERA_CLASS_BASE+38)
>> #define V4L2_CID_FOCUS_MOTOR_STATUS             (V4L2_CID_CAMERA_CLASS_BASE+41)
>> #define V4L2_CID_IRIS_MOTOR_STATUS              (V4L2_CID_CAMERA_CLASS_BASE+43)
>> enum v4l2_motor_status {
>>         V4L2_MOTOR_STATUS_IDLE                  = (0),
>>         V4L2_MOTOR_STATUS_MOVING                = (1 << 0),
>>         V4L2_MOTOR_STATUS_FAILED                = (1 << 1),
>>         V4L2_MOTOR_STATUS_NOTCALIBRATED         = (1 << 2),
>> };
>> #define V4L2_CID_FOCUS_MOTOR_SPEED              (V4L2_CID_CAMERA_CLASS_BASE+46)
>> #define V4L2_CID_IRIS_MOTOR_SPEED               (V4L2_CID_CAMERA_CLASS_BASE+48)
>>
>> This worked well for our use-case, but for us userspace has complete knowledge about
>> the camera assembly properties.
> 
> Where does userspace get that information from ?
> 

From the software engineers :-) We designed the cameras, so we know how to operate them.

I'm not entirely sure if that's what you meant, though.

Regards,

	Hans

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-11  7:12     ` Hans Verkuil
@ 2022-09-11  9:13       ` Laurent Pinchart
  2022-09-11 10:17         ` Hans Verkuil
  0 siblings, 1 reply; 11+ messages in thread
From: Laurent Pinchart @ 2022-09-11  9:13 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Dave Stevenson, Linux Media Mailing List, Sakari Ailus,
	Kieran Bingham, Nicolas Dufresne, Benjamin Gaignard,
	Hidenori Kobayashi, Paul Kocialkowski, Michael Olbrich,
	Ricardo Ribalda, Maxime Ripard, Daniel Scally,
	Jernej Škrabec, Niklas Söderlund, Michael Tretter,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

On Sun, Sep 11, 2022 at 09:12:15AM +0200, Hans Verkuil wrote:
> On 10/09/2022 18:17, Laurent Pinchart wrote:
> > On Sat, Sep 10, 2022 at 02:50:10PM +0200, Hans Verkuil wrote:
> >> On 06/09/2022 18:14, Dave Stevenson wrote:
> >>> Hi All.
> >>>
> >>> I realise that I'm in a slightly different position from many mainline
> >>> Linux-media developers in that I see multiple use cases for the same
> >>> sensor, rather than a driver predominantly being for one
> >>> product/platform. I'm therefore wanting to look at generic solutions
> >>> and fully featured drivers. Users get to decide the use cases, not the
> >>> hardware designers.
> >>>
> >>> The issues I've raised are things that I've encountered and would
> >>> benefit from a discussion to get views as to the direction that is
> >>> perceived to be workable. I appreciate that some can not be solved
> >>> immediately, but want to avoid too much bikeshedding in the first
> >>> round of patches.
> >>> What's realistic, and what pitfalls/limitations immediately jump out at people.
> >>>
> >>> Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
> >>>
> >>> See you on Monday.
> >>>
> >>>   Dave
> >>
> >> Some comments for the meeting on Monday:
> >>
> >> - On-sensor temperature sensing:
> >>
> >> If a control is used to read this, but the value is
> >> not available yet, then -EACCES can be returned. That's already defined as a valid return
> >> code in the API, it would just need to be extended for this use-case.
> >>
> >> - Sync sensors:
> >>
> >> Should it be part of the DT? That depends, I think, on whether this is a pure sw mechanism,
> >> or whether the wiring dictates which sensor can be master and which can be slaves. I assume
> >> that at the very least there has to be a way to group sensors that are/can be connected to
> >> the same master sync signal.
> >>
> >> - Lens assemblies:
> >>
> >> For what it is worth, Cisco uses motor controlled lenses and irises. We extended the camera
> >> controls with these new controls:
> >>
> >> #define V4L2_CID_FOCUS_CURRENT                  (V4L2_CID_CAMERA_CLASS_BASE+36)
> >> #define V4L2_CID_IRIS_CURRENT                   (V4L2_CID_CAMERA_CLASS_BASE+38)
> >> #define V4L2_CID_FOCUS_MOTOR_STATUS             (V4L2_CID_CAMERA_CLASS_BASE+41)
> >> #define V4L2_CID_IRIS_MOTOR_STATUS              (V4L2_CID_CAMERA_CLASS_BASE+43)
> >> enum v4l2_motor_status {
> >>         V4L2_MOTOR_STATUS_IDLE                  = (0),
> >>         V4L2_MOTOR_STATUS_MOVING                = (1 << 0),
> >>         V4L2_MOTOR_STATUS_FAILED                = (1 << 1),
> >>         V4L2_MOTOR_STATUS_NOTCALIBRATED         = (1 << 2),
> >> };
> >> #define V4L2_CID_FOCUS_MOTOR_SPEED              (V4L2_CID_CAMERA_CLASS_BASE+46)
> >> #define V4L2_CID_IRIS_MOTOR_SPEED               (V4L2_CID_CAMERA_CLASS_BASE+48)
> >>
> >> This worked well for our use-case, but for us userspace has complete knowledge about
> >> the camera assembly properties.
> > 
> > Where does userspace get that information from ?
> > 
> 
> From the software engineers :-) We designed the cameras, so we know how to operate them.
> 
> I'm not entirely sure if that's what you meant, though.

:-)

I meant to ask if you have userspace software that can work with
different camera modules, in which case it would need to identify the
module and retrieve corresponding parameters. That leads to the question
of camera module identification (i.e. if we have modules with the same
sensor but with different lenses, how do we identify that, as typically
in DT all we have is the sensor model), parameters format (can we
standardize that, in order to have interoperability of different
userspace software with different platforms) and storage (some systems
have an NVM in the camera sensor or in the camera sensor module, can we
meaningfully use that ?).

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 11+ messages in thread

* Re: [Media Summit] Imaging Sensor functionality
  2022-09-11  9:13       ` Laurent Pinchart
@ 2022-09-11 10:17         ` Hans Verkuil
  0 siblings, 0 replies; 11+ messages in thread
From: Hans Verkuil @ 2022-09-11 10:17 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Dave Stevenson, Linux Media Mailing List, Sakari Ailus,
	Kieran Bingham, Nicolas Dufresne, Benjamin Gaignard,
	Hidenori Kobayashi, Paul Kocialkowski, Michael Olbrich,
	Ricardo Ribalda, Maxime Ripard, Daniel Scally,
	Jernej Škrabec, Niklas Söderlund, Michael Tretter,
	Philipp Zabel, Mauro Carvalho Chehab, Benjamin MUGNIER,
	Jacopo Mondi

On 11/09/2022 11:13, Laurent Pinchart wrote:
> On Sun, Sep 11, 2022 at 09:12:15AM +0200, Hans Verkuil wrote:
>> On 10/09/2022 18:17, Laurent Pinchart wrote:
>>> On Sat, Sep 10, 2022 at 02:50:10PM +0200, Hans Verkuil wrote:
>>>> On 06/09/2022 18:14, Dave Stevenson wrote:
>>>>> Hi All.
>>>>>
>>>>> I realise that I'm in a slightly different position from many mainline
>>>>> Linux-media developers in that I see multiple use cases for the same
>>>>> sensor, rather than a driver predominantly being for one
>>>>> product/platform. I'm therefore wanting to look at generic solutions
>>>>> and fully featured drivers. Users get to decide the use cases, not the
>>>>> hardware designers.
>>>>>
>>>>> The issues I've raised are things that I've encountered and would
>>>>> benefit from a discussion to get views as to the direction that is
>>>>> perceived to be workable. I appreciate that some can not be solved
>>>>> immediately, but want to avoid too much bikeshedding in the first
>>>>> round of patches.
>>>>> What's realistic, and what pitfalls/limitations immediately jump out at people.
>>>>>
>>>>> Slides are at https://drive.google.com/file/d/1vjYJjTNRL1P3j6G4Nx2ZpjFtTBTNdeFG/view?usp=sharing
>>>>>
>>>>> See you on Monday.
>>>>>
>>>>>   Dave
>>>>
>>>> Some comments for the meeting on Monday:
>>>>
>>>> - On-sensor temperature sensing:
>>>>
>>>> If a control is used to read this, but the value is
>>>> not available yet, then -EACCES can be returned. That's already defined as a valid return
>>>> code in the API, it would just need to be extended for this use-case.
>>>>
>>>> - Sync sensors:
>>>>
>>>> Should it be part of the DT? That depends, I think, on whether this is a pure sw mechanism,
>>>> or whether the wiring dictates which sensor can be master and which can be slaves. I assume
>>>> that at the very least there has to be a way to group sensors that are/can be connected to
>>>> the same master sync signal.
>>>>
>>>> - Lens assemblies:
>>>>
>>>> For what it is worth, Cisco uses motor controlled lenses and irises. We extended the camera
>>>> controls with these new controls:
>>>>
>>>> #define V4L2_CID_FOCUS_CURRENT                  (V4L2_CID_CAMERA_CLASS_BASE+36)
>>>> #define V4L2_CID_IRIS_CURRENT                   (V4L2_CID_CAMERA_CLASS_BASE+38)
>>>> #define V4L2_CID_FOCUS_MOTOR_STATUS             (V4L2_CID_CAMERA_CLASS_BASE+41)
>>>> #define V4L2_CID_IRIS_MOTOR_STATUS              (V4L2_CID_CAMERA_CLASS_BASE+43)
>>>> enum v4l2_motor_status {
>>>>         V4L2_MOTOR_STATUS_IDLE                  = (0),
>>>>         V4L2_MOTOR_STATUS_MOVING                = (1 << 0),
>>>>         V4L2_MOTOR_STATUS_FAILED                = (1 << 1),
>>>>         V4L2_MOTOR_STATUS_NOTCALIBRATED         = (1 << 2),
>>>> };
>>>> #define V4L2_CID_FOCUS_MOTOR_SPEED              (V4L2_CID_CAMERA_CLASS_BASE+46)
>>>> #define V4L2_CID_IRIS_MOTOR_SPEED               (V4L2_CID_CAMERA_CLASS_BASE+48)
>>>>
>>>> This worked well for our use-case, but for us userspace has complete knowledge about
>>>> the camera assembly properties.
>>>
>>> Where does userspace get that information from ?
>>>
>>
>> From the software engineers :-) We designed the cameras, so we know how to operate them.
>>
>> I'm not entirely sure if that's what you meant, though.
> 
> :-)
> 
> I meant to ask if you have userspace software that can work with
> different camera modules, in which case it would need to identify the
> module and retrieve corresponding parameters. That leads to the question
> of camera module identification (i.e. if we have modules with the same
> sensor but with different lenses, how do we identify that, as typically
> in DT all we have is the sensor model), parameters format (can we
> standardize that, in order to have interoperability of different
> userspace software with different platforms) and storage (some systems
> have an NVM in the camera sensor or in the camera sensor module, can we
> meaningfully use that ?).
> 

Ah, no. These camera assemblies are part of the product itself. E.g. something
like this: https://projectworkplace.cisco.com/products/cisco-webex-dx80

Obviously the software knows the product as a whole, and so knows how the
camera assembly is made. Note that for most of these controls the device tree
will provide the ranges (depends on the number of steps of the stepper motor,
etc.)

I'm not up to speed on all the details (motor control is not my area of expertise),
but if there are specific questions I can probably dig them up (unless it is Cisco
proprietary code, of course).

Regards,

	Hans

^ permalink raw reply	[flat|nested] 11+ messages in thread

end of thread, other threads:[~2022-09-11 10:17 UTC | newest]

Thread overview: 11+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2022-09-06 16:14 [Media Summit] Imaging Sensor functionality Dave Stevenson
2022-09-06 17:53 ` Laurent Pinchart
2022-09-07  0:41   ` Laurent Pinchart
2022-09-07 13:12     ` Dave Stevenson
2022-09-07 12:42   ` Dave Stevenson
2022-09-07 13:11     ` Laurent Pinchart
2022-09-10 12:50 ` Hans Verkuil
2022-09-10 16:17   ` Laurent Pinchart
2022-09-11  7:12     ` Hans Verkuil
2022-09-11  9:13       ` Laurent Pinchart
2022-09-11 10:17         ` Hans Verkuil

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.