All of lore.kernel.org
 help / color / mirror / Atom feed
* V4L-DVB Summit Day 1
@ 2009-09-24  5:39 Hans Verkuil
  2009-09-24 11:52 ` Yu, Jinlu
  2009-09-24 18:07 ` Guennadi Liakhovetski
  0 siblings, 2 replies; 7+ messages in thread
From: Hans Verkuil @ 2009-09-24  5:39 UTC (permalink / raw)
  To: linux-media

Hi all,

As most of you know I organized a v4l-dvb summit (well, really a v4l2 summit 
as there were no dvb topics to discuss) during the Linux Plumbers Conference 
in Portland. This summit will take all three days of this conference, and I 
intend to make a short report at the end of each day.

First of all I want to thank everyone who attended this first day of the 
summit: we had a great turn-out with seven core v4l-dvb developers, three TI 
engineers, two Nokia engineers, two engineers from Samsung and an Intel 
engineer. I know I've forgotten someone, I'll try to fix that tomorrow.

But it meant that the main SoC vendors with complex video hardware were well 
represented.

The summit started off with an overview of the proposed media controller and 
an overview of the features of several SoCs to give an idea of what sort of 
complexity has to be supported in the future. I'll try to get some of the 
presentations up on my site. Unfortunately, not all presentations can be made 
public. The main message that came across though is that these complex devices 
with big pipelines, scalers, composers, colorspace converters, etc. require a 
completely new way of working.

While we did discuss the concepts of the media controller, we did not go into 
much detail: that is scheduled for Thursday.

In the afternoon we discussed the proposed timings API. There was no 
opposition to this API. The idea I had to also use this for sensor setup 
turned out to be based on a misconception on how the S_FMT relates to sensors. 
ENUM_FRAMESIZES basically gives you the possible resolutions that the scaler 
hidden inside the bridge can scale the native sensor resolution. It does not 
enumerate the various native sensor resolutions, since there is only one. So 
S_FMT really sets up the scaler.

So we can proceed with the timings RFC and hopefully have this implemented for 
2.6.33.

Next was the event API proposal: this caused some more discussions, in 
particular since the original RFC had no provision for (un)subscribing to 
events. The idea is that we want to subscribe to events on a per-filehandle 
basis. The core framework can keep track of events and distribute them to 
filehandles that 'listen' to them. So this RFC will clearly need to go to at 
least one revision.

That was also a good point to stop for the day and head out to get free beer 
and food :-)

Scheduled for Thursday is a discussion of the proposed memory pool and 
continued media controller discussions.

Regards,

        Hans

^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: V4L-DVB Summit Day 1
  2009-09-24  5:39 V4L-DVB Summit Day 1 Hans Verkuil
@ 2009-09-24 11:52 ` Yu, Jinlu
  2009-09-24 18:07 ` Guennadi Liakhovetski
  1 sibling, 0 replies; 7+ messages in thread
From: Yu, Jinlu @ 2009-09-24 11:52 UTC (permalink / raw)
  To: linux-media

About S_FMT, Moorestown Camera driver which I am working on has a different story.

We do use different sensor resolutions, because high resolution has low FPS, and can not be used for view-finding, e.g. ov5630 only output 15fps with 5Mega resolution. 

Our current solution is to disable the scaler in ISP and only support the resolutions that sensor can provide. So S_FMT will set the framesize into sensor.

Best Regards
Jinlu Yu
UMG UPSG PRC
INET: 8758 1603
TEL:  86 10 8217 1603
FAX:  86 10 8286 1400

-----Original Message-----
From: linux-media-owner@vger.kernel.org [mailto:linux-media-owner@vger.kernel.org] On Behalf Of Hans Verkuil
Sent: 2009年9月24日 13:39
To: linux-media@vger.kernel.org
Subject: V4L-DVB Summit Day 1

Hi all,

As most of you know I organized a v4l-dvb summit (well, really a v4l2 summit 
as there were no dvb topics to discuss) during the Linux Plumbers Conference 
in Portland. This summit will take all three days of this conference, and I 
intend to make a short report at the end of each day.

First of all I want to thank everyone who attended this first day of the 
summit: we had a great turn-out with seven core v4l-dvb developers, three TI 
engineers, two Nokia engineers, two engineers from Samsung and an Intel 
engineer. I know I've forgotten someone, I'll try to fix that tomorrow.

But it meant that the main SoC vendors with complex video hardware were well 
represented.

The summit started off with an overview of the proposed media controller and 
an overview of the features of several SoCs to give an idea of what sort of 
complexity has to be supported in the future. I'll try to get some of the 
presentations up on my site. Unfortunately, not all presentations can be made 
public. The main message that came across though is that these complex devices 
with big pipelines, scalers, composers, colorspace converters, etc. require a 
completely new way of working.

While we did discuss the concepts of the media controller, we did not go into 
much detail: that is scheduled for Thursday.

In the afternoon we discussed the proposed timings API. There was no 
opposition to this API. The idea I had to also use this for sensor setup 
turned out to be based on a misconception on how the S_FMT relates to sensors. 
ENUM_FRAMESIZES basically gives you the possible resolutions that the scaler 
hidden inside the bridge can scale the native sensor resolution. It does not 
enumerate the various native sensor resolutions, since there is only one. So 
S_FMT really sets up the scaler.

So we can proceed with the timings RFC and hopefully have this implemented for 
2.6.33.

Next was the event API proposal: this caused some more discussions, in 
particular since the original RFC had no provision for (un)subscribing to 
events. The idea is that we want to subscribe to events on a per-filehandle 
basis. The core framework can keep track of events and distribute them to 
filehandles that 'listen' to them. So this RFC will clearly need to go to at 
least one revision.

That was also a good point to stop for the day and head out to get free beer 
and food :-)

Scheduled for Thursday is a discussion of the proposed memory pool and 
continued media controller discussions.

Regards,

        Hans
--
To unsubscribe from this list: send the line "unsubscribe linux-media" in
the body of a message to majordomo@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: V4L-DVB Summit Day 1
  2009-09-24  5:39 V4L-DVB Summit Day 1 Hans Verkuil
  2009-09-24 11:52 ` Yu, Jinlu
@ 2009-09-24 18:07 ` Guennadi Liakhovetski
  2009-09-26  8:40   ` Dongsoo, Nathaniel Kim
  1 sibling, 1 reply; 7+ messages in thread
From: Guennadi Liakhovetski @ 2009-09-24 18:07 UTC (permalink / raw)
  To: Hans Verkuil; +Cc: linux-media

Hi Hans

Thanks for keeping us updated. One comment:

On Wed, 23 Sep 2009, Hans Verkuil wrote:

> In the afternoon we discussed the proposed timings API. There was no 
> opposition to this API. The idea I had to also use this for sensor setup 
> turned out to be based on a misconception on how the S_FMT relates to sensors. 
> ENUM_FRAMESIZES basically gives you the possible resolutions that the scaler 
> hidden inside the bridge can scale the native sensor resolution. It does not 
> enumerate the various native sensor resolutions, since there is only one. So 
> S_FMT really sets up the scaler.

Just as Jinlu Yu noticed in his email, this doesn't reflect the real 
situation, I am afraid. You can use binning and skipping on the sensor to 
scale the image, and you can also use the bridge to do the scaling, as you 
say. Worth than that, there's also a case, where there _several_ ways to 
perform scaling on the sensor, among which one can freely choose, and the 
host can scale too. And indeed it makes sense to scale on the source to 
save the bandwidth and thus increase the framerate. So, what I'm currently 
doing on sh-mobile, I try to scale on the client - in the best possible 
way. And then use bridge scaling to provide the exact result.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: V4L-DVB Summit Day 1
  2009-09-24 18:07 ` Guennadi Liakhovetski
@ 2009-09-26  8:40   ` Dongsoo, Nathaniel Kim
  2009-09-26  9:32     ` Guennadi Liakhovetski
  0 siblings, 1 reply; 7+ messages in thread
From: Dongsoo, Nathaniel Kim @ 2009-09-26  8:40 UTC (permalink / raw)
  To: Guennadi Liakhovetski; +Cc: Hans Verkuil, linux-media

On Fri, Sep 25, 2009 at 3:07 AM, Guennadi Liakhovetski
<g.liakhovetski@gmx.de> wrote:
> Hi Hans
>
> Thanks for keeping us updated. One comment:
>
> On Wed, 23 Sep 2009, Hans Verkuil wrote:
>
>> In the afternoon we discussed the proposed timings API. There was no
>> opposition to this API. The idea I had to also use this for sensor setup
>> turned out to be based on a misconception on how the S_FMT relates to sensors.
>> ENUM_FRAMESIZES basically gives you the possible resolutions that the scaler
>> hidden inside the bridge can scale the native sensor resolution. It does not
>> enumerate the various native sensor resolutions, since there is only one. So
>> S_FMT really sets up the scaler.
>
> Just as Jinlu Yu noticed in his email, this doesn't reflect the real
> situation, I am afraid. You can use binning and skipping on the sensor to
> scale the image, and you can also use the bridge to do the scaling, as you
> say. Worth than that, there's also a case, where there _several_ ways to
> perform scaling on the sensor, among which one can freely choose, and the
> host can scale too. And indeed it makes sense to scale on the source to
> save the bandwidth and thus increase the framerate. So, what I'm currently
> doing on sh-mobile, I try to scale on the client - in the best possible
> way. And then use bridge scaling to provide the exact result.
>

Yes I do agree with you. And it is highly necessary to provide a clear
method which obviously indicates which device to use in scaling job.
When I use some application processors which provide camera
peripherals with scaler inside and external ISP attached, there is no
way to use both scaler features inside them. I just need to choose one
of them.
Cheers,

Nate

-- 
=
DongSoo, Nathaniel Kim
Engineer
Mobile S/W Platform Lab.
Digital Media & Communications R&D Centre
Samsung Electronics CO., LTD.
e-mail : dongsoo.kim@gmail.com
          dongsoo45.kim@samsung.com

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: V4L-DVB Summit Day 1
  2009-09-26  8:40   ` Dongsoo, Nathaniel Kim
@ 2009-09-26  9:32     ` Guennadi Liakhovetski
  2009-09-26 13:06       ` Dongsoo, Nathaniel Kim
  0 siblings, 1 reply; 7+ messages in thread
From: Guennadi Liakhovetski @ 2009-09-26  9:32 UTC (permalink / raw)
  To: Dongsoo, Nathaniel Kim; +Cc: Hans Verkuil, Linux Media Mailing List

On Sat, 26 Sep 2009, Dongsoo, Nathaniel Kim wrote:

> On Fri, Sep 25, 2009 at 3:07 AM, Guennadi Liakhovetski
> <g.liakhovetski@gmx.de> wrote:
> > Hi Hans
> >
> > Thanks for keeping us updated. One comment:
> >
> > On Wed, 23 Sep 2009, Hans Verkuil wrote:
> >
> >> In the afternoon we discussed the proposed timings API. There was no
> >> opposition to this API. The idea I had to also use this for sensor setup
> >> turned out to be based on a misconception on how the S_FMT relates to sensors.
> >> ENUM_FRAMESIZES basically gives you the possible resolutions that the scaler
> >> hidden inside the bridge can scale the native sensor resolution. It does not
> >> enumerate the various native sensor resolutions, since there is only one. So
> >> S_FMT really sets up the scaler.
> >
> > Just as Jinlu Yu noticed in his email, this doesn't reflect the real
> > situation, I am afraid. You can use binning and skipping on the sensor to
> > scale the image, and you can also use the bridge to do the scaling, as you
> > say. Worth than that, there's also a case, where there _several_ ways to
> > perform scaling on the sensor, among which one can freely choose, and the
> > host can scale too. And indeed it makes sense to scale on the source to
> > save the bandwidth and thus increase the framerate. So, what I'm currently
> > doing on sh-mobile, I try to scale on the client - in the best possible
> > way. And then use bridge scaling to provide the exact result.
> >
> 
> Yes I do agree with you. And it is highly necessary to provide a clear
> method which obviously indicates which device to use in scaling job.
> When I use some application processors which provide camera
> peripherals with scaler inside and external ISP attached, there is no
> way to use both scaler features inside them. I just need to choose one
> of them.

Well, I don't necessarily agree, in fact, I do use both scaling engines in 
my sh setup. The argument is as mentioned above - bus usage and framerate 
optimisation. So, what I am doing is: I try to scale on the sensor as 
close as possible, and then scale further on the host (SoC). This works 
well, only calculations are not very trivial. But you only have to perform 
them once during setup, so, it's not time-critical. Might be worth 
implementing such calculations somewhere centrally to reduce error chances 
in specific drivers. Same with cropping.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: V4L-DVB Summit Day 1
  2009-09-26  9:32     ` Guennadi Liakhovetski
@ 2009-09-26 13:06       ` Dongsoo, Nathaniel Kim
  2009-09-26 19:31         ` Hans Verkuil
  0 siblings, 1 reply; 7+ messages in thread
From: Dongsoo, Nathaniel Kim @ 2009-09-26 13:06 UTC (permalink / raw)
  To: Guennadi Liakhovetski; +Cc: Hans Verkuil, Linux Media Mailing List

On Sat, Sep 26, 2009 at 6:32 PM, Guennadi Liakhovetski
<g.liakhovetski@gmx.de> wrote:
> On Sat, 26 Sep 2009, Dongsoo, Nathaniel Kim wrote:
>
>> On Fri, Sep 25, 2009 at 3:07 AM, Guennadi Liakhovetski
>> <g.liakhovetski@gmx.de> wrote:
>> > Hi Hans
>> >
>> > Thanks for keeping us updated. One comment:
>> >
>> > On Wed, 23 Sep 2009, Hans Verkuil wrote:
>> >
>> >> In the afternoon we discussed the proposed timings API. There was no
>> >> opposition to this API. The idea I had to also use this for sensor setup
>> >> turned out to be based on a misconception on how the S_FMT relates to sensors.
>> >> ENUM_FRAMESIZES basically gives you the possible resolutions that the scaler
>> >> hidden inside the bridge can scale the native sensor resolution. It does not
>> >> enumerate the various native sensor resolutions, since there is only one. So
>> >> S_FMT really sets up the scaler.
>> >
>> > Just as Jinlu Yu noticed in his email, this doesn't reflect the real
>> > situation, I am afraid. You can use binning and skipping on the sensor to
>> > scale the image, and you can also use the bridge to do the scaling, as you
>> > say. Worth than that, there's also a case, where there _several_ ways to
>> > perform scaling on the sensor, among which one can freely choose, and the
>> > host can scale too. And indeed it makes sense to scale on the source to
>> > save the bandwidth and thus increase the framerate. So, what I'm currently
>> > doing on sh-mobile, I try to scale on the client - in the best possible
>> > way. And then use bridge scaling to provide the exact result.
>> >
>>
>> Yes I do agree with you. And it is highly necessary to provide a clear
>> method which obviously indicates which device to use in scaling job.
>> When I use some application processors which provide camera
>> peripherals with scaler inside and external ISP attached, there is no
>> way to use both scaler features inside them. I just need to choose one
>> of them.
>
> Well, I don't necessarily agree, in fact, I do use both scaling engines in
> my sh setup. The argument is as mentioned above - bus usage and framerate
> optimisation. So, what I am doing is: I try to scale on the sensor as
> close as possible, and then scale further on the host (SoC). This works
> well, only calculations are not very trivial. But you only have to perform
> them once during setup, so, it's not time-critical. Might be worth
> implementing such calculations somewhere centrally to reduce error chances
> in specific drivers. Same with cropping.
>

I think that is a good approach. And considering the image quality, I
should make bypass the scaler when user is requesting the exact
resolution supported by the external camera ISP. Because some of
camera interface embedded scalers are very poor in image quality and
performance thus they may reduce in framerate as well. So, user can
choose "with scaler" or "without scaler".
Cheers,

Nate

-- 
=
DongSoo, Nathaniel Kim
Engineer
Mobile S/W Platform Lab.
Digital Media & Communications R&D Centre
Samsung Electronics CO., LTD.
e-mail : dongsoo.kim@gmail.com
          dongsoo45.kim@samsung.com

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: V4L-DVB Summit Day 1
  2009-09-26 13:06       ` Dongsoo, Nathaniel Kim
@ 2009-09-26 19:31         ` Hans Verkuil
  0 siblings, 0 replies; 7+ messages in thread
From: Hans Verkuil @ 2009-09-26 19:31 UTC (permalink / raw)
  To: Dongsoo, Nathaniel Kim; +Cc: Guennadi Liakhovetski, Linux Media Mailing List


> On Sat, Sep 26, 2009 at 6:32 PM, Guennadi Liakhovetski
> <g.liakhovetski@gmx.de> wrote:
>> On Sat, 26 Sep 2009, Dongsoo, Nathaniel Kim wrote:
>>
>>> On Fri, Sep 25, 2009 at 3:07 AM, Guennadi Liakhovetski
>>> <g.liakhovetski@gmx.de> wrote:
>>> > Hi Hans
>>> >
>>> > Thanks for keeping us updated. One comment:
>>> >
>>> > On Wed, 23 Sep 2009, Hans Verkuil wrote:
>>> >
>>> >> In the afternoon we discussed the proposed timings API. There was no
>>> >> opposition to this API. The idea I had to also use this for sensor
>>> setup
>>> >> turned out to be based on a misconception on how the S_FMT relates
>>> to sensors.
>>> >> ENUM_FRAMESIZES basically gives you the possible resolutions that
>>> the scaler
>>> >> hidden inside the bridge can scale the native sensor resolution. It
>>> does not
>>> >> enumerate the various native sensor resolutions, since there is only
>>> one. So
>>> >> S_FMT really sets up the scaler.
>>> >
>>> > Just as Jinlu Yu noticed in his email, this doesn't reflect the real
>>> > situation, I am afraid. You can use binning and skipping on the
>>> sensor to
>>> > scale the image, and you can also use the bridge to do the scaling,
>>> as you
>>> > say. Worth than that, there's also a case, where there _several_ ways
>>> to
>>> > perform scaling on the sensor, among which one can freely choose, and
>>> the
>>> > host can scale too. And indeed it makes sense to scale on the source
>>> to
>>> > save the bandwidth and thus increase the framerate. So, what I'm
>>> currently
>>> > doing on sh-mobile, I try to scale on the client - in the best
>>> possible
>>> > way. And then use bridge scaling to provide the exact result.
>>> >
>>>
>>> Yes I do agree with you. And it is highly necessary to provide a clear
>>> method which obviously indicates which device to use in scaling job.
>>> When I use some application processors which provide camera
>>> peripherals with scaler inside and external ISP attached, there is no
>>> way to use both scaler features inside them. I just need to choose one
>>> of them.
>>
>> Well, I don't necessarily agree, in fact, I do use both scaling engines
>> in
>> my sh setup. The argument is as mentioned above - bus usage and
>> framerate
>> optimisation. So, what I am doing is: I try to scale on the sensor as
>> close as possible, and then scale further on the host (SoC). This works
>> well, only calculations are not very trivial. But you only have to
>> perform
>> them once during setup, so, it's not time-critical. Might be worth
>> implementing such calculations somewhere centrally to reduce error
>> chances
>> in specific drivers. Same with cropping.
>>
>
> I think that is a good approach. And considering the image quality, I
> should make bypass the scaler when user is requesting the exact
> resolution supported by the external camera ISP. Because some of
> camera interface embedded scalers are very poor in image quality and
> performance thus they may reduce in framerate as well. So, user can
> choose "with scaler" or "without scaler".

There are two ways of doing this: one is to have a smart driver that will
attempt to do the best thing (soc-camera, uvc, gspca), the other will be
to give the application writer full control of the SoC capabilities
through the media controller. Through a media controller you will be able
to setup the sensor scaler and a SoC scaler independently.

For a digital camera for example you probably want to be able to control
the hardware from the application in order to get the very best results,
rather than let the driver do it.

Regards,

         Hans

-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom


^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2009-09-26 19:31 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2009-09-24  5:39 V4L-DVB Summit Day 1 Hans Verkuil
2009-09-24 11:52 ` Yu, Jinlu
2009-09-24 18:07 ` Guennadi Liakhovetski
2009-09-26  8:40   ` Dongsoo, Nathaniel Kim
2009-09-26  9:32     ` Guennadi Liakhovetski
2009-09-26 13:06       ` Dongsoo, Nathaniel Kim
2009-09-26 19:31         ` Hans Verkuil

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.