All of lore.kernel.org
 help / color / mirror / Atom feed
* RE: i.mx35 live video
       [not found] ` <alpine.DEB.2.00.1202261207001.17356@axis700.grange>
@ 2012-02-26 13:00   ` Alex Gershgorin
  2012-02-26 14:31     ` Guennadi Liakhovetski
  0 siblings, 1 reply; 7+ messages in thread
From: Alex Gershgorin @ 2012-02-26 13:00 UTC (permalink / raw)
  To: Guennadi Liakhovetski, linux-media


Thanks Guennadi for your quick response ,  

>Hi Alex
 
> Hi Guennadi,
>
> We would like to use I.MX35 processor in new project.
> An important element of the project is to obtain life video from the camera and display it on display.
> For these purposes, we want to use mainline Linux kernel which supports all the necessary drivers for the implementation of this task.
> As I understand that soc_camera is not currently supported userptr method, in which case how I can configure the video pipeline in user space
> to get the live video on display, without the intervention of the processor.

>soc-camera does support USERPTR, also the mx3_camera driver claims to
>support it.

I based on soc-camera.txt document.

The soc-camera subsystem provides a unified API between camera host drivers and
camera sensor drivers. It implements a V4L2 interface to the user, currently
only the mmap method is supported.

In any case, I glad that this supported :-) 

What do you think it is possible to implement video streaming without the intervention of the processor?   

Regards,

Alex Gershgorin 
 
  
 


 


 

^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: i.mx35 live video
  2012-02-26 13:00   ` i.mx35 live video Alex Gershgorin
@ 2012-02-26 14:31     ` Guennadi Liakhovetski
  2012-02-26 17:56       ` Alex Gershgorin
  0 siblings, 1 reply; 7+ messages in thread
From: Guennadi Liakhovetski @ 2012-02-26 14:31 UTC (permalink / raw)
  To: Alex Gershgorin; +Cc: linux-media

On Sun, 26 Feb 2012, Alex Gershgorin wrote:

> 
> Thanks Guennadi for your quick response ,  
> 
> >Hi Alex
>  
> > Hi Guennadi,
> >
> > We would like to use I.MX35 processor in new project.
> > An important element of the project is to obtain life video from the camera and display it on display.
> > For these purposes, we want to use mainline Linux kernel which supports all the necessary drivers for the implementation of this task.
> > As I understand that soc_camera is not currently supported userptr method, in which case how I can configure the video pipeline in user space
> > to get the live video on display, without the intervention of the processor.
> 
> >soc-camera does support USERPTR, also the mx3_camera driver claims to
> >support it.
> 
> I based on soc-camera.txt document.

Yeah, I really have to update it...

> The soc-camera subsystem provides a unified API between camera host drivers and
> camera sensor drivers. It implements a V4L2 interface to the user, currently
> only the mmap method is supported.
> 
> In any case, I glad that this supported :-) 
> 
> What do you think it is possible to implement video streaming without 
> the intervention of the processor?

It might be difficult to completely eliminate the CPU, at the very least 
you need to queue and dequeue buffers to and from the V4L driver. To avoid 
even that, in principle, you could try to use only one buffer, but I don't 
think the current version of the mx3_camera driver would be very happy 
about that. You could take 2 buffers and use panning, then you'd just have 
to send queue and dequeue buffers and pan the display. But in any case, 
you probably will have to process buffers, but your most important 
advantage is, that you won't have to copy data, you only have to move 
pointers around.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: i.mx35 live video
  2012-02-26 14:31     ` Guennadi Liakhovetski
@ 2012-02-26 17:56       ` Alex Gershgorin
  2012-02-26 20:58         ` Guennadi Liakhovetski
  0 siblings, 1 reply; 7+ messages in thread
From: Alex Gershgorin @ 2012-02-26 17:56 UTC (permalink / raw)
  To: 'Guennadi Liakhovetski'; +Cc: linux-media




> Thanks Guennadi for your quick response ,  
> 
> >Hi Alex
>  
> > Hi Guennadi,
> >
> > We would like to use I.MX35 processor in new project.
> > An important element of the project is to obtain life video from the camera and display it on display.
> > For these purposes, we want to use mainline Linux kernel which supports all the necessary drivers for the implementation of this task.
> > As I understand that soc_camera is not currently supported userptr method, in which case how I can configure the video pipeline in user space
> > to get the live video on display, without the intervention of the processor.
> 
> >soc-camera does support USERPTR, also the mx3_camera driver claims to
> >support it.
> 
> I based on soc-camera.txt document.

> Yeah, I really have to update it...

> The soc-camera subsystem provides a unified API between camera host drivers and
> camera sensor drivers. It implements a V4L2 interface to the user, currently
> only the mmap method is supported.
> 
> In any case, I glad that this supported :-) 
> 
> What do you think it is possible to implement video streaming without 
> the intervention of the processor?

>It might be difficult to completely eliminate the CPU, at the very least 
>you need to queue and dequeue buffers to and from the V4L driver. To avoid 
>even that, in principle, you could try to use only one buffer, but I don't 
>think the current version of the mx3_camera driver would be very happy 
>about that. You could take 2 buffers and use panning, then you'd just have 
>to send queue and dequeue buffers and pan the display. But in any case, 
>you probably will have to process buffers, but your most important 
>advantage is, that you won't have to copy data, you only have to move 
>pointers around.

The method that you describe is exactly what I had in mind.
It would be more correct to say it is "minimum" CPU intervention and not without CPU intervention. 
As far I understand, I can implement MMAP method for frame buffer device and pass this pointer directly to mx3_camera driver with use USERPTR method, then send queue and dequeue buffers to mx3_camera driver.
What is not clear, if it is possible to pass the same pointer of frame buffer in mx3_camera, if the driver is using two buffers?

Thanks,
Alex Gershgorin



 

 

^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: i.mx35 live video
  2012-02-26 17:56       ` Alex Gershgorin
@ 2012-02-26 20:58         ` Guennadi Liakhovetski
  2012-02-26 21:31           ` Sylwester Nawrocki
  2012-02-27  8:58           ` Alex Gershgorin
  0 siblings, 2 replies; 7+ messages in thread
From: Guennadi Liakhovetski @ 2012-02-26 20:58 UTC (permalink / raw)
  To: Alex Gershgorin; +Cc: linux-media

On Sun, 26 Feb 2012, Alex Gershgorin wrote:

> > Thanks Guennadi for your quick response ,  
> > 
> > >Hi Alex
> >  
> > > Hi Guennadi,
> > >
> > > We would like to use I.MX35 processor in new project.
> > > An important element of the project is to obtain life video from the camera and display it on display.
> > > For these purposes, we want to use mainline Linux kernel which supports all the necessary drivers for the implementation of this task.
> > > As I understand that soc_camera is not currently supported userptr method, in which case how I can configure the video pipeline in user space
> > > to get the live video on display, without the intervention of the processor.
> > 
> > >soc-camera does support USERPTR, also the mx3_camera driver claims to
> > >support it.
> > 
> > I based on soc-camera.txt document.
> 
> > Yeah, I really have to update it...
> 
> > The soc-camera subsystem provides a unified API between camera host drivers and
> > camera sensor drivers. It implements a V4L2 interface to the user, currently
> > only the mmap method is supported.
> > 
> > In any case, I glad that this supported :-) 
> > 
> > What do you think it is possible to implement video streaming without 
> > the intervention of the processor?
> 
> >It might be difficult to completely eliminate the CPU, at the very least 
> >you need to queue and dequeue buffers to and from the V4L driver. To avoid 
> >even that, in principle, you could try to use only one buffer, but I don't 
> >think the current version of the mx3_camera driver would be very happy 
> >about that. You could take 2 buffers and use panning, then you'd just have 
> >to send queue and dequeue buffers and pan the display. But in any case, 
> >you probably will have to process buffers, but your most important 
> >advantage is, that you won't have to copy data, you only have to move 
> >pointers around.
> 
> The method that you describe is exactly what I had in mind.
> It would be more correct to say it is "minimum" CPU intervention and not without CPU intervention. 

> As far I understand, I can implement MMAP method for frame buffer device 
> and pass this pointer directly to mx3_camera driver with use USERPTR 
> method, then send queue and dequeue buffers to mx3_camera driver.
> What is not clear, if it is possible to pass the same pointer of frame 
> buffer in mx3_camera, if the driver is using two buffers?

Sorry, I really don't know for sure. It should work, but I don't think I 
tested thid myself nor I remember anybody reporting having tested this 
mode. So, you can either try to search mailing list archives, or just test 
it. Begin with a simpler mode - USERPTR with separately allocated buffers 
and copying them manually to the framebuffer, then try to switch to just 
one buffer in this same mode, then switch to direct framebuffer memory.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: i.mx35 live video
  2012-02-26 20:58         ` Guennadi Liakhovetski
@ 2012-02-26 21:31           ` Sylwester Nawrocki
  2012-02-27 10:44             ` Alex Gershgorin
  2012-02-27  8:58           ` Alex Gershgorin
  1 sibling, 1 reply; 7+ messages in thread
From: Sylwester Nawrocki @ 2012-02-26 21:31 UTC (permalink / raw)
  To: Alex Gershgorin; +Cc: Guennadi Liakhovetski, linux-media

Hi,

On 02/26/2012 09:58 PM, Guennadi Liakhovetski wrote:
>>> It might be difficult to completely eliminate the CPU, at the very least
>>> you need to queue and dequeue buffers to and from the V4L driver. To avoid
>>> even that, in principle, you could try to use only one buffer, but I don't
>>> think the current version of the mx3_camera driver would be very happy
>>> about that. You could take 2 buffers and use panning, then you'd just have
>>> to send queue and dequeue buffers and pan the display. But in any case,
>>> you probably will have to process buffers, but your most important
>>> advantage is, that you won't have to copy data, you only have to move
>>> pointers around.
>>
>> The method that you describe is exactly what I had in mind.
>> It would be more correct to say it is "minimum" CPU intervention and not without CPU intervention.
> 
>> As far I understand, I can implement MMAP method for frame buffer device
>> and pass this pointer directly to mx3_camera driver with use USERPTR
>> method, then send queue and dequeue buffers to mx3_camera driver.
>> What is not clear, if it is possible to pass the same pointer of frame
>> buffer in mx3_camera, if the driver is using two buffers?

It should work when you request 2 USERPTR buffers and assign same address 
(frame buffer start) to them. I've seen setups like this working with videbuf2
based drivers. However it's really poor configuration, to avoid tearing
you could just set framebuffer virtual window size to contain at least
two screen windows and for the second buffer use framebuffer start address
with a proper offset as the USERPTR address. Then you could just add display
panning to display every frame.  

--

Regards,
Sylwester

> Sorry, I really don't know for sure. It should work, but I don't think I
> tested thid myself nor I remember anybody reporting having tested this
> mode. So, you can either try to search mailing list archives, or just test
> it. Begin with a simpler mode - USERPTR with separately allocated buffers
> and copying them manually to the framebuffer, then try to switch to just
> one buffer in this same mode, then switch to direct framebuffer memory.
> 
> Thanks
> Guennadi
> ---
> Guennadi Liakhovetski, Ph.D.
> Freelance Open-Source Software Developer
> http://www.open-technology.de/


^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: i.mx35 live video
  2012-02-26 20:58         ` Guennadi Liakhovetski
  2012-02-26 21:31           ` Sylwester Nawrocki
@ 2012-02-27  8:58           ` Alex Gershgorin
  1 sibling, 0 replies; 7+ messages in thread
From: Alex Gershgorin @ 2012-02-27  8:58 UTC (permalink / raw)
  To: 'Guennadi Liakhovetski'; +Cc: linux-media



-----Original Message-----
From: Guennadi Liakhovetski [mailto:g.liakhovetski@gmx.de] 
Sent: Sunday, February 26, 2012 10:58 PM
To: Alex Gershgorin
Cc: linux-media@vger.kernel.org
Subject: RE: i.mx35 live video

On Sun, 26 Feb 2012, Alex Gershgorin wrote:

> > Thanks Guennadi for your quick response ,  
> > 
> > >Hi Alex
> >  
> > > Hi Guennadi,
> > >
> > > We would like to use I.MX35 processor in new project.
> > > An important element of the project is to obtain life video from the camera and display it on display.
> > > For these purposes, we want to use mainline Linux kernel which supports all the necessary drivers for the implementation of this task.
> > > As I understand that soc_camera is not currently supported userptr method, in which case how I can configure the video pipeline in user space
> > > to get the live video on display, without the intervention of the processor.
> > 
> > >soc-camera does support USERPTR, also the mx3_camera driver claims to
> > >support it.
> > 
> > I based on soc-camera.txt document.
> 
> > Yeah, I really have to update it...
> 
> > The soc-camera subsystem provides a unified API between camera host drivers and
> > camera sensor drivers. It implements a V4L2 interface to the user, currently
> > only the mmap method is supported.
> > 
> > In any case, I glad that this supported :-) 
> > 
> > What do you think it is possible to implement video streaming without 
> > the intervention of the processor?
> 
> >It might be difficult to completely eliminate the CPU, at the very least 
> >you need to queue and dequeue buffers to and from the V4L driver. To avoid 
> >even that, in principle, you could try to use only one buffer, but I don't 
> >think the current version of the mx3_camera driver would be very happy 
> >about that. You could take 2 buffers and use panning, then you'd just have 
> >to send queue and dequeue buffers and pan the display. But in any case, 
> >you probably will have to process buffers, but your most important 
> >advantage is, that you won't have to copy data, you only have to move 
> >pointers around.
> 
> The method that you describe is exactly what I had in mind.
> It would be more correct to say it is "minimum" CPU intervention and not without CPU intervention. 

> As far I understand, I can implement MMAP method for frame buffer device 
> and pass this pointer directly to mx3_camera driver with use USERPTR 
> method, then send queue and dequeue buffers to mx3_camera driver.
> What is not clear, if it is possible to pass the same pointer of frame 
> buffer in mx3_camera, if the driver is using two buffers?

<Sorry, I really don't know for sure. It should work, but I don't think I 
<tested thid myself nor I remember anybody reporting having tested this 
<mode. So, you can either try to search mailing list archives, or just test 
<it. Begin with a simpler mode - USERPTR with separately allocated buffers 
<and copying them manually to the framebuffer, then try to switch to just 
<one buffer in this same mode, then switch to direct framebuffer memory.

Thanks Gennady this a good road map, in the near future I will be testing it and will get back to you.

Regards,
Alex Gershgorin




 
 

^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: i.mx35 live video
  2012-02-26 21:31           ` Sylwester Nawrocki
@ 2012-02-27 10:44             ` Alex Gershgorin
  0 siblings, 0 replies; 7+ messages in thread
From: Alex Gershgorin @ 2012-02-27 10:44 UTC (permalink / raw)
  To: 'Sylwester Nawrocki'; +Cc: Guennadi Liakhovetski, linux-media



 

Hi,

On 02/26/2012 09:58 PM, Guennadi Liakhovetski wrote:
>>> It might be difficult to completely eliminate the CPU, at the very least
>>> you need to queue and dequeue buffers to and from the V4L driver. To avoid
>>> even that, in principle, you could try to use only one buffer, but I don't
>>> think the current version of the mx3_camera driver would be very happy
>>> about that. You could take 2 buffers and use panning, then you'd just have
>>> to send queue and dequeue buffers and pan the display. But in any case,
>>> you probably will have to process buffers, but your most important
>>> advantage is, that you won't have to copy data, you only have to move
>>> pointers around.
>>
>> The method that you describe is exactly what I had in mind.
>> It would be more correct to say it is "minimum" CPU intervention and not without CPU intervention.
> 
>> As far I understand, I can implement MMAP method for frame buffer device
>> and pass this pointer directly to mx3_camera driver with use USERPTR
>> method, then send queue and dequeue buffers to mx3_camera driver.
>> What is not clear, if it is possible to pass the same pointer of frame
>> buffer in mx3_camera, if the driver is using two buffers?

<It should work when you request 2 USERPTR buffers and assign same address 
(>frame buffer start) to them. I've seen setups like this working with <videbuf2 based drivers.

Thanks for you information this is what I had in mind :-) 

<However it's really poor configuration, to avoid tearing
<you could just set framebuffer virtual window size to contain at least
<two screen windows and for the second buffer use framebuffer start address
<with a proper offset as the USERPTR address. Then you could just add <display panning to display every frame.  

Looks good I'll try to implement this method.
Thank you for your advice. 

Regards,
Alex Gershgorin




^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2012-02-27 10:44 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
     [not found] <4875438356E7CA4A8F2145FCD3E61C0B2C8966B289@MEP-EXCH.meprolight.com>
     [not found] ` <alpine.DEB.2.00.1202261207001.17356@axis700.grange>
2012-02-26 13:00   ` i.mx35 live video Alex Gershgorin
2012-02-26 14:31     ` Guennadi Liakhovetski
2012-02-26 17:56       ` Alex Gershgorin
2012-02-26 20:58         ` Guennadi Liakhovetski
2012-02-26 21:31           ` Sylwester Nawrocki
2012-02-27 10:44             ` Alex Gershgorin
2012-02-27  8:58           ` Alex Gershgorin

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.