All of lore.kernel.org
 help / color / mirror / Atom feed
* [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches
@ 2009-10-30 14:00 Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 1/9] soc-camera: remove no longer needed struct members Guennadi Liakhovetski
                   ` (9 more replies)
  0 siblings, 10 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:00 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Hi all

As discussed yesterday, we sant to finalise the conversion of soc-camera 
to v4l2-subdev. The presented 9 patches consist of a couple of clean-ups, 
minor additions to existing APIs, and, most importantly, the second 
version of the image-bus API. It hardly changed since v1, only got 
extended with a couple more formats and driver conversions. The last patch 
modifies mt9t031 sensor driver to enable its use outside of soc-camera. 
Muralidharan, hopefully you'd be able to test it. I'll provide more 
comments in the respective mail. A complete current patch-stack is 
available at

http://download.open-technology.de/soc-camera/20091030/

based on 2.6.32-rc5. Patches, not included with these mails have either 
been already pushed via hg, or posted to the list earlier.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* [PATCH 1/9] soc-camera: remove no longer needed struct members
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera Guennadi Liakhovetski
                   ` (8 subsequent siblings)
  9 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---
 include/media/soc_camera.h |    2 --
 1 files changed, 0 insertions(+), 2 deletions(-)

diff --git a/include/media/soc_camera.h b/include/media/soc_camera.h
index 3d74e60..c5afc8c 100644
--- a/include/media/soc_camera.h
+++ b/include/media/soc_camera.h
@@ -24,8 +24,6 @@ struct soc_camera_device {
 	struct device *pdev;		/* Platform device */
 	s32 user_width;
 	s32 user_height;
-	unsigned short width_min;
-	unsigned short height_min;
 	unsigned short y_skip_top;	/* Lines to skip at the top */
 	unsigned char iface;		/* Host number */
 	unsigned char devnum;		/* Device number per host */
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 1/9] soc-camera: remove no longer needed struct members Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 14:43   ` Karicheri, Muralidharan
  2009-11-10 12:55   ` Laurent Pinchart
  2009-10-30 14:01 ` [PATCH 3/9] soc-camera: fix multi-line comment coding style Guennadi Liakhovetski
                   ` (7 subsequent siblings)
  9 siblings, 2 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Introduce new v4l2-subdev sensor operations, move .enum_framesizes() and
.enum_frameintervals() methods to it, add a new .g_skip_top_lines() method
and switch soc-camera to use it instead of .y_skip_top soc_camera_device
member, which can now be removed.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
Reviewed-by: Hans Verkuil <hverkuil@xs4all.nl>
Reviewed-by: Sergio Aguirre <saaguirre@ti.com>
---
 drivers/media/video/mt9m001.c             |   30 ++++++++++++++++++++------
 drivers/media/video/mt9m111.c             |    1 -
 drivers/media/video/mt9t031.c             |    8 ++----
 drivers/media/video/mt9v022.c             |   32 ++++++++++++++++++++--------
 drivers/media/video/pxa_camera.c          |    9 ++++++-
 drivers/media/video/soc_camera_platform.c |    1 -
 include/media/soc_camera.h                |    1 -
 include/media/v4l2-subdev.h               |   13 +++++++++++
 8 files changed, 69 insertions(+), 26 deletions(-)

diff --git a/drivers/media/video/mt9m001.c b/drivers/media/video/mt9m001.c
index 45388d2..17be2d4 100644
--- a/drivers/media/video/mt9m001.c
+++ b/drivers/media/video/mt9m001.c
@@ -82,6 +82,7 @@ struct mt9m001 {
 	int model;	/* V4L2_IDENT_MT9M001* codes from v4l2-chip-ident.h */
 	unsigned int gain;
 	unsigned int exposure;
+	unsigned short y_skip_top;	/* Lines to skip at the top */
 	unsigned char autoexposure;
 };
 
@@ -222,7 +223,7 @@ static int mt9m001_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	soc_camera_limit_side(&rect.top, &rect.height,
 		     MT9M001_ROW_SKIP, MT9M001_MIN_HEIGHT, MT9M001_MAX_HEIGHT);
 
-	total_h = rect.height + icd->y_skip_top + vblank;
+	total_h = rect.height + mt9m001->y_skip_top + vblank;
 
 	/* Blanking and start values - default... */
 	ret = reg_write(client, MT9M001_HORIZONTAL_BLANKING, hblank);
@@ -239,7 +240,7 @@ static int mt9m001_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 		ret = reg_write(client, MT9M001_WINDOW_WIDTH, rect.width - 1);
 	if (!ret)
 		ret = reg_write(client, MT9M001_WINDOW_HEIGHT,
-				rect.height + icd->y_skip_top - 1);
+				rect.height + mt9m001->y_skip_top - 1);
 	if (!ret && mt9m001->autoexposure) {
 		ret = reg_write(client, MT9M001_SHUTTER_WIDTH, total_h);
 		if (!ret) {
@@ -327,13 +328,13 @@ static int mt9m001_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 static int mt9m001_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 {
 	struct i2c_client *client = sd->priv;
-	struct soc_camera_device *icd = client->dev.platform_data;
+	struct mt9m001 *mt9m001 = to_mt9m001(client);
 	struct v4l2_pix_format *pix = &f->fmt.pix;
 
 	v4l_bound_align_image(&pix->width, MT9M001_MIN_WIDTH,
 		MT9M001_MAX_WIDTH, 1,
-		&pix->height, MT9M001_MIN_HEIGHT + icd->y_skip_top,
-		MT9M001_MAX_HEIGHT + icd->y_skip_top, 0, 0);
+		&pix->height, MT9M001_MIN_HEIGHT + mt9m001->y_skip_top,
+		MT9M001_MAX_HEIGHT + mt9m001->y_skip_top, 0, 0);
 
 	if (pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
 	    pix->pixelformat == V4L2_PIX_FMT_SBGGR16)
@@ -552,7 +553,7 @@ static int mt9m001_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 		if (ctrl->value) {
 			const u16 vblank = 25;
 			unsigned int total_h = mt9m001->rect.height +
-				icd->y_skip_top + vblank;
+				mt9m001->y_skip_top + vblank;
 			if (reg_write(client, MT9M001_SHUTTER_WIDTH,
 				      total_h) < 0)
 				return -EIO;
@@ -655,6 +656,16 @@ static void mt9m001_video_remove(struct soc_camera_device *icd)
 		icl->free_bus(icl);
 }
 
+static int mt9m001_g_skip_top_lines(struct v4l2_subdev *sd, u32 *lines)
+{
+	struct i2c_client *client = sd->priv;
+	struct mt9m001 *mt9m001 = to_mt9m001(client);
+
+	*lines = mt9m001->y_skip_top;
+
+	return 0;
+}
+
 static struct v4l2_subdev_core_ops mt9m001_subdev_core_ops = {
 	.g_ctrl		= mt9m001_g_ctrl,
 	.s_ctrl		= mt9m001_s_ctrl,
@@ -675,9 +686,14 @@ static struct v4l2_subdev_video_ops mt9m001_subdev_video_ops = {
 	.cropcap	= mt9m001_cropcap,
 };
 
+static struct v4l2_subdev_sensor_ops mt9m001_subdev_sensor_ops = {
+	.g_skip_top_lines	= mt9m001_g_skip_top_lines,
+};
+
 static struct v4l2_subdev_ops mt9m001_subdev_ops = {
 	.core	= &mt9m001_subdev_core_ops,
 	.video	= &mt9m001_subdev_video_ops,
+	.sensor	= &mt9m001_subdev_sensor_ops,
 };
 
 static int mt9m001_probe(struct i2c_client *client,
@@ -714,8 +730,8 @@ static int mt9m001_probe(struct i2c_client *client,
 
 	/* Second stage probe - when a capture adapter is there */
 	icd->ops		= &mt9m001_ops;
-	icd->y_skip_top		= 0;
 
+	mt9m001->y_skip_top	= 0;
 	mt9m001->rect.left	= MT9M001_COLUMN_SKIP;
 	mt9m001->rect.top	= MT9M001_ROW_SKIP;
 	mt9m001->rect.width	= MT9M001_MAX_WIDTH;
diff --git a/drivers/media/video/mt9m111.c b/drivers/media/video/mt9m111.c
index 90da699..30db625 100644
--- a/drivers/media/video/mt9m111.c
+++ b/drivers/media/video/mt9m111.c
@@ -1019,7 +1019,6 @@ static int mt9m111_probe(struct i2c_client *client,
 
 	/* Second stage probe - when a capture adapter is there */
 	icd->ops		= &mt9m111_ops;
-	icd->y_skip_top		= 0;
 
 	mt9m111->rect.left	= MT9M111_MIN_DARK_COLS;
 	mt9m111->rect.top	= MT9M111_MIN_DARK_ROWS;
diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
index 6966f64..57e04e9 100644
--- a/drivers/media/video/mt9t031.c
+++ b/drivers/media/video/mt9t031.c
@@ -301,9 +301,9 @@ static int mt9t031_set_params(struct soc_camera_device *icd,
 		ret = reg_write(client, MT9T031_WINDOW_WIDTH, rect->width - 1);
 	if (ret >= 0)
 		ret = reg_write(client, MT9T031_WINDOW_HEIGHT,
-				rect->height + icd->y_skip_top - 1);
+				rect->height - 1);
 	if (ret >= 0 && mt9t031->autoexposure) {
-		unsigned int total_h = rect->height + icd->y_skip_top + vblank;
+		unsigned int total_h = rect->height + vblank;
 		ret = set_shutter(client, total_h);
 		if (ret >= 0) {
 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
@@ -656,8 +656,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 		if (ctrl->value) {
 			const u16 vblank = MT9T031_VERTICAL_BLANK;
 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
-			unsigned int total_h = mt9t031->rect.height +
-				icd->y_skip_top + vblank;
+			unsigned int total_h = mt9t031->rect.height + vblank;
 
 			if (set_shutter(client, total_h) < 0)
 				return -EIO;
@@ -773,7 +772,6 @@ static int mt9t031_probe(struct i2c_client *client,
 
 	/* Second stage probe - when a capture adapter is there */
 	icd->ops		= &mt9t031_ops;
-	icd->y_skip_top		= 0;
 
 	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
 	mt9t031->rect.top	= MT9T031_ROW_SKIP;
diff --git a/drivers/media/video/mt9v022.c b/drivers/media/video/mt9v022.c
index 995607f..b71898f 100644
--- a/drivers/media/video/mt9v022.c
+++ b/drivers/media/video/mt9v022.c
@@ -97,6 +97,7 @@ struct mt9v022 {
 	__u32 fourcc;
 	int model;	/* V4L2_IDENT_MT9V022* codes from v4l2-chip-ident.h */
 	u16 chip_control;
+	unsigned short y_skip_top;	/* Lines to skip at the top */
 };
 
 static struct mt9v022 *to_mt9v022(const struct i2c_client *client)
@@ -265,7 +266,6 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
 	struct v4l2_rect rect = a->c;
-	struct soc_camera_device *icd = client->dev.platform_data;
 	int ret;
 
 	/* Bayer format - even size lengths */
@@ -287,10 +287,10 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	if (ret >= 0) {
 		if (ret & 1) /* Autoexposure */
 			ret = reg_write(client, MT9V022_MAX_TOTAL_SHUTTER_WIDTH,
-					rect.height + icd->y_skip_top + 43);
+					rect.height + mt9v022->y_skip_top + 43);
 		else
 			ret = reg_write(client, MT9V022_TOTAL_SHUTTER_WIDTH,
-					rect.height + icd->y_skip_top + 43);
+					rect.height + mt9v022->y_skip_top + 43);
 	}
 	/* Setup frame format: defaults apart from width and height */
 	if (!ret)
@@ -309,7 +309,7 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 		ret = reg_write(client, MT9V022_WINDOW_WIDTH, rect.width);
 	if (!ret)
 		ret = reg_write(client, MT9V022_WINDOW_HEIGHT,
-				rect.height + icd->y_skip_top);
+				rect.height + mt9v022->y_skip_top);
 
 	if (ret < 0)
 		return ret;
@@ -410,15 +410,15 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 static int mt9v022_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 {
 	struct i2c_client *client = sd->priv;
-	struct soc_camera_device *icd = client->dev.platform_data;
+	struct mt9v022 *mt9v022 = to_mt9v022(client);
 	struct v4l2_pix_format *pix = &f->fmt.pix;
 	int align = pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
 		pix->pixelformat == V4L2_PIX_FMT_SBGGR16;
 
 	v4l_bound_align_image(&pix->width, MT9V022_MIN_WIDTH,
 		MT9V022_MAX_WIDTH, align,
-		&pix->height, MT9V022_MIN_HEIGHT + icd->y_skip_top,
-		MT9V022_MAX_HEIGHT + icd->y_skip_top, align, 0);
+		&pix->height, MT9V022_MIN_HEIGHT + mt9v022->y_skip_top,
+		MT9V022_MAX_HEIGHT + mt9v022->y_skip_top, align, 0);
 
 	return 0;
 }
@@ -787,6 +787,16 @@ static void mt9v022_video_remove(struct soc_camera_device *icd)
 		icl->free_bus(icl);
 }
 
+static int mt9v022_g_skip_top_lines(struct v4l2_subdev *sd, u32 *lines)
+{
+	struct i2c_client *client = sd->priv;
+	struct mt9v022 *mt9v022 = to_mt9v022(client);
+
+	*lines = mt9v022->y_skip_top;
+
+	return 0;
+}
+
 static struct v4l2_subdev_core_ops mt9v022_subdev_core_ops = {
 	.g_ctrl		= mt9v022_g_ctrl,
 	.s_ctrl		= mt9v022_s_ctrl,
@@ -807,9 +817,14 @@ static struct v4l2_subdev_video_ops mt9v022_subdev_video_ops = {
 	.cropcap	= mt9v022_cropcap,
 };
 
+static struct v4l2_subdev_sensor_ops mt9v022_subdev_sensor_ops = {
+	.g_skip_top_lines	= mt9v022_g_skip_top_lines,
+};
+
 static struct v4l2_subdev_ops mt9v022_subdev_ops = {
 	.core	= &mt9v022_subdev_core_ops,
 	.video	= &mt9v022_subdev_video_ops,
+	.sensor	= &mt9v022_subdev_sensor_ops,
 };
 
 static int mt9v022_probe(struct i2c_client *client,
@@ -851,8 +866,7 @@ static int mt9v022_probe(struct i2c_client *client,
 	 * MT9V022 _really_ corrupts the first read out line.
 	 * TODO: verify on i.MX31
 	 */
-	icd->y_skip_top		= 1;
-
+	mt9v022->y_skip_top	= 1;
 	mt9v022->rect.left	= MT9V022_COLUMN_SKIP;
 	mt9v022->rect.top	= MT9V022_ROW_SKIP;
 	mt9v022->rect.width	= MT9V022_MAX_WIDTH;
diff --git a/drivers/media/video/pxa_camera.c b/drivers/media/video/pxa_camera.c
index 51b683c..4df09a6 100644
--- a/drivers/media/video/pxa_camera.c
+++ b/drivers/media/video/pxa_camera.c
@@ -1051,8 +1051,13 @@ static void pxa_camera_setup_cicr(struct soc_camera_device *icd,
 {
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct pxa_camera_dev *pcdev = ici->priv;
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	unsigned long dw, bpp;
-	u32 cicr0, cicr1, cicr2, cicr3, cicr4 = 0;
+	u32 cicr0, cicr1, cicr2, cicr3, cicr4 = 0, y_skip_top;
+	int ret = v4l2_subdev_call(sd, sensor, g_skip_top_lines, &y_skip_top);
+
+	if (ret < 0)
+		y_skip_top = 0;
 
 	/* Datawidth is now guaranteed to be equal to one of the three values.
 	 * We fix bit-per-pixel equal to data-width... */
@@ -1118,7 +1123,7 @@ static void pxa_camera_setup_cicr(struct soc_camera_device *icd,
 
 	cicr2 = 0;
 	cicr3 = CICR3_LPF_VAL(icd->user_height - 1) |
-		CICR3_BFW_VAL(min((unsigned short)255, icd->y_skip_top));
+		CICR3_BFW_VAL(min((u32)255, y_skip_top));
 	cicr4 |= pcdev->mclk_divisor;
 
 	__raw_writel(cicr1, pcdev->base + CICR1);
diff --git a/drivers/media/video/soc_camera_platform.c b/drivers/media/video/soc_camera_platform.c
index b6a575c..c7c9151 100644
--- a/drivers/media/video/soc_camera_platform.c
+++ b/drivers/media/video/soc_camera_platform.c
@@ -128,7 +128,6 @@ static int soc_camera_platform_probe(struct platform_device *pdev)
 	/* Set the control device reference */
 	dev_set_drvdata(&icd->dev, &pdev->dev);
 
-	icd->y_skip_top		= 0;
 	icd->ops		= &soc_camera_platform_ops;
 
 	ici = to_soc_camera_host(icd->dev.parent);
diff --git a/include/media/soc_camera.h b/include/media/soc_camera.h
index c5afc8c..218639f 100644
--- a/include/media/soc_camera.h
+++ b/include/media/soc_camera.h
@@ -24,7 +24,6 @@ struct soc_camera_device {
 	struct device *pdev;		/* Platform device */
 	s32 user_width;
 	s32 user_height;
-	unsigned short y_skip_top;	/* Lines to skip at the top */
 	unsigned char iface;		/* Host number */
 	unsigned char devnum;		/* Device number per host */
 	unsigned char buswidth;		/* See comment in .c */
diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
index d411345..04193eb 100644
--- a/include/media/v4l2-subdev.h
+++ b/include/media/v4l2-subdev.h
@@ -227,8 +227,20 @@ struct v4l2_subdev_video_ops {
 	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
 	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
 	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
+};
+
+/**
+ * struct v4l2_subdev_sensor_ops - v4l2-subdev sensor operations
+ * @enum_framesizes: enumerate supported framesizes
+ * @enum_frameintervals: enumerate supported frame format intervals
+ * @g_skip_top_lines: number of lines at the top of the image to be skipped.
+ *		      This is needed for some sensors, that always corrupt
+ *		      several top lines of the output image.
+ */
+struct v4l2_subdev_sensor_ops {
 	int (*enum_framesizes)(struct v4l2_subdev *sd, struct v4l2_frmsizeenum *fsize);
 	int (*enum_frameintervals)(struct v4l2_subdev *sd, struct v4l2_frmivalenum *fival);
+	int (*g_skip_top_lines)(struct v4l2_subdev *sd, u32 *lines);
 };
 
 struct v4l2_subdev_ops {
@@ -236,6 +248,7 @@ struct v4l2_subdev_ops {
 	const struct v4l2_subdev_tuner_ops *tuner;
 	const struct v4l2_subdev_audio_ops *audio;
 	const struct v4l2_subdev_video_ops *video;
+	const struct v4l2_subdev_sensor_ops *sensor;
 };
 
 #define V4L2_SUBDEV_NAME_SIZE 32
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH 3/9] soc-camera: fix multi-line comment coding style
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 1/9] soc-camera: remove no longer needed struct members Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes Guennadi Liakhovetski
                   ` (6 subsequent siblings)
  9 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---
 drivers/media/video/mt9m001.c              |   36 ++++++++++++------
 drivers/media/video/mt9t031.c              |   24 ++++++++----
 drivers/media/video/mt9v022.c              |   48 ++++++++++++++++--------
 drivers/media/video/mx1_camera.c           |   36 ++++++++++++------
 drivers/media/video/mx3_camera.c           |   18 ++++++---
 drivers/media/video/pxa_camera.c           |   54 ++++++++++++++++++---------
 drivers/media/video/sh_mobile_ceu_camera.c |   18 ++++++---
 drivers/media/video/soc_camera.c           |   24 ++++++++----
 drivers/media/video/tw9910.c               |    8 ++--
 include/media/ov772x.h                     |    3 +-
 10 files changed, 178 insertions(+), 91 deletions(-)

diff --git a/drivers/media/video/mt9m001.c b/drivers/media/video/mt9m001.c
index 17be2d4..cc90660 100644
--- a/drivers/media/video/mt9m001.c
+++ b/drivers/media/video/mt9m001.c
@@ -17,9 +17,11 @@
 #include <media/v4l2-chip-ident.h>
 #include <media/soc_camera.h>
 
-/* mt9m001 i2c address 0x5d
+/*
+ * mt9m001 i2c address 0x5d
  * The platform has to define ctruct i2c_board_info objects and link to them
- * from struct soc_camera_link */
+ * from struct soc_camera_link
+ */
 
 /* mt9m001 selected register addresses */
 #define MT9M001_CHIP_VERSION		0x00
@@ -47,8 +49,10 @@
 #define MT9M001_ROW_SKIP		12
 
 static const struct soc_camera_data_format mt9m001_colour_formats[] = {
-	/* Order important: first natively supported,
-	 * second supported with a GPIO extender */
+	/*
+	 * Order important: first natively supported,
+	 * second supported with a GPIO extender
+	 */
 	{
 		.name		= "Bayer (sRGB) 10 bit",
 		.depth		= 10,
@@ -230,8 +234,10 @@ static int mt9m001_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	if (!ret)
 		ret = reg_write(client, MT9M001_VERTICAL_BLANKING, vblank);
 
-	/* The caller provides a supported format, as verified per
-	 * call to icd->try_fmt() */
+	/*
+	 * The caller provides a supported format, as verified per
+	 * call to icd->try_fmt()
+	 */
 	if (!ret)
 		ret = reg_write(client, MT9M001_COLUMN_START, rect.left);
 	if (!ret)
@@ -569,8 +575,10 @@ static int mt9m001_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 	return 0;
 }
 
-/* Interface active, can use i2c. If it fails, it can indeed mean, that
- * this wasn't our capture interface, so, we wait for the right one */
+/*
+ * Interface active, can use i2c. If it fails, it can indeed mean, that
+ * this wasn't our capture interface, so, we wait for the right one
+ */
 static int mt9m001_video_probe(struct soc_camera_device *icd,
 			       struct i2c_client *client)
 {
@@ -580,8 +588,10 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 	unsigned long flags;
 	int ret;
 
-	/* We must have a parent by now. And it cannot be a wrong one.
-	 * So this entire test is completely redundant. */
+	/*
+	 * We must have a parent by now. And it cannot be a wrong one.
+	 * So this entire test is completely redundant.
+	 */
 	if (!icd->dev.parent ||
 	    to_soc_camera_host(icd->dev.parent)->nr != icd->iface)
 		return -ENODEV;
@@ -737,8 +747,10 @@ static int mt9m001_probe(struct i2c_client *client,
 	mt9m001->rect.width	= MT9M001_MAX_WIDTH;
 	mt9m001->rect.height	= MT9M001_MAX_HEIGHT;
 
-	/* Simulated autoexposure. If enabled, we calculate shutter width
-	 * ourselves in the driver based on vertical blanking and frame width */
+	/*
+	 * Simulated autoexposure. If enabled, we calculate shutter width
+	 * ourselves in the driver based on vertical blanking and frame width
+	 */
 	mt9m001->autoexposure = 1;
 
 	ret = mt9m001_video_probe(icd, client);
diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
index 57e04e9..0d2a8fd 100644
--- a/drivers/media/video/mt9t031.c
+++ b/drivers/media/video/mt9t031.c
@@ -17,9 +17,11 @@
 #include <media/v4l2-chip-ident.h>
 #include <media/soc_camera.h>
 
-/* mt9t031 i2c address 0x5d
+/*
+ * mt9t031 i2c address 0x5d
  * The platform has to define i2c_board_info and link to it from
- * struct soc_camera_link */
+ * struct soc_camera_link
+ */
 
 /* mt9t031 selected register addresses */
 #define MT9T031_CHIP_VERSION		0x00
@@ -291,8 +293,10 @@ static int mt9t031_set_params(struct soc_camera_device *icd,
 	dev_dbg(&client->dev, "new physical left %u, top %u\n",
 		rect->left, rect->top);
 
-	/* The caller provides a supported format, as guaranteed by
-	 * icd->try_fmt_cap(), soc_camera_s_crop() and soc_camera_cropcap() */
+	/*
+	 * The caller provides a supported format, as guaranteed by
+	 * icd->try_fmt_cap(), soc_camera_s_crop() and soc_camera_cropcap()
+	 */
 	if (ret >= 0)
 		ret = reg_write(client, MT9T031_COLUMN_START, rect->left);
 	if (ret >= 0)
@@ -672,8 +676,10 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 	return 0;
 }
 
-/* Interface active, can use i2c. If it fails, it can indeed mean, that
- * this wasn't our capture interface, so, we wait for the right one */
+/*
+ * Interface active, can use i2c. If it fails, it can indeed mean, that
+ * this wasn't our capture interface, so, we wait for the right one
+ */
 static int mt9t031_video_probe(struct i2c_client *client)
 {
 	struct soc_camera_device *icd = client->dev.platform_data;
@@ -778,8 +784,10 @@ static int mt9t031_probe(struct i2c_client *client,
 	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
 	mt9t031->rect.height	= MT9T031_MAX_HEIGHT;
 
-	/* Simulated autoexposure. If enabled, we calculate shutter width
-	 * ourselves in the driver based on vertical blanking and frame width */
+	/*
+	 * Simulated autoexposure. If enabled, we calculate shutter width
+	 * ourselves in the driver based on vertical blanking and frame width
+	 */
 	mt9t031->autoexposure = 1;
 
 	mt9t031->xskip = 1;
diff --git a/drivers/media/video/mt9v022.c b/drivers/media/video/mt9v022.c
index b71898f..f60a9a1 100644
--- a/drivers/media/video/mt9v022.c
+++ b/drivers/media/video/mt9v022.c
@@ -18,9 +18,11 @@
 #include <media/v4l2-chip-ident.h>
 #include <media/soc_camera.h>
 
-/* mt9v022 i2c address 0x48, 0x4c, 0x58, 0x5c
+/*
+ * mt9v022 i2c address 0x48, 0x4c, 0x58, 0x5c
  * The platform has to define ctruct i2c_board_info objects and link to them
- * from struct soc_camera_link */
+ * from struct soc_camera_link
+ */
 
 static char *sensor_type;
 module_param(sensor_type, charp, S_IRUGO);
@@ -63,8 +65,10 @@ MODULE_PARM_DESC(sensor_type, "Sensor type: \"colour\" or \"monochrome\"");
 #define MT9V022_ROW_SKIP		4
 
 static const struct soc_camera_data_format mt9v022_colour_formats[] = {
-	/* Order important: first natively supported,
-	 * second supported with a GPIO extender */
+	/*
+	 * Order important: first natively supported,
+	 * second supported with a GPIO extender
+	 */
 	{
 		.name		= "Bayer (sRGB) 10 bit",
 		.depth		= 10,
@@ -144,9 +148,11 @@ static int mt9v022_init(struct i2c_client *client)
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
 	int ret;
 
-	/* Almost the default mode: master, parallel, simultaneous, and an
+	/*
+	 * Almost the default mode: master, parallel, simultaneous, and an
 	 * undocumented bit 0x200, which is present in table 7, but not in 8,
-	 * plus snapshot mode to disable scan for now */
+	 * plus snapshot mode to disable scan for now
+	 */
 	mt9v022->chip_control |= 0x10;
 	ret = reg_write(client, MT9V022_CHIP_CONTROL, mt9v022->chip_control);
 	if (!ret)
@@ -298,8 +304,10 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	if (!ret)
 		ret = reg_write(client, MT9V022_ROW_START, rect.top);
 	if (!ret)
-		/* Default 94, Phytec driver says:
-		 * "width + horizontal blank >= 660" */
+		/*
+		 * Default 94, Phytec driver says:
+		 * "width + horizontal blank >= 660"
+		 */
 		ret = reg_write(client, MT9V022_HORIZONTAL_BLANKING,
 				rect.width > 660 - 43 ? 43 :
 				660 - rect.width);
@@ -376,8 +384,10 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	};
 	int ret;
 
-	/* The caller provides a supported format, as verified per call to
-	 * icd->try_fmt(), datawidth is from our supported format list */
+	/*
+	 * The caller provides a supported format, as verified per call to
+	 * icd->try_fmt(), datawidth is from our supported format list
+	 */
 	switch (pix->pixelformat) {
 	case V4L2_PIX_FMT_GREY:
 	case V4L2_PIX_FMT_Y16:
@@ -635,8 +645,10 @@ static int mt9v022_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 					      48 + range / 2) / range + 16;
 			if (gain >= 32)
 				gain &= ~1;
-			/* The user wants to set gain manually, hope, she
-			 * knows, what she's doing... Switch AGC off. */
+			/*
+			 * The user wants to set gain manually, hope, she
+			 * knows, what she's doing... Switch AGC off.
+			 */
 
 			if (reg_clear(client, MT9V022_AEC_AGC_ENABLE, 0x2) < 0)
 				return -EIO;
@@ -655,8 +667,10 @@ static int mt9v022_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 			unsigned long range = qctrl->maximum - qctrl->minimum;
 			unsigned long shutter = ((ctrl->value - qctrl->minimum) *
 						 479 + range / 2) / range + 1;
-			/* The user wants to set shutter width manually, hope,
-			 * she knows, what she's doing... Switch AEC off. */
+			/*
+			 * The user wants to set shutter width manually, hope,
+			 * she knows, what she's doing... Switch AEC off.
+			 */
 
 			if (reg_clear(client, MT9V022_AEC_AGC_ENABLE, 0x1) < 0)
 				return -EIO;
@@ -689,8 +703,10 @@ static int mt9v022_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 	return 0;
 }
 
-/* Interface active, can use i2c. If it fails, it can indeed mean, that
- * this wasn't our capture interface, so, we wait for the right one */
+/*
+ * Interface active, can use i2c. If it fails, it can indeed mean, that
+ * this wasn't our capture interface, so, we wait for the right one
+ */
 static int mt9v022_video_probe(struct soc_camera_device *icd,
 			       struct i2c_client *client)
 {
diff --git a/drivers/media/video/mx1_camera.c b/drivers/media/video/mx1_camera.c
index 5f37952..659d20a 100644
--- a/drivers/media/video/mx1_camera.c
+++ b/drivers/media/video/mx1_camera.c
@@ -98,9 +98,11 @@ struct mx1_buffer {
 	int inwork;
 };
 
-/* i.MX1/i.MXL is only supposed to handle one camera on its Camera Sensor
+/*
+ * i.MX1/i.MXL is only supposed to handle one camera on its Camera Sensor
  * Interface. If anyone ever builds hardware to enable more than
- * one camera, they will have to modify this driver too */
+ * one camera, they will have to modify this driver too
+ */
 struct mx1_camera_dev {
 	struct soc_camera_host		soc_host;
 	struct soc_camera_device	*icd;
@@ -150,8 +152,10 @@ static void free_buffer(struct videobuf_queue *vq, struct mx1_buffer *buf)
 	dev_dbg(icd->dev.parent, "%s (vb=0x%p) 0x%08lx %d\n", __func__,
 		vb, vb->baddr, vb->bsize);
 
-	/* This waits until this buffer is out of danger, i.e., until it is no
-	 * longer in STATE_QUEUED or STATE_ACTIVE */
+	/*
+	 * This waits until this buffer is out of danger, i.e., until it is no
+	 * longer in STATE_QUEUED or STATE_ACTIVE
+	 */
 	videobuf_waiton(vb, 0, 0);
 	videobuf_dma_contig_free(vq, vb);
 
@@ -173,8 +177,10 @@ static int mx1_videobuf_prepare(struct videobuf_queue *vq,
 
 	BUG_ON(NULL == icd->current_fmt);
 
-	/* I think, in buf_prepare you only have to protect global data,
-	 * the actual buffer is yours */
+	/*
+	 * I think, in buf_prepare you only have to protect global data,
+	 * the actual buffer is yours
+	 */
 	buf->inwork = 1;
 
 	if (buf->fmt	!= icd->current_fmt ||
@@ -380,8 +386,10 @@ static int mclk_get_divisor(struct mx1_camera_dev *pcdev)
 
 	lcdclk = clk_get_rate(pcdev->clk);
 
-	/* We verify platform_mclk_10khz != 0, so if anyone breaks it, here
-	 * they get a nice Oops */
+	/*
+	 * We verify platform_mclk_10khz != 0, so if anyone breaks it, here
+	 * they get a nice Oops
+	 */
 	div = (lcdclk + 2 * mclk - 1) / (2 * mclk) - 1;
 
 	dev_dbg(pcdev->icd->dev.parent,
@@ -419,8 +427,10 @@ static void mx1_camera_deactivate(struct mx1_camera_dev *pcdev)
 	clk_disable(pcdev->clk);
 }
 
-/* The following two functions absolutely depend on the fact, that
- * there can be only one camera on i.MX1/i.MXL camera sensor interface */
+/*
+ * The following two functions absolutely depend on the fact, that
+ * there can be only one camera on i.MX1/i.MXL camera sensor interface
+ */
 static int mx1_camera_add_device(struct soc_camera_device *icd)
 {
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
@@ -577,10 +587,12 @@ static int mx1_camera_reqbufs(struct soc_camera_file *icf,
 {
 	int i;
 
-	/* This is for locking debugging only. I removed spinlocks and now I
+	/*
+	 * This is for locking debugging only. I removed spinlocks and now I
 	 * check whether .prepare is ever called on a linked buffer, or whether
 	 * a dma IRQ can occur for an in-work or unlinked buffer. Until now
-	 * it hadn't triggered */
+	 * it hadn't triggered
+	 */
 	for (i = 0; i < p->count; i++) {
 		struct mx1_buffer *buf = container_of(icf->vb_vidq.bufs[i],
 						      struct mx1_buffer, vb);
diff --git a/drivers/media/video/mx3_camera.c b/drivers/media/video/mx3_camera.c
index dff2e5e..545a430 100644
--- a/drivers/media/video/mx3_camera.c
+++ b/drivers/media/video/mx3_camera.c
@@ -563,8 +563,10 @@ static int test_platform_param(struct mx3_camera_dev *mx3_cam,
 		SOCAM_DATA_ACTIVE_HIGH |
 		SOCAM_DATA_ACTIVE_LOW;
 
-	/* If requested data width is supported by the platform, use it or any
-	 * possible lower value - i.MX31 is smart enough to schift bits */
+	/*
+	 * If requested data width is supported by the platform, use it or any
+	 * possible lower value - i.MX31 is smart enough to schift bits
+	 */
 	switch (buswidth) {
 	case 15:
 		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15))
@@ -1026,8 +1028,10 @@ static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 			common_flags &= ~SOCAM_PCLK_SAMPLE_FALLING;
 	}
 
-	/* Make the camera work in widest common mode, we'll take care of
-	 * the rest */
+	/*
+	 * Make the camera work in widest common mode, we'll take care of
+	 * the rest
+	 */
 	if (common_flags & SOCAM_DATAWIDTH_15)
 		common_flags = (common_flags & ~SOCAM_DATAWIDTH_MASK) |
 			SOCAM_DATAWIDTH_15;
@@ -1151,8 +1155,10 @@ static int __devinit mx3_camera_probe(struct platform_device *pdev)
 	if (!(mx3_cam->platform_flags & (MX3_CAMERA_DATAWIDTH_4 |
 			MX3_CAMERA_DATAWIDTH_8 | MX3_CAMERA_DATAWIDTH_10 |
 			MX3_CAMERA_DATAWIDTH_15))) {
-		/* Platform hasn't set available data widths. This is bad.
-		 * Warn and use a default. */
+		/*
+		 * Platform hasn't set available data widths. This is bad.
+		 * Warn and use a default.
+		 */
 		dev_warn(&pdev->dev, "WARNING! Platform hasn't set available "
 			 "data widths, using default 8 bit\n");
 		mx3_cam->platform_flags |= MX3_CAMERA_DATAWIDTH_8;
diff --git a/drivers/media/video/pxa_camera.c b/drivers/media/video/pxa_camera.c
index 4df09a6..f063f59 100644
--- a/drivers/media/video/pxa_camera.c
+++ b/drivers/media/video/pxa_camera.c
@@ -197,9 +197,11 @@ struct pxa_buffer {
 
 struct pxa_camera_dev {
 	struct soc_camera_host	soc_host;
-	/* PXA27x is only supposed to handle one camera on its Quick Capture
+	/*
+	 * PXA27x is only supposed to handle one camera on its Quick Capture
 	 * interface. If anyone ever builds hardware to enable more than
-	 * one camera, they will have to modify this driver too */
+	 * one camera, they will have to modify this driver too
+	 */
 	struct soc_camera_device *icd;
 	struct clk		*clk;
 
@@ -267,8 +269,10 @@ static void free_buffer(struct videobuf_queue *vq, struct pxa_buffer *buf)
 	dev_dbg(icd->dev.parent, "%s (vb=0x%p) 0x%08lx %d\n", __func__,
 		&buf->vb, buf->vb.baddr, buf->vb.bsize);
 
-	/* This waits until this buffer is out of danger, i.e., until it is no
-	 * longer in STATE_QUEUED or STATE_ACTIVE */
+	/*
+	 * This waits until this buffer is out of danger, i.e., until it is no
+	 * longer in STATE_QUEUED or STATE_ACTIVE
+	 */
 	videobuf_waiton(&buf->vb, 0, 0);
 	videobuf_dma_unmap(vq, dma);
 	videobuf_dma_free(dma);
@@ -437,15 +441,19 @@ static int pxa_videobuf_prepare(struct videobuf_queue *vq,
 	WARN_ON(!list_empty(&vb->queue));
 
 #ifdef DEBUG
-	/* This can be useful if you want to see if we actually fill
-	 * the buffer with something */
+	/*
+	 * This can be useful if you want to see if we actually fill
+	 * the buffer with something
+	 */
 	memset((void *)vb->baddr, 0xaa, vb->bsize);
 #endif
 
 	BUG_ON(NULL == icd->current_fmt);
 
-	/* I think, in buf_prepare you only have to protect global data,
-	 * the actual buffer is yours */
+	/*
+	 * I think, in buf_prepare you only have to protect global data,
+	 * the actual buffer is yours
+	 */
 	buf->inwork = 1;
 
 	if (buf->fmt	!= icd->current_fmt ||
@@ -834,8 +842,10 @@ static void pxa_camera_init_videobuf(struct videobuf_queue *q,
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct pxa_camera_dev *pcdev = ici->priv;
 
-	/* We must pass NULL as dev pointer, then all pci_* dma operations
-	 * transform to normal dma_* ones. */
+	/*
+	 * We must pass NULL as dev pointer, then all pci_* dma operations
+	 * transform to normal dma_* ones.
+	 */
 	videobuf_queue_sg_init(q, &pxa_videobuf_ops, NULL, &pcdev->lock,
 				V4L2_BUF_TYPE_VIDEO_CAPTURE, V4L2_FIELD_NONE,
 				sizeof(struct pxa_buffer), icd);
@@ -1059,8 +1069,10 @@ static void pxa_camera_setup_cicr(struct soc_camera_device *icd,
 	if (ret < 0)
 		y_skip_top = 0;
 
-	/* Datawidth is now guaranteed to be equal to one of the three values.
-	 * We fix bit-per-pixel equal to data-width... */
+	/*
+	 * Datawidth is now guaranteed to be equal to one of the three values.
+	 * We fix bit-per-pixel equal to data-width...
+	 */
 	switch (flags & SOCAM_DATAWIDTH_MASK) {
 	case SOCAM_DATAWIDTH_10:
 		dw = 4;
@@ -1071,8 +1083,10 @@ static void pxa_camera_setup_cicr(struct soc_camera_device *icd,
 		bpp = 0x20;
 		break;
 	default:
-		/* Actually it can only be 8 now,
-		 * default is just to silence compiler warnings */
+		/*
+		 * Actually it can only be 8 now,
+		 * default is just to silence compiler warnings
+		 */
 	case SOCAM_DATAWIDTH_8:
 		dw = 2;
 		bpp = 0;
@@ -1524,10 +1538,12 @@ static int pxa_camera_reqbufs(struct soc_camera_file *icf,
 {
 	int i;
 
-	/* This is for locking debugging only. I removed spinlocks and now I
+	/*
+	 * This is for locking debugging only. I removed spinlocks and now I
 	 * check whether .prepare is ever called on a linked buffer, or whether
 	 * a dma IRQ can occur for an in-work or unlinked buffer. Until now
-	 * it hadn't triggered */
+	 * it hadn't triggered
+	 */
 	for (i = 0; i < p->count; i++) {
 		struct pxa_buffer *buf = container_of(icf->vb_vidq.bufs[i],
 						      struct pxa_buffer, vb);
@@ -1662,8 +1678,10 @@ static int __devinit pxa_camera_probe(struct platform_device *pdev)
 	pcdev->platform_flags = pcdev->pdata->flags;
 	if (!(pcdev->platform_flags & (PXA_CAMERA_DATAWIDTH_8 |
 			PXA_CAMERA_DATAWIDTH_9 | PXA_CAMERA_DATAWIDTH_10))) {
-		/* Platform hasn't set available data widths. This is bad.
-		 * Warn and use a default. */
+		/*
+		 * Platform hasn't set available data widths. This is bad.
+		 * Warn and use a default.
+		 */
 		dev_warn(&pdev->dev, "WARNING! Platform hasn't set available "
 			 "data widths, using default 10 bit\n");
 		pcdev->platform_flags |= PXA_CAMERA_DATAWIDTH_10;
diff --git a/drivers/media/video/sh_mobile_ceu_camera.c b/drivers/media/video/sh_mobile_ceu_camera.c
index 2f78b4f..6613606 100644
--- a/drivers/media/video/sh_mobile_ceu_camera.c
+++ b/drivers/media/video/sh_mobile_ceu_camera.c
@@ -209,7 +209,8 @@ static void sh_mobile_ceu_capture(struct sh_mobile_ceu_dev *pcdev)
 	struct soc_camera_device *icd = pcdev->icd;
 	dma_addr_t phys_addr_top, phys_addr_bottom;
 
-	/* The hardware is _very_ picky about this sequence. Especially
+	/*
+	 * The hardware is _very_ picky about this sequence. Especially
 	 * the CEU_CETCR_MAGIC value. It seems like we need to acknowledge
 	 * several not-so-well documented interrupt sources in CETCR.
 	 */
@@ -265,8 +266,10 @@ static int sh_mobile_ceu_videobuf_prepare(struct videobuf_queue *vq,
 	WARN_ON(!list_empty(&vb->queue));
 
 #ifdef DEBUG
-	/* This can be useful if you want to see if we actually fill
-	 * the buffer with something */
+	/*
+	 * This can be useful if you want to see if we actually fill
+	 * the buffer with something
+	 */
 	memset((void *)vb->baddr, 0xaa, vb->bsize);
 #endif
 
@@ -653,7 +656,8 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 
 	ceu_write(pcdev, CFLCR, pcdev->cflcr);
 
-	/* A few words about byte order (observed in Big Endian mode)
+	/*
+	 * A few words about byte order (observed in Big Endian mode)
 	 *
 	 * In data fetch mode bytes are received in chunks of 8 bytes.
 	 * D0, D1, D2, D3, D4, D5, D6, D7 (D0 received first)
@@ -1517,10 +1521,12 @@ static int sh_mobile_ceu_reqbufs(struct soc_camera_file *icf,
 {
 	int i;
 
-	/* This is for locking debugging only. I removed spinlocks and now I
+	/*
+	 * This is for locking debugging only. I removed spinlocks and now I
 	 * check whether .prepare is ever called on a linked buffer, or whether
 	 * a dma IRQ can occur for an in-work or unlinked buffer. Until now
-	 * it hadn't triggered */
+	 * it hadn't triggered
+	 */
 	for (i = 0; i < p->count; i++) {
 		struct sh_mobile_ceu_buffer *buf;
 
diff --git a/drivers/media/video/soc_camera.c b/drivers/media/video/soc_camera.c
index 36e617b..bf77935 100644
--- a/drivers/media/video/soc_camera.c
+++ b/drivers/media/video/soc_camera.c
@@ -621,8 +621,10 @@ static int soc_camera_streamoff(struct file *file, void *priv,
 
 	mutex_lock(&icd->video_lock);
 
-	/* This calls buf_release from host driver's videobuf_queue_ops for all
-	 * remaining buffers. When the last buffer is freed, stop capture */
+	/*
+	 * This calls buf_release from host driver's videobuf_queue_ops for all
+	 * remaining buffers. When the last buffer is freed, stop capture
+	 */
 	videobuf_streamoff(&icf->vb_vidq);
 
 	v4l2_subdev_call(sd, video, s_stream, 0);
@@ -1004,8 +1006,10 @@ epower:
 	return ret;
 }
 
-/* This is called on device_unregister, which only means we have to disconnect
- * from the host, but not remove ourselves from the device list */
+/*
+ * This is called on device_unregister, which only means we have to disconnect
+ * from the host, but not remove ourselves from the device list
+ */
 static int soc_camera_remove(struct device *dev)
 {
 	struct soc_camera_device *icd = to_soc_camera_dev(dev);
@@ -1196,8 +1200,10 @@ static int soc_camera_device_register(struct soc_camera_device *icd)
 	}
 
 	if (num < 0)
-		/* ok, we have 256 cameras on this host...
-		 * man, stay reasonable... */
+		/*
+		 * ok, we have 256 cameras on this host...
+		 * man, stay reasonable...
+		 */
 		return -ENOMEM;
 
 	icd->devnum = num;
@@ -1328,9 +1334,11 @@ escdevreg:
 	return ret;
 }
 
-/* Only called on rmmod for each platform device, since they are not
+/*
+ * Only called on rmmod for each platform device, since they are not
  * hot-pluggable. Now we know, that all our users - hosts and devices have
- * been unloaded already */
+ * been unloaded already
+ */
 static int __devexit soc_camera_pdrv_remove(struct platform_device *pdev)
 {
 	struct soc_camera_device *icd = platform_get_drvdata(pdev);
diff --git a/drivers/media/video/tw9910.c b/drivers/media/video/tw9910.c
index 7bf90a2..3cb9ba6 100644
--- a/drivers/media/video/tw9910.c
+++ b/drivers/media/video/tw9910.c
@@ -180,9 +180,8 @@
 			  */
 
 /* VBICNTL */
-/* RTSEL : control the real time signal
-*          output from the MPOUT pin
-*/
+
+/* RTSEL : control the real time signal output from the MPOUT pin */
 #define RTSEL_MASK  0x07
 #define RTSEL_VLOSS 0x00 /* 0000 = Video loss */
 #define RTSEL_HLOCK 0x01 /* 0001 = H-lock */
@@ -596,7 +595,8 @@ static int tw9910_g_register(struct v4l2_subdev *sd,
 	if (ret < 0)
 		return ret;
 
-	/* ret      = int
+	/*
+	 * ret      = int
 	 * reg->val = __u64
 	 */
 	reg->val = (__u64)ret;
diff --git a/include/media/ov772x.h b/include/media/ov772x.h
index 30d9629..37bcd09 100644
--- a/include/media/ov772x.h
+++ b/include/media/ov772x.h
@@ -1,4 +1,5 @@
-/* ov772x Camera
+/*
+ * ov772x Camera
  *
  * Copyright (C) 2008 Renesas Solutions Corp.
  * Kuninori Morimoto <morimoto.kuninori@renesas.com>
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (2 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH 3/9] soc-camera: fix multi-line comment coding style Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-11-05 14:45   ` Hans Verkuil
  2009-10-30 14:01 ` [PATCH 5/9] soc-camera: add a private field to struct soc_camera_link Guennadi Liakhovetski
                   ` (5 subsequent siblings)
  9 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

The 16-bit monochrome fourcc code has been previously abused for a 10-bit
format, add a new 10-bit code instead. Also add missing 8- and 10-bit Bayer
fourcc codes for completeness.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---
 include/linux/videodev2.h |    7 ++++++-
 1 files changed, 6 insertions(+), 1 deletions(-)

diff --git a/include/linux/videodev2.h b/include/linux/videodev2.h
index b59e78c..9b240d5 100644
--- a/include/linux/videodev2.h
+++ b/include/linux/videodev2.h
@@ -294,6 +294,7 @@ struct v4l2_pix_format {
 
 /* Grey formats */
 #define V4L2_PIX_FMT_GREY    v4l2_fourcc('G', 'R', 'E', 'Y') /*  8  Greyscale     */
+#define V4L2_PIX_FMT_Y10     v4l2_fourcc('Y', '1', '0', ' ') /* 10  Greyscale     */
 #define V4L2_PIX_FMT_Y16     v4l2_fourcc('Y', '1', '6', ' ') /* 16  Greyscale     */
 
 /* Palette formats */
@@ -329,7 +330,11 @@ struct v4l2_pix_format {
 #define V4L2_PIX_FMT_SBGGR8  v4l2_fourcc('B', 'A', '8', '1') /*  8  BGBG.. GRGR.. */
 #define V4L2_PIX_FMT_SGBRG8  v4l2_fourcc('G', 'B', 'R', 'G') /*  8  GBGB.. RGRG.. */
 #define V4L2_PIX_FMT_SGRBG8  v4l2_fourcc('G', 'R', 'B', 'G') /*  8  GRGR.. BGBG.. */
-#define V4L2_PIX_FMT_SGRBG10 v4l2_fourcc('B', 'A', '1', '0') /* 10bit raw bayer */
+#define V4L2_PIX_FMT_SRGGB8  v4l2_fourcc('R', 'G', 'G', 'B') /*  8  RGRG.. GBGB.. */
+#define V4L2_PIX_FMT_SBGGR10 v4l2_fourcc('B', 'G', '1', '0') /* 10  BGBG.. GRGR.. */
+#define V4L2_PIX_FMT_SGBRG10 v4l2_fourcc('G', 'B', '1', '0') /* 10  GBGB.. RGRG.. */
+#define V4L2_PIX_FMT_SGRBG10 v4l2_fourcc('B', 'A', '1', '0') /* 10  GRGR.. BGBG.. */
+#define V4L2_PIX_FMT_SRGGB10 v4l2_fourcc('R', 'G', '1', '0') /* 10  RGRG.. GBGB.. */
 	/* 10bit raw bayer DPCM compressed to 8 bits */
 #define V4L2_PIX_FMT_SGRBG10DPCM8 v4l2_fourcc('B', 'D', '1', '0')
 	/*
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH 5/9] soc-camera: add a private field to struct soc_camera_link
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (3 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH 6/9] soc-camera: switch drivers and platforms to use .priv in " Guennadi Liakhovetski
                   ` (4 subsequent siblings)
  9 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Up to now, if a client driver needed platform data apart from those contained
in struct soc_camera_link, it had to embed the struct into its own object. This
makes the use of such a driver in configurations other than soc-camera
difficult. This patch adds a private field to struct soc_camera_link to make
the use of such private data arbitrary.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---
 include/media/soc_camera.h |    2 ++
 1 files changed, 2 insertions(+), 0 deletions(-)

diff --git a/include/media/soc_camera.h b/include/media/soc_camera.h
index 218639f..831efff 100644
--- a/include/media/soc_camera.h
+++ b/include/media/soc_camera.h
@@ -104,6 +104,8 @@ struct soc_camera_link {
 	int i2c_adapter_id;
 	struct i2c_board_info *board_info;
 	const char *module_name;
+	void *priv;
+
 	/*
 	 * For non-I2C devices platform platform has to provide methods to
 	 * add a device to the system and to remove
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH 6/9] soc-camera: switch drivers and platforms to use .priv in struct soc_camera_link
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (4 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH 5/9] soc-camera: add a private field to struct soc_camera_link Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats Guennadi Liakhovetski
                   ` (3 subsequent siblings)
  9 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

After this change drivers can be further extended to not fail, if they don't
get platform data, but to use defaults.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---
 arch/sh/boards/board-ap325rxa.c     |   38 +++++++++++++++++++---------------
 arch/sh/boards/mach-migor/setup.c   |   32 ++++++++++++++++------------
 drivers/media/video/ov772x.c        |    4 +-
 drivers/media/video/tw9910.c        |    6 ++--
 include/media/ov772x.h              |    1 -
 include/media/soc_camera_platform.h |    1 -
 include/media/tw9910.h              |    1 -
 7 files changed, 44 insertions(+), 39 deletions(-)

diff --git a/arch/sh/boards/board-ap325rxa.c b/arch/sh/boards/board-ap325rxa.c
index 2d08073..a3afe43 100644
--- a/arch/sh/boards/board-ap325rxa.c
+++ b/arch/sh/boards/board-ap325rxa.c
@@ -325,12 +325,14 @@ static struct soc_camera_platform_info camera_info = {
 	.bus_param = SOCAM_PCLK_SAMPLE_RISING | SOCAM_HSYNC_ACTIVE_HIGH |
 	SOCAM_VSYNC_ACTIVE_HIGH | SOCAM_MASTER | SOCAM_DATAWIDTH_8,
 	.set_capture = camera_set_capture,
-	.link = {
-		.bus_id		= 0,
-		.add_device	= ap325rxa_camera_add,
-		.del_device	= ap325rxa_camera_del,
-		.module_name	= "soc_camera_platform",
-	},
+};
+
+struct soc_camera_link camera_link = {
+	.bus_id		= 0,
+	.add_device	= ap325rxa_camera_add,
+	.del_device	= ap325rxa_camera_del,
+	.module_name	= "soc_camera_platform",
+	.priv		= &camera_info,
 };
 
 static void dummy_release(struct device *dev)
@@ -348,7 +350,7 @@ static struct platform_device camera_device = {
 static int ap325rxa_camera_add(struct soc_camera_link *icl,
 			       struct device *dev)
 {
-	if (icl != &camera_info.link || camera_probe() <= 0)
+	if (icl != &camera_link || camera_probe() <= 0)
 		return -ENODEV;
 
 	camera_info.dev = dev;
@@ -358,7 +360,7 @@ static int ap325rxa_camera_add(struct soc_camera_link *icl,
 
 static void ap325rxa_camera_del(struct soc_camera_link *icl)
 {
-	if (icl != &camera_info.link)
+	if (icl != &camera_link)
 		return;
 
 	platform_device_unregister(&camera_device);
@@ -439,13 +441,15 @@ static struct ov772x_camera_info ov7725_info = {
 	.buswidth	= SOCAM_DATAWIDTH_8,
 	.flags		= OV772X_FLAG_VFLIP | OV772X_FLAG_HFLIP,
 	.edgectrl	= OV772X_AUTO_EDGECTRL(0xf, 0),
-	.link = {
-		.bus_id		= 0,
-		.power		= ov7725_power,
-		.board_info	= &ap325rxa_i2c_camera[0],
-		.i2c_adapter_id	= 0,
-		.module_name	= "ov772x",
-	},
+};
+
+static struct soc_camera_link ov7725_link = {
+	.bus_id		= 0,
+	.power		= ov7725_power,
+	.board_info	= &ap325rxa_i2c_camera[0],
+	.i2c_adapter_id	= 0,
+	.module_name	= "ov772x",
+	.priv		= &ov7725_info,
 };
 
 static struct platform_device ap325rxa_camera[] = {
@@ -453,13 +457,13 @@ static struct platform_device ap325rxa_camera[] = {
 		.name	= "soc-camera-pdrv",
 		.id	= 0,
 		.dev	= {
-			.platform_data = &ov7725_info.link,
+			.platform_data = &ov7725_link,
 		},
 	}, {
 		.name	= "soc-camera-pdrv",
 		.id	= 1,
 		.dev	= {
-			.platform_data = &camera_info.link,
+			.platform_data = &camera_link,
 		},
 	},
 };
diff --git a/arch/sh/boards/mach-migor/setup.c b/arch/sh/boards/mach-migor/setup.c
index 6ed1fd3..6145120 100644
--- a/arch/sh/boards/mach-migor/setup.c
+++ b/arch/sh/boards/mach-migor/setup.c
@@ -425,23 +425,27 @@ static struct i2c_board_info migor_i2c_camera[] = {
 
 static struct ov772x_camera_info ov7725_info = {
 	.buswidth	= SOCAM_DATAWIDTH_8,
-	.link = {
-		.power		= ov7725_power,
-		.board_info	= &migor_i2c_camera[0],
-		.i2c_adapter_id	= 0,
-		.module_name	= "ov772x",
-	},
+};
+
+static struct soc_camera_link ov7725_link = {
+	.power		= ov7725_power,
+	.board_info	= &migor_i2c_camera[0],
+	.i2c_adapter_id	= 0,
+	.module_name	= "ov772x",
+	.priv		= &ov7725_info,
 };
 
 static struct tw9910_video_info tw9910_info = {
 	.buswidth	= SOCAM_DATAWIDTH_8,
 	.mpout		= TW9910_MPO_FIELD,
-	.link = {
-		.power		= tw9910_power,
-		.board_info	= &migor_i2c_camera[1],
-		.i2c_adapter_id	= 0,
-		.module_name	= "tw9910",
-	}
+};
+
+static struct soc_camera_link tw9910_link = {
+	.power		= tw9910_power,
+	.board_info	= &migor_i2c_camera[1],
+	.i2c_adapter_id	= 0,
+	.module_name	= "tw9910",
+	.priv		= &tw9910_info,
 };
 
 static struct platform_device migor_camera[] = {
@@ -449,13 +453,13 @@ static struct platform_device migor_camera[] = {
 		.name	= "soc-camera-pdrv",
 		.id	= 0,
 		.dev	= {
-			.platform_data = &ov7725_info.link,
+			.platform_data = &ov7725_link,
 		},
 	}, {
 		.name	= "soc-camera-pdrv",
 		.id	= 1,
 		.dev	= {
-			.platform_data = &tw9910_info.link,
+			.platform_data = &tw9910_link,
 		},
 	},
 };
diff --git a/drivers/media/video/ov772x.c b/drivers/media/video/ov772x.c
index eccb40a..dbaf508 100644
--- a/drivers/media/video/ov772x.c
+++ b/drivers/media/video/ov772x.c
@@ -1143,10 +1143,10 @@ static int ov772x_probe(struct i2c_client *client,
 	}
 
 	icl = to_soc_camera_link(icd);
-	if (!icl)
+	if (!icl || !icl->priv)
 		return -EINVAL;
 
-	info = container_of(icl, struct ov772x_camera_info, link);
+	info = icl->priv;
 
 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_BYTE_DATA)) {
 		dev_err(&adapter->dev,
diff --git a/drivers/media/video/tw9910.c b/drivers/media/video/tw9910.c
index 3cb9ba6..35373d8 100644
--- a/drivers/media/video/tw9910.c
+++ b/drivers/media/video/tw9910.c
@@ -955,10 +955,10 @@ static int tw9910_probe(struct i2c_client *client,
 	}
 
 	icl = to_soc_camera_link(icd);
-	if (!icl)
+	if (!icl || !icl->priv)
 		return -EINVAL;
 
-	info = container_of(icl, struct tw9910_video_info, link);
+	info = icl->priv;
 
 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_BYTE_DATA)) {
 		dev_err(&client->dev,
@@ -976,7 +976,7 @@ static int tw9910_probe(struct i2c_client *client,
 	v4l2_i2c_subdev_init(&priv->subdev, client, &tw9910_subdev_ops);
 
 	icd->ops     = &tw9910_ops;
-	icd->iface   = info->link.bus_id;
+	icd->iface   = icl->bus_id;
 
 	ret = tw9910_video_probe(icd, client);
 	if (ret) {
diff --git a/include/media/ov772x.h b/include/media/ov772x.h
index 37bcd09..14c77ef 100644
--- a/include/media/ov772x.h
+++ b/include/media/ov772x.h
@@ -55,7 +55,6 @@ struct ov772x_edge_ctrl {
 struct ov772x_camera_info {
 	unsigned long          buswidth;
 	unsigned long          flags;
-	struct soc_camera_link link;
 	struct ov772x_edge_ctrl edgectrl;
 };
 
diff --git a/include/media/soc_camera_platform.h b/include/media/soc_camera_platform.h
index bb70401..88b3b57 100644
--- a/include/media/soc_camera_platform.h
+++ b/include/media/soc_camera_platform.h
@@ -23,7 +23,6 @@ struct soc_camera_platform_info {
 	unsigned long bus_param;
 	struct device *dev;
 	int (*set_capture)(struct soc_camera_platform_info *info, int enable);
-	struct soc_camera_link link;
 };
 
 #endif /* __SOC_CAMERA_H__ */
diff --git a/include/media/tw9910.h b/include/media/tw9910.h
index 73231e7..5e2895a 100644
--- a/include/media/tw9910.h
+++ b/include/media/tw9910.h
@@ -32,7 +32,6 @@ enum tw9910_mpout_pin {
 struct tw9910_video_info {
 	unsigned long          buswidth;
 	enum tw9910_mpout_pin  mpout;
-	struct soc_camera_link link;
 };
 
 
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (5 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH 6/9] soc-camera: switch drivers and platforms to use .priv in " Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-11-05 15:41   ` Hans Verkuil
                     ` (2 more replies)
  2009-10-30 14:01 ` [PATCH/RFC 8/9 v2] soc-camera: convert to the new imagebus API Guennadi Liakhovetski
                   ` (2 subsequent siblings)
  9 siblings, 3 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Video subdevices, like cameras, decoders, connect to video bridges over
specialised busses. Data is being transferred over these busses in various
formats, which only loosely correspond to fourcc codes, describing how video
data is stored in RAM. This is not a one-to-one correspondence, therefore we
cannot use fourcc codes to configure subdevice output data formats. This patch
adds codes for several such on-the-bus formats and an API, similar to the
familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
codes. After all users of the old API in struct v4l2_subdev_video_ops are
converted, the API will be removed.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---
 drivers/media/video/Makefile        |    2 +-
 drivers/media/video/v4l2-imagebus.c |  218 +++++++++++++++++++++++++++++++++++
 include/media/v4l2-imagebus.h       |   84 ++++++++++++++
 include/media/v4l2-subdev.h         |   10 ++-
 4 files changed, 312 insertions(+), 2 deletions(-)
 create mode 100644 drivers/media/video/v4l2-imagebus.c
 create mode 100644 include/media/v4l2-imagebus.h

diff --git a/drivers/media/video/Makefile b/drivers/media/video/Makefile
index 7a2dcc3..62d8907 100644
--- a/drivers/media/video/Makefile
+++ b/drivers/media/video/Makefile
@@ -10,7 +10,7 @@ stkwebcam-objs	:=	stk-webcam.o stk-sensor.o
 
 omap2cam-objs	:=	omap24xxcam.o omap24xxcam-dma.o
 
-videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o
+videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o v4l2-imagebus.o
 
 # V4L2 core modules
 
diff --git a/drivers/media/video/v4l2-imagebus.c b/drivers/media/video/v4l2-imagebus.c
new file mode 100644
index 0000000..e0a3a83
--- /dev/null
+++ b/drivers/media/video/v4l2-imagebus.c
@@ -0,0 +1,218 @@
+/*
+ * Image Bus API
+ *
+ * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
+ *
+ * This program is free software; you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License version 2 as
+ * published by the Free Software Foundation.
+ */
+
+#include <linux/kernel.h>
+#include <linux/module.h>
+
+#include <media/v4l2-device.h>
+#include <media/v4l2-imagebus.h>
+
+static const struct v4l2_imgbus_pixelfmt imgbus_fmt[] = {
+	[V4L2_IMGBUS_FMT_YUYV] = {
+		.fourcc			= V4L2_PIX_FMT_YUYV,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "YUYV",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_YVYU] = {
+		.fourcc			= V4L2_PIX_FMT_YVYU,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "YVYU",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_UYVY] = {
+		.fourcc			= V4L2_PIX_FMT_UYVY,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "UYVY",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_VYUY] = {
+		.fourcc			= V4L2_PIX_FMT_VYUY,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "VYUY",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8] = {
+		.fourcc			= V4L2_PIX_FMT_VYUY,
+		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
+		.name			= "VYUY in SMPTE170M",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16] = {
+		.fourcc			= V4L2_PIX_FMT_VYUY,
+		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
+		.name			= "VYUY in SMPTE170M, 16bit",
+		.bits_per_sample	= 16,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_RGB555] = {
+		.fourcc			= V4L2_PIX_FMT_RGB555,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "RGB555",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_RGB555X] = {
+		.fourcc			= V4L2_PIX_FMT_RGB555X,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "RGB555X",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_RGB565] = {
+		.fourcc			= V4L2_PIX_FMT_RGB565,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "RGB565",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_RGB565X] = {
+		.fourcc			= V4L2_PIX_FMT_RGB565X,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "RGB565X",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SBGGR8] = {
+		.fourcc			= V4L2_PIX_FMT_SBGGR8,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 8 BGGR",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SGBRG8] = {
+		.fourcc			= V4L2_PIX_FMT_SGBRG8,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 8 GBRG",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SGRBG8] = {
+		.fourcc			= V4L2_PIX_FMT_SGRBG8,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 8 GRBG",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SRGGB8] = {
+		.fourcc			= V4L2_PIX_FMT_SRGGB8,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 8 RGGB",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SBGGR10] = {
+		.fourcc			= V4L2_PIX_FMT_SBGGR10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 BGGR",
+		.bits_per_sample	= 10,
+		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SGBRG10] = {
+		.fourcc			= V4L2_PIX_FMT_SGBRG10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 GBRG",
+		.bits_per_sample	= 10,
+		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SGRBG10] = {
+		.fourcc			= V4L2_PIX_FMT_SGRBG10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 GRBG",
+		.bits_per_sample	= 10,
+		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SRGGB10] = {
+		.fourcc			= V4L2_PIX_FMT_SRGGB10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 RGGB",
+		.bits_per_sample	= 10,
+		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_GREY] = {
+		.fourcc			= V4L2_PIX_FMT_GREY,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Grey",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_Y16] = {
+		.fourcc			= V4L2_PIX_FMT_Y16,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Grey 16bit",
+		.bits_per_sample	= 16,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_Y10] = {
+		.fourcc			= V4L2_PIX_FMT_Y10,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Grey 10bit",
+		.bits_per_sample	= 10,
+		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE] = {
+		.fourcc			= V4L2_PIX_FMT_SBGGR10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 BGGR",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE] = {
+		.fourcc			= V4L2_PIX_FMT_SBGGR10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 BGGR",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE] = {
+		.fourcc			= V4L2_PIX_FMT_SBGGR10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 BGGR",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
+		.order			= V4L2_IMGBUS_ORDER_BE,
+	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE] = {
+		.fourcc			= V4L2_PIX_FMT_SBGGR10,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer 10 BGGR",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
+		.order			= V4L2_IMGBUS_ORDER_BE,
+	},
+};
+
+const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
+	enum v4l2_imgbus_pixelcode code)
+{
+	if ((unsigned int)code > ARRAY_SIZE(imgbus_fmt))
+		return NULL;
+	return imgbus_fmt + code;
+}
+EXPORT_SYMBOL(v4l2_imgbus_get_fmtdesc);
+
+s32 v4l2_imgbus_bytes_per_line(u32 width,
+			       const struct v4l2_imgbus_pixelfmt *imgf)
+{
+	switch (imgf->packing) {
+	case V4L2_IMGBUS_PACKING_NONE:
+		return width * imgf->bits_per_sample / 8;
+	case V4L2_IMGBUS_PACKING_2X8_PADHI:
+	case V4L2_IMGBUS_PACKING_2X8_PADLO:
+	case V4L2_IMGBUS_PACKING_EXTEND16:
+		return width * 2;
+	}
+	return -EINVAL;
+}
+EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
diff --git a/include/media/v4l2-imagebus.h b/include/media/v4l2-imagebus.h
new file mode 100644
index 0000000..022d044
--- /dev/null
+++ b/include/media/v4l2-imagebus.h
@@ -0,0 +1,84 @@
+/*
+ * Image Bus API header
+ *
+ * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
+ *
+ * This program is free software; you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License version 2 as
+ * published by the Free Software Foundation.
+ */
+
+#ifndef V4L2_IMGBUS_H
+#define V4L2_IMGBUS_H
+
+enum v4l2_imgbus_packing {
+	V4L2_IMGBUS_PACKING_NONE,
+	V4L2_IMGBUS_PACKING_2X8_PADHI,
+	V4L2_IMGBUS_PACKING_2X8_PADLO,
+	V4L2_IMGBUS_PACKING_EXTEND16,
+};
+
+enum v4l2_imgbus_order {
+	V4L2_IMGBUS_ORDER_LE,
+	V4L2_IMGBUS_ORDER_BE,
+};
+
+enum v4l2_imgbus_pixelcode {
+	V4L2_IMGBUS_FMT_YUYV,
+	V4L2_IMGBUS_FMT_YVYU,
+	V4L2_IMGBUS_FMT_UYVY,
+	V4L2_IMGBUS_FMT_VYUY,
+	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
+	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
+	V4L2_IMGBUS_FMT_RGB555,
+	V4L2_IMGBUS_FMT_RGB555X,
+	V4L2_IMGBUS_FMT_RGB565,
+	V4L2_IMGBUS_FMT_RGB565X,
+	V4L2_IMGBUS_FMT_SBGGR8,
+	V4L2_IMGBUS_FMT_SGBRG8,
+	V4L2_IMGBUS_FMT_SGRBG8,
+	V4L2_IMGBUS_FMT_SRGGB8,
+	V4L2_IMGBUS_FMT_SBGGR10,
+	V4L2_IMGBUS_FMT_SGBRG10,
+	V4L2_IMGBUS_FMT_SGRBG10,
+	V4L2_IMGBUS_FMT_SRGGB10,
+	V4L2_IMGBUS_FMT_GREY,
+	V4L2_IMGBUS_FMT_Y16,
+	V4L2_IMGBUS_FMT_Y10,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
+};
+
+/**
+ * struct v4l2_imgbus_pixelfmt - Data format on the image bus
+ * @fourcc:		Fourcc code...
+ * @colorspace:		and colorspace, that will be obtained if the data is
+ *			stored in memory in the following way:
+ * @bits_per_sample:	How many bits the bridge has to sample
+ * @packing:		Type of sample-packing, that has to be used
+ * @order:		Sample order when storing in memory
+ */
+struct v4l2_imgbus_pixelfmt {
+	u32				fourcc;
+	enum v4l2_colorspace		colorspace;
+	const char			*name;
+	enum v4l2_imgbus_packing	packing;
+	enum v4l2_imgbus_order		order;
+	u8				bits_per_sample;
+};
+
+struct v4l2_imgbus_framefmt {
+	__u32				width;
+	__u32				height;
+	enum v4l2_imgbus_pixelcode	code;
+	enum v4l2_field			field;
+};
+
+const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
+	enum v4l2_imgbus_pixelcode code);
+s32 v4l2_imgbus_bytes_per_line(u32 width,
+			       const struct v4l2_imgbus_pixelfmt *imgf);
+
+#endif
diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
index 04193eb..1e86f39 100644
--- a/include/media/v4l2-subdev.h
+++ b/include/media/v4l2-subdev.h
@@ -22,6 +22,7 @@
 #define _V4L2_SUBDEV_H
 
 #include <media/v4l2-common.h>
+#include <media/v4l2-imagebus.h>
 
 struct v4l2_device;
 struct v4l2_subdev;
@@ -196,7 +197,7 @@ struct v4l2_subdev_audio_ops {
    s_std_output: set v4l2_std_id for video OUTPUT devices. This is ignored by
 	video input devices.
 
-  s_crystal_freq: sets the frequency of the crystal used to generate the
+   s_crystal_freq: sets the frequency of the crystal used to generate the
 	clocks in Hz. An extra flags field allows device specific configuration
 	regarding clock frequency dividers, etc. If not used, then set flags
 	to 0. If the frequency is not supported, then -EINVAL is returned.
@@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
 
    s_routing: see s_routing in audio_ops, except this version is for video
 	devices.
+
+   enum_imgbus_fmt: enumerate pixel formats provided by a video data source
  */
 struct v4l2_subdev_video_ops {
 	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32 config);
@@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
 	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
 	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
 	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
+	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
+			       enum v4l2_imgbus_pixelcode *code);
+	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
+	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
+	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
 };
 
 /**
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH/RFC 8/9 v2] soc-camera: convert to the new imagebus API
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (6 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 18:31   ` [PATCH/RFC 8a/9 " Guennadi Liakhovetski
  2009-10-30 18:34   ` [PATCH/RFC 8b/9 v3] rj54n1cb0c: Add cropping, auto white balance, restrict sizes, add platform data Guennadi Liakhovetski
  2009-10-30 14:01 ` [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional Guennadi Liakhovetski
  2009-10-30 14:34 ` [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Karicheri, Muralidharan
  9 siblings, 2 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---

As you see in the diffstat, the biggest is the patch for rj54n1cb0c.c, 
which does not only convert it to image-bus, but also adds some 
functionality. It might be better to split that off...

 arch/sh/boards/board-ap325rxa.c            |    4 +-
 arch/sh/boards/mach-kfr2r09/setup.c        |   13 +-
 drivers/media/video/mt9m001.c              |  117 +++---
 drivers/media/video/mt9m111.c              |  139 +++---
 drivers/media/video/mt9t031.c              |   67 ++--
 drivers/media/video/mt9v022.c              |  126 +++---
 drivers/media/video/mx1_camera.c           |   77 +++-
 drivers/media/video/mx3_camera.c           |  270 ++++++-----
 drivers/media/video/ov772x.c               |  177 +++----
 drivers/media/video/ov9640.c               |   62 ++--
 drivers/media/video/pxa_camera.c           |  265 ++++++-----
 drivers/media/video/rj54n1cb0c.c           |  726 +++++++++++++++++++---------
 drivers/media/video/sh_mobile_ceu_camera.c |  335 +++++++------
 drivers/media/video/soc_camera.c           |   72 ++--
 drivers/media/video/soc_camera_platform.c  |   37 +-
 drivers/media/video/tw9910.c               |   91 ++--
 include/media/rj54n1cb0c.h                 |   19 +
 include/media/soc_camera.h                 |   24 +-
 include/media/soc_camera_platform.h        |    2 +-
 19 files changed, 1491 insertions(+), 1132 deletions(-)
 create mode 100644 include/media/rj54n1cb0c.h

diff --git a/arch/sh/boards/board-ap325rxa.c b/arch/sh/boards/board-ap325rxa.c
index a3afe43..30fa8b8 100644
--- a/arch/sh/boards/board-ap325rxa.c
+++ b/arch/sh/boards/board-ap325rxa.c
@@ -317,8 +317,8 @@ static struct soc_camera_platform_info camera_info = {
 	.format_name = "UYVY",
 	.format_depth = 16,
 	.format = {
-		.pixelformat = V4L2_PIX_FMT_UYVY,
-		.colorspace = V4L2_COLORSPACE_SMPTE170M,
+		.code = V4L2_IMGBUS_FMT_UYVY,
+		.field = V4L2_FIELD_NONE,
 		.width = 640,
 		.height = 480,
 	},
diff --git a/arch/sh/boards/mach-kfr2r09/setup.c b/arch/sh/boards/mach-kfr2r09/setup.c
index ce01d6a..18df641 100644
--- a/arch/sh/boards/mach-kfr2r09/setup.c
+++ b/arch/sh/boards/mach-kfr2r09/setup.c
@@ -18,6 +18,7 @@
 #include <linux/input.h>
 #include <linux/i2c.h>
 #include <linux/usb/r8a66597.h>
+#include <media/rj54n1cb0c.h>
 #include <media/soc_camera.h>
 #include <media/sh_mobile_ceu.h>
 #include <video/sh_mobile_lcdc.h>
@@ -254,6 +255,9 @@ static struct i2c_board_info kfr2r09_i2c_camera = {
 
 static struct clk *camera_clk;
 
+/* set VIO_CKO clock to 25MHz */
+#define CEU_MCLK_FREQ 25000000
+
 #define DRVCRB 0xA405018C
 static int camera_power(struct device *dev, int mode)
 {
@@ -266,8 +270,7 @@ static int camera_power(struct device *dev, int mode)
 		if (IS_ERR(camera_clk))
 			return PTR_ERR(camera_clk);
 
-		/* set VIO_CKO clock to 25MHz */
-		rate = clk_round_rate(camera_clk, 25000000);
+		rate = clk_round_rate(camera_clk, CEU_MCLK_FREQ);
 		ret = clk_set_rate(camera_clk, rate);
 		if (ret < 0)
 			goto eclkrate;
@@ -317,11 +320,17 @@ eclkrate:
 	return ret;
 }
 
+static struct rj54n1_pdata rj54n1_priv = {
+	.mclk_freq	= CEU_MCLK_FREQ,
+	.ioctl_high	= false,
+};
+
 static struct soc_camera_link rj54n1_link = {
 	.power		= camera_power,
 	.board_info	= &kfr2r09_i2c_camera,
 	.i2c_adapter_id	= 1,
 	.module_name	= "rj54n1cb0c",
+	.priv		= &rj54n1_priv,
 };
 
 static struct platform_device kfr2r09_camera = {
diff --git a/drivers/media/video/mt9m001.c b/drivers/media/video/mt9m001.c
index cc90660..e1c8578 100644
--- a/drivers/media/video/mt9m001.c
+++ b/drivers/media/video/mt9m001.c
@@ -48,41 +48,27 @@
 #define MT9M001_COLUMN_SKIP		20
 #define MT9M001_ROW_SKIP		12
 
-static const struct soc_camera_data_format mt9m001_colour_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9m001_colour_codes[] = {
 	/*
 	 * Order important: first natively supported,
 	 * second supported with a GPIO extender
 	 */
-	{
-		.name		= "Bayer (sRGB) 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_SBGGR16,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}, {
-		.name		= "Bayer (sRGB) 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_SBGGR8,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+	V4L2_IMGBUS_FMT_SBGGR10,
+	V4L2_IMGBUS_FMT_SBGGR8,
 };
 
-static const struct soc_camera_data_format mt9m001_monochrome_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9m001_monochrome_codes[] = {
 	/* Order important - see above */
-	{
-		.name		= "Monochrome 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_Y16,
-	}, {
-		.name		= "Monochrome 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_GREY,
-	},
+	V4L2_IMGBUS_FMT_Y10,
+	V4L2_IMGBUS_FMT_GREY,
 };
 
 struct mt9m001 {
 	struct v4l2_subdev subdev;
 	struct v4l2_rect rect;	/* Sensor window */
-	__u32 fourcc;
+	enum v4l2_imgbus_pixelcode code;
+	const enum v4l2_imgbus_pixelcode *codes;
+	int num_codes;
 	int model;	/* V4L2_IDENT_MT9M001* codes from v4l2-chip-ident.h */
 	unsigned int gain;
 	unsigned int exposure;
@@ -209,8 +195,7 @@ static int mt9m001_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	const u16 hblank = 9, vblank = 25;
 	unsigned int total_h;
 
-	if (mt9m001->fourcc == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9m001->fourcc == V4L2_PIX_FMT_SBGGR16)
+	if (mt9m001->codes == mt9m001_colour_codes)
 		/*
 		 * Bayer format - even number of rows for simplicity,
 		 * but let the user play with the top row.
@@ -290,32 +275,31 @@ static int mt9m001_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9m001_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m001_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m001 *mt9m001 = to_mt9m001(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9m001->rect.width;
-	pix->height		= mt9m001->rect.height;
-	pix->pixelformat	= mt9m001->fourcc;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9m001->rect.width;
+	imgf->height	= mt9m001->rect.height;
+	imgf->code	= mt9m001->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9m001_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m001_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m001 *mt9m001 = to_mt9m001(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_crop a = {
 		.c = {
 			.left	= mt9m001->rect.left,
 			.top	= mt9m001->rect.top,
-			.width	= pix->width,
-			.height	= pix->height,
+			.width	= imgf->width,
+			.height	= imgf->height,
 		},
 	};
 	int ret;
@@ -323,28 +307,27 @@ static int mt9m001_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	/* No support for scaling so far, just crop. TODO: use skipping */
 	ret = mt9m001_s_crop(sd, &a);
 	if (!ret) {
-		pix->width = mt9m001->rect.width;
-		pix->height = mt9m001->rect.height;
-		mt9m001->fourcc = pix->pixelformat;
+		imgf->width	= mt9m001->rect.width;
+		imgf->height	= mt9m001->rect.height;
+		mt9m001->code	= imgf->code;
 	}
 
 	return ret;
 }
 
-static int mt9m001_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m001_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m001 *mt9m001 = to_mt9m001(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	v4l_bound_align_image(&pix->width, MT9M001_MIN_WIDTH,
+	v4l_bound_align_image(&imgf->width, MT9M001_MIN_WIDTH,
 		MT9M001_MAX_WIDTH, 1,
-		&pix->height, MT9M001_MIN_HEIGHT + mt9m001->y_skip_top,
+		&imgf->height, MT9M001_MIN_HEIGHT + mt9m001->y_skip_top,
 		MT9M001_MAX_HEIGHT + mt9m001->y_skip_top, 0, 0);
 
-	if (pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
-	    pix->pixelformat == V4L2_PIX_FMT_SBGGR16)
-		pix->height = ALIGN(pix->height - 1, 2);
+	if (mt9m001->codes == mt9m001_colour_codes)
+		imgf->height = ALIGN(imgf->height - 1, 2);
 
 	return 0;
 }
@@ -608,11 +591,11 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 	case 0x8411:
 	case 0x8421:
 		mt9m001->model = V4L2_IDENT_MT9M001C12ST;
-		icd->formats = mt9m001_colour_formats;
+		mt9m001->codes = mt9m001_colour_codes;
 		break;
 	case 0x8431:
 		mt9m001->model = V4L2_IDENT_MT9M001C12STM;
-		icd->formats = mt9m001_monochrome_formats;
+		mt9m001->codes = mt9m001_monochrome_codes;
 		break;
 	default:
 		dev_err(&client->dev,
@@ -620,7 +603,7 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 		return -ENODEV;
 	}
 
-	icd->num_formats = 0;
+	mt9m001->num_codes = 0;
 
 	/*
 	 * This is a 10bit sensor, so by default we only allow 10bit.
@@ -633,14 +616,14 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 		flags = SOCAM_DATAWIDTH_10;
 
 	if (flags & SOCAM_DATAWIDTH_10)
-		icd->num_formats++;
+		mt9m001->num_codes++;
 	else
-		icd->formats++;
+		mt9m001->codes++;
 
 	if (flags & SOCAM_DATAWIDTH_8)
-		icd->num_formats++;
+		mt9m001->num_codes++;
 
-	mt9m001->fourcc = icd->formats->fourcc;
+	mt9m001->code = mt9m001->codes[0];
 
 	dev_info(&client->dev, "Detected a MT9M001 chip ID %x (%s)\n", data,
 		 data == 0x8431 ? "C12STM" : "C12ST");
@@ -686,14 +669,28 @@ static struct v4l2_subdev_core_ops mt9m001_subdev_core_ops = {
 #endif
 };
 
+static int mt9m001_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	struct i2c_client *client = sd->priv;
+	struct mt9m001 *mt9m001 = to_mt9m001(client);
+
+	if ((unsigned int)index >= mt9m001->num_codes)
+		return -EINVAL;
+
+	*code = mt9m001->codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9m001_subdev_video_ops = {
-	.s_stream	= mt9m001_s_stream,
-	.s_fmt		= mt9m001_s_fmt,
-	.g_fmt		= mt9m001_g_fmt,
-	.try_fmt	= mt9m001_try_fmt,
-	.s_crop		= mt9m001_s_crop,
-	.g_crop		= mt9m001_g_crop,
-	.cropcap	= mt9m001_cropcap,
+	.s_stream		= mt9m001_s_stream,
+	.s_imgbus_fmt		= mt9m001_s_fmt,
+	.g_imgbus_fmt		= mt9m001_g_fmt,
+	.try_imgbus_fmt		= mt9m001_try_fmt,
+	.s_crop			= mt9m001_s_crop,
+	.g_crop			= mt9m001_g_crop,
+	.cropcap		= mt9m001_cropcap,
+	.enum_imgbus_fmt	= mt9m001_enum_fmt,
 };
 
 static struct v4l2_subdev_sensor_ops mt9m001_subdev_sensor_ops = {
diff --git a/drivers/media/video/mt9m111.c b/drivers/media/video/mt9m111.c
index 30db625..b5147e8 100644
--- a/drivers/media/video/mt9m111.c
+++ b/drivers/media/video/mt9m111.c
@@ -131,15 +131,15 @@
 #define JPG_FMT(_name, _depth, _fourcc) \
 	COL_FMT(_name, _depth, _fourcc, V4L2_COLORSPACE_JPEG)
 
-static const struct soc_camera_data_format mt9m111_colour_formats[] = {
-	JPG_FMT("CbYCrY 16 bit", 16, V4L2_PIX_FMT_UYVY),
-	JPG_FMT("CrYCbY 16 bit", 16, V4L2_PIX_FMT_VYUY),
-	JPG_FMT("YCbYCr 16 bit", 16, V4L2_PIX_FMT_YUYV),
-	JPG_FMT("YCrYCb 16 bit", 16, V4L2_PIX_FMT_YVYU),
-	RGB_FMT("RGB 565", 16, V4L2_PIX_FMT_RGB565),
-	RGB_FMT("RGB 555", 16, V4L2_PIX_FMT_RGB555),
-	RGB_FMT("Bayer (sRGB) 10 bit", 10, V4L2_PIX_FMT_SBGGR16),
-	RGB_FMT("Bayer (sRGB) 8 bit", 8, V4L2_PIX_FMT_SBGGR8),
+static const enum v4l2_imgbus_pixelcode mt9m111_colour_codes[] = {
+	V4L2_IMGBUS_FMT_UYVY,
+	V4L2_IMGBUS_FMT_VYUY,
+	V4L2_IMGBUS_FMT_YUYV,
+	V4L2_IMGBUS_FMT_YVYU,
+	V4L2_IMGBUS_FMT_RGB565,
+	V4L2_IMGBUS_FMT_RGB555,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
+	V4L2_IMGBUS_FMT_SBGGR8,
 };
 
 enum mt9m111_context {
@@ -152,7 +152,7 @@ struct mt9m111 {
 	int model;	/* V4L2_IDENT_MT9M11x* codes from v4l2-chip-ident.h */
 	enum mt9m111_context context;
 	struct v4l2_rect rect;
-	u32 pixfmt;
+	enum v4l2_imgbus_pixelcode code;
 	unsigned int gain;
 	unsigned char autoexposure;
 	unsigned char datawidth;
@@ -258,8 +258,8 @@ static int mt9m111_setup_rect(struct i2c_client *client,
 	int width = rect->width;
 	int height = rect->height;
 
-	if (mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR16)
+	if (mt9m111->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+	    mt9m111->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE)
 		is_raw_format = 1;
 	else
 		is_raw_format = 0;
@@ -307,7 +307,8 @@ static int mt9m111_setup_pixfmt(struct i2c_client *client, u16 outfmt)
 
 static int mt9m111_setfmt_bayer8(struct i2c_client *client)
 {
-	return mt9m111_setup_pixfmt(client, MT9M111_OUTFMT_PROCESSED_BAYER);
+	return mt9m111_setup_pixfmt(client, MT9M111_OUTFMT_PROCESSED_BAYER |
+				    MT9M111_OUTFMT_RGB);
 }
 
 static int mt9m111_setfmt_bayer10(struct i2c_client *client)
@@ -401,8 +402,8 @@ static int mt9m111_make_rect(struct i2c_client *client,
 {
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
 
-	if (mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR16) {
+	if (mt9m111->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+	    mt9m111->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE) {
 		/* Bayer format - even size lengths */
 		rect->width	= ALIGN(rect->width, 2);
 		rect->height	= ALIGN(rect->height, 2);
@@ -460,120 +461,120 @@ static int mt9m111_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9m111_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m111_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9m111->rect.width;
-	pix->height		= mt9m111->rect.height;
-	pix->pixelformat	= mt9m111->pixfmt;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9m111->rect.width;
+	imgf->height	= mt9m111->rect.height;
+	imgf->code	= mt9m111->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9m111_set_pixfmt(struct i2c_client *client, u32 pixfmt)
+static int mt9m111_set_pixfmt(struct i2c_client *client,
+			      enum v4l2_imgbus_pixelcode code)
 {
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
 	int ret;
 
-	switch (pixfmt) {
-	case V4L2_PIX_FMT_SBGGR8:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_SBGGR8:
 		ret = mt9m111_setfmt_bayer8(client);
 		break;
-	case V4L2_PIX_FMT_SBGGR16:
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE:
 		ret = mt9m111_setfmt_bayer10(client);
 		break;
-	case V4L2_PIX_FMT_RGB555:
+	case V4L2_IMGBUS_FMT_RGB555:
 		ret = mt9m111_setfmt_rgb555(client);
 		break;
-	case V4L2_PIX_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB565:
 		ret = mt9m111_setfmt_rgb565(client);
 		break;
-	case V4L2_PIX_FMT_UYVY:
+	case V4L2_IMGBUS_FMT_UYVY:
 		mt9m111->swap_yuv_y_chromas = 0;
 		mt9m111->swap_yuv_cb_cr = 0;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
-	case V4L2_PIX_FMT_VYUY:
+	case V4L2_IMGBUS_FMT_VYUY:
 		mt9m111->swap_yuv_y_chromas = 0;
 		mt9m111->swap_yuv_cb_cr = 1;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
-	case V4L2_PIX_FMT_YUYV:
+	case V4L2_IMGBUS_FMT_YUYV:
 		mt9m111->swap_yuv_y_chromas = 1;
 		mt9m111->swap_yuv_cb_cr = 0;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
-	case V4L2_PIX_FMT_YVYU:
+	case V4L2_IMGBUS_FMT_YVYU:
 		mt9m111->swap_yuv_y_chromas = 1;
 		mt9m111->swap_yuv_cb_cr = 1;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
 	default:
 		dev_err(&client->dev, "Pixel format not handled : %x\n",
-			pixfmt);
+			code);
 		ret = -EINVAL;
 	}
 
 	if (!ret)
-		mt9m111->pixfmt = pixfmt;
+		mt9m111->code = code;
 
 	return ret;
 }
 
-static int mt9m111_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m111_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_rect rect = {
 		.left	= mt9m111->rect.left,
 		.top	= mt9m111->rect.top,
-		.width	= pix->width,
-		.height	= pix->height,
+		.width	= imgf->width,
+		.height	= imgf->height,
 	};
 	int ret;
 
 	dev_dbg(&client->dev,
-		"%s fmt=%x left=%d, top=%d, width=%d, height=%d\n", __func__,
-		pix->pixelformat, rect.left, rect.top, rect.width, rect.height);
+		"%s code=%x left=%d, top=%d, width=%d, height=%d\n", __func__,
+		imgf->code, rect.left, rect.top, rect.width, rect.height);
 
 	ret = mt9m111_make_rect(client, &rect);
 	if (!ret)
-		ret = mt9m111_set_pixfmt(client, pix->pixelformat);
+		ret = mt9m111_set_pixfmt(client, imgf->code);
 	if (!ret)
 		mt9m111->rect = rect;
 	return ret;
 }
 
-static int mt9m111_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m111_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	bool bayer = pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
-		pix->pixelformat == V4L2_PIX_FMT_SBGGR16;
+	bool bayer = imgf->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE;
 
 	/*
 	 * With Bayer format enforce even side lengths, but let the user play
 	 * with the starting pixel
 	 */
 
-	if (pix->height > MT9M111_MAX_HEIGHT)
-		pix->height = MT9M111_MAX_HEIGHT;
-	else if (pix->height < 2)
-		pix->height = 2;
+	if (imgf->height > MT9M111_MAX_HEIGHT)
+		imgf->height = MT9M111_MAX_HEIGHT;
+	else if (imgf->height < 2)
+		imgf->height = 2;
 	else if (bayer)
-		pix->height = ALIGN(pix->height, 2);
+		imgf->height = ALIGN(imgf->height, 2);
 
-	if (pix->width > MT9M111_MAX_WIDTH)
-		pix->width = MT9M111_MAX_WIDTH;
-	else if (pix->width < 2)
-		pix->width = 2;
+	if (imgf->width > MT9M111_MAX_WIDTH)
+		imgf->width = MT9M111_MAX_WIDTH;
+	else if (imgf->width < 2)
+		imgf->width = 2;
 	else if (bayer)
-		pix->width = ALIGN(pix->width, 2);
+		imgf->width = ALIGN(imgf->width, 2);
 
 	return 0;
 }
@@ -863,7 +864,7 @@ static int mt9m111_restore_state(struct i2c_client *client)
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
 
 	mt9m111_set_context(client, mt9m111->context);
-	mt9m111_set_pixfmt(client, mt9m111->pixfmt);
+	mt9m111_set_pixfmt(client, mt9m111->code);
 	mt9m111_setup_rect(client, &mt9m111->rect);
 	mt9m111_set_flip(client, mt9m111->hflip, MT9M111_RMB_MIRROR_COLS);
 	mt9m111_set_flip(client, mt9m111->vflip, MT9M111_RMB_MIRROR_ROWS);
@@ -952,9 +953,6 @@ static int mt9m111_video_probe(struct soc_camera_device *icd,
 		goto ei2c;
 	}
 
-	icd->formats = mt9m111_colour_formats;
-	icd->num_formats = ARRAY_SIZE(mt9m111_colour_formats);
-
 	dev_info(&client->dev, "Detected a MT9M11x chip ID %x\n", data);
 
 ei2c:
@@ -971,13 +969,24 @@ static struct v4l2_subdev_core_ops mt9m111_subdev_core_ops = {
 #endif
 };
 
+static int mt9m111_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(mt9m111_colour_codes))
+		return -EINVAL;
+
+	*code = mt9m111_colour_codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9m111_subdev_video_ops = {
-	.s_fmt		= mt9m111_s_fmt,
-	.g_fmt		= mt9m111_g_fmt,
-	.try_fmt	= mt9m111_try_fmt,
-	.s_crop		= mt9m111_s_crop,
-	.g_crop		= mt9m111_g_crop,
-	.cropcap	= mt9m111_cropcap,
+	.s_imgbus_fmt		= mt9m111_s_fmt,
+	.g_imgbus_fmt		= mt9m111_g_fmt,
+	.try_imgbus_fmt		= mt9m111_try_fmt,
+	.s_crop			= mt9m111_s_crop,
+	.g_crop			= mt9m111_g_crop,
+	.cropcap		= mt9m111_cropcap,
+	.enum_imgbus_fmt	= mt9m111_enum_fmt,
 };
 
 static struct v4l2_subdev_ops mt9m111_subdev_ops = {
diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
index 0d2a8fd..c95c277 100644
--- a/drivers/media/video/mt9t031.c
+++ b/drivers/media/video/mt9t031.c
@@ -60,13 +60,8 @@
 	SOCAM_VSYNC_ACTIVE_HIGH | SOCAM_DATA_ACTIVE_HIGH |	\
 	SOCAM_MASTER | SOCAM_DATAWIDTH_10)
 
-static const struct soc_camera_data_format mt9t031_colour_formats[] = {
-	{
-		.name		= "Bayer (sRGB) 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_SGRBG10,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+static const enum v4l2_imgbus_pixelcode mt9t031_code[] = {
+	V4L2_IMGBUS_FMT_SGRBG10,
 };
 
 struct mt9t031 {
@@ -377,27 +372,26 @@ static int mt9t031_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9t031_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9t031_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9t031->rect.width / mt9t031->xskip;
-	pix->height		= mt9t031->rect.height / mt9t031->yskip;
-	pix->pixelformat	= V4L2_PIX_FMT_SGRBG10;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9t031->rect.width / mt9t031->xskip;
+	imgf->height	= mt9t031->rect.height / mt9t031->yskip;
+	imgf->code	= V4L2_IMGBUS_FMT_SGRBG10;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9t031_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9t031_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	struct soc_camera_device *icd = client->dev.platform_data;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	u16 xskip, yskip;
 	struct v4l2_rect rect = mt9t031->rect;
 
@@ -405,8 +399,8 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	 * try_fmt has put width and height within limits.
 	 * S_FMT: use binning and skipping for scaling
 	 */
-	xskip = mt9t031_skip(&rect.width, pix->width, MT9T031_MAX_WIDTH);
-	yskip = mt9t031_skip(&rect.height, pix->height, MT9T031_MAX_HEIGHT);
+	xskip = mt9t031_skip(&rect.width, imgf->width, MT9T031_MAX_WIDTH);
+	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
 
 	/* mt9t031_set_params() doesn't change width and height */
 	return mt9t031_set_params(icd, &rect, xskip, yskip);
@@ -416,13 +410,12 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
  * If a user window larger than sensor window is requested, we'll increase the
  * sensor window.
  */
-static int mt9t031_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9t031_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-
 	v4l_bound_align_image(
-		&pix->width, MT9T031_MIN_WIDTH, MT9T031_MAX_WIDTH, 1,
-		&pix->height, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT, 1, 0);
+		&imgf->width, MT9T031_MIN_WIDTH, MT9T031_MAX_WIDTH, 1,
+		&imgf->height, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT, 1, 0);
 
 	return 0;
 }
@@ -682,7 +675,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
  */
 static int mt9t031_video_probe(struct i2c_client *client)
 {
-	struct soc_camera_device *icd = client->dev.platform_data;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	s32 data;
 	int ret;
@@ -697,8 +689,6 @@ static int mt9t031_video_probe(struct i2c_client *client)
 	switch (data) {
 	case 0x1621:
 		mt9t031->model = V4L2_IDENT_MT9T031;
-		icd->formats = mt9t031_colour_formats;
-		icd->num_formats = ARRAY_SIZE(mt9t031_colour_formats);
 		break;
 	default:
 		dev_err(&client->dev,
@@ -729,14 +719,25 @@ static struct v4l2_subdev_core_ops mt9t031_subdev_core_ops = {
 #endif
 };
 
+static int mt9t031_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(mt9t031_code))
+		return -EINVAL;
+
+	*code = mt9t031_code[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9t031_subdev_video_ops = {
-	.s_stream	= mt9t031_s_stream,
-	.s_fmt		= mt9t031_s_fmt,
-	.g_fmt		= mt9t031_g_fmt,
-	.try_fmt	= mt9t031_try_fmt,
-	.s_crop		= mt9t031_s_crop,
-	.g_crop		= mt9t031_g_crop,
-	.cropcap	= mt9t031_cropcap,
+	.s_stream		= mt9t031_s_stream,
+	.s_imgbus_fmt		= mt9t031_s_fmt,
+	.g_imgbus_fmt		= mt9t031_g_fmt,
+	.try_imgbus_fmt		= mt9t031_try_fmt,
+	.s_crop			= mt9t031_s_crop,
+	.g_crop			= mt9t031_g_crop,
+	.cropcap		= mt9t031_cropcap,
+	.enum_imgbus_fmt	= mt9t031_enum_fmt,
 };
 
 static struct v4l2_subdev_ops mt9t031_subdev_ops = {
diff --git a/drivers/media/video/mt9v022.c b/drivers/media/video/mt9v022.c
index f60a9a1..9fc32d0 100644
--- a/drivers/media/video/mt9v022.c
+++ b/drivers/media/video/mt9v022.c
@@ -64,41 +64,27 @@ MODULE_PARM_DESC(sensor_type, "Sensor type: \"colour\" or \"monochrome\"");
 #define MT9V022_COLUMN_SKIP		1
 #define MT9V022_ROW_SKIP		4
 
-static const struct soc_camera_data_format mt9v022_colour_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9v022_colour_codes[] = {
 	/*
 	 * Order important: first natively supported,
 	 * second supported with a GPIO extender
 	 */
-	{
-		.name		= "Bayer (sRGB) 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_SBGGR16,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}, {
-		.name		= "Bayer (sRGB) 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_SBGGR8,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+	V4L2_IMGBUS_FMT_SBGGR10,
+	V4L2_IMGBUS_FMT_SBGGR8,
 };
 
-static const struct soc_camera_data_format mt9v022_monochrome_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9v022_monochrome_codes[] = {
 	/* Order important - see above */
-	{
-		.name		= "Monochrome 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_Y16,
-	}, {
-		.name		= "Monochrome 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_GREY,
-	},
+	V4L2_IMGBUS_FMT_Y10,
+	V4L2_IMGBUS_FMT_GREY,
 };
 
 struct mt9v022 {
 	struct v4l2_subdev subdev;
 	struct v4l2_rect rect;	/* Sensor window */
-	__u32 fourcc;
+	enum v4l2_imgbus_pixelcode code;
+	const enum v4l2_imgbus_pixelcode *codes;
+	int num_codes;
 	int model;	/* V4L2_IDENT_MT9V022* codes from v4l2-chip-ident.h */
 	u16 chip_control;
 	unsigned short y_skip_top;	/* Lines to skip at the top */
@@ -275,8 +261,7 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	int ret;
 
 	/* Bayer format - even size lengths */
-	if (mt9v022->fourcc == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9v022->fourcc == V4L2_PIX_FMT_SBGGR16) {
+	if (mt9v022->codes == mt9v022_colour_codes) {
 		rect.width	= ALIGN(rect.width, 2);
 		rect.height	= ALIGN(rect.height, 2);
 		/* Let the user play with the starting pixel */
@@ -354,32 +339,31 @@ static int mt9v022_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9v022_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9v022_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9v022->rect.width;
-	pix->height		= mt9v022->rect.height;
-	pix->pixelformat	= mt9v022->fourcc;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9v022->rect.width;
+	imgf->height	= mt9v022->rect.height;
+	imgf->code	= mt9v022->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9v022_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_crop a = {
 		.c = {
 			.left	= mt9v022->rect.left,
 			.top	= mt9v022->rect.top,
-			.width	= pix->width,
-			.height	= pix->height,
+			.width	= imgf->width,
+			.height	= imgf->height,
 		},
 	};
 	int ret;
@@ -388,14 +372,14 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	 * The caller provides a supported format, as verified per call to
 	 * icd->try_fmt(), datawidth is from our supported format list
 	 */
-	switch (pix->pixelformat) {
-	case V4L2_PIX_FMT_GREY:
-	case V4L2_PIX_FMT_Y16:
+	switch (imgf->code) {
+	case V4L2_IMGBUS_FMT_GREY:
+	case V4L2_IMGBUS_FMT_Y10:
 		if (mt9v022->model != V4L2_IDENT_MT9V022IX7ATM)
 			return -EINVAL;
 		break;
-	case V4L2_PIX_FMT_SBGGR8:
-	case V4L2_PIX_FMT_SBGGR16:
+	case V4L2_IMGBUS_FMT_SBGGR8:
+	case V4L2_IMGBUS_FMT_SBGGR10:
 		if (mt9v022->model != V4L2_IDENT_MT9V022IX7ATC)
 			return -EINVAL;
 		break;
@@ -409,25 +393,25 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	/* No support for scaling on this camera, just crop. */
 	ret = mt9v022_s_crop(sd, &a);
 	if (!ret) {
-		pix->width = mt9v022->rect.width;
-		pix->height = mt9v022->rect.height;
-		mt9v022->fourcc = pix->pixelformat;
+		imgf->width	= mt9v022->rect.width;
+		imgf->height	= mt9v022->rect.height;
+		mt9v022->code	= imgf->code;
 	}
 
 	return ret;
 }
 
-static int mt9v022_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9v022_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	int align = pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
-		pix->pixelformat == V4L2_PIX_FMT_SBGGR16;
+	int align = imgf->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10;
 
-	v4l_bound_align_image(&pix->width, MT9V022_MIN_WIDTH,
+	v4l_bound_align_image(&imgf->width, MT9V022_MIN_WIDTH,
 		MT9V022_MAX_WIDTH, align,
-		&pix->height, MT9V022_MIN_HEIGHT + mt9v022->y_skip_top,
+		&imgf->height, MT9V022_MIN_HEIGHT + mt9v022->y_skip_top,
 		MT9V022_MAX_HEIGHT + mt9v022->y_skip_top, align, 0);
 
 	return 0;
@@ -749,17 +733,17 @@ static int mt9v022_video_probe(struct soc_camera_device *icd,
 			    !strcmp("color", sensor_type))) {
 		ret = reg_write(client, MT9V022_PIXEL_OPERATION_MODE, 4 | 0x11);
 		mt9v022->model = V4L2_IDENT_MT9V022IX7ATC;
-		icd->formats = mt9v022_colour_formats;
+		mt9v022->codes = mt9v022_colour_codes;
 	} else {
 		ret = reg_write(client, MT9V022_PIXEL_OPERATION_MODE, 0x11);
 		mt9v022->model = V4L2_IDENT_MT9V022IX7ATM;
-		icd->formats = mt9v022_monochrome_formats;
+		mt9v022->codes = mt9v022_monochrome_codes;
 	}
 
 	if (ret < 0)
 		goto ei2c;
 
-	icd->num_formats = 0;
+	mt9v022->num_codes = 0;
 
 	/*
 	 * This is a 10bit sensor, so by default we only allow 10bit.
@@ -772,14 +756,14 @@ static int mt9v022_video_probe(struct soc_camera_device *icd,
 		flags = SOCAM_DATAWIDTH_10;
 
 	if (flags & SOCAM_DATAWIDTH_10)
-		icd->num_formats++;
+		mt9v022->num_codes++;
 	else
-		icd->formats++;
+		mt9v022->codes++;
 
 	if (flags & SOCAM_DATAWIDTH_8)
-		icd->num_formats++;
+		mt9v022->num_codes++;
 
-	mt9v022->fourcc = icd->formats->fourcc;
+	mt9v022->code = mt9v022->codes[0];
 
 	dev_info(&client->dev, "Detected a MT9V022 chip ID %x, %s sensor\n",
 		 data, mt9v022->model == V4L2_IDENT_MT9V022IX7ATM ?
@@ -823,14 +807,28 @@ static struct v4l2_subdev_core_ops mt9v022_subdev_core_ops = {
 #endif
 };
 
+static int mt9v022_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	struct i2c_client *client = sd->priv;
+	struct mt9v022 *mt9v022 = to_mt9v022(client);
+
+	if ((unsigned int)index >= mt9v022->num_codes)
+		return -EINVAL;
+
+	*code = mt9v022->codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9v022_subdev_video_ops = {
-	.s_stream	= mt9v022_s_stream,
-	.s_fmt		= mt9v022_s_fmt,
-	.g_fmt		= mt9v022_g_fmt,
-	.try_fmt	= mt9v022_try_fmt,
-	.s_crop		= mt9v022_s_crop,
-	.g_crop		= mt9v022_g_crop,
-	.cropcap	= mt9v022_cropcap,
+	.s_stream		= mt9v022_s_stream,
+	.s_imgbus_fmt		= mt9v022_s_fmt,
+	.g_imgbus_fmt		= mt9v022_g_fmt,
+	.try_imgbus_fmt		= mt9v022_try_fmt,
+	.s_crop			= mt9v022_s_crop,
+	.g_crop			= mt9v022_g_crop,
+	.cropcap		= mt9v022_cropcap,
+	.enum_imgbus_fmt	= mt9v022_enum_fmt,
 };
 
 static struct v4l2_subdev_sensor_ops mt9v022_subdev_sensor_ops = {
diff --git a/drivers/media/video/mx1_camera.c b/drivers/media/video/mx1_camera.c
index 659d20a..8e73c77 100644
--- a/drivers/media/video/mx1_camera.c
+++ b/drivers/media/video/mx1_camera.c
@@ -93,9 +93,9 @@
 /* buffer for one video frame */
 struct mx1_buffer {
 	/* common v4l buffer stuff -- must be first */
-	struct videobuf_buffer vb;
-	const struct soc_camera_data_format *fmt;
-	int inwork;
+	struct videobuf_buffer		vb;
+	enum v4l2_imgbus_pixelcode	code;
+	int				inwork;
 };
 
 /*
@@ -127,9 +127,13 @@ static int mx1_videobuf_setup(struct videobuf_queue *vq, unsigned int *count,
 			      unsigned int *size)
 {
 	struct soc_camera_device *icd = vq->priv_data;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
 
-	*size = icd->user_width * icd->user_height *
-		((icd->current_fmt->depth + 7) >> 3);
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
+	*size = bytes_per_line * icd->user_height;
 
 	if (!*count)
 		*count = 32;
@@ -168,6 +172,11 @@ static int mx1_videobuf_prepare(struct videobuf_queue *vq,
 	struct soc_camera_device *icd = vq->priv_data;
 	struct mx1_buffer *buf = container_of(vb, struct mx1_buffer, vb);
 	int ret;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	dev_dbg(icd->dev.parent, "%s (vb=0x%p) 0x%08lx %d\n", __func__,
 		vb, vb->baddr, vb->bsize);
@@ -183,18 +192,18 @@ static int mx1_videobuf_prepare(struct videobuf_queue *vq,
 	 */
 	buf->inwork = 1;
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
 		vb->state	= VIDEOBUF_NEEDS_INIT;
 	}
 
-	vb->size = vb->width * vb->height * ((buf->fmt->depth + 7) >> 3);
+	vb->size = bytes_per_line * vb->height;
 	if (0 != vb->baddr && vb->bsize < vb->size) {
 		ret = -EINVAL;
 		goto out;
@@ -496,12 +505,10 @@ static int mx1_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 
 	/* MX1 supports only 8bit buswidth */
 	common_flags = soc_camera_bus_param_compatible(camera_flags,
-							       CSI_BUS_FLAGS);
+						       CSI_BUS_FLAGS);
 	if (!common_flags)
 		return -EINVAL;
 
-	icd->buswidth = 8;
-
 	/* Make choises, based on platform choice */
 	if ((common_flags & SOCAM_VSYNC_ACTIVE_HIGH) &&
 		(common_flags & SOCAM_VSYNC_ACTIVE_LOW)) {
@@ -554,7 +561,8 @@ static int mx1_camera_set_fmt(struct soc_camera_device *icd,
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
-	int ret;
+	struct v4l2_imgbus_framefmt imgf;
+	int ret, buswidth;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
 	if (!xlate) {
@@ -563,12 +571,28 @@ static int mx1_camera_set_fmt(struct soc_camera_device *icd,
 		return -EINVAL;
 	}
 
-	ret = v4l2_subdev_call(sd, video, s_fmt, f);
-	if (!ret) {
-		icd->buswidth = xlate->buswidth;
-		icd->current_fmt = xlate->host_fmt;
+	buswidth = xlate->host_fmt->bits_per_sample;
+	if (buswidth > 8) {
+		dev_warn(icd->dev.parent,
+			 "bits-per-sample %d for format %x unsupported\n",
+			 buswidth, pix->pixelformat);
+		return -EINVAL;
 	}
 
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
+
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
+
+	pix->width		= imgf.width;
+	pix->height		= imgf.height;
+	pix->field		= imgf.field;
+	icd->current_fmt	= xlate;
+
 	return ret;
 }
 
@@ -576,10 +600,29 @@ static int mx1_camera_try_fmt(struct soc_camera_device *icd,
 			      struct v4l2_format *f)
 {
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
+	const struct soc_camera_format_xlate *xlate;
+	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
+	int ret;
 	/* TODO: limit to mx1 hardware capabilities */
 
+	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
+	if (!xlate) {
+		dev_warn(icd->dev.parent, "Format %x not found\n",
+			 pix->pixelformat);
+		return -EINVAL;
+	}
+
 	/* limit to sensor capabilities */
-	return v4l2_subdev_call(sd, video, try_fmt, f);
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
+
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
+	pix->field	= imgf.field;
+
+	return 0;
 }
 
 static int mx1_camera_reqbufs(struct soc_camera_file *icf,
diff --git a/drivers/media/video/mx3_camera.c b/drivers/media/video/mx3_camera.c
index 545a430..ab551f1 100644
--- a/drivers/media/video/mx3_camera.c
+++ b/drivers/media/video/mx3_camera.c
@@ -62,7 +62,7 @@
 struct mx3_camera_buffer {
 	/* common v4l buffer stuff -- must be first */
 	struct videobuf_buffer			vb;
-	const struct soc_camera_data_format	*fmt;
+	enum v4l2_imgbus_pixelcode		code;
 
 	/* One descriptot per scatterlist (per frame) */
 	struct dma_async_tx_descriptor		*txd;
@@ -117,8 +117,6 @@ struct dma_chan_request {
 	enum ipu_channel	id;
 };
 
-static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt);
-
 static u32 csi_reg_read(struct mx3_camera_dev *mx3, off_t reg)
 {
 	return __raw_readl(mx3->base + reg);
@@ -210,17 +208,16 @@ static int mx3_videobuf_setup(struct videobuf_queue *vq, unsigned int *count,
 	struct soc_camera_device *icd = vq->priv_data;
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct mx3_camera_dev *mx3_cam = ici->priv;
-	/*
-	 * bits-per-pixel (depth) as specified in camera's pixel format does
-	 * not necessarily match what the camera interface writes to RAM, but
-	 * it should be good enough for now.
-	 */
-	unsigned int bpp = DIV_ROUND_UP(icd->current_fmt->depth, 8);
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	if (!mx3_cam->idmac_channel[0])
 		return -EINVAL;
 
-	*size = icd->user_width * icd->user_height * bpp;
+	*size = bytes_per_line * icd->user_height;
 
 	if (!*count)
 		*count = 32;
@@ -240,21 +237,26 @@ static int mx3_videobuf_prepare(struct videobuf_queue *vq,
 	struct mx3_camera_dev *mx3_cam = ici->priv;
 	struct mx3_camera_buffer *buf =
 		container_of(vb, struct mx3_camera_buffer, vb);
-	/* current_fmt _must_ always be set */
-	size_t new_size = icd->user_width * icd->user_height *
-		((icd->current_fmt->depth + 7) >> 3);
+	size_t new_size;
 	int ret;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
+	new_size = bytes_per_line * icd->user_height;
 
 	/*
 	 * I think, in buf_prepare you only have to protect global data,
 	 * the actual buffer is yours
 	 */
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
@@ -347,13 +349,13 @@ static void mx3_videobuf_queue(struct videobuf_queue *vq,
 	struct dma_async_tx_descriptor *txd = buf->txd;
 	struct idmac_channel *ichan = to_idmac_chan(txd->chan);
 	struct idmac_video_param *video = &ichan->params.video;
-	const struct soc_camera_data_format *data_fmt = icd->current_fmt;
 	dma_cookie_t cookie;
+	u32 fourcc = icd->current_fmt->host_fmt->fourcc;
 
 	BUG_ON(!irqs_disabled());
 
 	/* This is the configuration of one sg-element */
-	video->out_pixel_fmt	= fourcc_to_ipu_pix(data_fmt->fourcc);
+	video->out_pixel_fmt	= fourcc_to_ipu_pix(fourcc);
 	video->out_width	= icd->user_width;
 	video->out_height	= icd->user_height;
 	video->out_stride	= icd->user_width;
@@ -567,28 +569,33 @@ static int test_platform_param(struct mx3_camera_dev *mx3_cam,
 	 * If requested data width is supported by the platform, use it or any
 	 * possible lower value - i.MX31 is smart enough to schift bits
 	 */
+	if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15)
+		*flags |= SOCAM_DATAWIDTH_15 | SOCAM_DATAWIDTH_10 |
+			SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
+	else if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_10)
+		*flags |= SOCAM_DATAWIDTH_10 | SOCAM_DATAWIDTH_8 |
+			SOCAM_DATAWIDTH_4;
+	else if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_8)
+		*flags |= SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
+	else if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_4)
+		*flags |= SOCAM_DATAWIDTH_4;
+
 	switch (buswidth) {
 	case 15:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15))
+		if (!(*flags & SOCAM_DATAWIDTH_15))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_15 | SOCAM_DATAWIDTH_10 |
-			SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
 		break;
 	case 10:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_10))
+		if (!(*flags & SOCAM_DATAWIDTH_10))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_10 | SOCAM_DATAWIDTH_8 |
-			SOCAM_DATAWIDTH_4;
 		break;
 	case 8:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_8))
+		if (!(*flags & SOCAM_DATAWIDTH_8))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
 		break;
 	case 4:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_4))
+		if (!(*flags & SOCAM_DATAWIDTH_4))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_4;
 		break;
 	default:
 		dev_warn(mx3_cam->soc_host.v4l2_dev.dev,
@@ -637,91 +644,95 @@ static bool chan_filter(struct dma_chan *chan, void *arg)
 		pdata->dma_dev == chan->device->dev;
 }
 
-static const struct soc_camera_data_format mx3_camera_formats[] = {
+static const struct v4l2_imgbus_pixelfmt mx3_camera_formats[] = {
 	{
-		.name		= "Bayer (sRGB) 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_SBGGR8,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
+		.fourcc			= V4L2_PIX_FMT_SBGGR8,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer (sRGB) 8 bit",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	}, {
-		.name		= "Monochrome 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_GREY,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
+		.fourcc			= V4L2_PIX_FMT_GREY,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Monochrome 8 bit",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	},
 };
 
-static bool buswidth_supported(struct soc_camera_host *ici, int depth)
+/* This will be corrected as we get more formats */
+static bool mx3_camera_packing_supported(const struct v4l2_imgbus_pixelfmt *fmt)
 {
-	struct mx3_camera_dev *mx3_cam = ici->priv;
-
-	switch (depth) {
-	case 4:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_4);
-	case 8:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_8);
-	case 10:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_10);
-	case 15:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15);
-	}
-	return false;
+	return	fmt->packing == V4L2_IMGBUS_PACKING_NONE ||
+		(fmt->bits_per_sample == 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_2X8) ||
+		(fmt->bits_per_sample > 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_EXTEND16);
 }
 
 static int mx3_camera_get_formats(struct soc_camera_device *icd, int idx,
 				  struct soc_camera_format_xlate *xlate)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
+	struct device *dev = icd->dev.parent;
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
-	int formats = 0, buswidth, ret;
+	int formats = 0, ret;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *fmt;
 
-	buswidth = icd->formats[idx].depth;
+	ret = v4l2_subdev_call(sd, video, enum_imgbus_fmt, idx, &code);
+	if (ret < 0)
+		/* No more formats */
+		return 0;
 
-	if (!buswidth_supported(ici, buswidth))
+	fmt = v4l2_imgbus_get_fmtdesc(code);
+	if (!fmt) {
+		dev_err(icd->dev.parent,
+			"Invalid format code #%d: %d\n", idx, code);
 		return 0;
+	}
 
-	ret = mx3_camera_try_bus_param(icd, buswidth);
+	/* This also checks support for the requested bits-per-sample */
+	ret = mx3_camera_try_bus_param(icd, fmt->bits_per_sample);
 	if (ret < 0)
 		return 0;
 
-	switch (icd->formats[idx].fourcc) {
-	case V4L2_PIX_FMT_SGRBG10:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_SGRBG10:
 		formats++;
 		if (xlate) {
-			xlate->host_fmt = &mx3_camera_formats[0];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
+			xlate->host_fmt	= &mx3_camera_formats[0];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(icd->dev.parent,
-				"Providing format %s using %s\n",
-				mx3_camera_formats[0].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				mx3_camera_formats[0].name, code);
 		}
-		goto passthrough;
-	case V4L2_PIX_FMT_Y16:
+		break;
+	case V4L2_IMGBUS_FMT_Y10:
 		formats++;
 		if (xlate) {
-			xlate->host_fmt = &mx3_camera_formats[1];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
+			xlate->host_fmt	= &mx3_camera_formats[1];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(icd->dev.parent,
-				"Providing format %s using %s\n",
-				mx3_camera_formats[0].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				mx3_camera_formats[1].name, code);
 		}
+		break;
 	default:
-passthrough:
-		/* Generic pass-through */
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
-			xlate++;
-			dev_dbg(icd->dev.parent,
-				"Providing format %s in pass-through mode\n",
-				icd->formats[idx].name);
-		}
+		if (!mx3_camera_packing_supported(fmt))
+			return 0;
+	}
+
+	/* Generic pass-through */
+	formats++;
+	if (xlate) {
+		xlate->host_fmt	= fmt;
+		xlate->code	= code;
+		xlate++;
+		dev_dbg(dev, "Providing format %x in pass-through mode\n",
+			xlate->host_fmt->fourcc);
 	}
 
 	return formats;
@@ -805,8 +816,7 @@ static int mx3_camera_set_crop(struct soc_camera_device *icd,
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct mx3_camera_dev *mx3_cam = ici->priv;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
-	struct v4l2_format f = {.type = V4L2_BUF_TYPE_VIDEO_CAPTURE};
-	struct v4l2_pix_format *pix = &f.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	soc_camera_limit_side(&rect->left, &rect->width, 0, 2, 4096);
@@ -817,19 +827,19 @@ static int mx3_camera_set_crop(struct soc_camera_device *icd,
 		return ret;
 
 	/* The capture device might have changed its output  */
-	ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+	ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
-	if (pix->width & 7) {
+	if (imgf.width & 7) {
 		/* Ouch! We can only handle 8-byte aligned width... */
-		stride_align(&pix->width);
-		ret = v4l2_subdev_call(sd, video, s_fmt, &f);
+		stride_align(&imgf.width);
+		ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
 		if (ret < 0)
 			return ret;
 	}
 
-	if (pix->width != icd->user_width || pix->height != icd->user_height) {
+	if (imgf.width != icd->user_width || imgf.height != icd->user_height) {
 		/*
 		 * We now know pixel formats and can decide upon DMA-channel(s)
 		 * So far only direct camera-to-memory is supported
@@ -840,14 +850,14 @@ static int mx3_camera_set_crop(struct soc_camera_device *icd,
 				return ret;
 		}
 
-		configure_geometry(mx3_cam, pix->width, pix->height);
+		configure_geometry(mx3_cam, imgf.width, imgf.height);
 	}
 
 	dev_dbg(icd->dev.parent, "Sensor cropped %dx%d\n",
-		pix->width, pix->height);
+		imgf.width, imgf.height);
 
-	icd->user_width = pix->width;
-	icd->user_height = pix->height;
+	icd->user_width		= imgf.width;
+	icd->user_height	= imgf.height;
 
 	return ret;
 }
@@ -860,6 +870,7 @@ static int mx3_camera_set_fmt(struct soc_camera_device *icd,
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
@@ -884,11 +895,19 @@ static int mx3_camera_set_fmt(struct soc_camera_device *icd,
 
 	configure_geometry(mx3_cam, pix->width, pix->height);
 
-	ret = v4l2_subdev_call(sd, video, s_fmt, f);
-	if (!ret) {
-		icd->buswidth = xlate->buswidth;
-		icd->current_fmt = xlate->host_fmt;
-	}
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
+
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
+
+	pix->width		= imgf.width;
+	pix->height		= imgf.height;
+	pix->field		= imgf.field;
+	icd->current_fmt	= xlate;
 
 	dev_dbg(icd->dev.parent, "Sensor set %dx%d\n", pix->width, pix->height);
 
@@ -901,8 +920,8 @@ static int mx3_camera_try_fmt(struct soc_camera_device *icd,
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	__u32 pixfmt = pix->pixelformat;
-	enum v4l2_field field;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pixfmt);
@@ -917,23 +936,34 @@ static int mx3_camera_try_fmt(struct soc_camera_device *icd,
 	if (pix->width > 4096)
 		pix->width = 4096;
 
-	pix->bytesperline = pix->width *
-		DIV_ROUND_UP(xlate->host_fmt->depth, 8);
+	pix->bytesperline = v4l2_imgbus_bytes_per_line(pix->width,
+						       xlate->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
 	pix->sizeimage = pix->height * pix->bytesperline;
 
-	/* camera has to see its format, but the user the original one */
-	pix->pixelformat = xlate->cam_fmt->fourcc;
 	/* limit to sensor capabilities */
-	ret = v4l2_subdev_call(sd, video, try_fmt, f);
-	pix->pixelformat = xlate->host_fmt->fourcc;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.field	= pix->field;
+	imgf.code	= xlate->code;
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
 
-	field = pix->field;
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
 
-	if (field == V4L2_FIELD_ANY) {
+	switch (imgf.field) {
+	case V4L2_FIELD_ANY:
 		pix->field = V4L2_FIELD_NONE;
-	} else if (field != V4L2_FIELD_NONE) {
-		dev_err(icd->dev.parent, "Field type %d unsupported.\n", field);
-		return -EINVAL;
+		break;
+	case V4L2_FIELD_NONE:
+		break;
+	default:
+		dev_err(icd->dev.parent, "Field type %d unsupported.\n",
+			imgf.field);
+		ret = -EINVAL;
 	}
 
 	return ret;
@@ -969,18 +999,26 @@ static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 	struct mx3_camera_dev *mx3_cam = ici->priv;
 	unsigned long bus_flags, camera_flags, common_flags;
 	u32 dw, sens_conf;
-	int ret = test_platform_param(mx3_cam, icd->buswidth, &bus_flags);
+	const struct v4l2_imgbus_pixelfmt *fmt;
+	int buswidth;
+	int ret;
 	const struct soc_camera_format_xlate *xlate;
 	struct device *dev = icd->dev.parent;
 
+	fmt = v4l2_imgbus_get_fmtdesc(icd->current_fmt->code);
+	if (!fmt)
+		return -EINVAL;
+
+	buswidth = fmt->bits_per_sample;
+	ret = test_platform_param(mx3_cam, buswidth, &bus_flags);
+
 	xlate = soc_camera_xlate_by_fourcc(icd, pixfmt);
 	if (!xlate) {
 		dev_warn(dev, "Format %x not found\n", pixfmt);
 		return -EINVAL;
 	}
 
-	dev_dbg(dev, "requested bus width %d bit: %d\n",
-		icd->buswidth, ret);
+	dev_dbg(dev, "requested bus width %d bit: %d\n", buswidth, ret);
 
 	if (ret < 0)
 		return ret;
@@ -1081,7 +1119,7 @@ static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 		sens_conf |= 1 << CSI_SENS_CONF_DATA_POL_SHIFT;
 
 	/* Just do what we're asked to do */
-	switch (xlate->host_fmt->depth) {
+	switch (xlate->host_fmt->bits_per_sample) {
 	case 4:
 		dw = 0 << CSI_SENS_CONF_DATA_WIDTH_SHIFT;
 		break;
diff --git a/drivers/media/video/ov772x.c b/drivers/media/video/ov772x.c
index dbaf508..f969011 100644
--- a/drivers/media/video/ov772x.c
+++ b/drivers/media/video/ov772x.c
@@ -382,7 +382,7 @@ struct regval_list {
 };
 
 struct ov772x_color_format {
-	const struct soc_camera_data_format *format;
+	const enum v4l2_imgbus_pixelcode code;
 	u8 dsp3;
 	u8 com3;
 	u8 com7;
@@ -434,93 +434,50 @@ static const struct regval_list ov772x_vga_regs[] = {
 };
 
 /*
- * supported format list
- */
-
-#define SETFOURCC(type) .name = (#type), .fourcc = (V4L2_PIX_FMT_ ## type)
-static const struct soc_camera_data_format ov772x_fmt_lists[] = {
-	{
-		SETFOURCC(YUYV),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_JPEG,
-	},
-	{
-		SETFOURCC(YVYU),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_JPEG,
-	},
-	{
-		SETFOURCC(UYVY),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_JPEG,
-	},
-	{
-		SETFOURCC(RGB555),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-	{
-		SETFOURCC(RGB555X),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-	{
-		SETFOURCC(RGB565),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-	{
-		SETFOURCC(RGB565X),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-};
-
-/*
- * color format list
+ * supported color format list
  */
 static const struct ov772x_color_format ov772x_cfmts[] = {
 	{
-		.format = &ov772x_fmt_lists[0],
-		.dsp3   = 0x0,
-		.com3   = SWAP_YUV,
-		.com7   = OFMT_YUV,
+		.code	= V4L2_IMGBUS_FMT_YUYV,
+		.dsp3	= 0x0,
+		.com3	= SWAP_YUV,
+		.com7	= OFMT_YUV,
 	},
 	{
-		.format = &ov772x_fmt_lists[1],
-		.dsp3   = UV_ON,
-		.com3   = SWAP_YUV,
-		.com7   = OFMT_YUV,
+		.code	= V4L2_IMGBUS_FMT_YVYU,
+		.dsp3	= UV_ON,
+		.com3	= SWAP_YUV,
+		.com7	= OFMT_YUV,
 	},
 	{
-		.format = &ov772x_fmt_lists[2],
-		.dsp3   = 0x0,
-		.com3   = 0x0,
-		.com7   = OFMT_YUV,
+		.code	= V4L2_IMGBUS_FMT_UYVY,
+		.dsp3	= 0x0,
+		.com3	= 0x0,
+		.com7	= OFMT_YUV,
 	},
 	{
-		.format = &ov772x_fmt_lists[3],
-		.dsp3   = 0x0,
-		.com3   = SWAP_RGB,
-		.com7   = FMT_RGB555 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB555,
+		.dsp3	= 0x0,
+		.com3	= SWAP_RGB,
+		.com7	= FMT_RGB555 | OFMT_RGB,
 	},
 	{
-		.format = &ov772x_fmt_lists[4],
-		.dsp3   = 0x0,
-		.com3   = 0x0,
-		.com7   = FMT_RGB555 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB555X,
+		.dsp3	= 0x0,
+		.com3	= 0x0,
+		.com7	= FMT_RGB555 | OFMT_RGB,
 	},
 	{
-		.format = &ov772x_fmt_lists[5],
-		.dsp3   = 0x0,
-		.com3   = SWAP_RGB,
-		.com7   = FMT_RGB565 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB565,
+		.dsp3	= 0x0,
+		.com3	= SWAP_RGB,
+		.com7	= FMT_RGB565 | OFMT_RGB,
 	},
 	{
-		.format = &ov772x_fmt_lists[6],
-		.dsp3   = 0x0,
-		.com3   = 0x0,
-		.com7   = FMT_RGB565 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB565X,
+		.dsp3	= 0x0,
+		.com3	= 0x0,
+		.com7	= FMT_RGB565 | OFMT_RGB,
 	},
 };
 
@@ -649,8 +606,8 @@ static int ov772x_s_stream(struct v4l2_subdev *sd, int enable)
 
 	ov772x_mask_set(client, COM2, SOFT_SLEEP_MODE, 0);
 
-	dev_dbg(&client->dev, "format %s, win %s\n",
-		priv->fmt->format->name, priv->win->name);
+	dev_dbg(&client->dev, "format %d, win %s\n",
+		priv->fmt->code, priv->win->name);
 
 	return 0;
 }
@@ -806,8 +763,8 @@ static const struct ov772x_win_size *ov772x_select_win(u32 width, u32 height)
 	return win;
 }
 
-static int ov772x_set_params(struct i2c_client *client,
-			     u32 *width, u32 *height, u32 pixfmt)
+static int ov772x_set_params(struct i2c_client *client, u32 *width, u32 *height,
+			     enum v4l2_imgbus_pixelcode code)
 {
 	struct ov772x_priv *priv = to_ov772x(client);
 	int ret = -EINVAL;
@@ -819,7 +776,7 @@ static int ov772x_set_params(struct i2c_client *client,
 	 */
 	priv->fmt = NULL;
 	for (i = 0; i < ARRAY_SIZE(ov772x_cfmts); i++) {
-		if (pixfmt == ov772x_cfmts[i].format->fourcc) {
+		if (code == ov772x_cfmts[i].code) {
 			priv->fmt = ov772x_cfmts + i;
 			break;
 		}
@@ -925,7 +882,7 @@ static int ov772x_set_params(struct i2c_client *client,
 	 */
 	val = priv->win->com7_bit | priv->fmt->com7;
 	ret = ov772x_mask_set(client,
-			      COM7, (SLCT_MASK | FMT_MASK | OFMT_MASK),
+			      COM7, SLCT_MASK | FMT_MASK | OFMT_MASK,
 			      val);
 	if (ret < 0)
 		goto ov772x_set_fmt_error;
@@ -981,54 +938,50 @@ static int ov772x_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int ov772x_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov772x_g_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct ov772x_priv *priv = to_ov772x(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
 	if (!priv->win || !priv->fmt) {
 		u32 width = VGA_WIDTH, height = VGA_HEIGHT;
 		int ret = ov772x_set_params(client, &width, &height,
-					    V4L2_PIX_FMT_YUYV);
+					    V4L2_IMGBUS_FMT_YUYV);
 		if (ret < 0)
 			return ret;
 	}
 
-	f->type			= V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	pix->width		= priv->win->width;
-	pix->height		= priv->win->height;
-	pix->pixelformat	= priv->fmt->format->fourcc;
-	pix->colorspace		= priv->fmt->format->colorspace;
-	pix->field		= V4L2_FIELD_NONE;
+	imgf->width	= priv->win->width;
+	imgf->height	= priv->win->height;
+	imgf->code	= priv->fmt->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int ov772x_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov772x_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	return ov772x_set_params(client, &pix->width, &pix->height,
-				 pix->pixelformat);
+	return ov772x_set_params(client, &imgf->width, &imgf->height,
+				 imgf->code);
 }
 
 static int ov772x_try_fmt(struct v4l2_subdev *sd,
-			  struct v4l2_format *f)
+			  struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	const struct ov772x_win_size *win;
 
 	/*
 	 * select suitable win
 	 */
-	win = ov772x_select_win(pix->width, pix->height);
+	win = ov772x_select_win(imgf->width, imgf->height);
 
-	pix->width  = win->width;
-	pix->height = win->height;
-	pix->field  = V4L2_FIELD_NONE;
+	imgf->width  = win->width;
+	imgf->height = win->height;
+	imgf->field  = V4L2_FIELD_NONE;
 
 	return 0;
 }
@@ -1057,9 +1010,6 @@ static int ov772x_video_probe(struct soc_camera_device *icd,
 		return -ENODEV;
 	}
 
-	icd->formats     = ov772x_fmt_lists;
-	icd->num_formats = ARRAY_SIZE(ov772x_fmt_lists);
-
 	/*
 	 * check and show product ID and manufacturer ID
 	 */
@@ -1109,13 +1059,24 @@ static struct v4l2_subdev_core_ops ov772x_subdev_core_ops = {
 #endif
 };
 
+static int ov772x_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(ov772x_cfmts))
+		return -EINVAL;
+
+	*code = ov772x_cfmts[index].code;
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops ov772x_subdev_video_ops = {
-	.s_stream	= ov772x_s_stream,
-	.g_fmt		= ov772x_g_fmt,
-	.s_fmt		= ov772x_s_fmt,
-	.try_fmt	= ov772x_try_fmt,
-	.cropcap	= ov772x_cropcap,
-	.g_crop		= ov772x_g_crop,
+	.s_stream		= ov772x_s_stream,
+	.g_imgbus_fmt		= ov772x_g_fmt,
+	.s_imgbus_fmt		= ov772x_s_fmt,
+	.try_imgbus_fmt		= ov772x_try_fmt,
+	.cropcap		= ov772x_cropcap,
+	.g_crop			= ov772x_g_crop,
+	.enum_imgbus_fmt	= ov772x_enum_fmt,
 };
 
 static struct v4l2_subdev_ops ov772x_subdev_ops = {
diff --git a/drivers/media/video/ov9640.c b/drivers/media/video/ov9640.c
index c81ae21..b63d921 100644
--- a/drivers/media/video/ov9640.c
+++ b/drivers/media/video/ov9640.c
@@ -160,13 +160,8 @@ static const struct ov9640_reg ov9640_regs_rgb[] = {
  * this version of the driver. To test and debug these formats add two entries
  * to the below array, see ov722x.c for an example.
  */
-static const struct soc_camera_data_format ov9640_fmt_lists[] = {
-	{
-		.name		= "UYVY",
-		.fourcc		= V4L2_PIX_FMT_UYVY,
-		.depth		= 16,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
+static const enum v4l2_imgbus_pixelcode ov9640_fmt_codes[] = {
+	V4L2_IMGBUS_FMT_UYVY,
 };
 
 static const struct v4l2_queryctrl ov9640_controls[] = {
@@ -434,20 +429,22 @@ static void ov9640_res_roundup(u32 *width, u32 *height)
 }
 
 /* Prepare necessary register changes depending on color encoding */
-static void ov9640_alter_regs(u32 pixfmt, struct ov9640_reg_alt *alt)
+static void ov9640_alter_regs(enum v4l2_imgbus_pixelcode code,
+			      struct ov9640_reg_alt *alt)
 {
-	switch (pixfmt) {
-	case V4L2_PIX_FMT_UYVY:
+	switch (code) {
+	default:
+	case V4L2_IMGBUS_FMT_UYVY:
 		alt->com12	= OV9640_COM12_YUV_AVG;
 		alt->com13	= OV9640_COM13_Y_DELAY_EN |
 					OV9640_COM13_YUV_DLY(0x01);
 		break;
-	case V4L2_PIX_FMT_RGB555:
+	case V4L2_IMGBUS_FMT_RGB555:
 		alt->com7	= OV9640_COM7_RGB;
 		alt->com13	= OV9640_COM13_RGB_AVG;
 		alt->com15	= OV9640_COM15_RGB_555;
 		break;
-	case V4L2_PIX_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB565:
 		alt->com7	= OV9640_COM7_RGB;
 		alt->com13	= OV9640_COM13_RGB_AVG;
 		alt->com15	= OV9640_COM15_RGB_565;
@@ -456,8 +453,8 @@ static void ov9640_alter_regs(u32 pixfmt, struct ov9640_reg_alt *alt)
 }
 
 /* Setup registers according to resolution and color encoding */
-static int ov9640_write_regs(struct i2c_client *client,
-		u32 width, u32 pixfmt, struct ov9640_reg_alt *alts)
+static int ov9640_write_regs(struct i2c_client *client, u32 width,
+		enum v4l2_imgbus_pixelcode code, struct ov9640_reg_alt *alts)
 {
 	const struct ov9640_reg	*ov9640_regs, *matrix_regs;
 	int			ov9640_regs_len, matrix_regs_len;
@@ -500,7 +497,7 @@ static int ov9640_write_regs(struct i2c_client *client,
 	}
 
 	/* select color matrix configuration for given color encoding */
-	if (pixfmt == V4L2_PIX_FMT_UYVY) {
+	if (code == V4L2_IMGBUS_FMT_UYVY) {
 		matrix_regs	= ov9640_regs_yuv;
 		matrix_regs_len	= ARRAY_SIZE(ov9640_regs_yuv);
 	} else {
@@ -562,15 +559,15 @@ static int ov9640_prog_dflt(struct i2c_client *client)
 }
 
 /* set the format we will capture in */
-static int ov9640_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov9640_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct ov9640_reg_alt alts = {0};
 	int ret;
 
-	ov9640_res_roundup(&pix->width, &pix->height);
-	ov9640_alter_regs(pix->pixelformat, &alts);
+	ov9640_res_roundup(&imgf->width, &imgf->height);
+	ov9640_alter_regs(imgf->code, &alts);
 
 	ov9640_reset(client);
 
@@ -578,16 +575,25 @@ static int ov9640_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	if (ret)
 		return ret;
 
-	return ov9640_write_regs(client, pix->width, pix->pixelformat, &alts);
+	return ov9640_write_regs(client, imgf->width, imgf->code, &alts);
 }
 
-static int ov9640_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov9640_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
+	ov9640_res_roundup(&imgf->width, &imgf->height);
+	imgf->field  = V4L2_FIELD_NONE;
 
-	ov9640_res_roundup(&pix->width, &pix->height);
-	pix->field  = V4L2_FIELD_NONE;
+	return 0;
+}
 
+static int ov9640_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(ov9640_fmt_codes))
+		return -EINVAL;
+
+	*code = ov9640_fmt_codes[index];
 	return 0;
 }
 
@@ -637,9 +643,6 @@ static int ov9640_video_probe(struct soc_camera_device *icd,
 		goto err;
 	}
 
-	icd->formats		= ov9640_fmt_lists;
-	icd->num_formats	= ARRAY_SIZE(ov9640_fmt_lists);
-
 	/*
 	 * check and show product ID and manufacturer ID
 	 */
@@ -703,8 +706,9 @@ static struct v4l2_subdev_core_ops ov9640_core_ops = {
 
 static struct v4l2_subdev_video_ops ov9640_video_ops = {
 	.s_stream		= ov9640_s_stream,
-	.s_fmt			= ov9640_s_fmt,
-	.try_fmt		= ov9640_try_fmt,
+	.s_imgbus_fmt		= ov9640_s_fmt,
+	.try_imgbus_fmt		= ov9640_try_fmt,
+	.enum_imgbus_fmt	= ov9640_enum_fmt,
 	.cropcap		= ov9640_cropcap,
 	.g_crop			= ov9640_g_crop,
 
diff --git a/drivers/media/video/pxa_camera.c b/drivers/media/video/pxa_camera.c
index f063f59..8dece33 100644
--- a/drivers/media/video/pxa_camera.c
+++ b/drivers/media/video/pxa_camera.c
@@ -183,16 +183,12 @@ struct pxa_cam_dma {
 /* buffer for one video frame */
 struct pxa_buffer {
 	/* common v4l buffer stuff -- must be first */
-	struct videobuf_buffer vb;
-
-	const struct soc_camera_data_format        *fmt;
-
+	struct videobuf_buffer		vb;
+	enum v4l2_imgbus_pixelcode	code;
 	/* our descriptor lists for Y, U and V channels */
-	struct pxa_cam_dma dmas[3];
-
-	int			inwork;
-
-	enum pxa_camera_active_dma active_dma;
+	struct pxa_cam_dma		dmas[3];
+	int				inwork;
+	enum pxa_camera_active_dma	active_dma;
 };
 
 struct pxa_camera_dev {
@@ -243,11 +239,15 @@ static int pxa_videobuf_setup(struct videobuf_queue *vq, unsigned int *count,
 			      unsigned int *size)
 {
 	struct soc_camera_device *icd = vq->priv_data;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	dev_dbg(icd->dev.parent, "count=%d, size=%d\n", *count, *size);
 
-	*size = roundup(icd->user_width * icd->user_height *
-			((icd->current_fmt->depth + 7) >> 3), 8);
+	*size = bytes_per_line * icd->user_height;
 
 	if (0 == *count)
 		*count = 32;
@@ -433,6 +433,11 @@ static int pxa_videobuf_prepare(struct videobuf_queue *vq,
 	struct pxa_buffer *buf = container_of(vb, struct pxa_buffer, vb);
 	int ret;
 	int size_y, size_u = 0, size_v = 0;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	dev_dbg(dev, "%s (vb=0x%p) 0x%08lx %d\n", __func__,
 		vb, vb->baddr, vb->bsize);
@@ -456,18 +461,18 @@ static int pxa_videobuf_prepare(struct videobuf_queue *vq,
 	 */
 	buf->inwork = 1;
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
 		vb->state	= VIDEOBUF_NEEDS_INIT;
 	}
 
-	vb->size = vb->width * vb->height * ((buf->fmt->depth + 7) >> 3);
+	vb->size = bytes_per_line * vb->height;
 	if (0 != vb->baddr && vb->bsize < vb->size) {
 		ret = -EINVAL;
 		goto out;
@@ -1157,9 +1162,15 @@ static int pxa_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct pxa_camera_dev *pcdev = ici->priv;
 	unsigned long bus_flags, camera_flags, common_flags;
-	int ret = test_platform_param(pcdev, icd->buswidth, &bus_flags);
+	const struct v4l2_imgbus_pixelfmt *fmt;
+	int ret;
 	struct pxa_cam *cam = icd->host_priv;
 
+	fmt = v4l2_imgbus_get_fmtdesc(icd->current_fmt->code);
+	if (!fmt)
+		return -EINVAL;
+
+	ret = test_platform_param(pcdev, fmt->bits_per_sample, &bus_flags);
 	if (ret < 0)
 		return ret;
 
@@ -1223,59 +1234,50 @@ static int pxa_camera_try_bus_param(struct soc_camera_device *icd,
 	return soc_camera_bus_param_compatible(camera_flags, bus_flags) ? 0 : -EINVAL;
 }
 
-static const struct soc_camera_data_format pxa_camera_formats[] = {
+static const struct v4l2_imgbus_pixelfmt pxa_camera_formats[] = {
 	{
-		.name		= "Planar YUV422 16 bit",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_YUV422P,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
+		.fourcc			= V4L2_PIX_FMT_YUV422P,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Planar YUV422 16 bit",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	},
 };
 
-static bool buswidth_supported(struct soc_camera_device *icd, int depth)
-{
-	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
-	struct pxa_camera_dev *pcdev = ici->priv;
-
-	switch (depth) {
-	case 8:
-		return !!(pcdev->platform_flags & PXA_CAMERA_DATAWIDTH_8);
-	case 9:
-		return !!(pcdev->platform_flags & PXA_CAMERA_DATAWIDTH_9);
-	case 10:
-		return !!(pcdev->platform_flags & PXA_CAMERA_DATAWIDTH_10);
-	}
-	return false;
-}
-
-static int required_buswidth(const struct soc_camera_data_format *fmt)
+/* This will be corrected as we get more formats */
+static bool pxa_camera_packing_supported(const struct v4l2_imgbus_pixelfmt *fmt)
 {
-	switch (fmt->fourcc) {
-	case V4L2_PIX_FMT_UYVY:
-	case V4L2_PIX_FMT_VYUY:
-	case V4L2_PIX_FMT_YUYV:
-	case V4L2_PIX_FMT_YVYU:
-	case V4L2_PIX_FMT_RGB565:
-	case V4L2_PIX_FMT_RGB555:
-		return 8;
-	default:
-		return fmt->depth;
-	}
+	return	fmt->packing == V4L2_IMGBUS_PACKING_NONE ||
+		(fmt->bits_per_sample == 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_2X8) ||
+		(fmt->bits_per_sample > 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_EXTEND16);
 }
 
 static int pxa_camera_get_formats(struct soc_camera_device *icd, int idx,
 				  struct soc_camera_format_xlate *xlate)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
-	int formats = 0, buswidth, ret;
+	int formats = 0, ret;
 	struct pxa_cam *cam;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *fmt;
 
-	buswidth = required_buswidth(icd->formats + idx);
+	ret = v4l2_subdev_call(sd, video, enum_imgbus_fmt, idx, &code);
+	if (ret < 0)
+		/* No more formats */
+		return 0;
 
-	if (!buswidth_supported(icd, buswidth))
+	fmt = v4l2_imgbus_get_fmtdesc(code);
+	if (!fmt) {
+		dev_err(dev, "Invalid format code #%d: %d\n", idx, code);
 		return 0;
+	}
 
-	ret = pxa_camera_try_bus_param(icd, buswidth);
+	/* This also checks support for the requested bits-per-sample */
+	ret = pxa_camera_try_bus_param(icd, fmt->bits_per_sample);
 	if (ret < 0)
 		return 0;
 
@@ -1289,45 +1291,40 @@ static int pxa_camera_get_formats(struct soc_camera_device *icd, int idx,
 		cam = icd->host_priv;
 	}
 
-	switch (icd->formats[idx].fourcc) {
-	case V4L2_PIX_FMT_UYVY:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_UYVY:
 		formats++;
 		if (xlate) {
-			xlate->host_fmt = &pxa_camera_formats[0];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
+			xlate->host_fmt	= &pxa_camera_formats[0];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(dev, "Providing format %s using %s\n",
-				pxa_camera_formats[0].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				pxa_camera_formats[0].name, code);
 		}
-	case V4L2_PIX_FMT_VYUY:
-	case V4L2_PIX_FMT_YUYV:
-	case V4L2_PIX_FMT_YVYU:
-	case V4L2_PIX_FMT_RGB565:
-	case V4L2_PIX_FMT_RGB555:
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
-			xlate++;
+	case V4L2_IMGBUS_FMT_VYUY:
+	case V4L2_IMGBUS_FMT_YUYV:
+	case V4L2_IMGBUS_FMT_YVYU:
+	case V4L2_IMGBUS_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB555:
+		if (xlate)
 			dev_dbg(dev, "Providing format %s packed\n",
-				icd->formats[idx].name);
-		}
+				fmt->name);
 		break;
 	default:
-		/* Generic pass-through */
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = icd->formats[idx].depth;
-			xlate++;
+		if (!pxa_camera_packing_supported(fmt))
+			return 0;
+		if (xlate)
 			dev_dbg(dev,
 				"Providing format %s in pass-through mode\n",
-				icd->formats[idx].name);
-		}
+				fmt->name);
+	}
+
+	/* Generic pass-through */
+	formats++;
+	if (xlate) {
+		xlate->host_fmt	= fmt;
+		xlate->code	= code;
+		xlate++;
 	}
 
 	return formats;
@@ -1339,11 +1336,11 @@ static void pxa_camera_put_formats(struct soc_camera_device *icd)
 	icd->host_priv = NULL;
 }
 
-static int pxa_camera_check_frame(struct v4l2_pix_format *pix)
+static int pxa_camera_check_frame(u32 width, u32 height)
 {
 	/* limit to pxa hardware capabilities */
-	return pix->height < 32 || pix->height > 2048 || pix->width < 48 ||
-		pix->width > 2048 || (pix->width & 0x01);
+	return height < 32 || height > 2048 || width < 48 || width > 2048 ||
+		(width & 0x01);
 }
 
 static int pxa_camera_set_crop(struct soc_camera_device *icd,
@@ -1358,9 +1355,9 @@ static int pxa_camera_set_crop(struct soc_camera_device *icd,
 		.master_clock = pcdev->mclk,
 		.pixel_clock_max = pcdev->ciclk / 4,
 	};
-	struct v4l2_format f;
-	struct v4l2_pix_format *pix = &f.fmt.pix, pix_tmp;
+	struct v4l2_imgbus_framefmt imgf;
 	struct pxa_cam *cam = icd->host_priv;
+	u32 fourcc = icd->current_fmt->host_fmt->fourcc;
 	int ret;
 
 	/* If PCLK is used to latch data from the sensor, check sense */
@@ -1377,27 +1374,23 @@ static int pxa_camera_set_crop(struct soc_camera_device *icd,
 		return ret;
 	}
 
-	f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+	ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
-	pix_tmp = *pix;
-	if (pxa_camera_check_frame(pix)) {
+	if (pxa_camera_check_frame(imgf.width, imgf.height)) {
 		/*
 		 * Camera cropping produced a frame beyond our capabilities.
 		 * FIXME: just extract a subframe, that we can process.
 		 */
-		v4l_bound_align_image(&pix->width, 48, 2048, 1,
-			&pix->height, 32, 2048, 0,
-			icd->current_fmt->fourcc == V4L2_PIX_FMT_YUV422P ?
-				4 : 0);
-		ret = v4l2_subdev_call(sd, video, s_fmt, &f);
+		v4l_bound_align_image(&imgf.width, 48, 2048, 1,
+			&imgf.height, 32, 2048, 0,
+			fourcc == V4L2_PIX_FMT_YUV422P ? 4 : 0);
+		ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
 		if (ret < 0)
 			return ret;
 
-		if (pxa_camera_check_frame(pix)) {
+		if (pxa_camera_check_frame(imgf.width, imgf.height)) {
 			dev_warn(icd->dev.parent,
 				 "Inconsistent state. Use S_FMT to repair\n");
 			return -EINVAL;
@@ -1414,10 +1407,10 @@ static int pxa_camera_set_crop(struct soc_camera_device *icd,
 		recalculate_fifo_timeout(pcdev, sense.pixel_clock);
 	}
 
-	icd->user_width = pix->width;
-	icd->user_height = pix->height;
+	icd->user_width		= imgf.width;
+	icd->user_height	= imgf.height;
 
-	pxa_camera_setup_cicr(icd, cam->flags, icd->current_fmt->fourcc);
+	pxa_camera_setup_cicr(icd, cam->flags, fourcc);
 
 	return ret;
 }
@@ -1429,14 +1422,13 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 	struct pxa_camera_dev *pcdev = ici->priv;
 	struct device *dev = icd->dev.parent;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
-	const struct soc_camera_data_format *cam_fmt = NULL;
 	const struct soc_camera_format_xlate *xlate = NULL;
 	struct soc_camera_sense sense = {
 		.master_clock = pcdev->mclk,
 		.pixel_clock_max = pcdev->ciclk / 4,
 	};
 	struct v4l2_pix_format *pix = &f->fmt.pix;
-	struct v4l2_format cam_f = *f;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
@@ -1445,26 +1437,27 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 		return -EINVAL;
 	}
 
-	cam_fmt = xlate->cam_fmt;
-
 	/* If PCLK is used to latch data from the sensor, check sense */
 	if (pcdev->platform_flags & PXA_CAMERA_PCLK_EN)
+		/* The caller holds a mutex. */
 		icd->sense = &sense;
 
-	cam_f.fmt.pix.pixelformat = cam_fmt->fourcc;
-	ret = v4l2_subdev_call(sd, video, s_fmt, &cam_f);
-	cam_f.fmt.pix.pixelformat = pix->pixelformat;
-	*pix = cam_f.fmt.pix;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
+
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
 
 	icd->sense = NULL;
 
 	if (ret < 0) {
 		dev_warn(dev, "Failed to configure for format %x\n",
 			 pix->pixelformat);
-	} else if (pxa_camera_check_frame(pix)) {
+	} else if (pxa_camera_check_frame(imgf.width, imgf.height)) {
 		dev_warn(dev,
 			 "Camera driver produced an unsupported frame %dx%d\n",
-			 pix->width, pix->height);
+			 imgf.width, imgf.height);
 		ret = -EINVAL;
 	} else if (sense.flags & SOCAM_SENSE_PCLK_CHANGED) {
 		if (sense.pixel_clock > sense.pixel_clock_max) {
@@ -1476,10 +1469,13 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 		recalculate_fifo_timeout(pcdev, sense.pixel_clock);
 	}
 
-	if (!ret) {
-		icd->buswidth = xlate->buswidth;
-		icd->current_fmt = xlate->host_fmt;
-	}
+	if (ret < 0)
+		return ret;
+
+	pix->width		= imgf.width;
+	pix->height		= imgf.height;
+	pix->field		= imgf.field;
+	icd->current_fmt	= xlate;
 
 	return ret;
 }
@@ -1487,17 +1483,16 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 static int pxa_camera_try_fmt(struct soc_camera_device *icd,
 			      struct v4l2_format *f)
 {
-	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	__u32 pixfmt = pix->pixelformat;
-	enum v4l2_field field;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pixfmt);
 	if (!xlate) {
-		dev_warn(ici->v4l2_dev.dev, "Format %x not found\n", pixfmt);
+		dev_warn(icd->dev.parent, "Format %x not found\n", pixfmt);
 		return -EINVAL;
 	}
 
@@ -1511,22 +1506,34 @@ static int pxa_camera_try_fmt(struct soc_camera_device *icd,
 			      &pix->height, 32, 2048, 0,
 			      pixfmt == V4L2_PIX_FMT_YUV422P ? 4 : 0);
 
-	pix->bytesperline = pix->width *
-		DIV_ROUND_UP(xlate->host_fmt->depth, 8);
+	pix->bytesperline = v4l2_imgbus_bytes_per_line(pix->width,
+						       xlate->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
 	pix->sizeimage = pix->height * pix->bytesperline;
 
-	/* camera has to see its format, but the user the original one */
-	pix->pixelformat = xlate->cam_fmt->fourcc;
 	/* limit to sensor capabilities */
-	ret = v4l2_subdev_call(sd, video, try_fmt, f);
-	pix->pixelformat = pixfmt;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.field	= pix->field;
+	imgf.code	= xlate->code;
 
-	field = pix->field;
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
 
-	if (field == V4L2_FIELD_ANY) {
-		pix->field = V4L2_FIELD_NONE;
-	} else if (field != V4L2_FIELD_NONE) {
-		dev_err(icd->dev.parent, "Field type %d unsupported.\n", field);
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
+
+	switch (imgf.field) {
+	case V4L2_FIELD_ANY:
+	case V4L2_FIELD_NONE:
+		pix->field	= V4L2_FIELD_NONE;
+		break;
+	default:
+		/* TODO: support interlaced at least in pass-through mode */
+		dev_err(icd->dev.parent, "Field type %d unsupported.\n",
+			imgf.field);
 		return -EINVAL;
 	}
 
diff --git a/drivers/media/video/rj54n1cb0c.c b/drivers/media/video/rj54n1cb0c.c
index 373f2a3..9d9512f 100644
--- a/drivers/media/video/rj54n1cb0c.c
+++ b/drivers/media/video/rj54n1cb0c.c
@@ -16,6 +16,7 @@
 #include <media/v4l2-subdev.h>
 #include <media/v4l2-chip-ident.h>
 #include <media/soc_camera.h>
+#include <media/rj54n1cb0c.h>
 
 #define RJ54N1_DEV_CODE			0x0400
 #define RJ54N1_DEV_CODE2		0x0401
@@ -38,6 +39,7 @@
 #define RJ54N1_H_OBEN_OFS		0x0413
 #define RJ54N1_V_OBEN_OFS		0x0414
 #define RJ54N1_RESIZE_CONTROL		0x0415
+#define RJ54N1_STILL_CONTROL		0x0417
 #define RJ54N1_INC_USE_SEL_H		0x0425
 #define RJ54N1_INC_USE_SEL_L		0x0426
 #define RJ54N1_MIRROR_STILL_MODE	0x0427
@@ -49,10 +51,21 @@
 #define RJ54N1_RA_SEL_UL		0x0530
 #define RJ54N1_BYTE_SWAP		0x0531
 #define RJ54N1_OUT_SIGPO		0x053b
+#define RJ54N1_WB_SEL_WEIGHT_I		0x054e
+#define RJ54N1_BIT8_WB			0x0569
+#define RJ54N1_HCAPS_WB			0x056a
+#define RJ54N1_VCAPS_WB			0x056b
+#define RJ54N1_HCAPE_WB			0x056c
+#define RJ54N1_VCAPE_WB			0x056d
+#define RJ54N1_EXPOSURE_CONTROL		0x058c
 #define RJ54N1_FRAME_LENGTH_S_H		0x0595
 #define RJ54N1_FRAME_LENGTH_S_L		0x0596
 #define RJ54N1_FRAME_LENGTH_P_H		0x0597
 #define RJ54N1_FRAME_LENGTH_P_L		0x0598
+#define RJ54N1_PEAK_H			0x05b7
+#define RJ54N1_PEAK_50			0x05b8
+#define RJ54N1_PEAK_60			0x05b9
+#define RJ54N1_PEAK_DIFF		0x05ba
 #define RJ54N1_IOC			0x05ef
 #define RJ54N1_TG_BYPASS		0x0700
 #define RJ54N1_PLL_L			0x0701
@@ -68,6 +81,7 @@
 #define RJ54N1_OCLK_SEL_EN		0x0713
 #define RJ54N1_CLK_RST			0x0717
 #define RJ54N1_RESET_STANDBY		0x0718
+#define RJ54N1_FWFLG			0x07fe
 
 #define E_EXCLK				(1 << 7)
 #define SOFT_STDBY			(1 << 4)
@@ -78,29 +92,34 @@
 #define RESIZE_HOLD_SEL			(1 << 2)
 #define RESIZE_GO			(1 << 1)
 
+/*
+ * When cropping, the camera automatically centers the cropped region, there
+ * doesn't seem to be a way to specify an explicit location of the rectangle.
+ */
 #define RJ54N1_COLUMN_SKIP		0
 #define RJ54N1_ROW_SKIP			0
 #define RJ54N1_MAX_WIDTH		1600
 #define RJ54N1_MAX_HEIGHT		1200
 
+#define PLL_L				2
+#define PLL_N				0x31
+
 /* I2C addresses: 0x50, 0x51, 0x60, 0x61 */
 
-static const struct soc_camera_data_format rj54n1_colour_formats[] = {
-	{
-		.name		= "YUYV",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_YUYV,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	}, {
-		.name		= "RGB565",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_RGB565,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+static const enum v4l2_imgbus_pixelcode rj54n1_colour_codes[] = {
+	V4L2_IMGBUS_FMT_YUYV,
+	V4L2_IMGBUS_FMT_YVYU,
+	V4L2_IMGBUS_FMT_RGB565,
+	V4L2_IMGBUS_FMT_RGB565X,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
+	V4L2_IMGBUS_FMT_SBGGR10,
 };
 
 struct rj54n1_clock_div {
-	u8 ratio_tg;
+	u8 ratio_tg;	/* can be 0 or an odd number */
 	u8 ratio_t;
 	u8 ratio_r;
 	u8 ratio_op;
@@ -109,12 +128,14 @@ struct rj54n1_clock_div {
 
 struct rj54n1 {
 	struct v4l2_subdev subdev;
+	struct rj54n1_clock_div clk_div;
+	enum v4l2_imgbus_pixelcode code;
 	struct v4l2_rect rect;	/* Sensor window */
+	unsigned int tgclk_mhz;
+	bool auto_wb;
 	unsigned short width;	/* Output window */
 	unsigned short height;
 	unsigned short resize;	/* Sensor * 1024 / resize = Output */
-	struct rj54n1_clock_div clk_div;
-	u32 fourcc;
 	unsigned short scale;
 	u8 bank;
 };
@@ -171,7 +192,7 @@ const static struct rj54n1_reg_val bank_7[] = {
 	{0x714, 0xff},
 	{0x715, 0xff},
 	{0x716, 0x1f},
-	{0x7FE, 0x02},
+	{0x7FE, 2},
 };
 
 const static struct rj54n1_reg_val bank_8[] = {
@@ -359,7 +380,7 @@ const static struct rj54n1_reg_val bank_8[] = {
 	{0x8BB, 0x00},
 	{0x8BC, 0xFF},
 	{0x8BD, 0x00},
-	{0x8FE, 0x02},
+	{0x8FE, 2},
 };
 
 const static struct rj54n1_reg_val bank_10[] = {
@@ -440,12 +461,24 @@ static int reg_write_multiple(struct i2c_client *client,
 	return 0;
 }
 
-static int rj54n1_s_stream(struct v4l2_subdev *sd, int enable)
+static int rj54n1_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
 {
-	/* TODO: start / stop streaming */
+	if ((unsigned int)index >= ARRAY_SIZE(rj54n1_colour_codes))
+		return -EINVAL;
+
+	*code = rj54n1_colour_codes[index];
 	return 0;
 }
 
+static int rj54n1_s_stream(struct v4l2_subdev *sd, int enable)
+{
+	struct i2c_client *client = sd->priv;
+
+	/* Switch between preview and still shot modes */
+	return reg_set(client, RJ54N1_STILL_CONTROL, (!enable) << 7, 0x80);
+}
+
 static int rj54n1_set_bus_param(struct soc_camera_device *icd,
 				unsigned long flags)
 {
@@ -502,6 +535,44 @@ static int rj54n1_commit(struct i2c_client *client)
 	return ret;
 }
 
+static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
+			       u32 *out_w, u32 *out_h);
+
+static int rj54n1_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
+{
+	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
+	struct v4l2_rect *rect = &a->c;
+	unsigned int dummy, output_w, output_h,
+		input_w = rect->width, input_h = rect->height;
+	int ret;
+
+	/* arbitrary minimum width and height, edges unimportant */
+	soc_camera_limit_side(&dummy, &input_w,
+		     RJ54N1_COLUMN_SKIP, 8, RJ54N1_MAX_WIDTH);
+
+	soc_camera_limit_side(&dummy, &input_h,
+		     RJ54N1_ROW_SKIP, 8, RJ54N1_MAX_HEIGHT);
+
+	output_w = (input_w * 1024 + rj54n1->resize / 2) / rj54n1->resize;
+	output_h = (input_h * 1024 + rj54n1->resize / 2) / rj54n1->resize;
+
+	dev_dbg(&client->dev, "Scaling for %ux%u : %u = %ux%u\n",
+		input_w, input_h, rj54n1->resize, output_w, output_h);
+
+	ret = rj54n1_sensor_scale(sd, &input_w, &input_h, &output_w, &output_h);
+	if (ret < 0)
+		return ret;
+
+	rj54n1->width		= output_w;
+	rj54n1->height		= output_h;
+	rj54n1->resize		= ret;
+	rj54n1->rect.width	= input_w;
+	rj54n1->rect.height	= input_h;
+
+	return 0;
+}
+
 static int rj54n1_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 {
 	struct i2c_client *client = sd->priv;
@@ -527,16 +598,16 @@ static int rj54n1_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int rj54n1_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int rj54n1_g_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct rj54n1 *rj54n1 = to_rj54n1(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->pixelformat	= rj54n1->fourcc;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->width		= rj54n1->width;
-	pix->height		= rj54n1->height;
+	imgf->code	= rj54n1->code;
+	imgf->field	= V4L2_FIELD_NONE;
+	imgf->width	= rj54n1->width;
+	imgf->height	= rj54n1->height;
 
 	return 0;
 }
@@ -550,11 +621,44 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 			       u32 *out_w, u32 *out_h)
 {
 	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	unsigned int skip, resize, input_w = *in_w, input_h = *in_h,
 		output_w = *out_w, output_h = *out_h;
-	u16 inc_sel;
+	u16 inc_sel, wb_bit8, wb_left, wb_right, wb_top, wb_bottom;
+	unsigned int peak, peak_50, peak_60;
 	int ret;
 
+	/*
+	 * We have a problem with crops, where the window is larger than 512x384
+	 * and output window is larger than a half of the input one. In this
+	 * case we have to either reduce the input window to equal or below
+	 * 512x384 or the output window to equal or below 1/2 of the input.
+	 */
+	if (output_w > max(512U, input_w / 2)) {
+		if (2 * output_w > RJ54N1_MAX_WIDTH) {
+			input_w = RJ54N1_MAX_WIDTH;
+			output_w = RJ54N1_MAX_WIDTH / 2;
+		} else {
+			input_w = output_w * 2;
+		}
+
+		dev_dbg(&client->dev, "Adjusted output width: in %u, out %u\n",
+			input_w, output_w);
+	}
+
+	if (output_h > max(384U, input_h / 2)) {
+		if (2 * output_h > RJ54N1_MAX_HEIGHT) {
+			input_h = RJ54N1_MAX_HEIGHT;
+			output_h = RJ54N1_MAX_HEIGHT / 2;
+		} else {
+			input_h = output_h * 2;
+		}
+
+		dev_dbg(&client->dev, "Adjusted output height: in %u, out %u\n",
+			input_h, output_h);
+	}
+
+	/* Idea: use the read mode for snapshots, handle separate geometries */
 	ret = rj54n1_set_rect(client, RJ54N1_X_OUTPUT_SIZE_S_L,
 			      RJ54N1_Y_OUTPUT_SIZE_S_L,
 			      RJ54N1_XY_OUTPUT_SIZE_S_H, output_w, output_h);
@@ -566,17 +670,27 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (ret < 0)
 		return ret;
 
-	if (output_w > input_w || output_h > input_h) {
+	if (output_w > input_w && output_h > input_h) {
 		input_w = output_w;
 		input_h = output_h;
 
 		resize = 1024;
 	} else {
 		unsigned int resize_x, resize_y;
-		resize_x = input_w * 1024 / output_w;
-		resize_y = input_h * 1024 / output_h;
-
-		resize = min(resize_x, resize_y);
+		resize_x = (input_w * 1024 + output_w / 2) / output_w;
+		resize_y = (input_h * 1024 + output_h / 2) / output_h;
+
+		/* We want max(resize_x, resize_y), check if it still fits */
+		if (resize_x > resize_y &&
+		    (output_h * resize_x + 512) / 1024 > RJ54N1_MAX_HEIGHT)
+			resize = (RJ54N1_MAX_HEIGHT * 1024 + output_h / 2) /
+				output_h;
+		else if (resize_y > resize_x &&
+			 (output_w * resize_y + 512) / 1024 > RJ54N1_MAX_WIDTH)
+			resize = (RJ54N1_MAX_WIDTH * 1024 + output_w / 2) /
+				output_w;
+		else
+			resize = max(resize_x, resize_y);
 
 		/* Prohibited value ranges */
 		switch (resize) {
@@ -589,12 +703,9 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 		case 8160 ... 8191:
 			resize = 8159;
 			break;
-		case 16320 ... 16383:
+		case 16320 ... 16384:
 			resize = 16319;
 		}
-
-		input_w = output_w * resize / 1024;
-		input_h = output_h * resize / 1024;
 	}
 
 	/* Set scaling */
@@ -607,13 +718,22 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 
 	/*
 	 * Configure a skipping bitmask. The sensor will select a skipping value
-	 * among set bits automatically.
+	 * among set bits automatically. This is very unclear in the datasheet
+	 * too. I was told, in this register one enables all skipping values,
+	 * that are required for a specific resize, and the camera selects
+	 * automatically, which ones to use. But it is unclear how to identify,
+	 * which cropping values are needed. Secondly, why don't we just set all
+	 * bits and let the camera choose? Would it increase processing time and
+	 * reduce the framerate? Using 0xfffc for INC_USE_SEL doesn't seem to
+	 * improve the image quality or stability for larger frames (see comment
+	 * above), but I didn't check the framerate.
 	 */
 	skip = min(resize / 1024, (unsigned)15);
+
 	inc_sel = 1 << skip;
 
 	if (inc_sel <= 2)
-		inc_sel = 0xc;
+		inc_sel = 0xC;
 	else if (resize & 1023 && skip < 15)
 		inc_sel |= 1 << (skip + 1);
 
@@ -621,6 +741,43 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (!ret)
 		ret = reg_write(client, RJ54N1_INC_USE_SEL_H, inc_sel >> 8);
 
+	if (!rj54n1->auto_wb) {
+		/* Auto white balance window */
+		wb_left	  = output_w / 16;
+		wb_right  = (3 * output_w / 4 - 3) / 4;
+		wb_top	  = output_h / 16;
+		wb_bottom = (3 * output_h / 4 - 3) / 4;
+		wb_bit8	  = ((wb_left >> 2) & 0x40) | ((wb_top >> 4) & 0x10) |
+			((wb_right >> 6) & 4) | ((wb_bottom >> 8) & 1);
+
+		if (!ret)
+			ret = reg_write(client, RJ54N1_BIT8_WB, wb_bit8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_HCAPS_WB, wb_left);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_VCAPS_WB, wb_top);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_HCAPE_WB, wb_right);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_VCAPE_WB, wb_bottom);
+	}
+
+	/* Antiflicker */
+	peak = 12 * RJ54N1_MAX_WIDTH * (1 << 14) * resize / rj54n1->tgclk_mhz /
+		10000;
+	peak_50 = peak / 6;
+	peak_60 = peak / 5;
+
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_H,
+				((peak_50 >> 4) & 0xf0) | (peak_60 >> 8));
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_50, peak_50);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_60, peak_60);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_DIFF, peak / 150);
+
 	/* Start resizing */
 	if (!ret)
 		ret = reg_write(client, RJ54N1_RESIZE_CONTROL,
@@ -629,8 +786,6 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (ret < 0)
 		return ret;
 
-	dev_dbg(&client->dev, "resize %u, skip %u\n", resize, skip);
-
 	/* Constant taken from manufacturer's example */
 	msleep(230);
 
@@ -638,190 +793,37 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (ret < 0)
 		return ret;
 
-	*in_w = input_w;
-	*in_h = input_h;
+	*in_w = (output_w * resize + 512) / 1024;
+	*in_h = (output_h * resize + 512) / 1024;
 	*out_w = output_w;
 	*out_h = output_h;
 
-	return resize;
-}
-
-static int rj54n1_set_clock(struct i2c_client *client)
-{
-	struct rj54n1 *rj54n1 = to_rj54n1(client);
-	int ret;
-
-	/* Enable external clock */
-	ret = reg_write(client, RJ54N1_RESET_STANDBY, E_EXCLK | SOFT_STDBY);
-	/* Leave stand-by */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RESET_STANDBY, E_EXCLK);
-
-	if (!ret)
-		ret = reg_write(client, RJ54N1_PLL_L, 2);
-	if (!ret)
-		ret = reg_write(client, RJ54N1_PLL_N, 0x31);
-
-	/* TGCLK dividers */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RATIO_TG,
-				rj54n1->clk_div.ratio_tg);
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RATIO_T,
-				rj54n1->clk_div.ratio_t);
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RATIO_R,
-				rj54n1->clk_div.ratio_r);
-
-	/* Enable TGCLK & RAMP */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RAMP_TGCLK_EN, 3);
-
-	/* Disable clock output */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_OCLK_DSP, 0);
-
-	/* Set divisors */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RATIO_OP,
-				rj54n1->clk_div.ratio_op);
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RATIO_O,
-				rj54n1->clk_div.ratio_o);
-
-	/* Enable OCLK */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_OCLK_SEL_EN, 1);
-
-	/* Use PLL for Timing Generator, write 2 to reserved bits */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_TG_BYPASS, 2);
-
-	/* Take sensor out of reset */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RESET_STANDBY,
-				E_EXCLK | SEN_RSTX);
-	/* Enable PLL */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_PLL_EN, 1);
-
-	/* Wait for PLL to stabilise */
-	msleep(10);
-
-	/* Enable clock to frequency divider */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_CLK_RST, 1);
-
-	if (!ret)
-		ret = reg_read(client, RJ54N1_CLK_RST);
-	if (ret != 1) {
-		dev_err(&client->dev,
-			"Resetting RJ54N1CB0C clock failed: %d!\n", ret);
-		return -EIO;
-	}
-	/* Start the PLL */
-	ret = reg_set(client, RJ54N1_OCLK_DSP, 1, 1);
-
-	/* Enable OCLK */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_OCLK_SEL_EN, 1);
-
-	return ret;
-}
-
-static int rj54n1_reg_init(struct i2c_client *client)
-{
-	int ret = rj54n1_set_clock(client);
-
-	if (!ret)
-		ret = reg_write_multiple(client, bank_7, ARRAY_SIZE(bank_7));
-	if (!ret)
-		ret = reg_write_multiple(client, bank_10, ARRAY_SIZE(bank_10));
-
-	/* Set binning divisors */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_SCALE_1_2_LEV, 3 | (7 << 4));
-	if (!ret)
-		ret = reg_write(client, RJ54N1_SCALE_4_LEV, 0xf);
-
-	/* Switch to fixed resize mode */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RESIZE_CONTROL,
-				RESIZE_HOLD_SEL | 1);
-
-	/* Set gain */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_Y_GAIN, 0x84);
-
-	/* Mirror the image back: default is upside down and left-to-right... */
-	if (!ret)
-		ret = reg_set(client, RJ54N1_MIRROR_STILL_MODE, 3, 3);
-
-	if (!ret)
-		ret = reg_write_multiple(client, bank_4, ARRAY_SIZE(bank_4));
-	if (!ret)
-		ret = reg_write_multiple(client, bank_5, ARRAY_SIZE(bank_5));
-	if (!ret)
-		ret = reg_write_multiple(client, bank_8, ARRAY_SIZE(bank_8));
-
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RESET_STANDBY,
-				E_EXCLK | DSP_RSTX | SEN_RSTX);
-
-	/* Commit init */
-	if (!ret)
-		ret = rj54n1_commit(client);
-
-	/* Take DSP, TG, sensor out of reset */
-	if (!ret)
-		ret = reg_write(client, RJ54N1_RESET_STANDBY,
-				E_EXCLK | DSP_RSTX | TG_RSTX | SEN_RSTX);
-
-	if (!ret)
-		ret = reg_write(client, 0x7fe, 2);
-
-	/* Constant taken from manufacturer's example */
-	msleep(700);
+	dev_dbg(&client->dev, "Scaled for %ux%u : %u = %ux%u, skip %u\n",
+		*in_w, *in_h, resize, output_w, output_h, skip);
 
-	return ret;
+	return resize;
 }
 
-/* FIXME: streaming output only up to 800x600 is functional */
-static int rj54n1_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
-{
-	struct v4l2_pix_format *pix = &f->fmt.pix;
+static int rj54n1_reg_init(struct i2c_client *client);
+static int rj54n1_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf);
 
-	pix->field = V4L2_FIELD_NONE;
-
-	if (pix->width > 800)
-		pix->width = 800;
-	if (pix->height > 600)
-		pix->height = 600;
-
-	return 0;
-}
-
-static int rj54n1_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int rj54n1_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct rj54n1 *rj54n1 = to_rj54n1(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	unsigned int output_w, output_h,
+	unsigned int output_w, output_h, max_w, max_h,
 		input_w = rj54n1->rect.width, input_h = rj54n1->rect.height;
 	int ret;
 
-	/*
-	 * The host driver can call us without .try_fmt(), so, we have to take
-	 * care ourseleves
-	 */
-	ret = rj54n1_try_fmt(sd, f);
+	rj54n1_try_fmt(sd, imgf);
 
 	/*
 	 * Verify if the sensor has just been powered on. TODO: replace this
 	 * with proper PM, when a suitable API is available.
 	 */
-	if (!ret)
-		ret = reg_read(client, RJ54N1_RESET_STANDBY);
+	ret = reg_read(client, RJ54N1_RESET_STANDBY);
 	if (ret < 0)
 		return ret;
 
@@ -831,50 +833,122 @@ static int rj54n1_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 			return ret;
 	}
 
+	dev_dbg(&client->dev, "%s: code = %d, width = %u, height = %u\n",
+		__func__, imgf->code, imgf->width, imgf->height);
+
 	/* RA_SEL_UL is only relevant for raw modes, ignored otherwise. */
-	switch (pix->pixelformat) {
-	case V4L2_PIX_FMT_YUYV:
+	switch (imgf->code) {
+	case V4L2_IMGBUS_FMT_YUYV:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 0);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
+		break;
+	case V4L2_IMGBUS_FMT_YVYU:
 		ret = reg_write(client, RJ54N1_OUT_SEL, 0);
 		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		break;
+	case V4L2_IMGBUS_FMT_RGB565:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 0x11);
+		if (!ret)
 			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
 		break;
-	case V4L2_PIX_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB565X:
 		ret = reg_write(client, RJ54N1_OUT_SEL, 0x11);
 		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
 			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 0);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 8);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 0);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 8);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 5);
 		break;
 	default:
 		ret = -EINVAL;
 	}
 
+	/* Special case: a raw mode with 10 bits of data per clock tick */
+	if (!ret)
+		ret = reg_set(client, RJ54N1_OCLK_SEL_EN,
+			      (imgf->code == V4L2_IMGBUS_FMT_SBGGR10) << 1, 2);
+
 	if (ret < 0)
 		return ret;
 
-	/* Supported scales 1:1 - 1:16 */
-	if (pix->width < input_w / 16)
-		pix->width = input_w / 16;
-	if (pix->height < input_h / 16)
-		pix->height = input_h / 16;
+	/* Supported scales 1:1 >= scale > 1:16 */
+	max_w = imgf->width * (16 * 1024 - 1) / 1024;
+	if (input_w > max_w)
+		input_w = max_w;
+	max_h = imgf->height * (16 * 1024 - 1) / 1024;
+	if (input_h > max_h)
+		input_h = max_h;
 
-	output_w = pix->width;
-	output_h = pix->height;
+	output_w = imgf->width;
+	output_h = imgf->height;
 
 	ret = rj54n1_sensor_scale(sd, &input_w, &input_h, &output_w, &output_h);
 	if (ret < 0)
 		return ret;
 
-	rj54n1->fourcc		= pix->pixelformat;
+	rj54n1->code		= imgf->code;
 	rj54n1->resize		= ret;
 	rj54n1->rect.width	= input_w;
 	rj54n1->rect.height	= input_h;
 	rj54n1->width		= output_w;
 	rj54n1->height		= output_h;
 
-	pix->width		= output_w;
-	pix->height		= output_h;
-	pix->field		= V4L2_FIELD_NONE;
+	imgf->width		= output_w;
+	imgf->height		= output_h;
+	imgf->field		= V4L2_FIELD_NONE;
 
-	return ret;
+	return 0;
+}
+
+static int rj54n1_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf)
+{
+	struct i2c_client *client = sd->priv;
+	int align = imgf->code == V4L2_IMGBUS_FMT_SBGGR10 ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE;
+
+	dev_dbg(&client->dev, "%s: code = %d, width = %u, height = %u\n",
+		__func__, imgf->code, imgf->width, imgf->height);
+
+	imgf->field = V4L2_FIELD_NONE;
+
+	v4l_bound_align_image(&imgf->width, 112, RJ54N1_MAX_WIDTH, align,
+			      &imgf->height, 84, RJ54N1_MAX_HEIGHT, align, 0);
+
+	return 0;
 }
 
 static int rj54n1_g_chip_ident(struct v4l2_subdev *sd,
@@ -963,6 +1037,14 @@ static const struct v4l2_queryctrl rj54n1_controls[] = {
 		.step		= 1,
 		.default_value	= 66,
 		.flags		= V4L2_CTRL_FLAG_SLIDER,
+	}, {
+		.id		= V4L2_CID_AUTO_WHITE_BALANCE,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Auto white balance",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 1,
 	},
 };
 
@@ -976,6 +1058,7 @@ static struct soc_camera_ops rj54n1_ops = {
 static int rj54n1_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	int data;
 
 	switch (ctrl->id) {
@@ -998,6 +1081,9 @@ static int rj54n1_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 
 		ctrl->value = data / 2;
 		break;
+	case V4L2_CID_AUTO_WHITE_BALANCE:
+		ctrl->value = rj54n1->auto_wb;
+		break;
 	}
 
 	return 0;
@@ -1007,6 +1093,7 @@ static int rj54n1_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	int data;
 	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	const struct v4l2_queryctrl *qctrl;
 
 	qctrl = soc_camera_find_qctrl(&rj54n1_ops, ctrl->id);
@@ -1037,6 +1124,13 @@ static int rj54n1_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 		else if (reg_write(client, RJ54N1_Y_GAIN, ctrl->value * 2) < 0)
 			return -EIO;
 		break;
+	case V4L2_CID_AUTO_WHITE_BALANCE:
+		/* Auto WB area - whole image */
+		if (reg_set(client, RJ54N1_WB_SEL_WEIGHT_I, ctrl->value << 7,
+			    0x80) < 0)
+			return -EIO;
+		rj54n1->auto_wb = ctrl->value;
+		break;
 	}
 
 	return 0;
@@ -1053,26 +1147,178 @@ static struct v4l2_subdev_core_ops rj54n1_subdev_core_ops = {
 };
 
 static struct v4l2_subdev_video_ops rj54n1_subdev_video_ops = {
-	.s_stream	= rj54n1_s_stream,
-	.s_fmt		= rj54n1_s_fmt,
-	.g_fmt		= rj54n1_g_fmt,
-	.try_fmt	= rj54n1_try_fmt,
-	.g_crop		= rj54n1_g_crop,
-	.cropcap	= rj54n1_cropcap,
+	.s_stream		= rj54n1_s_stream,
+	.s_imgbus_fmt		= rj54n1_s_fmt,
+	.g_imgbus_fmt		= rj54n1_g_fmt,
+	.try_imgbus_fmt		= rj54n1_try_fmt,
+	.s_crop			= rj54n1_s_crop,
+	.g_crop			= rj54n1_g_crop,
+	.cropcap		= rj54n1_cropcap,
+	.enum_imgbus_fmt	= rj54n1_enum_fmt,
+};
+
+static struct v4l2_subdev_sensor_ops rj54n1_subdev_sensor_ops = {
 };
 
 static struct v4l2_subdev_ops rj54n1_subdev_ops = {
 	.core	= &rj54n1_subdev_core_ops,
 	.video	= &rj54n1_subdev_video_ops,
+	.sensor	= &rj54n1_subdev_sensor_ops,
 };
 
-static int rj54n1_pin_config(struct i2c_client *client)
+static int rj54n1_set_clock(struct i2c_client *client)
+{
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
+	int ret;
+
+	/* Enable external clock */
+	ret = reg_write(client, RJ54N1_RESET_STANDBY, E_EXCLK | SOFT_STDBY);
+	/* Leave stand-by. Note: use this when implementing suspend / resume */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RESET_STANDBY, E_EXCLK);
+
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PLL_L, PLL_L);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PLL_N, PLL_N);
+
+	/* TGCLK dividers */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RATIO_TG,
+				rj54n1->clk_div.ratio_tg);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RATIO_T,
+				rj54n1->clk_div.ratio_t);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RATIO_R,
+				rj54n1->clk_div.ratio_r);
+
+	/* Enable TGCLK & RAMP */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RAMP_TGCLK_EN, 3);
+
+	/* Disable clock output */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_OCLK_DSP, 0);
+
+	/* Set divisors */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RATIO_OP,
+				rj54n1->clk_div.ratio_op);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RATIO_O,
+				rj54n1->clk_div.ratio_o);
+
+	/* Enable OCLK */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_OCLK_SEL_EN, 1);
+
+	/* Use PLL for Timing Generator, write 2 to reserved bits */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_TG_BYPASS, 2);
+
+	/* Take sensor out of reset */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RESET_STANDBY,
+				E_EXCLK | SEN_RSTX);
+	/* Enable PLL */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PLL_EN, 1);
+
+	/* Wait for PLL to stabilise */
+	msleep(10);
+
+	/* Enable clock to frequency divider */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_CLK_RST, 1);
+
+	if (!ret)
+		ret = reg_read(client, RJ54N1_CLK_RST);
+	if (ret != 1) {
+		dev_err(&client->dev,
+			"Resetting RJ54N1CB0C clock failed: %d!\n", ret);
+		return -EIO;
+	}
+
+	/* Start the PLL */
+	ret = reg_set(client, RJ54N1_OCLK_DSP, 1, 1);
+
+	/* Enable OCLK */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_OCLK_SEL_EN, 1);
+
+	return ret;
+}
+
+static int rj54n1_reg_init(struct i2c_client *client)
 {
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
+	int ret = rj54n1_set_clock(client);
+
+	if (!ret)
+		ret = reg_write_multiple(client, bank_7, ARRAY_SIZE(bank_7));
+	if (!ret)
+		ret = reg_write_multiple(client, bank_10, ARRAY_SIZE(bank_10));
+
+	/* Set binning divisors */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_SCALE_1_2_LEV, 3 | (7 << 4));
+	if (!ret)
+		ret = reg_write(client, RJ54N1_SCALE_4_LEV, 0xf);
+
+	/* Switch to fixed resize mode */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RESIZE_CONTROL,
+				RESIZE_HOLD_SEL | 1);
+
+	/* Set gain */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_Y_GAIN, 0x84);
+
 	/*
-	 * Experimentally found out IOCTRL wired to 0. TODO: add to platform
-	 * data: 0 or 1 << 7.
+	 * Mirror the image back: default is upside down and left-to-right...
+	 * Set manual preview / still shot switching
 	 */
-	return reg_write(client, RJ54N1_IOC, 0);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_MIRROR_STILL_MODE, 0x27);
+
+	if (!ret)
+		ret = reg_write_multiple(client, bank_4, ARRAY_SIZE(bank_4));
+
+	/* Auto exposure area */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_EXPOSURE_CONTROL, 0x80);
+	/* Check current auto WB config */
+	if (!ret)
+		ret = reg_read(client, RJ54N1_WB_SEL_WEIGHT_I);
+	if (ret >= 0) {
+		rj54n1->auto_wb = ret & 0x80;
+		ret = reg_write_multiple(client, bank_5, ARRAY_SIZE(bank_5));
+	}
+	if (!ret)
+		ret = reg_write_multiple(client, bank_8, ARRAY_SIZE(bank_8));
+
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RESET_STANDBY,
+				E_EXCLK | DSP_RSTX | SEN_RSTX);
+
+	/* Commit init */
+	if (!ret)
+		ret = rj54n1_commit(client);
+
+	/* Take DSP, TG, sensor out of reset */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_RESET_STANDBY,
+				E_EXCLK | DSP_RSTX | TG_RSTX | SEN_RSTX);
+
+	/* Start register update? Same register as 0x?FE in many bank_* sets */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_FWFLG, 2);
+
+	/* Constant taken from manufacturer's example */
+	msleep(700);
+
+	return ret;
 }
 
 /*
@@ -1080,7 +1326,8 @@ static int rj54n1_pin_config(struct i2c_client *client)
  * this wasn't our capture interface, so, we wait for the right one
  */
 static int rj54n1_video_probe(struct soc_camera_device *icd,
-			      struct i2c_client *client)
+			      struct i2c_client *client,
+			      struct rj54n1_pdata *priv)
 {
 	int data1, data2;
 	int ret;
@@ -1101,7 +1348,8 @@ static int rj54n1_video_probe(struct soc_camera_device *icd,
 		goto ei2c;
 	}
 
-	ret = rj54n1_pin_config(client);
+	/* Configure IOCTL polarity from the platform data: 0 or 1 << 7. */
+	ret = reg_write(client, RJ54N1_IOC, priv->ioctl_high << 7);
 	if (ret < 0)
 		goto ei2c;
 
@@ -1119,6 +1367,7 @@ static int rj54n1_probe(struct i2c_client *client,
 	struct soc_camera_device *icd = client->dev.platform_data;
 	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
 	struct soc_camera_link *icl;
+	struct rj54n1_pdata *rj54n1_priv;
 	int ret;
 
 	if (!icd) {
@@ -1127,11 +1376,13 @@ static int rj54n1_probe(struct i2c_client *client,
 	}
 
 	icl = to_soc_camera_link(icd);
-	if (!icl) {
+	if (!icl || !icl->priv) {
 		dev_err(&client->dev, "RJ54N1CB0C: missing platform data!\n");
 		return -EINVAL;
 	}
 
+	rj54n1_priv = icl->priv;
+
 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_BYTE_DATA)) {
 		dev_warn(&adapter->dev,
 			 "I2C-Adapter doesn't support I2C_FUNC_SMBUS_BYTE\n");
@@ -1153,10 +1404,12 @@ static int rj54n1_probe(struct i2c_client *client,
 	rj54n1->rect.height	= RJ54N1_MAX_HEIGHT;
 	rj54n1->width		= RJ54N1_MAX_WIDTH;
 	rj54n1->height		= RJ54N1_MAX_HEIGHT;
-	rj54n1->fourcc		= V4L2_PIX_FMT_YUYV;
+	rj54n1->code		= rj54n1_colour_codes[0];
 	rj54n1->resize		= 1024;
+	rj54n1->tgclk_mhz	= (rj54n1_priv->mclk_freq / PLL_L * PLL_N) /
+		(clk_div.ratio_tg + 1) / (clk_div.ratio_t + 1);
 
-	ret = rj54n1_video_probe(icd, client);
+	ret = rj54n1_video_probe(icd, client, rj54n1_priv);
 	if (ret < 0) {
 		icd->ops = NULL;
 		i2c_set_clientdata(client, NULL);
@@ -1164,9 +1417,6 @@ static int rj54n1_probe(struct i2c_client *client,
 		return ret;
 	}
 
-	icd->formats		= rj54n1_colour_formats;
-	icd->num_formats	= ARRAY_SIZE(rj54n1_colour_formats);
-
 	return ret;
 }
 
diff --git a/drivers/media/video/sh_mobile_ceu_camera.c b/drivers/media/video/sh_mobile_ceu_camera.c
index 0b7e32b..746aed0 100644
--- a/drivers/media/video/sh_mobile_ceu_camera.c
+++ b/drivers/media/video/sh_mobile_ceu_camera.c
@@ -37,6 +37,7 @@
 #include <media/soc_camera.h>
 #include <media/sh_mobile_ceu.h>
 #include <media/videobuf-dma-contig.h>
+#include <media/v4l2-imagebus.h>
 
 /* register offsets for sh7722 / sh7723 */
 
@@ -84,7 +85,7 @@
 /* per video frame buffer */
 struct sh_mobile_ceu_buffer {
 	struct videobuf_buffer vb; /* v4l buffer must be first */
-	const struct soc_camera_data_format *fmt;
+	enum v4l2_imgbus_pixelcode code;
 };
 
 struct sh_mobile_ceu_dev {
@@ -113,8 +114,8 @@ struct sh_mobile_ceu_cam {
 	struct v4l2_rect ceu_rect;
 	unsigned int cam_width;
 	unsigned int cam_height;
-	const struct soc_camera_data_format *extra_fmt;
-	const struct soc_camera_data_format *camera_fmt;
+	const struct v4l2_imgbus_pixelfmt *extra_fmt;
+	enum v4l2_imgbus_pixelcode code;
 };
 
 static unsigned long make_bus_param(struct sh_mobile_ceu_dev *pcdev)
@@ -195,10 +196,13 @@ static int sh_mobile_ceu_videobuf_setup(struct videobuf_queue *vq,
 	struct soc_camera_device *icd = vq->priv_data;
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct sh_mobile_ceu_dev *pcdev = ici->priv;
-	int bytes_per_pixel = (icd->current_fmt->depth + 7) >> 3;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
 
-	*size = PAGE_ALIGN(icd->user_width * icd->user_height *
-			   bytes_per_pixel);
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
+	*size = PAGE_ALIGN(bytes_per_line * icd->user_height);
 
 	if (0 == *count)
 		*count = 2;
@@ -282,7 +286,7 @@ static int sh_mobile_ceu_capture(struct sh_mobile_ceu_dev *pcdev)
 		ceu_write(pcdev, CDBYR, phys_addr_bottom);
 	}
 
-	switch (icd->current_fmt->fourcc) {
+	switch (icd->current_fmt->host_fmt->fourcc) {
 	case V4L2_PIX_FMT_NV12:
 	case V4L2_PIX_FMT_NV21:
 	case V4L2_PIX_FMT_NV16:
@@ -309,8 +313,13 @@ static int sh_mobile_ceu_videobuf_prepare(struct videobuf_queue *vq,
 {
 	struct soc_camera_device *icd = vq->priv_data;
 	struct sh_mobile_ceu_buffer *buf;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
 	int ret;
 
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
 	buf = container_of(vb, struct sh_mobile_ceu_buffer, vb);
 
 	dev_dbg(icd->dev.parent, "%s (vb=0x%p) 0x%08lx %zd\n", __func__,
@@ -329,18 +338,18 @@ static int sh_mobile_ceu_videobuf_prepare(struct videobuf_queue *vq,
 
 	BUG_ON(NULL == icd->current_fmt);
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
 		vb->state	= VIDEOBUF_NEEDS_INIT;
 	}
 
-	vb->size = vb->width * vb->height * ((buf->fmt->depth + 7) >> 3);
+	vb->size = vb->height * bytes_per_line;
 	if (0 != vb->baddr && vb->bsize < vb->size) {
 		ret = -EINVAL;
 		goto out;
@@ -564,7 +573,8 @@ static void sh_mobile_ceu_set_rect(struct soc_camera_device *icd,
 		}
 		width = cdwdr_width = out_width;
 	} else {
-		unsigned int w_factor = (icd->current_fmt->depth + 7) >> 3;
+		unsigned int w_factor = (7 +
+			icd->current_fmt->host_fmt->bits_per_sample) >> 3;
 
 		width = out_width * w_factor / 2;
 
@@ -671,24 +681,24 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 	value = 0x00000010; /* data fetch by default */
 	yuv_lineskip = 0;
 
-	switch (icd->current_fmt->fourcc) {
+	switch (icd->current_fmt->host_fmt->fourcc) {
 	case V4L2_PIX_FMT_NV12:
 	case V4L2_PIX_FMT_NV21:
 		yuv_lineskip = 1; /* skip for NV12/21, no skip for NV16/61 */
 		/* fall-through */
 	case V4L2_PIX_FMT_NV16:
 	case V4L2_PIX_FMT_NV61:
-		switch (cam->camera_fmt->fourcc) {
-		case V4L2_PIX_FMT_UYVY:
+		switch (cam->code) {
+		case V4L2_IMGBUS_FMT_UYVY:
 			value = 0x00000000; /* Cb0, Y0, Cr0, Y1 */
 			break;
-		case V4L2_PIX_FMT_VYUY:
+		case V4L2_IMGBUS_FMT_VYUY:
 			value = 0x00000100; /* Cr0, Y0, Cb0, Y1 */
 			break;
-		case V4L2_PIX_FMT_YUYV:
+		case V4L2_IMGBUS_FMT_YUYV:
 			value = 0x00000200; /* Y0, Cb0, Y1, Cr0 */
 			break;
-		case V4L2_PIX_FMT_YVYU:
+		case V4L2_IMGBUS_FMT_YVYU:
 			value = 0x00000300; /* Y0, Cr0, Y1, Cb0 */
 			break;
 		default:
@@ -696,8 +706,8 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 		}
 	}
 
-	if (icd->current_fmt->fourcc == V4L2_PIX_FMT_NV21 ||
-	    icd->current_fmt->fourcc == V4L2_PIX_FMT_NV61)
+	if (icd->current_fmt->host_fmt->fourcc == V4L2_PIX_FMT_NV21 ||
+	    icd->current_fmt->host_fmt->fourcc == V4L2_PIX_FMT_NV61)
 		value ^= 0x00000100; /* swap U, V to change from NV1x->NVx1 */
 
 	value |= common_flags & SOCAM_VSYNC_ACTIVE_LOW ? 1 << 1 : 0;
@@ -744,7 +754,8 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 	return 0;
 }
 
-static int sh_mobile_ceu_try_bus_param(struct soc_camera_device *icd)
+static int sh_mobile_ceu_try_bus_param(struct soc_camera_device *icd,
+				       unsigned char buswidth)
 {
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct sh_mobile_ceu_dev *pcdev = ici->priv;
@@ -753,48 +764,79 @@ static int sh_mobile_ceu_try_bus_param(struct soc_camera_device *icd)
 	camera_flags = icd->ops->query_bus_param(icd);
 	common_flags = soc_camera_bus_param_compatible(camera_flags,
 						       make_bus_param(pcdev));
-	if (!common_flags)
+	if (!common_flags || buswidth > 16 ||
+	    (buswidth > 8 && !(common_flags & SOCAM_DATAWIDTH_16)))
 		return -EINVAL;
 
 	return 0;
 }
 
-static const struct soc_camera_data_format sh_mobile_ceu_formats[] = {
-	{
-		.name		= "NV12",
-		.depth		= 12,
-		.fourcc		= V4L2_PIX_FMT_NV12,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
-	{
-		.name		= "NV21",
-		.depth		= 12,
-		.fourcc		= V4L2_PIX_FMT_NV21,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
-	{
-		.name		= "NV16",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_NV16,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
+static const struct v4l2_imgbus_pixelfmt sh_mobile_ceu_formats[] = {
 	{
-		.name		= "NV61",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_NV61,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
+		.fourcc			= V4L2_PIX_FMT_NV12,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV12",
+		.bits_per_sample	= 12,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, {
+		.fourcc			= V4L2_PIX_FMT_NV21,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV21",
+		.bits_per_sample	= 12,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, {
+		.fourcc			= V4L2_PIX_FMT_NV16,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV16",
+		.bits_per_sample	= 16,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, {
+		.fourcc			= V4L2_PIX_FMT_NV61,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV61",
+		.bits_per_sample	= 16,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	},
 };
 
+/* This will be corrected as we get more formats */
+static bool sh_mobile_ceu_packing_supported(const struct v4l2_imgbus_pixelfmt *fmt)
+{
+	return	fmt->packing == V4L2_IMGBUS_PACKING_NONE ||
+		(fmt->bits_per_sample == 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_2X8_PADHI) ||
+		(fmt->bits_per_sample > 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_EXTEND16);
+}
+
 static int sh_mobile_ceu_get_formats(struct soc_camera_device *icd, int idx,
 				     struct soc_camera_format_xlate *xlate)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
 	int ret, k, n;
 	int formats = 0;
 	struct sh_mobile_ceu_cam *cam;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *fmt;
+
+	ret = v4l2_subdev_call(sd, video, enum_imgbus_fmt, idx, &code);
+	if (ret < 0)
+		/* No more formats */
+		return 0;
 
-	ret = sh_mobile_ceu_try_bus_param(icd);
+	fmt = v4l2_imgbus_get_fmtdesc(code);
+	if (!fmt) {
+		dev_err(icd->dev.parent,
+			"Invalid format code #%d: %d\n", idx, code);
+		return -EINVAL;
+	}
+
+	ret = sh_mobile_ceu_try_bus_param(icd, fmt->bits_per_sample);
 	if (ret < 0)
 		return 0;
 
@@ -812,13 +854,13 @@ static int sh_mobile_ceu_get_formats(struct soc_camera_device *icd, int idx,
 	if (!idx)
 		cam->extra_fmt = NULL;
 
-	switch (icd->formats[idx].fourcc) {
-	case V4L2_PIX_FMT_UYVY:
-	case V4L2_PIX_FMT_VYUY:
-	case V4L2_PIX_FMT_YUYV:
-	case V4L2_PIX_FMT_YVYU:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_UYVY:
+	case V4L2_IMGBUS_FMT_VYUY:
+	case V4L2_IMGBUS_FMT_YUYV:
+	case V4L2_IMGBUS_FMT_YVYU:
 		if (cam->extra_fmt)
-			goto add_single_format;
+			break;
 
 		/*
 		 * Our case is simple so far: for any of the above four camera
@@ -829,32 +871,31 @@ static int sh_mobile_ceu_get_formats(struct soc_camera_device *icd, int idx,
 		 * the host_priv pointer and check whether the format you're
 		 * going to add now is already there.
 		 */
-		cam->extra_fmt = (void *)sh_mobile_ceu_formats;
+		cam->extra_fmt = sh_mobile_ceu_formats;
 
 		n = ARRAY_SIZE(sh_mobile_ceu_formats);
 		formats += n;
 		for (k = 0; xlate && k < n; k++) {
-			xlate->host_fmt = &sh_mobile_ceu_formats[k];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = icd->formats[idx].depth;
+			xlate->host_fmt	= &sh_mobile_ceu_formats[k];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(dev, "Providing format %s using %s\n",
-				sh_mobile_ceu_formats[k].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				sh_mobile_ceu_formats[k].name, code);
 		}
+		break;
 	default:
-add_single_format:
-		/* Generic pass-through */
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = icd->formats[idx].depth;
-			xlate++;
-			dev_dbg(dev,
-				"Providing format %s in pass-through mode\n",
-				icd->formats[idx].name);
-		}
+		if (!sh_mobile_ceu_packing_supported(fmt))
+			return 0;
+	}
+
+	/* Generic pass-through */
+	formats++;
+	if (xlate) {
+		xlate->host_fmt	= fmt;
+		xlate->code	= code;
+		xlate++;
+		dev_dbg(dev, "Providing format %s in pass-through mode\n",
+			xlate->host_fmt->name);
 	}
 
 	return formats;
@@ -1034,17 +1075,15 @@ static int client_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *crop,
 static int get_camera_scales(struct v4l2_subdev *sd, struct v4l2_rect *rect,
 			     unsigned int *scale_h, unsigned int *scale_v)
 {
-	struct v4l2_format f;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
-	f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+	ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
-	*scale_h = calc_generic_scale(rect->width, f.fmt.pix.width);
-	*scale_v = calc_generic_scale(rect->height, f.fmt.pix.height);
+	*scale_h = calc_generic_scale(rect->width, imgf.width);
+	*scale_v = calc_generic_scale(rect->height, imgf.height);
 
 	return 0;
 }
@@ -1059,32 +1098,29 @@ static int get_camera_subwin(struct soc_camera_device *icd,
 	if (!ceu_rect->width) {
 		struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 		struct device *dev = icd->dev.parent;
-		struct v4l2_format f;
-		struct v4l2_pix_format *pix = &f.fmt.pix;
+		struct v4l2_imgbus_framefmt imgf;
 		int ret;
 		/* First time */
 
-		f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-		ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+		ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 		if (ret < 0)
 			return ret;
 
-		dev_geo(dev, "camera fmt %ux%u\n", pix->width, pix->height);
+		dev_geo(dev, "camera fmt %ux%u\n", imgf.width, imgf.height);
 
-		if (pix->width > 2560) {
+		if (imgf.width > 2560) {
 			ceu_rect->width	 = 2560;
-			ceu_rect->left	 = (pix->width - 2560) / 2;
+			ceu_rect->left	 = (imgf.width - 2560) / 2;
 		} else {
-			ceu_rect->width	 = pix->width;
+			ceu_rect->width	 = imgf.width;
 			ceu_rect->left	 = 0;
 		}
 
-		if (pix->height > 1920) {
+		if (imgf.height > 1920) {
 			ceu_rect->height = 1920;
-			ceu_rect->top	 = (pix->height - 1920) / 2;
+			ceu_rect->top	 = (imgf.height - 1920) / 2;
 		} else {
-			ceu_rect->height = pix->height;
+			ceu_rect->height = imgf.height;
 			ceu_rect->top	 = 0;
 		}
 
@@ -1101,13 +1137,12 @@ static int get_camera_subwin(struct soc_camera_device *icd,
 	return 0;
 }
 
-static int client_s_fmt(struct soc_camera_device *icd, struct v4l2_format *f,
-			bool ceu_can_scale)
+static int client_s_fmt(struct soc_camera_device *icd,
+			struct v4l2_imgbus_framefmt *imgf, bool ceu_can_scale)
 {
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	unsigned int width = pix->width, height = pix->height, tmp_w, tmp_h;
+	unsigned int width = imgf->width, height = imgf->height, tmp_w, tmp_h;
 	unsigned int max_width, max_height;
 	struct v4l2_cropcap cap;
 	int ret;
@@ -1121,29 +1156,29 @@ static int client_s_fmt(struct soc_camera_device *icd, struct v4l2_format *f,
 	max_width = min(cap.bounds.width, 2560);
 	max_height = min(cap.bounds.height, 1920);
 
-	ret = v4l2_subdev_call(sd, video, s_fmt, f);
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, imgf);
 	if (ret < 0)
 		return ret;
 
-	dev_geo(dev, "camera scaled to %ux%u\n", pix->width, pix->height);
+	dev_geo(dev, "camera scaled to %ux%u\n", imgf->width, imgf->height);
 
-	if ((width == pix->width && height == pix->height) || !ceu_can_scale)
+	if ((width == imgf->width && height == imgf->height) || !ceu_can_scale)
 		return 0;
 
 	/* Camera set a format, but geometry is not precise, try to improve */
-	tmp_w = pix->width;
-	tmp_h = pix->height;
+	tmp_w = imgf->width;
+	tmp_h = imgf->height;
 
 	/* width <= max_width && height <= max_height - guaranteed by try_fmt */
 	while ((width > tmp_w || height > tmp_h) &&
 	       tmp_w < max_width && tmp_h < max_height) {
 		tmp_w = min(2 * tmp_w, max_width);
 		tmp_h = min(2 * tmp_h, max_height);
-		pix->width = tmp_w;
-		pix->height = tmp_h;
-		ret = v4l2_subdev_call(sd, video, s_fmt, f);
+		imgf->width = tmp_w;
+		imgf->height = tmp_h;
+		ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, imgf);
 		dev_geo(dev, "Camera scaled to %ux%u\n",
-			pix->width, pix->height);
+			imgf->width, imgf->height);
 		if (ret < 0) {
 			/* This shouldn't happen */
 			dev_err(dev, "Client failed to set format: %d\n", ret);
@@ -1161,27 +1196,26 @@ static int client_s_fmt(struct soc_camera_device *icd, struct v4l2_format *f,
  */
 static int client_scale(struct soc_camera_device *icd, struct v4l2_rect *rect,
 			struct v4l2_rect *sub_rect, struct v4l2_rect *ceu_rect,
-			struct v4l2_format *f, bool ceu_can_scale)
+			struct v4l2_imgbus_framefmt *imgf, bool ceu_can_scale)
 {
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct sh_mobile_ceu_cam *cam = icd->host_priv;
 	struct device *dev = icd->dev.parent;
-	struct v4l2_format f_tmp = *f;
-	struct v4l2_pix_format *pix_tmp = &f_tmp.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf_tmp = *imgf;
 	unsigned int scale_h, scale_v;
 	int ret;
 
 	/* 5. Apply iterative camera S_FMT for camera user window. */
-	ret = client_s_fmt(icd, &f_tmp, ceu_can_scale);
+	ret = client_s_fmt(icd, &imgf_tmp, ceu_can_scale);
 	if (ret < 0)
 		return ret;
 
 	dev_geo(dev, "5: camera scaled to %ux%u\n",
-		pix_tmp->width, pix_tmp->height);
+		imgf_tmp.width, imgf_tmp.height);
 
 	/* 6. Retrieve camera output window (g_fmt) */
 
-	/* unneeded - it is already in "f_tmp" */
+	/* unneeded - it is already in "imgf_tmp" */
 
 	/* 7. Calculate new camera scales. */
 	ret = get_camera_scales(sd, rect, &scale_h, &scale_v);
@@ -1190,10 +1224,10 @@ static int client_scale(struct soc_camera_device *icd, struct v4l2_rect *rect,
 
 	dev_geo(dev, "7: camera scales %u:%u\n", scale_h, scale_v);
 
-	cam->cam_width		= pix_tmp->width;
-	cam->cam_height		= pix_tmp->height;
-	f->fmt.pix.width	= pix_tmp->width;
-	f->fmt.pix.height	= pix_tmp->height;
+	cam->cam_width	= imgf_tmp.width;
+	cam->cam_height	= imgf_tmp.height;
+	imgf->width	= imgf_tmp.width;
+	imgf->height	= imgf_tmp.height;
 
 	/*
 	 * 8. Calculate new CEU crop - apply camera scales to previously
@@ -1257,8 +1291,7 @@ static int sh_mobile_ceu_set_crop(struct soc_camera_device *icd,
 	struct v4l2_rect *cam_rect = &cam_crop.c, *ceu_rect = &cam->ceu_rect;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
-	struct v4l2_format f;
-	struct v4l2_pix_format *pix = &f.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	unsigned int scale_comb_h, scale_comb_v, scale_ceu_h, scale_ceu_v,
 		out_width, out_height;
 	u32 capsr, cflcr;
@@ -1307,25 +1340,24 @@ static int sh_mobile_ceu_set_crop(struct soc_camera_device *icd,
 	 * 5. Using actual input window and calculated combined scales calculate
 	 *    camera target output window.
 	 */
-	pix->width		= scale_down(cam_rect->width, scale_comb_h);
-	pix->height		= scale_down(cam_rect->height, scale_comb_v);
+	imgf.width	= scale_down(cam_rect->width, scale_comb_h);
+	imgf.height	= scale_down(cam_rect->height, scale_comb_v);
 
-	dev_geo(dev, "5: camera target %ux%u\n", pix->width, pix->height);
+	dev_geo(dev, "5: camera target %ux%u\n", imgf.width, imgf.height);
 
 	/* 6. - 9. */
-	pix->pixelformat	= cam->camera_fmt->fourcc;
-	pix->colorspace		= cam->camera_fmt->colorspace;
+	imgf.code	= cam->code;
+	imgf.field	= pcdev->is_interlaced ? V4L2_FIELD_INTERLACED :
+						V4L2_FIELD_NONE;
 
 	capsr = capture_save_reset(pcdev);
 	dev_dbg(dev, "CAPSR 0x%x, CFLCR 0x%x\n", capsr, pcdev->cflcr);
 
 	/* Make relative to camera rectangle */
-	rect->left		-= cam_rect->left;
-	rect->top		-= cam_rect->top;
-
-	f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
+	rect->left	-= cam_rect->left;
+	rect->top	-= cam_rect->top;
 
-	ret = client_scale(icd, cam_rect, rect, ceu_rect, &f,
+	ret = client_scale(icd, cam_rect, rect, ceu_rect, &imgf,
 			   pcdev->image_mode && !pcdev->is_interlaced);
 
 	dev_geo(dev, "6-9: %d\n", ret);
@@ -1373,8 +1405,7 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 	struct sh_mobile_ceu_dev *pcdev = ici->priv;
 	struct sh_mobile_ceu_cam *cam = icd->host_priv;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
-	struct v4l2_format cam_f = *f;
-	struct v4l2_pix_format *cam_pix = &cam_f.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
 	__u32 pixfmt = pix->pixelformat;
@@ -1443,9 +1474,10 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 	 * 4. Calculate camera output window by applying combined scales to real
 	 *    input window.
 	 */
-	cam_pix->width = scale_down(cam_rect->width, scale_h);
-	cam_pix->height = scale_down(cam_rect->height, scale_v);
-	cam_pix->pixelformat = xlate->cam_fmt->fourcc;
+	imgf.width	= scale_down(cam_rect->width, scale_h);
+	imgf.height	= scale_down(cam_rect->height, scale_v);
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
 
 	switch (pixfmt) {
 	case V4L2_PIX_FMT_NV12:
@@ -1458,11 +1490,10 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 		image_mode = false;
 	}
 
-	dev_geo(dev, "4: camera output %ux%u\n",
-		cam_pix->width, cam_pix->height);
+	dev_geo(dev, "4: camera output %ux%u\n", imgf.width, imgf.height);
 
 	/* 5. - 9. */
-	ret = client_scale(icd, cam_rect, &cam_subrect, &ceu_rect, &cam_f,
+	ret = client_scale(icd, cam_rect, &cam_subrect, &ceu_rect, &imgf,
 			   image_mode && !is_interlaced);
 
 	dev_geo(dev, "5-9: client scale %d\n", ret);
@@ -1470,20 +1501,20 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 	/* Done with the camera. Now see if we can improve the result */
 
 	dev_dbg(dev, "Camera %d fmt %ux%u, requested %ux%u\n",
-		ret, cam_pix->width, cam_pix->height, pix->width, pix->height);
+		ret, imgf.width, imgf.height, pix->width, pix->height);
 	if (ret < 0)
 		return ret;
 
 	/* 10. Use CEU scaling to scale to the requested user window. */
 
 	/* We cannot scale up */
-	if (pix->width > cam_pix->width)
-		pix->width = cam_pix->width;
+	if (pix->width > imgf.width)
+		pix->width = imgf.width;
 	if (pix->width > ceu_rect.width)
 		pix->width = ceu_rect.width;
 
-	if (pix->height > cam_pix->height)
-		pix->height = cam_pix->height;
+	if (pix->height > imgf.height)
+		pix->height = imgf.height;
 	if (pix->height > ceu_rect.height)
 		pix->height = ceu_rect.height;
 
@@ -1497,10 +1528,9 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 
 	pcdev->cflcr = scale_h | (scale_v << 16);
 
-	icd->buswidth = xlate->buswidth;
-	icd->current_fmt = xlate->host_fmt;
-	cam->camera_fmt = xlate->cam_fmt;
-	cam->ceu_rect = ceu_rect;
+	cam->code		= xlate->code;
+	cam->ceu_rect		= ceu_rect;
+	icd->current_fmt	= xlate;
 
 	pcdev->is_interlaced = is_interlaced;
 	pcdev->image_mode = image_mode;
@@ -1514,6 +1544,7 @@ static int sh_mobile_ceu_try_fmt(struct soc_camera_device *icd,
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
+	struct v4l2_imgbus_framefmt imgf;
 	__u32 pixfmt = pix->pixelformat;
 	int width, height;
 	int ret;
@@ -1532,18 +1563,24 @@ static int sh_mobile_ceu_try_fmt(struct soc_camera_device *icd,
 	width = pix->width;
 	height = pix->height;
 
-	pix->bytesperline = pix->width *
-		DIV_ROUND_UP(xlate->host_fmt->depth, 8);
-	pix->sizeimage = pix->height * pix->bytesperline;
-
-	pix->pixelformat = xlate->cam_fmt->fourcc;
+	pix->bytesperline = v4l2_imgbus_bytes_per_line(width, xlate->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
+	pix->sizeimage = height * pix->bytesperline;
 
 	/* limit to sensor capabilities */
-	ret = v4l2_subdev_call(sd, video, try_fmt, f);
-	pix->pixelformat = pixfmt;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.field	= pix->field;
+	imgf.code	= xlate->code;
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
+	pix->field	= imgf.field;
+
 	switch (pixfmt) {
 	case V4L2_PIX_FMT_NV12:
 	case V4L2_PIX_FMT_NV21:
@@ -1555,7 +1592,7 @@ static int sh_mobile_ceu_try_fmt(struct soc_camera_device *icd,
 			int tmp_w = pix->width, tmp_h = pix->height;
 			pix->width = 2560;
 			pix->height = 1920;
-			ret = v4l2_subdev_call(sd, video, try_fmt, f);
+			ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
 			if (ret < 0) {
 				/* Shouldn't actually happen... */
 				dev_err(icd->dev.parent,
@@ -1661,7 +1698,7 @@ static int sh_mobile_ceu_set_ctrl(struct soc_camera_device *icd,
 
 	switch (ctrl->id) {
 	case V4L2_CID_SHARPNESS:
-		switch (icd->current_fmt->fourcc) {
+		switch (icd->current_fmt->host_fmt->fourcc) {
 		case V4L2_PIX_FMT_NV12:
 		case V4L2_PIX_FMT_NV21:
 		case V4L2_PIX_FMT_NV16:
diff --git a/drivers/media/video/soc_camera.c b/drivers/media/video/soc_camera.c
index bf77935..7c624f9 100644
--- a/drivers/media/video/soc_camera.c
+++ b/drivers/media/video/soc_camera.c
@@ -40,18 +40,6 @@ static LIST_HEAD(hosts);
 static LIST_HEAD(devices);
 static DEFINE_MUTEX(list_lock);		/* Protects the list of hosts */
 
-const struct soc_camera_data_format *soc_camera_format_by_fourcc(
-	struct soc_camera_device *icd, unsigned int fourcc)
-{
-	unsigned int i;
-
-	for (i = 0; i < icd->num_formats; i++)
-		if (icd->formats[i].fourcc == fourcc)
-			return icd->formats + i;
-	return NULL;
-}
-EXPORT_SYMBOL(soc_camera_format_by_fourcc);
-
 const struct soc_camera_format_xlate *soc_camera_xlate_by_fourcc(
 	struct soc_camera_device *icd, unsigned int fourcc)
 {
@@ -207,21 +195,26 @@ static int soc_camera_dqbuf(struct file *file, void *priv,
 /* Always entered with .video_lock held */
 static int soc_camera_init_user_formats(struct soc_camera_device *icd)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
-	int i, fmts = 0, ret;
+	int i, fmts = 0, raw_fmts = 0, ret;
+	enum v4l2_imgbus_pixelcode code;
+
+	while (!v4l2_subdev_call(sd, video, enum_imgbus_fmt, raw_fmts, &code))
+		raw_fmts++;
 
 	if (!ici->ops->get_formats)
 		/*
 		 * Fallback mode - the host will have to serve all
 		 * sensor-provided formats one-to-one to the user
 		 */
-		fmts = icd->num_formats;
+		fmts = raw_fmts;
 	else
 		/*
 		 * First pass - only count formats this host-sensor
 		 * configuration can provide
 		 */
-		for (i = 0; i < icd->num_formats; i++) {
+		for (i = 0; i < raw_fmts; i++) {
 			ret = ici->ops->get_formats(icd, i, NULL);
 			if (ret < 0)
 				return ret;
@@ -242,11 +235,11 @@ static int soc_camera_init_user_formats(struct soc_camera_device *icd)
 
 	/* Second pass - actually fill data formats */
 	fmts = 0;
-	for (i = 0; i < icd->num_formats; i++)
+	for (i = 0; i < raw_fmts; i++)
 		if (!ici->ops->get_formats) {
-			icd->user_formats[i].host_fmt = icd->formats + i;
-			icd->user_formats[i].cam_fmt = icd->formats + i;
-			icd->user_formats[i].buswidth = icd->formats[i].depth;
+			v4l2_subdev_call(sd, video, enum_imgbus_fmt, i, &code);
+			icd->user_formats[i].host_fmt = v4l2_imgbus_get_fmtdesc(code);
+			icd->user_formats[i].code = code;
 		} else {
 			ret = ici->ops->get_formats(icd, i,
 						    &icd->user_formats[fmts]);
@@ -255,7 +248,7 @@ static int soc_camera_init_user_formats(struct soc_camera_device *icd)
 			fmts += ret;
 		}
 
-	icd->current_fmt = icd->user_formats[0].host_fmt;
+	icd->current_fmt = &icd->user_formats[0];
 
 	return 0;
 
@@ -281,7 +274,7 @@ static void soc_camera_free_user_formats(struct soc_camera_device *icd)
 #define pixfmtstr(x) (x) & 0xff, ((x) >> 8) & 0xff, ((x) >> 16) & 0xff, \
 	((x) >> 24) & 0xff
 
-/* Called with .vb_lock held */
+/* Called with .vb_lock held, or from the first open(2), see comment there */
 static int soc_camera_set_fmt(struct soc_camera_file *icf,
 			      struct v4l2_format *f)
 {
@@ -302,7 +295,7 @@ static int soc_camera_set_fmt(struct soc_camera_file *icf,
 	if (ret < 0) {
 		return ret;
 	} else if (!icd->current_fmt ||
-		   icd->current_fmt->fourcc != pix->pixelformat) {
+		   icd->current_fmt->host_fmt->fourcc != pix->pixelformat) {
 		dev_err(&icd->dev,
 			"Host driver hasn't set up current format correctly!\n");
 		return -EINVAL;
@@ -369,8 +362,8 @@ static int soc_camera_open(struct file *file)
 				.width		= icd->user_width,
 				.height		= icd->user_height,
 				.field		= icd->field,
-				.pixelformat	= icd->current_fmt->fourcc,
-				.colorspace	= icd->current_fmt->colorspace,
+				.pixelformat	= icd->current_fmt->host_fmt->fourcc,
+				.colorspace	= icd->current_fmt->host_fmt->colorspace,
 			},
 		};
 
@@ -390,7 +383,12 @@ static int soc_camera_open(struct file *file)
 			goto eiciadd;
 		}
 
-		/* Try to configure with default parameters */
+		/*
+		 * Try to configure with default parameters. Notice: this is the
+		 * very first open, so, we cannot race against other calls,
+		 * apart from someone else calling open() simultaneously, but
+		 * .video_lock is protecting us against it.
+		 */
 		ret = soc_camera_set_fmt(icf, &f);
 		if (ret < 0)
 			goto esfmt;
@@ -534,7 +532,7 @@ static int soc_camera_enum_fmt_vid_cap(struct file *file, void  *priv,
 {
 	struct soc_camera_file *icf = file->private_data;
 	struct soc_camera_device *icd = icf->icd;
-	const struct soc_camera_data_format *format;
+	const struct v4l2_imgbus_pixelfmt *format;
 
 	WARN_ON(priv != file->private_data);
 
@@ -543,7 +541,8 @@ static int soc_camera_enum_fmt_vid_cap(struct file *file, void  *priv,
 
 	format = icd->user_formats[f->index].host_fmt;
 
-	strlcpy(f->description, format->name, sizeof(f->description));
+	if (format->name)
+		strlcpy(f->description, format->name, sizeof(f->description));
 	f->pixelformat = format->fourcc;
 	return 0;
 }
@@ -560,12 +559,14 @@ static int soc_camera_g_fmt_vid_cap(struct file *file, void *priv,
 	pix->width		= icd->user_width;
 	pix->height		= icd->user_height;
 	pix->field		= icf->vb_vidq.field;
-	pix->pixelformat	= icd->current_fmt->fourcc;
-	pix->bytesperline	= pix->width *
-		DIV_ROUND_UP(icd->current_fmt->depth, 8);
+	pix->pixelformat	= icd->current_fmt->host_fmt->fourcc;
+	pix->bytesperline	= v4l2_imgbus_bytes_per_line(pix->width,
+						icd->current_fmt->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
 	pix->sizeimage		= pix->height * pix->bytesperline;
 	dev_dbg(&icd->dev, "current_fmt->fourcc: 0x%08x\n",
-		icd->current_fmt->fourcc);
+		icd->current_fmt->host_fmt->fourcc);
 	return 0;
 }
 
@@ -894,7 +895,7 @@ static int soc_camera_probe(struct device *dev)
 	struct soc_camera_link *icl = to_soc_camera_link(icd);
 	struct device *control = NULL;
 	struct v4l2_subdev *sd;
-	struct v4l2_format f = {.type = V4L2_BUF_TYPE_VIDEO_CAPTURE};
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	dev_info(dev, "Probing %s\n", dev_name(dev));
@@ -965,9 +966,10 @@ static int soc_camera_probe(struct device *dev)
 
 	/* Try to improve our guess of a reasonable window format */
 	sd = soc_camera_to_subdev(icd);
-	if (!v4l2_subdev_call(sd, video, g_fmt, &f)) {
-		icd->user_width		= f.fmt.pix.width;
-		icd->user_height	= f.fmt.pix.height;
+	if (!v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf)) {
+		icd->user_width		= imgf.width;
+		icd->user_height	= imgf.height;
+		icd->field		= imgf.field;
 	}
 
 	/* Do we have to sysfs_remove_link() before device_unregister()? */
diff --git a/drivers/media/video/soc_camera_platform.c b/drivers/media/video/soc_camera_platform.c
index c7c9151..573480c 100644
--- a/drivers/media/video/soc_camera_platform.c
+++ b/drivers/media/video/soc_camera_platform.c
@@ -22,7 +22,7 @@
 
 struct soc_camera_platform_priv {
 	struct v4l2_subdev subdev;
-	struct soc_camera_data_format format;
+	struct v4l2_imgbus_framefmt format;
 };
 
 static struct soc_camera_platform_priv *get_priv(struct platform_device *pdev)
@@ -58,36 +58,33 @@ soc_camera_platform_query_bus_param(struct soc_camera_device *icd)
 }
 
 static int soc_camera_platform_try_fmt(struct v4l2_subdev *sd,
-				       struct v4l2_format *f)
+				       struct v4l2_imgbus_framefmt *imgf)
 {
 	struct soc_camera_platform_info *p = v4l2_get_subdevdata(sd);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width = p->format.width;
-	pix->height = p->format.height;
+	imgf->width = p->format.width;
+	imgf->height = p->format.height;
 	return 0;
 }
 
-static void soc_camera_platform_video_probe(struct soc_camera_device *icd,
-					    struct platform_device *pdev)
+static struct v4l2_subdev_core_ops platform_subdev_core_ops;
+
+static int soc_camera_platform_enum_fmt(struct v4l2_subdev *sd, int index,
+					enum v4l2_imgbus_pixelcode *code)
 {
-	struct soc_camera_platform_priv *priv = get_priv(pdev);
-	struct soc_camera_platform_info *p = pdev->dev.platform_data;
+	struct soc_camera_platform_info *p = v4l2_get_subdevdata(sd);
 
-	priv->format.name = p->format_name;
-	priv->format.depth = p->format_depth;
-	priv->format.fourcc = p->format.pixelformat;
-	priv->format.colorspace = p->format.colorspace;
+	if (index)
+		return -EINVAL;
 
-	icd->formats = &priv->format;
-	icd->num_formats = 1;
+	*code = p->format.code;
+	return 0;
 }
 
-static struct v4l2_subdev_core_ops platform_subdev_core_ops;
-
 static struct v4l2_subdev_video_ops platform_subdev_video_ops = {
-	.s_stream	= soc_camera_platform_s_stream,
-	.try_fmt	= soc_camera_platform_try_fmt,
+	.s_stream		= soc_camera_platform_s_stream,
+	.try_imgbus_fmt		= soc_camera_platform_try_fmt,
+	.enum_imgbus_fmt	= soc_camera_platform_enum_fmt,
 };
 
 static struct v4l2_subdev_ops platform_subdev_ops = {
@@ -132,8 +129,6 @@ static int soc_camera_platform_probe(struct platform_device *pdev)
 
 	ici = to_soc_camera_host(icd->dev.parent);
 
-	soc_camera_platform_video_probe(icd, pdev);
-
 	v4l2_subdev_init(&priv->subdev, &platform_subdev_ops);
 	v4l2_set_subdevdata(&priv->subdev, p);
 	strncpy(priv->subdev.name, dev_name(&pdev->dev), V4L2_SUBDEV_NAME_SIZE);
diff --git a/drivers/media/video/tw9910.c b/drivers/media/video/tw9910.c
index 35373d8..09ea042 100644
--- a/drivers/media/video/tw9910.c
+++ b/drivers/media/video/tw9910.c
@@ -240,13 +240,8 @@ static const struct regval_list tw9910_default_regs[] =
 	ENDMARKER,
 };
 
-static const struct soc_camera_data_format tw9910_color_fmt[] = {
-	{
-		.name       = "VYUY",
-		.fourcc     = V4L2_PIX_FMT_VYUY,
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SMPTE170M,
-	}
+static const enum v4l2_imgbus_pixelcode tw9910_color_codes[] = {
+	V4L2_IMGBUS_FMT_VYUY,
 };
 
 static const struct tw9910_scale_ctrl tw9910_ntsc_scales[] = {
@@ -762,11 +757,11 @@ static int tw9910_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int tw9910_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int tw9910_g_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct tw9910_priv *priv = to_tw9910(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
 	if (!priv->scale) {
 		int ret;
@@ -783,74 +778,74 @@ static int tw9910_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 			return ret;
 	}
 
-	f->type			= V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	pix->width		= priv->scale->width;
-	pix->height		= priv->scale->height;
-	pix->pixelformat	= V4L2_PIX_FMT_VYUY;
-	pix->colorspace		= V4L2_COLORSPACE_SMPTE170M;
-	pix->field		= V4L2_FIELD_INTERLACED;
+	imgf->width	= priv->scale->width;
+	imgf->height	= priv->scale->height;
+	imgf->code	= V4L2_IMGBUS_FMT_VYUY;
+	imgf->field	= V4L2_FIELD_INTERLACED;
 
 	return 0;
 }
 
-static int tw9910_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int tw9910_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct tw9910_priv *priv = to_tw9910(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	/* See tw9910_s_crop() - no proper cropping support */
 	struct v4l2_crop a = {
 		.c = {
 			.left	= 0,
 			.top	= 0,
-			.width	= pix->width,
-			.height	= pix->height,
+			.width	= imgf->width,
+			.height	= imgf->height,
 		},
 	};
 	int i, ret;
 
+	WARN_ON(imgf->field != V4L2_FIELD_ANY &&
+		imgf->field != V4L2_FIELD_INTERLACED);
+
 	/*
 	 * check color format
 	 */
-	for (i = 0; i < ARRAY_SIZE(tw9910_color_fmt); i++)
-		if (pix->pixelformat == tw9910_color_fmt[i].fourcc)
+	for (i = 0; i < ARRAY_SIZE(tw9910_color_codes); i++)
+		if (imgf->code == tw9910_color_codes[i])
 			break;
 
-	if (i == ARRAY_SIZE(tw9910_color_fmt))
+	if (i == ARRAY_SIZE(tw9910_color_codes))
 		return -EINVAL;
 
 	ret = tw9910_s_crop(sd, &a);
 	if (!ret) {
-		pix->width = priv->scale->width;
-		pix->height = priv->scale->height;
+		imgf->width	= priv->scale->width;
+		imgf->height	= priv->scale->height;
 	}
 	return ret;
 }
 
-static int tw9910_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int tw9910_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct soc_camera_device *icd = client->dev.platform_data;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	const struct tw9910_scale_ctrl *scale;
 
-	if (V4L2_FIELD_ANY == pix->field) {
-		pix->field = V4L2_FIELD_INTERLACED;
-	} else if (V4L2_FIELD_INTERLACED != pix->field) {
-		dev_err(&client->dev, "Field type invalid.\n");
+	if (V4L2_FIELD_ANY == imgf->field) {
+		imgf->field = V4L2_FIELD_INTERLACED;
+	} else if (V4L2_FIELD_INTERLACED != imgf->field) {
+		dev_err(&client->dev, "Field type %d invalid.\n", imgf->field);
 		return -EINVAL;
 	}
 
 	/*
 	 * select suitable norm
 	 */
-	scale = tw9910_select_norm(icd, pix->width, pix->height);
+	scale = tw9910_select_norm(icd, imgf->width, imgf->height);
 	if (!scale)
 		return -EINVAL;
 
-	pix->width  = scale->width;
-	pix->height = scale->height;
+	imgf->width	= scale->width;
+	imgf->height	= scale->height;
 
 	return 0;
 }
@@ -878,9 +873,6 @@ static int tw9910_video_probe(struct soc_camera_device *icd,
 		return -ENODEV;
 	}
 
-	icd->formats     = tw9910_color_fmt;
-	icd->num_formats = ARRAY_SIZE(tw9910_color_fmt);
-
 	/*
 	 * check and show Product ID
 	 * So far only revisions 0 and 1 have been seen
@@ -918,14 +910,25 @@ static struct v4l2_subdev_core_ops tw9910_subdev_core_ops = {
 #endif
 };
 
+static int tw9910_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(tw9910_color_codes))
+		return -EINVAL;
+
+	*code = tw9910_color_codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops tw9910_subdev_video_ops = {
-	.s_stream	= tw9910_s_stream,
-	.g_fmt		= tw9910_g_fmt,
-	.s_fmt		= tw9910_s_fmt,
-	.try_fmt	= tw9910_try_fmt,
-	.cropcap	= tw9910_cropcap,
-	.g_crop		= tw9910_g_crop,
-	.s_crop		= tw9910_s_crop,
+	.s_stream		= tw9910_s_stream,
+	.g_imgbus_fmt		= tw9910_g_fmt,
+	.s_imgbus_fmt		= tw9910_s_fmt,
+	.try_imgbus_fmt		= tw9910_try_fmt,
+	.cropcap		= tw9910_cropcap,
+	.g_crop			= tw9910_g_crop,
+	.s_crop			= tw9910_s_crop,
+	.enum_imgbus_fmt	= tw9910_enum_fmt,
 };
 
 static struct v4l2_subdev_ops tw9910_subdev_ops = {
diff --git a/include/media/rj54n1cb0c.h b/include/media/rj54n1cb0c.h
new file mode 100644
index 0000000..8ae3288
--- /dev/null
+++ b/include/media/rj54n1cb0c.h
@@ -0,0 +1,19 @@
+/*
+ * RJ54N1CB0C Private data
+ *
+ * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
+ *
+ * This program is free software; you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License version 2 as
+ * published by the Free Software Foundation.
+ */
+
+#ifndef __RJ54N1CB0C_H__
+#define __RJ54N1CB0C_H__
+
+struct rj54n1_pdata {
+	unsigned int	mclk_freq;
+	bool		ioctl_high;
+};
+
+#endif
diff --git a/include/media/soc_camera.h b/include/media/soc_camera.h
index 831efff..d0ef622 100644
--- a/include/media/soc_camera.h
+++ b/include/media/soc_camera.h
@@ -26,13 +26,10 @@ struct soc_camera_device {
 	s32 user_height;
 	unsigned char iface;		/* Host number */
 	unsigned char devnum;		/* Device number per host */
-	unsigned char buswidth;		/* See comment in .c */
 	struct soc_camera_sense *sense;	/* See comment in struct definition */
 	struct soc_camera_ops *ops;
 	struct video_device *vdev;
-	const struct soc_camera_data_format *current_fmt;
-	const struct soc_camera_data_format *formats;
-	int num_formats;
+	const struct soc_camera_format_xlate *current_fmt;
 	struct soc_camera_format_xlate *user_formats;
 	int num_user_formats;
 	enum v4l2_field field;		/* Preserve field over close() */
@@ -161,23 +158,13 @@ static inline struct v4l2_subdev *soc_camera_to_subdev(
 int soc_camera_host_register(struct soc_camera_host *ici);
 void soc_camera_host_unregister(struct soc_camera_host *ici);
 
-const struct soc_camera_data_format *soc_camera_format_by_fourcc(
-	struct soc_camera_device *icd, unsigned int fourcc);
 const struct soc_camera_format_xlate *soc_camera_xlate_by_fourcc(
 	struct soc_camera_device *icd, unsigned int fourcc);
 
-struct soc_camera_data_format {
-	const char *name;
-	unsigned int depth;
-	__u32 fourcc;
-	enum v4l2_colorspace colorspace;
-};
-
 /**
  * struct soc_camera_format_xlate - match between host and sensor formats
- * @cam_fmt: sensor format provided by the sensor
- * @host_fmt: host format after host translation from cam_fmt
- * @buswidth: bus width for this format
+ * @code: code of a sensor provided format
+ * @host_fmt: host format after host translation from code
  *
  * Host and sensor translation structure. Used in table of host and sensor
  * formats matchings in soc_camera_device. A host can override the generic list
@@ -185,9 +172,8 @@ struct soc_camera_data_format {
  * format setup.
  */
 struct soc_camera_format_xlate {
-	const struct soc_camera_data_format *cam_fmt;
-	const struct soc_camera_data_format *host_fmt;
-	unsigned char buswidth;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *host_fmt;
 };
 
 struct soc_camera_ops {
diff --git a/include/media/soc_camera_platform.h b/include/media/soc_camera_platform.h
index 88b3b57..a105268 100644
--- a/include/media/soc_camera_platform.h
+++ b/include/media/soc_camera_platform.h
@@ -19,7 +19,7 @@ struct device;
 struct soc_camera_platform_info {
 	const char *format_name;
 	unsigned long format_depth;
-	struct v4l2_pix_format format;
+	struct v4l2_imgbus_framefmt format;
 	unsigned long bus_param;
 	struct device *dev;
 	int (*set_capture)(struct soc_camera_platform_info *info, int enable);
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (7 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH/RFC 8/9 v2] soc-camera: convert to the new imagebus API Guennadi Liakhovetski
@ 2009-10-30 14:01 ` Guennadi Liakhovetski
  2009-10-30 15:28   ` Karicheri, Muralidharan
  2009-11-05 15:46   ` [PATCH/RFC 9/9] " Hans Verkuil
  2009-10-30 14:34 ` [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Karicheri, Muralidharan
  9 siblings, 2 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 14:01 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Now that we have moved most of the functions over to the v4l2-subdev API, only
quering and setting bus parameters are still performed using the legacy
soc-camera client API. Make the use of this API optional for mt9t031.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---

Muralidharan, this one is for you to test. To differentiate between the 
soc-camera case and a generic user I check i2c client's platform data 
(client->dev.platform_data), so, you have to make sure your user doesn't 
use that field for something else.

One more note: I'm not sure about where v4l2_device_unregister_subdev() 
should be called. In soc-camera the core calls 
v4l2_i2c_new_subdev_board(), which then calls 
v4l2_device_register_subdev(). Logically, it's also the core that then 
calls v4l2_device_unregister_subdev(). Whereas I see many other client 
drivers call v4l2_device_unregister_subdev() internally. So, if your 
bridge driver does not call v4l2_device_unregister_subdev() itself and 
expects the client to call it, there will be a slight problem with that 
too.

 drivers/media/video/mt9t031.c |  146 ++++++++++++++++++++---------------------
 1 files changed, 70 insertions(+), 76 deletions(-)

diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
index c95c277..49357bd 100644
--- a/drivers/media/video/mt9t031.c
+++ b/drivers/media/video/mt9t031.c
@@ -204,6 +204,59 @@ static unsigned long mt9t031_query_bus_param(struct soc_camera_device *icd)
 	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
 }
 
+static const struct v4l2_queryctrl mt9t031_controls[] = {
+	{
+		.id		= V4L2_CID_VFLIP,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Flip Vertically",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 0,
+	}, {
+		.id		= V4L2_CID_HFLIP,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Flip Horizontally",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 0,
+	}, {
+		.id		= V4L2_CID_GAIN,
+		.type		= V4L2_CTRL_TYPE_INTEGER,
+		.name		= "Gain",
+		.minimum	= 0,
+		.maximum	= 127,
+		.step		= 1,
+		.default_value	= 64,
+		.flags		= V4L2_CTRL_FLAG_SLIDER,
+	}, {
+		.id		= V4L2_CID_EXPOSURE,
+		.type		= V4L2_CTRL_TYPE_INTEGER,
+		.name		= "Exposure",
+		.minimum	= 1,
+		.maximum	= 255,
+		.step		= 1,
+		.default_value	= 255,
+		.flags		= V4L2_CTRL_FLAG_SLIDER,
+	}, {
+		.id		= V4L2_CID_EXPOSURE_AUTO,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Automatic Exposure",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 1,
+	}
+};
+
+static struct soc_camera_ops mt9t031_ops = {
+	.set_bus_param		= mt9t031_set_bus_param,
+	.query_bus_param	= mt9t031_query_bus_param,
+	.controls		= mt9t031_controls,
+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
+};
+
 /* target must be _even_ */
 static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
 {
@@ -223,10 +276,9 @@ static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
 }
 
 /* rect is the sensor rectangle, the caller guarantees parameter validity */
-static int mt9t031_set_params(struct soc_camera_device *icd,
+static int mt9t031_set_params(struct i2c_client *client,
 			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
 {
-	struct i2c_client *client = to_i2c_client(to_soc_camera_control(icd));
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	int ret;
 	u16 xbin, ybin;
@@ -307,7 +359,7 @@ static int mt9t031_set_params(struct soc_camera_device *icd,
 		if (ret >= 0) {
 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
 			const struct v4l2_queryctrl *qctrl =
-				soc_camera_find_qctrl(icd->ops,
+				soc_camera_find_qctrl(&mt9t031_ops,
 						      V4L2_CID_EXPOSURE);
 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
 				 (qctrl->maximum - qctrl->minimum)) /
@@ -333,7 +385,6 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	struct v4l2_rect rect = a->c;
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct soc_camera_device *icd = client->dev.platform_data;
 
 	rect.width = ALIGN(rect.width, 2);
 	rect.height = ALIGN(rect.height, 2);
@@ -344,7 +395,7 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	soc_camera_limit_side(&rect.top, &rect.height,
 		     MT9T031_ROW_SKIP, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT);
 
-	return mt9t031_set_params(icd, &rect, mt9t031->xskip, mt9t031->yskip);
+	return mt9t031_set_params(client, &rect, mt9t031->xskip, mt9t031->yskip);
 }
 
 static int mt9t031_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
@@ -391,7 +442,6 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct soc_camera_device *icd = client->dev.platform_data;
 	u16 xskip, yskip;
 	struct v4l2_rect rect = mt9t031->rect;
 
@@ -403,7 +453,7 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
 	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
 
 	/* mt9t031_set_params() doesn't change width and height */
-	return mt9t031_set_params(icd, &rect, xskip, yskip);
+	return mt9t031_set_params(client, &rect, xskip, yskip);
 }
 
 /*
@@ -476,59 +526,6 @@ static int mt9t031_s_register(struct v4l2_subdev *sd,
 }
 #endif
 
-static const struct v4l2_queryctrl mt9t031_controls[] = {
-	{
-		.id		= V4L2_CID_VFLIP,
-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
-		.name		= "Flip Vertically",
-		.minimum	= 0,
-		.maximum	= 1,
-		.step		= 1,
-		.default_value	= 0,
-	}, {
-		.id		= V4L2_CID_HFLIP,
-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
-		.name		= "Flip Horizontally",
-		.minimum	= 0,
-		.maximum	= 1,
-		.step		= 1,
-		.default_value	= 0,
-	}, {
-		.id		= V4L2_CID_GAIN,
-		.type		= V4L2_CTRL_TYPE_INTEGER,
-		.name		= "Gain",
-		.minimum	= 0,
-		.maximum	= 127,
-		.step		= 1,
-		.default_value	= 64,
-		.flags		= V4L2_CTRL_FLAG_SLIDER,
-	}, {
-		.id		= V4L2_CID_EXPOSURE,
-		.type		= V4L2_CTRL_TYPE_INTEGER,
-		.name		= "Exposure",
-		.minimum	= 1,
-		.maximum	= 255,
-		.step		= 1,
-		.default_value	= 255,
-		.flags		= V4L2_CTRL_FLAG_SLIDER,
-	}, {
-		.id		= V4L2_CID_EXPOSURE_AUTO,
-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
-		.name		= "Automatic Exposure",
-		.minimum	= 0,
-		.maximum	= 1,
-		.step		= 1,
-		.default_value	= 1,
-	}
-};
-
-static struct soc_camera_ops mt9t031_ops = {
-	.set_bus_param		= mt9t031_set_bus_param,
-	.query_bus_param	= mt9t031_query_bus_param,
-	.controls		= mt9t031_controls,
-	.num_controls		= ARRAY_SIZE(mt9t031_controls),
-};
-
 static int mt9t031_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	struct i2c_client *client = sd->priv;
@@ -565,7 +562,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct soc_camera_device *icd = client->dev.platform_data;
 	const struct v4l2_queryctrl *qctrl;
 	int data;
 
@@ -657,7 +653,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 
 			if (set_shutter(client, total_h) < 0)
 				return -EIO;
-			qctrl = soc_camera_find_qctrl(icd->ops, V4L2_CID_EXPOSURE);
+			qctrl = soc_camera_find_qctrl(&mt9t031_ops,
+						      V4L2_CID_EXPOSURE);
 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
 				 (qctrl->maximum - qctrl->minimum)) /
 				shutter_max + qctrl->minimum;
@@ -751,18 +748,16 @@ static int mt9t031_probe(struct i2c_client *client,
 	struct mt9t031 *mt9t031;
 	struct soc_camera_device *icd = client->dev.platform_data;
 	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
-	struct soc_camera_link *icl;
 	int ret;
 
-	if (!icd) {
-		dev_err(&client->dev, "MT9T031: missing soc-camera data!\n");
-		return -EINVAL;
-	}
+	if (icd) {
+		struct soc_camera_link *icl = to_soc_camera_link(icd);
+		if (!icl) {
+			dev_err(&client->dev, "MT9T031 driver needs platform data\n");
+			return -EINVAL;
+		}
 
-	icl = to_soc_camera_link(icd);
-	if (!icl) {
-		dev_err(&client->dev, "MT9T031 driver needs platform data\n");
-		return -EINVAL;
+		icd->ops = &mt9t031_ops;
 	}
 
 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_WORD_DATA)) {
@@ -777,9 +772,6 @@ static int mt9t031_probe(struct i2c_client *client,
 
 	v4l2_i2c_subdev_init(&mt9t031->subdev, client, &mt9t031_subdev_ops);
 
-	/* Second stage probe - when a capture adapter is there */
-	icd->ops		= &mt9t031_ops;
-
 	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
 	mt9t031->rect.top	= MT9T031_ROW_SKIP;
 	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
@@ -801,7 +793,8 @@ static int mt9t031_probe(struct i2c_client *client,
 	mt9t031_disable(client);
 
 	if (ret) {
-		icd->ops = NULL;
+		if (icd)
+			icd->ops = NULL;
 		i2c_set_clientdata(client, NULL);
 		kfree(mt9t031);
 	}
@@ -814,7 +807,8 @@ static int mt9t031_remove(struct i2c_client *client)
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	struct soc_camera_device *icd = client->dev.platform_data;
 
-	icd->ops = NULL;
+	if (icd)
+		icd->ops = NULL;
 	i2c_set_clientdata(client, NULL);
 	client->driver = NULL;
 	kfree(mt9t031);
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches
  2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
                   ` (8 preceding siblings ...)
  2009-10-30 14:01 ` [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional Guennadi Liakhovetski
@ 2009-10-30 14:34 ` Karicheri, Muralidharan
  2009-10-30 20:12   ` Guennadi Liakhovetski
  9 siblings, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-10-30 14:34 UTC (permalink / raw)
  To: Guennadi Liakhovetski, Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus

Guennadi,

Thanks for updating the driver. I will integrate it when I get a chance and let you know if I see any issues.

BTW, Is there someone developing a driver for MT9P031 sensor which is very similar to MT9T031? Do you suggest a separate driver for this sensor or
add the support in MT9T031? I need a driver for this and plan to add it soon.

Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
Germantown, MD 20874
email: m-karicheri2@ti.com

>-----Original Message-----
>From: Guennadi Liakhovetski [mailto:g.liakhovetski@gmx.de]
>Sent: Friday, October 30, 2009 10:01 AM
>To: Linux Media Mailing List
>Cc: Hans Verkuil; Laurent Pinchart; Sakari Ailus; Karicheri, Muralidharan
>Subject: [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera
>patches
>
>Hi all
>
>As discussed yesterday, we sant to finalise the conversion of soc-camera
>to v4l2-subdev. The presented 9 patches consist of a couple of clean-ups,
>minor additions to existing APIs, and, most importantly, the second
>version of the image-bus API. It hardly changed since v1, only got
>extended with a couple more formats and driver conversions. The last patch
>modifies mt9t031 sensor driver to enable its use outside of soc-camera.
>Muralidharan, hopefully you'd be able to test it. I'll provide more
>comments in the respective mail. A complete current patch-stack is
>available at
>
>http://download.open-technology.de/soc-camera/20091030/
>
>based on 2.6.32-rc5. Patches, not included with these mails have either
>been already pushed via hg, or posted to the list earlier.
>
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/


^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-10-30 14:01 ` [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera Guennadi Liakhovetski
@ 2009-10-30 14:43   ` Karicheri, Muralidharan
  2009-10-30 20:31     ` Guennadi Liakhovetski
  2009-11-10 12:55   ` Laurent Pinchart
  1 sibling, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-10-30 14:43 UTC (permalink / raw)
  To: Guennadi Liakhovetski, Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus

Guennadi,


> 	mt9m111->rect.left	= MT9M111_MIN_DARK_COLS;
> 	mt9m111->rect.top	= MT9M111_MIN_DARK_ROWS;
>diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
>index 6966f64..57e04e9 100644
>--- a/drivers/media/video/mt9t031.c
>+++ b/drivers/media/video/mt9t031.c
>@@ -301,9 +301,9 @@ static int mt9t031_set_params(struct soc_camera_device
>*icd,
> 		ret = reg_write(client, MT9T031_WINDOW_WIDTH, rect->width - 1);
> 	if (ret >= 0)
> 		ret = reg_write(client, MT9T031_WINDOW_HEIGHT,
>-				rect->height + icd->y_skip_top - 1);
>+				rect->height - 1);
Why y_skip_top is removed? When I connect the sensor output to our SOC input and do format conversion and resize on the fly (frame by frame conversion before writing to SDRAM) I have found that the frame completion interrupt fails to get generated with zero value for y_skip_top. I have used a value
of 10 and it worked fine for me. So I would like to have a s_skip_top_lines() in the sensor operations which can be called to update
this value from the host/bridge driver. 

> 	if (ret >= 0 && mt9t031->autoexposure) {
>-		unsigned int total_h = rect->height + icd->y_skip_top + vblank;
>+		unsigned int total_h = rect->height + vblank;
> 		ret = set_shutter(client, total_h);
> 		if (ret >= 0) {
> 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
>@@ -656,8 +656,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
> 		if (ctrl->value) {
> 			const u16 vblank = MT9T031_VERTICAL_BLANK;
> 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
>-			unsigned int total_h = mt9t031->rect.height +
>-				icd->y_skip_top + vblank;
>+			unsigned int total_h = mt9t031->rect.height + vblank;
>
> 			if (set_shutter(client, total_h) < 0)
> 				return -EIO;
>@@ -773,7 +772,6 @@ static int mt9t031_probe(struct i2c_client *client,
>
> 	/* Second stage probe - when a capture adapter is there */
> 	icd->ops		= &mt9t031_ops;
>-	icd->y_skip_top		= 0;
>
> 	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
> 	mt9t031->rect.top	= MT9T031_ROW_SKIP;
>diff --git a/drivers/media/video/mt9v022.c b/drivers/media/video/mt9v022.c
>index 995607f..b71898f 100644
>--- a/drivers/media/video/mt9v022.c
>+++ b/drivers/media/video/mt9v022.c
>@@ -97,6 +97,7 @@ struct mt9v022 {
> 	__u32 fourcc;
> 	int model;	/* V4L2_IDENT_MT9V022* codes from v4l2-chip-ident.h */
> 	u16 chip_control;
>+	unsigned short y_skip_top;	/* Lines to skip at the top */
> };
>
> static struct mt9v022 *to_mt9v022(const struct i2c_client *client)
>@@ -265,7 +266,6 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 	struct i2c_client *client = sd->priv;
> 	struct mt9v022 *mt9v022 = to_mt9v022(client);
> 	struct v4l2_rect rect = a->c;
>-	struct soc_camera_device *icd = client->dev.platform_data;
> 	int ret;
>
> 	/* Bayer format - even size lengths */
>@@ -287,10 +287,10 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 	if (ret >= 0) {
> 		if (ret & 1) /* Autoexposure */
> 			ret = reg_write(client, MT9V022_MAX_TOTAL_SHUTTER_WIDTH,
>-					rect.height + icd->y_skip_top + 43);
>+					rect.height + mt9v022->y_skip_top + 43);
> 		else
> 			ret = reg_write(client, MT9V022_TOTAL_SHUTTER_WIDTH,
>-					rect.height + icd->y_skip_top + 43);
>+					rect.height + mt9v022->y_skip_top + 43);
> 	}
> 	/* Setup frame format: defaults apart from width and height */
> 	if (!ret)
>@@ -309,7 +309,7 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 		ret = reg_write(client, MT9V022_WINDOW_WIDTH, rect.width);
> 	if (!ret)
> 		ret = reg_write(client, MT9V022_WINDOW_HEIGHT,
>-				rect.height + icd->y_skip_top);
>+				rect.height + mt9v022->y_skip_top);
>
> 	if (ret < 0)
> 		return ret;
>@@ -410,15 +410,15 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd,
>struct v4l2_format *f)
> static int mt9v022_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
> {
> 	struct i2c_client *client = sd->priv;
>-	struct soc_camera_device *icd = client->dev.platform_data;
>+	struct mt9v022 *mt9v022 = to_mt9v022(client);
> 	struct v4l2_pix_format *pix = &f->fmt.pix;
> 	int align = pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
> 		pix->pixelformat == V4L2_PIX_FMT_SBGGR16;
>
> 	v4l_bound_align_image(&pix->width, MT9V022_MIN_WIDTH,
> 		MT9V022_MAX_WIDTH, align,
>-		&pix->height, MT9V022_MIN_HEIGHT + icd->y_skip_top,
>-		MT9V022_MAX_HEIGHT + icd->y_skip_top, align, 0);
>+		&pix->height, MT9V022_MIN_HEIGHT + mt9v022->y_skip_top,
>+		MT9V022_MAX_HEIGHT + mt9v022->y_skip_top, align, 0);
>
> 	return 0;
> }
>@@ -787,6 +787,16 @@ static void mt9v022_video_remove(struct
>soc_camera_device *icd)
> 		icl->free_bus(icl);
> }
>
>+static int mt9v022_g_skip_top_lines(struct v4l2_subdev *sd, u32 *lines)
>+{
>+	struct i2c_client *client = sd->priv;
>+	struct mt9v022 *mt9v022 = to_mt9v022(client);
>+
>+	*lines = mt9v022->y_skip_top;
>+
>+	return 0;
>+}
>+
> static struct v4l2_subdev_core_ops mt9v022_subdev_core_ops = {
> 	.g_ctrl		= mt9v022_g_ctrl,
> 	.s_ctrl		= mt9v022_s_ctrl,
>@@ -807,9 +817,14 @@ static struct v4l2_subdev_video_ops
>mt9v022_subdev_video_ops = {
> 	.cropcap	= mt9v022_cropcap,
> };
>
>+static struct v4l2_subdev_sensor_ops mt9v022_subdev_sensor_ops = {
>+	.g_skip_top_lines	= mt9v022_g_skip_top_lines,
>+};
>+
> static struct v4l2_subdev_ops mt9v022_subdev_ops = {
> 	.core	= &mt9v022_subdev_core_ops,
> 	.video	= &mt9v022_subdev_video_ops,
>+	.sensor	= &mt9v022_subdev_sensor_ops,
> };
>
> static int mt9v022_probe(struct i2c_client *client,
>@@ -851,8 +866,7 @@ static int mt9v022_probe(struct i2c_client *client,
> 	 * MT9V022 _really_ corrupts the first read out line.
> 	 * TODO: verify on i.MX31
> 	 */
>-	icd->y_skip_top		= 1;
>-
>+	mt9v022->y_skip_top	= 1;
> 	mt9v022->rect.left	= MT9V022_COLUMN_SKIP;
> 	mt9v022->rect.top	= MT9V022_ROW_SKIP;
> 	mt9v022->rect.width	= MT9V022_MAX_WIDTH;
>diff --git a/drivers/media/video/pxa_camera.c
>b/drivers/media/video/pxa_camera.c
>index 51b683c..4df09a6 100644
>--- a/drivers/media/video/pxa_camera.c
>+++ b/drivers/media/video/pxa_camera.c
>@@ -1051,8 +1051,13 @@ static void pxa_camera_setup_cicr(struct
>soc_camera_device *icd,
> {
> 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
> 	struct pxa_camera_dev *pcdev = ici->priv;
>+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
> 	unsigned long dw, bpp;
>-	u32 cicr0, cicr1, cicr2, cicr3, cicr4 = 0;
>+	u32 cicr0, cicr1, cicr2, cicr3, cicr4 = 0, y_skip_top;
>+	int ret = v4l2_subdev_call(sd, sensor, g_skip_top_lines,
>&y_skip_top);
>+
>+	if (ret < 0)
>+		y_skip_top = 0;
>
> 	/* Datawidth is now guaranteed to be equal to one of the three values.
> 	 * We fix bit-per-pixel equal to data-width... */
>@@ -1118,7 +1123,7 @@ static void pxa_camera_setup_cicr(struct
>soc_camera_device *icd,
>
> 	cicr2 = 0;
> 	cicr3 = CICR3_LPF_VAL(icd->user_height - 1) |
>-		CICR3_BFW_VAL(min((unsigned short)255, icd->y_skip_top));
>+		CICR3_BFW_VAL(min((u32)255, y_skip_top));
> 	cicr4 |= pcdev->mclk_divisor;
>
> 	__raw_writel(cicr1, pcdev->base + CICR1);
>diff --git a/drivers/media/video/soc_camera_platform.c
>b/drivers/media/video/soc_camera_platform.c
>index b6a575c..c7c9151 100644
>--- a/drivers/media/video/soc_camera_platform.c
>+++ b/drivers/media/video/soc_camera_platform.c
>@@ -128,7 +128,6 @@ static int soc_camera_platform_probe(struct
>platform_device *pdev)
> 	/* Set the control device reference */
> 	dev_set_drvdata(&icd->dev, &pdev->dev);
>
>-	icd->y_skip_top		= 0;
> 	icd->ops		= &soc_camera_platform_ops;
>
> 	ici = to_soc_camera_host(icd->dev.parent);
>diff --git a/include/media/soc_camera.h b/include/media/soc_camera.h
>index c5afc8c..218639f 100644
>--- a/include/media/soc_camera.h
>+++ b/include/media/soc_camera.h
>@@ -24,7 +24,6 @@ struct soc_camera_device {
> 	struct device *pdev;		/* Platform device */
> 	s32 user_width;
> 	s32 user_height;
>-	unsigned short y_skip_top;	/* Lines to skip at the top */
> 	unsigned char iface;		/* Host number */
> 	unsigned char devnum;		/* Device number per host */
> 	unsigned char buswidth;		/* See comment in .c */
>diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
>index d411345..04193eb 100644
>--- a/include/media/v4l2-subdev.h
>+++ b/include/media/v4l2-subdev.h
>@@ -227,8 +227,20 @@ struct v4l2_subdev_video_ops {
> 	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
> 	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> 	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
>+};
>+
>+/**
>+ * struct v4l2_subdev_sensor_ops - v4l2-subdev sensor operations
>+ * @enum_framesizes: enumerate supported framesizes
>+ * @enum_frameintervals: enumerate supported frame format intervals
>+ * @g_skip_top_lines: number of lines at the top of the image to be
>skipped.
>+ *		      This is needed for some sensors, that always corrupt
>+ *		      several top lines of the output image.
>+ */
>+struct v4l2_subdev_sensor_ops {
> 	int (*enum_framesizes)(struct v4l2_subdev *sd, struct
>v4l2_frmsizeenum *fsize);
> 	int (*enum_frameintervals)(struct v4l2_subdev *sd, struct
>v4l2_frmivalenum *fival);
>+	int (*g_skip_top_lines)(struct v4l2_subdev *sd, u32 *lines);
> };
>
> struct v4l2_subdev_ops {
>@@ -236,6 +248,7 @@ struct v4l2_subdev_ops {
> 	const struct v4l2_subdev_tuner_ops *tuner;
> 	const struct v4l2_subdev_audio_ops *audio;
> 	const struct v4l2_subdev_video_ops *video;
>+	const struct v4l2_subdev_sensor_ops *sensor;
> };
>
> #define V4L2_SUBDEV_NAME_SIZE 32
>--
>1.6.2.4
>


^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional
  2009-10-30 14:01 ` [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional Guennadi Liakhovetski
@ 2009-10-30 15:28   ` Karicheri, Muralidharan
  2009-10-30 20:25     ` Guennadi Liakhovetski
  2009-11-05 15:46   ` [PATCH/RFC 9/9] " Hans Verkuil
  1 sibling, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-10-30 15:28 UTC (permalink / raw)
  To: Guennadi Liakhovetski, Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus

[-- Attachment #1: Type: text/plain, Size: 12279 bytes --]

Guennadi,

Thanks for your time for updating this driver. But I still don't think
it is in a state to be re-used on TI's VPFE platform. Please see
below for my comments.

Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
Germantown, MD 20874
email: m-karicheri2@ti.com

>-----Original Message-----
>From: Guennadi Liakhovetski [mailto:g.liakhovetski@gmx.de]
>Sent: Friday, October 30, 2009 10:02 AM
>To: Linux Media Mailing List
>Cc: Hans Verkuil; Laurent Pinchart; Sakari Ailus; Karicheri, Muralidharan
>Subject: [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API
>optional
>
>Now that we have moved most of the functions over to the v4l2-subdev API,
>only
>quering and setting bus parameters are still performed using the legacy
>soc-camera client API. Make the use of this API optional for mt9t031.
>
>Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
>---
>
>Muralidharan, this one is for you to test. To differentiate between the
>soc-camera case and a generic user I check i2c client's platform data
>(client->dev.platform_data), so, you have to make sure your user doesn't
>use that field for something else.
>
Currently I am using this field for bus parameters such as pclk polarity.
If there is an API (bus parameter) to set this after probing the sensor, that may work too. I will check your latest driver and let you know if 
I see an issue in migrating to this version.

>One more note: I'm not sure about where v4l2_device_unregister_subdev()
>should be called. In soc-camera the core calls
>v4l2_i2c_new_subdev_board(), which then calls
>v4l2_device_register_subdev(). Logically, it's also the core that then
>calls v4l2_device_unregister_subdev(). Whereas I see many other client
>drivers call v4l2_device_unregister_subdev() internally. So, if your
>bridge driver does not call v4l2_device_unregister_subdev() itself and
>expects the client to call it, there will be a slight problem with that
>too.

In my bridge driver also v4l2_i2c_new_subdev_board() is called to load
up the sub device. When the bridge driver is removed (remove() call), it calls v4l2_device_unregister() which will unregister the v4l2 device and all
sub devices (in turn calls v4l2_device_unregister_subdev()). But most
of the sub devices also calls v4l2_device_unregister_subdev() in the
remove() function of the module (so also the version of the mt9t031
that I use). So even if that call is kept in the mt9t031 sensor driver (not sure if someone use it as a standalone driver), it would just return since the v4l2_dev ptr in sd ptr would have been set to null as a result of the bridge driver remove() call. What do you think?

See also some comments below..

>
> drivers/media/video/mt9t031.c |  146 ++++++++++++++++++++-----------------
>----
> 1 files changed, 70 insertions(+), 76 deletions(-)
>
>diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
>index c95c277..49357bd 100644
>--- a/drivers/media/video/mt9t031.c
>+++ b/drivers/media/video/mt9t031.c
>@@ -204,6 +204,59 @@ static unsigned long mt9t031_query_bus_param(struct
>soc_camera_device *icd)
> 	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
> }
>
>+static const struct v4l2_queryctrl mt9t031_controls[] = {
>+	{
>+		.id		= V4L2_CID_VFLIP,
>+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>+		.name		= "Flip Vertically",
>+		.minimum	= 0,
>+		.maximum	= 1,
>+		.step		= 1,
>+		.default_value	= 0,
>+	}, {
>+		.id		= V4L2_CID_HFLIP,
>+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>+		.name		= "Flip Horizontally",
>+		.minimum	= 0,
>+		.maximum	= 1,
>+		.step		= 1,
>+		.default_value	= 0,
>+	}, {
>+		.id		= V4L2_CID_GAIN,
>+		.type		= V4L2_CTRL_TYPE_INTEGER,
>+		.name		= "Gain",
>+		.minimum	= 0,
>+		.maximum	= 127,
>+		.step		= 1,
>+		.default_value	= 64,
>+		.flags		= V4L2_CTRL_FLAG_SLIDER,
>+	}, {
>+		.id		= V4L2_CID_EXPOSURE,
>+		.type		= V4L2_CTRL_TYPE_INTEGER,
>+		.name		= "Exposure",
>+		.minimum	= 1,
>+		.maximum	= 255,
>+		.step		= 1,
>+		.default_value	= 255,
>+		.flags		= V4L2_CTRL_FLAG_SLIDER,
>+	}, {
>+		.id		= V4L2_CID_EXPOSURE_AUTO,
>+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>+		.name		= "Automatic Exposure",
>+		.minimum	= 0,
>+		.maximum	= 1,
>+		.step		= 1,
>+		.default_value	= 1,
>+	}
>+};
>+
>+static struct soc_camera_ops mt9t031_ops = {
>+	.set_bus_param		= mt9t031_set_bus_param,
>+	.query_bus_param	= mt9t031_query_bus_param,
>+	.controls		= mt9t031_controls,
>+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
>+};
>+

[MK] Why don't you implement queryctrl ops in core? query_bus_param
& set_bus_param() can be implemented as a sub device operation as well
right? I think we need to get the bus parameter RFC implemented and
this driver could be targeted for it's first use so that we could
work together to get it accepted. I didn't get a chance to study your bus image format RFC, but plan to review it soon and to see if it can be
used in my platform as well. For use of this driver in our platform,
all reference to soc_ must be removed. I am ok if the structure is
re-used, but if this driver calls any soc_camera function, it canot
be used in my platform.

BTW, I am attaching a version of the driver that we use in our kernel tree for your reference which will give you an idea of my requirement.

> /* target must be _even_ */
> static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
> {
>@@ -223,10 +276,9 @@ static u16 mt9t031_skip(s32 *source, s32 target, s32
>max)
> }
>
> /* rect is the sensor rectangle, the caller guarantees parameter validity
>*/
>-static int mt9t031_set_params(struct soc_camera_device *icd,
>+static int mt9t031_set_params(struct i2c_client *client,
> 			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
> {
>-	struct i2c_client *client =
>to_i2c_client(to_soc_camera_control(icd));
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
> 	int ret;
> 	u16 xbin, ybin;
>@@ -307,7 +359,7 @@ static int mt9t031_set_params(struct soc_camera_device
>*icd,
> 		if (ret >= 0) {
> 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
> 			const struct v4l2_queryctrl *qctrl =
>-				soc_camera_find_qctrl(icd->ops,
>+				soc_camera_find_qctrl(&mt9t031_ops,
> 						      V4L2_CID_EXPOSURE);
> 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
> 				 (qctrl->maximum - qctrl->minimum)) /
>@@ -333,7 +385,6 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 	struct v4l2_rect rect = a->c;
> 	struct i2c_client *client = sd->priv;
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>-	struct soc_camera_device *icd = client->dev.platform_data;
>
> 	rect.width = ALIGN(rect.width, 2);
> 	rect.height = ALIGN(rect.height, 2);
>@@ -344,7 +395,7 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 	soc_camera_limit_side(&rect.top, &rect.height,
> 		     MT9T031_ROW_SKIP, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT);
>
>-	return mt9t031_set_params(icd, &rect, mt9t031->xskip, mt9t031-
>>yskip);
>+	return mt9t031_set_params(client, &rect, mt9t031->xskip, mt9t031-
>>yskip);
> }
>
> static int mt9t031_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
>@@ -391,7 +442,6 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
> {
> 	struct i2c_client *client = sd->priv;
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>-	struct soc_camera_device *icd = client->dev.platform_data;
> 	u16 xskip, yskip;
> 	struct v4l2_rect rect = mt9t031->rect;
>
>@@ -403,7 +453,7 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
> 	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
>
> 	/* mt9t031_set_params() doesn't change width and height */
>-	return mt9t031_set_params(icd, &rect, xskip, yskip);
>+	return mt9t031_set_params(client, &rect, xskip, yskip);
> }
>
> /*
>@@ -476,59 +526,6 @@ static int mt9t031_s_register(struct v4l2_subdev *sd,
> }
> #endif
>
>-static const struct v4l2_queryctrl mt9t031_controls[] = {
>-	{
>-		.id		= V4L2_CID_VFLIP,
>-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>-		.name		= "Flip Vertically",
>-		.minimum	= 0,
>-		.maximum	= 1,
>-		.step		= 1,
>-		.default_value	= 0,
>-	}, {
>-		.id		= V4L2_CID_HFLIP,
>-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>-		.name		= "Flip Horizontally",
>-		.minimum	= 0,
>-		.maximum	= 1,
>-		.step		= 1,
>-		.default_value	= 0,
>-	}, {
>-		.id		= V4L2_CID_GAIN,
>-		.type		= V4L2_CTRL_TYPE_INTEGER,
>-		.name		= "Gain",
>-		.minimum	= 0,
>-		.maximum	= 127,
>-		.step		= 1,
>-		.default_value	= 64,
>-		.flags		= V4L2_CTRL_FLAG_SLIDER,
>-	}, {
>-		.id		= V4L2_CID_EXPOSURE,
>-		.type		= V4L2_CTRL_TYPE_INTEGER,
>-		.name		= "Exposure",
>-		.minimum	= 1,
>-		.maximum	= 255,
>-		.step		= 1,
>-		.default_value	= 255,
>-		.flags		= V4L2_CTRL_FLAG_SLIDER,
>-	}, {
>-		.id		= V4L2_CID_EXPOSURE_AUTO,
>-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>-		.name		= "Automatic Exposure",
>-		.minimum	= 0,
>-		.maximum	= 1,
>-		.step		= 1,
>-		.default_value	= 1,
>-	}
>-};
>-
>-static struct soc_camera_ops mt9t031_ops = {
>-	.set_bus_param		= mt9t031_set_bus_param,
>-	.query_bus_param	= mt9t031_query_bus_param,
>-	.controls		= mt9t031_controls,
>-	.num_controls		= ARRAY_SIZE(mt9t031_controls),
>-};
>-
> static int mt9t031_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control
>*ctrl)
> {
> 	struct i2c_client *client = sd->priv;
>@@ -565,7 +562,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
> {
> 	struct i2c_client *client = sd->priv;
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>-	struct soc_camera_device *icd = client->dev.platform_data;
> 	const struct v4l2_queryctrl *qctrl;
> 	int data;
>
>@@ -657,7 +653,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
>
> 			if (set_shutter(client, total_h) < 0)
> 				return -EIO;
>-			qctrl = soc_camera_find_qctrl(icd->ops,
>V4L2_CID_EXPOSURE);
>+			qctrl = soc_camera_find_qctrl(&mt9t031_ops,
>+						      V4L2_CID_EXPOSURE);

[MK] Why do we still need this call? In my version of the sensor driver,
I just implement the queryctrl() operation in core_ops. This cannot work
since soc_camera_find_qctrl() is implemented only in SoC camera.

> 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
> 				 (qctrl->maximum - qctrl->minimum)) /
> 				shutter_max + qctrl->minimum;
>@@ -751,18 +748,16 @@ static int mt9t031_probe(struct i2c_client *client,
> 	struct mt9t031 *mt9t031;
> 	struct soc_camera_device *icd = client->dev.platform_data;
> 	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
>-	struct soc_camera_link *icl;
> 	int ret;
>
>-	if (!icd) {
>-		dev_err(&client->dev, "MT9T031: missing soc-camera data!\n");
>-		return -EINVAL;
>-	}
>+	if (icd) {
>+		struct soc_camera_link *icl = to_soc_camera_link(icd);
>+		if (!icl) {
>+			dev_err(&client->dev, "MT9T031 driver needs platform
>data\n");
>+			return -EINVAL;
>+		}
>
>-	icl = to_soc_camera_link(icd);
>-	if (!icl) {
>-		dev_err(&client->dev, "MT9T031 driver needs platform data\n");
>-		return -EINVAL;
>+		icd->ops = &mt9t031_ops;
> 	}
>
> 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_WORD_DATA)) {
>@@ -777,9 +772,6 @@ static int mt9t031_probe(struct i2c_client *client,
>
> 	v4l2_i2c_subdev_init(&mt9t031->subdev, client, &mt9t031_subdev_ops);
>
>-	/* Second stage probe - when a capture adapter is there */
>-	icd->ops		= &mt9t031_ops;
>-
> 	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
> 	mt9t031->rect.top	= MT9T031_ROW_SKIP;
> 	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
>@@ -801,7 +793,8 @@ static int mt9t031_probe(struct i2c_client *client,
> 	mt9t031_disable(client);
>
> 	if (ret) {
>-		icd->ops = NULL;
>+		if (icd)
>+			icd->ops = NULL;
> 		i2c_set_clientdata(client, NULL);
> 		kfree(mt9t031);
> 	}
>@@ -814,7 +807,8 @@ static int mt9t031_remove(struct i2c_client *client)
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
> 	struct soc_camera_device *icd = client->dev.platform_data;
>
>-	icd->ops = NULL;
>+	if (icd)
>+		icd->ops = NULL;
> 	i2c_set_clientdata(client, NULL);
> 	client->driver = NULL;
> 	kfree(mt9t031);
>--
>1.6.2.4
>


[-- Attachment #2: mt9t031.c --]
[-- Type: text/plain, Size: 21193 bytes --]

/*
 * Driver for MT9T031 CMOS Image Sensor from Micron
 *
 * Copyright (C) 2008, Guennadi Liakhovetski, DENX Software Engineering <lg@denx.de>
 *
 * This program is free software; you can redistribute it and/or modify
 * it under the terms of the GNU General Public License version 2 as
 * published by the Free Software Foundation.
 */

#include <linux/videodev2.h>
#include <linux/slab.h>
#include <linux/i2c.h>
#include <linux/log2.h>

#include <media/v4l2-device.h>
#include <media/v4l2-common.h>
#include <media/v4l2-chip-ident.h>

/* mt9t031 i2c address 0x5d
 * The platform has to define i2c_board_info
 * and call i2c_register_board_info() */

/* mt9t031 selected register addresses */
#define MT9T031_CHIP_VERSION		0x00
#define MT9T031_ROW_START		0x01
#define MT9T031_COLUMN_START		0x02
#define MT9T031_WINDOW_HEIGHT		0x03
#define MT9T031_WINDOW_WIDTH		0x04
#define MT9T031_HORIZONTAL_BLANKING	0x05
#define MT9T031_VERTICAL_BLANKING	0x06
#define MT9T031_OUTPUT_CONTROL		0x07
#define MT9T031_SHUTTER_WIDTH_UPPER	0x08
#define MT9T031_SHUTTER_WIDTH		0x09
#define MT9T031_PIXEL_CLOCK_CONTROL	0x0a
#define MT9T031_FRAME_RESTART		0x0b
#define MT9T031_SHUTTER_DELAY		0x0c
#define MT9T031_RESET			0x0d
#define MT9T031_READ_MODE_1		0x1e
#define MT9T031_READ_MODE_2		0x20
#define MT9T031_READ_MODE_3		0x21
#define MT9T031_ROW_ADDRESS_MODE	0x22
#define MT9T031_COLUMN_ADDRESS_MODE	0x23
#define MT9T031_GLOBAL_GAIN		0x35
#define MT9T031_CHIP_ENABLE		0xF8

#define MT9T031_MAX_HEIGHT		1536
#define MT9T031_MAX_WIDTH		2048
#define MT9T031_MIN_HEIGHT		2
#define MT9T031_MIN_WIDTH		2
#define MT9T031_HORIZONTAL_BLANK	142
#define MT9T031_VERTICAL_BLANK		25
#define MT9T031_COLUMN_SKIP		32
#define MT9T031_ROW_SKIP		20
#define MT9T031_DEFAULT_WIDTH		640
#define MT9T031_DEFAULT_HEIGHT		480

#define MT9T031_BUS_PARAM	(SOCAM_PCLK_SAMPLE_RISING |	\
	SOCAM_PCLK_SAMPLE_FALLING | SOCAM_HSYNC_ACTIVE_HIGH |	\
	SOCAM_VSYNC_ACTIVE_HIGH | SOCAM_DATA_ACTIVE_HIGH |	\
	SOCAM_MASTER | SOCAM_DATAWIDTH_10)


/* Debug functions */
static int debug;
module_param(debug, bool, 0644);
MODULE_PARM_DESC(debug, "Debug level (0-1)");

static const struct v4l2_fmtdesc mt9t031_formats[] = {
	{
		.index = 0,
		.type = V4L2_BUF_TYPE_VIDEO_CAPTURE,
		.description = "Bayer (sRGB) 10 bit",
		.pixelformat = V4L2_PIX_FMT_SGRBG10,
	},
};
static const unsigned int mt9t031_num_formats = ARRAY_SIZE(mt9t031_formats);

static const struct v4l2_queryctrl mt9t031_controls[] = {
	{
		.id		= V4L2_CID_VFLIP,
		.type		= V4L2_CTRL_TYPE_BOOLEAN,
		.name		= "Flip Vertically",
		.minimum	= 0,
		.maximum	= 1,
		.step		= 1,
		.default_value	= 0,
	}, {
		.id		= V4L2_CID_HFLIP,
		.type		= V4L2_CTRL_TYPE_BOOLEAN,
		.name		= "Flip Horizontally",
		.minimum	= 0,
		.maximum	= 1,
		.step		= 1,
		.default_value	= 0,
	}, {
		.id		= V4L2_CID_GAIN,
		.type		= V4L2_CTRL_TYPE_INTEGER,
		.name		= "Gain",
		.minimum	= 0,
		.maximum	= 127,
		.step		= 1,
		.default_value	= 64,
		.flags		= V4L2_CTRL_FLAG_SLIDER,
	}, {
		.id		= V4L2_CID_EXPOSURE,
		.type		= V4L2_CTRL_TYPE_INTEGER,
		.name		= "Exposure",
		.minimum	= 1,
		.maximum	= 255,
		.step		= 1,
		.default_value	= 255,
		.flags		= V4L2_CTRL_FLAG_SLIDER,
	}, {
		.id		= V4L2_CID_EXPOSURE_AUTO,
		.type		= V4L2_CTRL_TYPE_BOOLEAN,
		.name		= "Automatic Exposure",
		.minimum	= 0,
		.maximum	= 1,
		.step		= 1,
		.default_value	= 1,
	}
};
static const unsigned int mt9t031_num_controls = ARRAY_SIZE(mt9t031_controls);

struct mt9t031 {
	struct v4l2_subdev sd;
	int model;	/* V4L2_IDENT_MT9T031* codes from v4l2-chip-ident.h */
	unsigned char autoexposure;
	u16 xskip;
	u16 yskip;
	u32 width;
	u32 height;
	unsigned short x_min;           /* Camera capabilities */
	unsigned short y_min;
	unsigned short x_current;       /* Current window location */
	unsigned short y_current;
	unsigned short width_min;
	unsigned short width_max;
	unsigned short height_min;
	unsigned short height_max;
	unsigned short y_skip_top;      /* Lines to skip at the top */
	unsigned short gain;
	unsigned short exposure;
};

static inline struct mt9t031 *to_mt9t031(struct v4l2_subdev *sd)
{
	return container_of(sd, struct mt9t031, sd);
}

static int reg_read(struct i2c_client *client, const u8 reg)
{
	s32 data;

	data = i2c_smbus_read_word_data(client, reg);
	return data < 0 ? data : swab16(data);
}

static int reg_write(struct i2c_client *client, const u8 reg,
		     const u16 data)
{
	return i2c_smbus_write_word_data(client, reg, swab16(data));
}

static int reg_set(struct i2c_client *client, const u8 reg,
		   const u16 data)
{
	int ret;

	ret = reg_read(client, reg);
	if (ret < 0)
		return ret;
	return reg_write(client, reg, ret | data);
}

static int reg_clear(struct i2c_client *client, const u8 reg,
		     const u16 data)
{
	int ret;

	ret = reg_read(client, reg);
	if (ret < 0)
		return ret;
	return reg_write(client, reg, ret & ~data);
}

static int set_shutter(struct v4l2_subdev *sd, const u32 data)
{
	struct i2c_client *client = v4l2_get_subdevdata(sd);
	int ret;

	ret = reg_write(client, MT9T031_SHUTTER_WIDTH_UPPER, data >> 16);

	if (ret >= 0)
		ret = reg_write(client, MT9T031_SHUTTER_WIDTH, data & 0xffff);

	return ret;
}

static int get_shutter(struct v4l2_subdev *sd, u32 *data)
{
	int ret;
	struct i2c_client *client = v4l2_get_subdevdata(sd);

	ret = reg_read(client, MT9T031_SHUTTER_WIDTH_UPPER);
	*data = ret << 16;

	if (ret >= 0)
		ret = reg_read(client, MT9T031_SHUTTER_WIDTH);
	*data |= ret & 0xffff;

	return ret < 0 ? ret : 0;
}

static int mt9t031_init(struct v4l2_subdev *sd, u32 val)
{
	int ret;
	struct i2c_client *client = v4l2_get_subdevdata(sd);

	/* Disable chip output, synchronous option update */
	ret = reg_write(client, MT9T031_RESET, 1);
	if (ret >= 0)
		ret = reg_write(client, MT9T031_RESET, 0);
	if (ret >= 0)
		ret = reg_clear(client, MT9T031_OUTPUT_CONTROL, 2);

	return ret >= 0 ? 0 : -EIO;
}

static int mt9t031_s_stream(struct v4l2_subdev *sd, int enable)
{
	struct i2c_client *client = v4l2_get_subdevdata(sd);

	/* Switch to master "normal" mode */
	if (enable) {
		if (reg_set(client, MT9T031_OUTPUT_CONTROL, 2) < 0)
			return -EIO;
	} else {
	/* Switch to master "" mode */
		if (reg_clear(client, MT9T031_OUTPUT_CONTROL, 2) < 0)
			return -EIO;
	}
	return 0;
}

/* Round up minima and round down maxima */
static void recalculate_limits(struct mt9t031 *mt9t031,
			       u16 xskip, u16 yskip)
{
	mt9t031->x_min = (MT9T031_COLUMN_SKIP + xskip - 1) / xskip;
	mt9t031->y_min = (MT9T031_ROW_SKIP + yskip - 1) / yskip;
	mt9t031->width_min = (MT9T031_MIN_WIDTH + xskip - 1) / xskip;
	mt9t031->height_min = (MT9T031_MIN_HEIGHT + yskip - 1) / yskip;
	mt9t031->width_max = MT9T031_MAX_WIDTH / xskip;
	mt9t031->height_max = MT9T031_MAX_HEIGHT / yskip;
}

const struct v4l2_queryctrl *mt9t031_find_qctrl(u32 id)
{
	int i;

	for (i = 0; i < mt9t031_num_controls; i++) {
		if (mt9t031_controls[i].id == id)
			return &mt9t031_controls[i];
	}
	return NULL;
}

static int mt9t031_set_params(struct v4l2_subdev *sd,
			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
{
	struct mt9t031 *mt9t031 = to_mt9t031(sd);
	struct i2c_client *client = v4l2_get_subdevdata(sd);

	int ret;
	u16 xbin, ybin, width, height, left, top;
	const u16 hblank = MT9T031_HORIZONTAL_BLANK,
		vblank = MT9T031_VERTICAL_BLANK;

	/* Make sure we don't exceed sensor limits */
	if (rect->left + rect->width > mt9t031->width_max)
		rect->left =
		(mt9t031->width_max - rect->width) / 2 + mt9t031->x_min;

	if (rect->top + rect->height > mt9t031->height_max)
		rect->top =
		(mt9t031->height_max - rect->height) / 2 + mt9t031->y_min;

	width = rect->width * xskip;
	height = rect->height * yskip;
	left = rect->left * xskip;
	top = rect->top * yskip;

	xbin = min(xskip, (u16)3);
	ybin = min(yskip, (u16)3);

	v4l2_dbg(1, debug, sd, "xskip %u, width %u/%u, yskip %u,"
		"height %u/%u\n", xskip, width, rect->width, yskip,
		height, rect->height);

	/* Could just do roundup(rect->left, [xy]bin * 2); but this is cheaper */
	switch (xbin) {
	case 2:
		left = (left + 3) & ~3;
		break;
	case 3:
		left = roundup(left, 6);
	}

	switch (ybin) {
	case 2:
		top = (top + 3) & ~3;
		break;
	case 3:
		top = roundup(top, 6);
	}

	/* Disable register update, reconfigure atomically */
	ret = reg_set(client, MT9T031_OUTPUT_CONTROL, 1);
	if (ret < 0)
		return ret;

	/* Blanking and start values - default... */
	ret = reg_write(client, MT9T031_HORIZONTAL_BLANKING, hblank);
	if (ret >= 0)
		ret = reg_write(client, MT9T031_VERTICAL_BLANKING, vblank);

	if (yskip != mt9t031->yskip || xskip != mt9t031->xskip) {
		/* Binning, skipping */
		if (ret >= 0)
			ret = reg_write(client, MT9T031_COLUMN_ADDRESS_MODE,
					((xbin - 1) << 4) | (xskip - 1));
		if (ret >= 0)
			ret = reg_write(client, MT9T031_ROW_ADDRESS_MODE,
					((ybin - 1) << 4) | (yskip - 1));
	}
	v4l2_dbg(1, debug, sd, "new physical left %u, top %u\n", left, top);

	/* The caller provides a supported format, as guaranteed by
	 * icd->try_fmt_cap(), soc_camera_s_crop() and soc_camera_cropcap() */
	if (ret >= 0)
		ret = reg_write(client, MT9T031_COLUMN_START, left);
	if (ret >= 0)
		ret = reg_write(client, MT9T031_ROW_START, top);
	if (ret >= 0)
		ret = reg_write(client, MT9T031_WINDOW_WIDTH, width - 1);
	if (ret >= 0)
		ret = reg_write(client, MT9T031_WINDOW_HEIGHT,
				height + mt9t031->y_skip_top - 1);
	if (ret >= 0 && mt9t031->autoexposure) {
		ret = set_shutter(sd, height + mt9t031->y_skip_top + vblank);
		if (ret >= 0) {
			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
			const struct v4l2_queryctrl *qctrl =
				mt9t031_find_qctrl(V4L2_CID_EXPOSURE);
			mt9t031->exposure = (shutter_max / 2 + (height +
					 mt9t031->y_skip_top + vblank - 1) *
					 (qctrl->maximum - qctrl->minimum)) /
				shutter_max + qctrl->minimum;
		}
	}

	/* Re-enable register update, commit all changes */
	if (ret >= 0) {
		ret = reg_clear(client, MT9T031_OUTPUT_CONTROL, 1);
		/* update the values */
		mt9t031->width	= rect->width,
		mt9t031->height	= rect->height,
		mt9t031->x_current = rect->left;
		mt9t031->y_current = rect->top;
	}
	return ret < 0 ? ret : 0;
}

static int mt9t031_set_fmt(struct v4l2_subdev *sd,
			   struct v4l2_format *f)
{
	struct mt9t031 *mt9t031 = to_mt9t031(sd);
	int ret;
	u16 xskip, yskip;
	struct v4l2_rect rect = {
		.left	= mt9t031->x_current,
		.top	= mt9t031->y_current,
		.width	= f->fmt.pix.width,
		.height	= f->fmt.pix.height,
	};

	/*
	 * try_fmt has put rectangle within limits.
	 * S_FMT - use binning and skipping for scaling, recalculate
	 * limits, used for cropping
	 */
	/* Is this more optimal than just a division? */
	for (xskip = 8; xskip > 1; xskip--)
		if (rect.width * xskip <= MT9T031_MAX_WIDTH)
			break;

	for (yskip = 8; yskip > 1; yskip--)
		if (rect.height * yskip <= MT9T031_MAX_HEIGHT)
			break;

	recalculate_limits(mt9t031, xskip, yskip);

	ret = mt9t031_set_params(sd, &rect, xskip, yskip);
	if (!ret) {
		mt9t031->xskip = xskip;
		mt9t031->yskip = yskip;
	}
	return ret;
}

static int mt9t031_try_fmt(struct v4l2_subdev *sd,
			   struct v4l2_format *f)
{
	struct v4l2_pix_format *pix = &f->fmt.pix;

	if (pix->height < MT9T031_MIN_HEIGHT)
		pix->height = MT9T031_MIN_HEIGHT;
	if (pix->height > MT9T031_MAX_HEIGHT)
		pix->height = MT9T031_MAX_HEIGHT;
	if (pix->width < MT9T031_MIN_WIDTH)
		pix->width = MT9T031_MIN_WIDTH;
	if (pix->width > MT9T031_MAX_WIDTH)
		pix->width = MT9T031_MAX_WIDTH;

	pix->width &= ~0x01; /* has to be even */
	pix->height &= ~0x01; /* has to be even */
	return 0;
}

static int mt9t031_get_chip_id(struct v4l2_subdev *sd,
			       struct v4l2_dbg_chip_ident *id)
{
	struct mt9t031 *mt9t031 = to_mt9t031(sd);
	struct i2c_client *client = v4l2_get_subdevdata(sd);;

	if (id->match.type != V4L2_CHIP_MATCH_I2C_ADDR)
		return -EINVAL;

	if (id->match.addr != client->addr)
		return -ENODEV;

	id->ident	= mt9t031->model;
	id->revision	= 0;

	return 0;
}

#ifdef CONFIG_VIDEO_ADV_DEBUG
static int mt9t031_get_register(struct v4l2_subdev *sd,
				struct v4l2_dbg_register *reg)
{
	struct i2c_client *client = v4l2_get_subdevdata(sd);;
	struct mt9t031 *mt9t031 = to_mt9t031(sd);

	if (reg->match.type != V4L2_CHIP_MATCH_I2C_ADDR || reg->reg > 0xff)
		return -EINVAL;

	if (reg->match.addr != client->addr)
		return -ENODEV;

	reg->val = reg_read(client, reg->reg);

	if (reg->val > 0xffff)
		return -EIO;

	return 0;
}

static int mt9t031_set_register(struct v4l2_subdev *sd,
				struct v4l2_dbg_register *reg)
{
	struct i2c_client *client = v4l2_get_subdevdata(sd);
	struct mt9t031 *mt9t031 = to_mt9t031(sd);

	if (reg->match.type != V4L2_CHIP_MATCH_I2C_ADDR || reg->reg > 0xff)
		return -EINVAL;

	if (reg->match.addr != client->addr)
		return -ENODEV;

	if (reg_write(client, reg->reg, reg->val) < 0)
		return -EIO;

	return 0;
}
#endif


static int mt9t031_get_control(struct v4l2_subdev *, struct v4l2_control *);
static int mt9t031_set_control(struct v4l2_subdev *, struct v4l2_control *);
static int mt9t031_queryctrl(struct v4l2_subdev *, struct v4l2_queryctrl *);

static const struct v4l2_subdev_core_ops mt9t031_core_ops = {
	.g_chip_ident = mt9t031_get_chip_id,
	.init = mt9t031_init,
	.queryctrl = mt9t031_queryctrl,
	.g_ctrl	= mt9t031_get_control,
	.s_ctrl	= mt9t031_set_control,
#ifdef CONFIG_VIDEO_ADV_DEBUG
	.get_register = mt9t031_get_register,
	.set_register = mt9t031_set_register,
#endif
};

static const struct v4l2_subdev_video_ops mt9t031_video_ops = {
	.s_fmt = mt9t031_set_fmt,
	.try_fmt = mt9t031_try_fmt,
	.s_stream = mt9t031_s_stream,
};

static const struct v4l2_subdev_ops mt9t031_ops = {
	.core = &mt9t031_core_ops,
	.video = &mt9t031_video_ops,
};

static int mt9t031_queryctrl(struct v4l2_subdev *sd,
			    struct v4l2_queryctrl *qctrl)
{
	const struct v4l2_queryctrl *temp_qctrl;

	temp_qctrl = mt9t031_find_qctrl(qctrl->id);
	if (!temp_qctrl) {
		v4l2_err(sd, "control id %d not supported", qctrl->id);
		return -EINVAL;
	}
	memcpy(qctrl, temp_qctrl, sizeof(*qctrl));
	return 0;
}

static int mt9t031_get_control(struct v4l2_subdev *sd,
			       struct v4l2_control *ctrl)
{
	struct i2c_client *client = v4l2_get_subdevdata(sd);
	struct mt9t031 *mt9t031 = to_mt9t031(sd);
	int data;

	switch (ctrl->id) {
	case V4L2_CID_VFLIP:
		data = reg_read(client, MT9T031_READ_MODE_2);
		if (data < 0)
			return -EIO;
		ctrl->value = !!(data & 0x8000);
		break;
	case V4L2_CID_HFLIP:
		data = reg_read(client, MT9T031_READ_MODE_2);
		if (data < 0)
			return -EIO;
		ctrl->value = !!(data & 0x4000);
		break;
	case V4L2_CID_EXPOSURE_AUTO:
		ctrl->value = mt9t031->autoexposure;
		break;
	}
	return 0;
}

static int mt9t031_set_control(struct v4l2_subdev *sd,
			       struct v4l2_control *ctrl)
{
	struct mt9t031 *mt9t031 = to_mt9t031(sd);
	const struct v4l2_queryctrl *qctrl = NULL;
	int data;
	struct i2c_client *client = v4l2_get_subdevdata(sd);

	if (NULL == ctrl)
		return -EINVAL;

	qctrl = mt9t031_find_qctrl(ctrl->id);
	if (!qctrl) {
		v4l2_err(sd, "control id %d not supported", ctrl->id);
		return -EINVAL;
	}

	switch (ctrl->id) {
	case V4L2_CID_VFLIP:
		if (ctrl->value)
			data = reg_set(client, MT9T031_READ_MODE_2, 0x8000);
		else
			data = reg_clear(client, MT9T031_READ_MODE_2, 0x8000);
		if (data < 0)
			return -EIO;
		break;
	case V4L2_CID_HFLIP:
		if (ctrl->value)
			data = reg_set(client, MT9T031_READ_MODE_2, 0x4000);
		else
			data = reg_clear(client, MT9T031_READ_MODE_2, 0x4000);
		if (data < 0)
			return -EIO;
		break;
	case V4L2_CID_GAIN:
		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl->minimum)
			return -EINVAL;
		/* See Datasheet Table 7, Gain settings. */
		if (ctrl->value <= qctrl->default_value) {
			/* Pack it into 0..1 step 0.125, register values 0..8 */
			unsigned long range = qctrl->default_value - qctrl->minimum;
			data = ((ctrl->value - qctrl->minimum) * 8 + range / 2) / range;

			v4l2_dbg(1, debug, sd, "Setting gain %d\n", data);
			data = reg_write(client, MT9T031_GLOBAL_GAIN, data);
			if (data < 0)
				return -EIO;
		} else {
			/* Pack it into 1.125..128 variable step, register values 9..0x7860 */
			/* We assume qctrl->maximum - qctrl->default_value - 1 > 0 */
			unsigned long range = qctrl->maximum - qctrl->default_value - 1;
			/* calculated gain: map 65..127 to 9..1024 step 0.125 */
			unsigned long gain = ((ctrl->value - qctrl->default_value - 1) *
					       1015 + range / 2) / range + 9;

			if (gain <= 32)
				/* calculated gain 9..32 -> 9..32 */
				data = gain;
			else if (gain <= 64)
				/* calculated gain 33..64 -> 0x51..0x60 */
				data = ((gain - 32) * 16 + 16) / 32 + 80;
			else
				/*
				 * calculated gain 65..1024 -> (1..120) << 8 +
				 * 0x60
				 */
				data = (((gain - 64 + 7) * 32) & 0xff00) | 0x60;

			v4l2_dbg(1, debug, sd, "Setting gain from 0x%x to"
				 "0x%x\n",
				 reg_read(client, MT9T031_GLOBAL_GAIN), data);

			data = reg_write(client, MT9T031_GLOBAL_GAIN, data);
			if (data < 0)
				return -EIO;
		}

		/* Success */
		mt9t031->gain = ctrl->value;
		break;
	case V4L2_CID_EXPOSURE:
		/* mt9t031 has maximum == default */
		if (ctrl->value > qctrl->maximum ||
		    ctrl->value < qctrl->minimum)
			return -EINVAL;
		else {
			const unsigned long range =
				qctrl->maximum - qctrl->minimum;
			const u32 shutter =
				((ctrl->value - qctrl->minimum) * 1048 +
					range / 2) / range + 1;
			u32 old;

			get_shutter(sd, &old);
			v4l2_dbg(1, debug, sd,
				"Setting shutter width from %u to %u\n",
				old, shutter);
			if (set_shutter(sd, shutter) < 0)
				return -EIO;
			mt9t031->exposure = ctrl->value;
			mt9t031->autoexposure = 0;
		}
		break;
	case V4L2_CID_EXPOSURE_AUTO:
		if (ctrl->value) {
			const u16 vblank = MT9T031_VERTICAL_BLANK;
			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
			if (set_shutter(sd, mt9t031->height +
					mt9t031->y_skip_top + vblank) < 0)
				return -EIO;

			qctrl = mt9t031_find_qctrl(V4L2_CID_EXPOSURE);
			mt9t031->exposure =
				(shutter_max / 2 + (mt9t031->height +
				mt9t031->y_skip_top + vblank - 1) *
				(qctrl->maximum - qctrl->minimum)) /
				shutter_max + qctrl->minimum;
			mt9t031->autoexposure = 1;
		} else
			mt9t031->autoexposure = 0;
		break;
	}
	return 0;
}

/* Interface active, can use i2c. If it fails, it can indeed mean, that
 * this wasn't our capture interface, so, we wait for the right one */
static int mt9t031_detect(struct i2c_client *client, int *model)
{
	s32 data;

	/* Enable the chip */
	data = reg_write(client, MT9T031_CHIP_ENABLE, 1);
	dev_dbg(&client->dev, "write: %d\n", data);

	/* Read out the chip version register */
	data = reg_read(client, MT9T031_CHIP_VERSION);

	switch (data) {
	case 0x1621:
		*model = V4L2_IDENT_MT9T031;
		break;
	default:
		dev_err(&client->dev,
			"No MT9T031 chip detected, register read %x\n", data);
		return -ENODEV;
	}

	dev_info(&client->dev, "Detected a MT9T031 chip ID %x\n", data);
	return 0;
}

static int mt9t031_probe(struct i2c_client *client,
			 const struct i2c_device_id *did)
{
	struct mt9t031 *mt9t031;
	struct v4l2_subdev *sd;
	int pclk_pol;
	int ret;

	if (!i2c_check_functionality(client->adapter,
				     I2C_FUNC_SMBUS_WORD_DATA)) {
		dev_warn(&client->dev,
			 "I2C-Adapter doesn't support I2C_FUNC_SMBUS_WORD\n");
		return -EIO;
	}

	if (!client->dev.platform_data) {
		dev_err(&client->dev, "No platform data!!\n");
		return -ENODEV;
	}

	pclk_pol = (int)client->dev.platform_data;

	mt9t031 = kzalloc(sizeof(struct mt9t031), GFP_KERNEL);
	if (!mt9t031)
		return -ENOMEM;

	ret = mt9t031_detect(client, &mt9t031->model);
	if (ret)
		goto clean;

	mt9t031->x_min		= MT9T031_COLUMN_SKIP;
	mt9t031->y_min		= MT9T031_ROW_SKIP;
	mt9t031->width		= MT9T031_DEFAULT_WIDTH;
	mt9t031->height		= MT9T031_DEFAULT_WIDTH;
	mt9t031->x_current	= mt9t031->x_min;
	mt9t031->y_current	= mt9t031->y_min;
	mt9t031->width_min	= MT9T031_MIN_WIDTH;
	mt9t031->width_max	= MT9T031_MAX_WIDTH;
	mt9t031->height_min	= MT9T031_MIN_HEIGHT;
	mt9t031->height_max	= MT9T031_MAX_HEIGHT;
	mt9t031->y_skip_top	= 10;
	mt9t031->autoexposure = 1;
	mt9t031->xskip = 1;
	mt9t031->yskip = 1;

	/* Register with V4L2 layer as slave device */
	sd = &mt9t031->sd;
	v4l2_i2c_subdev_init(sd, client, &mt9t031_ops);
	if (!pclk_pol)
		reg_clear(v4l2_get_subdevdata(sd),
			  MT9T031_PIXEL_CLOCK_CONTROL, 0x8000);
	else
		reg_set(v4l2_get_subdevdata(sd),
			MT9T031_PIXEL_CLOCK_CONTROL, 0x8000);

	v4l2_info(sd, "%s decoder driver registered !!\n", sd->name);
	return 0;

clean:
	kfree(mt9t031);
	return ret;
}

static int mt9t031_remove(struct i2c_client *client)
{
	struct v4l2_subdev *sd = i2c_get_clientdata(client);
	struct mt9t031 *mt9t031 = to_mt9t031(sd);

	v4l2_device_unregister_subdev(sd);

	kfree(mt9t031);
	return 0;
}

static const struct i2c_device_id mt9t031_id[] = {
	{ "mt9t031", 0 },
	{ }
};
MODULE_DEVICE_TABLE(i2c, mt9t031_id);

static struct i2c_driver mt9t031_i2c_driver = {
	.driver = {
		.name = "mt9t031",
	},
	.probe		= mt9t031_probe,
	.remove		= mt9t031_remove,
	.id_table	= mt9t031_id,
};

static int __init mt9t031_mod_init(void)
{
	return i2c_add_driver(&mt9t031_i2c_driver);
}

static void __exit mt9t031_mod_exit(void)
{
	i2c_del_driver(&mt9t031_i2c_driver);
}

module_init(mt9t031_mod_init);
module_exit(mt9t031_mod_exit);

MODULE_DESCRIPTION("Micron MT9T031 Camera driver");
MODULE_AUTHOR("Guennadi Liakhovetski <lg@denx.de>");
MODULE_LICENSE("GPL v2");

^ permalink raw reply	[flat|nested] 51+ messages in thread

* [PATCH/RFC 8a/9 v2] soc-camera: convert to the new imagebus API
  2009-10-30 14:01 ` [PATCH/RFC 8/9 v2] soc-camera: convert to the new imagebus API Guennadi Liakhovetski
@ 2009-10-30 18:31   ` Guennadi Liakhovetski
  2009-10-30 18:34   ` [PATCH/RFC 8b/9 v3] rj54n1cb0c: Add cropping, auto white balance, restrict sizes, add platform data Guennadi Liakhovetski
  1 sibling, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 18:31 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---

I did split the 8/9 patch into two - conversion and rj54n1cb0c 
enhancements. This is part "a".

 arch/sh/boards/board-ap325rxa.c            |    4 +-
 drivers/media/video/mt9m001.c              |  117 +++++-----
 drivers/media/video/mt9m111.c              |  139 ++++++------
 drivers/media/video/mt9t031.c              |   67 +++---
 drivers/media/video/mt9v022.c              |  126 +++++------
 drivers/media/video/mx1_camera.c           |   77 +++++--
 drivers/media/video/mx3_camera.c           |  270 +++++++++++++----------
 drivers/media/video/ov772x.c               |  177 ++++++---------
 drivers/media/video/ov9640.c               |   62 +++---
 drivers/media/video/pxa_camera.c           |  265 +++++++++++-----------
 drivers/media/video/rj54n1cb0c.c           |  174 ++++++++++-----
 drivers/media/video/sh_mobile_ceu_camera.c |  335 +++++++++++++++------------
 drivers/media/video/soc_camera.c           |   72 +++---
 drivers/media/video/soc_camera_platform.c  |   37 ++--
 drivers/media/video/tw9910.c               |   91 ++++----
 include/media/soc_camera.h                 |   24 +--
 include/media/soc_camera_platform.h        |    2 +-
 17 files changed, 1092 insertions(+), 947 deletions(-)

diff --git a/arch/sh/boards/board-ap325rxa.c b/arch/sh/boards/board-ap325rxa.c
index a3afe43..30fa8b8 100644
--- a/arch/sh/boards/board-ap325rxa.c
+++ b/arch/sh/boards/board-ap325rxa.c
@@ -317,8 +317,8 @@ static struct soc_camera_platform_info camera_info = {
 	.format_name = "UYVY",
 	.format_depth = 16,
 	.format = {
-		.pixelformat = V4L2_PIX_FMT_UYVY,
-		.colorspace = V4L2_COLORSPACE_SMPTE170M,
+		.code = V4L2_IMGBUS_FMT_UYVY,
+		.field = V4L2_FIELD_NONE,
 		.width = 640,
 		.height = 480,
 	},
diff --git a/drivers/media/video/mt9m001.c b/drivers/media/video/mt9m001.c
index cc90660..e1c8578 100644
--- a/drivers/media/video/mt9m001.c
+++ b/drivers/media/video/mt9m001.c
@@ -48,41 +48,27 @@
 #define MT9M001_COLUMN_SKIP		20
 #define MT9M001_ROW_SKIP		12
 
-static const struct soc_camera_data_format mt9m001_colour_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9m001_colour_codes[] = {
 	/*
 	 * Order important: first natively supported,
 	 * second supported with a GPIO extender
 	 */
-	{
-		.name		= "Bayer (sRGB) 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_SBGGR16,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}, {
-		.name		= "Bayer (sRGB) 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_SBGGR8,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+	V4L2_IMGBUS_FMT_SBGGR10,
+	V4L2_IMGBUS_FMT_SBGGR8,
 };
 
-static const struct soc_camera_data_format mt9m001_monochrome_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9m001_monochrome_codes[] = {
 	/* Order important - see above */
-	{
-		.name		= "Monochrome 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_Y16,
-	}, {
-		.name		= "Monochrome 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_GREY,
-	},
+	V4L2_IMGBUS_FMT_Y10,
+	V4L2_IMGBUS_FMT_GREY,
 };
 
 struct mt9m001 {
 	struct v4l2_subdev subdev;
 	struct v4l2_rect rect;	/* Sensor window */
-	__u32 fourcc;
+	enum v4l2_imgbus_pixelcode code;
+	const enum v4l2_imgbus_pixelcode *codes;
+	int num_codes;
 	int model;	/* V4L2_IDENT_MT9M001* codes from v4l2-chip-ident.h */
 	unsigned int gain;
 	unsigned int exposure;
@@ -209,8 +195,7 @@ static int mt9m001_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	const u16 hblank = 9, vblank = 25;
 	unsigned int total_h;
 
-	if (mt9m001->fourcc == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9m001->fourcc == V4L2_PIX_FMT_SBGGR16)
+	if (mt9m001->codes == mt9m001_colour_codes)
 		/*
 		 * Bayer format - even number of rows for simplicity,
 		 * but let the user play with the top row.
@@ -290,32 +275,31 @@ static int mt9m001_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9m001_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m001_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m001 *mt9m001 = to_mt9m001(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9m001->rect.width;
-	pix->height		= mt9m001->rect.height;
-	pix->pixelformat	= mt9m001->fourcc;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9m001->rect.width;
+	imgf->height	= mt9m001->rect.height;
+	imgf->code	= mt9m001->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9m001_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m001_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m001 *mt9m001 = to_mt9m001(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_crop a = {
 		.c = {
 			.left	= mt9m001->rect.left,
 			.top	= mt9m001->rect.top,
-			.width	= pix->width,
-			.height	= pix->height,
+			.width	= imgf->width,
+			.height	= imgf->height,
 		},
 	};
 	int ret;
@@ -323,28 +307,27 @@ static int mt9m001_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	/* No support for scaling so far, just crop. TODO: use skipping */
 	ret = mt9m001_s_crop(sd, &a);
 	if (!ret) {
-		pix->width = mt9m001->rect.width;
-		pix->height = mt9m001->rect.height;
-		mt9m001->fourcc = pix->pixelformat;
+		imgf->width	= mt9m001->rect.width;
+		imgf->height	= mt9m001->rect.height;
+		mt9m001->code	= imgf->code;
 	}
 
 	return ret;
 }
 
-static int mt9m001_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m001_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m001 *mt9m001 = to_mt9m001(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	v4l_bound_align_image(&pix->width, MT9M001_MIN_WIDTH,
+	v4l_bound_align_image(&imgf->width, MT9M001_MIN_WIDTH,
 		MT9M001_MAX_WIDTH, 1,
-		&pix->height, MT9M001_MIN_HEIGHT + mt9m001->y_skip_top,
+		&imgf->height, MT9M001_MIN_HEIGHT + mt9m001->y_skip_top,
 		MT9M001_MAX_HEIGHT + mt9m001->y_skip_top, 0, 0);
 
-	if (pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
-	    pix->pixelformat == V4L2_PIX_FMT_SBGGR16)
-		pix->height = ALIGN(pix->height - 1, 2);
+	if (mt9m001->codes == mt9m001_colour_codes)
+		imgf->height = ALIGN(imgf->height - 1, 2);
 
 	return 0;
 }
@@ -608,11 +591,11 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 	case 0x8411:
 	case 0x8421:
 		mt9m001->model = V4L2_IDENT_MT9M001C12ST;
-		icd->formats = mt9m001_colour_formats;
+		mt9m001->codes = mt9m001_colour_codes;
 		break;
 	case 0x8431:
 		mt9m001->model = V4L2_IDENT_MT9M001C12STM;
-		icd->formats = mt9m001_monochrome_formats;
+		mt9m001->codes = mt9m001_monochrome_codes;
 		break;
 	default:
 		dev_err(&client->dev,
@@ -620,7 +603,7 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 		return -ENODEV;
 	}
 
-	icd->num_formats = 0;
+	mt9m001->num_codes = 0;
 
 	/*
 	 * This is a 10bit sensor, so by default we only allow 10bit.
@@ -633,14 +616,14 @@ static int mt9m001_video_probe(struct soc_camera_device *icd,
 		flags = SOCAM_DATAWIDTH_10;
 
 	if (flags & SOCAM_DATAWIDTH_10)
-		icd->num_formats++;
+		mt9m001->num_codes++;
 	else
-		icd->formats++;
+		mt9m001->codes++;
 
 	if (flags & SOCAM_DATAWIDTH_8)
-		icd->num_formats++;
+		mt9m001->num_codes++;
 
-	mt9m001->fourcc = icd->formats->fourcc;
+	mt9m001->code = mt9m001->codes[0];
 
 	dev_info(&client->dev, "Detected a MT9M001 chip ID %x (%s)\n", data,
 		 data == 0x8431 ? "C12STM" : "C12ST");
@@ -686,14 +669,28 @@ static struct v4l2_subdev_core_ops mt9m001_subdev_core_ops = {
 #endif
 };
 
+static int mt9m001_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	struct i2c_client *client = sd->priv;
+	struct mt9m001 *mt9m001 = to_mt9m001(client);
+
+	if ((unsigned int)index >= mt9m001->num_codes)
+		return -EINVAL;
+
+	*code = mt9m001->codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9m001_subdev_video_ops = {
-	.s_stream	= mt9m001_s_stream,
-	.s_fmt		= mt9m001_s_fmt,
-	.g_fmt		= mt9m001_g_fmt,
-	.try_fmt	= mt9m001_try_fmt,
-	.s_crop		= mt9m001_s_crop,
-	.g_crop		= mt9m001_g_crop,
-	.cropcap	= mt9m001_cropcap,
+	.s_stream		= mt9m001_s_stream,
+	.s_imgbus_fmt		= mt9m001_s_fmt,
+	.g_imgbus_fmt		= mt9m001_g_fmt,
+	.try_imgbus_fmt		= mt9m001_try_fmt,
+	.s_crop			= mt9m001_s_crop,
+	.g_crop			= mt9m001_g_crop,
+	.cropcap		= mt9m001_cropcap,
+	.enum_imgbus_fmt	= mt9m001_enum_fmt,
 };
 
 static struct v4l2_subdev_sensor_ops mt9m001_subdev_sensor_ops = {
diff --git a/drivers/media/video/mt9m111.c b/drivers/media/video/mt9m111.c
index 30db625..b5147e8 100644
--- a/drivers/media/video/mt9m111.c
+++ b/drivers/media/video/mt9m111.c
@@ -131,15 +131,15 @@
 #define JPG_FMT(_name, _depth, _fourcc) \
 	COL_FMT(_name, _depth, _fourcc, V4L2_COLORSPACE_JPEG)
 
-static const struct soc_camera_data_format mt9m111_colour_formats[] = {
-	JPG_FMT("CbYCrY 16 bit", 16, V4L2_PIX_FMT_UYVY),
-	JPG_FMT("CrYCbY 16 bit", 16, V4L2_PIX_FMT_VYUY),
-	JPG_FMT("YCbYCr 16 bit", 16, V4L2_PIX_FMT_YUYV),
-	JPG_FMT("YCrYCb 16 bit", 16, V4L2_PIX_FMT_YVYU),
-	RGB_FMT("RGB 565", 16, V4L2_PIX_FMT_RGB565),
-	RGB_FMT("RGB 555", 16, V4L2_PIX_FMT_RGB555),
-	RGB_FMT("Bayer (sRGB) 10 bit", 10, V4L2_PIX_FMT_SBGGR16),
-	RGB_FMT("Bayer (sRGB) 8 bit", 8, V4L2_PIX_FMT_SBGGR8),
+static const enum v4l2_imgbus_pixelcode mt9m111_colour_codes[] = {
+	V4L2_IMGBUS_FMT_UYVY,
+	V4L2_IMGBUS_FMT_VYUY,
+	V4L2_IMGBUS_FMT_YUYV,
+	V4L2_IMGBUS_FMT_YVYU,
+	V4L2_IMGBUS_FMT_RGB565,
+	V4L2_IMGBUS_FMT_RGB555,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
+	V4L2_IMGBUS_FMT_SBGGR8,
 };
 
 enum mt9m111_context {
@@ -152,7 +152,7 @@ struct mt9m111 {
 	int model;	/* V4L2_IDENT_MT9M11x* codes from v4l2-chip-ident.h */
 	enum mt9m111_context context;
 	struct v4l2_rect rect;
-	u32 pixfmt;
+	enum v4l2_imgbus_pixelcode code;
 	unsigned int gain;
 	unsigned char autoexposure;
 	unsigned char datawidth;
@@ -258,8 +258,8 @@ static int mt9m111_setup_rect(struct i2c_client *client,
 	int width = rect->width;
 	int height = rect->height;
 
-	if (mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR16)
+	if (mt9m111->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+	    mt9m111->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE)
 		is_raw_format = 1;
 	else
 		is_raw_format = 0;
@@ -307,7 +307,8 @@ static int mt9m111_setup_pixfmt(struct i2c_client *client, u16 outfmt)
 
 static int mt9m111_setfmt_bayer8(struct i2c_client *client)
 {
-	return mt9m111_setup_pixfmt(client, MT9M111_OUTFMT_PROCESSED_BAYER);
+	return mt9m111_setup_pixfmt(client, MT9M111_OUTFMT_PROCESSED_BAYER |
+				    MT9M111_OUTFMT_RGB);
 }
 
 static int mt9m111_setfmt_bayer10(struct i2c_client *client)
@@ -401,8 +402,8 @@ static int mt9m111_make_rect(struct i2c_client *client,
 {
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
 
-	if (mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9m111->pixfmt == V4L2_PIX_FMT_SBGGR16) {
+	if (mt9m111->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+	    mt9m111->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE) {
 		/* Bayer format - even size lengths */
 		rect->width	= ALIGN(rect->width, 2);
 		rect->height	= ALIGN(rect->height, 2);
@@ -460,120 +461,120 @@ static int mt9m111_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9m111_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m111_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9m111->rect.width;
-	pix->height		= mt9m111->rect.height;
-	pix->pixelformat	= mt9m111->pixfmt;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9m111->rect.width;
+	imgf->height	= mt9m111->rect.height;
+	imgf->code	= mt9m111->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9m111_set_pixfmt(struct i2c_client *client, u32 pixfmt)
+static int mt9m111_set_pixfmt(struct i2c_client *client,
+			      enum v4l2_imgbus_pixelcode code)
 {
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
 	int ret;
 
-	switch (pixfmt) {
-	case V4L2_PIX_FMT_SBGGR8:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_SBGGR8:
 		ret = mt9m111_setfmt_bayer8(client);
 		break;
-	case V4L2_PIX_FMT_SBGGR16:
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE:
 		ret = mt9m111_setfmt_bayer10(client);
 		break;
-	case V4L2_PIX_FMT_RGB555:
+	case V4L2_IMGBUS_FMT_RGB555:
 		ret = mt9m111_setfmt_rgb555(client);
 		break;
-	case V4L2_PIX_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB565:
 		ret = mt9m111_setfmt_rgb565(client);
 		break;
-	case V4L2_PIX_FMT_UYVY:
+	case V4L2_IMGBUS_FMT_UYVY:
 		mt9m111->swap_yuv_y_chromas = 0;
 		mt9m111->swap_yuv_cb_cr = 0;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
-	case V4L2_PIX_FMT_VYUY:
+	case V4L2_IMGBUS_FMT_VYUY:
 		mt9m111->swap_yuv_y_chromas = 0;
 		mt9m111->swap_yuv_cb_cr = 1;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
-	case V4L2_PIX_FMT_YUYV:
+	case V4L2_IMGBUS_FMT_YUYV:
 		mt9m111->swap_yuv_y_chromas = 1;
 		mt9m111->swap_yuv_cb_cr = 0;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
-	case V4L2_PIX_FMT_YVYU:
+	case V4L2_IMGBUS_FMT_YVYU:
 		mt9m111->swap_yuv_y_chromas = 1;
 		mt9m111->swap_yuv_cb_cr = 1;
 		ret = mt9m111_setfmt_yuv(client);
 		break;
 	default:
 		dev_err(&client->dev, "Pixel format not handled : %x\n",
-			pixfmt);
+			code);
 		ret = -EINVAL;
 	}
 
 	if (!ret)
-		mt9m111->pixfmt = pixfmt;
+		mt9m111->code = code;
 
 	return ret;
 }
 
-static int mt9m111_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m111_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_rect rect = {
 		.left	= mt9m111->rect.left,
 		.top	= mt9m111->rect.top,
-		.width	= pix->width,
-		.height	= pix->height,
+		.width	= imgf->width,
+		.height	= imgf->height,
 	};
 	int ret;
 
 	dev_dbg(&client->dev,
-		"%s fmt=%x left=%d, top=%d, width=%d, height=%d\n", __func__,
-		pix->pixelformat, rect.left, rect.top, rect.width, rect.height);
+		"%s code=%x left=%d, top=%d, width=%d, height=%d\n", __func__,
+		imgf->code, rect.left, rect.top, rect.width, rect.height);
 
 	ret = mt9m111_make_rect(client, &rect);
 	if (!ret)
-		ret = mt9m111_set_pixfmt(client, pix->pixelformat);
+		ret = mt9m111_set_pixfmt(client, imgf->code);
 	if (!ret)
 		mt9m111->rect = rect;
 	return ret;
 }
 
-static int mt9m111_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9m111_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	bool bayer = pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
-		pix->pixelformat == V4L2_PIX_FMT_SBGGR16;
+	bool bayer = imgf->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE;
 
 	/*
 	 * With Bayer format enforce even side lengths, but let the user play
 	 * with the starting pixel
 	 */
 
-	if (pix->height > MT9M111_MAX_HEIGHT)
-		pix->height = MT9M111_MAX_HEIGHT;
-	else if (pix->height < 2)
-		pix->height = 2;
+	if (imgf->height > MT9M111_MAX_HEIGHT)
+		imgf->height = MT9M111_MAX_HEIGHT;
+	else if (imgf->height < 2)
+		imgf->height = 2;
 	else if (bayer)
-		pix->height = ALIGN(pix->height, 2);
+		imgf->height = ALIGN(imgf->height, 2);
 
-	if (pix->width > MT9M111_MAX_WIDTH)
-		pix->width = MT9M111_MAX_WIDTH;
-	else if (pix->width < 2)
-		pix->width = 2;
+	if (imgf->width > MT9M111_MAX_WIDTH)
+		imgf->width = MT9M111_MAX_WIDTH;
+	else if (imgf->width < 2)
+		imgf->width = 2;
 	else if (bayer)
-		pix->width = ALIGN(pix->width, 2);
+		imgf->width = ALIGN(imgf->width, 2);
 
 	return 0;
 }
@@ -863,7 +864,7 @@ static int mt9m111_restore_state(struct i2c_client *client)
 	struct mt9m111 *mt9m111 = to_mt9m111(client);
 
 	mt9m111_set_context(client, mt9m111->context);
-	mt9m111_set_pixfmt(client, mt9m111->pixfmt);
+	mt9m111_set_pixfmt(client, mt9m111->code);
 	mt9m111_setup_rect(client, &mt9m111->rect);
 	mt9m111_set_flip(client, mt9m111->hflip, MT9M111_RMB_MIRROR_COLS);
 	mt9m111_set_flip(client, mt9m111->vflip, MT9M111_RMB_MIRROR_ROWS);
@@ -952,9 +953,6 @@ static int mt9m111_video_probe(struct soc_camera_device *icd,
 		goto ei2c;
 	}
 
-	icd->formats = mt9m111_colour_formats;
-	icd->num_formats = ARRAY_SIZE(mt9m111_colour_formats);
-
 	dev_info(&client->dev, "Detected a MT9M11x chip ID %x\n", data);
 
 ei2c:
@@ -971,13 +969,24 @@ static struct v4l2_subdev_core_ops mt9m111_subdev_core_ops = {
 #endif
 };
 
+static int mt9m111_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(mt9m111_colour_codes))
+		return -EINVAL;
+
+	*code = mt9m111_colour_codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9m111_subdev_video_ops = {
-	.s_fmt		= mt9m111_s_fmt,
-	.g_fmt		= mt9m111_g_fmt,
-	.try_fmt	= mt9m111_try_fmt,
-	.s_crop		= mt9m111_s_crop,
-	.g_crop		= mt9m111_g_crop,
-	.cropcap	= mt9m111_cropcap,
+	.s_imgbus_fmt		= mt9m111_s_fmt,
+	.g_imgbus_fmt		= mt9m111_g_fmt,
+	.try_imgbus_fmt		= mt9m111_try_fmt,
+	.s_crop			= mt9m111_s_crop,
+	.g_crop			= mt9m111_g_crop,
+	.cropcap		= mt9m111_cropcap,
+	.enum_imgbus_fmt	= mt9m111_enum_fmt,
 };
 
 static struct v4l2_subdev_ops mt9m111_subdev_ops = {
diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
index 0d2a8fd..c95c277 100644
--- a/drivers/media/video/mt9t031.c
+++ b/drivers/media/video/mt9t031.c
@@ -60,13 +60,8 @@
 	SOCAM_VSYNC_ACTIVE_HIGH | SOCAM_DATA_ACTIVE_HIGH |	\
 	SOCAM_MASTER | SOCAM_DATAWIDTH_10)
 
-static const struct soc_camera_data_format mt9t031_colour_formats[] = {
-	{
-		.name		= "Bayer (sRGB) 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_SGRBG10,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+static const enum v4l2_imgbus_pixelcode mt9t031_code[] = {
+	V4L2_IMGBUS_FMT_SGRBG10,
 };
 
 struct mt9t031 {
@@ -377,27 +372,26 @@ static int mt9t031_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9t031_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9t031_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9t031->rect.width / mt9t031->xskip;
-	pix->height		= mt9t031->rect.height / mt9t031->yskip;
-	pix->pixelformat	= V4L2_PIX_FMT_SGRBG10;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9t031->rect.width / mt9t031->xskip;
+	imgf->height	= mt9t031->rect.height / mt9t031->yskip;
+	imgf->code	= V4L2_IMGBUS_FMT_SGRBG10;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9t031_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9t031_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	struct soc_camera_device *icd = client->dev.platform_data;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	u16 xskip, yskip;
 	struct v4l2_rect rect = mt9t031->rect;
 
@@ -405,8 +399,8 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	 * try_fmt has put width and height within limits.
 	 * S_FMT: use binning and skipping for scaling
 	 */
-	xskip = mt9t031_skip(&rect.width, pix->width, MT9T031_MAX_WIDTH);
-	yskip = mt9t031_skip(&rect.height, pix->height, MT9T031_MAX_HEIGHT);
+	xskip = mt9t031_skip(&rect.width, imgf->width, MT9T031_MAX_WIDTH);
+	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
 
 	/* mt9t031_set_params() doesn't change width and height */
 	return mt9t031_set_params(icd, &rect, xskip, yskip);
@@ -416,13 +410,12 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
  * If a user window larger than sensor window is requested, we'll increase the
  * sensor window.
  */
-static int mt9t031_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9t031_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-
 	v4l_bound_align_image(
-		&pix->width, MT9T031_MIN_WIDTH, MT9T031_MAX_WIDTH, 1,
-		&pix->height, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT, 1, 0);
+		&imgf->width, MT9T031_MIN_WIDTH, MT9T031_MAX_WIDTH, 1,
+		&imgf->height, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT, 1, 0);
 
 	return 0;
 }
@@ -682,7 +675,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
  */
 static int mt9t031_video_probe(struct i2c_client *client)
 {
-	struct soc_camera_device *icd = client->dev.platform_data;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	s32 data;
 	int ret;
@@ -697,8 +689,6 @@ static int mt9t031_video_probe(struct i2c_client *client)
 	switch (data) {
 	case 0x1621:
 		mt9t031->model = V4L2_IDENT_MT9T031;
-		icd->formats = mt9t031_colour_formats;
-		icd->num_formats = ARRAY_SIZE(mt9t031_colour_formats);
 		break;
 	default:
 		dev_err(&client->dev,
@@ -729,14 +719,25 @@ static struct v4l2_subdev_core_ops mt9t031_subdev_core_ops = {
 #endif
 };
 
+static int mt9t031_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(mt9t031_code))
+		return -EINVAL;
+
+	*code = mt9t031_code[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9t031_subdev_video_ops = {
-	.s_stream	= mt9t031_s_stream,
-	.s_fmt		= mt9t031_s_fmt,
-	.g_fmt		= mt9t031_g_fmt,
-	.try_fmt	= mt9t031_try_fmt,
-	.s_crop		= mt9t031_s_crop,
-	.g_crop		= mt9t031_g_crop,
-	.cropcap	= mt9t031_cropcap,
+	.s_stream		= mt9t031_s_stream,
+	.s_imgbus_fmt		= mt9t031_s_fmt,
+	.g_imgbus_fmt		= mt9t031_g_fmt,
+	.try_imgbus_fmt		= mt9t031_try_fmt,
+	.s_crop			= mt9t031_s_crop,
+	.g_crop			= mt9t031_g_crop,
+	.cropcap		= mt9t031_cropcap,
+	.enum_imgbus_fmt	= mt9t031_enum_fmt,
 };
 
 static struct v4l2_subdev_ops mt9t031_subdev_ops = {
diff --git a/drivers/media/video/mt9v022.c b/drivers/media/video/mt9v022.c
index f60a9a1..9fc32d0 100644
--- a/drivers/media/video/mt9v022.c
+++ b/drivers/media/video/mt9v022.c
@@ -64,41 +64,27 @@ MODULE_PARM_DESC(sensor_type, "Sensor type: \"colour\" or \"monochrome\"");
 #define MT9V022_COLUMN_SKIP		1
 #define MT9V022_ROW_SKIP		4
 
-static const struct soc_camera_data_format mt9v022_colour_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9v022_colour_codes[] = {
 	/*
 	 * Order important: first natively supported,
 	 * second supported with a GPIO extender
 	 */
-	{
-		.name		= "Bayer (sRGB) 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_SBGGR16,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}, {
-		.name		= "Bayer (sRGB) 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_SBGGR8,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+	V4L2_IMGBUS_FMT_SBGGR10,
+	V4L2_IMGBUS_FMT_SBGGR8,
 };
 
-static const struct soc_camera_data_format mt9v022_monochrome_formats[] = {
+static const enum v4l2_imgbus_pixelcode mt9v022_monochrome_codes[] = {
 	/* Order important - see above */
-	{
-		.name		= "Monochrome 10 bit",
-		.depth		= 10,
-		.fourcc		= V4L2_PIX_FMT_Y16,
-	}, {
-		.name		= "Monochrome 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_GREY,
-	},
+	V4L2_IMGBUS_FMT_Y10,
+	V4L2_IMGBUS_FMT_GREY,
 };
 
 struct mt9v022 {
 	struct v4l2_subdev subdev;
 	struct v4l2_rect rect;	/* Sensor window */
-	__u32 fourcc;
+	enum v4l2_imgbus_pixelcode code;
+	const enum v4l2_imgbus_pixelcode *codes;
+	int num_codes;
 	int model;	/* V4L2_IDENT_MT9V022* codes from v4l2-chip-ident.h */
 	u16 chip_control;
 	unsigned short y_skip_top;	/* Lines to skip at the top */
@@ -275,8 +261,7 @@ static int mt9v022_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	int ret;
 
 	/* Bayer format - even size lengths */
-	if (mt9v022->fourcc == V4L2_PIX_FMT_SBGGR8 ||
-	    mt9v022->fourcc == V4L2_PIX_FMT_SBGGR16) {
+	if (mt9v022->codes == mt9v022_colour_codes) {
 		rect.width	= ALIGN(rect.width, 2);
 		rect.height	= ALIGN(rect.height, 2);
 		/* Let the user play with the starting pixel */
@@ -354,32 +339,31 @@ static int mt9v022_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int mt9v022_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9v022_g_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width		= mt9v022->rect.width;
-	pix->height		= mt9v022->rect.height;
-	pix->pixelformat	= mt9v022->fourcc;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->colorspace		= V4L2_COLORSPACE_SRGB;
+	imgf->width	= mt9v022->rect.width;
+	imgf->height	= mt9v022->rect.height;
+	imgf->code	= mt9v022->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9v022_s_fmt(struct v4l2_subdev *sd,
+			 struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_crop a = {
 		.c = {
 			.left	= mt9v022->rect.left,
 			.top	= mt9v022->rect.top,
-			.width	= pix->width,
-			.height	= pix->height,
+			.width	= imgf->width,
+			.height	= imgf->height,
 		},
 	};
 	int ret;
@@ -388,14 +372,14 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	 * The caller provides a supported format, as verified per call to
 	 * icd->try_fmt(), datawidth is from our supported format list
 	 */
-	switch (pix->pixelformat) {
-	case V4L2_PIX_FMT_GREY:
-	case V4L2_PIX_FMT_Y16:
+	switch (imgf->code) {
+	case V4L2_IMGBUS_FMT_GREY:
+	case V4L2_IMGBUS_FMT_Y10:
 		if (mt9v022->model != V4L2_IDENT_MT9V022IX7ATM)
 			return -EINVAL;
 		break;
-	case V4L2_PIX_FMT_SBGGR8:
-	case V4L2_PIX_FMT_SBGGR16:
+	case V4L2_IMGBUS_FMT_SBGGR8:
+	case V4L2_IMGBUS_FMT_SBGGR10:
 		if (mt9v022->model != V4L2_IDENT_MT9V022IX7ATC)
 			return -EINVAL;
 		break;
@@ -409,25 +393,25 @@ static int mt9v022_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	/* No support for scaling on this camera, just crop. */
 	ret = mt9v022_s_crop(sd, &a);
 	if (!ret) {
-		pix->width = mt9v022->rect.width;
-		pix->height = mt9v022->rect.height;
-		mt9v022->fourcc = pix->pixelformat;
+		imgf->width	= mt9v022->rect.width;
+		imgf->height	= mt9v022->rect.height;
+		mt9v022->code	= imgf->code;
 	}
 
 	return ret;
 }
 
-static int mt9v022_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int mt9v022_try_fmt(struct v4l2_subdev *sd,
+			   struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9v022 *mt9v022 = to_mt9v022(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	int align = pix->pixelformat == V4L2_PIX_FMT_SBGGR8 ||
-		pix->pixelformat == V4L2_PIX_FMT_SBGGR16;
+	int align = imgf->code == V4L2_IMGBUS_FMT_SBGGR8 ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10;
 
-	v4l_bound_align_image(&pix->width, MT9V022_MIN_WIDTH,
+	v4l_bound_align_image(&imgf->width, MT9V022_MIN_WIDTH,
 		MT9V022_MAX_WIDTH, align,
-		&pix->height, MT9V022_MIN_HEIGHT + mt9v022->y_skip_top,
+		&imgf->height, MT9V022_MIN_HEIGHT + mt9v022->y_skip_top,
 		MT9V022_MAX_HEIGHT + mt9v022->y_skip_top, align, 0);
 
 	return 0;
@@ -749,17 +733,17 @@ static int mt9v022_video_probe(struct soc_camera_device *icd,
 			    !strcmp("color", sensor_type))) {
 		ret = reg_write(client, MT9V022_PIXEL_OPERATION_MODE, 4 | 0x11);
 		mt9v022->model = V4L2_IDENT_MT9V022IX7ATC;
-		icd->formats = mt9v022_colour_formats;
+		mt9v022->codes = mt9v022_colour_codes;
 	} else {
 		ret = reg_write(client, MT9V022_PIXEL_OPERATION_MODE, 0x11);
 		mt9v022->model = V4L2_IDENT_MT9V022IX7ATM;
-		icd->formats = mt9v022_monochrome_formats;
+		mt9v022->codes = mt9v022_monochrome_codes;
 	}
 
 	if (ret < 0)
 		goto ei2c;
 
-	icd->num_formats = 0;
+	mt9v022->num_codes = 0;
 
 	/*
 	 * This is a 10bit sensor, so by default we only allow 10bit.
@@ -772,14 +756,14 @@ static int mt9v022_video_probe(struct soc_camera_device *icd,
 		flags = SOCAM_DATAWIDTH_10;
 
 	if (flags & SOCAM_DATAWIDTH_10)
-		icd->num_formats++;
+		mt9v022->num_codes++;
 	else
-		icd->formats++;
+		mt9v022->codes++;
 
 	if (flags & SOCAM_DATAWIDTH_8)
-		icd->num_formats++;
+		mt9v022->num_codes++;
 
-	mt9v022->fourcc = icd->formats->fourcc;
+	mt9v022->code = mt9v022->codes[0];
 
 	dev_info(&client->dev, "Detected a MT9V022 chip ID %x, %s sensor\n",
 		 data, mt9v022->model == V4L2_IDENT_MT9V022IX7ATM ?
@@ -823,14 +807,28 @@ static struct v4l2_subdev_core_ops mt9v022_subdev_core_ops = {
 #endif
 };
 
+static int mt9v022_enum_fmt(struct v4l2_subdev *sd, int index,
+			    enum v4l2_imgbus_pixelcode *code)
+{
+	struct i2c_client *client = sd->priv;
+	struct mt9v022 *mt9v022 = to_mt9v022(client);
+
+	if ((unsigned int)index >= mt9v022->num_codes)
+		return -EINVAL;
+
+	*code = mt9v022->codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops mt9v022_subdev_video_ops = {
-	.s_stream	= mt9v022_s_stream,
-	.s_fmt		= mt9v022_s_fmt,
-	.g_fmt		= mt9v022_g_fmt,
-	.try_fmt	= mt9v022_try_fmt,
-	.s_crop		= mt9v022_s_crop,
-	.g_crop		= mt9v022_g_crop,
-	.cropcap	= mt9v022_cropcap,
+	.s_stream		= mt9v022_s_stream,
+	.s_imgbus_fmt		= mt9v022_s_fmt,
+	.g_imgbus_fmt		= mt9v022_g_fmt,
+	.try_imgbus_fmt		= mt9v022_try_fmt,
+	.s_crop			= mt9v022_s_crop,
+	.g_crop			= mt9v022_g_crop,
+	.cropcap		= mt9v022_cropcap,
+	.enum_imgbus_fmt	= mt9v022_enum_fmt,
 };
 
 static struct v4l2_subdev_sensor_ops mt9v022_subdev_sensor_ops = {
diff --git a/drivers/media/video/mx1_camera.c b/drivers/media/video/mx1_camera.c
index 659d20a..8e73c77 100644
--- a/drivers/media/video/mx1_camera.c
+++ b/drivers/media/video/mx1_camera.c
@@ -93,9 +93,9 @@
 /* buffer for one video frame */
 struct mx1_buffer {
 	/* common v4l buffer stuff -- must be first */
-	struct videobuf_buffer vb;
-	const struct soc_camera_data_format *fmt;
-	int inwork;
+	struct videobuf_buffer		vb;
+	enum v4l2_imgbus_pixelcode	code;
+	int				inwork;
 };
 
 /*
@@ -127,9 +127,13 @@ static int mx1_videobuf_setup(struct videobuf_queue *vq, unsigned int *count,
 			      unsigned int *size)
 {
 	struct soc_camera_device *icd = vq->priv_data;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
 
-	*size = icd->user_width * icd->user_height *
-		((icd->current_fmt->depth + 7) >> 3);
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
+	*size = bytes_per_line * icd->user_height;
 
 	if (!*count)
 		*count = 32;
@@ -168,6 +172,11 @@ static int mx1_videobuf_prepare(struct videobuf_queue *vq,
 	struct soc_camera_device *icd = vq->priv_data;
 	struct mx1_buffer *buf = container_of(vb, struct mx1_buffer, vb);
 	int ret;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	dev_dbg(icd->dev.parent, "%s (vb=0x%p) 0x%08lx %d\n", __func__,
 		vb, vb->baddr, vb->bsize);
@@ -183,18 +192,18 @@ static int mx1_videobuf_prepare(struct videobuf_queue *vq,
 	 */
 	buf->inwork = 1;
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
 		vb->state	= VIDEOBUF_NEEDS_INIT;
 	}
 
-	vb->size = vb->width * vb->height * ((buf->fmt->depth + 7) >> 3);
+	vb->size = bytes_per_line * vb->height;
 	if (0 != vb->baddr && vb->bsize < vb->size) {
 		ret = -EINVAL;
 		goto out;
@@ -496,12 +505,10 @@ static int mx1_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 
 	/* MX1 supports only 8bit buswidth */
 	common_flags = soc_camera_bus_param_compatible(camera_flags,
-							       CSI_BUS_FLAGS);
+						       CSI_BUS_FLAGS);
 	if (!common_flags)
 		return -EINVAL;
 
-	icd->buswidth = 8;
-
 	/* Make choises, based on platform choice */
 	if ((common_flags & SOCAM_VSYNC_ACTIVE_HIGH) &&
 		(common_flags & SOCAM_VSYNC_ACTIVE_LOW)) {
@@ -554,7 +561,8 @@ static int mx1_camera_set_fmt(struct soc_camera_device *icd,
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
-	int ret;
+	struct v4l2_imgbus_framefmt imgf;
+	int ret, buswidth;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
 	if (!xlate) {
@@ -563,12 +571,28 @@ static int mx1_camera_set_fmt(struct soc_camera_device *icd,
 		return -EINVAL;
 	}
 
-	ret = v4l2_subdev_call(sd, video, s_fmt, f);
-	if (!ret) {
-		icd->buswidth = xlate->buswidth;
-		icd->current_fmt = xlate->host_fmt;
+	buswidth = xlate->host_fmt->bits_per_sample;
+	if (buswidth > 8) {
+		dev_warn(icd->dev.parent,
+			 "bits-per-sample %d for format %x unsupported\n",
+			 buswidth, pix->pixelformat);
+		return -EINVAL;
 	}
 
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
+
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
+
+	pix->width		= imgf.width;
+	pix->height		= imgf.height;
+	pix->field		= imgf.field;
+	icd->current_fmt	= xlate;
+
 	return ret;
 }
 
@@ -576,10 +600,29 @@ static int mx1_camera_try_fmt(struct soc_camera_device *icd,
 			      struct v4l2_format *f)
 {
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
+	const struct soc_camera_format_xlate *xlate;
+	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
+	int ret;
 	/* TODO: limit to mx1 hardware capabilities */
 
+	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
+	if (!xlate) {
+		dev_warn(icd->dev.parent, "Format %x not found\n",
+			 pix->pixelformat);
+		return -EINVAL;
+	}
+
 	/* limit to sensor capabilities */
-	return v4l2_subdev_call(sd, video, try_fmt, f);
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
+
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
+	pix->field	= imgf.field;
+
+	return 0;
 }
 
 static int mx1_camera_reqbufs(struct soc_camera_file *icf,
diff --git a/drivers/media/video/mx3_camera.c b/drivers/media/video/mx3_camera.c
index 545a430..ab551f1 100644
--- a/drivers/media/video/mx3_camera.c
+++ b/drivers/media/video/mx3_camera.c
@@ -62,7 +62,7 @@
 struct mx3_camera_buffer {
 	/* common v4l buffer stuff -- must be first */
 	struct videobuf_buffer			vb;
-	const struct soc_camera_data_format	*fmt;
+	enum v4l2_imgbus_pixelcode		code;
 
 	/* One descriptot per scatterlist (per frame) */
 	struct dma_async_tx_descriptor		*txd;
@@ -117,8 +117,6 @@ struct dma_chan_request {
 	enum ipu_channel	id;
 };
 
-static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt);
-
 static u32 csi_reg_read(struct mx3_camera_dev *mx3, off_t reg)
 {
 	return __raw_readl(mx3->base + reg);
@@ -210,17 +208,16 @@ static int mx3_videobuf_setup(struct videobuf_queue *vq, unsigned int *count,
 	struct soc_camera_device *icd = vq->priv_data;
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct mx3_camera_dev *mx3_cam = ici->priv;
-	/*
-	 * bits-per-pixel (depth) as specified in camera's pixel format does
-	 * not necessarily match what the camera interface writes to RAM, but
-	 * it should be good enough for now.
-	 */
-	unsigned int bpp = DIV_ROUND_UP(icd->current_fmt->depth, 8);
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	if (!mx3_cam->idmac_channel[0])
 		return -EINVAL;
 
-	*size = icd->user_width * icd->user_height * bpp;
+	*size = bytes_per_line * icd->user_height;
 
 	if (!*count)
 		*count = 32;
@@ -240,21 +237,26 @@ static int mx3_videobuf_prepare(struct videobuf_queue *vq,
 	struct mx3_camera_dev *mx3_cam = ici->priv;
 	struct mx3_camera_buffer *buf =
 		container_of(vb, struct mx3_camera_buffer, vb);
-	/* current_fmt _must_ always be set */
-	size_t new_size = icd->user_width * icd->user_height *
-		((icd->current_fmt->depth + 7) >> 3);
+	size_t new_size;
 	int ret;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
+	new_size = bytes_per_line * icd->user_height;
 
 	/*
 	 * I think, in buf_prepare you only have to protect global data,
 	 * the actual buffer is yours
 	 */
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
@@ -347,13 +349,13 @@ static void mx3_videobuf_queue(struct videobuf_queue *vq,
 	struct dma_async_tx_descriptor *txd = buf->txd;
 	struct idmac_channel *ichan = to_idmac_chan(txd->chan);
 	struct idmac_video_param *video = &ichan->params.video;
-	const struct soc_camera_data_format *data_fmt = icd->current_fmt;
 	dma_cookie_t cookie;
+	u32 fourcc = icd->current_fmt->host_fmt->fourcc;
 
 	BUG_ON(!irqs_disabled());
 
 	/* This is the configuration of one sg-element */
-	video->out_pixel_fmt	= fourcc_to_ipu_pix(data_fmt->fourcc);
+	video->out_pixel_fmt	= fourcc_to_ipu_pix(fourcc);
 	video->out_width	= icd->user_width;
 	video->out_height	= icd->user_height;
 	video->out_stride	= icd->user_width;
@@ -567,28 +569,33 @@ static int test_platform_param(struct mx3_camera_dev *mx3_cam,
 	 * If requested data width is supported by the platform, use it or any
 	 * possible lower value - i.MX31 is smart enough to schift bits
 	 */
+	if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15)
+		*flags |= SOCAM_DATAWIDTH_15 | SOCAM_DATAWIDTH_10 |
+			SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
+	else if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_10)
+		*flags |= SOCAM_DATAWIDTH_10 | SOCAM_DATAWIDTH_8 |
+			SOCAM_DATAWIDTH_4;
+	else if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_8)
+		*flags |= SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
+	else if (mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_4)
+		*flags |= SOCAM_DATAWIDTH_4;
+
 	switch (buswidth) {
 	case 15:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15))
+		if (!(*flags & SOCAM_DATAWIDTH_15))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_15 | SOCAM_DATAWIDTH_10 |
-			SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
 		break;
 	case 10:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_10))
+		if (!(*flags & SOCAM_DATAWIDTH_10))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_10 | SOCAM_DATAWIDTH_8 |
-			SOCAM_DATAWIDTH_4;
 		break;
 	case 8:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_8))
+		if (!(*flags & SOCAM_DATAWIDTH_8))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_8 | SOCAM_DATAWIDTH_4;
 		break;
 	case 4:
-		if (!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_4))
+		if (!(*flags & SOCAM_DATAWIDTH_4))
 			return -EINVAL;
-		*flags |= SOCAM_DATAWIDTH_4;
 		break;
 	default:
 		dev_warn(mx3_cam->soc_host.v4l2_dev.dev,
@@ -637,91 +644,95 @@ static bool chan_filter(struct dma_chan *chan, void *arg)
 		pdata->dma_dev == chan->device->dev;
 }
 
-static const struct soc_camera_data_format mx3_camera_formats[] = {
+static const struct v4l2_imgbus_pixelfmt mx3_camera_formats[] = {
 	{
-		.name		= "Bayer (sRGB) 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_SBGGR8,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
+		.fourcc			= V4L2_PIX_FMT_SBGGR8,
+		.colorspace		= V4L2_COLORSPACE_SRGB,
+		.name			= "Bayer (sRGB) 8 bit",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	}, {
-		.name		= "Monochrome 8 bit",
-		.depth		= 8,
-		.fourcc		= V4L2_PIX_FMT_GREY,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
+		.fourcc			= V4L2_PIX_FMT_GREY,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Monochrome 8 bit",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	},
 };
 
-static bool buswidth_supported(struct soc_camera_host *ici, int depth)
+/* This will be corrected as we get more formats */
+static bool mx3_camera_packing_supported(const struct v4l2_imgbus_pixelfmt *fmt)
 {
-	struct mx3_camera_dev *mx3_cam = ici->priv;
-
-	switch (depth) {
-	case 4:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_4);
-	case 8:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_8);
-	case 10:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_10);
-	case 15:
-		return !!(mx3_cam->platform_flags & MX3_CAMERA_DATAWIDTH_15);
-	}
-	return false;
+	return	fmt->packing == V4L2_IMGBUS_PACKING_NONE ||
+		(fmt->bits_per_sample == 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_2X8) ||
+		(fmt->bits_per_sample > 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_EXTEND16);
 }
 
 static int mx3_camera_get_formats(struct soc_camera_device *icd, int idx,
 				  struct soc_camera_format_xlate *xlate)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
+	struct device *dev = icd->dev.parent;
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
-	int formats = 0, buswidth, ret;
+	int formats = 0, ret;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *fmt;
 
-	buswidth = icd->formats[idx].depth;
+	ret = v4l2_subdev_call(sd, video, enum_imgbus_fmt, idx, &code);
+	if (ret < 0)
+		/* No more formats */
+		return 0;
 
-	if (!buswidth_supported(ici, buswidth))
+	fmt = v4l2_imgbus_get_fmtdesc(code);
+	if (!fmt) {
+		dev_err(icd->dev.parent,
+			"Invalid format code #%d: %d\n", idx, code);
 		return 0;
+	}
 
-	ret = mx3_camera_try_bus_param(icd, buswidth);
+	/* This also checks support for the requested bits-per-sample */
+	ret = mx3_camera_try_bus_param(icd, fmt->bits_per_sample);
 	if (ret < 0)
 		return 0;
 
-	switch (icd->formats[idx].fourcc) {
-	case V4L2_PIX_FMT_SGRBG10:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_SGRBG10:
 		formats++;
 		if (xlate) {
-			xlate->host_fmt = &mx3_camera_formats[0];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
+			xlate->host_fmt	= &mx3_camera_formats[0];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(icd->dev.parent,
-				"Providing format %s using %s\n",
-				mx3_camera_formats[0].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				mx3_camera_formats[0].name, code);
 		}
-		goto passthrough;
-	case V4L2_PIX_FMT_Y16:
+		break;
+	case V4L2_IMGBUS_FMT_Y10:
 		formats++;
 		if (xlate) {
-			xlate->host_fmt = &mx3_camera_formats[1];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
+			xlate->host_fmt	= &mx3_camera_formats[1];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(icd->dev.parent,
-				"Providing format %s using %s\n",
-				mx3_camera_formats[0].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				mx3_camera_formats[1].name, code);
 		}
+		break;
 	default:
-passthrough:
-		/* Generic pass-through */
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
-			xlate++;
-			dev_dbg(icd->dev.parent,
-				"Providing format %s in pass-through mode\n",
-				icd->formats[idx].name);
-		}
+		if (!mx3_camera_packing_supported(fmt))
+			return 0;
+	}
+
+	/* Generic pass-through */
+	formats++;
+	if (xlate) {
+		xlate->host_fmt	= fmt;
+		xlate->code	= code;
+		xlate++;
+		dev_dbg(dev, "Providing format %x in pass-through mode\n",
+			xlate->host_fmt->fourcc);
 	}
 
 	return formats;
@@ -805,8 +816,7 @@ static int mx3_camera_set_crop(struct soc_camera_device *icd,
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct mx3_camera_dev *mx3_cam = ici->priv;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
-	struct v4l2_format f = {.type = V4L2_BUF_TYPE_VIDEO_CAPTURE};
-	struct v4l2_pix_format *pix = &f.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	soc_camera_limit_side(&rect->left, &rect->width, 0, 2, 4096);
@@ -817,19 +827,19 @@ static int mx3_camera_set_crop(struct soc_camera_device *icd,
 		return ret;
 
 	/* The capture device might have changed its output  */
-	ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+	ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
-	if (pix->width & 7) {
+	if (imgf.width & 7) {
 		/* Ouch! We can only handle 8-byte aligned width... */
-		stride_align(&pix->width);
-		ret = v4l2_subdev_call(sd, video, s_fmt, &f);
+		stride_align(&imgf.width);
+		ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
 		if (ret < 0)
 			return ret;
 	}
 
-	if (pix->width != icd->user_width || pix->height != icd->user_height) {
+	if (imgf.width != icd->user_width || imgf.height != icd->user_height) {
 		/*
 		 * We now know pixel formats and can decide upon DMA-channel(s)
 		 * So far only direct camera-to-memory is supported
@@ -840,14 +850,14 @@ static int mx3_camera_set_crop(struct soc_camera_device *icd,
 				return ret;
 		}
 
-		configure_geometry(mx3_cam, pix->width, pix->height);
+		configure_geometry(mx3_cam, imgf.width, imgf.height);
 	}
 
 	dev_dbg(icd->dev.parent, "Sensor cropped %dx%d\n",
-		pix->width, pix->height);
+		imgf.width, imgf.height);
 
-	icd->user_width = pix->width;
-	icd->user_height = pix->height;
+	icd->user_width		= imgf.width;
+	icd->user_height	= imgf.height;
 
 	return ret;
 }
@@ -860,6 +870,7 @@ static int mx3_camera_set_fmt(struct soc_camera_device *icd,
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
@@ -884,11 +895,19 @@ static int mx3_camera_set_fmt(struct soc_camera_device *icd,
 
 	configure_geometry(mx3_cam, pix->width, pix->height);
 
-	ret = v4l2_subdev_call(sd, video, s_fmt, f);
-	if (!ret) {
-		icd->buswidth = xlate->buswidth;
-		icd->current_fmt = xlate->host_fmt;
-	}
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
+
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
+
+	pix->width		= imgf.width;
+	pix->height		= imgf.height;
+	pix->field		= imgf.field;
+	icd->current_fmt	= xlate;
 
 	dev_dbg(icd->dev.parent, "Sensor set %dx%d\n", pix->width, pix->height);
 
@@ -901,8 +920,8 @@ static int mx3_camera_try_fmt(struct soc_camera_device *icd,
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	__u32 pixfmt = pix->pixelformat;
-	enum v4l2_field field;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pixfmt);
@@ -917,23 +936,34 @@ static int mx3_camera_try_fmt(struct soc_camera_device *icd,
 	if (pix->width > 4096)
 		pix->width = 4096;
 
-	pix->bytesperline = pix->width *
-		DIV_ROUND_UP(xlate->host_fmt->depth, 8);
+	pix->bytesperline = v4l2_imgbus_bytes_per_line(pix->width,
+						       xlate->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
 	pix->sizeimage = pix->height * pix->bytesperline;
 
-	/* camera has to see its format, but the user the original one */
-	pix->pixelformat = xlate->cam_fmt->fourcc;
 	/* limit to sensor capabilities */
-	ret = v4l2_subdev_call(sd, video, try_fmt, f);
-	pix->pixelformat = xlate->host_fmt->fourcc;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.field	= pix->field;
+	imgf.code	= xlate->code;
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
 
-	field = pix->field;
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
 
-	if (field == V4L2_FIELD_ANY) {
+	switch (imgf.field) {
+	case V4L2_FIELD_ANY:
 		pix->field = V4L2_FIELD_NONE;
-	} else if (field != V4L2_FIELD_NONE) {
-		dev_err(icd->dev.parent, "Field type %d unsupported.\n", field);
-		return -EINVAL;
+		break;
+	case V4L2_FIELD_NONE:
+		break;
+	default:
+		dev_err(icd->dev.parent, "Field type %d unsupported.\n",
+			imgf.field);
+		ret = -EINVAL;
 	}
 
 	return ret;
@@ -969,18 +999,26 @@ static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 	struct mx3_camera_dev *mx3_cam = ici->priv;
 	unsigned long bus_flags, camera_flags, common_flags;
 	u32 dw, sens_conf;
-	int ret = test_platform_param(mx3_cam, icd->buswidth, &bus_flags);
+	const struct v4l2_imgbus_pixelfmt *fmt;
+	int buswidth;
+	int ret;
 	const struct soc_camera_format_xlate *xlate;
 	struct device *dev = icd->dev.parent;
 
+	fmt = v4l2_imgbus_get_fmtdesc(icd->current_fmt->code);
+	if (!fmt)
+		return -EINVAL;
+
+	buswidth = fmt->bits_per_sample;
+	ret = test_platform_param(mx3_cam, buswidth, &bus_flags);
+
 	xlate = soc_camera_xlate_by_fourcc(icd, pixfmt);
 	if (!xlate) {
 		dev_warn(dev, "Format %x not found\n", pixfmt);
 		return -EINVAL;
 	}
 
-	dev_dbg(dev, "requested bus width %d bit: %d\n",
-		icd->buswidth, ret);
+	dev_dbg(dev, "requested bus width %d bit: %d\n", buswidth, ret);
 
 	if (ret < 0)
 		return ret;
@@ -1081,7 +1119,7 @@ static int mx3_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 		sens_conf |= 1 << CSI_SENS_CONF_DATA_POL_SHIFT;
 
 	/* Just do what we're asked to do */
-	switch (xlate->host_fmt->depth) {
+	switch (xlate->host_fmt->bits_per_sample) {
 	case 4:
 		dw = 0 << CSI_SENS_CONF_DATA_WIDTH_SHIFT;
 		break;
diff --git a/drivers/media/video/ov772x.c b/drivers/media/video/ov772x.c
index dbaf508..f969011 100644
--- a/drivers/media/video/ov772x.c
+++ b/drivers/media/video/ov772x.c
@@ -382,7 +382,7 @@ struct regval_list {
 };
 
 struct ov772x_color_format {
-	const struct soc_camera_data_format *format;
+	const enum v4l2_imgbus_pixelcode code;
 	u8 dsp3;
 	u8 com3;
 	u8 com7;
@@ -434,93 +434,50 @@ static const struct regval_list ov772x_vga_regs[] = {
 };
 
 /*
- * supported format list
- */
-
-#define SETFOURCC(type) .name = (#type), .fourcc = (V4L2_PIX_FMT_ ## type)
-static const struct soc_camera_data_format ov772x_fmt_lists[] = {
-	{
-		SETFOURCC(YUYV),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_JPEG,
-	},
-	{
-		SETFOURCC(YVYU),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_JPEG,
-	},
-	{
-		SETFOURCC(UYVY),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_JPEG,
-	},
-	{
-		SETFOURCC(RGB555),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-	{
-		SETFOURCC(RGB555X),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-	{
-		SETFOURCC(RGB565),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-	{
-		SETFOURCC(RGB565X),
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SRGB,
-	},
-};
-
-/*
- * color format list
+ * supported color format list
  */
 static const struct ov772x_color_format ov772x_cfmts[] = {
 	{
-		.format = &ov772x_fmt_lists[0],
-		.dsp3   = 0x0,
-		.com3   = SWAP_YUV,
-		.com7   = OFMT_YUV,
+		.code	= V4L2_IMGBUS_FMT_YUYV,
+		.dsp3	= 0x0,
+		.com3	= SWAP_YUV,
+		.com7	= OFMT_YUV,
 	},
 	{
-		.format = &ov772x_fmt_lists[1],
-		.dsp3   = UV_ON,
-		.com3   = SWAP_YUV,
-		.com7   = OFMT_YUV,
+		.code	= V4L2_IMGBUS_FMT_YVYU,
+		.dsp3	= UV_ON,
+		.com3	= SWAP_YUV,
+		.com7	= OFMT_YUV,
 	},
 	{
-		.format = &ov772x_fmt_lists[2],
-		.dsp3   = 0x0,
-		.com3   = 0x0,
-		.com7   = OFMT_YUV,
+		.code	= V4L2_IMGBUS_FMT_UYVY,
+		.dsp3	= 0x0,
+		.com3	= 0x0,
+		.com7	= OFMT_YUV,
 	},
 	{
-		.format = &ov772x_fmt_lists[3],
-		.dsp3   = 0x0,
-		.com3   = SWAP_RGB,
-		.com7   = FMT_RGB555 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB555,
+		.dsp3	= 0x0,
+		.com3	= SWAP_RGB,
+		.com7	= FMT_RGB555 | OFMT_RGB,
 	},
 	{
-		.format = &ov772x_fmt_lists[4],
-		.dsp3   = 0x0,
-		.com3   = 0x0,
-		.com7   = FMT_RGB555 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB555X,
+		.dsp3	= 0x0,
+		.com3	= 0x0,
+		.com7	= FMT_RGB555 | OFMT_RGB,
 	},
 	{
-		.format = &ov772x_fmt_lists[5],
-		.dsp3   = 0x0,
-		.com3   = SWAP_RGB,
-		.com7   = FMT_RGB565 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB565,
+		.dsp3	= 0x0,
+		.com3	= SWAP_RGB,
+		.com7	= FMT_RGB565 | OFMT_RGB,
 	},
 	{
-		.format = &ov772x_fmt_lists[6],
-		.dsp3   = 0x0,
-		.com3   = 0x0,
-		.com7   = FMT_RGB565 | OFMT_RGB,
+		.code	= V4L2_IMGBUS_FMT_RGB565X,
+		.dsp3	= 0x0,
+		.com3	= 0x0,
+		.com7	= FMT_RGB565 | OFMT_RGB,
 	},
 };
 
@@ -649,8 +606,8 @@ static int ov772x_s_stream(struct v4l2_subdev *sd, int enable)
 
 	ov772x_mask_set(client, COM2, SOFT_SLEEP_MODE, 0);
 
-	dev_dbg(&client->dev, "format %s, win %s\n",
-		priv->fmt->format->name, priv->win->name);
+	dev_dbg(&client->dev, "format %d, win %s\n",
+		priv->fmt->code, priv->win->name);
 
 	return 0;
 }
@@ -806,8 +763,8 @@ static const struct ov772x_win_size *ov772x_select_win(u32 width, u32 height)
 	return win;
 }
 
-static int ov772x_set_params(struct i2c_client *client,
-			     u32 *width, u32 *height, u32 pixfmt)
+static int ov772x_set_params(struct i2c_client *client, u32 *width, u32 *height,
+			     enum v4l2_imgbus_pixelcode code)
 {
 	struct ov772x_priv *priv = to_ov772x(client);
 	int ret = -EINVAL;
@@ -819,7 +776,7 @@ static int ov772x_set_params(struct i2c_client *client,
 	 */
 	priv->fmt = NULL;
 	for (i = 0; i < ARRAY_SIZE(ov772x_cfmts); i++) {
-		if (pixfmt == ov772x_cfmts[i].format->fourcc) {
+		if (code == ov772x_cfmts[i].code) {
 			priv->fmt = ov772x_cfmts + i;
 			break;
 		}
@@ -925,7 +882,7 @@ static int ov772x_set_params(struct i2c_client *client,
 	 */
 	val = priv->win->com7_bit | priv->fmt->com7;
 	ret = ov772x_mask_set(client,
-			      COM7, (SLCT_MASK | FMT_MASK | OFMT_MASK),
+			      COM7, SLCT_MASK | FMT_MASK | OFMT_MASK,
 			      val);
 	if (ret < 0)
 		goto ov772x_set_fmt_error;
@@ -981,54 +938,50 @@ static int ov772x_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int ov772x_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov772x_g_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct ov772x_priv *priv = to_ov772x(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
 	if (!priv->win || !priv->fmt) {
 		u32 width = VGA_WIDTH, height = VGA_HEIGHT;
 		int ret = ov772x_set_params(client, &width, &height,
-					    V4L2_PIX_FMT_YUYV);
+					    V4L2_IMGBUS_FMT_YUYV);
 		if (ret < 0)
 			return ret;
 	}
 
-	f->type			= V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	pix->width		= priv->win->width;
-	pix->height		= priv->win->height;
-	pix->pixelformat	= priv->fmt->format->fourcc;
-	pix->colorspace		= priv->fmt->format->colorspace;
-	pix->field		= V4L2_FIELD_NONE;
+	imgf->width	= priv->win->width;
+	imgf->height	= priv->win->height;
+	imgf->code	= priv->fmt->code;
+	imgf->field	= V4L2_FIELD_NONE;
 
 	return 0;
 }
 
-static int ov772x_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov772x_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	return ov772x_set_params(client, &pix->width, &pix->height,
-				 pix->pixelformat);
+	return ov772x_set_params(client, &imgf->width, &imgf->height,
+				 imgf->code);
 }
 
 static int ov772x_try_fmt(struct v4l2_subdev *sd,
-			  struct v4l2_format *f)
+			  struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	const struct ov772x_win_size *win;
 
 	/*
 	 * select suitable win
 	 */
-	win = ov772x_select_win(pix->width, pix->height);
+	win = ov772x_select_win(imgf->width, imgf->height);
 
-	pix->width  = win->width;
-	pix->height = win->height;
-	pix->field  = V4L2_FIELD_NONE;
+	imgf->width  = win->width;
+	imgf->height = win->height;
+	imgf->field  = V4L2_FIELD_NONE;
 
 	return 0;
 }
@@ -1057,9 +1010,6 @@ static int ov772x_video_probe(struct soc_camera_device *icd,
 		return -ENODEV;
 	}
 
-	icd->formats     = ov772x_fmt_lists;
-	icd->num_formats = ARRAY_SIZE(ov772x_fmt_lists);
-
 	/*
 	 * check and show product ID and manufacturer ID
 	 */
@@ -1109,13 +1059,24 @@ static struct v4l2_subdev_core_ops ov772x_subdev_core_ops = {
 #endif
 };
 
+static int ov772x_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(ov772x_cfmts))
+		return -EINVAL;
+
+	*code = ov772x_cfmts[index].code;
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops ov772x_subdev_video_ops = {
-	.s_stream	= ov772x_s_stream,
-	.g_fmt		= ov772x_g_fmt,
-	.s_fmt		= ov772x_s_fmt,
-	.try_fmt	= ov772x_try_fmt,
-	.cropcap	= ov772x_cropcap,
-	.g_crop		= ov772x_g_crop,
+	.s_stream		= ov772x_s_stream,
+	.g_imgbus_fmt		= ov772x_g_fmt,
+	.s_imgbus_fmt		= ov772x_s_fmt,
+	.try_imgbus_fmt		= ov772x_try_fmt,
+	.cropcap		= ov772x_cropcap,
+	.g_crop			= ov772x_g_crop,
+	.enum_imgbus_fmt	= ov772x_enum_fmt,
 };
 
 static struct v4l2_subdev_ops ov772x_subdev_ops = {
diff --git a/drivers/media/video/ov9640.c b/drivers/media/video/ov9640.c
index c81ae21..b63d921 100644
--- a/drivers/media/video/ov9640.c
+++ b/drivers/media/video/ov9640.c
@@ -160,13 +160,8 @@ static const struct ov9640_reg ov9640_regs_rgb[] = {
  * this version of the driver. To test and debug these formats add two entries
  * to the below array, see ov722x.c for an example.
  */
-static const struct soc_camera_data_format ov9640_fmt_lists[] = {
-	{
-		.name		= "UYVY",
-		.fourcc		= V4L2_PIX_FMT_UYVY,
-		.depth		= 16,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
+static const enum v4l2_imgbus_pixelcode ov9640_fmt_codes[] = {
+	V4L2_IMGBUS_FMT_UYVY,
 };
 
 static const struct v4l2_queryctrl ov9640_controls[] = {
@@ -434,20 +429,22 @@ static void ov9640_res_roundup(u32 *width, u32 *height)
 }
 
 /* Prepare necessary register changes depending on color encoding */
-static void ov9640_alter_regs(u32 pixfmt, struct ov9640_reg_alt *alt)
+static void ov9640_alter_regs(enum v4l2_imgbus_pixelcode code,
+			      struct ov9640_reg_alt *alt)
 {
-	switch (pixfmt) {
-	case V4L2_PIX_FMT_UYVY:
+	switch (code) {
+	default:
+	case V4L2_IMGBUS_FMT_UYVY:
 		alt->com12	= OV9640_COM12_YUV_AVG;
 		alt->com13	= OV9640_COM13_Y_DELAY_EN |
 					OV9640_COM13_YUV_DLY(0x01);
 		break;
-	case V4L2_PIX_FMT_RGB555:
+	case V4L2_IMGBUS_FMT_RGB555:
 		alt->com7	= OV9640_COM7_RGB;
 		alt->com13	= OV9640_COM13_RGB_AVG;
 		alt->com15	= OV9640_COM15_RGB_555;
 		break;
-	case V4L2_PIX_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB565:
 		alt->com7	= OV9640_COM7_RGB;
 		alt->com13	= OV9640_COM13_RGB_AVG;
 		alt->com15	= OV9640_COM15_RGB_565;
@@ -456,8 +453,8 @@ static void ov9640_alter_regs(u32 pixfmt, struct ov9640_reg_alt *alt)
 }
 
 /* Setup registers according to resolution and color encoding */
-static int ov9640_write_regs(struct i2c_client *client,
-		u32 width, u32 pixfmt, struct ov9640_reg_alt *alts)
+static int ov9640_write_regs(struct i2c_client *client, u32 width,
+		enum v4l2_imgbus_pixelcode code, struct ov9640_reg_alt *alts)
 {
 	const struct ov9640_reg	*ov9640_regs, *matrix_regs;
 	int			ov9640_regs_len, matrix_regs_len;
@@ -500,7 +497,7 @@ static int ov9640_write_regs(struct i2c_client *client,
 	}
 
 	/* select color matrix configuration for given color encoding */
-	if (pixfmt == V4L2_PIX_FMT_UYVY) {
+	if (code == V4L2_IMGBUS_FMT_UYVY) {
 		matrix_regs	= ov9640_regs_yuv;
 		matrix_regs_len	= ARRAY_SIZE(ov9640_regs_yuv);
 	} else {
@@ -562,15 +559,15 @@ static int ov9640_prog_dflt(struct i2c_client *client)
 }
 
 /* set the format we will capture in */
-static int ov9640_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov9640_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct ov9640_reg_alt alts = {0};
 	int ret;
 
-	ov9640_res_roundup(&pix->width, &pix->height);
-	ov9640_alter_regs(pix->pixelformat, &alts);
+	ov9640_res_roundup(&imgf->width, &imgf->height);
+	ov9640_alter_regs(imgf->code, &alts);
 
 	ov9640_reset(client);
 
@@ -578,16 +575,25 @@ static int ov9640_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	if (ret)
 		return ret;
 
-	return ov9640_write_regs(client, pix->width, pix->pixelformat, &alts);
+	return ov9640_write_regs(client, imgf->width, imgf->code, &alts);
 }
 
-static int ov9640_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int ov9640_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
+	ov9640_res_roundup(&imgf->width, &imgf->height);
+	imgf->field  = V4L2_FIELD_NONE;
 
-	ov9640_res_roundup(&pix->width, &pix->height);
-	pix->field  = V4L2_FIELD_NONE;
+	return 0;
+}
 
+static int ov9640_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(ov9640_fmt_codes))
+		return -EINVAL;
+
+	*code = ov9640_fmt_codes[index];
 	return 0;
 }
 
@@ -637,9 +643,6 @@ static int ov9640_video_probe(struct soc_camera_device *icd,
 		goto err;
 	}
 
-	icd->formats		= ov9640_fmt_lists;
-	icd->num_formats	= ARRAY_SIZE(ov9640_fmt_lists);
-
 	/*
 	 * check and show product ID and manufacturer ID
 	 */
@@ -703,8 +706,9 @@ static struct v4l2_subdev_core_ops ov9640_core_ops = {
 
 static struct v4l2_subdev_video_ops ov9640_video_ops = {
 	.s_stream		= ov9640_s_stream,
-	.s_fmt			= ov9640_s_fmt,
-	.try_fmt		= ov9640_try_fmt,
+	.s_imgbus_fmt		= ov9640_s_fmt,
+	.try_imgbus_fmt		= ov9640_try_fmt,
+	.enum_imgbus_fmt	= ov9640_enum_fmt,
 	.cropcap		= ov9640_cropcap,
 	.g_crop			= ov9640_g_crop,
 
diff --git a/drivers/media/video/pxa_camera.c b/drivers/media/video/pxa_camera.c
index f063f59..8dece33 100644
--- a/drivers/media/video/pxa_camera.c
+++ b/drivers/media/video/pxa_camera.c
@@ -183,16 +183,12 @@ struct pxa_cam_dma {
 /* buffer for one video frame */
 struct pxa_buffer {
 	/* common v4l buffer stuff -- must be first */
-	struct videobuf_buffer vb;
-
-	const struct soc_camera_data_format        *fmt;
-
+	struct videobuf_buffer		vb;
+	enum v4l2_imgbus_pixelcode	code;
 	/* our descriptor lists for Y, U and V channels */
-	struct pxa_cam_dma dmas[3];
-
-	int			inwork;
-
-	enum pxa_camera_active_dma active_dma;
+	struct pxa_cam_dma		dmas[3];
+	int				inwork;
+	enum pxa_camera_active_dma	active_dma;
 };
 
 struct pxa_camera_dev {
@@ -243,11 +239,15 @@ static int pxa_videobuf_setup(struct videobuf_queue *vq, unsigned int *count,
 			      unsigned int *size)
 {
 	struct soc_camera_device *icd = vq->priv_data;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	dev_dbg(icd->dev.parent, "count=%d, size=%d\n", *count, *size);
 
-	*size = roundup(icd->user_width * icd->user_height *
-			((icd->current_fmt->depth + 7) >> 3), 8);
+	*size = bytes_per_line * icd->user_height;
 
 	if (0 == *count)
 		*count = 32;
@@ -433,6 +433,11 @@ static int pxa_videobuf_prepare(struct videobuf_queue *vq,
 	struct pxa_buffer *buf = container_of(vb, struct pxa_buffer, vb);
 	int ret;
 	int size_y, size_u = 0, size_v = 0;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
+
+	if (bytes_per_line < 0)
+		return bytes_per_line;
 
 	dev_dbg(dev, "%s (vb=0x%p) 0x%08lx %d\n", __func__,
 		vb, vb->baddr, vb->bsize);
@@ -456,18 +461,18 @@ static int pxa_videobuf_prepare(struct videobuf_queue *vq,
 	 */
 	buf->inwork = 1;
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
 		vb->state	= VIDEOBUF_NEEDS_INIT;
 	}
 
-	vb->size = vb->width * vb->height * ((buf->fmt->depth + 7) >> 3);
+	vb->size = bytes_per_line * vb->height;
 	if (0 != vb->baddr && vb->bsize < vb->size) {
 		ret = -EINVAL;
 		goto out;
@@ -1157,9 +1162,15 @@ static int pxa_camera_set_bus_param(struct soc_camera_device *icd, __u32 pixfmt)
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct pxa_camera_dev *pcdev = ici->priv;
 	unsigned long bus_flags, camera_flags, common_flags;
-	int ret = test_platform_param(pcdev, icd->buswidth, &bus_flags);
+	const struct v4l2_imgbus_pixelfmt *fmt;
+	int ret;
 	struct pxa_cam *cam = icd->host_priv;
 
+	fmt = v4l2_imgbus_get_fmtdesc(icd->current_fmt->code);
+	if (!fmt)
+		return -EINVAL;
+
+	ret = test_platform_param(pcdev, fmt->bits_per_sample, &bus_flags);
 	if (ret < 0)
 		return ret;
 
@@ -1223,59 +1234,50 @@ static int pxa_camera_try_bus_param(struct soc_camera_device *icd,
 	return soc_camera_bus_param_compatible(camera_flags, bus_flags) ? 0 : -EINVAL;
 }
 
-static const struct soc_camera_data_format pxa_camera_formats[] = {
+static const struct v4l2_imgbus_pixelfmt pxa_camera_formats[] = {
 	{
-		.name		= "Planar YUV422 16 bit",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_YUV422P,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
+		.fourcc			= V4L2_PIX_FMT_YUV422P,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "Planar YUV422 16 bit",
+		.bits_per_sample	= 8,
+		.packing		= V4L2_IMGBUS_PACKING_2X8,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	},
 };
 
-static bool buswidth_supported(struct soc_camera_device *icd, int depth)
-{
-	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
-	struct pxa_camera_dev *pcdev = ici->priv;
-
-	switch (depth) {
-	case 8:
-		return !!(pcdev->platform_flags & PXA_CAMERA_DATAWIDTH_8);
-	case 9:
-		return !!(pcdev->platform_flags & PXA_CAMERA_DATAWIDTH_9);
-	case 10:
-		return !!(pcdev->platform_flags & PXA_CAMERA_DATAWIDTH_10);
-	}
-	return false;
-}
-
-static int required_buswidth(const struct soc_camera_data_format *fmt)
+/* This will be corrected as we get more formats */
+static bool pxa_camera_packing_supported(const struct v4l2_imgbus_pixelfmt *fmt)
 {
-	switch (fmt->fourcc) {
-	case V4L2_PIX_FMT_UYVY:
-	case V4L2_PIX_FMT_VYUY:
-	case V4L2_PIX_FMT_YUYV:
-	case V4L2_PIX_FMT_YVYU:
-	case V4L2_PIX_FMT_RGB565:
-	case V4L2_PIX_FMT_RGB555:
-		return 8;
-	default:
-		return fmt->depth;
-	}
+	return	fmt->packing == V4L2_IMGBUS_PACKING_NONE ||
+		(fmt->bits_per_sample == 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_2X8) ||
+		(fmt->bits_per_sample > 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_EXTEND16);
 }
 
 static int pxa_camera_get_formats(struct soc_camera_device *icd, int idx,
 				  struct soc_camera_format_xlate *xlate)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
-	int formats = 0, buswidth, ret;
+	int formats = 0, ret;
 	struct pxa_cam *cam;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *fmt;
 
-	buswidth = required_buswidth(icd->formats + idx);
+	ret = v4l2_subdev_call(sd, video, enum_imgbus_fmt, idx, &code);
+	if (ret < 0)
+		/* No more formats */
+		return 0;
 
-	if (!buswidth_supported(icd, buswidth))
+	fmt = v4l2_imgbus_get_fmtdesc(code);
+	if (!fmt) {
+		dev_err(dev, "Invalid format code #%d: %d\n", idx, code);
 		return 0;
+	}
 
-	ret = pxa_camera_try_bus_param(icd, buswidth);
+	/* This also checks support for the requested bits-per-sample */
+	ret = pxa_camera_try_bus_param(icd, fmt->bits_per_sample);
 	if (ret < 0)
 		return 0;
 
@@ -1289,45 +1291,40 @@ static int pxa_camera_get_formats(struct soc_camera_device *icd, int idx,
 		cam = icd->host_priv;
 	}
 
-	switch (icd->formats[idx].fourcc) {
-	case V4L2_PIX_FMT_UYVY:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_UYVY:
 		formats++;
 		if (xlate) {
-			xlate->host_fmt = &pxa_camera_formats[0];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
+			xlate->host_fmt	= &pxa_camera_formats[0];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(dev, "Providing format %s using %s\n",
-				pxa_camera_formats[0].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				pxa_camera_formats[0].name, code);
 		}
-	case V4L2_PIX_FMT_VYUY:
-	case V4L2_PIX_FMT_YUYV:
-	case V4L2_PIX_FMT_YVYU:
-	case V4L2_PIX_FMT_RGB565:
-	case V4L2_PIX_FMT_RGB555:
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = buswidth;
-			xlate++;
+	case V4L2_IMGBUS_FMT_VYUY:
+	case V4L2_IMGBUS_FMT_YUYV:
+	case V4L2_IMGBUS_FMT_YVYU:
+	case V4L2_IMGBUS_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_RGB555:
+		if (xlate)
 			dev_dbg(dev, "Providing format %s packed\n",
-				icd->formats[idx].name);
-		}
+				fmt->name);
 		break;
 	default:
-		/* Generic pass-through */
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = icd->formats[idx].depth;
-			xlate++;
+		if (!pxa_camera_packing_supported(fmt))
+			return 0;
+		if (xlate)
 			dev_dbg(dev,
 				"Providing format %s in pass-through mode\n",
-				icd->formats[idx].name);
-		}
+				fmt->name);
+	}
+
+	/* Generic pass-through */
+	formats++;
+	if (xlate) {
+		xlate->host_fmt	= fmt;
+		xlate->code	= code;
+		xlate++;
 	}
 
 	return formats;
@@ -1339,11 +1336,11 @@ static void pxa_camera_put_formats(struct soc_camera_device *icd)
 	icd->host_priv = NULL;
 }
 
-static int pxa_camera_check_frame(struct v4l2_pix_format *pix)
+static int pxa_camera_check_frame(u32 width, u32 height)
 {
 	/* limit to pxa hardware capabilities */
-	return pix->height < 32 || pix->height > 2048 || pix->width < 48 ||
-		pix->width > 2048 || (pix->width & 0x01);
+	return height < 32 || height > 2048 || width < 48 || width > 2048 ||
+		(width & 0x01);
 }
 
 static int pxa_camera_set_crop(struct soc_camera_device *icd,
@@ -1358,9 +1355,9 @@ static int pxa_camera_set_crop(struct soc_camera_device *icd,
 		.master_clock = pcdev->mclk,
 		.pixel_clock_max = pcdev->ciclk / 4,
 	};
-	struct v4l2_format f;
-	struct v4l2_pix_format *pix = &f.fmt.pix, pix_tmp;
+	struct v4l2_imgbus_framefmt imgf;
 	struct pxa_cam *cam = icd->host_priv;
+	u32 fourcc = icd->current_fmt->host_fmt->fourcc;
 	int ret;
 
 	/* If PCLK is used to latch data from the sensor, check sense */
@@ -1377,27 +1374,23 @@ static int pxa_camera_set_crop(struct soc_camera_device *icd,
 		return ret;
 	}
 
-	f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+	ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
-	pix_tmp = *pix;
-	if (pxa_camera_check_frame(pix)) {
+	if (pxa_camera_check_frame(imgf.width, imgf.height)) {
 		/*
 		 * Camera cropping produced a frame beyond our capabilities.
 		 * FIXME: just extract a subframe, that we can process.
 		 */
-		v4l_bound_align_image(&pix->width, 48, 2048, 1,
-			&pix->height, 32, 2048, 0,
-			icd->current_fmt->fourcc == V4L2_PIX_FMT_YUV422P ?
-				4 : 0);
-		ret = v4l2_subdev_call(sd, video, s_fmt, &f);
+		v4l_bound_align_image(&imgf.width, 48, 2048, 1,
+			&imgf.height, 32, 2048, 0,
+			fourcc == V4L2_PIX_FMT_YUV422P ? 4 : 0);
+		ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
 		if (ret < 0)
 			return ret;
 
-		if (pxa_camera_check_frame(pix)) {
+		if (pxa_camera_check_frame(imgf.width, imgf.height)) {
 			dev_warn(icd->dev.parent,
 				 "Inconsistent state. Use S_FMT to repair\n");
 			return -EINVAL;
@@ -1414,10 +1407,10 @@ static int pxa_camera_set_crop(struct soc_camera_device *icd,
 		recalculate_fifo_timeout(pcdev, sense.pixel_clock);
 	}
 
-	icd->user_width = pix->width;
-	icd->user_height = pix->height;
+	icd->user_width		= imgf.width;
+	icd->user_height	= imgf.height;
 
-	pxa_camera_setup_cicr(icd, cam->flags, icd->current_fmt->fourcc);
+	pxa_camera_setup_cicr(icd, cam->flags, fourcc);
 
 	return ret;
 }
@@ -1429,14 +1422,13 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 	struct pxa_camera_dev *pcdev = ici->priv;
 	struct device *dev = icd->dev.parent;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
-	const struct soc_camera_data_format *cam_fmt = NULL;
 	const struct soc_camera_format_xlate *xlate = NULL;
 	struct soc_camera_sense sense = {
 		.master_clock = pcdev->mclk,
 		.pixel_clock_max = pcdev->ciclk / 4,
 	};
 	struct v4l2_pix_format *pix = &f->fmt.pix;
-	struct v4l2_format cam_f = *f;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pix->pixelformat);
@@ -1445,26 +1437,27 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 		return -EINVAL;
 	}
 
-	cam_fmt = xlate->cam_fmt;
-
 	/* If PCLK is used to latch data from the sensor, check sense */
 	if (pcdev->platform_flags & PXA_CAMERA_PCLK_EN)
+		/* The caller holds a mutex. */
 		icd->sense = &sense;
 
-	cam_f.fmt.pix.pixelformat = cam_fmt->fourcc;
-	ret = v4l2_subdev_call(sd, video, s_fmt, &cam_f);
-	cam_f.fmt.pix.pixelformat = pix->pixelformat;
-	*pix = cam_f.fmt.pix;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
+
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, &imgf);
 
 	icd->sense = NULL;
 
 	if (ret < 0) {
 		dev_warn(dev, "Failed to configure for format %x\n",
 			 pix->pixelformat);
-	} else if (pxa_camera_check_frame(pix)) {
+	} else if (pxa_camera_check_frame(imgf.width, imgf.height)) {
 		dev_warn(dev,
 			 "Camera driver produced an unsupported frame %dx%d\n",
-			 pix->width, pix->height);
+			 imgf.width, imgf.height);
 		ret = -EINVAL;
 	} else if (sense.flags & SOCAM_SENSE_PCLK_CHANGED) {
 		if (sense.pixel_clock > sense.pixel_clock_max) {
@@ -1476,10 +1469,13 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 		recalculate_fifo_timeout(pcdev, sense.pixel_clock);
 	}
 
-	if (!ret) {
-		icd->buswidth = xlate->buswidth;
-		icd->current_fmt = xlate->host_fmt;
-	}
+	if (ret < 0)
+		return ret;
+
+	pix->width		= imgf.width;
+	pix->height		= imgf.height;
+	pix->field		= imgf.field;
+	icd->current_fmt	= xlate;
 
 	return ret;
 }
@@ -1487,17 +1483,16 @@ static int pxa_camera_set_fmt(struct soc_camera_device *icd,
 static int pxa_camera_try_fmt(struct soc_camera_device *icd,
 			      struct v4l2_format *f)
 {
-	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	__u32 pixfmt = pix->pixelformat;
-	enum v4l2_field field;
 	int ret;
 
 	xlate = soc_camera_xlate_by_fourcc(icd, pixfmt);
 	if (!xlate) {
-		dev_warn(ici->v4l2_dev.dev, "Format %x not found\n", pixfmt);
+		dev_warn(icd->dev.parent, "Format %x not found\n", pixfmt);
 		return -EINVAL;
 	}
 
@@ -1511,22 +1506,34 @@ static int pxa_camera_try_fmt(struct soc_camera_device *icd,
 			      &pix->height, 32, 2048, 0,
 			      pixfmt == V4L2_PIX_FMT_YUV422P ? 4 : 0);
 
-	pix->bytesperline = pix->width *
-		DIV_ROUND_UP(xlate->host_fmt->depth, 8);
+	pix->bytesperline = v4l2_imgbus_bytes_per_line(pix->width,
+						       xlate->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
 	pix->sizeimage = pix->height * pix->bytesperline;
 
-	/* camera has to see its format, but the user the original one */
-	pix->pixelformat = xlate->cam_fmt->fourcc;
 	/* limit to sensor capabilities */
-	ret = v4l2_subdev_call(sd, video, try_fmt, f);
-	pix->pixelformat = pixfmt;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.field	= pix->field;
+	imgf.code	= xlate->code;
 
-	field = pix->field;
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
+	if (ret < 0)
+		return ret;
 
-	if (field == V4L2_FIELD_ANY) {
-		pix->field = V4L2_FIELD_NONE;
-	} else if (field != V4L2_FIELD_NONE) {
-		dev_err(icd->dev.parent, "Field type %d unsupported.\n", field);
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
+
+	switch (imgf.field) {
+	case V4L2_FIELD_ANY:
+	case V4L2_FIELD_NONE:
+		pix->field	= V4L2_FIELD_NONE;
+		break;
+	default:
+		/* TODO: support interlaced at least in pass-through mode */
+		dev_err(icd->dev.parent, "Field type %d unsupported.\n",
+			imgf.field);
 		return -EINVAL;
 	}
 
diff --git a/drivers/media/video/rj54n1cb0c.c b/drivers/media/video/rj54n1cb0c.c
index 373f2a3..0c998e8 100644
--- a/drivers/media/video/rj54n1cb0c.c
+++ b/drivers/media/video/rj54n1cb0c.c
@@ -85,18 +85,16 @@
 
 /* I2C addresses: 0x50, 0x51, 0x60, 0x61 */
 
-static const struct soc_camera_data_format rj54n1_colour_formats[] = {
-	{
-		.name		= "YUYV",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_YUYV,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	}, {
-		.name		= "RGB565",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_RGB565,
-		.colorspace	= V4L2_COLORSPACE_SRGB,
-	}
+static const enum v4l2_imgbus_pixelcode rj54n1_colour_codes[] = {
+	V4L2_IMGBUS_FMT_YUYV,
+	V4L2_IMGBUS_FMT_YVYU,
+	V4L2_IMGBUS_FMT_RGB565,
+	V4L2_IMGBUS_FMT_RGB565X,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
+	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
+	V4L2_IMGBUS_FMT_SBGGR10,
 };
 
 struct rj54n1_clock_div {
@@ -109,12 +107,12 @@ struct rj54n1_clock_div {
 
 struct rj54n1 {
 	struct v4l2_subdev subdev;
+	enum v4l2_imgbus_pixelcode code;
 	struct v4l2_rect rect;	/* Sensor window */
 	unsigned short width;	/* Output window */
 	unsigned short height;
 	unsigned short resize;	/* Sensor * 1024 / resize = Output */
 	struct rj54n1_clock_div clk_div;
-	u32 fourcc;
 	unsigned short scale;
 	u8 bank;
 };
@@ -440,6 +438,16 @@ static int reg_write_multiple(struct i2c_client *client,
 	return 0;
 }
 
+static int rj54n1_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(rj54n1_colour_codes))
+		return -EINVAL;
+
+	*code = rj54n1_colour_codes[index];
+	return 0;
+}
+
 static int rj54n1_s_stream(struct v4l2_subdev *sd, int enable)
 {
 	/* TODO: start / stop streaming */
@@ -527,16 +535,16 @@ static int rj54n1_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int rj54n1_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int rj54n1_g_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct rj54n1 *rj54n1 = to_rj54n1(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->pixelformat	= rj54n1->fourcc;
-	pix->field		= V4L2_FIELD_NONE;
-	pix->width		= rj54n1->width;
-	pix->height		= rj54n1->height;
+	imgf->code	= rj54n1->code;
+	imgf->field	= V4L2_FIELD_NONE;
+	imgf->width	= rj54n1->width;
+	imgf->height	= rj54n1->height;
 
 	return 0;
 }
@@ -787,26 +795,33 @@ static int rj54n1_reg_init(struct i2c_client *client)
 }
 
 /* FIXME: streaming output only up to 800x600 is functional */
-static int rj54n1_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int rj54n1_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf)
 {
-	struct v4l2_pix_format *pix = &f->fmt.pix;
+	struct i2c_client *client = sd->priv;
+	int align = imgf->code == V4L2_IMGBUS_FMT_SBGGR10 ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE ||
+		imgf->code == V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE;
 
-	pix->field = V4L2_FIELD_NONE;
+	dev_dbg(&client->dev, "%s: code = %d, width = %u, height = %u\n",
+		__func__, imgf->code, imgf->width, imgf->height);
 
-	if (pix->width > 800)
-		pix->width = 800;
-	if (pix->height > 600)
-		pix->height = 600;
+	imgf->field = V4L2_FIELD_NONE;
+
+	v4l_bound_align_image(&imgf->width, 112, RJ54N1_MAX_WIDTH, align,
+			      &imgf->height, 84, RJ54N1_MAX_HEIGHT, align, 0);
 
 	return 0;
 }
 
-static int rj54n1_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int rj54n1_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct rj54n1 *rj54n1 = to_rj54n1(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	unsigned int output_w, output_h,
+	unsigned int output_w, output_h, max_w, max_h,
 		input_w = rj54n1->rect.width, input_h = rj54n1->rect.height;
 	int ret;
 
@@ -814,7 +829,7 @@ static int rj54n1_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	 * The host driver can call us without .try_fmt(), so, we have to take
 	 * care ourseleves
 	 */
-	ret = rj54n1_try_fmt(sd, f);
+	ret = rj54n1_try_fmt(sd, imgf);
 
 	/*
 	 * Verify if the sensor has just been powered on. TODO: replace this
@@ -832,49 +847,97 @@ static int rj54n1_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 	}
 
 	/* RA_SEL_UL is only relevant for raw modes, ignored otherwise. */
-	switch (pix->pixelformat) {
-	case V4L2_PIX_FMT_YUYV:
+	switch (imgf->code) {
+	case V4L2_IMGBUS_FMT_YUYV:
 		ret = reg_write(client, RJ54N1_OUT_SEL, 0);
 		if (!ret)
 			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
 		break;
-	case V4L2_PIX_FMT_RGB565:
+	case V4L2_IMGBUS_FMT_YVYU:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 0);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		break;
+	case V4L2_IMGBUS_FMT_RGB565:
 		ret = reg_write(client, RJ54N1_OUT_SEL, 0x11);
 		if (!ret)
 			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
 		break;
+	case V4L2_IMGBUS_FMT_RGB565X:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 0x11);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 0);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 8, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 8);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 0);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 4);
+		if (!ret)
+			ret = reg_set(client, RJ54N1_BYTE_SWAP, 0, 8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_RA_SEL_UL, 8);
+		break;
+	case V4L2_IMGBUS_FMT_SBGGR10:
+		ret = reg_write(client, RJ54N1_OUT_SEL, 5);
+		break;
 	default:
 		ret = -EINVAL;
 	}
 
+	/* Special case: a raw mode with 10 bits of data per clock tick */
+	if (!ret)
+		ret = reg_set(client, RJ54N1_OCLK_SEL_EN,
+			      (imgf->code == V4L2_IMGBUS_FMT_SBGGR10) << 1, 2);
+
 	if (ret < 0)
 		return ret;
 
-	/* Supported scales 1:1 - 1:16 */
-	if (pix->width < input_w / 16)
-		pix->width = input_w / 16;
-	if (pix->height < input_h / 16)
-		pix->height = input_h / 16;
+	/* Supported scales 1:1 >= scale > 1:16 */
+	max_w = imgf->width * (16 * 1024 - 1) / 1024;
+	if (input_w > max_w)
+		input_w = max_w;
+	max_h = imgf->height * (16 * 1024 - 1) / 1024;
+	if (input_h > max_h)
+		input_h = max_h;
 
-	output_w = pix->width;
-	output_h = pix->height;
+	output_w = imgf->width;
+	output_h = imgf->height;
 
 	ret = rj54n1_sensor_scale(sd, &input_w, &input_h, &output_w, &output_h);
 	if (ret < 0)
 		return ret;
 
-	rj54n1->fourcc		= pix->pixelformat;
+	rj54n1->code		= imgf->code;
 	rj54n1->resize		= ret;
 	rj54n1->rect.width	= input_w;
 	rj54n1->rect.height	= input_h;
 	rj54n1->width		= output_w;
 	rj54n1->height		= output_h;
 
-	pix->width		= output_w;
-	pix->height		= output_h;
-	pix->field		= V4L2_FIELD_NONE;
+	imgf->width		= output_w;
+	imgf->height		= output_h;
+	imgf->field		= V4L2_FIELD_NONE;
 
-	return ret;
+	return 0;
 }
 
 static int rj54n1_g_chip_ident(struct v4l2_subdev *sd,
@@ -1053,12 +1116,16 @@ static struct v4l2_subdev_core_ops rj54n1_subdev_core_ops = {
 };
 
 static struct v4l2_subdev_video_ops rj54n1_subdev_video_ops = {
-	.s_stream	= rj54n1_s_stream,
-	.s_fmt		= rj54n1_s_fmt,
-	.g_fmt		= rj54n1_g_fmt,
-	.try_fmt	= rj54n1_try_fmt,
-	.g_crop		= rj54n1_g_crop,
-	.cropcap	= rj54n1_cropcap,
+	.s_stream		= rj54n1_s_stream,
+	.s_imgbus_fmt		= rj54n1_s_fmt,
+	.g_imgbus_fmt		= rj54n1_g_fmt,
+	.try_imgbus_fmt		= rj54n1_try_fmt,
+	.enum_imgbus_fmt	= rj54n1_enum_fmt,
+	.s_imgbus_fmt		= rj54n1_s_fmt,
+	.g_imgbus_fmt		= rj54n1_g_fmt,
+	.try_imgbus_fmt		= rj54n1_try_fmt,
+	.g_crop			= rj54n1_g_crop,
+	.cropcap		= rj54n1_cropcap,
 };
 
 static struct v4l2_subdev_ops rj54n1_subdev_ops = {
@@ -1153,7 +1220,7 @@ static int rj54n1_probe(struct i2c_client *client,
 	rj54n1->rect.height	= RJ54N1_MAX_HEIGHT;
 	rj54n1->width		= RJ54N1_MAX_WIDTH;
 	rj54n1->height		= RJ54N1_MAX_HEIGHT;
-	rj54n1->fourcc		= V4L2_PIX_FMT_YUYV;
+	rj54n1->code		= rj54n1_colour_codes[0];
 	rj54n1->resize		= 1024;
 
 	ret = rj54n1_video_probe(icd, client);
@@ -1164,9 +1231,6 @@ static int rj54n1_probe(struct i2c_client *client,
 		return ret;
 	}
 
-	icd->formats		= rj54n1_colour_formats;
-	icd->num_formats	= ARRAY_SIZE(rj54n1_colour_formats);
-
 	return ret;
 }
 
diff --git a/drivers/media/video/sh_mobile_ceu_camera.c b/drivers/media/video/sh_mobile_ceu_camera.c
index 0b7e32b..746aed0 100644
--- a/drivers/media/video/sh_mobile_ceu_camera.c
+++ b/drivers/media/video/sh_mobile_ceu_camera.c
@@ -37,6 +37,7 @@
 #include <media/soc_camera.h>
 #include <media/sh_mobile_ceu.h>
 #include <media/videobuf-dma-contig.h>
+#include <media/v4l2-imagebus.h>
 
 /* register offsets for sh7722 / sh7723 */
 
@@ -84,7 +85,7 @@
 /* per video frame buffer */
 struct sh_mobile_ceu_buffer {
 	struct videobuf_buffer vb; /* v4l buffer must be first */
-	const struct soc_camera_data_format *fmt;
+	enum v4l2_imgbus_pixelcode code;
 };
 
 struct sh_mobile_ceu_dev {
@@ -113,8 +114,8 @@ struct sh_mobile_ceu_cam {
 	struct v4l2_rect ceu_rect;
 	unsigned int cam_width;
 	unsigned int cam_height;
-	const struct soc_camera_data_format *extra_fmt;
-	const struct soc_camera_data_format *camera_fmt;
+	const struct v4l2_imgbus_pixelfmt *extra_fmt;
+	enum v4l2_imgbus_pixelcode code;
 };
 
 static unsigned long make_bus_param(struct sh_mobile_ceu_dev *pcdev)
@@ -195,10 +196,13 @@ static int sh_mobile_ceu_videobuf_setup(struct videobuf_queue *vq,
 	struct soc_camera_device *icd = vq->priv_data;
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct sh_mobile_ceu_dev *pcdev = ici->priv;
-	int bytes_per_pixel = (icd->current_fmt->depth + 7) >> 3;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
 
-	*size = PAGE_ALIGN(icd->user_width * icd->user_height *
-			   bytes_per_pixel);
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
+	*size = PAGE_ALIGN(bytes_per_line * icd->user_height);
 
 	if (0 == *count)
 		*count = 2;
@@ -282,7 +286,7 @@ static int sh_mobile_ceu_capture(struct sh_mobile_ceu_dev *pcdev)
 		ceu_write(pcdev, CDBYR, phys_addr_bottom);
 	}
 
-	switch (icd->current_fmt->fourcc) {
+	switch (icd->current_fmt->host_fmt->fourcc) {
 	case V4L2_PIX_FMT_NV12:
 	case V4L2_PIX_FMT_NV21:
 	case V4L2_PIX_FMT_NV16:
@@ -309,8 +313,13 @@ static int sh_mobile_ceu_videobuf_prepare(struct videobuf_queue *vq,
 {
 	struct soc_camera_device *icd = vq->priv_data;
 	struct sh_mobile_ceu_buffer *buf;
+	int bytes_per_line = v4l2_imgbus_bytes_per_line(icd->user_width,
+						icd->current_fmt->host_fmt);
 	int ret;
 
+	if (bytes_per_line < 0)
+		return bytes_per_line;
+
 	buf = container_of(vb, struct sh_mobile_ceu_buffer, vb);
 
 	dev_dbg(icd->dev.parent, "%s (vb=0x%p) 0x%08lx %zd\n", __func__,
@@ -329,18 +338,18 @@ static int sh_mobile_ceu_videobuf_prepare(struct videobuf_queue *vq,
 
 	BUG_ON(NULL == icd->current_fmt);
 
-	if (buf->fmt	!= icd->current_fmt ||
+	if (buf->code	!= icd->current_fmt->code ||
 	    vb->width	!= icd->user_width ||
 	    vb->height	!= icd->user_height ||
 	    vb->field	!= field) {
-		buf->fmt	= icd->current_fmt;
+		buf->code	= icd->current_fmt->code;
 		vb->width	= icd->user_width;
 		vb->height	= icd->user_height;
 		vb->field	= field;
 		vb->state	= VIDEOBUF_NEEDS_INIT;
 	}
 
-	vb->size = vb->width * vb->height * ((buf->fmt->depth + 7) >> 3);
+	vb->size = vb->height * bytes_per_line;
 	if (0 != vb->baddr && vb->bsize < vb->size) {
 		ret = -EINVAL;
 		goto out;
@@ -564,7 +573,8 @@ static void sh_mobile_ceu_set_rect(struct soc_camera_device *icd,
 		}
 		width = cdwdr_width = out_width;
 	} else {
-		unsigned int w_factor = (icd->current_fmt->depth + 7) >> 3;
+		unsigned int w_factor = (7 +
+			icd->current_fmt->host_fmt->bits_per_sample) >> 3;
 
 		width = out_width * w_factor / 2;
 
@@ -671,24 +681,24 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 	value = 0x00000010; /* data fetch by default */
 	yuv_lineskip = 0;
 
-	switch (icd->current_fmt->fourcc) {
+	switch (icd->current_fmt->host_fmt->fourcc) {
 	case V4L2_PIX_FMT_NV12:
 	case V4L2_PIX_FMT_NV21:
 		yuv_lineskip = 1; /* skip for NV12/21, no skip for NV16/61 */
 		/* fall-through */
 	case V4L2_PIX_FMT_NV16:
 	case V4L2_PIX_FMT_NV61:
-		switch (cam->camera_fmt->fourcc) {
-		case V4L2_PIX_FMT_UYVY:
+		switch (cam->code) {
+		case V4L2_IMGBUS_FMT_UYVY:
 			value = 0x00000000; /* Cb0, Y0, Cr0, Y1 */
 			break;
-		case V4L2_PIX_FMT_VYUY:
+		case V4L2_IMGBUS_FMT_VYUY:
 			value = 0x00000100; /* Cr0, Y0, Cb0, Y1 */
 			break;
-		case V4L2_PIX_FMT_YUYV:
+		case V4L2_IMGBUS_FMT_YUYV:
 			value = 0x00000200; /* Y0, Cb0, Y1, Cr0 */
 			break;
-		case V4L2_PIX_FMT_YVYU:
+		case V4L2_IMGBUS_FMT_YVYU:
 			value = 0x00000300; /* Y0, Cr0, Y1, Cb0 */
 			break;
 		default:
@@ -696,8 +706,8 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 		}
 	}
 
-	if (icd->current_fmt->fourcc == V4L2_PIX_FMT_NV21 ||
-	    icd->current_fmt->fourcc == V4L2_PIX_FMT_NV61)
+	if (icd->current_fmt->host_fmt->fourcc == V4L2_PIX_FMT_NV21 ||
+	    icd->current_fmt->host_fmt->fourcc == V4L2_PIX_FMT_NV61)
 		value ^= 0x00000100; /* swap U, V to change from NV1x->NVx1 */
 
 	value |= common_flags & SOCAM_VSYNC_ACTIVE_LOW ? 1 << 1 : 0;
@@ -744,7 +754,8 @@ static int sh_mobile_ceu_set_bus_param(struct soc_camera_device *icd,
 	return 0;
 }
 
-static int sh_mobile_ceu_try_bus_param(struct soc_camera_device *icd)
+static int sh_mobile_ceu_try_bus_param(struct soc_camera_device *icd,
+				       unsigned char buswidth)
 {
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
 	struct sh_mobile_ceu_dev *pcdev = ici->priv;
@@ -753,48 +764,79 @@ static int sh_mobile_ceu_try_bus_param(struct soc_camera_device *icd)
 	camera_flags = icd->ops->query_bus_param(icd);
 	common_flags = soc_camera_bus_param_compatible(camera_flags,
 						       make_bus_param(pcdev));
-	if (!common_flags)
+	if (!common_flags || buswidth > 16 ||
+	    (buswidth > 8 && !(common_flags & SOCAM_DATAWIDTH_16)))
 		return -EINVAL;
 
 	return 0;
 }
 
-static const struct soc_camera_data_format sh_mobile_ceu_formats[] = {
-	{
-		.name		= "NV12",
-		.depth		= 12,
-		.fourcc		= V4L2_PIX_FMT_NV12,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
-	{
-		.name		= "NV21",
-		.depth		= 12,
-		.fourcc		= V4L2_PIX_FMT_NV21,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
-	{
-		.name		= "NV16",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_NV16,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
-	},
+static const struct v4l2_imgbus_pixelfmt sh_mobile_ceu_formats[] = {
 	{
-		.name		= "NV61",
-		.depth		= 16,
-		.fourcc		= V4L2_PIX_FMT_NV61,
-		.colorspace	= V4L2_COLORSPACE_JPEG,
+		.fourcc			= V4L2_PIX_FMT_NV12,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV12",
+		.bits_per_sample	= 12,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, {
+		.fourcc			= V4L2_PIX_FMT_NV21,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV21",
+		.bits_per_sample	= 12,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, {
+		.fourcc			= V4L2_PIX_FMT_NV16,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV16",
+		.bits_per_sample	= 16,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
+	}, {
+		.fourcc			= V4L2_PIX_FMT_NV61,
+		.colorspace		= V4L2_COLORSPACE_JPEG,
+		.name			= "NV61",
+		.bits_per_sample	= 16,
+		.packing		= V4L2_IMGBUS_PACKING_NONE,
+		.order			= V4L2_IMGBUS_ORDER_LE,
 	},
 };
 
+/* This will be corrected as we get more formats */
+static bool sh_mobile_ceu_packing_supported(const struct v4l2_imgbus_pixelfmt *fmt)
+{
+	return	fmt->packing == V4L2_IMGBUS_PACKING_NONE ||
+		(fmt->bits_per_sample == 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_2X8_PADHI) ||
+		(fmt->bits_per_sample > 8 &&
+		 fmt->packing == V4L2_IMGBUS_PACKING_EXTEND16);
+}
+
 static int sh_mobile_ceu_get_formats(struct soc_camera_device *icd, int idx,
 				     struct soc_camera_format_xlate *xlate)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
 	int ret, k, n;
 	int formats = 0;
 	struct sh_mobile_ceu_cam *cam;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *fmt;
+
+	ret = v4l2_subdev_call(sd, video, enum_imgbus_fmt, idx, &code);
+	if (ret < 0)
+		/* No more formats */
+		return 0;
 
-	ret = sh_mobile_ceu_try_bus_param(icd);
+	fmt = v4l2_imgbus_get_fmtdesc(code);
+	if (!fmt) {
+		dev_err(icd->dev.parent,
+			"Invalid format code #%d: %d\n", idx, code);
+		return -EINVAL;
+	}
+
+	ret = sh_mobile_ceu_try_bus_param(icd, fmt->bits_per_sample);
 	if (ret < 0)
 		return 0;
 
@@ -812,13 +854,13 @@ static int sh_mobile_ceu_get_formats(struct soc_camera_device *icd, int idx,
 	if (!idx)
 		cam->extra_fmt = NULL;
 
-	switch (icd->formats[idx].fourcc) {
-	case V4L2_PIX_FMT_UYVY:
-	case V4L2_PIX_FMT_VYUY:
-	case V4L2_PIX_FMT_YUYV:
-	case V4L2_PIX_FMT_YVYU:
+	switch (code) {
+	case V4L2_IMGBUS_FMT_UYVY:
+	case V4L2_IMGBUS_FMT_VYUY:
+	case V4L2_IMGBUS_FMT_YUYV:
+	case V4L2_IMGBUS_FMT_YVYU:
 		if (cam->extra_fmt)
-			goto add_single_format;
+			break;
 
 		/*
 		 * Our case is simple so far: for any of the above four camera
@@ -829,32 +871,31 @@ static int sh_mobile_ceu_get_formats(struct soc_camera_device *icd, int idx,
 		 * the host_priv pointer and check whether the format you're
 		 * going to add now is already there.
 		 */
-		cam->extra_fmt = (void *)sh_mobile_ceu_formats;
+		cam->extra_fmt = sh_mobile_ceu_formats;
 
 		n = ARRAY_SIZE(sh_mobile_ceu_formats);
 		formats += n;
 		for (k = 0; xlate && k < n; k++) {
-			xlate->host_fmt = &sh_mobile_ceu_formats[k];
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = icd->formats[idx].depth;
+			xlate->host_fmt	= &sh_mobile_ceu_formats[k];
+			xlate->code	= code;
 			xlate++;
-			dev_dbg(dev, "Providing format %s using %s\n",
-				sh_mobile_ceu_formats[k].name,
-				icd->formats[idx].name);
+			dev_dbg(dev, "Providing format %s using code %d\n",
+				sh_mobile_ceu_formats[k].name, code);
 		}
+		break;
 	default:
-add_single_format:
-		/* Generic pass-through */
-		formats++;
-		if (xlate) {
-			xlate->host_fmt = icd->formats + idx;
-			xlate->cam_fmt = icd->formats + idx;
-			xlate->buswidth = icd->formats[idx].depth;
-			xlate++;
-			dev_dbg(dev,
-				"Providing format %s in pass-through mode\n",
-				icd->formats[idx].name);
-		}
+		if (!sh_mobile_ceu_packing_supported(fmt))
+			return 0;
+	}
+
+	/* Generic pass-through */
+	formats++;
+	if (xlate) {
+		xlate->host_fmt	= fmt;
+		xlate->code	= code;
+		xlate++;
+		dev_dbg(dev, "Providing format %s in pass-through mode\n",
+			xlate->host_fmt->name);
 	}
 
 	return formats;
@@ -1034,17 +1075,15 @@ static int client_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *crop,
 static int get_camera_scales(struct v4l2_subdev *sd, struct v4l2_rect *rect,
 			     unsigned int *scale_h, unsigned int *scale_v)
 {
-	struct v4l2_format f;
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
-	f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+	ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
-	*scale_h = calc_generic_scale(rect->width, f.fmt.pix.width);
-	*scale_v = calc_generic_scale(rect->height, f.fmt.pix.height);
+	*scale_h = calc_generic_scale(rect->width, imgf.width);
+	*scale_v = calc_generic_scale(rect->height, imgf.height);
 
 	return 0;
 }
@@ -1059,32 +1098,29 @@ static int get_camera_subwin(struct soc_camera_device *icd,
 	if (!ceu_rect->width) {
 		struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 		struct device *dev = icd->dev.parent;
-		struct v4l2_format f;
-		struct v4l2_pix_format *pix = &f.fmt.pix;
+		struct v4l2_imgbus_framefmt imgf;
 		int ret;
 		/* First time */
 
-		f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-		ret = v4l2_subdev_call(sd, video, g_fmt, &f);
+		ret = v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf);
 		if (ret < 0)
 			return ret;
 
-		dev_geo(dev, "camera fmt %ux%u\n", pix->width, pix->height);
+		dev_geo(dev, "camera fmt %ux%u\n", imgf.width, imgf.height);
 
-		if (pix->width > 2560) {
+		if (imgf.width > 2560) {
 			ceu_rect->width	 = 2560;
-			ceu_rect->left	 = (pix->width - 2560) / 2;
+			ceu_rect->left	 = (imgf.width - 2560) / 2;
 		} else {
-			ceu_rect->width	 = pix->width;
+			ceu_rect->width	 = imgf.width;
 			ceu_rect->left	 = 0;
 		}
 
-		if (pix->height > 1920) {
+		if (imgf.height > 1920) {
 			ceu_rect->height = 1920;
-			ceu_rect->top	 = (pix->height - 1920) / 2;
+			ceu_rect->top	 = (imgf.height - 1920) / 2;
 		} else {
-			ceu_rect->height = pix->height;
+			ceu_rect->height = imgf.height;
 			ceu_rect->top	 = 0;
 		}
 
@@ -1101,13 +1137,12 @@ static int get_camera_subwin(struct soc_camera_device *icd,
 	return 0;
 }
 
-static int client_s_fmt(struct soc_camera_device *icd, struct v4l2_format *f,
-			bool ceu_can_scale)
+static int client_s_fmt(struct soc_camera_device *icd,
+			struct v4l2_imgbus_framefmt *imgf, bool ceu_can_scale)
 {
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
-	unsigned int width = pix->width, height = pix->height, tmp_w, tmp_h;
+	unsigned int width = imgf->width, height = imgf->height, tmp_w, tmp_h;
 	unsigned int max_width, max_height;
 	struct v4l2_cropcap cap;
 	int ret;
@@ -1121,29 +1156,29 @@ static int client_s_fmt(struct soc_camera_device *icd, struct v4l2_format *f,
 	max_width = min(cap.bounds.width, 2560);
 	max_height = min(cap.bounds.height, 1920);
 
-	ret = v4l2_subdev_call(sd, video, s_fmt, f);
+	ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, imgf);
 	if (ret < 0)
 		return ret;
 
-	dev_geo(dev, "camera scaled to %ux%u\n", pix->width, pix->height);
+	dev_geo(dev, "camera scaled to %ux%u\n", imgf->width, imgf->height);
 
-	if ((width == pix->width && height == pix->height) || !ceu_can_scale)
+	if ((width == imgf->width && height == imgf->height) || !ceu_can_scale)
 		return 0;
 
 	/* Camera set a format, but geometry is not precise, try to improve */
-	tmp_w = pix->width;
-	tmp_h = pix->height;
+	tmp_w = imgf->width;
+	tmp_h = imgf->height;
 
 	/* width <= max_width && height <= max_height - guaranteed by try_fmt */
 	while ((width > tmp_w || height > tmp_h) &&
 	       tmp_w < max_width && tmp_h < max_height) {
 		tmp_w = min(2 * tmp_w, max_width);
 		tmp_h = min(2 * tmp_h, max_height);
-		pix->width = tmp_w;
-		pix->height = tmp_h;
-		ret = v4l2_subdev_call(sd, video, s_fmt, f);
+		imgf->width = tmp_w;
+		imgf->height = tmp_h;
+		ret = v4l2_subdev_call(sd, video, s_imgbus_fmt, imgf);
 		dev_geo(dev, "Camera scaled to %ux%u\n",
-			pix->width, pix->height);
+			imgf->width, imgf->height);
 		if (ret < 0) {
 			/* This shouldn't happen */
 			dev_err(dev, "Client failed to set format: %d\n", ret);
@@ -1161,27 +1196,26 @@ static int client_s_fmt(struct soc_camera_device *icd, struct v4l2_format *f,
  */
 static int client_scale(struct soc_camera_device *icd, struct v4l2_rect *rect,
 			struct v4l2_rect *sub_rect, struct v4l2_rect *ceu_rect,
-			struct v4l2_format *f, bool ceu_can_scale)
+			struct v4l2_imgbus_framefmt *imgf, bool ceu_can_scale)
 {
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct sh_mobile_ceu_cam *cam = icd->host_priv;
 	struct device *dev = icd->dev.parent;
-	struct v4l2_format f_tmp = *f;
-	struct v4l2_pix_format *pix_tmp = &f_tmp.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf_tmp = *imgf;
 	unsigned int scale_h, scale_v;
 	int ret;
 
 	/* 5. Apply iterative camera S_FMT for camera user window. */
-	ret = client_s_fmt(icd, &f_tmp, ceu_can_scale);
+	ret = client_s_fmt(icd, &imgf_tmp, ceu_can_scale);
 	if (ret < 0)
 		return ret;
 
 	dev_geo(dev, "5: camera scaled to %ux%u\n",
-		pix_tmp->width, pix_tmp->height);
+		imgf_tmp.width, imgf_tmp.height);
 
 	/* 6. Retrieve camera output window (g_fmt) */
 
-	/* unneeded - it is already in "f_tmp" */
+	/* unneeded - it is already in "imgf_tmp" */
 
 	/* 7. Calculate new camera scales. */
 	ret = get_camera_scales(sd, rect, &scale_h, &scale_v);
@@ -1190,10 +1224,10 @@ static int client_scale(struct soc_camera_device *icd, struct v4l2_rect *rect,
 
 	dev_geo(dev, "7: camera scales %u:%u\n", scale_h, scale_v);
 
-	cam->cam_width		= pix_tmp->width;
-	cam->cam_height		= pix_tmp->height;
-	f->fmt.pix.width	= pix_tmp->width;
-	f->fmt.pix.height	= pix_tmp->height;
+	cam->cam_width	= imgf_tmp.width;
+	cam->cam_height	= imgf_tmp.height;
+	imgf->width	= imgf_tmp.width;
+	imgf->height	= imgf_tmp.height;
 
 	/*
 	 * 8. Calculate new CEU crop - apply camera scales to previously
@@ -1257,8 +1291,7 @@ static int sh_mobile_ceu_set_crop(struct soc_camera_device *icd,
 	struct v4l2_rect *cam_rect = &cam_crop.c, *ceu_rect = &cam->ceu_rect;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
-	struct v4l2_format f;
-	struct v4l2_pix_format *pix = &f.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	unsigned int scale_comb_h, scale_comb_v, scale_ceu_h, scale_ceu_v,
 		out_width, out_height;
 	u32 capsr, cflcr;
@@ -1307,25 +1340,24 @@ static int sh_mobile_ceu_set_crop(struct soc_camera_device *icd,
 	 * 5. Using actual input window and calculated combined scales calculate
 	 *    camera target output window.
 	 */
-	pix->width		= scale_down(cam_rect->width, scale_comb_h);
-	pix->height		= scale_down(cam_rect->height, scale_comb_v);
+	imgf.width	= scale_down(cam_rect->width, scale_comb_h);
+	imgf.height	= scale_down(cam_rect->height, scale_comb_v);
 
-	dev_geo(dev, "5: camera target %ux%u\n", pix->width, pix->height);
+	dev_geo(dev, "5: camera target %ux%u\n", imgf.width, imgf.height);
 
 	/* 6. - 9. */
-	pix->pixelformat	= cam->camera_fmt->fourcc;
-	pix->colorspace		= cam->camera_fmt->colorspace;
+	imgf.code	= cam->code;
+	imgf.field	= pcdev->is_interlaced ? V4L2_FIELD_INTERLACED :
+						V4L2_FIELD_NONE;
 
 	capsr = capture_save_reset(pcdev);
 	dev_dbg(dev, "CAPSR 0x%x, CFLCR 0x%x\n", capsr, pcdev->cflcr);
 
 	/* Make relative to camera rectangle */
-	rect->left		-= cam_rect->left;
-	rect->top		-= cam_rect->top;
-
-	f.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
+	rect->left	-= cam_rect->left;
+	rect->top	-= cam_rect->top;
 
-	ret = client_scale(icd, cam_rect, rect, ceu_rect, &f,
+	ret = client_scale(icd, cam_rect, rect, ceu_rect, &imgf,
 			   pcdev->image_mode && !pcdev->is_interlaced);
 
 	dev_geo(dev, "6-9: %d\n", ret);
@@ -1373,8 +1405,7 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 	struct sh_mobile_ceu_dev *pcdev = ici->priv;
 	struct sh_mobile_ceu_cam *cam = icd->host_priv;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
-	struct v4l2_format cam_f = *f;
-	struct v4l2_pix_format *cam_pix = &cam_f.fmt.pix;
+	struct v4l2_imgbus_framefmt imgf;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct device *dev = icd->dev.parent;
 	__u32 pixfmt = pix->pixelformat;
@@ -1443,9 +1474,10 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 	 * 4. Calculate camera output window by applying combined scales to real
 	 *    input window.
 	 */
-	cam_pix->width = scale_down(cam_rect->width, scale_h);
-	cam_pix->height = scale_down(cam_rect->height, scale_v);
-	cam_pix->pixelformat = xlate->cam_fmt->fourcc;
+	imgf.width	= scale_down(cam_rect->width, scale_h);
+	imgf.height	= scale_down(cam_rect->height, scale_v);
+	imgf.code	= xlate->code;
+	imgf.field	= pix->field;
 
 	switch (pixfmt) {
 	case V4L2_PIX_FMT_NV12:
@@ -1458,11 +1490,10 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 		image_mode = false;
 	}
 
-	dev_geo(dev, "4: camera output %ux%u\n",
-		cam_pix->width, cam_pix->height);
+	dev_geo(dev, "4: camera output %ux%u\n", imgf.width, imgf.height);
 
 	/* 5. - 9. */
-	ret = client_scale(icd, cam_rect, &cam_subrect, &ceu_rect, &cam_f,
+	ret = client_scale(icd, cam_rect, &cam_subrect, &ceu_rect, &imgf,
 			   image_mode && !is_interlaced);
 
 	dev_geo(dev, "5-9: client scale %d\n", ret);
@@ -1470,20 +1501,20 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 	/* Done with the camera. Now see if we can improve the result */
 
 	dev_dbg(dev, "Camera %d fmt %ux%u, requested %ux%u\n",
-		ret, cam_pix->width, cam_pix->height, pix->width, pix->height);
+		ret, imgf.width, imgf.height, pix->width, pix->height);
 	if (ret < 0)
 		return ret;
 
 	/* 10. Use CEU scaling to scale to the requested user window. */
 
 	/* We cannot scale up */
-	if (pix->width > cam_pix->width)
-		pix->width = cam_pix->width;
+	if (pix->width > imgf.width)
+		pix->width = imgf.width;
 	if (pix->width > ceu_rect.width)
 		pix->width = ceu_rect.width;
 
-	if (pix->height > cam_pix->height)
-		pix->height = cam_pix->height;
+	if (pix->height > imgf.height)
+		pix->height = imgf.height;
 	if (pix->height > ceu_rect.height)
 		pix->height = ceu_rect.height;
 
@@ -1497,10 +1528,9 @@ static int sh_mobile_ceu_set_fmt(struct soc_camera_device *icd,
 
 	pcdev->cflcr = scale_h | (scale_v << 16);
 
-	icd->buswidth = xlate->buswidth;
-	icd->current_fmt = xlate->host_fmt;
-	cam->camera_fmt = xlate->cam_fmt;
-	cam->ceu_rect = ceu_rect;
+	cam->code		= xlate->code;
+	cam->ceu_rect		= ceu_rect;
+	icd->current_fmt	= xlate;
 
 	pcdev->is_interlaced = is_interlaced;
 	pcdev->image_mode = image_mode;
@@ -1514,6 +1544,7 @@ static int sh_mobile_ceu_try_fmt(struct soc_camera_device *icd,
 	const struct soc_camera_format_xlate *xlate;
 	struct v4l2_pix_format *pix = &f->fmt.pix;
 	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
+	struct v4l2_imgbus_framefmt imgf;
 	__u32 pixfmt = pix->pixelformat;
 	int width, height;
 	int ret;
@@ -1532,18 +1563,24 @@ static int sh_mobile_ceu_try_fmt(struct soc_camera_device *icd,
 	width = pix->width;
 	height = pix->height;
 
-	pix->bytesperline = pix->width *
-		DIV_ROUND_UP(xlate->host_fmt->depth, 8);
-	pix->sizeimage = pix->height * pix->bytesperline;
-
-	pix->pixelformat = xlate->cam_fmt->fourcc;
+	pix->bytesperline = v4l2_imgbus_bytes_per_line(width, xlate->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
+	pix->sizeimage = height * pix->bytesperline;
 
 	/* limit to sensor capabilities */
-	ret = v4l2_subdev_call(sd, video, try_fmt, f);
-	pix->pixelformat = pixfmt;
+	imgf.width	= pix->width;
+	imgf.height	= pix->height;
+	imgf.field	= pix->field;
+	imgf.code	= xlate->code;
+	ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
 	if (ret < 0)
 		return ret;
 
+	pix->width	= imgf.width;
+	pix->height	= imgf.height;
+	pix->field	= imgf.field;
+
 	switch (pixfmt) {
 	case V4L2_PIX_FMT_NV12:
 	case V4L2_PIX_FMT_NV21:
@@ -1555,7 +1592,7 @@ static int sh_mobile_ceu_try_fmt(struct soc_camera_device *icd,
 			int tmp_w = pix->width, tmp_h = pix->height;
 			pix->width = 2560;
 			pix->height = 1920;
-			ret = v4l2_subdev_call(sd, video, try_fmt, f);
+			ret = v4l2_subdev_call(sd, video, try_imgbus_fmt, &imgf);
 			if (ret < 0) {
 				/* Shouldn't actually happen... */
 				dev_err(icd->dev.parent,
@@ -1661,7 +1698,7 @@ static int sh_mobile_ceu_set_ctrl(struct soc_camera_device *icd,
 
 	switch (ctrl->id) {
 	case V4L2_CID_SHARPNESS:
-		switch (icd->current_fmt->fourcc) {
+		switch (icd->current_fmt->host_fmt->fourcc) {
 		case V4L2_PIX_FMT_NV12:
 		case V4L2_PIX_FMT_NV21:
 		case V4L2_PIX_FMT_NV16:
diff --git a/drivers/media/video/soc_camera.c b/drivers/media/video/soc_camera.c
index bf77935..7c624f9 100644
--- a/drivers/media/video/soc_camera.c
+++ b/drivers/media/video/soc_camera.c
@@ -40,18 +40,6 @@ static LIST_HEAD(hosts);
 static LIST_HEAD(devices);
 static DEFINE_MUTEX(list_lock);		/* Protects the list of hosts */
 
-const struct soc_camera_data_format *soc_camera_format_by_fourcc(
-	struct soc_camera_device *icd, unsigned int fourcc)
-{
-	unsigned int i;
-
-	for (i = 0; i < icd->num_formats; i++)
-		if (icd->formats[i].fourcc == fourcc)
-			return icd->formats + i;
-	return NULL;
-}
-EXPORT_SYMBOL(soc_camera_format_by_fourcc);
-
 const struct soc_camera_format_xlate *soc_camera_xlate_by_fourcc(
 	struct soc_camera_device *icd, unsigned int fourcc)
 {
@@ -207,21 +195,26 @@ static int soc_camera_dqbuf(struct file *file, void *priv,
 /* Always entered with .video_lock held */
 static int soc_camera_init_user_formats(struct soc_camera_device *icd)
 {
+	struct v4l2_subdev *sd = soc_camera_to_subdev(icd);
 	struct soc_camera_host *ici = to_soc_camera_host(icd->dev.parent);
-	int i, fmts = 0, ret;
+	int i, fmts = 0, raw_fmts = 0, ret;
+	enum v4l2_imgbus_pixelcode code;
+
+	while (!v4l2_subdev_call(sd, video, enum_imgbus_fmt, raw_fmts, &code))
+		raw_fmts++;
 
 	if (!ici->ops->get_formats)
 		/*
 		 * Fallback mode - the host will have to serve all
 		 * sensor-provided formats one-to-one to the user
 		 */
-		fmts = icd->num_formats;
+		fmts = raw_fmts;
 	else
 		/*
 		 * First pass - only count formats this host-sensor
 		 * configuration can provide
 		 */
-		for (i = 0; i < icd->num_formats; i++) {
+		for (i = 0; i < raw_fmts; i++) {
 			ret = ici->ops->get_formats(icd, i, NULL);
 			if (ret < 0)
 				return ret;
@@ -242,11 +235,11 @@ static int soc_camera_init_user_formats(struct soc_camera_device *icd)
 
 	/* Second pass - actually fill data formats */
 	fmts = 0;
-	for (i = 0; i < icd->num_formats; i++)
+	for (i = 0; i < raw_fmts; i++)
 		if (!ici->ops->get_formats) {
-			icd->user_formats[i].host_fmt = icd->formats + i;
-			icd->user_formats[i].cam_fmt = icd->formats + i;
-			icd->user_formats[i].buswidth = icd->formats[i].depth;
+			v4l2_subdev_call(sd, video, enum_imgbus_fmt, i, &code);
+			icd->user_formats[i].host_fmt = v4l2_imgbus_get_fmtdesc(code);
+			icd->user_formats[i].code = code;
 		} else {
 			ret = ici->ops->get_formats(icd, i,
 						    &icd->user_formats[fmts]);
@@ -255,7 +248,7 @@ static int soc_camera_init_user_formats(struct soc_camera_device *icd)
 			fmts += ret;
 		}
 
-	icd->current_fmt = icd->user_formats[0].host_fmt;
+	icd->current_fmt = &icd->user_formats[0];
 
 	return 0;
 
@@ -281,7 +274,7 @@ static void soc_camera_free_user_formats(struct soc_camera_device *icd)
 #define pixfmtstr(x) (x) & 0xff, ((x) >> 8) & 0xff, ((x) >> 16) & 0xff, \
 	((x) >> 24) & 0xff
 
-/* Called with .vb_lock held */
+/* Called with .vb_lock held, or from the first open(2), see comment there */
 static int soc_camera_set_fmt(struct soc_camera_file *icf,
 			      struct v4l2_format *f)
 {
@@ -302,7 +295,7 @@ static int soc_camera_set_fmt(struct soc_camera_file *icf,
 	if (ret < 0) {
 		return ret;
 	} else if (!icd->current_fmt ||
-		   icd->current_fmt->fourcc != pix->pixelformat) {
+		   icd->current_fmt->host_fmt->fourcc != pix->pixelformat) {
 		dev_err(&icd->dev,
 			"Host driver hasn't set up current format correctly!\n");
 		return -EINVAL;
@@ -369,8 +362,8 @@ static int soc_camera_open(struct file *file)
 				.width		= icd->user_width,
 				.height		= icd->user_height,
 				.field		= icd->field,
-				.pixelformat	= icd->current_fmt->fourcc,
-				.colorspace	= icd->current_fmt->colorspace,
+				.pixelformat	= icd->current_fmt->host_fmt->fourcc,
+				.colorspace	= icd->current_fmt->host_fmt->colorspace,
 			},
 		};
 
@@ -390,7 +383,12 @@ static int soc_camera_open(struct file *file)
 			goto eiciadd;
 		}
 
-		/* Try to configure with default parameters */
+		/*
+		 * Try to configure with default parameters. Notice: this is the
+		 * very first open, so, we cannot race against other calls,
+		 * apart from someone else calling open() simultaneously, but
+		 * .video_lock is protecting us against it.
+		 */
 		ret = soc_camera_set_fmt(icf, &f);
 		if (ret < 0)
 			goto esfmt;
@@ -534,7 +532,7 @@ static int soc_camera_enum_fmt_vid_cap(struct file *file, void  *priv,
 {
 	struct soc_camera_file *icf = file->private_data;
 	struct soc_camera_device *icd = icf->icd;
-	const struct soc_camera_data_format *format;
+	const struct v4l2_imgbus_pixelfmt *format;
 
 	WARN_ON(priv != file->private_data);
 
@@ -543,7 +541,8 @@ static int soc_camera_enum_fmt_vid_cap(struct file *file, void  *priv,
 
 	format = icd->user_formats[f->index].host_fmt;
 
-	strlcpy(f->description, format->name, sizeof(f->description));
+	if (format->name)
+		strlcpy(f->description, format->name, sizeof(f->description));
 	f->pixelformat = format->fourcc;
 	return 0;
 }
@@ -560,12 +559,14 @@ static int soc_camera_g_fmt_vid_cap(struct file *file, void *priv,
 	pix->width		= icd->user_width;
 	pix->height		= icd->user_height;
 	pix->field		= icf->vb_vidq.field;
-	pix->pixelformat	= icd->current_fmt->fourcc;
-	pix->bytesperline	= pix->width *
-		DIV_ROUND_UP(icd->current_fmt->depth, 8);
+	pix->pixelformat	= icd->current_fmt->host_fmt->fourcc;
+	pix->bytesperline	= v4l2_imgbus_bytes_per_line(pix->width,
+						icd->current_fmt->host_fmt);
+	if (pix->bytesperline < 0)
+		return pix->bytesperline;
 	pix->sizeimage		= pix->height * pix->bytesperline;
 	dev_dbg(&icd->dev, "current_fmt->fourcc: 0x%08x\n",
-		icd->current_fmt->fourcc);
+		icd->current_fmt->host_fmt->fourcc);
 	return 0;
 }
 
@@ -894,7 +895,7 @@ static int soc_camera_probe(struct device *dev)
 	struct soc_camera_link *icl = to_soc_camera_link(icd);
 	struct device *control = NULL;
 	struct v4l2_subdev *sd;
-	struct v4l2_format f = {.type = V4L2_BUF_TYPE_VIDEO_CAPTURE};
+	struct v4l2_imgbus_framefmt imgf;
 	int ret;
 
 	dev_info(dev, "Probing %s\n", dev_name(dev));
@@ -965,9 +966,10 @@ static int soc_camera_probe(struct device *dev)
 
 	/* Try to improve our guess of a reasonable window format */
 	sd = soc_camera_to_subdev(icd);
-	if (!v4l2_subdev_call(sd, video, g_fmt, &f)) {
-		icd->user_width		= f.fmt.pix.width;
-		icd->user_height	= f.fmt.pix.height;
+	if (!v4l2_subdev_call(sd, video, g_imgbus_fmt, &imgf)) {
+		icd->user_width		= imgf.width;
+		icd->user_height	= imgf.height;
+		icd->field		= imgf.field;
 	}
 
 	/* Do we have to sysfs_remove_link() before device_unregister()? */
diff --git a/drivers/media/video/soc_camera_platform.c b/drivers/media/video/soc_camera_platform.c
index c7c9151..573480c 100644
--- a/drivers/media/video/soc_camera_platform.c
+++ b/drivers/media/video/soc_camera_platform.c
@@ -22,7 +22,7 @@
 
 struct soc_camera_platform_priv {
 	struct v4l2_subdev subdev;
-	struct soc_camera_data_format format;
+	struct v4l2_imgbus_framefmt format;
 };
 
 static struct soc_camera_platform_priv *get_priv(struct platform_device *pdev)
@@ -58,36 +58,33 @@ soc_camera_platform_query_bus_param(struct soc_camera_device *icd)
 }
 
 static int soc_camera_platform_try_fmt(struct v4l2_subdev *sd,
-				       struct v4l2_format *f)
+				       struct v4l2_imgbus_framefmt *imgf)
 {
 	struct soc_camera_platform_info *p = v4l2_get_subdevdata(sd);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
-	pix->width = p->format.width;
-	pix->height = p->format.height;
+	imgf->width = p->format.width;
+	imgf->height = p->format.height;
 	return 0;
 }
 
-static void soc_camera_platform_video_probe(struct soc_camera_device *icd,
-					    struct platform_device *pdev)
+static struct v4l2_subdev_core_ops platform_subdev_core_ops;
+
+static int soc_camera_platform_enum_fmt(struct v4l2_subdev *sd, int index,
+					enum v4l2_imgbus_pixelcode *code)
 {
-	struct soc_camera_platform_priv *priv = get_priv(pdev);
-	struct soc_camera_platform_info *p = pdev->dev.platform_data;
+	struct soc_camera_platform_info *p = v4l2_get_subdevdata(sd);
 
-	priv->format.name = p->format_name;
-	priv->format.depth = p->format_depth;
-	priv->format.fourcc = p->format.pixelformat;
-	priv->format.colorspace = p->format.colorspace;
+	if (index)
+		return -EINVAL;
 
-	icd->formats = &priv->format;
-	icd->num_formats = 1;
+	*code = p->format.code;
+	return 0;
 }
 
-static struct v4l2_subdev_core_ops platform_subdev_core_ops;
-
 static struct v4l2_subdev_video_ops platform_subdev_video_ops = {
-	.s_stream	= soc_camera_platform_s_stream,
-	.try_fmt	= soc_camera_platform_try_fmt,
+	.s_stream		= soc_camera_platform_s_stream,
+	.try_imgbus_fmt		= soc_camera_platform_try_fmt,
+	.enum_imgbus_fmt	= soc_camera_platform_enum_fmt,
 };
 
 static struct v4l2_subdev_ops platform_subdev_ops = {
@@ -132,8 +129,6 @@ static int soc_camera_platform_probe(struct platform_device *pdev)
 
 	ici = to_soc_camera_host(icd->dev.parent);
 
-	soc_camera_platform_video_probe(icd, pdev);
-
 	v4l2_subdev_init(&priv->subdev, &platform_subdev_ops);
 	v4l2_set_subdevdata(&priv->subdev, p);
 	strncpy(priv->subdev.name, dev_name(&pdev->dev), V4L2_SUBDEV_NAME_SIZE);
diff --git a/drivers/media/video/tw9910.c b/drivers/media/video/tw9910.c
index 35373d8..09ea042 100644
--- a/drivers/media/video/tw9910.c
+++ b/drivers/media/video/tw9910.c
@@ -240,13 +240,8 @@ static const struct regval_list tw9910_default_regs[] =
 	ENDMARKER,
 };
 
-static const struct soc_camera_data_format tw9910_color_fmt[] = {
-	{
-		.name       = "VYUY",
-		.fourcc     = V4L2_PIX_FMT_VYUY,
-		.depth      = 16,
-		.colorspace = V4L2_COLORSPACE_SMPTE170M,
-	}
+static const enum v4l2_imgbus_pixelcode tw9910_color_codes[] = {
+	V4L2_IMGBUS_FMT_VYUY,
 };
 
 static const struct tw9910_scale_ctrl tw9910_ntsc_scales[] = {
@@ -762,11 +757,11 @@ static int tw9910_cropcap(struct v4l2_subdev *sd, struct v4l2_cropcap *a)
 	return 0;
 }
 
-static int tw9910_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int tw9910_g_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct tw9910_priv *priv = to_tw9910(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 
 	if (!priv->scale) {
 		int ret;
@@ -783,74 +778,74 @@ static int tw9910_g_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
 			return ret;
 	}
 
-	f->type			= V4L2_BUF_TYPE_VIDEO_CAPTURE;
-
-	pix->width		= priv->scale->width;
-	pix->height		= priv->scale->height;
-	pix->pixelformat	= V4L2_PIX_FMT_VYUY;
-	pix->colorspace		= V4L2_COLORSPACE_SMPTE170M;
-	pix->field		= V4L2_FIELD_INTERLACED;
+	imgf->width	= priv->scale->width;
+	imgf->height	= priv->scale->height;
+	imgf->code	= V4L2_IMGBUS_FMT_VYUY;
+	imgf->field	= V4L2_FIELD_INTERLACED;
 
 	return 0;
 }
 
-static int tw9910_s_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int tw9910_s_fmt(struct v4l2_subdev *sd,
+			struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct tw9910_priv *priv = to_tw9910(client);
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	/* See tw9910_s_crop() - no proper cropping support */
 	struct v4l2_crop a = {
 		.c = {
 			.left	= 0,
 			.top	= 0,
-			.width	= pix->width,
-			.height	= pix->height,
+			.width	= imgf->width,
+			.height	= imgf->height,
 		},
 	};
 	int i, ret;
 
+	WARN_ON(imgf->field != V4L2_FIELD_ANY &&
+		imgf->field != V4L2_FIELD_INTERLACED);
+
 	/*
 	 * check color format
 	 */
-	for (i = 0; i < ARRAY_SIZE(tw9910_color_fmt); i++)
-		if (pix->pixelformat == tw9910_color_fmt[i].fourcc)
+	for (i = 0; i < ARRAY_SIZE(tw9910_color_codes); i++)
+		if (imgf->code == tw9910_color_codes[i])
 			break;
 
-	if (i == ARRAY_SIZE(tw9910_color_fmt))
+	if (i == ARRAY_SIZE(tw9910_color_codes))
 		return -EINVAL;
 
 	ret = tw9910_s_crop(sd, &a);
 	if (!ret) {
-		pix->width = priv->scale->width;
-		pix->height = priv->scale->height;
+		imgf->width	= priv->scale->width;
+		imgf->height	= priv->scale->height;
 	}
 	return ret;
 }
 
-static int tw9910_try_fmt(struct v4l2_subdev *sd, struct v4l2_format *f)
+static int tw9910_try_fmt(struct v4l2_subdev *sd,
+			  struct v4l2_imgbus_framefmt *imgf)
 {
 	struct i2c_client *client = sd->priv;
 	struct soc_camera_device *icd = client->dev.platform_data;
-	struct v4l2_pix_format *pix = &f->fmt.pix;
 	const struct tw9910_scale_ctrl *scale;
 
-	if (V4L2_FIELD_ANY == pix->field) {
-		pix->field = V4L2_FIELD_INTERLACED;
-	} else if (V4L2_FIELD_INTERLACED != pix->field) {
-		dev_err(&client->dev, "Field type invalid.\n");
+	if (V4L2_FIELD_ANY == imgf->field) {
+		imgf->field = V4L2_FIELD_INTERLACED;
+	} else if (V4L2_FIELD_INTERLACED != imgf->field) {
+		dev_err(&client->dev, "Field type %d invalid.\n", imgf->field);
 		return -EINVAL;
 	}
 
 	/*
 	 * select suitable norm
 	 */
-	scale = tw9910_select_norm(icd, pix->width, pix->height);
+	scale = tw9910_select_norm(icd, imgf->width, imgf->height);
 	if (!scale)
 		return -EINVAL;
 
-	pix->width  = scale->width;
-	pix->height = scale->height;
+	imgf->width	= scale->width;
+	imgf->height	= scale->height;
 
 	return 0;
 }
@@ -878,9 +873,6 @@ static int tw9910_video_probe(struct soc_camera_device *icd,
 		return -ENODEV;
 	}
 
-	icd->formats     = tw9910_color_fmt;
-	icd->num_formats = ARRAY_SIZE(tw9910_color_fmt);
-
 	/*
 	 * check and show Product ID
 	 * So far only revisions 0 and 1 have been seen
@@ -918,14 +910,25 @@ static struct v4l2_subdev_core_ops tw9910_subdev_core_ops = {
 #endif
 };
 
+static int tw9910_enum_fmt(struct v4l2_subdev *sd, int index,
+			   enum v4l2_imgbus_pixelcode *code)
+{
+	if ((unsigned int)index >= ARRAY_SIZE(tw9910_color_codes))
+		return -EINVAL;
+
+	*code = tw9910_color_codes[index];
+	return 0;
+}
+
 static struct v4l2_subdev_video_ops tw9910_subdev_video_ops = {
-	.s_stream	= tw9910_s_stream,
-	.g_fmt		= tw9910_g_fmt,
-	.s_fmt		= tw9910_s_fmt,
-	.try_fmt	= tw9910_try_fmt,
-	.cropcap	= tw9910_cropcap,
-	.g_crop		= tw9910_g_crop,
-	.s_crop		= tw9910_s_crop,
+	.s_stream		= tw9910_s_stream,
+	.g_imgbus_fmt		= tw9910_g_fmt,
+	.s_imgbus_fmt		= tw9910_s_fmt,
+	.try_imgbus_fmt		= tw9910_try_fmt,
+	.cropcap		= tw9910_cropcap,
+	.g_crop			= tw9910_g_crop,
+	.s_crop			= tw9910_s_crop,
+	.enum_imgbus_fmt	= tw9910_enum_fmt,
 };
 
 static struct v4l2_subdev_ops tw9910_subdev_ops = {
diff --git a/include/media/soc_camera.h b/include/media/soc_camera.h
index 831efff..d0ef622 100644
--- a/include/media/soc_camera.h
+++ b/include/media/soc_camera.h
@@ -26,13 +26,10 @@ struct soc_camera_device {
 	s32 user_height;
 	unsigned char iface;		/* Host number */
 	unsigned char devnum;		/* Device number per host */
-	unsigned char buswidth;		/* See comment in .c */
 	struct soc_camera_sense *sense;	/* See comment in struct definition */
 	struct soc_camera_ops *ops;
 	struct video_device *vdev;
-	const struct soc_camera_data_format *current_fmt;
-	const struct soc_camera_data_format *formats;
-	int num_formats;
+	const struct soc_camera_format_xlate *current_fmt;
 	struct soc_camera_format_xlate *user_formats;
 	int num_user_formats;
 	enum v4l2_field field;		/* Preserve field over close() */
@@ -161,23 +158,13 @@ static inline struct v4l2_subdev *soc_camera_to_subdev(
 int soc_camera_host_register(struct soc_camera_host *ici);
 void soc_camera_host_unregister(struct soc_camera_host *ici);
 
-const struct soc_camera_data_format *soc_camera_format_by_fourcc(
-	struct soc_camera_device *icd, unsigned int fourcc);
 const struct soc_camera_format_xlate *soc_camera_xlate_by_fourcc(
 	struct soc_camera_device *icd, unsigned int fourcc);
 
-struct soc_camera_data_format {
-	const char *name;
-	unsigned int depth;
-	__u32 fourcc;
-	enum v4l2_colorspace colorspace;
-};
-
 /**
  * struct soc_camera_format_xlate - match between host and sensor formats
- * @cam_fmt: sensor format provided by the sensor
- * @host_fmt: host format after host translation from cam_fmt
- * @buswidth: bus width for this format
+ * @code: code of a sensor provided format
+ * @host_fmt: host format after host translation from code
  *
  * Host and sensor translation structure. Used in table of host and sensor
  * formats matchings in soc_camera_device. A host can override the generic list
@@ -185,9 +172,8 @@ struct soc_camera_data_format {
  * format setup.
  */
 struct soc_camera_format_xlate {
-	const struct soc_camera_data_format *cam_fmt;
-	const struct soc_camera_data_format *host_fmt;
-	unsigned char buswidth;
+	enum v4l2_imgbus_pixelcode code;
+	const struct v4l2_imgbus_pixelfmt *host_fmt;
 };
 
 struct soc_camera_ops {
diff --git a/include/media/soc_camera_platform.h b/include/media/soc_camera_platform.h
index 88b3b57..a105268 100644
--- a/include/media/soc_camera_platform.h
+++ b/include/media/soc_camera_platform.h
@@ -19,7 +19,7 @@ struct device;
 struct soc_camera_platform_info {
 	const char *format_name;
 	unsigned long format_depth;
-	struct v4l2_pix_format format;
+	struct v4l2_imgbus_framefmt format;
 	unsigned long bus_param;
 	struct device *dev;
 	int (*set_capture)(struct soc_camera_platform_info *info, int enable);
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* [PATCH/RFC 8b/9 v3] rj54n1cb0c: Add cropping, auto white balance, restrict sizes, add platform data
  2009-10-30 14:01 ` [PATCH/RFC 8/9 v2] soc-camera: convert to the new imagebus API Guennadi Liakhovetski
  2009-10-30 18:31   ` [PATCH/RFC 8a/9 " Guennadi Liakhovetski
@ 2009-10-30 18:34   ` Guennadi Liakhovetski
  1 sibling, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 18:34 UTC (permalink / raw)
  To: Linux Media Mailing List
  Cc: Hans Verkuil, Laurent Pinchart, Sakari Ailus, Muralidharan Karicheri

It has been experimentally found out, that the sensor only supports up to
512x384 video output and also has some restrictions on minimum scale. We
disable non-working size ranges until, maybe, someone finds out how to properly
set them up. Also add cropping support, an auto white balance control, platform
data to specify master clock frequency and polarity of the IOCTL pin. Add 
this data to the kfr2r09 board.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---

...and part "b". I also incremented the version, because I also swapped 
functions to reduce the diff, but the content is exactly the same.

 arch/sh/boards/mach-kfr2r09/setup.c |   13 ++-
 drivers/media/video/rj54n1cb0c.c    |  282 ++++++++++++++++++++++++++++-------
 include/media/rj54n1cb0c.h          |   19 +++
 3 files changed, 260 insertions(+), 54 deletions(-)
 create mode 100644 include/media/rj54n1cb0c.h

diff --git a/arch/sh/boards/mach-kfr2r09/setup.c b/arch/sh/boards/mach-kfr2r09/setup.c
index ce01d6a..18df641 100644
--- a/arch/sh/boards/mach-kfr2r09/setup.c
+++ b/arch/sh/boards/mach-kfr2r09/setup.c
@@ -18,6 +18,7 @@
 #include <linux/input.h>
 #include <linux/i2c.h>
 #include <linux/usb/r8a66597.h>
+#include <media/rj54n1cb0c.h>
 #include <media/soc_camera.h>
 #include <media/sh_mobile_ceu.h>
 #include <video/sh_mobile_lcdc.h>
@@ -254,6 +255,9 @@ static struct i2c_board_info kfr2r09_i2c_camera = {
 
 static struct clk *camera_clk;
 
+/* set VIO_CKO clock to 25MHz */
+#define CEU_MCLK_FREQ 25000000
+
 #define DRVCRB 0xA405018C
 static int camera_power(struct device *dev, int mode)
 {
@@ -266,8 +270,7 @@ static int camera_power(struct device *dev, int mode)
 		if (IS_ERR(camera_clk))
 			return PTR_ERR(camera_clk);
 
-		/* set VIO_CKO clock to 25MHz */
-		rate = clk_round_rate(camera_clk, 25000000);
+		rate = clk_round_rate(camera_clk, CEU_MCLK_FREQ);
 		ret = clk_set_rate(camera_clk, rate);
 		if (ret < 0)
 			goto eclkrate;
@@ -317,11 +320,17 @@ eclkrate:
 	return ret;
 }
 
+static struct rj54n1_pdata rj54n1_priv = {
+	.mclk_freq	= CEU_MCLK_FREQ,
+	.ioctl_high	= false,
+};
+
 static struct soc_camera_link rj54n1_link = {
 	.power		= camera_power,
 	.board_info	= &kfr2r09_i2c_camera,
 	.i2c_adapter_id	= 1,
 	.module_name	= "rj54n1cb0c",
+	.priv		= &rj54n1_priv,
 };
 
 static struct platform_device kfr2r09_camera = {
diff --git a/drivers/media/video/rj54n1cb0c.c b/drivers/media/video/rj54n1cb0c.c
index 0c998e8..6c4cba2 100644
--- a/drivers/media/video/rj54n1cb0c.c
+++ b/drivers/media/video/rj54n1cb0c.c
@@ -16,6 +16,7 @@
 #include <media/v4l2-subdev.h>
 #include <media/v4l2-chip-ident.h>
 #include <media/soc_camera.h>
+#include <media/rj54n1cb0c.h>
 
 #define RJ54N1_DEV_CODE			0x0400
 #define RJ54N1_DEV_CODE2		0x0401
@@ -38,6 +39,7 @@
 #define RJ54N1_H_OBEN_OFS		0x0413
 #define RJ54N1_V_OBEN_OFS		0x0414
 #define RJ54N1_RESIZE_CONTROL		0x0415
+#define RJ54N1_STILL_CONTROL		0x0417
 #define RJ54N1_INC_USE_SEL_H		0x0425
 #define RJ54N1_INC_USE_SEL_L		0x0426
 #define RJ54N1_MIRROR_STILL_MODE	0x0427
@@ -49,10 +51,21 @@
 #define RJ54N1_RA_SEL_UL		0x0530
 #define RJ54N1_BYTE_SWAP		0x0531
 #define RJ54N1_OUT_SIGPO		0x053b
+#define RJ54N1_WB_SEL_WEIGHT_I		0x054e
+#define RJ54N1_BIT8_WB			0x0569
+#define RJ54N1_HCAPS_WB			0x056a
+#define RJ54N1_VCAPS_WB			0x056b
+#define RJ54N1_HCAPE_WB			0x056c
+#define RJ54N1_VCAPE_WB			0x056d
+#define RJ54N1_EXPOSURE_CONTROL		0x058c
 #define RJ54N1_FRAME_LENGTH_S_H		0x0595
 #define RJ54N1_FRAME_LENGTH_S_L		0x0596
 #define RJ54N1_FRAME_LENGTH_P_H		0x0597
 #define RJ54N1_FRAME_LENGTH_P_L		0x0598
+#define RJ54N1_PEAK_H			0x05b7
+#define RJ54N1_PEAK_50			0x05b8
+#define RJ54N1_PEAK_60			0x05b9
+#define RJ54N1_PEAK_DIFF		0x05ba
 #define RJ54N1_IOC			0x05ef
 #define RJ54N1_TG_BYPASS		0x0700
 #define RJ54N1_PLL_L			0x0701
@@ -68,6 +81,7 @@
 #define RJ54N1_OCLK_SEL_EN		0x0713
 #define RJ54N1_CLK_RST			0x0717
 #define RJ54N1_RESET_STANDBY		0x0718
+#define RJ54N1_FWFLG			0x07fe
 
 #define E_EXCLK				(1 << 7)
 #define SOFT_STDBY			(1 << 4)
@@ -78,11 +92,18 @@
 #define RESIZE_HOLD_SEL			(1 << 2)
 #define RESIZE_GO			(1 << 1)
 
+/*
+ * When cropping, the camera automatically centers the cropped region, there
+ * doesn't seem to be a way to specify an explicit location of the rectangle.
+ */
 #define RJ54N1_COLUMN_SKIP		0
 #define RJ54N1_ROW_SKIP			0
 #define RJ54N1_MAX_WIDTH		1600
 #define RJ54N1_MAX_HEIGHT		1200
 
+#define PLL_L				2
+#define PLL_N				0x31
+
 /* I2C addresses: 0x50, 0x51, 0x60, 0x61 */
 
 static const enum v4l2_imgbus_pixelcode rj54n1_colour_codes[] = {
@@ -98,7 +119,7 @@ static const enum v4l2_imgbus_pixelcode rj54n1_colour_codes[] = {
 };
 
 struct rj54n1_clock_div {
-	u8 ratio_tg;
+	u8 ratio_tg;	/* can be 0 or an odd number */
 	u8 ratio_t;
 	u8 ratio_r;
 	u8 ratio_op;
@@ -107,12 +128,14 @@ struct rj54n1_clock_div {
 
 struct rj54n1 {
 	struct v4l2_subdev subdev;
+	struct rj54n1_clock_div clk_div;
 	enum v4l2_imgbus_pixelcode code;
 	struct v4l2_rect rect;	/* Sensor window */
+	unsigned int tgclk_mhz;
+	bool auto_wb;
 	unsigned short width;	/* Output window */
 	unsigned short height;
 	unsigned short resize;	/* Sensor * 1024 / resize = Output */
-	struct rj54n1_clock_div clk_div;
 	unsigned short scale;
 	u8 bank;
 };
@@ -169,7 +192,7 @@ const static struct rj54n1_reg_val bank_7[] = {
 	{0x714, 0xff},
 	{0x715, 0xff},
 	{0x716, 0x1f},
-	{0x7FE, 0x02},
+	{0x7FE, 2},
 };
 
 const static struct rj54n1_reg_val bank_8[] = {
@@ -357,7 +380,7 @@ const static struct rj54n1_reg_val bank_8[] = {
 	{0x8BB, 0x00},
 	{0x8BC, 0xFF},
 	{0x8BD, 0x00},
-	{0x8FE, 0x02},
+	{0x8FE, 2},
 };
 
 const static struct rj54n1_reg_val bank_10[] = {
@@ -450,8 +473,10 @@ static int rj54n1_enum_fmt(struct v4l2_subdev *sd, int index,
 
 static int rj54n1_s_stream(struct v4l2_subdev *sd, int enable)
 {
-	/* TODO: start / stop streaming */
-	return 0;
+	struct i2c_client *client = sd->priv;
+
+	/* Switch between preview and still shot modes */
+	return reg_set(client, RJ54N1_STILL_CONTROL, (!enable) << 7, 0x80);
 }
 
 static int rj54n1_set_bus_param(struct soc_camera_device *icd,
@@ -510,6 +535,44 @@ static int rj54n1_commit(struct i2c_client *client)
 	return ret;
 }
 
+static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
+			       u32 *out_w, u32 *out_h);
+
+static int rj54n1_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
+{
+	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
+	struct v4l2_rect *rect = &a->c;
+	unsigned int dummy, output_w, output_h,
+		input_w = rect->width, input_h = rect->height;
+	int ret;
+
+	/* arbitrary minimum width and height, edges unimportant */
+	soc_camera_limit_side(&dummy, &input_w,
+		     RJ54N1_COLUMN_SKIP, 8, RJ54N1_MAX_WIDTH);
+
+	soc_camera_limit_side(&dummy, &input_h,
+		     RJ54N1_ROW_SKIP, 8, RJ54N1_MAX_HEIGHT);
+
+	output_w = (input_w * 1024 + rj54n1->resize / 2) / rj54n1->resize;
+	output_h = (input_h * 1024 + rj54n1->resize / 2) / rj54n1->resize;
+
+	dev_dbg(&client->dev, "Scaling for %ux%u : %u = %ux%u\n",
+		input_w, input_h, rj54n1->resize, output_w, output_h);
+
+	ret = rj54n1_sensor_scale(sd, &input_w, &input_h, &output_w, &output_h);
+	if (ret < 0)
+		return ret;
+
+	rj54n1->width		= output_w;
+	rj54n1->height		= output_h;
+	rj54n1->resize		= ret;
+	rj54n1->rect.width	= input_w;
+	rj54n1->rect.height	= input_h;
+
+	return 0;
+}
+
 static int rj54n1_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 {
 	struct i2c_client *client = sd->priv;
@@ -558,11 +621,44 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 			       u32 *out_w, u32 *out_h)
 {
 	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	unsigned int skip, resize, input_w = *in_w, input_h = *in_h,
 		output_w = *out_w, output_h = *out_h;
-	u16 inc_sel;
+	u16 inc_sel, wb_bit8, wb_left, wb_right, wb_top, wb_bottom;
+	unsigned int peak, peak_50, peak_60;
 	int ret;
 
+	/*
+	 * We have a problem with crops, where the window is larger than 512x384
+	 * and output window is larger than a half of the input one. In this
+	 * case we have to either reduce the input window to equal or below
+	 * 512x384 or the output window to equal or below 1/2 of the input.
+	 */
+	if (output_w > max(512U, input_w / 2)) {
+		if (2 * output_w > RJ54N1_MAX_WIDTH) {
+			input_w = RJ54N1_MAX_WIDTH;
+			output_w = RJ54N1_MAX_WIDTH / 2;
+		} else {
+			input_w = output_w * 2;
+		}
+
+		dev_dbg(&client->dev, "Adjusted output width: in %u, out %u\n",
+			input_w, output_w);
+	}
+
+	if (output_h > max(384U, input_h / 2)) {
+		if (2 * output_h > RJ54N1_MAX_HEIGHT) {
+			input_h = RJ54N1_MAX_HEIGHT;
+			output_h = RJ54N1_MAX_HEIGHT / 2;
+		} else {
+			input_h = output_h * 2;
+		}
+
+		dev_dbg(&client->dev, "Adjusted output height: in %u, out %u\n",
+			input_h, output_h);
+	}
+
+	/* Idea: use the read mode for snapshots, handle separate geometries */
 	ret = rj54n1_set_rect(client, RJ54N1_X_OUTPUT_SIZE_S_L,
 			      RJ54N1_Y_OUTPUT_SIZE_S_L,
 			      RJ54N1_XY_OUTPUT_SIZE_S_H, output_w, output_h);
@@ -574,17 +670,27 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (ret < 0)
 		return ret;
 
-	if (output_w > input_w || output_h > input_h) {
+	if (output_w > input_w && output_h > input_h) {
 		input_w = output_w;
 		input_h = output_h;
 
 		resize = 1024;
 	} else {
 		unsigned int resize_x, resize_y;
-		resize_x = input_w * 1024 / output_w;
-		resize_y = input_h * 1024 / output_h;
-
-		resize = min(resize_x, resize_y);
+		resize_x = (input_w * 1024 + output_w / 2) / output_w;
+		resize_y = (input_h * 1024 + output_h / 2) / output_h;
+
+		/* We want max(resize_x, resize_y), check if it still fits */
+		if (resize_x > resize_y &&
+		    (output_h * resize_x + 512) / 1024 > RJ54N1_MAX_HEIGHT)
+			resize = (RJ54N1_MAX_HEIGHT * 1024 + output_h / 2) /
+				output_h;
+		else if (resize_y > resize_x &&
+			 (output_w * resize_y + 512) / 1024 > RJ54N1_MAX_WIDTH)
+			resize = (RJ54N1_MAX_WIDTH * 1024 + output_w / 2) /
+				output_w;
+		else
+			resize = max(resize_x, resize_y);
 
 		/* Prohibited value ranges */
 		switch (resize) {
@@ -597,12 +703,9 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 		case 8160 ... 8191:
 			resize = 8159;
 			break;
-		case 16320 ... 16383:
+		case 16320 ... 16384:
 			resize = 16319;
 		}
-
-		input_w = output_w * resize / 1024;
-		input_h = output_h * resize / 1024;
 	}
 
 	/* Set scaling */
@@ -615,9 +718,18 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 
 	/*
 	 * Configure a skipping bitmask. The sensor will select a skipping value
-	 * among set bits automatically.
+	 * among set bits automatically. This is very unclear in the datasheet
+	 * too. I was told, in this register one enables all skipping values,
+	 * that are required for a specific resize, and the camera selects
+	 * automatically, which ones to use. But it is unclear how to identify,
+	 * which cropping values are needed. Secondly, why don't we just set all
+	 * bits and let the camera choose? Would it increase processing time and
+	 * reduce the framerate? Using 0xfffc for INC_USE_SEL doesn't seem to
+	 * improve the image quality or stability for larger frames (see comment
+	 * above), but I didn't check the framerate.
 	 */
 	skip = min(resize / 1024, (unsigned)15);
+
 	inc_sel = 1 << skip;
 
 	if (inc_sel <= 2)
@@ -629,6 +741,43 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (!ret)
 		ret = reg_write(client, RJ54N1_INC_USE_SEL_H, inc_sel >> 8);
 
+	if (!rj54n1->auto_wb) {
+		/* Auto white balance window */
+		wb_left	  = output_w / 16;
+		wb_right  = (3 * output_w / 4 - 3) / 4;
+		wb_top	  = output_h / 16;
+		wb_bottom = (3 * output_h / 4 - 3) / 4;
+		wb_bit8	  = ((wb_left >> 2) & 0x40) | ((wb_top >> 4) & 0x10) |
+			((wb_right >> 6) & 4) | ((wb_bottom >> 8) & 1);
+
+		if (!ret)
+			ret = reg_write(client, RJ54N1_BIT8_WB, wb_bit8);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_HCAPS_WB, wb_left);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_VCAPS_WB, wb_top);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_HCAPE_WB, wb_right);
+		if (!ret)
+			ret = reg_write(client, RJ54N1_VCAPE_WB, wb_bottom);
+	}
+
+	/* Antiflicker */
+	peak = 12 * RJ54N1_MAX_WIDTH * (1 << 14) * resize / rj54n1->tgclk_mhz /
+		10000;
+	peak_50 = peak / 6;
+	peak_60 = peak / 5;
+
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_H,
+				((peak_50 >> 4) & 0xf0) | (peak_60 >> 8));
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_50, peak_50);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_60, peak_60);
+	if (!ret)
+		ret = reg_write(client, RJ54N1_PEAK_DIFF, peak / 150);
+
 	/* Start resizing */
 	if (!ret)
 		ret = reg_write(client, RJ54N1_RESIZE_CONTROL,
@@ -637,8 +786,6 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (ret < 0)
 		return ret;
 
-	dev_dbg(&client->dev, "resize %u, skip %u\n", resize, skip);
-
 	/* Constant taken from manufacturer's example */
 	msleep(230);
 
@@ -646,11 +793,14 @@ static int rj54n1_sensor_scale(struct v4l2_subdev *sd, u32 *in_w, u32 *in_h,
 	if (ret < 0)
 		return ret;
 
-	*in_w = input_w;
-	*in_h = input_h;
+	*in_w = (output_w * resize + 512) / 1024;
+	*in_h = (output_h * resize + 512) / 1024;
 	*out_w = output_w;
 	*out_h = output_h;
 
+	dev_dbg(&client->dev, "Scaled for %ux%u : %u = %ux%u, skip %u\n",
+		*in_w, *in_h, resize, output_w, output_h, skip);
+
 	return resize;
 }
 
@@ -661,14 +811,14 @@ static int rj54n1_set_clock(struct i2c_client *client)
 
 	/* Enable external clock */
 	ret = reg_write(client, RJ54N1_RESET_STANDBY, E_EXCLK | SOFT_STDBY);
-	/* Leave stand-by */
+	/* Leave stand-by. Note: use this when implementing suspend / resume */
 	if (!ret)
 		ret = reg_write(client, RJ54N1_RESET_STANDBY, E_EXCLK);
 
 	if (!ret)
-		ret = reg_write(client, RJ54N1_PLL_L, 2);
+		ret = reg_write(client, RJ54N1_PLL_L, PLL_L);
 	if (!ret)
-		ret = reg_write(client, RJ54N1_PLL_N, 0x31);
+		ret = reg_write(client, RJ54N1_PLL_N, PLL_N);
 
 	/* TGCLK dividers */
 	if (!ret)
@@ -727,6 +877,7 @@ static int rj54n1_set_clock(struct i2c_client *client)
 			"Resetting RJ54N1CB0C clock failed: %d!\n", ret);
 		return -EIO;
 	}
+
 	/* Start the PLL */
 	ret = reg_set(client, RJ54N1_OCLK_DSP, 1, 1);
 
@@ -739,6 +890,7 @@ static int rj54n1_set_clock(struct i2c_client *client)
 
 static int rj54n1_reg_init(struct i2c_client *client)
 {
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	int ret = rj54n1_set_clock(client);
 
 	if (!ret)
@@ -761,14 +913,26 @@ static int rj54n1_reg_init(struct i2c_client *client)
 	if (!ret)
 		ret = reg_write(client, RJ54N1_Y_GAIN, 0x84);
 
-	/* Mirror the image back: default is upside down and left-to-right... */
+	/*
+	 * Mirror the image back: default is upside down and left-to-right...
+	 * Set manual preview / still shot switching
+	 */
 	if (!ret)
-		ret = reg_set(client, RJ54N1_MIRROR_STILL_MODE, 3, 3);
+		ret = reg_write(client, RJ54N1_MIRROR_STILL_MODE, 0x27);
 
 	if (!ret)
 		ret = reg_write_multiple(client, bank_4, ARRAY_SIZE(bank_4));
+
+	/* Auto exposure area */
+	if (!ret)
+		ret = reg_write(client, RJ54N1_EXPOSURE_CONTROL, 0x80);
+	/* Check current auto WB config */
 	if (!ret)
+		ret = reg_read(client, RJ54N1_WB_SEL_WEIGHT_I);
+	if (ret >= 0) {
+		rj54n1->auto_wb = ret & 0x80;
 		ret = reg_write_multiple(client, bank_5, ARRAY_SIZE(bank_5));
+	}
 	if (!ret)
 		ret = reg_write_multiple(client, bank_8, ARRAY_SIZE(bank_8));
 
@@ -785,8 +949,9 @@ static int rj54n1_reg_init(struct i2c_client *client)
 		ret = reg_write(client, RJ54N1_RESET_STANDBY,
 				E_EXCLK | DSP_RSTX | TG_RSTX | SEN_RSTX);
 
+	/* Start register update? Same register as 0x?FE in many bank_* sets */
 	if (!ret)
-		ret = reg_write(client, 0x7fe, 2);
+		ret = reg_write(client, RJ54N1_FWFLG, 2);
 
 	/* Constant taken from manufacturer's example */
 	msleep(700);
@@ -794,7 +959,6 @@ static int rj54n1_reg_init(struct i2c_client *client)
 	return ret;
 }
 
-/* FIXME: streaming output only up to 800x600 is functional */
 static int rj54n1_try_fmt(struct v4l2_subdev *sd,
 			  struct v4l2_imgbus_framefmt *imgf)
 {
@@ -825,18 +989,13 @@ static int rj54n1_s_fmt(struct v4l2_subdev *sd,
 		input_w = rj54n1->rect.width, input_h = rj54n1->rect.height;
 	int ret;
 
-	/*
-	 * The host driver can call us without .try_fmt(), so, we have to take
-	 * care ourseleves
-	 */
-	ret = rj54n1_try_fmt(sd, imgf);
+	rj54n1_try_fmt(sd, imgf);
 
 	/*
 	 * Verify if the sensor has just been powered on. TODO: replace this
 	 * with proper PM, when a suitable API is available.
 	 */
-	if (!ret)
-		ret = reg_read(client, RJ54N1_RESET_STANDBY);
+	ret = reg_read(client, RJ54N1_RESET_STANDBY);
 	if (ret < 0)
 		return ret;
 
@@ -846,6 +1005,9 @@ static int rj54n1_s_fmt(struct v4l2_subdev *sd,
 			return ret;
 	}
 
+	dev_dbg(&client->dev, "%s: code = %d, width = %u, height = %u\n",
+		__func__, imgf->code, imgf->width, imgf->height);
+
 	/* RA_SEL_UL is only relevant for raw modes, ignored otherwise. */
 	switch (imgf->code) {
 	case V4L2_IMGBUS_FMT_YUYV:
@@ -1026,6 +1188,14 @@ static const struct v4l2_queryctrl rj54n1_controls[] = {
 		.step		= 1,
 		.default_value	= 66,
 		.flags		= V4L2_CTRL_FLAG_SLIDER,
+	}, {
+		.id		= V4L2_CID_AUTO_WHITE_BALANCE,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Auto white balance",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 1,
 	},
 };
 
@@ -1039,6 +1209,7 @@ static struct soc_camera_ops rj54n1_ops = {
 static int rj54n1_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	int data;
 
 	switch (ctrl->id) {
@@ -1061,6 +1232,9 @@ static int rj54n1_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 
 		ctrl->value = data / 2;
 		break;
+	case V4L2_CID_AUTO_WHITE_BALANCE:
+		ctrl->value = rj54n1->auto_wb;
+		break;
 	}
 
 	return 0;
@@ -1070,6 +1244,7 @@ static int rj54n1_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	int data;
 	struct i2c_client *client = sd->priv;
+	struct rj54n1 *rj54n1 = to_rj54n1(client);
 	const struct v4l2_queryctrl *qctrl;
 
 	qctrl = soc_camera_find_qctrl(&rj54n1_ops, ctrl->id);
@@ -1100,6 +1275,13 @@ static int rj54n1_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 		else if (reg_write(client, RJ54N1_Y_GAIN, ctrl->value * 2) < 0)
 			return -EIO;
 		break;
+	case V4L2_CID_AUTO_WHITE_BALANCE:
+		/* Auto WB area - whole image */
+		if (reg_set(client, RJ54N1_WB_SEL_WEIGHT_I, ctrl->value << 7,
+			    0x80) < 0)
+			return -EIO;
+		rj54n1->auto_wb = ctrl->value;
+		break;
 	}
 
 	return 0;
@@ -1120,12 +1302,10 @@ static struct v4l2_subdev_video_ops rj54n1_subdev_video_ops = {
 	.s_imgbus_fmt		= rj54n1_s_fmt,
 	.g_imgbus_fmt		= rj54n1_g_fmt,
 	.try_imgbus_fmt		= rj54n1_try_fmt,
-	.enum_imgbus_fmt	= rj54n1_enum_fmt,
-	.s_imgbus_fmt		= rj54n1_s_fmt,
-	.g_imgbus_fmt		= rj54n1_g_fmt,
-	.try_imgbus_fmt		= rj54n1_try_fmt,
+	.s_crop			= rj54n1_s_crop,
 	.g_crop			= rj54n1_g_crop,
 	.cropcap		= rj54n1_cropcap,
+	.enum_imgbus_fmt	= rj54n1_enum_fmt,
 };
 
 static struct v4l2_subdev_ops rj54n1_subdev_ops = {
@@ -1133,21 +1313,13 @@ static struct v4l2_subdev_ops rj54n1_subdev_ops = {
 	.video	= &rj54n1_subdev_video_ops,
 };
 
-static int rj54n1_pin_config(struct i2c_client *client)
-{
-	/*
-	 * Experimentally found out IOCTRL wired to 0. TODO: add to platform
-	 * data: 0 or 1 << 7.
-	 */
-	return reg_write(client, RJ54N1_IOC, 0);
-}
-
 /*
  * Interface active, can use i2c. If it fails, it can indeed mean, that
  * this wasn't our capture interface, so, we wait for the right one
  */
 static int rj54n1_video_probe(struct soc_camera_device *icd,
-			      struct i2c_client *client)
+			      struct i2c_client *client,
+			      struct rj54n1_pdata *priv)
 {
 	int data1, data2;
 	int ret;
@@ -1168,7 +1340,8 @@ static int rj54n1_video_probe(struct soc_camera_device *icd,
 		goto ei2c;
 	}
 
-	ret = rj54n1_pin_config(client);
+	/* Configure IOCTL polarity from the platform data: 0 or 1 << 7. */
+	ret = reg_write(client, RJ54N1_IOC, priv->ioctl_high << 7);
 	if (ret < 0)
 		goto ei2c;
 
@@ -1186,6 +1359,7 @@ static int rj54n1_probe(struct i2c_client *client,
 	struct soc_camera_device *icd = client->dev.platform_data;
 	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
 	struct soc_camera_link *icl;
+	struct rj54n1_pdata *rj54n1_priv;
 	int ret;
 
 	if (!icd) {
@@ -1194,11 +1368,13 @@ static int rj54n1_probe(struct i2c_client *client,
 	}
 
 	icl = to_soc_camera_link(icd);
-	if (!icl) {
+	if (!icl || !icl->priv) {
 		dev_err(&client->dev, "RJ54N1CB0C: missing platform data!\n");
 		return -EINVAL;
 	}
 
+	rj54n1_priv = icl->priv;
+
 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_BYTE_DATA)) {
 		dev_warn(&adapter->dev,
 			 "I2C-Adapter doesn't support I2C_FUNC_SMBUS_BYTE\n");
@@ -1222,8 +1398,10 @@ static int rj54n1_probe(struct i2c_client *client,
 	rj54n1->height		= RJ54N1_MAX_HEIGHT;
 	rj54n1->code		= rj54n1_colour_codes[0];
 	rj54n1->resize		= 1024;
+	rj54n1->tgclk_mhz	= (rj54n1_priv->mclk_freq / PLL_L * PLL_N) /
+		(clk_div.ratio_tg + 1) / (clk_div.ratio_t + 1);
 
-	ret = rj54n1_video_probe(icd, client);
+	ret = rj54n1_video_probe(icd, client, rj54n1_priv);
 	if (ret < 0) {
 		icd->ops = NULL;
 		i2c_set_clientdata(client, NULL);
diff --git a/include/media/rj54n1cb0c.h b/include/media/rj54n1cb0c.h
new file mode 100644
index 0000000..8ae3288
--- /dev/null
+++ b/include/media/rj54n1cb0c.h
@@ -0,0 +1,19 @@
+/*
+ * RJ54N1CB0C Private data
+ *
+ * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
+ *
+ * This program is free software; you can redistribute it and/or modify
+ * it under the terms of the GNU General Public License version 2 as
+ * published by the Free Software Foundation.
+ */
+
+#ifndef __RJ54N1CB0C_H__
+#define __RJ54N1CB0C_H__
+
+struct rj54n1_pdata {
+	unsigned int	mclk_freq;
+	bool		ioctl_high;
+};
+
+#endif
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches
  2009-10-30 14:34 ` [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Karicheri, Muralidharan
@ 2009-10-30 20:12   ` Guennadi Liakhovetski
  0 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 20:12 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

On Fri, 30 Oct 2009, Karicheri, Muralidharan wrote:

> Guennadi,
> 
> Thanks for updating the driver. I will integrate it when I get a chance and let you know if I see any issues.
> 
> BTW, Is there someone developing a driver for MT9P031 sensor which is 
> very similar to MT9T031? Do you suggest a separate driver for this 
> sensor or
> add the support in MT9T031? I need a driver for this and plan to add it soon.

It depends on the difference between mt9t031 and mt9p031, of course. I had 
a brief look at the mt9p031 datasheet while placing it next to mt9t031, 
and for my taste there are already too many differences to pack them in 
one driver. MT9T031 was also very similar to MT9M001, I think, but 
copy-pasting actually served me a bad service:-) there turned out to be 
too many subtle differences in the end.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional
  2009-10-30 15:28   ` Karicheri, Muralidharan
@ 2009-10-30 20:25     ` Guennadi Liakhovetski
  2009-11-02 16:05       ` Karicheri, Muralidharan
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 20:25 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

On Fri, 30 Oct 2009, Karicheri, Muralidharan wrote:

> Guennadi,
> 
> Thanks for your time for updating this driver. But I still don't think
> it is in a state to be re-used on TI's VPFE platform. Please see
> below for my comments.
> 
> Murali Karicheri
> Software Design Engineer
> Texas Instruments Inc.
> Germantown, MD 20874
> email: m-karicheri2@ti.com
> 
> >-----Original Message-----
> >From: Guennadi Liakhovetski [mailto:g.liakhovetski@gmx.de]
> >Sent: Friday, October 30, 2009 10:02 AM
> >To: Linux Media Mailing List
> >Cc: Hans Verkuil; Laurent Pinchart; Sakari Ailus; Karicheri, Muralidharan
> >Subject: [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API
> >optional
> >
> >Now that we have moved most of the functions over to the v4l2-subdev API,
> >only
> >quering and setting bus parameters are still performed using the legacy
> >soc-camera client API. Make the use of this API optional for mt9t031.
> >
> >Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> >---
> >
> >Muralidharan, this one is for you to test. To differentiate between the
> >soc-camera case and a generic user I check i2c client's platform data
> >(client->dev.platform_data), so, you have to make sure your user doesn't
> >use that field for something else.
> >
> Currently I am using this field for bus parameters such as pclk polarity.
> If there is an API (bus parameter) to set this after probing the sensor, 
> that may work too. I will check your latest driver and let you know if
> I see an issue in migrating to this version.

No, that shall come with the bus-configuration API. I was already thinking 
about switching to passing a pointer to struct soc_camera_link or 
something similar in platform_data, because that's exactly what that 
struct is for. Of course, we would have to agree on a specific object with 
platform parameters there also for non soc-camera drivers.

> >One more note: I'm not sure about where v4l2_device_unregister_subdev()
> >should be called. In soc-camera the core calls
> >v4l2_i2c_new_subdev_board(), which then calls
> >v4l2_device_register_subdev(). Logically, it's also the core that then
> >calls v4l2_device_unregister_subdev(). Whereas I see many other client
> >drivers call v4l2_device_unregister_subdev() internally. So, if your
> >bridge driver does not call v4l2_device_unregister_subdev() itself and
> >expects the client to call it, there will be a slight problem with that
> >too.
> 
> In my bridge driver also v4l2_i2c_new_subdev_board() is called to load 
> up the sub device. When the bridge driver is removed (remove() call), it 
> calls v4l2_device_unregister() which will unregister the v4l2 device and 
> all sub devices (in turn calls v4l2_device_unregister_subdev()). But 
> most of the sub devices also calls v4l2_device_unregister_subdev() in 
> the remove() function of the module (so also the version of the mt9t031 
> that I use). So even if that call is kept in the mt9t031 sensor driver 
> (not sure if someone use it as a standalone driver), it would just 
> return since the v4l2_dev ptr in sd ptr would have been set to null as a 
> result of the bridge driver remove() call. What do you think?

...as long as sd has not been freed yet by then. But in case of mt9t031 
the subdevice is embedded in driver-instance object, which is freed, when 
the respective i2c device gets unregistered or the driver unloaded. So, 
you could call it twice here, yes.

> See also some comments below..
> 
> >
> > drivers/media/video/mt9t031.c |  146 ++++++++++++++++++++-----------------
> >----
> > 1 files changed, 70 insertions(+), 76 deletions(-)
> >
> >diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
> >index c95c277..49357bd 100644
> >--- a/drivers/media/video/mt9t031.c
> >+++ b/drivers/media/video/mt9t031.c
> >@@ -204,6 +204,59 @@ static unsigned long mt9t031_query_bus_param(struct
> >soc_camera_device *icd)
> > 	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
> > }
> >
> >+static const struct v4l2_queryctrl mt9t031_controls[] = {
> >+	{
> >+		.id		= V4L2_CID_VFLIP,
> >+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> >+		.name		= "Flip Vertically",
> >+		.minimum	= 0,
> >+		.maximum	= 1,
> >+		.step		= 1,
> >+		.default_value	= 0,
> >+	}, {
> >+		.id		= V4L2_CID_HFLIP,
> >+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> >+		.name		= "Flip Horizontally",
> >+		.minimum	= 0,
> >+		.maximum	= 1,
> >+		.step		= 1,
> >+		.default_value	= 0,
> >+	}, {
> >+		.id		= V4L2_CID_GAIN,
> >+		.type		= V4L2_CTRL_TYPE_INTEGER,
> >+		.name		= "Gain",
> >+		.minimum	= 0,
> >+		.maximum	= 127,
> >+		.step		= 1,
> >+		.default_value	= 64,
> >+		.flags		= V4L2_CTRL_FLAG_SLIDER,
> >+	}, {
> >+		.id		= V4L2_CID_EXPOSURE,
> >+		.type		= V4L2_CTRL_TYPE_INTEGER,
> >+		.name		= "Exposure",
> >+		.minimum	= 1,
> >+		.maximum	= 255,
> >+		.step		= 1,
> >+		.default_value	= 255,
> >+		.flags		= V4L2_CTRL_FLAG_SLIDER,
> >+	}, {
> >+		.id		= V4L2_CID_EXPOSURE_AUTO,
> >+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> >+		.name		= "Automatic Exposure",
> >+		.minimum	= 0,
> >+		.maximum	= 1,
> >+		.step		= 1,
> >+		.default_value	= 1,
> >+	}
> >+};
> >+
> >+static struct soc_camera_ops mt9t031_ops = {
> >+	.set_bus_param		= mt9t031_set_bus_param,
> >+	.query_bus_param	= mt9t031_query_bus_param,
> >+	.controls		= mt9t031_controls,
> >+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> >+};
> >+
> 
> [MK] Why don't you implement queryctrl ops in core? query_bus_param
> & set_bus_param() can be implemented as a sub device operation as well
> right? I think we need to get the bus parameter RFC implemented and
> this driver could be targeted for it's first use so that we could
> work together to get it accepted. I didn't get a chance to study your 
> bus image format RFC, but plan to review it soon and to see if it can be
> used in my platform as well. For use of this driver in our platform,
> all reference to soc_ must be removed. I am ok if the structure is
> re-used, but if this driver calls any soc_camera function, it canot
> be used in my platform.

Why? Some soc-camera functions are just library functions, you just have 
to build soc-camera into your kernel. (also see below)

> BTW, I am attaching a version of the driver that we use in our kernel 
> tree for your reference which will give you an idea of my requirement.
> 

[snip]

> >@@ -565,7 +562,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
> >struct v4l2_control *ctrl)
> > {
> > 	struct i2c_client *client = sd->priv;
> > 	struct mt9t031 *mt9t031 = to_mt9t031(client);
> >-	struct soc_camera_device *icd = client->dev.platform_data;
> > 	const struct v4l2_queryctrl *qctrl;
> > 	int data;
> >
> >@@ -657,7 +653,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
> >struct v4l2_control *ctrl)
> >
> > 			if (set_shutter(client, total_h) < 0)
> > 				return -EIO;
> >-			qctrl = soc_camera_find_qctrl(icd->ops,
> >V4L2_CID_EXPOSURE);
> >+			qctrl = soc_camera_find_qctrl(&mt9t031_ops,
> >+						      V4L2_CID_EXPOSURE);
> 
> [MK] Why do we still need this call? In my version of the sensor driver,
> I just implement the queryctrl() operation in core_ops. This cannot work
> since soc_camera_find_qctrl() is implemented only in SoC camera.

As mentioned above, that's just a library function without any further 
dependencies, so, why reimplement it?

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-10-30 14:43   ` Karicheri, Muralidharan
@ 2009-10-30 20:31     ` Guennadi Liakhovetski
  2009-11-02 16:14       ` Karicheri, Muralidharan
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-10-30 20:31 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

On Fri, 30 Oct 2009, Karicheri, Muralidharan wrote:

> Guennadi,
> 
> 
> > 	mt9m111->rect.left	= MT9M111_MIN_DARK_COLS;
> > 	mt9m111->rect.top	= MT9M111_MIN_DARK_ROWS;
> >diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
> >index 6966f64..57e04e9 100644
> >--- a/drivers/media/video/mt9t031.c
> >+++ b/drivers/media/video/mt9t031.c
> >@@ -301,9 +301,9 @@ static int mt9t031_set_params(struct soc_camera_device
> >*icd,
> > 		ret = reg_write(client, MT9T031_WINDOW_WIDTH, rect->width - 1);
> > 	if (ret >= 0)
> > 		ret = reg_write(client, MT9T031_WINDOW_HEIGHT,
> >-				rect->height + icd->y_skip_top - 1);
> >+				rect->height - 1);

> Why y_skip_top is removed?

Because noone ever said they needed it?

> When I connect the sensor output to our SOC 
> input and do format conversion and resize on the fly (frame by frame 
> conversion before writing to SDRAM) I have found that the frame 
> completion interrupt fails to get generated with zero value for 
> y_skip_top. I have used a value
> of 10 and it worked fine for me. So I would like to have a 
> s_skip_top_lines() in the sensor operations which can be called to 
> update this value from the host/bridge driver.

Hm, strange, that's actually not the purpose of this parameter. Wouldn't 
it work for you just as well, if you just request 10 more lines when 
sending s_fmt from your bridge driver?

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional
  2009-10-30 20:25     ` Guennadi Liakhovetski
@ 2009-11-02 16:05       ` Karicheri, Muralidharan
  2009-11-04 16:49         ` [PATCH/RFC 9/9 v2] " Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-11-02 16:05 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

Guennadi,

Thanks for the reply.

>> >+};
>> >+
>> >+static struct soc_camera_ops mt9t031_ops = {
>> >+	.set_bus_param		= mt9t031_set_bus_param,
>> >+	.query_bus_param	= mt9t031_query_bus_param,
>> >+	.controls		= mt9t031_controls,
>> >+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
>> >+};
>> >+
>>
>> [MK] Why don't you implement queryctrl ops in core? query_bus_param
>> & set_bus_param() can be implemented as a sub device operation as well
>> right? I think we need to get the bus parameter RFC implemented and
>> this driver could be targeted for it's first use so that we could
>> work together to get it accepted. I didn't get a chance to study your
>> bus image format RFC, but plan to review it soon and to see if it can be
>> used in my platform as well. For use of this driver in our platform,
>> all reference to soc_ must be removed. I am ok if the structure is
>> re-used, but if this driver calls any soc_camera function, it canot
>> be used in my platform.
>
>Why? Some soc-camera functions are just library functions, you just have
>to build soc-camera into your kernel. (also see below)
>
My point is that the control is for the sensor device, so why to implement
queryctrl in SoC camera? Just for this I need to include SOC camera in my build? That doesn't make any sense at all. IMHO, queryctrl() logically belongs to this sensor driver which can be called from the bridge driver using sudev API call. Any reverse dependency from MT9T031 to SoC camera to be removed if it is to be re-used across other platforms. Can we agree on this? Did you have a chance to compare the driver file that I had sent to you?

Thanks.

Murali
>> BTW, I am attaching a version of the driver that we use in our kernel
>> tree for your reference which will give you an idea of my requirement.
>>
>
>[snip]
>
>> >@@ -565,7 +562,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>> >struct v4l2_control *ctrl)
>> > {
>> > 	struct i2c_client *client = sd->priv;
>> > 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>> >-	struct soc_camera_device *icd = client->dev.platform_data;
>> > 	const struct v4l2_queryctrl *qctrl;
>> > 	int data;
>> >
>> >@@ -657,7 +653,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>> >struct v4l2_control *ctrl)
>> >
>> > 			if (set_shutter(client, total_h) < 0)
>> > 				return -EIO;
>> >-			qctrl = soc_camera_find_qctrl(icd->ops,
>> >V4L2_CID_EXPOSURE);
>> >+			qctrl = soc_camera_find_qctrl(&mt9t031_ops,
>> >+						      V4L2_CID_EXPOSURE);
>>
>> [MK] Why do we still need this call? In my version of the sensor driver,
>> I just implement the queryctrl() operation in core_ops. This cannot work
>> since soc_camera_find_qctrl() is implemented only in SoC camera.
>
>As mentioned above, that's just a library function without any further
>dependencies, so, why reimplement it?
>
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/


^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-10-30 20:31     ` Guennadi Liakhovetski
@ 2009-11-02 16:14       ` Karicheri, Muralidharan
  2009-11-04 19:11         ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-11-02 16:14 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

Guennadi,

Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
Germantown, MD 20874
phone: 301-407-9583
email: m-karicheri2@ti.com

>-----Original Message-----
>From: linux-media-owner@vger.kernel.org [mailto:linux-media-
>owner@vger.kernel.org] On Behalf Of Guennadi Liakhovetski
>Sent: Friday, October 30, 2009 4:32 PM
>To: Karicheri, Muralidharan
>Cc: Linux Media Mailing List; Hans Verkuil; Laurent Pinchart; Sakari Ailus
>Subject: RE: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use
>g_skip_top_lines in soc-camera
>
>On Fri, 30 Oct 2009, Karicheri, Muralidharan wrote:
>
>> Guennadi,
>>
>>
>> > 	mt9m111->rect.left	= MT9M111_MIN_DARK_COLS;
>> > 	mt9m111->rect.top	= MT9M111_MIN_DARK_ROWS;
>> >diff --git a/drivers/media/video/mt9t031.c
>b/drivers/media/video/mt9t031.c
>> >index 6966f64..57e04e9 100644
>> >--- a/drivers/media/video/mt9t031.c
>> >+++ b/drivers/media/video/mt9t031.c
>> >@@ -301,9 +301,9 @@ static int mt9t031_set_params(struct
>soc_camera_device
>> >*icd,
>> > 		ret = reg_write(client, MT9T031_WINDOW_WIDTH, rect->width - 1);
>> > 	if (ret >= 0)
>> > 		ret = reg_write(client, MT9T031_WINDOW_HEIGHT,
>> >-				rect->height + icd->y_skip_top - 1);
>> >+				rect->height - 1);
>
>> Why y_skip_top is removed?
>
>Because noone ever said they needed it?
>
I suggest you keep it. It can have default 0. I have not viewed the resulting image for the top line to see if it is corrupted. I just
use it to display it to my display device and I am not seeing any
corruption. I need to view the image at some point to check if it has
any corruption.
>> When I connect the sensor output to our SOC
>> input and do format conversion and resize on the fly (frame by frame
>> conversion before writing to SDRAM) I have found that the frame
>> completion interrupt fails to get generated with zero value for
>> y_skip_top. I have used a value
>> of 10 and it worked fine for me. So I would like to have a
>> s_skip_top_lines() in the sensor operations which can be called to
>> update this value from the host/bridge driver.
>
>Hm, strange, that's actually not the purpose of this parameter. Wouldn't
>it work for you just as well, if you just request 10 more lines when
>sending s_fmt from your bridge driver?
Ok. It might work by asking some additional lines from the bridge driver.
I will try this out.
>
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
>--
>To unsubscribe from this list: send the line "unsubscribe linux-media" in
>the body of a message to majordomo@vger.kernel.org
>More majordomo info at  http://vger.kernel.org/majordomo-info.html


^ permalink raw reply	[flat|nested] 51+ messages in thread

* [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-02 16:05       ` Karicheri, Muralidharan
@ 2009-11-04 16:49         ` Guennadi Liakhovetski
  2009-11-04 16:57           ` Karicheri, Muralidharan
  2009-11-05 15:57           ` Hans Verkuil
  0 siblings, 2 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-04 16:49 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

Now that we have moved most of the functions over to the v4l2-subdev API, only
quering and setting bus parameters are still performed using the legacy
soc-camera client API. Make the use of this API optional for mt9t031.

Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
---

On Mon, 2 Nov 2009, Karicheri, Muralidharan wrote:

> >> >+static struct soc_camera_ops mt9t031_ops = {
> >> >+	.set_bus_param		= mt9t031_set_bus_param,
> >> >+	.query_bus_param	= mt9t031_query_bus_param,
> >> >+	.controls		= mt9t031_controls,
> >> >+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> >> >+};
> >> >+
> >>
> >> [MK] Why don't you implement queryctrl ops in core? query_bus_param
> >> & set_bus_param() can be implemented as a sub device operation as well
> >> right? I think we need to get the bus parameter RFC implemented and
> >> this driver could be targeted for it's first use so that we could
> >> work together to get it accepted. I didn't get a chance to study your
> >> bus image format RFC, but plan to review it soon and to see if it can be
> >> used in my platform as well. For use of this driver in our platform,
> >> all reference to soc_ must be removed. I am ok if the structure is
> >> re-used, but if this driver calls any soc_camera function, it canot
> >> be used in my platform.
> >
> >Why? Some soc-camera functions are just library functions, you just have
> >to build soc-camera into your kernel. (also see below)
> >
> My point is that the control is for the sensor device, so why to implement
> queryctrl in SoC camera? Just for this I need to include SOC camera in 
> my build? That doesn't make any sense at all. IMHO, queryctrl() 
> logically belongs to this sensor driver which can be called from the 
> bridge driver using sudev API call. Any reverse dependency from MT9T031 
> to SoC camera to be removed if it is to be re-used across other 
> platforms. Can we agree on this?

In general I'm sure you understand, that there are lots of functions in 
the kernel, that we use in specific modules, not because they interact 
with other systems, but because they implement some common functionality 
and just reduce code-duplication. And I can well imagine that in many such 
cases using just one or a couple of such functions will pull a much larger 
pile of unused code with them. But in this case those calls can indeed be 
very easily eliminated. Please have a look at the version below.

> Did you have a chance to compare the driver file that I had sent to you?

I looked at it, but it is based on an earlier version of the driver, so, 
it wasn't very easy to compare. Maybe you could send a diff against the 
mainline version, on which it is based?

Thanks
Guennadi

 drivers/media/video/mt9t031.c |  167 +++++++++++++++++++++--------------------
 1 files changed, 85 insertions(+), 82 deletions(-)

diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
index c95c277..86bf8f6 100644
--- a/drivers/media/video/mt9t031.c
+++ b/drivers/media/video/mt9t031.c
@@ -204,6 +204,71 @@ static unsigned long mt9t031_query_bus_param(struct soc_camera_device *icd)
 	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
 }
 
+enum {
+	MT9T031_CTRL_VFLIP,
+	MT9T031_CTRL_HFLIP,
+	MT9T031_CTRL_GAIN,
+	MT9T031_CTRL_EXPOSURE,
+	MT9T031_CTRL_EXPOSURE_AUTO,
+};
+
+static const struct v4l2_queryctrl mt9t031_controls[] = {
+	[MT9T031_CTRL_VFLIP] = {
+		.id		= V4L2_CID_VFLIP,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Flip Vertically",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 0,
+	},
+	[MT9T031_CTRL_HFLIP] = {
+		.id		= V4L2_CID_HFLIP,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Flip Horizontally",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 0,
+	},
+	[MT9T031_CTRL_GAIN] = {
+		.id		= V4L2_CID_GAIN,
+		.type		= V4L2_CTRL_TYPE_INTEGER,
+		.name		= "Gain",
+		.minimum	= 0,
+		.maximum	= 127,
+		.step		= 1,
+		.default_value	= 64,
+		.flags		= V4L2_CTRL_FLAG_SLIDER,
+	},
+	[MT9T031_CTRL_EXPOSURE] = {
+		.id		= V4L2_CID_EXPOSURE,
+		.type		= V4L2_CTRL_TYPE_INTEGER,
+		.name		= "Exposure",
+		.minimum	= 1,
+		.maximum	= 255,
+		.step		= 1,
+		.default_value	= 255,
+		.flags		= V4L2_CTRL_FLAG_SLIDER,
+	},
+	[MT9T031_CTRL_EXPOSURE_AUTO] = {
+		.id		= V4L2_CID_EXPOSURE_AUTO,
+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
+		.name		= "Automatic Exposure",
+		.minimum	= 0,
+		.maximum	= 1,
+		.step		= 1,
+		.default_value	= 1,
+	}
+};
+
+static struct soc_camera_ops mt9t031_ops = {
+	.set_bus_param		= mt9t031_set_bus_param,
+	.query_bus_param	= mt9t031_query_bus_param,
+	.controls		= mt9t031_controls,
+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
+};
+
 /* target must be _even_ */
 static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
 {
@@ -223,10 +288,9 @@ static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
 }
 
 /* rect is the sensor rectangle, the caller guarantees parameter validity */
-static int mt9t031_set_params(struct soc_camera_device *icd,
+static int mt9t031_set_params(struct i2c_client *client,
 			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
 {
-	struct i2c_client *client = to_i2c_client(to_soc_camera_control(icd));
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	int ret;
 	u16 xbin, ybin;
@@ -307,8 +371,7 @@ static int mt9t031_set_params(struct soc_camera_device *icd,
 		if (ret >= 0) {
 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
 			const struct v4l2_queryctrl *qctrl =
-				soc_camera_find_qctrl(icd->ops,
-						      V4L2_CID_EXPOSURE);
+				&mt9t031_controls[MT9T031_CTRL_EXPOSURE];
 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
 				 (qctrl->maximum - qctrl->minimum)) /
 				shutter_max + qctrl->minimum;
@@ -333,7 +396,6 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	struct v4l2_rect rect = a->c;
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct soc_camera_device *icd = client->dev.platform_data;
 
 	rect.width = ALIGN(rect.width, 2);
 	rect.height = ALIGN(rect.height, 2);
@@ -344,7 +406,7 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
 	soc_camera_limit_side(&rect.top, &rect.height,
 		     MT9T031_ROW_SKIP, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT);
 
-	return mt9t031_set_params(icd, &rect, mt9t031->xskip, mt9t031->yskip);
+	return mt9t031_set_params(client, &rect, mt9t031->xskip, mt9t031->yskip);
 }
 
 static int mt9t031_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
@@ -391,7 +453,6 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct soc_camera_device *icd = client->dev.platform_data;
 	u16 xskip, yskip;
 	struct v4l2_rect rect = mt9t031->rect;
 
@@ -403,7 +464,7 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
 	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
 
 	/* mt9t031_set_params() doesn't change width and height */
-	return mt9t031_set_params(icd, &rect, xskip, yskip);
+	return mt9t031_set_params(client, &rect, xskip, yskip);
 }
 
 /*
@@ -476,59 +537,6 @@ static int mt9t031_s_register(struct v4l2_subdev *sd,
 }
 #endif
 
-static const struct v4l2_queryctrl mt9t031_controls[] = {
-	{
-		.id		= V4L2_CID_VFLIP,
-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
-		.name		= "Flip Vertically",
-		.minimum	= 0,
-		.maximum	= 1,
-		.step		= 1,
-		.default_value	= 0,
-	}, {
-		.id		= V4L2_CID_HFLIP,
-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
-		.name		= "Flip Horizontally",
-		.minimum	= 0,
-		.maximum	= 1,
-		.step		= 1,
-		.default_value	= 0,
-	}, {
-		.id		= V4L2_CID_GAIN,
-		.type		= V4L2_CTRL_TYPE_INTEGER,
-		.name		= "Gain",
-		.minimum	= 0,
-		.maximum	= 127,
-		.step		= 1,
-		.default_value	= 64,
-		.flags		= V4L2_CTRL_FLAG_SLIDER,
-	}, {
-		.id		= V4L2_CID_EXPOSURE,
-		.type		= V4L2_CTRL_TYPE_INTEGER,
-		.name		= "Exposure",
-		.minimum	= 1,
-		.maximum	= 255,
-		.step		= 1,
-		.default_value	= 255,
-		.flags		= V4L2_CTRL_FLAG_SLIDER,
-	}, {
-		.id		= V4L2_CID_EXPOSURE_AUTO,
-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
-		.name		= "Automatic Exposure",
-		.minimum	= 0,
-		.maximum	= 1,
-		.step		= 1,
-		.default_value	= 1,
-	}
-};
-
-static struct soc_camera_ops mt9t031_ops = {
-	.set_bus_param		= mt9t031_set_bus_param,
-	.query_bus_param	= mt9t031_query_bus_param,
-	.controls		= mt9t031_controls,
-	.num_controls		= ARRAY_SIZE(mt9t031_controls),
-};
-
 static int mt9t031_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	struct i2c_client *client = sd->priv;
@@ -565,15 +573,9 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 {
 	struct i2c_client *client = sd->priv;
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
-	struct soc_camera_device *icd = client->dev.platform_data;
 	const struct v4l2_queryctrl *qctrl;
 	int data;
 
-	qctrl = soc_camera_find_qctrl(&mt9t031_ops, ctrl->id);
-
-	if (!qctrl)
-		return -EINVAL;
-
 	switch (ctrl->id) {
 	case V4L2_CID_VFLIP:
 		if (ctrl->value)
@@ -592,6 +594,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 			return -EIO;
 		break;
 	case V4L2_CID_GAIN:
+		qctrl = &mt9t031_controls[MT9T031_CTRL_GAIN];
 		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl->minimum)
 			return -EINVAL;
 		/* See Datasheet Table 7, Gain settings. */
@@ -631,6 +634,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 		mt9t031->gain = ctrl->value;
 		break;
 	case V4L2_CID_EXPOSURE:
+		qctrl = &mt9t031_controls[MT9T031_CTRL_EXPOSURE];
 		/* mt9t031 has maximum == default */
 		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl->minimum)
 			return -EINVAL;
@@ -657,7 +661,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 
 			if (set_shutter(client, total_h) < 0)
 				return -EIO;
-			qctrl = soc_camera_find_qctrl(icd->ops, V4L2_CID_EXPOSURE);
+			qctrl = &mt9t031_controls[MT9T031_CTRL_EXPOSURE];
 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
 				 (qctrl->maximum - qctrl->minimum)) /
 				shutter_max + qctrl->minimum;
@@ -665,6 +669,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
 		} else
 			mt9t031->autoexposure = 0;
 		break;
+	default:
+		return -EINVAL;
 	}
 	return 0;
 }
@@ -751,18 +757,16 @@ static int mt9t031_probe(struct i2c_client *client,
 	struct mt9t031 *mt9t031;
 	struct soc_camera_device *icd = client->dev.platform_data;
 	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
-	struct soc_camera_link *icl;
 	int ret;
 
-	if (!icd) {
-		dev_err(&client->dev, "MT9T031: missing soc-camera data!\n");
-		return -EINVAL;
-	}
+	if (icd) {
+		struct soc_camera_link *icl = to_soc_camera_link(icd);
+		if (!icl) {
+			dev_err(&client->dev, "MT9T031 driver needs platform data\n");
+			return -EINVAL;
+		}
 
-	icl = to_soc_camera_link(icd);
-	if (!icl) {
-		dev_err(&client->dev, "MT9T031 driver needs platform data\n");
-		return -EINVAL;
+		icd->ops = &mt9t031_ops;
 	}
 
 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_WORD_DATA)) {
@@ -777,9 +781,6 @@ static int mt9t031_probe(struct i2c_client *client,
 
 	v4l2_i2c_subdev_init(&mt9t031->subdev, client, &mt9t031_subdev_ops);
 
-	/* Second stage probe - when a capture adapter is there */
-	icd->ops		= &mt9t031_ops;
-
 	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
 	mt9t031->rect.top	= MT9T031_ROW_SKIP;
 	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
@@ -801,7 +802,8 @@ static int mt9t031_probe(struct i2c_client *client,
 	mt9t031_disable(client);
 
 	if (ret) {
-		icd->ops = NULL;
+		if (icd)
+			icd->ops = NULL;
 		i2c_set_clientdata(client, NULL);
 		kfree(mt9t031);
 	}
@@ -814,7 +816,8 @@ static int mt9t031_remove(struct i2c_client *client)
 	struct mt9t031 *mt9t031 = to_mt9t031(client);
 	struct soc_camera_device *icd = client->dev.platform_data;
 
-	icd->ops = NULL;
+	if (icd)
+		icd->ops = NULL;
 	i2c_set_clientdata(client, NULL);
 	client->driver = NULL;
 	kfree(mt9t031);
-- 
1.6.2.4


^ permalink raw reply related	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-04 16:49         ` [PATCH/RFC 9/9 v2] " Guennadi Liakhovetski
@ 2009-11-04 16:57           ` Karicheri, Muralidharan
  2009-11-04 17:53             ` Guennadi Liakhovetski
  2009-11-05 15:57           ` Hans Verkuil
  1 sibling, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-11-04 16:57 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

Guennadi,

Thanks for the reply. I will have a chance to work on this
sometime in the next two weeks as I am pre-occupied with other
items. I will definitely try to use this version and do my
testing and let you know the result.

Will this apply cleanly to the v4l-dvb linux-next branch?

Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
Germantown, MD 20874
phone: 301-407-9583
email: m-karicheri2@ti.com

>-----Original Message-----
>From: Guennadi Liakhovetski [mailto:g.liakhovetski@gmx.de]
>Sent: Wednesday, November 04, 2009 11:49 AM
>To: Karicheri, Muralidharan
>Cc: Linux Media Mailing List; Hans Verkuil; Laurent Pinchart; Sakari Ailus
>Subject: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client
>API optional
>
>Now that we have moved most of the functions over to the v4l2-subdev API,
>only
>quering and setting bus parameters are still performed using the legacy
>soc-camera client API. Make the use of this API optional for mt9t031.
>
>Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
>---
>
>On Mon, 2 Nov 2009, Karicheri, Muralidharan wrote:
>
>> >> >+static struct soc_camera_ops mt9t031_ops = {
>> >> >+	.set_bus_param		= mt9t031_set_bus_param,
>> >> >+	.query_bus_param	= mt9t031_query_bus_param,
>> >> >+	.controls		= mt9t031_controls,
>> >> >+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
>> >> >+};
>> >> >+
>> >>
>> >> [MK] Why don't you implement queryctrl ops in core? query_bus_param
>> >> & set_bus_param() can be implemented as a sub device operation as well
>> >> right? I think we need to get the bus parameter RFC implemented and
>> >> this driver could be targeted for it's first use so that we could
>> >> work together to get it accepted. I didn't get a chance to study your
>> >> bus image format RFC, but plan to review it soon and to see if it can
>be
>> >> used in my platform as well. For use of this driver in our platform,
>> >> all reference to soc_ must be removed. I am ok if the structure is
>> >> re-used, but if this driver calls any soc_camera function, it canot
>> >> be used in my platform.
>> >
>> >Why? Some soc-camera functions are just library functions, you just have
>> >to build soc-camera into your kernel. (also see below)
>> >
>> My point is that the control is for the sensor device, so why to
>implement
>> queryctrl in SoC camera? Just for this I need to include SOC camera in
>> my build? That doesn't make any sense at all. IMHO, queryctrl()
>> logically belongs to this sensor driver which can be called from the
>> bridge driver using sudev API call. Any reverse dependency from MT9T031
>> to SoC camera to be removed if it is to be re-used across other
>> platforms. Can we agree on this?
>
>In general I'm sure you understand, that there are lots of functions in
>the kernel, that we use in specific modules, not because they interact
>with other systems, but because they implement some common functionality
>and just reduce code-duplication. And I can well imagine that in many such
>cases using just one or a couple of such functions will pull a much larger
>pile of unused code with them. But in this case those calls can indeed be
>very easily eliminated. Please have a look at the version below.
>
>> Did you have a chance to compare the driver file that I had sent to you?
>
>I looked at it, but it is based on an earlier version of the driver, so,
>it wasn't very easy to compare. Maybe you could send a diff against the
>mainline version, on which it is based?
>
>Thanks
>Guennadi
>
> drivers/media/video/mt9t031.c |  167 +++++++++++++++++++++----------------
>----
> 1 files changed, 85 insertions(+), 82 deletions(-)
>
>diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
>index c95c277..86bf8f6 100644
>--- a/drivers/media/video/mt9t031.c
>+++ b/drivers/media/video/mt9t031.c
>@@ -204,6 +204,71 @@ static unsigned long mt9t031_query_bus_param(struct
>soc_camera_device *icd)
> 	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
> }
>
>+enum {
>+	MT9T031_CTRL_VFLIP,
>+	MT9T031_CTRL_HFLIP,
>+	MT9T031_CTRL_GAIN,
>+	MT9T031_CTRL_EXPOSURE,
>+	MT9T031_CTRL_EXPOSURE_AUTO,
>+};
>+
>+static const struct v4l2_queryctrl mt9t031_controls[] = {
>+	[MT9T031_CTRL_VFLIP] = {
>+		.id		= V4L2_CID_VFLIP,
>+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>+		.name		= "Flip Vertically",
>+		.minimum	= 0,
>+		.maximum	= 1,
>+		.step		= 1,
>+		.default_value	= 0,
>+	},
>+	[MT9T031_CTRL_HFLIP] = {
>+		.id		= V4L2_CID_HFLIP,
>+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>+		.name		= "Flip Horizontally",
>+		.minimum	= 0,
>+		.maximum	= 1,
>+		.step		= 1,
>+		.default_value	= 0,
>+	},
>+	[MT9T031_CTRL_GAIN] = {
>+		.id		= V4L2_CID_GAIN,
>+		.type		= V4L2_CTRL_TYPE_INTEGER,
>+		.name		= "Gain",
>+		.minimum	= 0,
>+		.maximum	= 127,
>+		.step		= 1,
>+		.default_value	= 64,
>+		.flags		= V4L2_CTRL_FLAG_SLIDER,
>+	},
>+	[MT9T031_CTRL_EXPOSURE] = {
>+		.id		= V4L2_CID_EXPOSURE,
>+		.type		= V4L2_CTRL_TYPE_INTEGER,
>+		.name		= "Exposure",
>+		.minimum	= 1,
>+		.maximum	= 255,
>+		.step		= 1,
>+		.default_value	= 255,
>+		.flags		= V4L2_CTRL_FLAG_SLIDER,
>+	},
>+	[MT9T031_CTRL_EXPOSURE_AUTO] = {
>+		.id		= V4L2_CID_EXPOSURE_AUTO,
>+		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>+		.name		= "Automatic Exposure",
>+		.minimum	= 0,
>+		.maximum	= 1,
>+		.step		= 1,
>+		.default_value	= 1,
>+	}
>+};
>+
>+static struct soc_camera_ops mt9t031_ops = {
>+	.set_bus_param		= mt9t031_set_bus_param,
>+	.query_bus_param	= mt9t031_query_bus_param,
>+	.controls		= mt9t031_controls,
>+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
>+};
>+
> /* target must be _even_ */
> static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
> {
>@@ -223,10 +288,9 @@ static u16 mt9t031_skip(s32 *source, s32 target, s32
>max)
> }
>
> /* rect is the sensor rectangle, the caller guarantees parameter validity
>*/
>-static int mt9t031_set_params(struct soc_camera_device *icd,
>+static int mt9t031_set_params(struct i2c_client *client,
> 			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
> {
>-	struct i2c_client *client =
>to_i2c_client(to_soc_camera_control(icd));
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
> 	int ret;
> 	u16 xbin, ybin;
>@@ -307,8 +371,7 @@ static int mt9t031_set_params(struct soc_camera_device
>*icd,
> 		if (ret >= 0) {
> 			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
> 			const struct v4l2_queryctrl *qctrl =
>-				soc_camera_find_qctrl(icd->ops,
>-						      V4L2_CID_EXPOSURE);
>+				&mt9t031_controls[MT9T031_CTRL_EXPOSURE];
> 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
> 				 (qctrl->maximum - qctrl->minimum)) /
> 				shutter_max + qctrl->minimum;
>@@ -333,7 +396,6 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 	struct v4l2_rect rect = a->c;
> 	struct i2c_client *client = sd->priv;
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>-	struct soc_camera_device *icd = client->dev.platform_data;
>
> 	rect.width = ALIGN(rect.width, 2);
> 	rect.height = ALIGN(rect.height, 2);
>@@ -344,7 +406,7 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd,
>struct v4l2_crop *a)
> 	soc_camera_limit_side(&rect.top, &rect.height,
> 		     MT9T031_ROW_SKIP, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT);
>
>-	return mt9t031_set_params(icd, &rect, mt9t031->xskip, mt9t031-
>>yskip);
>+	return mt9t031_set_params(client, &rect, mt9t031->xskip, mt9t031-
>>yskip);
> }
>
> static int mt9t031_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
>@@ -391,7 +453,6 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
> {
> 	struct i2c_client *client = sd->priv;
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>-	struct soc_camera_device *icd = client->dev.platform_data;
> 	u16 xskip, yskip;
> 	struct v4l2_rect rect = mt9t031->rect;
>
>@@ -403,7 +464,7 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
> 	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
>
> 	/* mt9t031_set_params() doesn't change width and height */
>-	return mt9t031_set_params(icd, &rect, xskip, yskip);
>+	return mt9t031_set_params(client, &rect, xskip, yskip);
> }
>
> /*
>@@ -476,59 +537,6 @@ static int mt9t031_s_register(struct v4l2_subdev *sd,
> }
> #endif
>
>-static const struct v4l2_queryctrl mt9t031_controls[] = {
>-	{
>-		.id		= V4L2_CID_VFLIP,
>-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>-		.name		= "Flip Vertically",
>-		.minimum	= 0,
>-		.maximum	= 1,
>-		.step		= 1,
>-		.default_value	= 0,
>-	}, {
>-		.id		= V4L2_CID_HFLIP,
>-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>-		.name		= "Flip Horizontally",
>-		.minimum	= 0,
>-		.maximum	= 1,
>-		.step		= 1,
>-		.default_value	= 0,
>-	}, {
>-		.id		= V4L2_CID_GAIN,
>-		.type		= V4L2_CTRL_TYPE_INTEGER,
>-		.name		= "Gain",
>-		.minimum	= 0,
>-		.maximum	= 127,
>-		.step		= 1,
>-		.default_value	= 64,
>-		.flags		= V4L2_CTRL_FLAG_SLIDER,
>-	}, {
>-		.id		= V4L2_CID_EXPOSURE,
>-		.type		= V4L2_CTRL_TYPE_INTEGER,
>-		.name		= "Exposure",
>-		.minimum	= 1,
>-		.maximum	= 255,
>-		.step		= 1,
>-		.default_value	= 255,
>-		.flags		= V4L2_CTRL_FLAG_SLIDER,
>-	}, {
>-		.id		= V4L2_CID_EXPOSURE_AUTO,
>-		.type		= V4L2_CTRL_TYPE_BOOLEAN,
>-		.name		= "Automatic Exposure",
>-		.minimum	= 0,
>-		.maximum	= 1,
>-		.step		= 1,
>-		.default_value	= 1,
>-	}
>-};
>-
>-static struct soc_camera_ops mt9t031_ops = {
>-	.set_bus_param		= mt9t031_set_bus_param,
>-	.query_bus_param	= mt9t031_query_bus_param,
>-	.controls		= mt9t031_controls,
>-	.num_controls		= ARRAY_SIZE(mt9t031_controls),
>-};
>-
> static int mt9t031_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control
>*ctrl)
> {
> 	struct i2c_client *client = sd->priv;
>@@ -565,15 +573,9 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
> {
> 	struct i2c_client *client = sd->priv;
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
>-	struct soc_camera_device *icd = client->dev.platform_data;
> 	const struct v4l2_queryctrl *qctrl;
> 	int data;
>
>-	qctrl = soc_camera_find_qctrl(&mt9t031_ops, ctrl->id);
>-
>-	if (!qctrl)
>-		return -EINVAL;
>-
> 	switch (ctrl->id) {
> 	case V4L2_CID_VFLIP:
> 		if (ctrl->value)
>@@ -592,6 +594,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
> 			return -EIO;
> 		break;
> 	case V4L2_CID_GAIN:
>+		qctrl = &mt9t031_controls[MT9T031_CTRL_GAIN];
> 		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl-
>>minimum)
> 			return -EINVAL;
> 		/* See Datasheet Table 7, Gain settings. */
>@@ -631,6 +634,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
> 		mt9t031->gain = ctrl->value;
> 		break;
> 	case V4L2_CID_EXPOSURE:
>+		qctrl = &mt9t031_controls[MT9T031_CTRL_EXPOSURE];
> 		/* mt9t031 has maximum == default */
> 		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl-
>>minimum)
> 			return -EINVAL;
>@@ -657,7 +661,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
>
> 			if (set_shutter(client, total_h) < 0)
> 				return -EIO;
>-			qctrl = soc_camera_find_qctrl(icd->ops,
>V4L2_CID_EXPOSURE);
>+			qctrl = &mt9t031_controls[MT9T031_CTRL_EXPOSURE];
> 			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
> 				 (qctrl->maximum - qctrl->minimum)) /
> 				shutter_max + qctrl->minimum;
>@@ -665,6 +669,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd,
>struct v4l2_control *ctrl)
> 		} else
> 			mt9t031->autoexposure = 0;
> 		break;
>+	default:
>+		return -EINVAL;
> 	}
> 	return 0;
> }
>@@ -751,18 +757,16 @@ static int mt9t031_probe(struct i2c_client *client,
> 	struct mt9t031 *mt9t031;
> 	struct soc_camera_device *icd = client->dev.platform_data;
> 	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
>-	struct soc_camera_link *icl;
> 	int ret;
>
>-	if (!icd) {
>-		dev_err(&client->dev, "MT9T031: missing soc-camera data!\n");
>-		return -EINVAL;
>-	}
>+	if (icd) {
>+		struct soc_camera_link *icl = to_soc_camera_link(icd);
>+		if (!icl) {
>+			dev_err(&client->dev, "MT9T031 driver needs platform
>data\n");
>+			return -EINVAL;
>+		}
>
>-	icl = to_soc_camera_link(icd);
>-	if (!icl) {
>-		dev_err(&client->dev, "MT9T031 driver needs platform data\n");
>-		return -EINVAL;
>+		icd->ops = &mt9t031_ops;
> 	}
>
> 	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_WORD_DATA)) {
>@@ -777,9 +781,6 @@ static int mt9t031_probe(struct i2c_client *client,
>
> 	v4l2_i2c_subdev_init(&mt9t031->subdev, client, &mt9t031_subdev_ops);
>
>-	/* Second stage probe - when a capture adapter is there */
>-	icd->ops		= &mt9t031_ops;
>-
> 	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
> 	mt9t031->rect.top	= MT9T031_ROW_SKIP;
> 	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
>@@ -801,7 +802,8 @@ static int mt9t031_probe(struct i2c_client *client,
> 	mt9t031_disable(client);
>
> 	if (ret) {
>-		icd->ops = NULL;
>+		if (icd)
>+			icd->ops = NULL;
> 		i2c_set_clientdata(client, NULL);
> 		kfree(mt9t031);
> 	}
>@@ -814,7 +816,8 @@ static int mt9t031_remove(struct i2c_client *client)
> 	struct mt9t031 *mt9t031 = to_mt9t031(client);
> 	struct soc_camera_device *icd = client->dev.platform_data;
>
>-	icd->ops = NULL;
>+	if (icd)
>+		icd->ops = NULL;
> 	i2c_set_clientdata(client, NULL);
> 	client->driver = NULL;
> 	kfree(mt9t031);
>--
>1.6.2.4
>


^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-04 16:57           ` Karicheri, Muralidharan
@ 2009-11-04 17:53             ` Guennadi Liakhovetski
  2009-11-05  0:04               ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-04 17:53 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

On Wed, 4 Nov 2009, Karicheri, Muralidharan wrote:

> Guennadi,
> 
> Thanks for the reply. I will have a chance to work on this
> sometime in the next two weeks as I am pre-occupied with other
> items. I will definitely try to use this version and do my
> testing and let you know the result.
> 
> Will this apply cleanly to the v4l-dvb linux-next branch?

Maybe you can apply the whole set of 9 patches to it, not sure. Better yet 
get the complete stack from the location I provided in the introductory 
mail (0/9) and apply it as instructed there. Just beware, that there are 
still some older patch versions, which you would have to replace with the 
ones from this thread. I'll try to update that stack shortly.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-11-02 16:14       ` Karicheri, Muralidharan
@ 2009-11-04 19:11         ` Guennadi Liakhovetski
  0 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-04 19:11 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

On Mon, 2 Nov 2009, Karicheri, Muralidharan wrote:

> >> >@@ -301,9 +301,9 @@ static int mt9t031_set_params(struct
> >soc_camera_device
> >> >*icd,
> >> > 		ret = reg_write(client, MT9T031_WINDOW_WIDTH, rect->width - 1);
> >> > 	if (ret >= 0)
> >> > 		ret = reg_write(client, MT9T031_WINDOW_HEIGHT,
> >> >-				rect->height + icd->y_skip_top - 1);
> >> >+				rect->height - 1);
> >
> >> Why y_skip_top is removed?
> >
> >Because noone ever said they needed it?
> >
> I suggest you keep it. It can have default 0. I have not viewed the 
> resulting image for the top line to see if it is corrupted. I just
> use it to display it to my display device and I am not seeing any
> corruption. I need to view the image at some point to check if it has
> any corruption.

Ok, I preserved it, although I'm not convinced it is indeed needed.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-04 17:53             ` Guennadi Liakhovetski
@ 2009-11-05  0:04               ` Guennadi Liakhovetski
  0 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-05  0:04 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Linux Media Mailing List, Hans Verkuil, Laurent Pinchart, Sakari Ailus

On Wed, 4 Nov 2009, Guennadi Liakhovetski wrote:

> On Wed, 4 Nov 2009, Karicheri, Muralidharan wrote:
> 
> > Guennadi,
> > 
> > Thanks for the reply. I will have a chance to work on this
> > sometime in the next two weeks as I am pre-occupied with other
> > items. I will definitely try to use this version and do my
> > testing and let you know the result.
> > 
> > Will this apply cleanly to the v4l-dvb linux-next branch?
> 
> Maybe you can apply the whole set of 9 patches to it, not sure. Better yet 
> get the complete stack from the location I provided in the introductory 
> mail (0/9) and apply it as instructed there. Just beware, that there are 
> still some older patch versions, which you would have to replace with the 
> ones from this thread. I'll try to update that stack shortly.

The current stack, based on 2.6.32-rc5, is at

http://download.open-technology.de/soc-camera/20091105/

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes
  2009-10-30 14:01 ` [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes Guennadi Liakhovetski
@ 2009-11-05 14:45   ` Hans Verkuil
  2009-11-05 16:29     ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-05 14:45 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Friday 30 October 2009 15:01:14 Guennadi Liakhovetski wrote:
> The 16-bit monochrome fourcc code has been previously abused for a 10-bit
> format, add a new 10-bit code instead. Also add missing 8- and 10-bit Bayer
> fourcc codes for completeness.

I'm fairly certain that you also have to document these new formats in the
DocBook documentation. Run 'make spec' to verify this.

Regards,

	Hans

> 
> Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> ---
>  include/linux/videodev2.h |    7 ++++++-
>  1 files changed, 6 insertions(+), 1 deletions(-)
> 
> diff --git a/include/linux/videodev2.h b/include/linux/videodev2.h
> index b59e78c..9b240d5 100644
> --- a/include/linux/videodev2.h
> +++ b/include/linux/videodev2.h
> @@ -294,6 +294,7 @@ struct v4l2_pix_format {
>  
>  /* Grey formats */
>  #define V4L2_PIX_FMT_GREY    v4l2_fourcc('G', 'R', 'E', 'Y') /*  8  Greyscale     */
> +#define V4L2_PIX_FMT_Y10     v4l2_fourcc('Y', '1', '0', ' ') /* 10  Greyscale     */
>  #define V4L2_PIX_FMT_Y16     v4l2_fourcc('Y', '1', '6', ' ') /* 16  Greyscale     */
>  
>  /* Palette formats */
> @@ -329,7 +330,11 @@ struct v4l2_pix_format {
>  #define V4L2_PIX_FMT_SBGGR8  v4l2_fourcc('B', 'A', '8', '1') /*  8  BGBG.. GRGR.. */
>  #define V4L2_PIX_FMT_SGBRG8  v4l2_fourcc('G', 'B', 'R', 'G') /*  8  GBGB.. RGRG.. */
>  #define V4L2_PIX_FMT_SGRBG8  v4l2_fourcc('G', 'R', 'B', 'G') /*  8  GRGR.. BGBG.. */
> -#define V4L2_PIX_FMT_SGRBG10 v4l2_fourcc('B', 'A', '1', '0') /* 10bit raw bayer */
> +#define V4L2_PIX_FMT_SRGGB8  v4l2_fourcc('R', 'G', 'G', 'B') /*  8  RGRG.. GBGB.. */
> +#define V4L2_PIX_FMT_SBGGR10 v4l2_fourcc('B', 'G', '1', '0') /* 10  BGBG.. GRGR.. */
> +#define V4L2_PIX_FMT_SGBRG10 v4l2_fourcc('G', 'B', '1', '0') /* 10  GBGB.. RGRG.. */
> +#define V4L2_PIX_FMT_SGRBG10 v4l2_fourcc('B', 'A', '1', '0') /* 10  GRGR.. BGBG.. */
> +#define V4L2_PIX_FMT_SRGGB10 v4l2_fourcc('R', 'G', '1', '0') /* 10  RGRG.. GBGB.. */
>  	/* 10bit raw bayer DPCM compressed to 8 bits */
>  #define V4L2_PIX_FMT_SGRBG10DPCM8 v4l2_fourcc('B', 'D', '1', '0')
>  	/*



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-10-30 14:01 ` [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats Guennadi Liakhovetski
@ 2009-11-05 15:41   ` Hans Verkuil
  2009-11-05 16:51     ` Guennadi Liakhovetski
  2009-11-10 13:51   ` Laurent Pinchart
  2009-11-11  7:55   ` Hans Verkuil
  2 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-05 15:41 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> Video subdevices, like cameras, decoders, connect to video bridges over
> specialised busses. Data is being transferred over these busses in various
> formats, which only loosely correspond to fourcc codes, describing how video
> data is stored in RAM. This is not a one-to-one correspondence, therefore we
> cannot use fourcc codes to configure subdevice output data formats. This patch
> adds codes for several such on-the-bus formats and an API, similar to the
> familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> codes. After all users of the old API in struct v4l2_subdev_video_ops are
> converted, the API will be removed.

OK, this seems to completely disregard points raised in my earlier "bus and
data format negotiation" RFC which is available here once www.mail-archive.org
is working again:

http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html

BTW, ignore the 'Video timings' section of that RFC. That part is wrong.

The big problem I have with this proposal is the unholy mixing of bus and
memory formatting. That should be completely separated. Only the bridge
knows how a bus format can be converted into which memory (pixel) formats.

A bus format is also separate from the colorspace: that is an independent
piece of data. Personally I would just keep using v4l2_pix_format, except
that the fourcc field refers to a busimg format rather than a pixel format
in the case of subdevs. In most non-sensor drivers this field is completely
ignored anyway since the bus format is fixed.

I don't mind if you do a bus format to pixel format mapping inside soc-camera,
but it shouldn't spill over into the v4l core code.

Laurent is also correct that this should be eventually pad-specific, but
we can ignore that for now.

I'm also missing the bus hardware configuration (polarities, sampling on
rising or falling edge). What happened to that? Or is that a next step?

Regards,

	Hans

> 
> Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> ---
>  drivers/media/video/Makefile        |    2 +-
>  drivers/media/video/v4l2-imagebus.c |  218 +++++++++++++++++++++++++++++++++++
>  include/media/v4l2-imagebus.h       |   84 ++++++++++++++
>  include/media/v4l2-subdev.h         |   10 ++-
>  4 files changed, 312 insertions(+), 2 deletions(-)
>  create mode 100644 drivers/media/video/v4l2-imagebus.c
>  create mode 100644 include/media/v4l2-imagebus.h
> 
> diff --git a/drivers/media/video/Makefile b/drivers/media/video/Makefile
> index 7a2dcc3..62d8907 100644
> --- a/drivers/media/video/Makefile
> +++ b/drivers/media/video/Makefile
> @@ -10,7 +10,7 @@ stkwebcam-objs	:=	stk-webcam.o stk-sensor.o
>  
>  omap2cam-objs	:=	omap24xxcam.o omap24xxcam-dma.o
>  
> -videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o
> +videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o v4l2-imagebus.o
>  
>  # V4L2 core modules
>  
> diff --git a/drivers/media/video/v4l2-imagebus.c b/drivers/media/video/v4l2-imagebus.c
> new file mode 100644
> index 0000000..e0a3a83
> --- /dev/null
> +++ b/drivers/media/video/v4l2-imagebus.c
> @@ -0,0 +1,218 @@
> +/*
> + * Image Bus API
> + *
> + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> + *
> + * This program is free software; you can redistribute it and/or modify
> + * it under the terms of the GNU General Public License version 2 as
> + * published by the Free Software Foundation.
> + */
> +
> +#include <linux/kernel.h>
> +#include <linux/module.h>
> +
> +#include <media/v4l2-device.h>
> +#include <media/v4l2-imagebus.h>
> +
> +static const struct v4l2_imgbus_pixelfmt imgbus_fmt[] = {
> +	[V4L2_IMGBUS_FMT_YUYV] = {
> +		.fourcc			= V4L2_PIX_FMT_YUYV,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "YUYV",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_YVYU] = {
> +		.fourcc			= V4L2_PIX_FMT_YVYU,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "YVYU",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_UYVY] = {
> +		.fourcc			= V4L2_PIX_FMT_UYVY,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "UYVY",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_VYUY] = {
> +		.fourcc			= V4L2_PIX_FMT_VYUY,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "VYUY",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8] = {
> +		.fourcc			= V4L2_PIX_FMT_VYUY,
> +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> +		.name			= "VYUY in SMPTE170M",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16] = {
> +		.fourcc			= V4L2_PIX_FMT_VYUY,
> +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> +		.name			= "VYUY in SMPTE170M, 16bit",
> +		.bits_per_sample	= 16,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB555] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB555,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB555",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB555X] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB555X,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB555X",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB565] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB565,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB565",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB565X] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB565X,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB565X",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR8] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGBRG8] = {
> +		.fourcc			= V4L2_PIX_FMT_SGBRG8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 GBRG",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGRBG8] = {
> +		.fourcc			= V4L2_PIX_FMT_SGRBG8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 GRBG",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SRGGB8] = {
> +		.fourcc			= V4L2_PIX_FMT_SRGGB8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 RGGB",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGBRG10] = {
> +		.fourcc			= V4L2_PIX_FMT_SGBRG10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 GBRG",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGRBG10] = {
> +		.fourcc			= V4L2_PIX_FMT_SGRBG10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 GRBG",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SRGGB10] = {
> +		.fourcc			= V4L2_PIX_FMT_SRGGB10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 RGGB",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_GREY] = {
> +		.fourcc			= V4L2_PIX_FMT_GREY,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "Grey",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_Y16] = {
> +		.fourcc			= V4L2_PIX_FMT_Y16,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "Grey 16bit",
> +		.bits_per_sample	= 16,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_Y10] = {
> +		.fourcc			= V4L2_PIX_FMT_Y10,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "Grey 10bit",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_BE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> +		.order			= V4L2_IMGBUS_ORDER_BE,
> +	},
> +};
> +
> +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> +	enum v4l2_imgbus_pixelcode code)
> +{
> +	if ((unsigned int)code > ARRAY_SIZE(imgbus_fmt))
> +		return NULL;
> +	return imgbus_fmt + code;
> +}
> +EXPORT_SYMBOL(v4l2_imgbus_get_fmtdesc);
> +
> +s32 v4l2_imgbus_bytes_per_line(u32 width,
> +			       const struct v4l2_imgbus_pixelfmt *imgf)
> +{
> +	switch (imgf->packing) {
> +	case V4L2_IMGBUS_PACKING_NONE:
> +		return width * imgf->bits_per_sample / 8;
> +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
> +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
> +	case V4L2_IMGBUS_PACKING_EXTEND16:
> +		return width * 2;
> +	}
> +	return -EINVAL;
> +}
> +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
> diff --git a/include/media/v4l2-imagebus.h b/include/media/v4l2-imagebus.h
> new file mode 100644
> index 0000000..022d044
> --- /dev/null
> +++ b/include/media/v4l2-imagebus.h
> @@ -0,0 +1,84 @@
> +/*
> + * Image Bus API header
> + *
> + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> + *
> + * This program is free software; you can redistribute it and/or modify
> + * it under the terms of the GNU General Public License version 2 as
> + * published by the Free Software Foundation.
> + */
> +
> +#ifndef V4L2_IMGBUS_H
> +#define V4L2_IMGBUS_H
> +
> +enum v4l2_imgbus_packing {
> +	V4L2_IMGBUS_PACKING_NONE,
> +	V4L2_IMGBUS_PACKING_2X8_PADHI,
> +	V4L2_IMGBUS_PACKING_2X8_PADLO,
> +	V4L2_IMGBUS_PACKING_EXTEND16,
> +};
> +
> +enum v4l2_imgbus_order {
> +	V4L2_IMGBUS_ORDER_LE,
> +	V4L2_IMGBUS_ORDER_BE,
> +};
> +
> +enum v4l2_imgbus_pixelcode {
> +	V4L2_IMGBUS_FMT_YUYV,
> +	V4L2_IMGBUS_FMT_YVYU,
> +	V4L2_IMGBUS_FMT_UYVY,
> +	V4L2_IMGBUS_FMT_VYUY,
> +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
> +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
> +	V4L2_IMGBUS_FMT_RGB555,
> +	V4L2_IMGBUS_FMT_RGB555X,
> +	V4L2_IMGBUS_FMT_RGB565,
> +	V4L2_IMGBUS_FMT_RGB565X,
> +	V4L2_IMGBUS_FMT_SBGGR8,
> +	V4L2_IMGBUS_FMT_SGBRG8,
> +	V4L2_IMGBUS_FMT_SGRBG8,
> +	V4L2_IMGBUS_FMT_SRGGB8,
> +	V4L2_IMGBUS_FMT_SBGGR10,
> +	V4L2_IMGBUS_FMT_SGBRG10,
> +	V4L2_IMGBUS_FMT_SGRBG10,
> +	V4L2_IMGBUS_FMT_SRGGB10,
> +	V4L2_IMGBUS_FMT_GREY,
> +	V4L2_IMGBUS_FMT_Y16,
> +	V4L2_IMGBUS_FMT_Y10,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
> +};
> +
> +/**
> + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
> + * @fourcc:		Fourcc code...
> + * @colorspace:		and colorspace, that will be obtained if the data is
> + *			stored in memory in the following way:
> + * @bits_per_sample:	How many bits the bridge has to sample
> + * @packing:		Type of sample-packing, that has to be used
> + * @order:		Sample order when storing in memory
> + */
> +struct v4l2_imgbus_pixelfmt {
> +	u32				fourcc;
> +	enum v4l2_colorspace		colorspace;
> +	const char			*name;
> +	enum v4l2_imgbus_packing	packing;
> +	enum v4l2_imgbus_order		order;
> +	u8				bits_per_sample;
> +};
> +
> +struct v4l2_imgbus_framefmt {
> +	__u32				width;
> +	__u32				height;
> +	enum v4l2_imgbus_pixelcode	code;
> +	enum v4l2_field			field;
> +};
> +
> +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> +	enum v4l2_imgbus_pixelcode code);
> +s32 v4l2_imgbus_bytes_per_line(u32 width,
> +			       const struct v4l2_imgbus_pixelfmt *imgf);
> +
> +#endif
> diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> index 04193eb..1e86f39 100644
> --- a/include/media/v4l2-subdev.h
> +++ b/include/media/v4l2-subdev.h
> @@ -22,6 +22,7 @@
>  #define _V4L2_SUBDEV_H
>  
>  #include <media/v4l2-common.h>
> +#include <media/v4l2-imagebus.h>
>  
>  struct v4l2_device;
>  struct v4l2_subdev;
> @@ -196,7 +197,7 @@ struct v4l2_subdev_audio_ops {
>     s_std_output: set v4l2_std_id for video OUTPUT devices. This is ignored by
>  	video input devices.
>  
> -  s_crystal_freq: sets the frequency of the crystal used to generate the
> +   s_crystal_freq: sets the frequency of the crystal used to generate the
>  	clocks in Hz. An extra flags field allows device specific configuration
>  	regarding clock frequency dividers, etc. If not used, then set flags
>  	to 0. If the frequency is not supported, then -EINVAL is returned.
> @@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
>  
>     s_routing: see s_routing in audio_ops, except this version is for video
>  	devices.
> +
> +   enum_imgbus_fmt: enumerate pixel formats provided by a video data source
>   */
>  struct v4l2_subdev_video_ops {
>  	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32 config);
> @@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
>  	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
>  	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
>  	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> +	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
> +			       enum v4l2_imgbus_pixelcode *code);
> +	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> +	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> +	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
>  };
>  
>  /**



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional
  2009-10-30 14:01 ` [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional Guennadi Liakhovetski
  2009-10-30 15:28   ` Karicheri, Muralidharan
@ 2009-11-05 15:46   ` Hans Verkuil
  1 sibling, 0 replies; 51+ messages in thread
From: Hans Verkuil @ 2009-11-05 15:46 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Friday 30 October 2009 15:01:36 Guennadi Liakhovetski wrote:
> Now that we have moved most of the functions over to the v4l2-subdev API, only
> quering and setting bus parameters are still performed using the legacy
> soc-camera client API. Make the use of this API optional for mt9t031.
> 
> Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> ---
> 
> Muralidharan, this one is for you to test. To differentiate between the 
> soc-camera case and a generic user I check i2c client's platform data 
> (client->dev.platform_data), so, you have to make sure your user doesn't 
> use that field for something else.
> 
> One more note: I'm not sure about where v4l2_device_unregister_subdev() 
> should be called. In soc-camera the core calls 
> v4l2_i2c_new_subdev_board(), which then calls 
> v4l2_device_register_subdev(). Logically, it's also the core that then 
> calls v4l2_device_unregister_subdev(). Whereas I see many other client 
> drivers call v4l2_device_unregister_subdev() internally. So, if your 
> bridge driver does not call v4l2_device_unregister_subdev() itself and 
> expects the client to call it, there will be a slight problem with that 
> too.

The remove function of an i2c module should call v4l2_device_unregister_subdev.

>From the v4l2-framework.txt document:

"Make sure to call v4l2_device_unregister_subdev(sd) when the remove() callback
is called. This will unregister the sub-device from the bridge driver. It is
safe to call this even if the sub-device was never registered.

You need to do this because when the bridge driver destroys the i2c adapter
the remove() callbacks are called of the i2c devices on that adapter.
After that the corresponding v4l2_subdev structures are invalid, so they
have to be unregistered first. Calling v4l2_device_unregister_subdev(sd)
from the remove() callback ensures that this is always done correctly."

Note that this is something that will not normally happen on a SoC, but it
is common for USB or PCI devices.

Regards,

	Hans

> 
>  drivers/media/video/mt9t031.c |  146 ++++++++++++++++++++---------------------
>  1 files changed, 70 insertions(+), 76 deletions(-)
> 
> diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
> index c95c277..49357bd 100644
> --- a/drivers/media/video/mt9t031.c
> +++ b/drivers/media/video/mt9t031.c
> @@ -204,6 +204,59 @@ static unsigned long mt9t031_query_bus_param(struct soc_camera_device *icd)
>  	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
>  }
>  
> +static const struct v4l2_queryctrl mt9t031_controls[] = {
> +	{
> +		.id		= V4L2_CID_VFLIP,
> +		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> +		.name		= "Flip Vertically",
> +		.minimum	= 0,
> +		.maximum	= 1,
> +		.step		= 1,
> +		.default_value	= 0,
> +	}, {
> +		.id		= V4L2_CID_HFLIP,
> +		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> +		.name		= "Flip Horizontally",
> +		.minimum	= 0,
> +		.maximum	= 1,
> +		.step		= 1,
> +		.default_value	= 0,
> +	}, {
> +		.id		= V4L2_CID_GAIN,
> +		.type		= V4L2_CTRL_TYPE_INTEGER,
> +		.name		= "Gain",
> +		.minimum	= 0,
> +		.maximum	= 127,
> +		.step		= 1,
> +		.default_value	= 64,
> +		.flags		= V4L2_CTRL_FLAG_SLIDER,
> +	}, {
> +		.id		= V4L2_CID_EXPOSURE,
> +		.type		= V4L2_CTRL_TYPE_INTEGER,
> +		.name		= "Exposure",
> +		.minimum	= 1,
> +		.maximum	= 255,
> +		.step		= 1,
> +		.default_value	= 255,
> +		.flags		= V4L2_CTRL_FLAG_SLIDER,
> +	}, {
> +		.id		= V4L2_CID_EXPOSURE_AUTO,
> +		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> +		.name		= "Automatic Exposure",
> +		.minimum	= 0,
> +		.maximum	= 1,
> +		.step		= 1,
> +		.default_value	= 1,
> +	}
> +};
> +
> +static struct soc_camera_ops mt9t031_ops = {
> +	.set_bus_param		= mt9t031_set_bus_param,
> +	.query_bus_param	= mt9t031_query_bus_param,
> +	.controls		= mt9t031_controls,
> +	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> +};
> +
>  /* target must be _even_ */
>  static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
>  {
> @@ -223,10 +276,9 @@ static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
>  }
>  
>  /* rect is the sensor rectangle, the caller guarantees parameter validity */
> -static int mt9t031_set_params(struct soc_camera_device *icd,
> +static int mt9t031_set_params(struct i2c_client *client,
>  			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
>  {
> -	struct i2c_client *client = to_i2c_client(to_soc_camera_control(icd));
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
>  	int ret;
>  	u16 xbin, ybin;
> @@ -307,7 +359,7 @@ static int mt9t031_set_params(struct soc_camera_device *icd,
>  		if (ret >= 0) {
>  			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
>  			const struct v4l2_queryctrl *qctrl =
> -				soc_camera_find_qctrl(icd->ops,
> +				soc_camera_find_qctrl(&mt9t031_ops,
>  						      V4L2_CID_EXPOSURE);
>  			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
>  				 (qctrl->maximum - qctrl->minimum)) /
> @@ -333,7 +385,6 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
>  	struct v4l2_rect rect = a->c;
>  	struct i2c_client *client = sd->priv;
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
> -	struct soc_camera_device *icd = client->dev.platform_data;
>  
>  	rect.width = ALIGN(rect.width, 2);
>  	rect.height = ALIGN(rect.height, 2);
> @@ -344,7 +395,7 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
>  	soc_camera_limit_side(&rect.top, &rect.height,
>  		     MT9T031_ROW_SKIP, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT);
>  
> -	return mt9t031_set_params(icd, &rect, mt9t031->xskip, mt9t031->yskip);
> +	return mt9t031_set_params(client, &rect, mt9t031->xskip, mt9t031->yskip);
>  }
>  
>  static int mt9t031_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
> @@ -391,7 +442,6 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
>  {
>  	struct i2c_client *client = sd->priv;
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
> -	struct soc_camera_device *icd = client->dev.platform_data;
>  	u16 xskip, yskip;
>  	struct v4l2_rect rect = mt9t031->rect;
>  
> @@ -403,7 +453,7 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
>  	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
>  
>  	/* mt9t031_set_params() doesn't change width and height */
> -	return mt9t031_set_params(icd, &rect, xskip, yskip);
> +	return mt9t031_set_params(client, &rect, xskip, yskip);
>  }
>  
>  /*
> @@ -476,59 +526,6 @@ static int mt9t031_s_register(struct v4l2_subdev *sd,
>  }
>  #endif
>  
> -static const struct v4l2_queryctrl mt9t031_controls[] = {
> -	{
> -		.id		= V4L2_CID_VFLIP,
> -		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> -		.name		= "Flip Vertically",
> -		.minimum	= 0,
> -		.maximum	= 1,
> -		.step		= 1,
> -		.default_value	= 0,
> -	}, {
> -		.id		= V4L2_CID_HFLIP,
> -		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> -		.name		= "Flip Horizontally",
> -		.minimum	= 0,
> -		.maximum	= 1,
> -		.step		= 1,
> -		.default_value	= 0,
> -	}, {
> -		.id		= V4L2_CID_GAIN,
> -		.type		= V4L2_CTRL_TYPE_INTEGER,
> -		.name		= "Gain",
> -		.minimum	= 0,
> -		.maximum	= 127,
> -		.step		= 1,
> -		.default_value	= 64,
> -		.flags		= V4L2_CTRL_FLAG_SLIDER,
> -	}, {
> -		.id		= V4L2_CID_EXPOSURE,
> -		.type		= V4L2_CTRL_TYPE_INTEGER,
> -		.name		= "Exposure",
> -		.minimum	= 1,
> -		.maximum	= 255,
> -		.step		= 1,
> -		.default_value	= 255,
> -		.flags		= V4L2_CTRL_FLAG_SLIDER,
> -	}, {
> -		.id		= V4L2_CID_EXPOSURE_AUTO,
> -		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> -		.name		= "Automatic Exposure",
> -		.minimum	= 0,
> -		.maximum	= 1,
> -		.step		= 1,
> -		.default_value	= 1,
> -	}
> -};
> -
> -static struct soc_camera_ops mt9t031_ops = {
> -	.set_bus_param		= mt9t031_set_bus_param,
> -	.query_bus_param	= mt9t031_query_bus_param,
> -	.controls		= mt9t031_controls,
> -	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> -};
> -
>  static int mt9t031_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  {
>  	struct i2c_client *client = sd->priv;
> @@ -565,7 +562,6 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  {
>  	struct i2c_client *client = sd->priv;
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
> -	struct soc_camera_device *icd = client->dev.platform_data;
>  	const struct v4l2_queryctrl *qctrl;
>  	int data;
>  
> @@ -657,7 +653,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  
>  			if (set_shutter(client, total_h) < 0)
>  				return -EIO;
> -			qctrl = soc_camera_find_qctrl(icd->ops, V4L2_CID_EXPOSURE);
> +			qctrl = soc_camera_find_qctrl(&mt9t031_ops,
> +						      V4L2_CID_EXPOSURE);
>  			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
>  				 (qctrl->maximum - qctrl->minimum)) /
>  				shutter_max + qctrl->minimum;
> @@ -751,18 +748,16 @@ static int mt9t031_probe(struct i2c_client *client,
>  	struct mt9t031 *mt9t031;
>  	struct soc_camera_device *icd = client->dev.platform_data;
>  	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
> -	struct soc_camera_link *icl;
>  	int ret;
>  
> -	if (!icd) {
> -		dev_err(&client->dev, "MT9T031: missing soc-camera data!\n");
> -		return -EINVAL;
> -	}
> +	if (icd) {
> +		struct soc_camera_link *icl = to_soc_camera_link(icd);
> +		if (!icl) {
> +			dev_err(&client->dev, "MT9T031 driver needs platform data\n");
> +			return -EINVAL;
> +		}
>  
> -	icl = to_soc_camera_link(icd);
> -	if (!icl) {
> -		dev_err(&client->dev, "MT9T031 driver needs platform data\n");
> -		return -EINVAL;
> +		icd->ops = &mt9t031_ops;
>  	}
>  
>  	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_WORD_DATA)) {
> @@ -777,9 +772,6 @@ static int mt9t031_probe(struct i2c_client *client,
>  
>  	v4l2_i2c_subdev_init(&mt9t031->subdev, client, &mt9t031_subdev_ops);
>  
> -	/* Second stage probe - when a capture adapter is there */
> -	icd->ops		= &mt9t031_ops;
> -
>  	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
>  	mt9t031->rect.top	= MT9T031_ROW_SKIP;
>  	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
> @@ -801,7 +793,8 @@ static int mt9t031_probe(struct i2c_client *client,
>  	mt9t031_disable(client);
>  
>  	if (ret) {
> -		icd->ops = NULL;
> +		if (icd)
> +			icd->ops = NULL;
>  		i2c_set_clientdata(client, NULL);
>  		kfree(mt9t031);
>  	}
> @@ -814,7 +807,8 @@ static int mt9t031_remove(struct i2c_client *client)
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
>  	struct soc_camera_device *icd = client->dev.platform_data;
>  
> -	icd->ops = NULL;
> +	if (icd)
> +		icd->ops = NULL;
>  	i2c_set_clientdata(client, NULL);
>  	client->driver = NULL;
>  	kfree(mt9t031);



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-04 16:49         ` [PATCH/RFC 9/9 v2] " Guennadi Liakhovetski
  2009-11-04 16:57           ` Karicheri, Muralidharan
@ 2009-11-05 15:57           ` Hans Verkuil
  2009-11-05 16:59             ` Guennadi Liakhovetski
  1 sibling, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-05 15:57 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Karicheri, Muralidharan, Linux Media Mailing List,
	Laurent Pinchart, Sakari Ailus

On Wednesday 04 November 2009 17:49:28 Guennadi Liakhovetski wrote:
> Now that we have moved most of the functions over to the v4l2-subdev API, only
> quering and setting bus parameters are still performed using the legacy
> soc-camera client API. Make the use of this API optional for mt9t031.
> 
> Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> ---
> 
> On Mon, 2 Nov 2009, Karicheri, Muralidharan wrote:
> 
> > >> >+static struct soc_camera_ops mt9t031_ops = {
> > >> >+	.set_bus_param		= mt9t031_set_bus_param,
> > >> >+	.query_bus_param	= mt9t031_query_bus_param,
> > >> >+	.controls		= mt9t031_controls,
> > >> >+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> > >> >+};
> > >> >+
> > >>
> > >> [MK] Why don't you implement queryctrl ops in core? query_bus_param
> > >> & set_bus_param() can be implemented as a sub device operation as well
> > >> right? I think we need to get the bus parameter RFC implemented and
> > >> this driver could be targeted for it's first use so that we could
> > >> work together to get it accepted. I didn't get a chance to study your
> > >> bus image format RFC, but plan to review it soon and to see if it can be
> > >> used in my platform as well. For use of this driver in our platform,
> > >> all reference to soc_ must be removed. I am ok if the structure is
> > >> re-used, but if this driver calls any soc_camera function, it canot
> > >> be used in my platform.
> > >
> > >Why? Some soc-camera functions are just library functions, you just have
> > >to build soc-camera into your kernel. (also see below)
> > >
> > My point is that the control is for the sensor device, so why to implement
> > queryctrl in SoC camera? Just for this I need to include SOC camera in 
> > my build? That doesn't make any sense at all. IMHO, queryctrl() 
> > logically belongs to this sensor driver which can be called from the 
> > bridge driver using sudev API call. Any reverse dependency from MT9T031 
> > to SoC camera to be removed if it is to be re-used across other 
> > platforms. Can we agree on this?
> 
> In general I'm sure you understand, that there are lots of functions in 
> the kernel, that we use in specific modules, not because they interact 
> with other systems, but because they implement some common functionality 
> and just reduce code-duplication. And I can well imagine that in many such 
> cases using just one or a couple of such functions will pull a much larger 
> pile of unused code with them. But in this case those calls can indeed be 
> very easily eliminated. Please have a look at the version below.

I'm not following this, I'm afraid. The sensor drivers should just support
queryctrl and should use v4l2_ctrl_query_fill() from v4l2-common.c to fill
in the v4l2_queryctrl struct.

This will also make it easy to convert them to the control framework that I
am working on.

Regards,

	Hans

> 
> > Did you have a chance to compare the driver file that I had sent to you?
> 
> I looked at it, but it is based on an earlier version of the driver, so, 
> it wasn't very easy to compare. Maybe you could send a diff against the 
> mainline version, on which it is based?
> 
> Thanks
> Guennadi
> 
>  drivers/media/video/mt9t031.c |  167 +++++++++++++++++++++--------------------
>  1 files changed, 85 insertions(+), 82 deletions(-)
> 
> diff --git a/drivers/media/video/mt9t031.c b/drivers/media/video/mt9t031.c
> index c95c277..86bf8f6 100644
> --- a/drivers/media/video/mt9t031.c
> +++ b/drivers/media/video/mt9t031.c
> @@ -204,6 +204,71 @@ static unsigned long mt9t031_query_bus_param(struct soc_camera_device *icd)
>  	return soc_camera_apply_sensor_flags(icl, MT9T031_BUS_PARAM);
>  }
>  
> +enum {
> +	MT9T031_CTRL_VFLIP,
> +	MT9T031_CTRL_HFLIP,
> +	MT9T031_CTRL_GAIN,
> +	MT9T031_CTRL_EXPOSURE,
> +	MT9T031_CTRL_EXPOSURE_AUTO,
> +};
> +
> +static const struct v4l2_queryctrl mt9t031_controls[] = {
> +	[MT9T031_CTRL_VFLIP] = {
> +		.id		= V4L2_CID_VFLIP,
> +		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> +		.name		= "Flip Vertically",
> +		.minimum	= 0,
> +		.maximum	= 1,
> +		.step		= 1,
> +		.default_value	= 0,
> +	},
> +	[MT9T031_CTRL_HFLIP] = {
> +		.id		= V4L2_CID_HFLIP,
> +		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> +		.name		= "Flip Horizontally",
> +		.minimum	= 0,
> +		.maximum	= 1,
> +		.step		= 1,
> +		.default_value	= 0,
> +	},
> +	[MT9T031_CTRL_GAIN] = {
> +		.id		= V4L2_CID_GAIN,
> +		.type		= V4L2_CTRL_TYPE_INTEGER,
> +		.name		= "Gain",
> +		.minimum	= 0,
> +		.maximum	= 127,
> +		.step		= 1,
> +		.default_value	= 64,
> +		.flags		= V4L2_CTRL_FLAG_SLIDER,
> +	},
> +	[MT9T031_CTRL_EXPOSURE] = {
> +		.id		= V4L2_CID_EXPOSURE,
> +		.type		= V4L2_CTRL_TYPE_INTEGER,
> +		.name		= "Exposure",
> +		.minimum	= 1,
> +		.maximum	= 255,
> +		.step		= 1,
> +		.default_value	= 255,
> +		.flags		= V4L2_CTRL_FLAG_SLIDER,
> +	},
> +	[MT9T031_CTRL_EXPOSURE_AUTO] = {
> +		.id		= V4L2_CID_EXPOSURE_AUTO,
> +		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> +		.name		= "Automatic Exposure",
> +		.minimum	= 0,
> +		.maximum	= 1,
> +		.step		= 1,
> +		.default_value	= 1,
> +	}
> +};
> +
> +static struct soc_camera_ops mt9t031_ops = {
> +	.set_bus_param		= mt9t031_set_bus_param,
> +	.query_bus_param	= mt9t031_query_bus_param,
> +	.controls		= mt9t031_controls,
> +	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> +};
> +
>  /* target must be _even_ */
>  static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
>  {
> @@ -223,10 +288,9 @@ static u16 mt9t031_skip(s32 *source, s32 target, s32 max)
>  }
>  
>  /* rect is the sensor rectangle, the caller guarantees parameter validity */
> -static int mt9t031_set_params(struct soc_camera_device *icd,
> +static int mt9t031_set_params(struct i2c_client *client,
>  			      struct v4l2_rect *rect, u16 xskip, u16 yskip)
>  {
> -	struct i2c_client *client = to_i2c_client(to_soc_camera_control(icd));
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
>  	int ret;
>  	u16 xbin, ybin;
> @@ -307,8 +371,7 @@ static int mt9t031_set_params(struct soc_camera_device *icd,
>  		if (ret >= 0) {
>  			const u32 shutter_max = MT9T031_MAX_HEIGHT + vblank;
>  			const struct v4l2_queryctrl *qctrl =
> -				soc_camera_find_qctrl(icd->ops,
> -						      V4L2_CID_EXPOSURE);
> +				&mt9t031_controls[MT9T031_CTRL_EXPOSURE];
>  			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
>  				 (qctrl->maximum - qctrl->minimum)) /
>  				shutter_max + qctrl->minimum;
> @@ -333,7 +396,6 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
>  	struct v4l2_rect rect = a->c;
>  	struct i2c_client *client = sd->priv;
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
> -	struct soc_camera_device *icd = client->dev.platform_data;
>  
>  	rect.width = ALIGN(rect.width, 2);
>  	rect.height = ALIGN(rect.height, 2);
> @@ -344,7 +406,7 @@ static int mt9t031_s_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
>  	soc_camera_limit_side(&rect.top, &rect.height,
>  		     MT9T031_ROW_SKIP, MT9T031_MIN_HEIGHT, MT9T031_MAX_HEIGHT);
>  
> -	return mt9t031_set_params(icd, &rect, mt9t031->xskip, mt9t031->yskip);
> +	return mt9t031_set_params(client, &rect, mt9t031->xskip, mt9t031->yskip);
>  }
>  
>  static int mt9t031_g_crop(struct v4l2_subdev *sd, struct v4l2_crop *a)
> @@ -391,7 +453,6 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
>  {
>  	struct i2c_client *client = sd->priv;
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
> -	struct soc_camera_device *icd = client->dev.platform_data;
>  	u16 xskip, yskip;
>  	struct v4l2_rect rect = mt9t031->rect;
>  
> @@ -403,7 +464,7 @@ static int mt9t031_s_fmt(struct v4l2_subdev *sd,
>  	yskip = mt9t031_skip(&rect.height, imgf->height, MT9T031_MAX_HEIGHT);
>  
>  	/* mt9t031_set_params() doesn't change width and height */
> -	return mt9t031_set_params(icd, &rect, xskip, yskip);
> +	return mt9t031_set_params(client, &rect, xskip, yskip);
>  }
>  
>  /*
> @@ -476,59 +537,6 @@ static int mt9t031_s_register(struct v4l2_subdev *sd,
>  }
>  #endif
>  
> -static const struct v4l2_queryctrl mt9t031_controls[] = {
> -	{
> -		.id		= V4L2_CID_VFLIP,
> -		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> -		.name		= "Flip Vertically",
> -		.minimum	= 0,
> -		.maximum	= 1,
> -		.step		= 1,
> -		.default_value	= 0,
> -	}, {
> -		.id		= V4L2_CID_HFLIP,
> -		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> -		.name		= "Flip Horizontally",
> -		.minimum	= 0,
> -		.maximum	= 1,
> -		.step		= 1,
> -		.default_value	= 0,
> -	}, {
> -		.id		= V4L2_CID_GAIN,
> -		.type		= V4L2_CTRL_TYPE_INTEGER,
> -		.name		= "Gain",
> -		.minimum	= 0,
> -		.maximum	= 127,
> -		.step		= 1,
> -		.default_value	= 64,
> -		.flags		= V4L2_CTRL_FLAG_SLIDER,
> -	}, {
> -		.id		= V4L2_CID_EXPOSURE,
> -		.type		= V4L2_CTRL_TYPE_INTEGER,
> -		.name		= "Exposure",
> -		.minimum	= 1,
> -		.maximum	= 255,
> -		.step		= 1,
> -		.default_value	= 255,
> -		.flags		= V4L2_CTRL_FLAG_SLIDER,
> -	}, {
> -		.id		= V4L2_CID_EXPOSURE_AUTO,
> -		.type		= V4L2_CTRL_TYPE_BOOLEAN,
> -		.name		= "Automatic Exposure",
> -		.minimum	= 0,
> -		.maximum	= 1,
> -		.step		= 1,
> -		.default_value	= 1,
> -	}
> -};
> -
> -static struct soc_camera_ops mt9t031_ops = {
> -	.set_bus_param		= mt9t031_set_bus_param,
> -	.query_bus_param	= mt9t031_query_bus_param,
> -	.controls		= mt9t031_controls,
> -	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> -};
> -
>  static int mt9t031_g_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  {
>  	struct i2c_client *client = sd->priv;
> @@ -565,15 +573,9 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  {
>  	struct i2c_client *client = sd->priv;
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
> -	struct soc_camera_device *icd = client->dev.platform_data;
>  	const struct v4l2_queryctrl *qctrl;
>  	int data;
>  
> -	qctrl = soc_camera_find_qctrl(&mt9t031_ops, ctrl->id);
> -
> -	if (!qctrl)
> -		return -EINVAL;
> -
>  	switch (ctrl->id) {
>  	case V4L2_CID_VFLIP:
>  		if (ctrl->value)
> @@ -592,6 +594,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  			return -EIO;
>  		break;
>  	case V4L2_CID_GAIN:
> +		qctrl = &mt9t031_controls[MT9T031_CTRL_GAIN];
>  		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl->minimum)
>  			return -EINVAL;
>  		/* See Datasheet Table 7, Gain settings. */
> @@ -631,6 +634,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  		mt9t031->gain = ctrl->value;
>  		break;
>  	case V4L2_CID_EXPOSURE:
> +		qctrl = &mt9t031_controls[MT9T031_CTRL_EXPOSURE];
>  		/* mt9t031 has maximum == default */
>  		if (ctrl->value > qctrl->maximum || ctrl->value < qctrl->minimum)
>  			return -EINVAL;
> @@ -657,7 +661,7 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  
>  			if (set_shutter(client, total_h) < 0)
>  				return -EIO;
> -			qctrl = soc_camera_find_qctrl(icd->ops, V4L2_CID_EXPOSURE);
> +			qctrl = &mt9t031_controls[MT9T031_CTRL_EXPOSURE];
>  			mt9t031->exposure = (shutter_max / 2 + (total_h - 1) *
>  				 (qctrl->maximum - qctrl->minimum)) /
>  				shutter_max + qctrl->minimum;
> @@ -665,6 +669,8 @@ static int mt9t031_s_ctrl(struct v4l2_subdev *sd, struct v4l2_control *ctrl)
>  		} else
>  			mt9t031->autoexposure = 0;
>  		break;
> +	default:
> +		return -EINVAL;
>  	}
>  	return 0;
>  }
> @@ -751,18 +757,16 @@ static int mt9t031_probe(struct i2c_client *client,
>  	struct mt9t031 *mt9t031;
>  	struct soc_camera_device *icd = client->dev.platform_data;
>  	struct i2c_adapter *adapter = to_i2c_adapter(client->dev.parent);
> -	struct soc_camera_link *icl;
>  	int ret;
>  
> -	if (!icd) {
> -		dev_err(&client->dev, "MT9T031: missing soc-camera data!\n");
> -		return -EINVAL;
> -	}
> +	if (icd) {
> +		struct soc_camera_link *icl = to_soc_camera_link(icd);
> +		if (!icl) {
> +			dev_err(&client->dev, "MT9T031 driver needs platform data\n");
> +			return -EINVAL;
> +		}
>  
> -	icl = to_soc_camera_link(icd);
> -	if (!icl) {
> -		dev_err(&client->dev, "MT9T031 driver needs platform data\n");
> -		return -EINVAL;
> +		icd->ops = &mt9t031_ops;
>  	}
>  
>  	if (!i2c_check_functionality(adapter, I2C_FUNC_SMBUS_WORD_DATA)) {
> @@ -777,9 +781,6 @@ static int mt9t031_probe(struct i2c_client *client,
>  
>  	v4l2_i2c_subdev_init(&mt9t031->subdev, client, &mt9t031_subdev_ops);
>  
> -	/* Second stage probe - when a capture adapter is there */
> -	icd->ops		= &mt9t031_ops;
> -
>  	mt9t031->rect.left	= MT9T031_COLUMN_SKIP;
>  	mt9t031->rect.top	= MT9T031_ROW_SKIP;
>  	mt9t031->rect.width	= MT9T031_MAX_WIDTH;
> @@ -801,7 +802,8 @@ static int mt9t031_probe(struct i2c_client *client,
>  	mt9t031_disable(client);
>  
>  	if (ret) {
> -		icd->ops = NULL;
> +		if (icd)
> +			icd->ops = NULL;
>  		i2c_set_clientdata(client, NULL);
>  		kfree(mt9t031);
>  	}
> @@ -814,7 +816,8 @@ static int mt9t031_remove(struct i2c_client *client)
>  	struct mt9t031 *mt9t031 = to_mt9t031(client);
>  	struct soc_camera_device *icd = client->dev.platform_data;
>  
> -	icd->ops = NULL;
> +	if (icd)
> +		icd->ops = NULL;
>  	i2c_set_clientdata(client, NULL);
>  	client->driver = NULL;
>  	kfree(mt9t031);



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes
  2009-11-05 14:45   ` Hans Verkuil
@ 2009-11-05 16:29     ` Guennadi Liakhovetski
  2009-11-05 16:32       ` Hans Verkuil
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-05 16:29 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thu, 5 Nov 2009, Hans Verkuil wrote:

> On Friday 30 October 2009 15:01:14 Guennadi Liakhovetski wrote:
> > The 16-bit monochrome fourcc code has been previously abused for a 10-bit
> > format, add a new 10-bit code instead. Also add missing 8- and 10-bit Bayer
> > fourcc codes for completeness.
> 
> I'm fairly certain that you also have to document these new formats in the
> DocBook documentation. Run 'make spec' to verify this.

You mean hg-documentation, don't you? These are _kernel_ git-patches so 
far. When I prepare a pull request I'll (try not to forget to) add the 
docs too.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes
  2009-11-05 16:29     ` Guennadi Liakhovetski
@ 2009-11-05 16:32       ` Hans Verkuil
  0 siblings, 0 replies; 51+ messages in thread
From: Hans Verkuil @ 2009-11-05 16:32 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thursday 05 November 2009 17:29:29 Guennadi Liakhovetski wrote:
> On Thu, 5 Nov 2009, Hans Verkuil wrote:
> 
> > On Friday 30 October 2009 15:01:14 Guennadi Liakhovetski wrote:
> > > The 16-bit monochrome fourcc code has been previously abused for a 10-bit
> > > format, add a new 10-bit code instead. Also add missing 8- and 10-bit Bayer
> > > fourcc codes for completeness.
> > 
> > I'm fairly certain that you also have to document these new formats in the
> > DocBook documentation. Run 'make spec' to verify this.
> 
> You mean hg-documentation, don't you? These are _kernel_ git-patches so 
> far. When I prepare a pull request I'll (try not to forget to) add the 
> docs too.

Yes, that's hg documentation.

Thanks,

	Hans

-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-05 15:41   ` Hans Verkuil
@ 2009-11-05 16:51     ` Guennadi Liakhovetski
  2009-11-05 18:11       ` Hans Verkuil
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-05 16:51 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thu, 5 Nov 2009, Hans Verkuil wrote:

> On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > Video subdevices, like cameras, decoders, connect to video bridges over
> > specialised busses. Data is being transferred over these busses in various
> > formats, which only loosely correspond to fourcc codes, describing how video
> > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > cannot use fourcc codes to configure subdevice output data formats. This patch
> > adds codes for several such on-the-bus formats and an API, similar to the
> > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > converted, the API will be removed.
> 
> OK, this seems to completely disregard points raised in my earlier "bus and
> data format negotiation" RFC which is available here once www.mail-archive.org
> is working again:
> 
> http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html
> 
> BTW, ignore the 'Video timings' section of that RFC. That part is wrong.
> 
> The big problem I have with this proposal is the unholy mixing of bus and
> memory formatting. That should be completely separated. Only the bridge
> knows how a bus format can be converted into which memory (pixel) formats.

Please, explain why only the bridge knows about that.

My model is the following:

1. we define various data formats on the bus. Each such format variation 
gets a unique identification.

2. given a data format ID the data format is perfectly defined. This 
means, you do not have to have a special knowledge about this specific 
format to be able to handle it in some _generic_ way. A typical such 
generic handling on a bridge is, for instance, copying the data into 
memory "one-to-one." For example, if a sensor delivers 10 bit monochrome 
data over an eight bit bus as follows

y7 y6 y5 y4 y3 y2 y1 y0   xx xx xx xx xx xx y9 y8 ...

then _any_ bridge, capable of just copying data from the bus bytewise into 
RAM will be able to produce little-endian 10-bit grey pixel format in RAM. 
This handling is _not_ bridge specific. This is what I call packing.

3. Therefore, each bridge, capable of handling of some "generic" data 
using some specific packing, can perfectly look through data-format 
descriptors, see if it finds any with the supported packing, and if so, it 
_then_ knows, that it can use that specific data format and the specific 
packing to produce the resulting pixel format from the format descriptor.

> A bus format is also separate from the colorspace: that is an independent
> piece of data.

Sure. TBH, I do not quite how enum v4l2_colorspace is actually used. Is it 
uniquely defined by each pixel format? So, it can be derived from that? 
Then it is indeed redundant. Can drop, don't care about it that much.

> Personally I would just keep using v4l2_pix_format, except
> that the fourcc field refers to a busimg format rather than a pixel format
> in the case of subdevs. In most non-sensor drivers this field is completely
> ignored anyway since the bus format is fixed.

Example: there are cameras, that can be configured to pad 2 bits from the 
incomplete byte above to 10 either in high or in low bits. Do you want to 
introduce a new FOURCC code for those two formats? This is an example of 
what I call packing.

> I don't mind if you do a bus format to pixel format mapping inside soc-camera,
> but it shouldn't spill over into the v4l core code.

Don't understand. This is not for soc-camera only. This infrastructure 
should be used by all subdev drivers, communicating aver a data bus. The 
distinction is quite clear to me: if two entities connect over a bus, they 
use an image-bus data format to describe the data format. If they write 
and read from RAM - that's pixel format.

> Laurent is also correct that this should be eventually pad-specific, but
> we can ignore that for now.
> 
> I'm also missing the bus hardware configuration (polarities, sampling on
> rising or falling edge). What happened to that? Or is that a next step?

It is separate, yes.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-05 15:57           ` Hans Verkuil
@ 2009-11-05 16:59             ` Guennadi Liakhovetski
  2009-11-05 17:07               ` Karicheri, Muralidharan
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-05 16:59 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Karicheri, Muralidharan, Linux Media Mailing List,
	Laurent Pinchart, Sakari Ailus

On Thu, 5 Nov 2009, Hans Verkuil wrote:

> On Wednesday 04 November 2009 17:49:28 Guennadi Liakhovetski wrote:
> > Now that we have moved most of the functions over to the v4l2-subdev API, only
> > quering and setting bus parameters are still performed using the legacy
> > soc-camera client API. Make the use of this API optional for mt9t031.
> > 
> > Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > ---
> > 
> > On Mon, 2 Nov 2009, Karicheri, Muralidharan wrote:
> > 
> > > >> >+static struct soc_camera_ops mt9t031_ops = {
> > > >> >+	.set_bus_param		= mt9t031_set_bus_param,
> > > >> >+	.query_bus_param	= mt9t031_query_bus_param,
> > > >> >+	.controls		= mt9t031_controls,
> > > >> >+	.num_controls		= ARRAY_SIZE(mt9t031_controls),
> > > >> >+};
> > > >> >+
> > > >>
> > > >> [MK] Why don't you implement queryctrl ops in core? query_bus_param
> > > >> & set_bus_param() can be implemented as a sub device operation as well
> > > >> right? I think we need to get the bus parameter RFC implemented and
> > > >> this driver could be targeted for it's first use so that we could
> > > >> work together to get it accepted. I didn't get a chance to study your
> > > >> bus image format RFC, but plan to review it soon and to see if it can be
> > > >> used in my platform as well. For use of this driver in our platform,
> > > >> all reference to soc_ must be removed. I am ok if the structure is
> > > >> re-used, but if this driver calls any soc_camera function, it canot
> > > >> be used in my platform.
> > > >
> > > >Why? Some soc-camera functions are just library functions, you just have
> > > >to build soc-camera into your kernel. (also see below)
> > > >
> > > My point is that the control is for the sensor device, so why to implement
> > > queryctrl in SoC camera? Just for this I need to include SOC camera in 
> > > my build? That doesn't make any sense at all. IMHO, queryctrl() 
> > > logically belongs to this sensor driver which can be called from the 
> > > bridge driver using sudev API call. Any reverse dependency from MT9T031 
> > > to SoC camera to be removed if it is to be re-used across other 
> > > platforms. Can we agree on this?
> > 
> > In general I'm sure you understand, that there are lots of functions in 
> > the kernel, that we use in specific modules, not because they interact 
> > with other systems, but because they implement some common functionality 
> > and just reduce code-duplication. And I can well imagine that in many such 
> > cases using just one or a couple of such functions will pull a much larger 
> > pile of unused code with them. But in this case those calls can indeed be 
> > very easily eliminated. Please have a look at the version below.
> 
> I'm not following this, I'm afraid. The sensor drivers should just support
> queryctrl and should use v4l2_ctrl_query_fill() from v4l2-common.c to fill
> in the v4l2_queryctrl struct.

I think, this is unrelated. Muralidharan just complained about the 
soc_camera_find_qctrl() function being used in client subdev drivers, that 
were to be converted to v4l2-subdev, specifically, in mt9t031.c. And I 
just explained, that that's just a pretty trivial library function, that 
does not introduce any restrictions on how that subdev driver can be used 
in non-soc-camera configurations, apart from the need to build and load 
the soc-camera module. In other words, any v4l2-device bridge driver 
should be able to communicate with such a subdev driver, calling that 
function.

> This will also make it easy to convert them to the control framework that I
> am working on.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-05 16:59             ` Guennadi Liakhovetski
@ 2009-11-05 17:07               ` Karicheri, Muralidharan
  2009-11-10 13:54                 ` Laurent Pinchart
  0 siblings, 1 reply; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-11-05 17:07 UTC (permalink / raw)
  To: Guennadi Liakhovetski, Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus


Guennadi,

>> in the v4l2_queryctrl struct.
>
>I think, this is unrelated. Muralidharan just complained about the
>soc_camera_find_qctrl() function being used in client subdev drivers, that
>were to be converted to v4l2-subdev, specifically, in mt9t031.c. And I
>just explained, that that's just a pretty trivial library function, that
>does not introduce any restrictions on how that subdev driver can be used
>in non-soc-camera configurations, apart from the need to build and load
>the soc-camera module. In other words, any v4l2-device bridge driver
>should be able to communicate with such a subdev driver, calling that
>function.
>
If soc_camera_find_qctrl() is such a generic function, why don't you
move it to v4l2-common.c so that other platforms doesn't have to build
SOC camera sub system to use this function? Your statement reinforce
this.

>> This will also make it easy to convert them to the control framework that
>I
>> am working on.
>
>Thanks
>Guennadi
>---
>Guennadi Liakhovetski, Ph.D.
>Freelance Open-Source Software Developer
>http://www.open-technology.de/
>--
>To unsubscribe from this list: send the line "unsubscribe linux-media" in
>the body of a message to majordomo@vger.kernel.org
>More majordomo info at  http://vger.kernel.org/majordomo-info.html


^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-05 16:51     ` Guennadi Liakhovetski
@ 2009-11-05 18:11       ` Hans Verkuil
  2009-11-05 18:56         ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-05 18:11 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thursday 05 November 2009 17:51:50 Guennadi Liakhovetski wrote:
> On Thu, 5 Nov 2009, Hans Verkuil wrote:
> 
> > On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > > Video subdevices, like cameras, decoders, connect to video bridges over
> > > specialised busses. Data is being transferred over these busses in various
> > > formats, which only loosely correspond to fourcc codes, describing how video
> > > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > > cannot use fourcc codes to configure subdevice output data formats. This patch
> > > adds codes for several such on-the-bus formats and an API, similar to the
> > > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > > converted, the API will be removed.
> > 
> > OK, this seems to completely disregard points raised in my earlier "bus and
> > data format negotiation" RFC which is available here once www.mail-archive.org
> > is working again:
> > 
> > http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html
> > 
> > BTW, ignore the 'Video timings' section of that RFC. That part is wrong.
> > 
> > The big problem I have with this proposal is the unholy mixing of bus and
> > memory formatting. That should be completely separated. Only the bridge
> > knows how a bus format can be converted into which memory (pixel) formats.
> 
> Please, explain why only the bridge knows about that.
> 
> My model is the following:
> 
> 1. we define various data formats on the bus. Each such format variation 
> gets a unique identification.
> 
> 2. given a data format ID the data format is perfectly defined. This 
> means, you do not have to have a special knowledge about this specific 
> format to be able to handle it in some _generic_ way. A typical such 
> generic handling on a bridge is, for instance, copying the data into 
> memory "one-to-one." For example, if a sensor delivers 10 bit monochrome 
> data over an eight bit bus as follows
> 
> y7 y6 y5 y4 y3 y2 y1 y0   xx xx xx xx xx xx y9 y8 ...
> 
> then _any_ bridge, capable of just copying data from the bus bytewise into 
> RAM will be able to produce little-endian 10-bit grey pixel format in RAM. 
> This handling is _not_ bridge specific. This is what I call packing.

Of course it is bridge dependent. It is the bridge that takes data from the
bus and puts it in memory. In many cases that is done very simply by bytewise
copying. Other bridges can do RGB to YUV or vice versa conversions or can do
endianness conversion or can do JPEG/MPEG compression on the fly or whatever
else hardware designers will think of.

It's no doubt true for the SoCs you have been working with, but it is not so
simple in general.
 
> 3. Therefore, each bridge, capable of handling of some "generic" data 
> using some specific packing, can perfectly look through data-format 
> descriptors, see if it finds any with the supported packing, and if so, it 
> _then_ knows, that it can use that specific data format and the specific 
> packing to produce the resulting pixel format from the format descriptor.
> 
> > A bus format is also separate from the colorspace: that is an independent
> > piece of data.
> 
> Sure. TBH, I do not quite how enum v4l2_colorspace is actually used. Is it 
> uniquely defined by each pixel format? So, it can be derived from that? 
> Then it is indeed redundant. Can drop, don't care about it that much.

It's independent from the pixel format. So the same pixel (or bus) format can
have different colorspaces.

> > Personally I would just keep using v4l2_pix_format, except
> > that the fourcc field refers to a busimg format rather than a pixel format
> > in the case of subdevs. In most non-sensor drivers this field is completely
> > ignored anyway since the bus format is fixed.
> 
> Example: there are cameras, that can be configured to pad 2 bits from the 
> incomplete byte above to 10 either in high or in low bits. Do you want to 
> introduce a new FOURCC code for those two formats? This is an example of 
> what I call packing.

If this happens in the sensor, then yes.

> > I don't mind if you do a bus format to pixel format mapping inside soc-camera,
> > but it shouldn't spill over into the v4l core code.
> 
> Don't understand. This is not for soc-camera only. This infrastructure 
> should be used by all subdev drivers, communicating aver a data bus. The 
> distinction is quite clear to me: if two entities connect over a bus, they 
> use an image-bus data format to describe the data format. If they write 
> and read from RAM - that's pixel format.

We agree about that, but why then does struct v4l2_imgbus_framefmt contain
memory-related fields like packing, order and bits_per_sample? A subdev driver
does not care about that. All it has are X pins through which the data has to
pass. How that will look like in memory it doesn't know and doesn't care.

> > Laurent is also correct that this should be eventually pad-specific, but
> > we can ignore that for now.
> > 
> > I'm also missing the bus hardware configuration (polarities, sampling on
> > rising or falling edge). What happened to that? Or is that a next step?
> 
> It is separate, yes.

I saw that as well. I had a lot of emails to go through :-)

Regards,

	Hans

> 
> Thanks
> Guennadi
> ---
> Guennadi Liakhovetski, Ph.D.
> Freelance Open-Source Software Developer
> http://www.open-technology.de/
> 
> 



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-05 18:11       ` Hans Verkuil
@ 2009-11-05 18:56         ` Guennadi Liakhovetski
  2009-11-06  6:47           ` Hans Verkuil
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-05 18:56 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thu, 5 Nov 2009, Hans Verkuil wrote:

> On Thursday 05 November 2009 17:51:50 Guennadi Liakhovetski wrote:
> > On Thu, 5 Nov 2009, Hans Verkuil wrote:
> > 
> > > On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > > > Video subdevices, like cameras, decoders, connect to video bridges over
> > > > specialised busses. Data is being transferred over these busses in various
> > > > formats, which only loosely correspond to fourcc codes, describing how video
> > > > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > > > cannot use fourcc codes to configure subdevice output data formats. This patch
> > > > adds codes for several such on-the-bus formats and an API, similar to the
> > > > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > > > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > > > converted, the API will be removed.
> > > 
> > > OK, this seems to completely disregard points raised in my earlier "bus and
> > > data format negotiation" RFC which is available here once www.mail-archive.org
> > > is working again:
> > > 
> > > http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html
> > > 
> > > BTW, ignore the 'Video timings' section of that RFC. That part is wrong.
> > > 
> > > The big problem I have with this proposal is the unholy mixing of bus and
> > > memory formatting. That should be completely separated. Only the bridge
> > > knows how a bus format can be converted into which memory (pixel) formats.
> > 
> > Please, explain why only the bridge knows about that.
> > 
> > My model is the following:
> > 
> > 1. we define various data formats on the bus. Each such format variation 
> > gets a unique identification.
> > 
> > 2. given a data format ID the data format is perfectly defined. This 
> > means, you do not have to have a special knowledge about this specific 
> > format to be able to handle it in some _generic_ way. A typical such 
> > generic handling on a bridge is, for instance, copying the data into 
> > memory "one-to-one." For example, if a sensor delivers 10 bit monochrome 
> > data over an eight bit bus as follows
> > 
> > y7 y6 y5 y4 y3 y2 y1 y0   xx xx xx xx xx xx y9 y8 ...
> > 
> > then _any_ bridge, capable of just copying data from the bus bytewise into 
> > RAM will be able to produce little-endian 10-bit grey pixel format in RAM. 
> > This handling is _not_ bridge specific. This is what I call packing.
> 
> Of course it is bridge dependent. It is the bridge that takes data from the
> bus and puts it in memory. In many cases that is done very simply by bytewise
> copying. Other bridges can do RGB to YUV or vice versa conversions or can do
> endianness conversion or can do JPEG/MPEG compression on the fly or whatever
> else hardware designers will think of.
> 
> It's no doubt true for the SoCs you have been working with, but it is not so
> simple in general.

Ok, I forgot to mention one more point in the model:

4. Each bridge has _two_ ways to process data: data-format-specific and 
generic (pass-through). It's the _former_ one that is bridge specific, 
quite right! For a bridge to be able to process a data format, that it can 
process in a _special_ way, it doesn't need v4l2_imgbus_pixelfmt, it's 
only for data-formats, that bridges do _not_ know specifically they need 
it. In that _generic_ case it is not bridge-specific and a bridge driver 
can just look into the respective v4l2_imgbus_pixelfmt descriptor.

Consider the following: a bridge can process N formats in a specific way. 
It knows which bits in which order represent which colours, etc. In such a 
case you just tell the driver "format X" and that's all it has to know 
about it to be able to handle it.

The sensor, connected to the bridge, can also provide format Y, which the 
bridge doesn't know about. So what, there's then no way to use that 
format? Or do we have to add a _special_ handling rule for each format to 
each bridge driver?...

> > 3. Therefore, each bridge, capable of handling of some "generic" data 
> > using some specific packing, can perfectly look through data-format 
> > descriptors, see if it finds any with the supported packing, and if so, it 
> > _then_ knows, that it can use that specific data format and the specific 
> > packing to produce the resulting pixel format from the format descriptor.
> > 
> > > A bus format is also separate from the colorspace: that is an independent
> > > piece of data.
> > 
> > Sure. TBH, I do not quite how enum v4l2_colorspace is actually used. Is it 
> > uniquely defined by each pixel format? So, it can be derived from that? 
> > Then it is indeed redundant. Can drop, don't care about it that much.
> 
> It's independent from the pixel format. So the same pixel (or bus) format can
> have different colorspaces.

Then I do not understand what a colourspace means in v4l context. You mean 
a yuv format can belong to a jpeg, or an srgb space?...

> > > Personally I would just keep using v4l2_pix_format, except
> > > that the fourcc field refers to a busimg format rather than a pixel format
> > > in the case of subdevs. In most non-sensor drivers this field is completely
> > > ignored anyway since the bus format is fixed.
> > 
> > Example: there are cameras, that can be configured to pad 2 bits from the 
> > incomplete byte above to 10 either in high or in low bits. Do you want to 
> > introduce a new FOURCC code for those two formats? This is an example of 
> > what I call packing.
> 
> If this happens in the sensor, then yes.

No, those are two data formats, as produced by a camera sensor on the bus. 
What is made out of them in RAM is a completely separate issue.

> > > I don't mind if you do a bus format to pixel format mapping inside soc-camera,
> > > but it shouldn't spill over into the v4l core code.
> > 
> > Don't understand. This is not for soc-camera only. This infrastructure 
> > should be used by all subdev drivers, communicating aver a data bus. The 
> > distinction is quite clear to me: if two entities connect over a bus, they 
> > use an image-bus data format to describe the data format. If they write 
> > and read from RAM - that's pixel format.
> 
> We agree about that, but why then does struct v4l2_imgbus_framefmt contain
> memory-related fields like packing, order and bits_per_sample? A subdev driver
> does not care about that. All it has are X pins through which the data has to
> pass. How that will look like in memory it doesn't know and doesn't care.

That's right. subdev drivers do not care about v4l2_imgbus_framefmt, it's 
only bridge drivers, that do.

> > > Laurent is also correct that this should be eventually pad-specific, but
> > > we can ignore that for now.
> > > 
> > > I'm also missing the bus hardware configuration (polarities, sampling on
> > > rising or falling edge). What happened to that? Or is that a next step?
> > 
> > It is separate, yes.
> 
> I saw that as well. I had a lot of emails to go through :-)

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-05 18:56         ` Guennadi Liakhovetski
@ 2009-11-06  6:47           ` Hans Verkuil
  2009-11-06  7:42             ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-06  6:47 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thursday 05 November 2009 19:56:04 Guennadi Liakhovetski wrote:
> On Thu, 5 Nov 2009, Hans Verkuil wrote:
> 
> > On Thursday 05 November 2009 17:51:50 Guennadi Liakhovetski wrote:
> > > On Thu, 5 Nov 2009, Hans Verkuil wrote:
> > > 
> > > > On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > > > > Video subdevices, like cameras, decoders, connect to video bridges over
> > > > > specialised busses. Data is being transferred over these busses in various
> > > > > formats, which only loosely correspond to fourcc codes, describing how video
> > > > > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > > > > cannot use fourcc codes to configure subdevice output data formats. This patch
> > > > > adds codes for several such on-the-bus formats and an API, similar to the
> > > > > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > > > > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > > > > converted, the API will be removed.
> > > > 
> > > > OK, this seems to completely disregard points raised in my earlier "bus and
> > > > data format negotiation" RFC which is available here once www.mail-archive.org
> > > > is working again:
> > > > 
> > > > http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html
> > > > 
> > > > BTW, ignore the 'Video timings' section of that RFC. That part is wrong.
> > > > 
> > > > The big problem I have with this proposal is the unholy mixing of bus and
> > > > memory formatting. That should be completely separated. Only the bridge
> > > > knows how a bus format can be converted into which memory (pixel) formats.
> > > 
> > > Please, explain why only the bridge knows about that.
> > > 
> > > My model is the following:
> > > 
> > > 1. we define various data formats on the bus. Each such format variation 
> > > gets a unique identification.
> > > 
> > > 2. given a data format ID the data format is perfectly defined. This 
> > > means, you do not have to have a special knowledge about this specific 
> > > format to be able to handle it in some _generic_ way. A typical such 
> > > generic handling on a bridge is, for instance, copying the data into 
> > > memory "one-to-one." For example, if a sensor delivers 10 bit monochrome 
> > > data over an eight bit bus as follows
> > > 
> > > y7 y6 y5 y4 y3 y2 y1 y0   xx xx xx xx xx xx y9 y8 ...
> > > 
> > > then _any_ bridge, capable of just copying data from the bus bytewise into 
> > > RAM will be able to produce little-endian 10-bit grey pixel format in RAM. 
> > > This handling is _not_ bridge specific. This is what I call packing.
> > 
> > Of course it is bridge dependent. It is the bridge that takes data from the
> > bus and puts it in memory. In many cases that is done very simply by bytewise
> > copying. Other bridges can do RGB to YUV or vice versa conversions or can do
> > endianness conversion or can do JPEG/MPEG compression on the fly or whatever
> > else hardware designers will think of.
> > 
> > It's no doubt true for the SoCs you have been working with, but it is not so
> > simple in general.
> 
> Ok, I forgot to mention one more point in the model:
> 
> 4. Each bridge has _two_ ways to process data: data-format-specific and 
> generic (pass-through). It's the _former_ one that is bridge specific, 
> quite right! For a bridge to be able to process a data format, that it can 
> process in a _special_ way, it doesn't need v4l2_imgbus_pixelfmt, it's 
> only for data-formats, that bridges do _not_ know specifically they need 
> it. In that _generic_ case it is not bridge-specific and a bridge driver 
> can just look into the respective v4l2_imgbus_pixelfmt descriptor.
> 
> Consider the following: a bridge can process N formats in a specific way. 
> It knows which bits in which order represent which colours, etc. In such a 
> case you just tell the driver "format X" and that's all it has to know 
> about it to be able to handle it.
> 
> The sensor, connected to the bridge, can also provide format Y, which the 
> bridge doesn't know about. So what, there's then no way to use that 
> format? Or do we have to add a _special_ handling rule for each format to 
> each bridge driver?...
> 
> > > 3. Therefore, each bridge, capable of handling of some "generic" data 
> > > using some specific packing, can perfectly look through data-format 
> > > descriptors, see if it finds any with the supported packing, and if so, it 
> > > _then_ knows, that it can use that specific data format and the specific 
> > > packing to produce the resulting pixel format from the format descriptor.
> > > 
> > > > A bus format is also separate from the colorspace: that is an independent
> > > > piece of data.
> > > 
> > > Sure. TBH, I do not quite how enum v4l2_colorspace is actually used. Is it 
> > > uniquely defined by each pixel format? So, it can be derived from that? 
> > > Then it is indeed redundant. Can drop, don't care about it that much.
> > 
> > It's independent from the pixel format. So the same pixel (or bus) format can
> > have different colorspaces.
> 
> Then I do not understand what a colourspace means in v4l context. You mean 
> a yuv format can belong to a jpeg, or an srgb space?...

No, it's not that extreme, but e.g. the same yuv format can be used with
different colorspaces depending on the source. I don't have the datasheet
handy but I know that for HDMI inputs there are different RGB colorspaces
depending on the input resolution. So while the dataformat is the same,
the colorspace will be different.

> 
> > > > Personally I would just keep using v4l2_pix_format, except
> > > > that the fourcc field refers to a busimg format rather than a pixel format
> > > > in the case of subdevs. In most non-sensor drivers this field is completely
> > > > ignored anyway since the bus format is fixed.
> > > 
> > > Example: there are cameras, that can be configured to pad 2 bits from the 
> > > incomplete byte above to 10 either in high or in low bits. Do you want to 
> > > introduce a new FOURCC code for those two formats? This is an example of 
> > > what I call packing.
> > 
> > If this happens in the sensor, then yes.
> 
> No, those are two data formats, as produced by a camera sensor on the bus. 
> What is made out of them in RAM is a completely separate issue.
> 
> > > > I don't mind if you do a bus format to pixel format mapping inside soc-camera,
> > > > but it shouldn't spill over into the v4l core code.
> > > 
> > > Don't understand. This is not for soc-camera only. This infrastructure 
> > > should be used by all subdev drivers, communicating aver a data bus. The 
> > > distinction is quite clear to me: if two entities connect over a bus, they 
> > > use an image-bus data format to describe the data format. If they write 
> > > and read from RAM - that's pixel format.
> > 
> > We agree about that, but why then does struct v4l2_imgbus_framefmt contain
> > memory-related fields like packing, order and bits_per_sample? A subdev driver
> > does not care about that. All it has are X pins through which the data has to
> > pass. How that will look like in memory it doesn't know and doesn't care.
> 
> That's right. subdev drivers do not care about v4l2_imgbus_framefmt, it's 
> only bridge drivers, that do.

I think most of my objections would probably go away if you redid your subdev
API so that the subdev only gets the data format and none of the packing data.

The subdev API should not contain anything that it doesn't need. Otherwise it
becomes very confusing.

Regards,

	Hans

> 
> > > > Laurent is also correct that this should be eventually pad-specific, but
> > > > we can ignore that for now.
> > > > 
> > > > I'm also missing the bus hardware configuration (polarities, sampling on
> > > > rising or falling edge). What happened to that? Or is that a next step?
> > > 
> > > It is separate, yes.
> > 
> > I saw that as well. I had a lot of emails to go through :-)
> 
> Thanks
> Guennadi
> ---
> Guennadi Liakhovetski, Ph.D.
> Freelance Open-Source Software Developer
> http://www.open-technology.de/
> 



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-06  6:47           ` Hans Verkuil
@ 2009-11-06  7:42             ` Guennadi Liakhovetski
  2009-11-06  8:28               ` Hans Verkuil
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-06  7:42 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Fri, 6 Nov 2009, Hans Verkuil wrote:

> On Thursday 05 November 2009 19:56:04 Guennadi Liakhovetski wrote:
> > On Thu, 5 Nov 2009, Hans Verkuil wrote:
> > 
> > > On Thursday 05 November 2009 17:51:50 Guennadi Liakhovetski wrote:
> > > > On Thu, 5 Nov 2009, Hans Verkuil wrote:
> > > > 
> > > > > On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > > > > > Video subdevices, like cameras, decoders, connect to video bridges over
> > > > > > specialised busses. Data is being transferred over these busses in various
> > > > > > formats, which only loosely correspond to fourcc codes, describing how video
> > > > > > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > > > > > cannot use fourcc codes to configure subdevice output data formats. This patch
> > > > > > adds codes for several such on-the-bus formats and an API, similar to the
> > > > > > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > > > > > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > > > > > converted, the API will be removed.
> > > > > 
> > > > > OK, this seems to completely disregard points raised in my earlier "bus and
> > > > > data format negotiation" RFC which is available here once www.mail-archive.org
> > > > > is working again:
> > > > > 
> > > > > http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html
> > > > > 
> > > > > BTW, ignore the 'Video timings' section of that RFC. That part is wrong.
> > > > > 
> > > > > The big problem I have with this proposal is the unholy mixing of bus and
> > > > > memory formatting. That should be completely separated. Only the bridge
> > > > > knows how a bus format can be converted into which memory (pixel) formats.
> > > > 
> > > > Please, explain why only the bridge knows about that.
> > > > 
> > > > My model is the following:
> > > > 
> > > > 1. we define various data formats on the bus. Each such format variation 
> > > > gets a unique identification.
> > > > 
> > > > 2. given a data format ID the data format is perfectly defined. This 
> > > > means, you do not have to have a special knowledge about this specific 
> > > > format to be able to handle it in some _generic_ way. A typical such 
> > > > generic handling on a bridge is, for instance, copying the data into 
> > > > memory "one-to-one." For example, if a sensor delivers 10 bit monochrome 
> > > > data over an eight bit bus as follows
> > > > 
> > > > y7 y6 y5 y4 y3 y2 y1 y0   xx xx xx xx xx xx y9 y8 ...
> > > > 
> > > > then _any_ bridge, capable of just copying data from the bus bytewise into 
> > > > RAM will be able to produce little-endian 10-bit grey pixel format in RAM. 
> > > > This handling is _not_ bridge specific. This is what I call packing.
> > > 
> > > Of course it is bridge dependent. It is the bridge that takes data from the
> > > bus and puts it in memory. In many cases that is done very simply by bytewise
> > > copying. Other bridges can do RGB to YUV or vice versa conversions or can do
> > > endianness conversion or can do JPEG/MPEG compression on the fly or whatever
> > > else hardware designers will think of.
> > > 
> > > It's no doubt true for the SoCs you have been working with, but it is not so
> > > simple in general.
> > 
> > Ok, I forgot to mention one more point in the model:
> > 
> > 4. Each bridge has _two_ ways to process data: data-format-specific and 
> > generic (pass-through). It's the _former_ one that is bridge specific, 
> > quite right! For a bridge to be able to process a data format, that it can 
> > process in a _special_ way, it doesn't need v4l2_imgbus_pixelfmt, it's 
> > only for data-formats, that bridges do _not_ know specifically they need 
> > it. In that _generic_ case it is not bridge-specific and a bridge driver 
> > can just look into the respective v4l2_imgbus_pixelfmt descriptor.
> > 
> > Consider the following: a bridge can process N formats in a specific way. 
> > It knows which bits in which order represent which colours, etc. In such a 
> > case you just tell the driver "format X" and that's all it has to know 
> > about it to be able to handle it.
> > 
> > The sensor, connected to the bridge, can also provide format Y, which the 
> > bridge doesn't know about. So what, there's then no way to use that 
> > format? Or do we have to add a _special_ handling rule for each format to 
> > each bridge driver?...
> > 
> > > > 3. Therefore, each bridge, capable of handling of some "generic" data 
> > > > using some specific packing, can perfectly look through data-format 
> > > > descriptors, see if it finds any with the supported packing, and if so, it 
> > > > _then_ knows, that it can use that specific data format and the specific 
> > > > packing to produce the resulting pixel format from the format descriptor.
> > > > 
> > > > > A bus format is also separate from the colorspace: that is an independent
> > > > > piece of data.
> > > > 
> > > > Sure. TBH, I do not quite how enum v4l2_colorspace is actually used. Is it 
> > > > uniquely defined by each pixel format? So, it can be derived from that? 
> > > > Then it is indeed redundant. Can drop, don't care about it that much.
> > > 
> > > It's independent from the pixel format. So the same pixel (or bus) format can
> > > have different colorspaces.
> > 
> > Then I do not understand what a colourspace means in v4l context. You mean 
> > a yuv format can belong to a jpeg, or an srgb space?...
> 
> No, it's not that extreme, but e.g. the same yuv format can be used with
> different colorspaces depending on the source. I don't have the datasheet
> handy but I know that for HDMI inputs there are different RGB colorspaces
> depending on the input resolution. So while the dataformat is the same,
> the colorspace will be different.

Ok, so, you mean something like colour components are assigned in the same 
way in data tuples, but, for example, colour value ranges can be 
different?

As for whether a colour-space field is needed in struct 
v4l2_imgbus_pixelfmt, actually, I think, it is. Otherwise how would you 
reply to G_FMT and TRY_FMT requests?

> > > > > Personally I would just keep using v4l2_pix_format, except
> > > > > that the fourcc field refers to a busimg format rather than a pixel format
> > > > > in the case of subdevs. In most non-sensor drivers this field is completely
> > > > > ignored anyway since the bus format is fixed.
> > > > 
> > > > Example: there are cameras, that can be configured to pad 2 bits from the 
> > > > incomplete byte above to 10 either in high or in low bits. Do you want to 
> > > > introduce a new FOURCC code for those two formats? This is an example of 
> > > > what I call packing.
> > > 
> > > If this happens in the sensor, then yes.
> > 
> > No, those are two data formats, as produced by a camera sensor on the bus. 
> > What is made out of them in RAM is a completely separate issue.
> > 
> > > > > I don't mind if you do a bus format to pixel format mapping inside soc-camera,
> > > > > but it shouldn't spill over into the v4l core code.
> > > > 
> > > > Don't understand. This is not for soc-camera only. This infrastructure 
> > > > should be used by all subdev drivers, communicating aver a data bus. The 
> > > > distinction is quite clear to me: if two entities connect over a bus, they 
> > > > use an image-bus data format to describe the data format. If they write 
> > > > and read from RAM - that's pixel format.
> > > 
> > > We agree about that, but why then does struct v4l2_imgbus_framefmt contain
> > > memory-related fields like packing, order and bits_per_sample? A subdev driver
> > > does not care about that. All it has are X pins through which the data has to
> > > pass. How that will look like in memory it doesn't know and doesn't care.
> > 
> > That's right. subdev drivers do not care about v4l2_imgbus_framefmt, it's 
> > only bridge drivers, that do.
> 
> I think most of my objections would probably go away if you redid your subdev
> API so that the subdev only gets the data format and none of the packing data.
> 
> The subdev API should not contain anything that it doesn't need. Otherwise it
> becomes very confusing.

Sorry, I grepped drivers and headers for v4l2_imgbus_pixelfmt and only see 
it used in host drivers. Can you point out more precisely what you mean?

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-06  7:42             ` Guennadi Liakhovetski
@ 2009-11-06  8:28               ` Hans Verkuil
  0 siblings, 0 replies; 51+ messages in thread
From: Hans Verkuil @ 2009-11-06  8:28 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri


> On Fri, 6 Nov 2009, Hans Verkuil wrote:
>
>> On Thursday 05 November 2009 19:56:04 Guennadi Liakhovetski wrote:
>> > On Thu, 5 Nov 2009, Hans Verkuil wrote:
>> >
>> > > On Thursday 05 November 2009 17:51:50 Guennadi Liakhovetski wrote:
>> > > > On Thu, 5 Nov 2009, Hans Verkuil wrote:
>> > > >
>> > > > > On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
>> > > > > > Video subdevices, like cameras, decoders, connect to video
>> bridges over
>> > > > > > specialised busses. Data is being transferred over these
>> busses in various
>> > > > > > formats, which only loosely correspond to fourcc codes,
>> describing how video
>> > > > > > data is stored in RAM. This is not a one-to-one
>> correspondence, therefore we
>> > > > > > cannot use fourcc codes to configure subdevice output data
>> formats. This patch
>> > > > > > adds codes for several such on-the-bus formats and an API,
>> similar to the
>> > > > > > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for
>> configuring those
>> > > > > > codes. After all users of the old API in struct
>> v4l2_subdev_video_ops are
>> > > > > > converted, the API will be removed.
>> > > > >
>> > > > > OK, this seems to completely disregard points raised in my
>> earlier "bus and
>> > > > > data format negotiation" RFC which is available here once
>> www.mail-archive.org
>> > > > > is working again:
>> > > > >
>> > > > > http://www.mail-archive.com/linux-media%40vger.kernel.org/msg09644.html
>> > > > >
>> > > > > BTW, ignore the 'Video timings' section of that RFC. That part
>> is wrong.
>> > > > >
>> > > > > The big problem I have with this proposal is the unholy mixing
>> of bus and
>> > > > > memory formatting. That should be completely separated. Only the
>> bridge
>> > > > > knows how a bus format can be converted into which memory
>> (pixel) formats.
>> > > >
>> > > > Please, explain why only the bridge knows about that.
>> > > >
>> > > > My model is the following:
>> > > >
>> > > > 1. we define various data formats on the bus. Each such format
>> variation
>> > > > gets a unique identification.
>> > > >
>> > > > 2. given a data format ID the data format is perfectly defined.
>> This
>> > > > means, you do not have to have a special knowledge about this
>> specific
>> > > > format to be able to handle it in some _generic_ way. A typical
>> such
>> > > > generic handling on a bridge is, for instance, copying the data
>> into
>> > > > memory "one-to-one." For example, if a sensor delivers 10 bit
>> monochrome
>> > > > data over an eight bit bus as follows
>> > > >
>> > > > y7 y6 y5 y4 y3 y2 y1 y0   xx xx xx xx xx xx y9 y8 ...
>> > > >
>> > > > then _any_ bridge, capable of just copying data from the bus
>> bytewise into
>> > > > RAM will be able to produce little-endian 10-bit grey pixel format
>> in RAM.
>> > > > This handling is _not_ bridge specific. This is what I call
>> packing.
>> > >
>> > > Of course it is bridge dependent. It is the bridge that takes data
>> from the
>> > > bus and puts it in memory. In many cases that is done very simply by
>> bytewise
>> > > copying. Other bridges can do RGB to YUV or vice versa conversions
>> or can do
>> > > endianness conversion or can do JPEG/MPEG compression on the fly or
>> whatever
>> > > else hardware designers will think of.
>> > >
>> > > It's no doubt true for the SoCs you have been working with, but it
>> is not so
>> > > simple in general.
>> >
>> > Ok, I forgot to mention one more point in the model:
>> >
>> > 4. Each bridge has _two_ ways to process data: data-format-specific
>> and
>> > generic (pass-through). It's the _former_ one that is bridge specific,
>> > quite right! For a bridge to be able to process a data format, that it
>> can
>> > process in a _special_ way, it doesn't need v4l2_imgbus_pixelfmt, it's
>> > only for data-formats, that bridges do _not_ know specifically they
>> need
>> > it. In that _generic_ case it is not bridge-specific and a bridge
>> driver
>> > can just look into the respective v4l2_imgbus_pixelfmt descriptor.
>> >
>> > Consider the following: a bridge can process N formats in a specific
>> way.
>> > It knows which bits in which order represent which colours, etc. In
>> such a
>> > case you just tell the driver "format X" and that's all it has to know
>> > about it to be able to handle it.
>> >
>> > The sensor, connected to the bridge, can also provide format Y, which
>> the
>> > bridge doesn't know about. So what, there's then no way to use that
>> > format? Or do we have to add a _special_ handling rule for each format
>> to
>> > each bridge driver?...
>> >
>> > > > 3. Therefore, each bridge, capable of handling of some "generic"
>> data
>> > > > using some specific packing, can perfectly look through
>> data-format
>> > > > descriptors, see if it finds any with the supported packing, and
>> if so, it
>> > > > _then_ knows, that it can use that specific data format and the
>> specific
>> > > > packing to produce the resulting pixel format from the format
>> descriptor.
>> > > >
>> > > > > A bus format is also separate from the colorspace: that is an
>> independent
>> > > > > piece of data.
>> > > >
>> > > > Sure. TBH, I do not quite how enum v4l2_colorspace is actually
>> used. Is it
>> > > > uniquely defined by each pixel format? So, it can be derived from
>> that?
>> > > > Then it is indeed redundant. Can drop, don't care about it that
>> much.
>> > >
>> > > It's independent from the pixel format. So the same pixel (or bus)
>> format can
>> > > have different colorspaces.
>> >
>> > Then I do not understand what a colourspace means in v4l context. You
>> mean
>> > a yuv format can belong to a jpeg, or an srgb space?...
>>
>> No, it's not that extreme, but e.g. the same yuv format can be used with
>> different colorspaces depending on the source. I don't have the
>> datasheet
>> handy but I know that for HDMI inputs there are different RGB
>> colorspaces
>> depending on the input resolution. So while the dataformat is the same,
>> the colorspace will be different.
>
> Ok, so, you mean something like colour components are assigned in the same
> way in data tuples, but, for example, colour value ranges can be
> different?

Right.

> As for whether a colour-space field is needed in struct
> v4l2_imgbus_pixelfmt, actually, I think, it is. Otherwise how would you
> reply to G_FMT and TRY_FMT requests?

Yes, it definitely needs to be passed one way or another.

>
>> > > > > Personally I would just keep using v4l2_pix_format, except
>> > > > > that the fourcc field refers to a busimg format rather than a
>> pixel format
>> > > > > in the case of subdevs. In most non-sensor drivers this field is
>> completely
>> > > > > ignored anyway since the bus format is fixed.
>> > > >
>> > > > Example: there are cameras, that can be configured to pad 2 bits
>> from the
>> > > > incomplete byte above to 10 either in high or in low bits. Do you
>> want to
>> > > > introduce a new FOURCC code for those two formats? This is an
>> example of
>> > > > what I call packing.
>> > >
>> > > If this happens in the sensor, then yes.
>> >
>> > No, those are two data formats, as produced by a camera sensor on the
>> bus.
>> > What is made out of them in RAM is a completely separate issue.
>> >
>> > > > > I don't mind if you do a bus format to pixel format mapping
>> inside soc-camera,
>> > > > > but it shouldn't spill over into the v4l core code.
>> > > >
>> > > > Don't understand. This is not for soc-camera only. This
>> infrastructure
>> > > > should be used by all subdev drivers, communicating aver a data
>> bus. The
>> > > > distinction is quite clear to me: if two entities connect over a
>> bus, they
>> > > > use an image-bus data format to describe the data format. If they
>> write
>> > > > and read from RAM - that's pixel format.
>> > >
>> > > We agree about that, but why then does struct v4l2_imgbus_framefmt
>> contain
>> > > memory-related fields like packing, order and bits_per_sample? A
>> subdev driver
>> > > does not care about that. All it has are X pins through which the
>> data has to
>> > > pass. How that will look like in memory it doesn't know and doesn't
>> care.
>> >
>> > That's right. subdev drivers do not care about v4l2_imgbus_framefmt,
>> it's
>> > only bridge drivers, that do.
>>
>> I think most of my objections would probably go away if you redid your
>> subdev
>> API so that the subdev only gets the data format and none of the packing
>> data.
>>
>> The subdev API should not contain anything that it doesn't need.
>> Otherwise it
>> becomes very confusing.
>
> Sorry, I grepped drivers and headers for v4l2_imgbus_pixelfmt and only see
> it used in host drivers. Can you point out more precisely what you mean?

Aargh! I confused enum v4l2_imgbus_pixelcode with struct
v4l2_imgbus_pixelfmt! Forget what I said, I'll do another review this
weekend.

Regards,

        Hans

-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom


^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-10-30 14:01 ` [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera Guennadi Liakhovetski
  2009-10-30 14:43   ` Karicheri, Muralidharan
@ 2009-11-10 12:55   ` Laurent Pinchart
  2009-11-10 14:11     ` Guennadi Liakhovetski
  1 sibling, 1 reply; 51+ messages in thread
From: Laurent Pinchart @ 2009-11-10 12:55 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Hans Verkuil, Sakari Ailus,
	Muralidharan Karicheri

Hi Guennadi,

On Friday 30 October 2009 15:01:06 Guennadi Liakhovetski wrote:
> Introduce new v4l2-subdev sensor operations, move .enum_framesizes() and
> .enum_frameintervals() methods to it,

I understand that we need sensor-specific operations, but I'm not sure if 
those two are really unneeded for "non-sensor" video.

Speaking about enum_framesizes() and enum_frameintervals(), wouldn't it be 
better to provide a static array of data instead of a callback function ? That 
should be dealt with in another patch set of course.

> add a new .g_skip_top_lines() method and switch soc-camera to use it instead
> of .y_skip_top soc_camera_device member, which can now be removed.

BTW, the lines of "garbage" you get at the beginning of the image is actually 
probably meta-data (such as exposure settings). Maybe the g_skip_top_lines() 
operation could be renamed to something meta-data related. Applications could 
also be interested in getting the data.

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-10-30 14:01 ` [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats Guennadi Liakhovetski
  2009-11-05 15:41   ` Hans Verkuil
@ 2009-11-10 13:51   ` Laurent Pinchart
  2009-11-10 14:28     ` Guennadi Liakhovetski
  2009-11-11  7:55   ` Hans Verkuil
  2 siblings, 1 reply; 51+ messages in thread
From: Laurent Pinchart @ 2009-11-10 13:51 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Hans Verkuil, Sakari Ailus,
	Muralidharan Karicheri

On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> Video subdevices, like cameras, decoders, connect to video bridges over
> specialised busses. Data is being transferred over these busses in various
> formats, which only loosely correspond to fourcc codes, describing how
>  video data is stored in RAM. This is not a one-to-one correspondence,
>  therefore we cannot use fourcc codes to configure subdevice output data
>  formats. This patch adds codes for several such on-the-bus formats and an
>  API, similar to the familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt()
>  API for configuring those codes. After all users of the old API in struct
>  v4l2_subdev_video_ops are converted, the API will be removed.

[snip]

> diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> index 04193eb..1e86f39 100644
> --- a/include/media/v4l2-subdev.h
> +++ b/include/media/v4l2-subdev.h

[snip]

> @@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
> 
>     s_routing: see s_routing in audio_ops, except this version is for video
>  	devices.
> +
> +   enum_imgbus_fmt: enumerate pixel formats provided by a video data

Do we need to make that dynamic (and O(n)) or could we use a static array of 
image bug frame formats ?

>  source */
>  struct v4l2_subdev_video_ops {
>  	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32
>  config); @@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
>  	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
>  	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
>  	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> +	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
> +			       enum v4l2_imgbus_pixelcode *code);
> +	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt
>  *fmt);
> +	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt
>  *fmt);
> +	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt
> *fmt); };

Obviously those calls will need to be moved to pad operations later. They will 
be exposed to userspace through ioctls on the media controller device and/or 
the subdevices, so the v4l2_imgbus_pixelcode type shouldn't be an enum.

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-05 17:07               ` Karicheri, Muralidharan
@ 2009-11-10 13:54                 ` Laurent Pinchart
  2009-11-10 14:36                   ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Laurent Pinchart @ 2009-11-10 13:54 UTC (permalink / raw)
  To: Karicheri, Muralidharan
  Cc: Guennadi Liakhovetski, Hans Verkuil, Linux Media Mailing List,
	Sakari Ailus

Hi Guennadi,

On Thursday 05 November 2009 18:07:09 Karicheri, Muralidharan wrote:
> Guennadi,
> 
> >> in the v4l2_queryctrl struct.
> >
> >I think, this is unrelated. Muralidharan just complained about the
> >soc_camera_find_qctrl() function being used in client subdev drivers, that
> >were to be converted to v4l2-subdev, specifically, in mt9t031.c. And I
> >just explained, that that's just a pretty trivial library function, that
> >does not introduce any restrictions on how that subdev driver can be used
> >in non-soc-camera configurations, apart from the need to build and load
> >the soc-camera module. In other words, any v4l2-device bridge driver
> >should be able to communicate with such a subdev driver, calling that
> >function.
> 
> If soc_camera_find_qctrl() is such a generic function, why don't you
> move it to v4l2-common.c so that other platforms doesn't have to build
> SOC camera sub system to use this function? Your statement reinforce
> this.

I second this. Hans is working on a controls framework that should (hopefully 
:-)) make drivers simpler by handling common tasks in the v4l core.

Do you have any plan to work on the bus hardware configuration API ? When that 
will be available the mt9t031 driver could be made completely soc-camera-free.

-- 
Regards,

Laurent Pinchart

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera
  2009-11-10 12:55   ` Laurent Pinchart
@ 2009-11-10 14:11     ` Guennadi Liakhovetski
  0 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-10 14:11 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Linux Media Mailing List, Hans Verkuil, Sakari Ailus,
	Muralidharan Karicheri

On Tue, 10 Nov 2009, Laurent Pinchart wrote:

> Hi Guennadi,
> 
> On Friday 30 October 2009 15:01:06 Guennadi Liakhovetski wrote:
> > Introduce new v4l2-subdev sensor operations, move .enum_framesizes() and
> > .enum_frameintervals() methods to it,
> 
> I understand that we need sensor-specific operations, but I'm not sure if 
> those two are really unneeded for "non-sensor" video.

I suspect that wasn't my idea:-) Ok, found:

http://thread.gmane.org/gmane.linux.drivers.video-input-infrastructure/8990/focus=9078

> Speaking about enum_framesizes() and enum_frameintervals(), wouldn't it be 
> better to provide a static array of data instead of a callback function ? That 
> should be dealt with in another patch set of course.

TBH, I don't understand why these methods are needed at all. Why the 
existing {S,G,TRY}_FMT are not enough. So, obviously, this isn't a 
question to me either.

> > add a new .g_skip_top_lines() method and switch soc-camera to use it instead
> > of .y_skip_top soc_camera_device member, which can now be removed.
> 
> BTW, the lines of "garbage" you get at the beginning of the image is actually 
> probably meta-data (such as exposure settings). Maybe the g_skip_top_lines() 
> operation could be renamed to something meta-data related. Applications could 
> also be interested in getting the data.

Aha, that's interesting, thanks! Yes, we could easily rename it to 
.g_metadata_lines() or something like that.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-10 13:51   ` Laurent Pinchart
@ 2009-11-10 14:28     ` Guennadi Liakhovetski
  0 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-10 14:28 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Linux Media Mailing List, Hans Verkuil, Sakari Ailus,
	Muralidharan Karicheri

On Tue, 10 Nov 2009, Laurent Pinchart wrote:

> On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > Video subdevices, like cameras, decoders, connect to video bridges over
> > specialised busses. Data is being transferred over these busses in various
> > formats, which only loosely correspond to fourcc codes, describing how
> >  video data is stored in RAM. This is not a one-to-one correspondence,
> >  therefore we cannot use fourcc codes to configure subdevice output data
> >  formats. This patch adds codes for several such on-the-bus formats and an
> >  API, similar to the familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt()
> >  API for configuring those codes. After all users of the old API in struct
> >  v4l2_subdev_video_ops are converted, the API will be removed.
> 
> [snip]
> 
> > diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> > index 04193eb..1e86f39 100644
> > --- a/include/media/v4l2-subdev.h
> > +++ b/include/media/v4l2-subdev.h
> 
> [snip]
> 
> > @@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
> > 
> >     s_routing: see s_routing in audio_ops, except this version is for video
> >  	devices.
> > +
> > +   enum_imgbus_fmt: enumerate pixel formats provided by a video data
> 
> Do we need to make that dynamic (and O(n)) or could we use a static array of 
> image bug frame formats ?

The current soc-camera uses a static array. It works for its users, but I 
do not know about others.

> >  source */
> >  struct v4l2_subdev_video_ops {
> >  	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32
> >  config); @@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
> >  	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
> >  	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> >  	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> > +	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
> > +			       enum v4l2_imgbus_pixelcode *code);
> > +	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt
> >  *fmt);
> > +	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt
> >  *fmt);
> > +	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt
> > *fmt); };
> 
> Obviously those calls will need to be moved to pad operations later.

Right.

> They will 
> be exposed to userspace through ioctls on the media controller device and/or 
> the subdevices, so the v4l2_imgbus_pixelcode type shouldn't be an enum.

Ok, will use __u32 for it then just as all other enum types...

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 9/9 v2] mt9t031: make the use of the soc-camera client API optional
  2009-11-10 13:54                 ` Laurent Pinchart
@ 2009-11-10 14:36                   ` Guennadi Liakhovetski
  0 siblings, 0 replies; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-10 14:36 UTC (permalink / raw)
  To: Laurent Pinchart
  Cc: Karicheri, Muralidharan, Hans Verkuil, Linux Media Mailing List,
	Sakari Ailus

On Tue, 10 Nov 2009, Laurent Pinchart wrote:

> Hi Guennadi,
> 
> On Thursday 05 November 2009 18:07:09 Karicheri, Muralidharan wrote:
> > Guennadi,
> > 
> > >> in the v4l2_queryctrl struct.
> > >
> > >I think, this is unrelated. Muralidharan just complained about the
> > >soc_camera_find_qctrl() function being used in client subdev drivers, that
> > >were to be converted to v4l2-subdev, specifically, in mt9t031.c. And I
> > >just explained, that that's just a pretty trivial library function, that
> > >does not introduce any restrictions on how that subdev driver can be used
> > >in non-soc-camera configurations, apart from the need to build and load
> > >the soc-camera module. In other words, any v4l2-device bridge driver
> > >should be able to communicate with such a subdev driver, calling that
> > >function.
> > 
> > If soc_camera_find_qctrl() is such a generic function, why don't you
> > move it to v4l2-common.c so that other platforms doesn't have to build
> > SOC camera sub system to use this function? Your statement reinforce
> > this.
> 
> I second this. Hans is working on a controls framework that should (hopefully 
> :-)) make drivers simpler by handling common tasks in the v4l core.

Well, if you look at the function itself and at how it got replaced in 
this version of the patch by O(1) operations, you'll, probably, agree, 
that avoiding that function where possible is better than making it 
generic. But if there are any legitimate users for it - sure, can make it 
generic too.

> Do you have any plan to work on the bus hardware configuration API ? When that 
> will be available the mt9t031 driver could be made completely soc-camera-free.

I'd love to first push the proposed image-bus upstream. Even with just 
that many drivers can already be re-used. As for bus configuration, I 
thought there were enough people working on it already:-) If not, maybe I 
could have a look at it, but we better reach some agreement on that 
beforehand to avoid duplicating the effort.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-10-30 14:01 ` [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats Guennadi Liakhovetski
  2009-11-05 15:41   ` Hans Verkuil
  2009-11-10 13:51   ` Laurent Pinchart
@ 2009-11-11  7:55   ` Hans Verkuil
  2009-11-12  8:08     ` Guennadi Liakhovetski
  2 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-11  7:55 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

Hi Guennadi,

OK, I've looked at this again. See my comments below.

On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> Video subdevices, like cameras, decoders, connect to video bridges over
> specialised busses. Data is being transferred over these busses in various
> formats, which only loosely correspond to fourcc codes, describing how video
> data is stored in RAM. This is not a one-to-one correspondence, therefore we
> cannot use fourcc codes to configure subdevice output data formats. This patch
> adds codes for several such on-the-bus formats and an API, similar to the
> familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> codes. After all users of the old API in struct v4l2_subdev_video_ops are
> converted, the API will be removed.
> 
> Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> ---
>  drivers/media/video/Makefile        |    2 +-
>  drivers/media/video/v4l2-imagebus.c |  218 +++++++++++++++++++++++++++++++++++
>  include/media/v4l2-imagebus.h       |   84 ++++++++++++++
>  include/media/v4l2-subdev.h         |   10 ++-
>  4 files changed, 312 insertions(+), 2 deletions(-)
>  create mode 100644 drivers/media/video/v4l2-imagebus.c
>  create mode 100644 include/media/v4l2-imagebus.h
> 
> diff --git a/drivers/media/video/Makefile b/drivers/media/video/Makefile
> index 7a2dcc3..62d8907 100644
> --- a/drivers/media/video/Makefile
> +++ b/drivers/media/video/Makefile
> @@ -10,7 +10,7 @@ stkwebcam-objs	:=	stk-webcam.o stk-sensor.o
>  
>  omap2cam-objs	:=	omap24xxcam.o omap24xxcam-dma.o
>  
> -videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o
> +videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o v4l2-imagebus.o
>  
>  # V4L2 core modules
>  
> diff --git a/drivers/media/video/v4l2-imagebus.c b/drivers/media/video/v4l2-imagebus.c
> new file mode 100644
> index 0000000..e0a3a83
> --- /dev/null
> +++ b/drivers/media/video/v4l2-imagebus.c
> @@ -0,0 +1,218 @@
> +/*
> + * Image Bus API
> + *
> + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> + *
> + * This program is free software; you can redistribute it and/or modify
> + * it under the terms of the GNU General Public License version 2 as
> + * published by the Free Software Foundation.
> + */
> +
> +#include <linux/kernel.h>
> +#include <linux/module.h>
> +
> +#include <media/v4l2-device.h>
> +#include <media/v4l2-imagebus.h>
> +
> +static const struct v4l2_imgbus_pixelfmt imgbus_fmt[] = {
> +	[V4L2_IMGBUS_FMT_YUYV] = {
> +		.fourcc			= V4L2_PIX_FMT_YUYV,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "YUYV",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_YVYU] = {
> +		.fourcc			= V4L2_PIX_FMT_YVYU,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "YVYU",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_UYVY] = {
> +		.fourcc			= V4L2_PIX_FMT_UYVY,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "UYVY",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_VYUY] = {
> +		.fourcc			= V4L2_PIX_FMT_VYUY,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "VYUY",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8] = {
> +		.fourcc			= V4L2_PIX_FMT_VYUY,
> +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> +		.name			= "VYUY in SMPTE170M",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16] = {
> +		.fourcc			= V4L2_PIX_FMT_VYUY,
> +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> +		.name			= "VYUY in SMPTE170M, 16bit",
> +		.bits_per_sample	= 16,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB555] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB555,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB555",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB555X] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB555X,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB555X",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB565] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB565,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB565",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_RGB565X] = {
> +		.fourcc			= V4L2_PIX_FMT_RGB565X,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "RGB565X",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR8] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGBRG8] = {
> +		.fourcc			= V4L2_PIX_FMT_SGBRG8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 GBRG",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGRBG8] = {
> +		.fourcc			= V4L2_PIX_FMT_SGRBG8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 GRBG",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SRGGB8] = {
> +		.fourcc			= V4L2_PIX_FMT_SRGGB8,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 8 RGGB",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGBRG10] = {
> +		.fourcc			= V4L2_PIX_FMT_SGBRG10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 GBRG",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SGRBG10] = {
> +		.fourcc			= V4L2_PIX_FMT_SGRBG10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 GRBG",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SRGGB10] = {
> +		.fourcc			= V4L2_PIX_FMT_SRGGB10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 RGGB",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_GREY] = {
> +		.fourcc			= V4L2_PIX_FMT_GREY,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "Grey",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_Y16] = {
> +		.fourcc			= V4L2_PIX_FMT_Y16,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "Grey 16bit",
> +		.bits_per_sample	= 16,
> +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_Y10] = {
> +		.fourcc			= V4L2_PIX_FMT_Y10,
> +		.colorspace		= V4L2_COLORSPACE_JPEG,
> +		.name			= "Grey 10bit",
> +		.bits_per_sample	= 10,
> +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> +		.order			= V4L2_IMGBUS_ORDER_LE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> +		.order			= V4L2_IMGBUS_ORDER_BE,
> +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE] = {
> +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> +		.colorspace		= V4L2_COLORSPACE_SRGB,
> +		.name			= "Bayer 10 BGGR",
> +		.bits_per_sample	= 8,
> +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> +		.order			= V4L2_IMGBUS_ORDER_BE,
> +	},
> +};
> +
> +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> +	enum v4l2_imgbus_pixelcode code)
> +{
> +	if ((unsigned int)code > ARRAY_SIZE(imgbus_fmt))
> +		return NULL;
> +	return imgbus_fmt + code;
> +}
> +EXPORT_SYMBOL(v4l2_imgbus_get_fmtdesc);
> +
> +s32 v4l2_imgbus_bytes_per_line(u32 width,
> +			       const struct v4l2_imgbus_pixelfmt *imgf)
> +{
> +	switch (imgf->packing) {
> +	case V4L2_IMGBUS_PACKING_NONE:
> +		return width * imgf->bits_per_sample / 8;
> +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
> +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
> +	case V4L2_IMGBUS_PACKING_EXTEND16:
> +		return width * 2;
> +	}
> +	return -EINVAL;
> +}
> +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);

As you know, I am not convinced that this code belongs in the core. I do not
think this translation from IMGBUS to PIXFMT is generic enough. However, if
you just make this part of soc-camera then I am OK with this.

> diff --git a/include/media/v4l2-imagebus.h b/include/media/v4l2-imagebus.h
> new file mode 100644
> index 0000000..022d044
> --- /dev/null
> +++ b/include/media/v4l2-imagebus.h
> @@ -0,0 +1,84 @@
> +/*
> + * Image Bus API header
> + *
> + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> + *
> + * This program is free software; you can redistribute it and/or modify
> + * it under the terms of the GNU General Public License version 2 as
> + * published by the Free Software Foundation.
> + */
> +
> +#ifndef V4L2_IMGBUS_H
> +#define V4L2_IMGBUS_H
> +
> +enum v4l2_imgbus_packing {
> +	V4L2_IMGBUS_PACKING_NONE,
> +	V4L2_IMGBUS_PACKING_2X8_PADHI,
> +	V4L2_IMGBUS_PACKING_2X8_PADLO,
> +	V4L2_IMGBUS_PACKING_EXTEND16,
> +};
> +
> +enum v4l2_imgbus_order {
> +	V4L2_IMGBUS_ORDER_LE,
> +	V4L2_IMGBUS_ORDER_BE,
> +};

For the same reason I think these enums should be soc-camera internal.

> +
> +enum v4l2_imgbus_pixelcode {

It's probably a good idea to start with something like:

	V4L2_IMGBUS_FMT_FIXED = 1,

since many video encoders/decoders have a fixed bus format, so in those cases
there is nothing to set up.

I also like to leave value 0 free, that way it can be used as a special value
internally (or as sentinel in a imgbus array as suggested below).

One other comment to throw into the pot: what about calling this just
V4L2_BUS_FMT...? So imgbus becomes just bus. For some reason I find imgbus a
bit odd. Probably because I think of it more as a video bus or even as a more
general data bus. For all I know it might be used in the future to choose
between different types of histogram data or something like that.

Or is this just me?

> +	V4L2_IMGBUS_FMT_YUYV,
> +	V4L2_IMGBUS_FMT_YVYU,
> +	V4L2_IMGBUS_FMT_UYVY,
> +	V4L2_IMGBUS_FMT_VYUY,
> +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
> +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
> +	V4L2_IMGBUS_FMT_RGB555,
> +	V4L2_IMGBUS_FMT_RGB555X,
> +	V4L2_IMGBUS_FMT_RGB565,
> +	V4L2_IMGBUS_FMT_RGB565X,
> +	V4L2_IMGBUS_FMT_SBGGR8,
> +	V4L2_IMGBUS_FMT_SGBRG8,
> +	V4L2_IMGBUS_FMT_SGRBG8,
> +	V4L2_IMGBUS_FMT_SRGGB8,
> +	V4L2_IMGBUS_FMT_SBGGR10,
> +	V4L2_IMGBUS_FMT_SGBRG10,
> +	V4L2_IMGBUS_FMT_SGRBG10,
> +	V4L2_IMGBUS_FMT_SRGGB10,
> +	V4L2_IMGBUS_FMT_GREY,
> +	V4L2_IMGBUS_FMT_Y16,
> +	V4L2_IMGBUS_FMT_Y10,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
> +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,

Obviously the meaning of these formats need to be documented in this header
as well. Are all these imgbus formats used? Anything that is not used shouldn't
be in this list IMHO.

> +};
> +
> +/**
> + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
> + * @fourcc:		Fourcc code...
> + * @colorspace:		and colorspace, that will be obtained if the data is
> + *			stored in memory in the following way:
> + * @bits_per_sample:	How many bits the bridge has to sample
> + * @packing:		Type of sample-packing, that has to be used
> + * @order:		Sample order when storing in memory
> + */
> +struct v4l2_imgbus_pixelfmt {
> +	u32				fourcc;
> +	enum v4l2_colorspace		colorspace;
> +	const char			*name;
> +	enum v4l2_imgbus_packing	packing;
> +	enum v4l2_imgbus_order		order;
> +	u8				bits_per_sample;
> +};

Ditto for this struct. Note that the colorspace field should be moved to
imgbus_framefmt.

> +
> +struct v4l2_imgbus_framefmt {
> +	__u32				width;
> +	__u32				height;
> +	enum v4l2_imgbus_pixelcode	code;
> +	enum v4l2_field			field;
> +};

Interesting observation: this struct is almost identical to struct
v4l2_pix_format. Frankly, I really wonder whether we shouldn't reuse that
struct. In many cases (mostly for video encoders/decoders) the VIDIOC_S_FMT
ioctl and friends can just pass the fmt.pix pointer directly to the subdev.

So keeping this struct will make life easier. The only thing we have to make
a note of in the subdev header is that the pixelformat will be interpreted
as an imgbus 'pixelformat' instead.

Note that the current g/s/try_fmt subdev functions receive a struct v4l2_format
pointer. I think that can be replaced by struct v4l2_pix_format. I don't think
that there is any subdev driver that needs anything other than that struct. That
would definitely simplify the driver code.

Regarding the enum_imgbus_fmt: what about just adding a 'const u32 *imgbus_fmts'
field to v4l2_subdev? Or do you think that this might be something that cannot
be const? I.e., that the subdev driver needs to modify the list of available fmts
dynamically?

Regards,

	Hans

> +
> +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> +	enum v4l2_imgbus_pixelcode code);
> +s32 v4l2_imgbus_bytes_per_line(u32 width,
> +			       const struct v4l2_imgbus_pixelfmt *imgf);
> +
> +#endif
> diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> index 04193eb..1e86f39 100644
> --- a/include/media/v4l2-subdev.h
> +++ b/include/media/v4l2-subdev.h
> @@ -22,6 +22,7 @@
>  #define _V4L2_SUBDEV_H
>  
>  #include <media/v4l2-common.h>
> +#include <media/v4l2-imagebus.h>
>  
>  struct v4l2_device;
>  struct v4l2_subdev;
> @@ -196,7 +197,7 @@ struct v4l2_subdev_audio_ops {
>     s_std_output: set v4l2_std_id for video OUTPUT devices. This is ignored by
>  	video input devices.
>  
> -  s_crystal_freq: sets the frequency of the crystal used to generate the
> +   s_crystal_freq: sets the frequency of the crystal used to generate the
>  	clocks in Hz. An extra flags field allows device specific configuration
>  	regarding clock frequency dividers, etc. If not used, then set flags
>  	to 0. If the frequency is not supported, then -EINVAL is returned.
> @@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
>  
>     s_routing: see s_routing in audio_ops, except this version is for video
>  	devices.
> +
> +   enum_imgbus_fmt: enumerate pixel formats provided by a video data source
>   */
>  struct v4l2_subdev_video_ops {
>  	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32 config);
> @@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
>  	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
>  	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
>  	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> +	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
> +			       enum v4l2_imgbus_pixelcode *code);
> +	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> +	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> +	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
>  };
>  
>  /**



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-11  7:55   ` Hans Verkuil
@ 2009-11-12  8:08     ` Guennadi Liakhovetski
  2009-11-15 16:23       ` Hans Verkuil
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-12  8:08 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

Hi Hans

Thanks for the review

On Wed, 11 Nov 2009, Hans Verkuil wrote:

> Hi Guennadi,
> 
> OK, I've looked at this again. See my comments below.
> 
> On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > Video subdevices, like cameras, decoders, connect to video bridges over
> > specialised busses. Data is being transferred over these busses in various
> > formats, which only loosely correspond to fourcc codes, describing how video
> > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > cannot use fourcc codes to configure subdevice output data formats. This patch
> > adds codes for several such on-the-bus formats and an API, similar to the
> > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > converted, the API will be removed.
> > 
> > Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > ---
> >  drivers/media/video/Makefile        |    2 +-
> >  drivers/media/video/v4l2-imagebus.c |  218 +++++++++++++++++++++++++++++++++++
> >  include/media/v4l2-imagebus.h       |   84 ++++++++++++++
> >  include/media/v4l2-subdev.h         |   10 ++-
> >  4 files changed, 312 insertions(+), 2 deletions(-)
> >  create mode 100644 drivers/media/video/v4l2-imagebus.c
> >  create mode 100644 include/media/v4l2-imagebus.h
> > 
> > diff --git a/drivers/media/video/Makefile b/drivers/media/video/Makefile
> > index 7a2dcc3..62d8907 100644
> > --- a/drivers/media/video/Makefile
> > +++ b/drivers/media/video/Makefile
> > @@ -10,7 +10,7 @@ stkwebcam-objs	:=	stk-webcam.o stk-sensor.o
> >  
> >  omap2cam-objs	:=	omap24xxcam.o omap24xxcam-dma.o
> >  
> > -videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o
> > +videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o v4l2-imagebus.o
> >  
> >  # V4L2 core modules
> >  
> > diff --git a/drivers/media/video/v4l2-imagebus.c b/drivers/media/video/v4l2-imagebus.c
> > new file mode 100644
> > index 0000000..e0a3a83
> > --- /dev/null
> > +++ b/drivers/media/video/v4l2-imagebus.c
> > @@ -0,0 +1,218 @@
> > +/*
> > + * Image Bus API
> > + *
> > + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > + *
> > + * This program is free software; you can redistribute it and/or modify
> > + * it under the terms of the GNU General Public License version 2 as
> > + * published by the Free Software Foundation.
> > + */
> > +
> > +#include <linux/kernel.h>
> > +#include <linux/module.h>
> > +
> > +#include <media/v4l2-device.h>
> > +#include <media/v4l2-imagebus.h>
> > +
> > +static const struct v4l2_imgbus_pixelfmt imgbus_fmt[] = {
> > +	[V4L2_IMGBUS_FMT_YUYV] = {
> > +		.fourcc			= V4L2_PIX_FMT_YUYV,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "YUYV",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_YVYU] = {
> > +		.fourcc			= V4L2_PIX_FMT_YVYU,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "YVYU",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_UYVY] = {
> > +		.fourcc			= V4L2_PIX_FMT_UYVY,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "UYVY",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_VYUY] = {
> > +		.fourcc			= V4L2_PIX_FMT_VYUY,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "VYUY",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8] = {
> > +		.fourcc			= V4L2_PIX_FMT_VYUY,
> > +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> > +		.name			= "VYUY in SMPTE170M",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16] = {
> > +		.fourcc			= V4L2_PIX_FMT_VYUY,
> > +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> > +		.name			= "VYUY in SMPTE170M, 16bit",
> > +		.bits_per_sample	= 16,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_RGB555] = {
> > +		.fourcc			= V4L2_PIX_FMT_RGB555,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "RGB555",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_RGB555X] = {
> > +		.fourcc			= V4L2_PIX_FMT_RGB555X,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "RGB555X",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_RGB565] = {
> > +		.fourcc			= V4L2_PIX_FMT_RGB565,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "RGB565",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_RGB565X] = {
> > +		.fourcc			= V4L2_PIX_FMT_RGB565X,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "RGB565X",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SBGGR8] = {
> > +		.fourcc			= V4L2_PIX_FMT_SBGGR8,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 8 BGGR",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SGBRG8] = {
> > +		.fourcc			= V4L2_PIX_FMT_SGBRG8,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 8 GBRG",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SGRBG8] = {
> > +		.fourcc			= V4L2_PIX_FMT_SGRBG8,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 8 GRBG",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SRGGB8] = {
> > +		.fourcc			= V4L2_PIX_FMT_SRGGB8,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 8 RGGB",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SBGGR10] = {
> > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 BGGR",
> > +		.bits_per_sample	= 10,
> > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SGBRG10] = {
> > +		.fourcc			= V4L2_PIX_FMT_SGBRG10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 GBRG",
> > +		.bits_per_sample	= 10,
> > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SGRBG10] = {
> > +		.fourcc			= V4L2_PIX_FMT_SGRBG10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 GRBG",
> > +		.bits_per_sample	= 10,
> > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SRGGB10] = {
> > +		.fourcc			= V4L2_PIX_FMT_SRGGB10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 RGGB",
> > +		.bits_per_sample	= 10,
> > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_GREY] = {
> > +		.fourcc			= V4L2_PIX_FMT_GREY,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "Grey",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_Y16] = {
> > +		.fourcc			= V4L2_PIX_FMT_Y16,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "Grey 16bit",
> > +		.bits_per_sample	= 16,
> > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_Y10] = {
> > +		.fourcc			= V4L2_PIX_FMT_Y10,
> > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > +		.name			= "Grey 10bit",
> > +		.bits_per_sample	= 10,
> > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE] = {
> > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 BGGR",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE] = {
> > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 BGGR",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE] = {
> > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 BGGR",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +		.order			= V4L2_IMGBUS_ORDER_BE,
> > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE] = {
> > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > +		.name			= "Bayer 10 BGGR",
> > +		.bits_per_sample	= 8,
> > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> > +		.order			= V4L2_IMGBUS_ORDER_BE,
> > +	},
> > +};
> > +
> > +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> > +	enum v4l2_imgbus_pixelcode code)
> > +{
> > +	if ((unsigned int)code > ARRAY_SIZE(imgbus_fmt))
> > +		return NULL;
> > +	return imgbus_fmt + code;
> > +}
> > +EXPORT_SYMBOL(v4l2_imgbus_get_fmtdesc);
> > +
> > +s32 v4l2_imgbus_bytes_per_line(u32 width,
> > +			       const struct v4l2_imgbus_pixelfmt *imgf)
> > +{
> > +	switch (imgf->packing) {
> > +	case V4L2_IMGBUS_PACKING_NONE:
> > +		return width * imgf->bits_per_sample / 8;
> > +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
> > +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
> > +	case V4L2_IMGBUS_PACKING_EXTEND16:
> > +		return width * 2;
> > +	}
> > +	return -EINVAL;
> > +}
> > +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
> 
> As you know, I am not convinced that this code belongs in the core. I do not
> think this translation from IMGBUS to PIXFMT is generic enough. However, if
> you just make this part of soc-camera then I am OK with this.

Are you referring to a specific function like v4l2_imgbus_bytes_per_line 
or to the whole v4l2-imagebus.c? The whole file and the 
v4l2_imgbus_get_fmtdesc() function must be available to all drivers, not 
just to soc-camera, if we want to use {enum,g,s,try}_imgbus_fmt API in 
other drivers too, and we do want to use them, if we want to re-use client 
drivers.

As for this specific function, it is not very generic yet, that's right. 
But I expect it to become more generic as more users and use-cases appear.

> > diff --git a/include/media/v4l2-imagebus.h b/include/media/v4l2-imagebus.h
> > new file mode 100644
> > index 0000000..022d044
> > --- /dev/null
> > +++ b/include/media/v4l2-imagebus.h
> > @@ -0,0 +1,84 @@
> > +/*
> > + * Image Bus API header
> > + *
> > + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > + *
> > + * This program is free software; you can redistribute it and/or modify
> > + * it under the terms of the GNU General Public License version 2 as
> > + * published by the Free Software Foundation.
> > + */
> > +
> > +#ifndef V4L2_IMGBUS_H
> > +#define V4L2_IMGBUS_H
> > +
> > +enum v4l2_imgbus_packing {
> > +	V4L2_IMGBUS_PACKING_NONE,
> > +	V4L2_IMGBUS_PACKING_2X8_PADHI,
> > +	V4L2_IMGBUS_PACKING_2X8_PADLO,
> > +	V4L2_IMGBUS_PACKING_EXTEND16,
> > +};
> > +
> > +enum v4l2_imgbus_order {
> > +	V4L2_IMGBUS_ORDER_LE,
> > +	V4L2_IMGBUS_ORDER_BE,
> > +};
> 
> For the same reason I think these enums should be soc-camera internal.

See above.

> > +
> > +enum v4l2_imgbus_pixelcode {
> 
> It's probably a good idea to start with something like:
> 
> 	V4L2_IMGBUS_FMT_FIXED = 1,
> 
> since many video encoders/decoders have a fixed bus format, so in those cases
> there is nothing to set up.

Well, you mean there are many host-client pairs, that both have only one 
setting and cannot be reused in other more generic combinations? Only for 
such devices such a fixed format might make sense, yes. It might also be 
useful for cases where we actually have no idea what format data is being 
transferred over the bus. Is this what you mean? For such cases yes, we 
might reserve one fixed format.

> I also like to leave value 0 free, that way it can be used as a special value
> internally (or as sentinel in a imgbus array as suggested below).

Ok.

> One other comment to throw into the pot: what about calling this just
> V4L2_BUS_FMT...? So imgbus becomes just bus. For some reason I find imgbus a
> bit odd. Probably because I think of it more as a video bus or even as a more
> general data bus. For all I know it might be used in the future to choose
> between different types of histogram data or something like that.

It might well be not the best namespace choice. But just "bus" OTOH seems 
way too generic to me. Maybe some (multi)mediabus? Or is even that too 
generic? It certainly depends on the scope which we foresee for this API.

> Or is this just me?

So, here you propose image-bus to be used globally... Sorry, so, shall it 
stay internal to soc-camera or shall it become global?

> > +	V4L2_IMGBUS_FMT_YUYV,
> > +	V4L2_IMGBUS_FMT_YVYU,
> > +	V4L2_IMGBUS_FMT_UYVY,
> > +	V4L2_IMGBUS_FMT_VYUY,
> > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
> > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
> > +	V4L2_IMGBUS_FMT_RGB555,
> > +	V4L2_IMGBUS_FMT_RGB555X,
> > +	V4L2_IMGBUS_FMT_RGB565,
> > +	V4L2_IMGBUS_FMT_RGB565X,
> > +	V4L2_IMGBUS_FMT_SBGGR8,
> > +	V4L2_IMGBUS_FMT_SGBRG8,
> > +	V4L2_IMGBUS_FMT_SGRBG8,
> > +	V4L2_IMGBUS_FMT_SRGGB8,
> > +	V4L2_IMGBUS_FMT_SBGGR10,
> > +	V4L2_IMGBUS_FMT_SGBRG10,
> > +	V4L2_IMGBUS_FMT_SGRBG10,
> > +	V4L2_IMGBUS_FMT_SRGGB10,
> > +	V4L2_IMGBUS_FMT_GREY,
> > +	V4L2_IMGBUS_FMT_Y16,
> > +	V4L2_IMGBUS_FMT_Y10,
> > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
> > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
> > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
> > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
> 
> Obviously the meaning of these formats need to be documented in this header
> as well. Are all these imgbus formats used? Anything that is not used shouldn't
> be in this list IMHO.

A few of them are, yes, some might not actually be used yes, but have been 
added for completenes. We can have a better look at them and maybe throw a 
couple of them away, yes.

Document - yes. But, please, under linux/Documentation/video4linux/.

> > +};
> > +
> > +/**
> > + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
> > + * @fourcc:		Fourcc code...
> > + * @colorspace:		and colorspace, that will be obtained if the data is
> > + *			stored in memory in the following way:
> > + * @bits_per_sample:	How many bits the bridge has to sample
> > + * @packing:		Type of sample-packing, that has to be used
> > + * @order:		Sample order when storing in memory
> > + */
> > +struct v4l2_imgbus_pixelfmt {
> > +	u32				fourcc;
> > +	enum v4l2_colorspace		colorspace;
> > +	const char			*name;
> > +	enum v4l2_imgbus_packing	packing;
> > +	enum v4l2_imgbus_order		order;
> > +	u8				bits_per_sample;
> > +};
> 
> Ditto for this struct. Note that the colorspace field should be moved to
> imgbus_framefmt.

Hm, not sure. Consider a simple scenario: user issues S_FMT. Host driver 
cannot handle that pixel-format in a "special" way, so, it goes for 
"pass-through," so it has to find an enum v4l2_imgbus_pixelcode value, 
from which it can generate the requested pixel-format _and_ colorspace. To 
do that it scans the internal pixel/data format translation table to look 
for the specific pixel-format and colorspace value, and issues 
s_imgbus_fmt to the client with the respective pixelcode.

Of course, this could ylso be done differently. In fact, I just do not 
know what client drivers know about colorspaces. Are they fixed per data 
format, and thus also uniquely defined by the latter? If so, no 
client-visible struct needs it. If some pixelcodes can exist with 
different colorspaces, then yes, we might want to pass the colorspace with 
s_imgbus_fmt in struct v4l2_imgbus_framefmt instead of allocating separate 
pixelcodes for them.

> > +
> > +struct v4l2_imgbus_framefmt {
> > +	__u32				width;
> > +	__u32				height;
> > +	enum v4l2_imgbus_pixelcode	code;
> > +	enum v4l2_field			field;
> > +};
> 
> Interesting observation: this struct is almost identical to struct
> v4l2_pix_format. Frankly, I really wonder whether we shouldn't reuse that
> struct. In many cases (mostly for video encoders/decoders) the VIDIOC_S_FMT
> ioctl and friends can just pass the fmt.pix pointer directly to the subdev.
> 
> So keeping this struct will make life easier. The only thing we have to make
> a note of in the subdev header is that the pixelformat will be interpreted
> as an imgbus 'pixelformat' instead.

I know it is similar, but I would prefer to have a different struct to 
avoid confusion and let the compiler do typechecking. I can well imagine, 
if we re-use the same struct, some drivers will forget to convert between 
pixel and data formats.

> Note that the current g/s/try_fmt subdev functions receive a struct v4l2_format
> pointer. I think that can be replaced by struct v4l2_pix_format. I don't think
> that there is any subdev driver that needs anything other than that struct. That
> would definitely simplify the driver code.

This can be done, yes. It would simplify the code by removing one line 
from each affected function like

	struct v4l2_pix_format *pix = &f->fmt.pix;

but it would negatively affect uniformity with the user-facing API, IMHO. 
In any case we want to eventually remove those *_fmt methods from subdev 
and replace them with respective *_imgbus_fmt counterparts (renaming them 
at the same time), don't we?

> Regarding the enum_imgbus_fmt: what about just adding a 'const u32 *imgbus_fmts'
> field to v4l2_subdev? Or do you think that this might be something that cannot
> be const? I.e., that the subdev driver needs to modify the list of available fmts
> dynamically?

soc-camera has been using static format lists all the time and we haven't 
seen a need for dynamic format lists yet. And no, I so far cannot imagine 
a need for them. Even if some formats may or may not be available 
depending on some run-time conditions, we can always just create a 
complete list and add a "available" or "enabled" field to the format.

Thanks
Guennadi

> 
> Regards,
> 
> 	Hans
> 
> > +
> > +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> > +	enum v4l2_imgbus_pixelcode code);
> > +s32 v4l2_imgbus_bytes_per_line(u32 width,
> > +			       const struct v4l2_imgbus_pixelfmt *imgf);
> > +
> > +#endif
> > diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> > index 04193eb..1e86f39 100644
> > --- a/include/media/v4l2-subdev.h
> > +++ b/include/media/v4l2-subdev.h
> > @@ -22,6 +22,7 @@
> >  #define _V4L2_SUBDEV_H
> >  
> >  #include <media/v4l2-common.h>
> > +#include <media/v4l2-imagebus.h>
> >  
> >  struct v4l2_device;
> >  struct v4l2_subdev;
> > @@ -196,7 +197,7 @@ struct v4l2_subdev_audio_ops {
> >     s_std_output: set v4l2_std_id for video OUTPUT devices. This is ignored by
> >  	video input devices.
> >  
> > -  s_crystal_freq: sets the frequency of the crystal used to generate the
> > +   s_crystal_freq: sets the frequency of the crystal used to generate the
> >  	clocks in Hz. An extra flags field allows device specific configuration
> >  	regarding clock frequency dividers, etc. If not used, then set flags
> >  	to 0. If the frequency is not supported, then -EINVAL is returned.
> > @@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
> >  
> >     s_routing: see s_routing in audio_ops, except this version is for video
> >  	devices.
> > +
> > +   enum_imgbus_fmt: enumerate pixel formats provided by a video data source
> >   */
> >  struct v4l2_subdev_video_ops {
> >  	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32 config);
> > @@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
> >  	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
> >  	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> >  	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> > +	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
> > +			       enum v4l2_imgbus_pixelcode *code);
> > +	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> > +	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> > +	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> >  };
> >  
> >  /**
> 
> 
> 
> -- 
> Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom
> 

---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-12  8:08     ` Guennadi Liakhovetski
@ 2009-11-15 16:23       ` Hans Verkuil
  2009-11-19 22:33         ` Guennadi Liakhovetski
  0 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-15 16:23 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

Hi Guennadi,

Apologies for the late reply, but better late than never :-)

On Thursday 12 November 2009 09:08:42 Guennadi Liakhovetski wrote:
> Hi Hans
> 
> Thanks for the review
> 
> On Wed, 11 Nov 2009, Hans Verkuil wrote:
> 
> > Hi Guennadi,
> > 
> > OK, I've looked at this again. See my comments below.
> > 
> > On Friday 30 October 2009 15:01:27 Guennadi Liakhovetski wrote:
> > > Video subdevices, like cameras, decoders, connect to video bridges over
> > > specialised busses. Data is being transferred over these busses in various
> > > formats, which only loosely correspond to fourcc codes, describing how video
> > > data is stored in RAM. This is not a one-to-one correspondence, therefore we
> > > cannot use fourcc codes to configure subdevice output data formats. This patch
> > > adds codes for several such on-the-bus formats and an API, similar to the
> > > familiar .s_fmt(), .g_fmt(), .try_fmt(), .enum_fmt() API for configuring those
> > > codes. After all users of the old API in struct v4l2_subdev_video_ops are
> > > converted, the API will be removed.
> > > 
> > > Signed-off-by: Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > > ---
> > >  drivers/media/video/Makefile        |    2 +-
> > >  drivers/media/video/v4l2-imagebus.c |  218 +++++++++++++++++++++++++++++++++++
> > >  include/media/v4l2-imagebus.h       |   84 ++++++++++++++
> > >  include/media/v4l2-subdev.h         |   10 ++-
> > >  4 files changed, 312 insertions(+), 2 deletions(-)
> > >  create mode 100644 drivers/media/video/v4l2-imagebus.c
> > >  create mode 100644 include/media/v4l2-imagebus.h
> > > 
> > > diff --git a/drivers/media/video/Makefile b/drivers/media/video/Makefile
> > > index 7a2dcc3..62d8907 100644
> > > --- a/drivers/media/video/Makefile
> > > +++ b/drivers/media/video/Makefile
> > > @@ -10,7 +10,7 @@ stkwebcam-objs	:=	stk-webcam.o stk-sensor.o
> > >  
> > >  omap2cam-objs	:=	omap24xxcam.o omap24xxcam-dma.o
> > >  
> > > -videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o
> > > +videodev-objs	:=	v4l2-dev.o v4l2-ioctl.o v4l2-device.o v4l2-imagebus.o
> > >  
> > >  # V4L2 core modules
> > >  
> > > diff --git a/drivers/media/video/v4l2-imagebus.c b/drivers/media/video/v4l2-imagebus.c
> > > new file mode 100644
> > > index 0000000..e0a3a83
> > > --- /dev/null
> > > +++ b/drivers/media/video/v4l2-imagebus.c
> > > @@ -0,0 +1,218 @@
> > > +/*
> > > + * Image Bus API
> > > + *
> > > + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > > + *
> > > + * This program is free software; you can redistribute it and/or modify
> > > + * it under the terms of the GNU General Public License version 2 as
> > > + * published by the Free Software Foundation.
> > > + */
> > > +
> > > +#include <linux/kernel.h>
> > > +#include <linux/module.h>
> > > +
> > > +#include <media/v4l2-device.h>
> > > +#include <media/v4l2-imagebus.h>
> > > +
> > > +static const struct v4l2_imgbus_pixelfmt imgbus_fmt[] = {
> > > +	[V4L2_IMGBUS_FMT_YUYV] = {
> > > +		.fourcc			= V4L2_PIX_FMT_YUYV,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "YUYV",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_YVYU] = {
> > > +		.fourcc			= V4L2_PIX_FMT_YVYU,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "YVYU",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_UYVY] = {
> > > +		.fourcc			= V4L2_PIX_FMT_UYVY,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "UYVY",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_VYUY] = {
> > > +		.fourcc			= V4L2_PIX_FMT_VYUY,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "VYUY",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8] = {
> > > +		.fourcc			= V4L2_PIX_FMT_VYUY,
> > > +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> > > +		.name			= "VYUY in SMPTE170M",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16] = {
> > > +		.fourcc			= V4L2_PIX_FMT_VYUY,
> > > +		.colorspace		= V4L2_COLORSPACE_SMPTE170M,
> > > +		.name			= "VYUY in SMPTE170M, 16bit",
> > > +		.bits_per_sample	= 16,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_RGB555] = {
> > > +		.fourcc			= V4L2_PIX_FMT_RGB555,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "RGB555",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_RGB555X] = {
> > > +		.fourcc			= V4L2_PIX_FMT_RGB555X,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "RGB555X",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_RGB565] = {
> > > +		.fourcc			= V4L2_PIX_FMT_RGB565,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "RGB565",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_RGB565X] = {
> > > +		.fourcc			= V4L2_PIX_FMT_RGB565X,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "RGB565X",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SBGGR8] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SBGGR8,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 8 BGGR",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SGBRG8] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SGBRG8,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 8 GBRG",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SGRBG8] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SGRBG8,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 8 GRBG",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SRGGB8] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SRGGB8,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 8 RGGB",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SBGGR10] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 BGGR",
> > > +		.bits_per_sample	= 10,
> > > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SGBRG10] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SGBRG10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 GBRG",
> > > +		.bits_per_sample	= 10,
> > > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SGRBG10] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SGRBG10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 GRBG",
> > > +		.bits_per_sample	= 10,
> > > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SRGGB10] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SRGGB10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 RGGB",
> > > +		.bits_per_sample	= 10,
> > > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_GREY] = {
> > > +		.fourcc			= V4L2_PIX_FMT_GREY,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "Grey",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_Y16] = {
> > > +		.fourcc			= V4L2_PIX_FMT_Y16,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "Grey 16bit",
> > > +		.bits_per_sample	= 16,
> > > +		.packing		= V4L2_IMGBUS_PACKING_NONE,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_Y10] = {
> > > +		.fourcc			= V4L2_PIX_FMT_Y10,
> > > +		.colorspace		= V4L2_COLORSPACE_JPEG,
> > > +		.name			= "Grey 10bit",
> > > +		.bits_per_sample	= 10,
> > > +		.packing		= V4L2_IMGBUS_PACKING_EXTEND16,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 BGGR",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 BGGR",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> > > +		.order			= V4L2_IMGBUS_ORDER_LE,
> > > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 BGGR",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +		.order			= V4L2_IMGBUS_ORDER_BE,
> > > +	}, [V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE] = {
> > > +		.fourcc			= V4L2_PIX_FMT_SBGGR10,
> > > +		.colorspace		= V4L2_COLORSPACE_SRGB,
> > > +		.name			= "Bayer 10 BGGR",
> > > +		.bits_per_sample	= 8,
> > > +		.packing		= V4L2_IMGBUS_PACKING_2X8_PADLO,
> > > +		.order			= V4L2_IMGBUS_ORDER_BE,
> > > +	},
> > > +};
> > > +
> > > +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> > > +	enum v4l2_imgbus_pixelcode code)
> > > +{
> > > +	if ((unsigned int)code > ARRAY_SIZE(imgbus_fmt))
> > > +		return NULL;
> > > +	return imgbus_fmt + code;
> > > +}
> > > +EXPORT_SYMBOL(v4l2_imgbus_get_fmtdesc);
> > > +
> > > +s32 v4l2_imgbus_bytes_per_line(u32 width,
> > > +			       const struct v4l2_imgbus_pixelfmt *imgf)
> > > +{
> > > +	switch (imgf->packing) {
> > > +	case V4L2_IMGBUS_PACKING_NONE:
> > > +		return width * imgf->bits_per_sample / 8;
> > > +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
> > > +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
> > > +	case V4L2_IMGBUS_PACKING_EXTEND16:
> > > +		return width * 2;
> > > +	}
> > > +	return -EINVAL;
> > > +}
> > > +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
> > 
> > As you know, I am not convinced that this code belongs in the core. I do not
> > think this translation from IMGBUS to PIXFMT is generic enough. However, if
> > you just make this part of soc-camera then I am OK with this.
> 
> Are you referring to a specific function like v4l2_imgbus_bytes_per_line 
> or to the whole v4l2-imagebus.c?

I'm referring to the whole file.

> The whole file and the  
> v4l2_imgbus_get_fmtdesc() function must be available to all drivers, not 
> just to soc-camera, if we want to use {enum,g,s,try}_imgbus_fmt API in 
> other drivers too, and we do want to use them, if we want to re-use client 
> drivers.

The sub-device drivers do not need this source. They just need to report
the supported image bus formats. And I am far from convinced that other bridge
drivers can actually reuse your v4l2-imagebus.c code.

If they can, then we can always rename it from e.g. soc-imagebus.c to
v4l2-imagebus.c. Right now I prefer to keep it inside soc-camera where is
clearly does work and when other people start implementing imagebus support,
then we can refer them to the work you did in soc-camera and we'll see what
happens.

> 
> As for this specific function, it is not very generic yet, that's right. 
> But I expect it to become more generic as more users and use-cases appear.
> 
> > > diff --git a/include/media/v4l2-imagebus.h b/include/media/v4l2-imagebus.h
> > > new file mode 100644
> > > index 0000000..022d044
> > > --- /dev/null
> > > +++ b/include/media/v4l2-imagebus.h
> > > @@ -0,0 +1,84 @@
> > > +/*
> > > + * Image Bus API header
> > > + *
> > > + * Copyright (C) 2009, Guennadi Liakhovetski <g.liakhovetski@gmx.de>
> > > + *
> > > + * This program is free software; you can redistribute it and/or modify
> > > + * it under the terms of the GNU General Public License version 2 as
> > > + * published by the Free Software Foundation.
> > > + */
> > > +
> > > +#ifndef V4L2_IMGBUS_H
> > > +#define V4L2_IMGBUS_H
> > > +
> > > +enum v4l2_imgbus_packing {
> > > +	V4L2_IMGBUS_PACKING_NONE,
> > > +	V4L2_IMGBUS_PACKING_2X8_PADHI,
> > > +	V4L2_IMGBUS_PACKING_2X8_PADLO,
> > > +	V4L2_IMGBUS_PACKING_EXTEND16,
> > > +};
> > > +
> > > +enum v4l2_imgbus_order {
> > > +	V4L2_IMGBUS_ORDER_LE,
> > > +	V4L2_IMGBUS_ORDER_BE,
> > > +};
> > 
> > For the same reason I think these enums should be soc-camera internal.
> 
> See above.

See above :-)

> 
> > > +
> > > +enum v4l2_imgbus_pixelcode {
> > 
> > It's probably a good idea to start with something like:
> > 
> > 	V4L2_IMGBUS_FMT_FIXED = 1,
> > 
> > since many video encoders/decoders have a fixed bus format, so in those cases
> > there is nothing to set up.
> 
> Well, you mean there are many host-client pairs, that both have only one 
> setting and cannot be reused in other more generic combinations?

Yes. Most standard definition video encoders/decoders fall in that category,
and so do pretty much all audio devices. Basically this is true for practically
all non-sensor i2c devices.

> Only for  
> such devices such a fixed format might make sense, yes. It might also be 
> useful for cases where we actually have no idea what format data is being 
> transferred over the bus. Is this what you mean? For such cases yes, we 
> might reserve one fixed format.

Yes, that's what I mean.

> 
> > I also like to leave value 0 free, that way it can be used as a special value
> > internally (or as sentinel in a imgbus array as suggested below).
> 
> Ok.
> 
> > One other comment to throw into the pot: what about calling this just
> > V4L2_BUS_FMT...? So imgbus becomes just bus. For some reason I find imgbus a
> > bit odd. Probably because I think of it more as a video bus or even as a more
> > general data bus. For all I know it might be used in the future to choose
> > between different types of histogram data or something like that.
> 
> It might well be not the best namespace choice. But just "bus" OTOH seems 
> way too generic to me. Maybe some (multi)mediabus? Or is even that too 
> generic? It certainly depends on the scope which we foresee for this API.

Hmm, I like that: 'mediabus'. Much better IMHO than imagebus. Image bus is
too specific to sensor, I think. Media bus is more generic (also for video
and audio formats), but it still clearly refers to the media data flowing
over the bus rather than e.g. control data.
 
> > Or is this just me?
> 
> So, here you propose image-bus to be used globally... Sorry, so, shall it 
> stay internal to soc-camera or shall it become global?

The V4L2_IMGBUS_FMT defines and the struct v4l2_imgbus_framefmt (if we will
use that) are global (i.e. anything we need for v4l2_subdev).

Anything related to struct v4l2_imgbus_pixelfmt is soc-camera specific for
now as far as I am concerned.

> > > +	V4L2_IMGBUS_FMT_YUYV,
> > > +	V4L2_IMGBUS_FMT_YVYU,
> > > +	V4L2_IMGBUS_FMT_UYVY,
> > > +	V4L2_IMGBUS_FMT_VYUY,
> > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
> > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
> > > +	V4L2_IMGBUS_FMT_RGB555,
> > > +	V4L2_IMGBUS_FMT_RGB555X,
> > > +	V4L2_IMGBUS_FMT_RGB565,
> > > +	V4L2_IMGBUS_FMT_RGB565X,
> > > +	V4L2_IMGBUS_FMT_SBGGR8,
> > > +	V4L2_IMGBUS_FMT_SGBRG8,
> > > +	V4L2_IMGBUS_FMT_SGRBG8,
> > > +	V4L2_IMGBUS_FMT_SRGGB8,
> > > +	V4L2_IMGBUS_FMT_SBGGR10,
> > > +	V4L2_IMGBUS_FMT_SGBRG10,
> > > +	V4L2_IMGBUS_FMT_SGRBG10,
> > > +	V4L2_IMGBUS_FMT_SRGGB10,
> > > +	V4L2_IMGBUS_FMT_GREY,
> > > +	V4L2_IMGBUS_FMT_Y16,
> > > +	V4L2_IMGBUS_FMT_Y10,
> > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
> > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
> > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
> > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
> > 
> > Obviously the meaning of these formats need to be documented in this header
> > as well. Are all these imgbus formats used? Anything that is not used shouldn't
> > be in this list IMHO.
> 
> A few of them are, yes, some might not actually be used yes, but have been 
> added for completenes. We can have a better look at them and maybe throw a 
> couple of them away, yes.
> 
> Document - yes. But, please, under linux/Documentation/video4linux/.

The problem is that people will forget to add it to the documentation when
they add new formats. We have that problem already with PIXFMT, and there you
actually get an error or warning when building the documentation.

I think that the chances of keeping the documentation up to date are much
higher if we document it at the same place that these formats are defined.

> 
> > > +};
> > > +
> > > +/**
> > > + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
> > > + * @fourcc:		Fourcc code...
> > > + * @colorspace:		and colorspace, that will be obtained if the data is
> > > + *			stored in memory in the following way:
> > > + * @bits_per_sample:	How many bits the bridge has to sample
> > > + * @packing:		Type of sample-packing, that has to be used
> > > + * @order:		Sample order when storing in memory
> > > + */
> > > +struct v4l2_imgbus_pixelfmt {
> > > +	u32				fourcc;
> > > +	enum v4l2_colorspace		colorspace;
> > > +	const char			*name;
> > > +	enum v4l2_imgbus_packing	packing;
> > > +	enum v4l2_imgbus_order		order;
> > > +	u8				bits_per_sample;
> > > +};
> > 
> > Ditto for this struct. Note that the colorspace field should be moved to
> > imgbus_framefmt.
> 
> Hm, not sure. Consider a simple scenario: user issues S_FMT. Host driver 
> cannot handle that pixel-format in a "special" way, so, it goes for 
> "pass-through," so it has to find an enum v4l2_imgbus_pixelcode value, 
> from which it can generate the requested pixel-format _and_ colorspace. To 
> do that it scans the internal pixel/data format translation table to look 
> for the specific pixel-format and colorspace value, and issues 
> s_imgbus_fmt to the client with the respective pixelcode.
> 
> Of course, this could ylso be done differently. In fact, I just do not 
> know what client drivers know about colorspaces. Are they fixed per data 
> format, and thus also uniquely defined by the latter? If so, no 
> client-visible struct needs it. If some pixelcodes can exist with 
> different colorspaces, then yes, we might want to pass the colorspace with 
> s_imgbus_fmt in struct v4l2_imgbus_framefmt instead of allocating separate 
> pixelcodes for them.

Yes, some video devices have image bus formats that can deliver different
colorspaces. For example, HDMI receivers will typically get information of
the colorspace as part of the datastream. So the same YCbCr bus format might
use either the ITU601 or ITU709 colorspace.

Typically for receivers calling g_imgbus_fmt() will return the colorspace
it currently receives but it will ignore any attempt to set the colorspace.

When programming a HDMI transmitter you will typically have to provide the
colorspace when you set the format since it needs that information to fill
in the colorspace information that it will generate in the datastream.

What is not needed is that you attempt to match a pixelformat to a busformat
and colorspace pair. You can ignore the colorspace for that.

> 
> > > +
> > > +struct v4l2_imgbus_framefmt {
> > > +	__u32				width;
> > > +	__u32				height;
> > > +	enum v4l2_imgbus_pixelcode	code;
> > > +	enum v4l2_field			field;
> > > +};
> > 
> > Interesting observation: this struct is almost identical to struct
> > v4l2_pix_format. Frankly, I really wonder whether we shouldn't reuse that
> > struct. In many cases (mostly for video encoders/decoders) the VIDIOC_S_FMT
> > ioctl and friends can just pass the fmt.pix pointer directly to the subdev.
> > 
> > So keeping this struct will make life easier. The only thing we have to make
> > a note of in the subdev header is that the pixelformat will be interpreted
> > as an imgbus 'pixelformat' instead.
> 
> I know it is similar, but I would prefer to have a different struct to 
> avoid confusion and let the compiler do typechecking. I can well imagine, 
> if we re-use the same struct, some drivers will forget to convert between 
> pixel and data formats.
> 
> > Note that the current g/s/try_fmt subdev functions receive a struct v4l2_format
> > pointer. I think that can be replaced by struct v4l2_pix_format. I don't think
> > that there is any subdev driver that needs anything other than that struct. That
> > would definitely simplify the driver code.
> 
> This can be done, yes. It would simplify the code by removing one line 
> from each affected function like
> 
> 	struct v4l2_pix_format *pix = &f->fmt.pix;
> 
> but it would negatively affect uniformity with the user-facing API, IMHO. 
> In any case we want to eventually remove those *_fmt methods from subdev 
> and replace them with respective *_imgbus_fmt counterparts (renaming them 
> at the same time), don't we?

I thought about it some more and I agree with you.

Regards,

	Hans

 
> > Regarding the enum_imgbus_fmt: what about just adding a 'const u32 *imgbus_fmts'
> > field to v4l2_subdev? Or do you think that this might be something that cannot
> > be const? I.e., that the subdev driver needs to modify the list of available fmts
> > dynamically?
> 
> soc-camera has been using static format lists all the time and we haven't 
> seen a need for dynamic format lists yet. And no, I so far cannot imagine 
> a need for them. Even if some formats may or may not be available 
> depending on some run-time conditions, we can always just create a 
> complete list and add a "available" or "enabled" field to the format.
> 
> Thanks
> Guennadi
> 
> > 
> > Regards,
> > 
> > 	Hans
> > 
> > > +
> > > +const struct v4l2_imgbus_pixelfmt *v4l2_imgbus_get_fmtdesc(
> > > +	enum v4l2_imgbus_pixelcode code);
> > > +s32 v4l2_imgbus_bytes_per_line(u32 width,
> > > +			       const struct v4l2_imgbus_pixelfmt *imgf);
> > > +
> > > +#endif
> > > diff --git a/include/media/v4l2-subdev.h b/include/media/v4l2-subdev.h
> > > index 04193eb..1e86f39 100644
> > > --- a/include/media/v4l2-subdev.h
> > > +++ b/include/media/v4l2-subdev.h
> > > @@ -22,6 +22,7 @@
> > >  #define _V4L2_SUBDEV_H
> > >  
> > >  #include <media/v4l2-common.h>
> > > +#include <media/v4l2-imagebus.h>
> > >  
> > >  struct v4l2_device;
> > >  struct v4l2_subdev;
> > > @@ -196,7 +197,7 @@ struct v4l2_subdev_audio_ops {
> > >     s_std_output: set v4l2_std_id for video OUTPUT devices. This is ignored by
> > >  	video input devices.
> > >  
> > > -  s_crystal_freq: sets the frequency of the crystal used to generate the
> > > +   s_crystal_freq: sets the frequency of the crystal used to generate the
> > >  	clocks in Hz. An extra flags field allows device specific configuration
> > >  	regarding clock frequency dividers, etc. If not used, then set flags
> > >  	to 0. If the frequency is not supported, then -EINVAL is returned.
> > > @@ -206,6 +207,8 @@ struct v4l2_subdev_audio_ops {
> > >  
> > >     s_routing: see s_routing in audio_ops, except this version is for video
> > >  	devices.
> > > +
> > > +   enum_imgbus_fmt: enumerate pixel formats provided by a video data source
> > >   */
> > >  struct v4l2_subdev_video_ops {
> > >  	int (*s_routing)(struct v4l2_subdev *sd, u32 input, u32 output, u32 config);
> > > @@ -227,6 +230,11 @@ struct v4l2_subdev_video_ops {
> > >  	int (*s_crop)(struct v4l2_subdev *sd, struct v4l2_crop *crop);
> > >  	int (*g_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> > >  	int (*s_parm)(struct v4l2_subdev *sd, struct v4l2_streamparm *param);
> > > +	int (*enum_imgbus_fmt)(struct v4l2_subdev *sd, int index,
> > > +			       enum v4l2_imgbus_pixelcode *code);
> > > +	int (*g_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> > > +	int (*try_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> > > +	int (*s_imgbus_fmt)(struct v4l2_subdev *sd, struct v4l2_imgbus_framefmt *fmt);
> > >  };
> > >  
> > >  /**
> > 
> > 
> > 
> > -- 
> > Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom
> > 
> 
> ---
> Guennadi Liakhovetski, Ph.D.
> Freelance Open-Source Software Developer
> http://www.open-technology.de/
> 



-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG Telecom

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-15 16:23       ` Hans Verkuil
@ 2009-11-19 22:33         ` Guennadi Liakhovetski
  2009-11-20 12:29           ` Hans Verkuil
  0 siblings, 1 reply; 51+ messages in thread
From: Guennadi Liakhovetski @ 2009-11-19 22:33 UTC (permalink / raw)
  To: Hans Verkuil
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

Hi Hans

On Sun, 15 Nov 2009, Hans Verkuil wrote:

[snip]

> > > > +s32 v4l2_imgbus_bytes_per_line(u32 width,
> > > > +			       const struct v4l2_imgbus_pixelfmt *imgf)
> > > > +{
> > > > +	switch (imgf->packing) {
> > > > +	case V4L2_IMGBUS_PACKING_NONE:
> > > > +		return width * imgf->bits_per_sample / 8;
> > > > +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
> > > > +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
> > > > +	case V4L2_IMGBUS_PACKING_EXTEND16:
> > > > +		return width * 2;
> > > > +	}
> > > > +	return -EINVAL;
> > > > +}
> > > > +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
> > > 
> > > As you know, I am not convinced that this code belongs in the core. I do not
> > > think this translation from IMGBUS to PIXFMT is generic enough. However, if
> > > you just make this part of soc-camera then I am OK with this.
> > 
> > Are you referring to a specific function like v4l2_imgbus_bytes_per_line 
> > or to the whole v4l2-imagebus.c?
> 
> I'm referring to the whole file.
> 
> > The whole file and the  
> > v4l2_imgbus_get_fmtdesc() function must be available to all drivers, not 
> > just to soc-camera, if we want to use {enum,g,s,try}_imgbus_fmt API in 
> > other drivers too, and we do want to use them, if we want to re-use client 
> > drivers.
> 
> The sub-device drivers do not need this source. They just need to report
> the supported image bus formats. And I am far from convinced that other bridge
> drivers can actually reuse your v4l2-imagebus.c code.

You mean, all non-soc-camera bridge drivers only handle special client 
formats, no generic pass-through? What about other SoC v4l host drivers, 
not using soc-camera, and willing to switch to v4l2-subdev? Like OMAPs, 
etc? I'm sure they would want to be able to use the pass-through mode

> If they can, then we can always rename it from e.g. soc-imagebus.c to
> v4l2-imagebus.c. Right now I prefer to keep it inside soc-camera where is
> clearly does work and when other people start implementing imagebus support,
> then we can refer them to the work you did in soc-camera and we'll see what
> happens.

You know how it happens - some authors do not know about some hidden code, 
during the review noone realises, that they are re-implementing that... 
Eventually you end up with duplicated customised sub-optimal code. Fresh 
example - the whole soc-camera framework:-) I only learned about 
int-device after soc-camera has already been submitted in its submission 
form. And I did ask on lists whether there was any code for such 
systems:-)

I do not quite understand what disturbs you about making this API global. 
It is a completely internal API - no exposure to user-space. We can modify 
or remove it any time.

Then think about wider exposure, testing. If you like we can make it a 
separate module and make soc-camera select it. And we can always degrade 
it back to soc-camera-specific:-)

> > > One other comment to throw into the pot: what about calling this just
> > > V4L2_BUS_FMT...? So imgbus becomes just bus. For some reason I find imgbus a
> > > bit odd. Probably because I think of it more as a video bus or even as a more
> > > general data bus. For all I know it might be used in the future to choose
> > > between different types of histogram data or something like that.
> > 
> > It might well be not the best namespace choice. But just "bus" OTOH seems 
> > way too generic to me. Maybe some (multi)mediabus? Or is even that too 
> > generic? It certainly depends on the scope which we foresee for this API.
> 
> Hmm, I like that: 'mediabus'. Much better IMHO than imagebus. Image bus is
> too specific to sensor, I think. Media bus is more generic (also for video
> and audio formats), but it still clearly refers to the media data flowing
> over the bus rather than e.g. control data.

Well, do we really think it might ever become relevant for audio? We're 
having problems adopting it generically for video even:-)

> > > > +	V4L2_IMGBUS_FMT_YUYV,
> > > > +	V4L2_IMGBUS_FMT_YVYU,
> > > > +	V4L2_IMGBUS_FMT_UYVY,
> > > > +	V4L2_IMGBUS_FMT_VYUY,
> > > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
> > > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
> > > > +	V4L2_IMGBUS_FMT_RGB555,
> > > > +	V4L2_IMGBUS_FMT_RGB555X,
> > > > +	V4L2_IMGBUS_FMT_RGB565,
> > > > +	V4L2_IMGBUS_FMT_RGB565X,
> > > > +	V4L2_IMGBUS_FMT_SBGGR8,
> > > > +	V4L2_IMGBUS_FMT_SGBRG8,
> > > > +	V4L2_IMGBUS_FMT_SGRBG8,
> > > > +	V4L2_IMGBUS_FMT_SRGGB8,
> > > > +	V4L2_IMGBUS_FMT_SBGGR10,
> > > > +	V4L2_IMGBUS_FMT_SGBRG10,
> > > > +	V4L2_IMGBUS_FMT_SGRBG10,
> > > > +	V4L2_IMGBUS_FMT_SRGGB10,
> > > > +	V4L2_IMGBUS_FMT_GREY,
> > > > +	V4L2_IMGBUS_FMT_Y16,
> > > > +	V4L2_IMGBUS_FMT_Y10,
> > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
> > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
> > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
> > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
> > > 
> > > Obviously the meaning of these formats need to be documented in this header
> > > as well. Are all these imgbus formats used? Anything that is not used shouldn't
> > > be in this list IMHO.
> > 
> > A few of them are, yes, some might not actually be used yes, but have been 
> > added for completenes. We can have a better look at them and maybe throw a 
> > couple of them away, yes.
> > 
> > Document - yes. But, please, under linux/Documentation/video4linux/.
> 
> The problem is that people will forget to add it to the documentation when
> they add new formats. We have that problem already with PIXFMT, and there you
> actually get an error or warning when building the documentation.
> 
> I think that the chances of keeping the documentation up to date are much
> higher if we document it at the same place that these formats are defined.

Ah, you mean docbook in the code - sure, better yet. I meant in the kernel 
as opposed to the hg documentation collection.

> > > > +};
> > > > +
> > > > +/**
> > > > + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
> > > > + * @fourcc:		Fourcc code...
> > > > + * @colorspace:		and colorspace, that will be obtained if the data is
> > > > + *			stored in memory in the following way:
> > > > + * @bits_per_sample:	How many bits the bridge has to sample
> > > > + * @packing:		Type of sample-packing, that has to be used
> > > > + * @order:		Sample order when storing in memory
> > > > + */
> > > > +struct v4l2_imgbus_pixelfmt {
> > > > +	u32				fourcc;
> > > > +	enum v4l2_colorspace		colorspace;
> > > > +	const char			*name;
> > > > +	enum v4l2_imgbus_packing	packing;
> > > > +	enum v4l2_imgbus_order		order;
> > > > +	u8				bits_per_sample;
> > > > +};
> > > 
> > > Ditto for this struct. Note that the colorspace field should be moved to
> > > imgbus_framefmt.
> > 
> > Hm, not sure. Consider a simple scenario: user issues S_FMT. Host driver 
> > cannot handle that pixel-format in a "special" way, so, it goes for 
> > "pass-through," so it has to find an enum v4l2_imgbus_pixelcode value, 
> > from which it can generate the requested pixel-format _and_ colorspace. To 
> > do that it scans the internal pixel/data format translation table to look 
> > for the specific pixel-format and colorspace value, and issues 
> > s_imgbus_fmt to the client with the respective pixelcode.
> > 
> > Of course, this could ylso be done differently. In fact, I just do not 
> > know what client drivers know about colorspaces. Are they fixed per data 
> > format, and thus also uniquely defined by the latter? If so, no 
> > client-visible struct needs it. If some pixelcodes can exist with 
> > different colorspaces, then yes, we might want to pass the colorspace with 
> > s_imgbus_fmt in struct v4l2_imgbus_framefmt instead of allocating separate 
> > pixelcodes for them.
> 
> Yes, some video devices have image bus formats that can deliver different
> colorspaces. For example, HDMI receivers will typically get information of
> the colorspace as part of the datastream. So the same YCbCr bus format might
> use either the ITU601 or ITU709 colorspace.
> 
> Typically for receivers calling g_imgbus_fmt() will return the colorspace
> it currently receives but it will ignore any attempt to set the colorspace.
> 
> When programming a HDMI transmitter you will typically have to provide the
> colorspace when you set the format since it needs that information to fill
> in the colorspace information that it will generate in the datastream.
> 
> What is not needed is that you attempt to match a pixelformat to a busformat
> and colorspace pair. You can ignore the colorspace for that.

Ok, thanks, I'll change that.

Thanks
Guennadi
---
Guennadi Liakhovetski, Ph.D.
Freelance Open-Source Software Developer
http://www.open-technology.de/

^ permalink raw reply	[flat|nested] 51+ messages in thread

* Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-19 22:33         ` Guennadi Liakhovetski
@ 2009-11-20 12:29           ` Hans Verkuil
  2009-11-20 15:07             ` Karicheri, Muralidharan
  0 siblings, 1 reply; 51+ messages in thread
From: Hans Verkuil @ 2009-11-20 12:29 UTC (permalink / raw)
  To: Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus,
	Muralidharan Karicheri

On Thursday 19 November 2009 23:33:22 Guennadi Liakhovetski wrote:
> Hi Hans
>
> On Sun, 15 Nov 2009, Hans Verkuil wrote:
>
> [snip]
>
> > > > > +s32 v4l2_imgbus_bytes_per_line(u32 width,
> > > > > +			       const struct v4l2_imgbus_pixelfmt *imgf)
> > > > > +{
> > > > > +	switch (imgf->packing) {
> > > > > +	case V4L2_IMGBUS_PACKING_NONE:
> > > > > +		return width * imgf->bits_per_sample / 8;
> > > > > +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
> > > > > +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
> > > > > +	case V4L2_IMGBUS_PACKING_EXTEND16:
> > > > > +		return width * 2;
> > > > > +	}
> > > > > +	return -EINVAL;
> > > > > +}
> > > > > +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
> > > >
> > > > As you know, I am not convinced that this code belongs in the core.
> > > > I do not think this translation from IMGBUS to PIXFMT is generic
> > > > enough. However, if you just make this part of soc-camera then I am
> > > > OK with this.
> > >
> > > Are you referring to a specific function like
> > > v4l2_imgbus_bytes_per_line or to the whole v4l2-imagebus.c?
> >
> > I'm referring to the whole file.
> >
> > > The whole file and the
> > > v4l2_imgbus_get_fmtdesc() function must be available to all drivers,
> > > not just to soc-camera, if we want to use {enum,g,s,try}_imgbus_fmt
> > > API in other drivers too, and we do want to use them, if we want to
> > > re-use client drivers.
> >
> > The sub-device drivers do not need this source. They just need to
> > report the supported image bus formats. And I am far from convinced
> > that other bridge drivers can actually reuse your v4l2-imagebus.c code.
>
> You mean, all non-soc-camera bridge drivers only handle special client
> formats, no generic pass-through?

That's correct. It's never been a problem until now. Usually the format is
fixed, so there is nothing to configure.

> What about other SoC v4l host drivers, 
> not using soc-camera, and willing to switch to v4l2-subdev? Like OMAPs,
> etc? I'm sure they would want to be able to use the pass-through mode

And if they can reuse your code, then we will rename it to v4l2-busimage.c

But I have my doubts about that. I don't like that code, but I also don't 
have the time to think about a better alternative. As long as it is
soc-camera specific, then I don't mind. And if omap3 can reuse it, then I 
clearly was wrong and we can rename it and make it part of the core 
framework.

> > If they can, then we can always rename it from e.g. soc-imagebus.c to
> > v4l2-imagebus.c. Right now I prefer to keep it inside soc-camera where
> > is clearly does work and when other people start implementing imagebus
> > support, then we can refer them to the work you did in soc-camera and
> > we'll see what happens.
>
> You know how it happens - some authors do not know about some hidden
> code, during the review noone realises, that they are re-implementing
> that... Eventually you end up with duplicated customised sub-optimal
> code. Fresh example - the whole soc-camera framework:-) I only learned
> about int-device after soc-camera has already been submitted in its
> submission form. And I did ask on lists whether there was any code for
> such systems:-)

All the relevant omap developers are CC-ed in this discussion, and I'm also 
paying fairly close attention to anything SoC related.

> I do not quite understand what disturbs you about making this API global.
> It is a completely internal API - no exposure to user-space. We can
> modify or remove it any time.
>
> Then think about wider exposure, testing. If you like we can make it a
> separate module and make soc-camera select it. And we can always degrade
> it back to soc-camera-specific:-)

Making this API global means that it becomes part of the framework. And I 
want to pay a lot more attention to that code than we did in the past. So I 
have to be convinced that it is code that is really reusable by other 
drivers. And I am not convinced about that. Since I know omap3 will need 
this soon, I want to wait for their experiences with your code before 
making this part of the framework.

> > > > One other comment to throw into the pot: what about calling this
> > > > just V4L2_BUS_FMT...? So imgbus becomes just bus. For some reason I
> > > > find imgbus a bit odd. Probably because I think of it more as a
> > > > video bus or even as a more general data bus. For all I know it
> > > > might be used in the future to choose between different types of
> > > > histogram data or something like that.
> > >
> > > It might well be not the best namespace choice. But just "bus" OTOH
> > > seems way too generic to me. Maybe some (multi)mediabus? Or is even
> > > that too generic? It certainly depends on the scope which we foresee
> > > for this API.
> >
> > Hmm, I like that: 'mediabus'. Much better IMHO than imagebus. Image bus
> > is too specific to sensor, I think. Media bus is more generic (also for
> > video and audio formats), but it still clearly refers to the media data
> > flowing over the bus rather than e.g. control data.
>
> Well, do we really think it might ever become relevant for audio? We're
> having problems adopting it generically for video even:-)

Or VBI data, or whatever we might need in the future that is related to 
media.

> > > > > +	V4L2_IMGBUS_FMT_YUYV,
> > > > > +	V4L2_IMGBUS_FMT_YVYU,
> > > > > +	V4L2_IMGBUS_FMT_UYVY,
> > > > > +	V4L2_IMGBUS_FMT_VYUY,
> > > > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
> > > > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
> > > > > +	V4L2_IMGBUS_FMT_RGB555,
> > > > > +	V4L2_IMGBUS_FMT_RGB555X,
> > > > > +	V4L2_IMGBUS_FMT_RGB565,
> > > > > +	V4L2_IMGBUS_FMT_RGB565X,
> > > > > +	V4L2_IMGBUS_FMT_SBGGR8,
> > > > > +	V4L2_IMGBUS_FMT_SGBRG8,
> > > > > +	V4L2_IMGBUS_FMT_SGRBG8,
> > > > > +	V4L2_IMGBUS_FMT_SRGGB8,
> > > > > +	V4L2_IMGBUS_FMT_SBGGR10,
> > > > > +	V4L2_IMGBUS_FMT_SGBRG10,
> > > > > +	V4L2_IMGBUS_FMT_SGRBG10,
> > > > > +	V4L2_IMGBUS_FMT_SRGGB10,
> > > > > +	V4L2_IMGBUS_FMT_GREY,
> > > > > +	V4L2_IMGBUS_FMT_Y16,
> > > > > +	V4L2_IMGBUS_FMT_Y10,
> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
> > > >
> > > > Obviously the meaning of these formats need to be documented in
> > > > this header as well. Are all these imgbus formats used? Anything
> > > > that is not used shouldn't be in this list IMHO.
> > >
> > > A few of them are, yes, some might not actually be used yes, but have
> > > been added for completenes. We can have a better look at them and
> > > maybe throw a couple of them away, yes.
> > >
> > > Document - yes. But, please, under linux/Documentation/video4linux/.
> >
> > The problem is that people will forget to add it to the documentation
> > when they add new formats. We have that problem already with PIXFMT,
> > and there you actually get an error or warning when building the
> > documentation.
> >
> > I think that the chances of keeping the documentation up to date are
> > much higher if we document it at the same place that these formats are
> > defined.
>
> Ah, you mean docbook in the code - sure, better yet. I meant in the
> kernel as opposed to the hg documentation collection.
>
> > > > > +};
> > > > > +
> > > > > +/**
> > > > > + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
> > > > > + * @fourcc:		Fourcc code...
> > > > > + * @colorspace:		and colorspace, that will be obtained if the
> > > > > data is + *			stored in memory in the following way:
> > > > > + * @bits_per_sample:	How many bits the bridge has to sample
> > > > > + * @packing:		Type of sample-packing, that has to be used
> > > > > + * @order:		Sample order when storing in memory
> > > > > + */
> > > > > +struct v4l2_imgbus_pixelfmt {
> > > > > +	u32				fourcc;
> > > > > +	enum v4l2_colorspace		colorspace;
> > > > > +	const char			*name;
> > > > > +	enum v4l2_imgbus_packing	packing;
> > > > > +	enum v4l2_imgbus_order		order;
> > > > > +	u8				bits_per_sample;
> > > > > +};
> > > >
> > > > Ditto for this struct. Note that the colorspace field should be
> > > > moved to imgbus_framefmt.
> > >
> > > Hm, not sure. Consider a simple scenario: user issues S_FMT. Host
> > > driver cannot handle that pixel-format in a "special" way, so, it
> > > goes for "pass-through," so it has to find an enum
> > > v4l2_imgbus_pixelcode value, from which it can generate the requested
> > > pixel-format _and_ colorspace. To do that it scans the internal
> > > pixel/data format translation table to look for the specific
> > > pixel-format and colorspace value, and issues s_imgbus_fmt to the
> > > client with the respective pixelcode.
> > >
> > > Of course, this could ylso be done differently. In fact, I just do
> > > not know what client drivers know about colorspaces. Are they fixed
> > > per data format, and thus also uniquely defined by the latter? If so,
> > > no client-visible struct needs it. If some pixelcodes can exist with
> > > different colorspaces, then yes, we might want to pass the colorspace
> > > with s_imgbus_fmt in struct v4l2_imgbus_framefmt instead of
> > > allocating separate pixelcodes for them.
> >
> > Yes, some video devices have image bus formats that can deliver
> > different colorspaces. For example, HDMI receivers will typically get
> > information of the colorspace as part of the datastream. So the same
> > YCbCr bus format might use either the ITU601 or ITU709 colorspace.
> >
> > Typically for receivers calling g_imgbus_fmt() will return the
> > colorspace it currently receives but it will ignore any attempt to set
> > the colorspace.
> >
> > When programming a HDMI transmitter you will typically have to provide
> > the colorspace when you set the format since it needs that information
> > to fill in the colorspace information that it will generate in the
> > datastream.
> >
> > What is not needed is that you attempt to match a pixelformat to a
> > busformat and colorspace pair. You can ignore the colorspace for that.
>
> Ok, thanks, I'll change that.

We are really almost there: rename imgbus to mediabus and rename to 
v4l2-imagebus.c to soc-mediabus.c (which we might change back in the 
future). It would be really nice to get this in for 2.6.33.

Regards,

	Hans

-- 
Hans Verkuil - video4linux developer - sponsored by TANDBERG

^ permalink raw reply	[flat|nested] 51+ messages in thread

* RE: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats
  2009-11-20 12:29           ` Hans Verkuil
@ 2009-11-20 15:07             ` Karicheri, Muralidharan
  0 siblings, 0 replies; 51+ messages in thread
From: Karicheri, Muralidharan @ 2009-11-20 15:07 UTC (permalink / raw)
  To: Hans Verkuil, Guennadi Liakhovetski
  Cc: Linux Media Mailing List, Laurent Pinchart, Sakari Ailus

Hi,

I guess this is only one part of the required API support for setting
bus configuration for which I had sent an RFC some time back. I am sure
we need to set bus image/data format in vpfe/vpbe drivers of DMxxx.
I am starting to do more upstream work for vpfe capture & display drivers and would like to submit an updated RFC for bus configuration. I am not sure if someone is already working on that RFC. 

Looks like we need to have two APIs at sub-device level for handling this.
One for image data format (Which is addressed by this RFC) and other for hardware signals like polarities, bus type etc. Any comments?

BTW, I didn't have a chance to go over Guennadi's RFC for bus image format
so far and hope to spend sometime on this next week. 

Murali Karicheri
Software Design Engineer
Texas Instruments Inc.
Germantown, MD 20874
phone: 301-407-9583
email: m-karicheri2@ti.com

>-----Original Message-----
>From: Hans Verkuil [mailto:hverkuil@xs4all.nl]
>Sent: Friday, November 20, 2009 7:29 AM
>To: Guennadi Liakhovetski
>Cc: Linux Media Mailing List; Laurent Pinchart; Sakari Ailus; Karicheri,
>Muralidharan
>Subject: Re: [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring
>v4l2 subdev pixel and frame formats
>
>On Thursday 19 November 2009 23:33:22 Guennadi Liakhovetski wrote:
>> Hi Hans
>>
>> On Sun, 15 Nov 2009, Hans Verkuil wrote:
>>
>> [snip]
>>
>> > > > > +s32 v4l2_imgbus_bytes_per_line(u32 width,
>> > > > > +			       const struct v4l2_imgbus_pixelfmt *imgf)
>> > > > > +{
>> > > > > +	switch (imgf->packing) {
>> > > > > +	case V4L2_IMGBUS_PACKING_NONE:
>> > > > > +		return width * imgf->bits_per_sample / 8;
>> > > > > +	case V4L2_IMGBUS_PACKING_2X8_PADHI:
>> > > > > +	case V4L2_IMGBUS_PACKING_2X8_PADLO:
>> > > > > +	case V4L2_IMGBUS_PACKING_EXTEND16:
>> > > > > +		return width * 2;
>> > > > > +	}
>> > > > > +	return -EINVAL;
>> > > > > +}
>> > > > > +EXPORT_SYMBOL(v4l2_imgbus_bytes_per_line);
>> > > >
>> > > > As you know, I am not convinced that this code belongs in the core.
>> > > > I do not think this translation from IMGBUS to PIXFMT is generic
>> > > > enough. However, if you just make this part of soc-camera then I am
>> > > > OK with this.
>> > >
>> > > Are you referring to a specific function like
>> > > v4l2_imgbus_bytes_per_line or to the whole v4l2-imagebus.c?
>> >
>> > I'm referring to the whole file.
>> >
>> > > The whole file and the
>> > > v4l2_imgbus_get_fmtdesc() function must be available to all drivers,
>> > > not just to soc-camera, if we want to use {enum,g,s,try}_imgbus_fmt
>> > > API in other drivers too, and we do want to use them, if we want to
>> > > re-use client drivers.
>> >
>> > The sub-device drivers do not need this source. They just need to
>> > report the supported image bus formats. And I am far from convinced
>> > that other bridge drivers can actually reuse your v4l2-imagebus.c code.
>>
>> You mean, all non-soc-camera bridge drivers only handle special client
>> formats, no generic pass-through?
>
>That's correct. It's never been a problem until now. Usually the format is
>fixed, so there is nothing to configure.
>
>> What about other SoC v4l host drivers,
>> not using soc-camera, and willing to switch to v4l2-subdev? Like OMAPs,
>> etc? I'm sure they would want to be able to use the pass-through mode
>
>And if they can reuse your code, then we will rename it to v4l2-busimage.c
>
>But I have my doubts about that. I don't like that code, but I also don't
>have the time to think about a better alternative. As long as it is
>soc-camera specific, then I don't mind. And if omap3 can reuse it, then I
>clearly was wrong and we can rename it and make it part of the core
>framework.
>
>> > If they can, then we can always rename it from e.g. soc-imagebus.c to
>> > v4l2-imagebus.c. Right now I prefer to keep it inside soc-camera where
>> > is clearly does work and when other people start implementing imagebus
>> > support, then we can refer them to the work you did in soc-camera and
>> > we'll see what happens.
>>
>> You know how it happens - some authors do not know about some hidden
>> code, during the review noone realises, that they are re-implementing
>> that... Eventually you end up with duplicated customised sub-optimal
>> code. Fresh example - the whole soc-camera framework:-) I only learned
>> about int-device after soc-camera has already been submitted in its
>> submission form. And I did ask on lists whether there was any code for
>> such systems:-)
>
>All the relevant omap developers are CC-ed in this discussion, and I'm also
>paying fairly close attention to anything SoC related.
>
>> I do not quite understand what disturbs you about making this API global.
>> It is a completely internal API - no exposure to user-space. We can
>> modify or remove it any time.
>>
>> Then think about wider exposure, testing. If you like we can make it a
>> separate module and make soc-camera select it. And we can always degrade
>> it back to soc-camera-specific:-)
>
>Making this API global means that it becomes part of the framework. And I
>want to pay a lot more attention to that code than we did in the past. So I
>have to be convinced that it is code that is really reusable by other
>drivers. And I am not convinced about that. Since I know omap3 will need
>this soon, I want to wait for their experiences with your code before
>making this part of the framework.
>
>> > > > One other comment to throw into the pot: what about calling this
>> > > > just V4L2_BUS_FMT...? So imgbus becomes just bus. For some reason I
>> > > > find imgbus a bit odd. Probably because I think of it more as a
>> > > > video bus or even as a more general data bus. For all I know it
>> > > > might be used in the future to choose between different types of
>> > > > histogram data or something like that.
>> > >
>> > > It might well be not the best namespace choice. But just "bus" OTOH
>> > > seems way too generic to me. Maybe some (multi)mediabus? Or is even
>> > > that too generic? It certainly depends on the scope which we foresee
>> > > for this API.
>> >
>> > Hmm, I like that: 'mediabus'. Much better IMHO than imagebus. Image bus
>> > is too specific to sensor, I think. Media bus is more generic (also for
>> > video and audio formats), but it still clearly refers to the media data
>> > flowing over the bus rather than e.g. control data.
>>
>> Well, do we really think it might ever become relevant for audio? We're
>> having problems adopting it generically for video even:-)
>
>Or VBI data, or whatever we might need in the future that is related to
>media.
>
>> > > > > +	V4L2_IMGBUS_FMT_YUYV,
>> > > > > +	V4L2_IMGBUS_FMT_YVYU,
>> > > > > +	V4L2_IMGBUS_FMT_UYVY,
>> > > > > +	V4L2_IMGBUS_FMT_VYUY,
>> > > > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_8,
>> > > > > +	V4L2_IMGBUS_FMT_VYUY_SMPTE170M_16,
>> > > > > +	V4L2_IMGBUS_FMT_RGB555,
>> > > > > +	V4L2_IMGBUS_FMT_RGB555X,
>> > > > > +	V4L2_IMGBUS_FMT_RGB565,
>> > > > > +	V4L2_IMGBUS_FMT_RGB565X,
>> > > > > +	V4L2_IMGBUS_FMT_SBGGR8,
>> > > > > +	V4L2_IMGBUS_FMT_SGBRG8,
>> > > > > +	V4L2_IMGBUS_FMT_SGRBG8,
>> > > > > +	V4L2_IMGBUS_FMT_SRGGB8,
>> > > > > +	V4L2_IMGBUS_FMT_SBGGR10,
>> > > > > +	V4L2_IMGBUS_FMT_SGBRG10,
>> > > > > +	V4L2_IMGBUS_FMT_SGRBG10,
>> > > > > +	V4L2_IMGBUS_FMT_SRGGB10,
>> > > > > +	V4L2_IMGBUS_FMT_GREY,
>> > > > > +	V4L2_IMGBUS_FMT_Y16,
>> > > > > +	V4L2_IMGBUS_FMT_Y10,
>> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_BE,
>> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_BE,
>> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADHI_LE,
>> > > > > +	V4L2_IMGBUS_FMT_SBGGR10_2X8_PADLO_LE,
>> > > >
>> > > > Obviously the meaning of these formats need to be documented in
>> > > > this header as well. Are all these imgbus formats used? Anything
>> > > > that is not used shouldn't be in this list IMHO.
>> > >
>> > > A few of them are, yes, some might not actually be used yes, but have
>> > > been added for completenes. We can have a better look at them and
>> > > maybe throw a couple of them away, yes.
>> > >
>> > > Document - yes. But, please, under linux/Documentation/video4linux/.
>> >
>> > The problem is that people will forget to add it to the documentation
>> > when they add new formats. We have that problem already with PIXFMT,
>> > and there you actually get an error or warning when building the
>> > documentation.
>> >
>> > I think that the chances of keeping the documentation up to date are
>> > much higher if we document it at the same place that these formats are
>> > defined.
>>
>> Ah, you mean docbook in the code - sure, better yet. I meant in the
>> kernel as opposed to the hg documentation collection.
>>
>> > > > > +};
>> > > > > +
>> > > > > +/**
>> > > > > + * struct v4l2_imgbus_pixelfmt - Data format on the image bus
>> > > > > + * @fourcc:		Fourcc code...
>> > > > > + * @colorspace:		and colorspace, that will be obtained
>if the
>> > > > > data is + *			stored in memory in the following way:
>> > > > > + * @bits_per_sample:	How many bits the bridge has to sample
>> > > > > + * @packing:		Type of sample-packing, that has to be used
>> > > > > + * @order:		Sample order when storing in memory
>> > > > > + */
>> > > > > +struct v4l2_imgbus_pixelfmt {
>> > > > > +	u32				fourcc;
>> > > > > +	enum v4l2_colorspace		colorspace;
>> > > > > +	const char			*name;
>> > > > > +	enum v4l2_imgbus_packing	packing;
>> > > > > +	enum v4l2_imgbus_order		order;
>> > > > > +	u8				bits_per_sample;
>> > > > > +};
>> > > >
>> > > > Ditto for this struct. Note that the colorspace field should be
>> > > > moved to imgbus_framefmt.
>> > >
>> > > Hm, not sure. Consider a simple scenario: user issues S_FMT. Host
>> > > driver cannot handle that pixel-format in a "special" way, so, it
>> > > goes for "pass-through," so it has to find an enum
>> > > v4l2_imgbus_pixelcode value, from which it can generate the requested
>> > > pixel-format _and_ colorspace. To do that it scans the internal
>> > > pixel/data format translation table to look for the specific
>> > > pixel-format and colorspace value, and issues s_imgbus_fmt to the
>> > > client with the respective pixelcode.
>> > >
>> > > Of course, this could ylso be done differently. In fact, I just do
>> > > not know what client drivers know about colorspaces. Are they fixed
>> > > per data format, and thus also uniquely defined by the latter? If so,
>> > > no client-visible struct needs it. If some pixelcodes can exist with
>> > > different colorspaces, then yes, we might want to pass the colorspace
>> > > with s_imgbus_fmt in struct v4l2_imgbus_framefmt instead of
>> > > allocating separate pixelcodes for them.
>> >
>> > Yes, some video devices have image bus formats that can deliver
>> > different colorspaces. For example, HDMI receivers will typically get
>> > information of the colorspace as part of the datastream. So the same
>> > YCbCr bus format might use either the ITU601 or ITU709 colorspace.
>> >
>> > Typically for receivers calling g_imgbus_fmt() will return the
>> > colorspace it currently receives but it will ignore any attempt to set
>> > the colorspace.
>> >
>> > When programming a HDMI transmitter you will typically have to provide
>> > the colorspace when you set the format since it needs that information
>> > to fill in the colorspace information that it will generate in the
>> > datastream.
>> >
>> > What is not needed is that you attempt to match a pixelformat to a
>> > busformat and colorspace pair. You can ignore the colorspace for that.
>>
>> Ok, thanks, I'll change that.
>
>We are really almost there: rename imgbus to mediabus and rename to
>v4l2-imagebus.c to soc-mediabus.c (which we might change back in the
>future). It would be really nice to get this in for 2.6.33.
>
>Regards,
>
>	Hans
>
>--
>Hans Verkuil - video4linux developer - sponsored by TANDBERG


^ permalink raw reply	[flat|nested] 51+ messages in thread

end of thread, other threads:[~2009-11-20 15:07 UTC | newest]

Thread overview: 51+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2009-10-30 14:00 [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH 1/9] soc-camera: remove no longer needed struct members Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH 2/9] v4l: add new v4l2-subdev sensor operations, use g_skip_top_lines in soc-camera Guennadi Liakhovetski
2009-10-30 14:43   ` Karicheri, Muralidharan
2009-10-30 20:31     ` Guennadi Liakhovetski
2009-11-02 16:14       ` Karicheri, Muralidharan
2009-11-04 19:11         ` Guennadi Liakhovetski
2009-11-10 12:55   ` Laurent Pinchart
2009-11-10 14:11     ` Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH 3/9] soc-camera: fix multi-line comment coding style Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH 4/9] v4l: Add a 10-bit monochrome and missing 8- and 10-bit Bayer fourcc codes Guennadi Liakhovetski
2009-11-05 14:45   ` Hans Verkuil
2009-11-05 16:29     ` Guennadi Liakhovetski
2009-11-05 16:32       ` Hans Verkuil
2009-10-30 14:01 ` [PATCH 5/9] soc-camera: add a private field to struct soc_camera_link Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH 6/9] soc-camera: switch drivers and platforms to use .priv in " Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH/RFC 7/9 v2] v4l: add an image-bus API for configuring v4l2 subdev pixel and frame formats Guennadi Liakhovetski
2009-11-05 15:41   ` Hans Verkuil
2009-11-05 16:51     ` Guennadi Liakhovetski
2009-11-05 18:11       ` Hans Verkuil
2009-11-05 18:56         ` Guennadi Liakhovetski
2009-11-06  6:47           ` Hans Verkuil
2009-11-06  7:42             ` Guennadi Liakhovetski
2009-11-06  8:28               ` Hans Verkuil
2009-11-10 13:51   ` Laurent Pinchart
2009-11-10 14:28     ` Guennadi Liakhovetski
2009-11-11  7:55   ` Hans Verkuil
2009-11-12  8:08     ` Guennadi Liakhovetski
2009-11-15 16:23       ` Hans Verkuil
2009-11-19 22:33         ` Guennadi Liakhovetski
2009-11-20 12:29           ` Hans Verkuil
2009-11-20 15:07             ` Karicheri, Muralidharan
2009-10-30 14:01 ` [PATCH/RFC 8/9 v2] soc-camera: convert to the new imagebus API Guennadi Liakhovetski
2009-10-30 18:31   ` [PATCH/RFC 8a/9 " Guennadi Liakhovetski
2009-10-30 18:34   ` [PATCH/RFC 8b/9 v3] rj54n1cb0c: Add cropping, auto white balance, restrict sizes, add platform data Guennadi Liakhovetski
2009-10-30 14:01 ` [PATCH/RFC 9/9] mt9t031: make the use of the soc-camera client API optional Guennadi Liakhovetski
2009-10-30 15:28   ` Karicheri, Muralidharan
2009-10-30 20:25     ` Guennadi Liakhovetski
2009-11-02 16:05       ` Karicheri, Muralidharan
2009-11-04 16:49         ` [PATCH/RFC 9/9 v2] " Guennadi Liakhovetski
2009-11-04 16:57           ` Karicheri, Muralidharan
2009-11-04 17:53             ` Guennadi Liakhovetski
2009-11-05  0:04               ` Guennadi Liakhovetski
2009-11-05 15:57           ` Hans Verkuil
2009-11-05 16:59             ` Guennadi Liakhovetski
2009-11-05 17:07               ` Karicheri, Muralidharan
2009-11-10 13:54                 ` Laurent Pinchart
2009-11-10 14:36                   ` Guennadi Liakhovetski
2009-11-05 15:46   ` [PATCH/RFC 9/9] " Hans Verkuil
2009-10-30 14:34 ` [PATCH/RFC 0/9 v2] Image-bus API and accompanying soc-camera patches Karicheri, Muralidharan
2009-10-30 20:12   ` Guennadi Liakhovetski

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.