All of lore.kernel.org
 help / color / mirror / Atom feed
* Conversion between unsigned and signed for "ff_effects_max" in uinput and input.
@ 2015-07-16 22:09 Elias Vanderstuyft
  2015-07-16 22:50 ` Dmitry Torokhov
  0 siblings, 1 reply; 2+ messages in thread
From: Elias Vanderstuyft @ 2015-07-16 22:09 UTC (permalink / raw)
  To: open list:HID CORE LAYER, vojtech, Dmitry Torokhov

Hi everyone,

Making observations based on the following headers:

uinput.h: "(struct uinput_device).ff_effects_max" is defined as "unsigned int".
uapi/uinput.h: "(struct uinput_user_dev).ff_effects_max" is defined as "__u32".

vs

input.h: "(struct ff_device).max_effects" is defined as "int",
however, signature of input_ff_create() in input.h is:
    int input_ff_create(struct input_dev *dev, unsigned int max_effects)

Why is "(struct ff_device).max_effects" defined as a signed integer,
instead of an unsigned integer, as defined in the majority of the headers?

If that question is cleared out,
I would like to point to a potential integer overflow in the assignment in
ff-core.c::input_ff_create():
    ff->max_effects = max_effects;
(http://lxr.free-electrons.com/source/drivers/input/ff-core.c#L337)
Although physically unlikely that max_effects would exceed 0x7FFFFFFF,
shouldn't there be a check to prevent "ff->max_effects" becoming negative?
Note that using UInput, the user can define its own value of max_effects,
so adding a check might be justified.

Feedback would be nice.

Thanks,
Elias

^ permalink raw reply	[flat|nested] 2+ messages in thread

* Re: Conversion between unsigned and signed for "ff_effects_max" in uinput and input.
  2015-07-16 22:09 Conversion between unsigned and signed for "ff_effects_max" in uinput and input Elias Vanderstuyft
@ 2015-07-16 22:50 ` Dmitry Torokhov
  0 siblings, 0 replies; 2+ messages in thread
From: Dmitry Torokhov @ 2015-07-16 22:50 UTC (permalink / raw)
  To: Elias Vanderstuyft; +Cc: open list:HID CORE LAYER, vojtech

Hi Elias,

On Fri, Jul 17, 2015 at 12:09:37AM +0200, Elias Vanderstuyft wrote:
> Hi everyone,
> 
> Making observations based on the following headers:
> 
> uinput.h: "(struct uinput_device).ff_effects_max" is defined as "unsigned int".
> uapi/uinput.h: "(struct uinput_user_dev).ff_effects_max" is defined as "__u32".
> 
> vs
> 
> input.h: "(struct ff_device).max_effects" is defined as "int",
> however, signature of input_ff_create() in input.h is:
>     int input_ff_create(struct input_dev *dev, unsigned int max_effects)
> 
> Why is "(struct ff_device).max_effects" defined as a signed integer,
> instead of an unsigned integer, as defined in the majority of the headers?

The effect->id is signed (because we treat -1 as special value when
given to us by userspace) and so it made sense to have max_effects the
same type...

> 
> If that question is cleared out,
> I would like to point to a potential integer overflow in the assignment in
> ff-core.c::input_ff_create():
>     ff->max_effects = max_effects;
> (http://lxr.free-electrons.com/source/drivers/input/ff-core.c#L337)
> Although physically unlikely that max_effects would exceed 0x7FFFFFFF,
> shouldn't there be a check to prevent "ff->max_effects" becoming negative?
> Note that using UInput, the user can define its own value of max_effects,
> so adding a check might be justified.


Yes, we should compare against INT_MAX and fail with -EINVAL I guess.

Thanks.

-- 
Dmitry

^ permalink raw reply	[flat|nested] 2+ messages in thread

end of thread, other threads:[~2015-07-16 22:50 UTC | newest]

Thread overview: 2+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2015-07-16 22:09 Conversion between unsigned and signed for "ff_effects_max" in uinput and input Elias Vanderstuyft
2015-07-16 22:50 ` Dmitry Torokhov

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.