From: Ingo Molnar <mingo@kernel.org> To: Mark Rutland <mark.rutland@arm.com> Cc: Peter Zijlstra <peterz@infradead.org>, linux-arm-kernel@lists.infradead.org, linux-kernel@vger.kernel.org, aryabinin@virtuozzo.com, boqun.feng@gmail.com, catalin.marinas@arm.com, dvyukov@google.com, will.deacon@arm.com Subject: [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Date: Sat, 5 May 2018 10:11:00 +0200 [thread overview] Message-ID: <20180505081100.nsyrqrpzq2vd27bk@gmail.com> (raw) In-Reply-To: <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> * Mark Rutland <mark.rutland@arm.com> wrote: > On Fri, May 04, 2018 at 08:01:05PM +0200, Peter Zijlstra wrote: > > On Fri, May 04, 2018 at 06:39:32PM +0100, Mark Rutland wrote: > > > Currently <asm-generic/atomic-instrumented.h> only instruments the fully > > > ordered variants of atomic functions, ignoring the {relaxed,acquire,release} > > > ordering variants. > > > > > > This patch reworks the header to instrument all ordering variants of the atomic > > > functions, so that architectures implementing these are instrumented > > > appropriately. > > > > > > To minimise repetition, a macro is used to generate each variant from a common > > > template. The {full,relaxed,acquire,release} order variants respectively are > > > then built using this template, where the architecture provides an > > > implementation. > > > > include/asm-generic/atomic-instrumented.h | 1195 ++++++++++++++++++++++++----- > > > 1 file changed, 1008 insertions(+), 187 deletions(-) > > > > Is there really no way to either generate or further macro compress this? > > I can definitely macro compress this somewhat, but the bulk of the > repetition will be the ifdeffery, which can't be macro'd away IIUC. The thing is, the existing #ifdeffery is suboptimal to begin with. I just did the following cleanups (patch attached): include/linux/atomic.h | 1275 +++++++++++++++++++++--------------------------- 1 file changed, 543 insertions(+), 732 deletions(-) The gist of the changes is the following simplification of the main construct: Before: #ifndef atomic_fetch_dec_relaxed #ifndef atomic_fetch_dec #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) #else /* atomic_fetch_dec */ #define atomic_fetch_dec_relaxed atomic_fetch_dec #define atomic_fetch_dec_acquire atomic_fetch_dec #define atomic_fetch_dec_release atomic_fetch_dec #endif /* atomic_fetch_dec */ #else /* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_acquire #define atomic_fetch_dec_acquire(...) \ __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec_release #define atomic_fetch_dec_release(...) \ __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec #define atomic_fetch_dec(...) \ __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) #endif #endif /* atomic_fetch_dec_relaxed */ After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif The new variant is readable at a glance, and the hierarchy of defines is very obvious as well. And I think we could do even better - there's absolutely no reason why _every_ operation has to be made conditional on a finegrained level - they are overriden in API groups. In fact allowing individual override is arguably a fragility. So we could do the following simplification on top of that: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif Note how the grouping of APIs based on 'atomic_fetch_dec' is already an assumption in the primary !atomic_fetch_dec_relaxed branch. I much prefer such clear constructs of API mapping versus magic macros. Thanks, Ingo =============================> >From 0171d4ed840d25befaedcf03e834bb76acb400c0 Mon Sep 17 00:00:00 2001 From: Ingo Molnar <mingo@kernel.org> Date: Sat, 5 May 2018 09:57:02 +0200 Subject: [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Use structured defines to make it all much more readable. Before: #ifndef atomic_fetch_dec_relaxed #ifndef atomic_fetch_dec #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) #else /* atomic_fetch_dec */ #define atomic_fetch_dec_relaxed atomic_fetch_dec #define atomic_fetch_dec_acquire atomic_fetch_dec #define atomic_fetch_dec_release atomic_fetch_dec #endif /* atomic_fetch_dec */ #else /* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_acquire #define atomic_fetch_dec_acquire(...) \ __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec_release #define atomic_fetch_dec_release(...) \ __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec #define atomic_fetch_dec(...) \ __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) #endif #endif /* atomic_fetch_dec_relaxed */ After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif Beyond the linecount reduction this also makes it easier to follow the various conditions. Also clean up a few other minor details and make the code more consistent throughout. No change in functionality. Cc: Peter Zijlstra <peterz@infradead.org> Cc: Linus Torvalds <torvalds@linux-foundation.org> Cc: Andrew Morton <akpm@linux-foundation.org> Cc: Thomas Gleixner <tglx@linutronix.de> Cc: Paul E. McKenney <paulmck@linux.vnet.ibm.com> Cc: Will Deacon <will.deacon@arm.com> Cc: linux-kernel@vger.kernel.org Signed-off-by: Ingo Molnar <mingo@kernel.org> --- include/linux/atomic.h | 1275 +++++++++++++++++++++--------------------------- 1 file changed, 543 insertions(+), 732 deletions(-) diff --git a/include/linux/atomic.h b/include/linux/atomic.h index 01ce3997cb42..dc157c092ae5 100644 --- a/include/linux/atomic.h +++ b/include/linux/atomic.h @@ -24,11 +24,11 @@ */ #ifndef atomic_read_acquire -#define atomic_read_acquire(v) smp_load_acquire(&(v)->counter) +# define atomic_read_acquire(v) smp_load_acquire(&(v)->counter) #endif #ifndef atomic_set_release -#define atomic_set_release(v, i) smp_store_release(&(v)->counter, (i)) +# define atomic_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif /* @@ -71,454 +71,351 @@ }) #endif -/* atomic_add_return_relaxed */ -#ifndef atomic_add_return_relaxed -#define atomic_add_return_relaxed atomic_add_return -#define atomic_add_return_acquire atomic_add_return -#define atomic_add_return_release atomic_add_return - -#else /* atomic_add_return_relaxed */ - -#ifndef atomic_add_return_acquire -#define atomic_add_return_acquire(...) \ - __atomic_op_acquire(atomic_add_return, __VA_ARGS__) -#endif +/* atomic_add_return_relaxed() et al: */ -#ifndef atomic_add_return_release -#define atomic_add_return_release(...) \ - __atomic_op_release(atomic_add_return, __VA_ARGS__) -#endif - -#ifndef atomic_add_return -#define atomic_add_return(...) \ - __atomic_op_fence(atomic_add_return, __VA_ARGS__) -#endif -#endif /* atomic_add_return_relaxed */ +#ifndef atomic_add_return_relaxed +# define atomic_add_return_relaxed atomic_add_return +# define atomic_add_return_acquire atomic_add_return +# define atomic_add_return_release atomic_add_return +#else +# ifndef atomic_add_return_acquire +# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) +# endif +# ifndef atomic_add_return_release +# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) +# endif +# ifndef atomic_add_return +# define atomic_add_return(...) __atomic_op_fence(atomic_add_return, __VA_ARGS__) +# endif +#endif + +/* atomic_inc_return_relaxed() et al: */ -/* atomic_inc_return_relaxed */ #ifndef atomic_inc_return_relaxed -#define atomic_inc_return_relaxed atomic_inc_return -#define atomic_inc_return_acquire atomic_inc_return -#define atomic_inc_return_release atomic_inc_return - -#else /* atomic_inc_return_relaxed */ - -#ifndef atomic_inc_return_acquire -#define atomic_inc_return_acquire(...) \ - __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic_inc_return_release -#define atomic_inc_return_release(...) \ - __atomic_op_release(atomic_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic_inc_return -#define atomic_inc_return(...) \ - __atomic_op_fence(atomic_inc_return, __VA_ARGS__) -#endif -#endif /* atomic_inc_return_relaxed */ +# define atomic_inc_return_relaxed atomic_inc_return +# define atomic_inc_return_acquire atomic_inc_return +# define atomic_inc_return_release atomic_inc_return +#else +# ifndef atomic_inc_return_acquire +# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) +# endif +# ifndef atomic_inc_return_release +# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) +# endif +# ifndef atomic_inc_return +# define atomic_inc_return(...) __atomic_op_fence(atomic_inc_return, __VA_ARGS__) +# endif +#endif + +/* atomic_sub_return_relaxed() et al: */ -/* atomic_sub_return_relaxed */ #ifndef atomic_sub_return_relaxed -#define atomic_sub_return_relaxed atomic_sub_return -#define atomic_sub_return_acquire atomic_sub_return -#define atomic_sub_return_release atomic_sub_return - -#else /* atomic_sub_return_relaxed */ - -#ifndef atomic_sub_return_acquire -#define atomic_sub_return_acquire(...) \ - __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic_sub_return_release -#define atomic_sub_return_release(...) \ - __atomic_op_release(atomic_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic_sub_return -#define atomic_sub_return(...) \ - __atomic_op_fence(atomic_sub_return, __VA_ARGS__) -#endif -#endif /* atomic_sub_return_relaxed */ +# define atomic_sub_return_relaxed atomic_sub_return +# define atomic_sub_return_acquire atomic_sub_return +# define atomic_sub_return_release atomic_sub_return +#else +# ifndef atomic_sub_return_acquire +# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) +# endif +# ifndef atomic_sub_return_release +# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) +# endif +# ifndef atomic_sub_return +# define atomic_sub_return(...) __atomic_op_fence(atomic_sub_return, __VA_ARGS__) +# endif +#endif + +/* atomic_dec_return_relaxed() et al: */ -/* atomic_dec_return_relaxed */ #ifndef atomic_dec_return_relaxed -#define atomic_dec_return_relaxed atomic_dec_return -#define atomic_dec_return_acquire atomic_dec_return -#define atomic_dec_return_release atomic_dec_return - -#else /* atomic_dec_return_relaxed */ - -#ifndef atomic_dec_return_acquire -#define atomic_dec_return_acquire(...) \ - __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) -#endif +# define atomic_dec_return_relaxed atomic_dec_return +# define atomic_dec_return_acquire atomic_dec_return +# define atomic_dec_return_release atomic_dec_return +#else +# ifndef atomic_dec_return_acquire +# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) +# endif +# ifndef atomic_dec_return_release +# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) +# endif +# ifndef atomic_dec_return +# define atomic_dec_return(...) __atomic_op_fence(atomic_dec_return, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_add_relaxed() et al: */ -#ifndef atomic_dec_return_release -#define atomic_dec_return_release(...) \ - __atomic_op_release(atomic_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic_dec_return -#define atomic_dec_return(...) \ - __atomic_op_fence(atomic_dec_return, __VA_ARGS__) -#endif -#endif /* atomic_dec_return_relaxed */ - - -/* atomic_fetch_add_relaxed */ #ifndef atomic_fetch_add_relaxed -#define atomic_fetch_add_relaxed atomic_fetch_add -#define atomic_fetch_add_acquire atomic_fetch_add -#define atomic_fetch_add_release atomic_fetch_add - -#else /* atomic_fetch_add_relaxed */ - -#ifndef atomic_fetch_add_acquire -#define atomic_fetch_add_acquire(...) \ - __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_add_release -#define atomic_fetch_add_release(...) \ - __atomic_op_release(atomic_fetch_add, __VA_ARGS__) -#endif +# define atomic_fetch_add_relaxed atomic_fetch_add +# define atomic_fetch_add_acquire atomic_fetch_add +# define atomic_fetch_add_release atomic_fetch_add +#else +# ifndef atomic_fetch_add_acquire +# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic_fetch_add_release +# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic_fetch_add +# define atomic_fetch_add(...) __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_inc_relaxed() et al: */ -#ifndef atomic_fetch_add -#define atomic_fetch_add(...) \ - __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) -#endif -#endif /* atomic_fetch_add_relaxed */ - -/* atomic_fetch_inc_relaxed */ #ifndef atomic_fetch_inc_relaxed +# ifndef atomic_fetch_inc +# define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) +# define atomic_fetch_inc_relaxed(v) atomic_fetch_add_relaxed(1, (v)) +# define atomic_fetch_inc_acquire(v) atomic_fetch_add_acquire(1, (v)) +# define atomic_fetch_inc_release(v) atomic_fetch_add_release(1, (v)) +# else +# define atomic_fetch_inc_relaxed atomic_fetch_inc +# define atomic_fetch_inc_acquire atomic_fetch_inc +# define atomic_fetch_inc_release atomic_fetch_inc +# endif +#else +# ifndef atomic_fetch_inc_acquire +# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic_fetch_inc_release +# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic_fetch_inc +# define atomic_fetch_inc(...) __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_sub_relaxed() et al: */ -#ifndef atomic_fetch_inc -#define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) -#define atomic_fetch_inc_relaxed(v) atomic_fetch_add_relaxed(1, (v)) -#define atomic_fetch_inc_acquire(v) atomic_fetch_add_acquire(1, (v)) -#define atomic_fetch_inc_release(v) atomic_fetch_add_release(1, (v)) -#else /* atomic_fetch_inc */ -#define atomic_fetch_inc_relaxed atomic_fetch_inc -#define atomic_fetch_inc_acquire atomic_fetch_inc -#define atomic_fetch_inc_release atomic_fetch_inc -#endif /* atomic_fetch_inc */ - -#else /* atomic_fetch_inc_relaxed */ - -#ifndef atomic_fetch_inc_acquire -#define atomic_fetch_inc_acquire(...) \ - __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_inc_release -#define atomic_fetch_inc_release(...) \ - __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_inc -#define atomic_fetch_inc(...) \ - __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) -#endif -#endif /* atomic_fetch_inc_relaxed */ - -/* atomic_fetch_sub_relaxed */ #ifndef atomic_fetch_sub_relaxed -#define atomic_fetch_sub_relaxed atomic_fetch_sub -#define atomic_fetch_sub_acquire atomic_fetch_sub -#define atomic_fetch_sub_release atomic_fetch_sub - -#else /* atomic_fetch_sub_relaxed */ - -#ifndef atomic_fetch_sub_acquire -#define atomic_fetch_sub_acquire(...) \ - __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) -#endif +# define atomic_fetch_sub_relaxed atomic_fetch_sub +# define atomic_fetch_sub_acquire atomic_fetch_sub +# define atomic_fetch_sub_release atomic_fetch_sub +#else +# ifndef atomic_fetch_sub_acquire +# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic_fetch_sub_release +# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic_fetch_sub +# define atomic_fetch_sub(...) __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_dec_relaxed() et al: */ -#ifndef atomic_fetch_sub_release -#define atomic_fetch_sub_release(...) \ - __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_sub -#define atomic_fetch_sub(...) \ - __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) -#endif -#endif /* atomic_fetch_sub_relaxed */ - -/* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_relaxed +# ifndef atomic_fetch_dec +# define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) +# define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) +# define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) +# define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) +# else +# define atomic_fetch_dec_relaxed atomic_fetch_dec +# define atomic_fetch_dec_acquire atomic_fetch_dec +# define atomic_fetch_dec_release atomic_fetch_dec +# endif +#else +# ifndef atomic_fetch_dec_acquire +# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic_fetch_dec_release +# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic_fetch_dec +# define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_or_relaxed() et al: */ -#ifndef atomic_fetch_dec -#define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) -#define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) -#define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) -#define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) -#else /* atomic_fetch_dec */ -#define atomic_fetch_dec_relaxed atomic_fetch_dec -#define atomic_fetch_dec_acquire atomic_fetch_dec -#define atomic_fetch_dec_release atomic_fetch_dec -#endif /* atomic_fetch_dec */ - -#else /* atomic_fetch_dec_relaxed */ - -#ifndef atomic_fetch_dec_acquire -#define atomic_fetch_dec_acquire(...) \ - __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_dec_release -#define atomic_fetch_dec_release(...) \ - __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_dec -#define atomic_fetch_dec(...) \ - __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) -#endif -#endif /* atomic_fetch_dec_relaxed */ - -/* atomic_fetch_or_relaxed */ #ifndef atomic_fetch_or_relaxed -#define atomic_fetch_or_relaxed atomic_fetch_or -#define atomic_fetch_or_acquire atomic_fetch_or -#define atomic_fetch_or_release atomic_fetch_or - -#else /* atomic_fetch_or_relaxed */ - -#ifndef atomic_fetch_or_acquire -#define atomic_fetch_or_acquire(...) \ - __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_or_release -#define atomic_fetch_or_release(...) \ - __atomic_op_release(atomic_fetch_or, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_or -#define atomic_fetch_or(...) \ - __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) -#endif -#endif /* atomic_fetch_or_relaxed */ +# define atomic_fetch_or_relaxed atomic_fetch_or +# define atomic_fetch_or_acquire atomic_fetch_or +# define atomic_fetch_or_release atomic_fetch_or +#else +# ifndef atomic_fetch_or_acquire +# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic_fetch_or_release +# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic_fetch_or +# define atomic_fetch_or(...) __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_and_relaxed() et al: */ -/* atomic_fetch_and_relaxed */ #ifndef atomic_fetch_and_relaxed -#define atomic_fetch_and_relaxed atomic_fetch_and -#define atomic_fetch_and_acquire atomic_fetch_and -#define atomic_fetch_and_release atomic_fetch_and - -#else /* atomic_fetch_and_relaxed */ - -#ifndef atomic_fetch_and_acquire -#define atomic_fetch_and_acquire(...) \ - __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_and_release -#define atomic_fetch_and_release(...) \ - __atomic_op_release(atomic_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_and -#define atomic_fetch_and(...) \ - __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_relaxed atomic_fetch_and +# define atomic_fetch_and_acquire atomic_fetch_and +# define atomic_fetch_and_release atomic_fetch_and +#else +# ifndef atomic_fetch_and_acquire +# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic_fetch_and_release +# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic_fetch_and +# define atomic_fetch_and(...) __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# endif #endif -#endif /* atomic_fetch_and_relaxed */ #ifdef atomic_andnot -/* atomic_fetch_andnot_relaxed */ -#ifndef atomic_fetch_andnot_relaxed -#define atomic_fetch_andnot_relaxed atomic_fetch_andnot -#define atomic_fetch_andnot_acquire atomic_fetch_andnot -#define atomic_fetch_andnot_release atomic_fetch_andnot - -#else /* atomic_fetch_andnot_relaxed */ -#ifndef atomic_fetch_andnot_acquire -#define atomic_fetch_andnot_acquire(...) \ - __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) -#endif +/* atomic_fetch_andnot_relaxed() et al: */ -#ifndef atomic_fetch_andnot_release -#define atomic_fetch_andnot_release(...) \ - __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) +#ifndef atomic_fetch_andnot_relaxed +# define atomic_fetch_andnot_relaxed atomic_fetch_andnot +# define atomic_fetch_andnot_acquire atomic_fetch_andnot +# define atomic_fetch_andnot_release atomic_fetch_andnot +#else +# ifndef atomic_fetch_andnot_acquire +# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic_fetch_andnot_release +# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic_fetch_andnot +# define atomic_fetch_andnot(...) __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) +# endif #endif -#ifndef atomic_fetch_andnot -#define atomic_fetch_andnot(...) \ - __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) -#endif -#endif /* atomic_fetch_andnot_relaxed */ #endif /* atomic_andnot */ -/* atomic_fetch_xor_relaxed */ -#ifndef atomic_fetch_xor_relaxed -#define atomic_fetch_xor_relaxed atomic_fetch_xor -#define atomic_fetch_xor_acquire atomic_fetch_xor -#define atomic_fetch_xor_release atomic_fetch_xor - -#else /* atomic_fetch_xor_relaxed */ +/* atomic_fetch_xor_relaxed() et al: */ -#ifndef atomic_fetch_xor_acquire -#define atomic_fetch_xor_acquire(...) \ - __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_xor_release -#define atomic_fetch_xor_release(...) \ - __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) +#ifndef atomic_fetch_xor_relaxed +# define atomic_fetch_xor_relaxed atomic_fetch_xor +# define atomic_fetch_xor_acquire atomic_fetch_xor +# define atomic_fetch_xor_release atomic_fetch_xor +#else +# ifndef atomic_fetch_xor_acquire +# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic_fetch_xor_release +# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic_fetch_xor +# define atomic_fetch_xor(...) __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) +# endif #endif -#ifndef atomic_fetch_xor -#define atomic_fetch_xor(...) \ - __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) -#endif -#endif /* atomic_fetch_xor_relaxed */ +/* atomic_xchg_relaxed() et al: */ -/* atomic_xchg_relaxed */ #ifndef atomic_xchg_relaxed -#define atomic_xchg_relaxed atomic_xchg -#define atomic_xchg_acquire atomic_xchg -#define atomic_xchg_release atomic_xchg - -#else /* atomic_xchg_relaxed */ - -#ifndef atomic_xchg_acquire -#define atomic_xchg_acquire(...) \ - __atomic_op_acquire(atomic_xchg, __VA_ARGS__) -#endif - -#ifndef atomic_xchg_release -#define atomic_xchg_release(...) \ - __atomic_op_release(atomic_xchg, __VA_ARGS__) -#endif +#define atomic_xchg_relaxed atomic_xchg +#define atomic_xchg_acquire atomic_xchg +#define atomic_xchg_release atomic_xchg +#else +# ifndef atomic_xchg_acquire +# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) +# endif +# ifndef atomic_xchg_release +# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) +# endif +# ifndef atomic_xchg +# define atomic_xchg(...) __atomic_op_fence(atomic_xchg, __VA_ARGS__) +# endif +#endif + +/* atomic_cmpxchg_relaxed() et al: */ -#ifndef atomic_xchg -#define atomic_xchg(...) \ - __atomic_op_fence(atomic_xchg, __VA_ARGS__) -#endif -#endif /* atomic_xchg_relaxed */ - -/* atomic_cmpxchg_relaxed */ #ifndef atomic_cmpxchg_relaxed -#define atomic_cmpxchg_relaxed atomic_cmpxchg -#define atomic_cmpxchg_acquire atomic_cmpxchg -#define atomic_cmpxchg_release atomic_cmpxchg - -#else /* atomic_cmpxchg_relaxed */ - -#ifndef atomic_cmpxchg_acquire -#define atomic_cmpxchg_acquire(...) \ - __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_relaxed atomic_cmpxchg +# define atomic_cmpxchg_acquire atomic_cmpxchg +# define atomic_cmpxchg_release atomic_cmpxchg +#else +# ifndef atomic_cmpxchg_acquire +# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic_cmpxchg_release +# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic_cmpxchg +# define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) +# endif #endif -#ifndef atomic_cmpxchg_release -#define atomic_cmpxchg_release(...) \ - __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic_cmpxchg -#define atomic_cmpxchg(...) \ - __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) -#endif -#endif /* atomic_cmpxchg_relaxed */ - #ifndef atomic_try_cmpxchg - -#define __atomic_try_cmpxchg(type, _p, _po, _n) \ -({ \ +# define __atomic_try_cmpxchg(type, _p, _po, _n) \ + ({ \ typeof(_po) __po = (_po); \ typeof(*(_po)) __r, __o = *__po; \ __r = atomic_cmpxchg##type((_p), __o, (_n)); \ if (unlikely(__r != __o)) \ *__po = __r; \ likely(__r == __o); \ -}) - -#define atomic_try_cmpxchg(_p, _po, _n) __atomic_try_cmpxchg(, _p, _po, _n) -#define atomic_try_cmpxchg_relaxed(_p, _po, _n) __atomic_try_cmpxchg(_relaxed, _p, _po, _n) -#define atomic_try_cmpxchg_acquire(_p, _po, _n) __atomic_try_cmpxchg(_acquire, _p, _po, _n) -#define atomic_try_cmpxchg_release(_p, _po, _n) __atomic_try_cmpxchg(_release, _p, _po, _n) - -#else /* atomic_try_cmpxchg */ -#define atomic_try_cmpxchg_relaxed atomic_try_cmpxchg -#define atomic_try_cmpxchg_acquire atomic_try_cmpxchg -#define atomic_try_cmpxchg_release atomic_try_cmpxchg -#endif /* atomic_try_cmpxchg */ - -/* cmpxchg_relaxed */ -#ifndef cmpxchg_relaxed -#define cmpxchg_relaxed cmpxchg -#define cmpxchg_acquire cmpxchg -#define cmpxchg_release cmpxchg - -#else /* cmpxchg_relaxed */ - -#ifndef cmpxchg_acquire -#define cmpxchg_acquire(...) \ - __atomic_op_acquire(cmpxchg, __VA_ARGS__) + }) +# define atomic_try_cmpxchg(_p, _po, _n) __atomic_try_cmpxchg(, _p, _po, _n) +# define atomic_try_cmpxchg_relaxed(_p, _po, _n) __atomic_try_cmpxchg(_relaxed, _p, _po, _n) +# define atomic_try_cmpxchg_acquire(_p, _po, _n) __atomic_try_cmpxchg(_acquire, _p, _po, _n) +# define atomic_try_cmpxchg_release(_p, _po, _n) __atomic_try_cmpxchg(_release, _p, _po, _n) +#else +# define atomic_try_cmpxchg_relaxed atomic_try_cmpxchg +# define atomic_try_cmpxchg_acquire atomic_try_cmpxchg +# define atomic_try_cmpxchg_release atomic_try_cmpxchg #endif -#ifndef cmpxchg_release -#define cmpxchg_release(...) \ - __atomic_op_release(cmpxchg, __VA_ARGS__) -#endif +/* cmpxchg_relaxed() et al: */ -#ifndef cmpxchg -#define cmpxchg(...) \ - __atomic_op_fence(cmpxchg, __VA_ARGS__) -#endif -#endif /* cmpxchg_relaxed */ +#ifndef cmpxchg_relaxed +# define cmpxchg_relaxed cmpxchg +# define cmpxchg_acquire cmpxchg +# define cmpxchg_release cmpxchg +#else +# ifndef cmpxchg_acquire +# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) +# endif +# ifndef cmpxchg_release +# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) +# endif +# ifndef cmpxchg +# define cmpxchg(...) __atomic_op_fence(cmpxchg, __VA_ARGS__) +# endif +#endif + +/* cmpxchg64_relaxed() et al: */ -/* cmpxchg64_relaxed */ #ifndef cmpxchg64_relaxed -#define cmpxchg64_relaxed cmpxchg64 -#define cmpxchg64_acquire cmpxchg64 -#define cmpxchg64_release cmpxchg64 - -#else /* cmpxchg64_relaxed */ - -#ifndef cmpxchg64_acquire -#define cmpxchg64_acquire(...) \ - __atomic_op_acquire(cmpxchg64, __VA_ARGS__) -#endif - -#ifndef cmpxchg64_release -#define cmpxchg64_release(...) \ - __atomic_op_release(cmpxchg64, __VA_ARGS__) -#endif - -#ifndef cmpxchg64 -#define cmpxchg64(...) \ - __atomic_op_fence(cmpxchg64, __VA_ARGS__) -#endif -#endif /* cmpxchg64_relaxed */ +# define cmpxchg64_relaxed cmpxchg64 +# define cmpxchg64_acquire cmpxchg64 +# define cmpxchg64_release cmpxchg64 +#else +# ifndef cmpxchg64_acquire +# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) +# endif +# ifndef cmpxchg64_release +# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) +# endif +# ifndef cmpxchg64 +# define cmpxchg64(...) __atomic_op_fence(cmpxchg64, __VA_ARGS__) +# endif +#endif + +/* xchg_relaxed() et al: */ -/* xchg_relaxed */ #ifndef xchg_relaxed -#define xchg_relaxed xchg -#define xchg_acquire xchg -#define xchg_release xchg - -#else /* xchg_relaxed */ - -#ifndef xchg_acquire -#define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) -#endif - -#ifndef xchg_release -#define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) +# define xchg_relaxed xchg +# define xchg_acquire xchg +# define xchg_release xchg +#else +# ifndef xchg_acquire +# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) +# endif +# ifndef xchg_release +# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) +# endif +# ifndef xchg +# define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) +# endif #endif -#ifndef xchg -#define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) -#endif -#endif /* xchg_relaxed */ - /** * atomic_add_unless - add unless the number is already a given value * @v: pointer of type atomic_t @@ -541,7 +438,7 @@ static inline int atomic_add_unless(atomic_t *v, int a, int u) * Returns non-zero if @v was non-zero, and zero otherwise. */ #ifndef atomic_inc_not_zero -#define atomic_inc_not_zero(v) atomic_add_unless((v), 1, 0) +# define atomic_inc_not_zero(v) atomic_add_unless((v), 1, 0) #endif #ifndef atomic_andnot @@ -607,6 +504,7 @@ static inline int atomic_inc_not_zero_hint(atomic_t *v, int hint) static inline int atomic_inc_unless_negative(atomic_t *p) { int v, v1; + for (v = 0; v >= 0; v = v1) { v1 = atomic_cmpxchg(p, v, v + 1); if (likely(v1 == v)) @@ -620,6 +518,7 @@ static inline int atomic_inc_unless_negative(atomic_t *p) static inline int atomic_dec_unless_positive(atomic_t *p) { int v, v1; + for (v = 0; v <= 0; v = v1) { v1 = atomic_cmpxchg(p, v, v - 1); if (likely(v1 == v)) @@ -640,6 +539,7 @@ static inline int atomic_dec_unless_positive(atomic_t *p) static inline int atomic_dec_if_positive(atomic_t *v) { int c, old, dec; + c = atomic_read(v); for (;;) { dec = c - 1; @@ -654,400 +554,311 @@ static inline int atomic_dec_if_positive(atomic_t *v) } #endif -#define atomic_cond_read_relaxed(v, c) smp_cond_load_relaxed(&(v)->counter, (c)) -#define atomic_cond_read_acquire(v, c) smp_cond_load_acquire(&(v)->counter, (c)) +#define atomic_cond_read_relaxed(v, c) smp_cond_load_relaxed(&(v)->counter, (c)) +#define atomic_cond_read_acquire(v, c) smp_cond_load_acquire(&(v)->counter, (c)) #ifdef CONFIG_GENERIC_ATOMIC64 #include <asm-generic/atomic64.h> #endif #ifndef atomic64_read_acquire -#define atomic64_read_acquire(v) smp_load_acquire(&(v)->counter) +# define atomic64_read_acquire(v) smp_load_acquire(&(v)->counter) #endif #ifndef atomic64_set_release -#define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) -#endif - -/* atomic64_add_return_relaxed */ -#ifndef atomic64_add_return_relaxed -#define atomic64_add_return_relaxed atomic64_add_return -#define atomic64_add_return_acquire atomic64_add_return -#define atomic64_add_return_release atomic64_add_return - -#else /* atomic64_add_return_relaxed */ - -#ifndef atomic64_add_return_acquire -#define atomic64_add_return_acquire(...) \ - __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif -#ifndef atomic64_add_return_release -#define atomic64_add_return_release(...) \ - __atomic_op_release(atomic64_add_return, __VA_ARGS__) -#endif +/* atomic64_add_return_relaxed() et al: */ -#ifndef atomic64_add_return -#define atomic64_add_return(...) \ - __atomic_op_fence(atomic64_add_return, __VA_ARGS__) -#endif -#endif /* atomic64_add_return_relaxed */ +#ifndef atomic64_add_return_relaxed +# define atomic64_add_return_relaxed atomic64_add_return +# define atomic64_add_return_acquire atomic64_add_return +# define atomic64_add_return_release atomic64_add_return +#else +# ifndef atomic64_add_return_acquire +# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# endif +# ifndef atomic64_add_return_release +# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) +# endif +# ifndef atomic64_add_return +# define atomic64_add_return(...) __atomic_op_fence(atomic64_add_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_inc_return_relaxed() et al: */ -/* atomic64_inc_return_relaxed */ #ifndef atomic64_inc_return_relaxed -#define atomic64_inc_return_relaxed atomic64_inc_return -#define atomic64_inc_return_acquire atomic64_inc_return -#define atomic64_inc_return_release atomic64_inc_return - -#else /* atomic64_inc_return_relaxed */ - -#ifndef atomic64_inc_return_acquire -#define atomic64_inc_return_acquire(...) \ - __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic64_inc_return_release -#define atomic64_inc_return_release(...) \ - __atomic_op_release(atomic64_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic64_inc_return -#define atomic64_inc_return(...) \ - __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) -#endif -#endif /* atomic64_inc_return_relaxed */ - +# define atomic64_inc_return_relaxed atomic64_inc_return +# define atomic64_inc_return_acquire atomic64_inc_return +# define atomic64_inc_return_release atomic64_inc_return +#else +# ifndef atomic64_inc_return_acquire +# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) +# endif +# ifndef atomic64_inc_return_release +# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) +# endif +# ifndef atomic64_inc_return +# define atomic64_inc_return(...) __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_sub_return_relaxed() et al: */ -/* atomic64_sub_return_relaxed */ #ifndef atomic64_sub_return_relaxed -#define atomic64_sub_return_relaxed atomic64_sub_return -#define atomic64_sub_return_acquire atomic64_sub_return -#define atomic64_sub_return_release atomic64_sub_return +# define atomic64_sub_return_relaxed atomic64_sub_return +# define atomic64_sub_return_acquire atomic64_sub_return +# define atomic64_sub_return_release atomic64_sub_return +#else +# ifndef atomic64_sub_return_acquire +# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) +# endif +# ifndef atomic64_sub_return_release +# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) +# endif +# ifndef atomic64_sub_return +# define atomic64_sub_return(...) __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_dec_return_relaxed() et al: */ -#else /* atomic64_sub_return_relaxed */ - -#ifndef atomic64_sub_return_acquire -#define atomic64_sub_return_acquire(...) \ - __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic64_sub_return_release -#define atomic64_sub_return_release(...) \ - __atomic_op_release(atomic64_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic64_sub_return -#define atomic64_sub_return(...) \ - __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) -#endif -#endif /* atomic64_sub_return_relaxed */ - -/* atomic64_dec_return_relaxed */ #ifndef atomic64_dec_return_relaxed -#define atomic64_dec_return_relaxed atomic64_dec_return -#define atomic64_dec_return_acquire atomic64_dec_return -#define atomic64_dec_return_release atomic64_dec_return - -#else /* atomic64_dec_return_relaxed */ - -#ifndef atomic64_dec_return_acquire -#define atomic64_dec_return_acquire(...) \ - __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic64_dec_return_release -#define atomic64_dec_return_release(...) \ - __atomic_op_release(atomic64_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic64_dec_return -#define atomic64_dec_return(...) \ - __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) -#endif -#endif /* atomic64_dec_return_relaxed */ +# define atomic64_dec_return_relaxed atomic64_dec_return +# define atomic64_dec_return_acquire atomic64_dec_return +# define atomic64_dec_return_release atomic64_dec_return +#else +# ifndef atomic64_dec_return_acquire +# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) +# endif +# ifndef atomic64_dec_return_release +# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) +# endif +# ifndef atomic64_dec_return +# define atomic64_dec_return(...) __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_add_relaxed() et al: */ - -/* atomic64_fetch_add_relaxed */ #ifndef atomic64_fetch_add_relaxed -#define atomic64_fetch_add_relaxed atomic64_fetch_add -#define atomic64_fetch_add_acquire atomic64_fetch_add -#define atomic64_fetch_add_release atomic64_fetch_add - -#else /* atomic64_fetch_add_relaxed */ - -#ifndef atomic64_fetch_add_acquire -#define atomic64_fetch_add_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) -#endif +# define atomic64_fetch_add_relaxed atomic64_fetch_add +# define atomic64_fetch_add_acquire atomic64_fetch_add +# define atomic64_fetch_add_release atomic64_fetch_add +#else +# ifndef atomic64_fetch_add_acquire +# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_add_release +# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_add +# define atomic64_fetch_add(...) __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_inc_relaxed() et al: */ -#ifndef atomic64_fetch_add_release -#define atomic64_fetch_add_release(...) \ - __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_add -#define atomic64_fetch_add(...) \ - __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_add_relaxed */ - -/* atomic64_fetch_inc_relaxed */ #ifndef atomic64_fetch_inc_relaxed +# ifndef atomic64_fetch_inc +# define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) +# define atomic64_fetch_inc_relaxed(v) atomic64_fetch_add_relaxed(1, (v)) +# define atomic64_fetch_inc_acquire(v) atomic64_fetch_add_acquire(1, (v)) +# define atomic64_fetch_inc_release(v) atomic64_fetch_add_release(1, (v)) +# else +# define atomic64_fetch_inc_relaxed atomic64_fetch_inc +# define atomic64_fetch_inc_acquire atomic64_fetch_inc +# define atomic64_fetch_inc_release atomic64_fetch_inc +# endif +#else +# ifndef atomic64_fetch_inc_acquire +# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_inc_release +# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_inc +# define atomic64_fetch_inc(...) __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_sub_relaxed() et al: */ -#ifndef atomic64_fetch_inc -#define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) -#define atomic64_fetch_inc_relaxed(v) atomic64_fetch_add_relaxed(1, (v)) -#define atomic64_fetch_inc_acquire(v) atomic64_fetch_add_acquire(1, (v)) -#define atomic64_fetch_inc_release(v) atomic64_fetch_add_release(1, (v)) -#else /* atomic64_fetch_inc */ -#define atomic64_fetch_inc_relaxed atomic64_fetch_inc -#define atomic64_fetch_inc_acquire atomic64_fetch_inc -#define atomic64_fetch_inc_release atomic64_fetch_inc -#endif /* atomic64_fetch_inc */ - -#else /* atomic64_fetch_inc_relaxed */ - -#ifndef atomic64_fetch_inc_acquire -#define atomic64_fetch_inc_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_inc_release -#define atomic64_fetch_inc_release(...) \ - __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_inc -#define atomic64_fetch_inc(...) \ - __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_inc_relaxed */ - -/* atomic64_fetch_sub_relaxed */ #ifndef atomic64_fetch_sub_relaxed -#define atomic64_fetch_sub_relaxed atomic64_fetch_sub -#define atomic64_fetch_sub_acquire atomic64_fetch_sub -#define atomic64_fetch_sub_release atomic64_fetch_sub - -#else /* atomic64_fetch_sub_relaxed */ - -#ifndef atomic64_fetch_sub_acquire -#define atomic64_fetch_sub_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_sub_release -#define atomic64_fetch_sub_release(...) \ - __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_sub -#define atomic64_fetch_sub(...) \ - __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_sub_relaxed */ +# define atomic64_fetch_sub_relaxed atomic64_fetch_sub +# define atomic64_fetch_sub_acquire atomic64_fetch_sub +# define atomic64_fetch_sub_release atomic64_fetch_sub +#else +# ifndef atomic64_fetch_sub_acquire +# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_sub_release +# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_sub +# define atomic64_fetch_sub(...) __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_dec_relaxed() et al: */ -/* atomic64_fetch_dec_relaxed */ #ifndef atomic64_fetch_dec_relaxed +# ifndef atomic64_fetch_dec +# define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) +# define atomic64_fetch_dec_relaxed(v) atomic64_fetch_sub_relaxed(1, (v)) +# define atomic64_fetch_dec_acquire(v) atomic64_fetch_sub_acquire(1, (v)) +# define atomic64_fetch_dec_release(v) atomic64_fetch_sub_release(1, (v)) +# else +# define atomic64_fetch_dec_relaxed atomic64_fetch_dec +# define atomic64_fetch_dec_acquire atomic64_fetch_dec +# define atomic64_fetch_dec_release atomic64_fetch_dec +# endif +#else +# ifndef atomic64_fetch_dec_acquire +# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_dec_release +# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_dec +# define atomic64_fetch_dec(...) __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_or_relaxed() et al: */ -#ifndef atomic64_fetch_dec -#define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) -#define atomic64_fetch_dec_relaxed(v) atomic64_fetch_sub_relaxed(1, (v)) -#define atomic64_fetch_dec_acquire(v) atomic64_fetch_sub_acquire(1, (v)) -#define atomic64_fetch_dec_release(v) atomic64_fetch_sub_release(1, (v)) -#else /* atomic64_fetch_dec */ -#define atomic64_fetch_dec_relaxed atomic64_fetch_dec -#define atomic64_fetch_dec_acquire atomic64_fetch_dec -#define atomic64_fetch_dec_release atomic64_fetch_dec -#endif /* atomic64_fetch_dec */ - -#else /* atomic64_fetch_dec_relaxed */ - -#ifndef atomic64_fetch_dec_acquire -#define atomic64_fetch_dec_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_dec_release -#define atomic64_fetch_dec_release(...) \ - __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_dec -#define atomic64_fetch_dec(...) \ - __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_dec_relaxed */ - -/* atomic64_fetch_or_relaxed */ #ifndef atomic64_fetch_or_relaxed -#define atomic64_fetch_or_relaxed atomic64_fetch_or -#define atomic64_fetch_or_acquire atomic64_fetch_or -#define atomic64_fetch_or_release atomic64_fetch_or - -#else /* atomic64_fetch_or_relaxed */ - -#ifndef atomic64_fetch_or_acquire -#define atomic64_fetch_or_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_relaxed atomic64_fetch_or +# define atomic64_fetch_or_acquire atomic64_fetch_or +# define atomic64_fetch_or_release atomic64_fetch_or +#else +# ifndef atomic64_fetch_or_acquire +# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_or_release +# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_or +# define atomic64_fetch_or(...) __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_or_release -#define atomic64_fetch_or_release(...) \ - __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) -#endif -#ifndef atomic64_fetch_or -#define atomic64_fetch_or(...) \ - __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_or_relaxed */ +/* atomic64_fetch_and_relaxed() et al: */ -/* atomic64_fetch_and_relaxed */ #ifndef atomic64_fetch_and_relaxed -#define atomic64_fetch_and_relaxed atomic64_fetch_and -#define atomic64_fetch_and_acquire atomic64_fetch_and -#define atomic64_fetch_and_release atomic64_fetch_and - -#else /* atomic64_fetch_and_relaxed */ - -#ifndef atomic64_fetch_and_acquire -#define atomic64_fetch_and_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_relaxed atomic64_fetch_and +# define atomic64_fetch_and_acquire atomic64_fetch_and +# define atomic64_fetch_and_release atomic64_fetch_and +#else +# ifndef atomic64_fetch_and_acquire +# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_and_release +# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_and +# define atomic64_fetch_and(...) __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_and_release -#define atomic64_fetch_and_release(...) \ - __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_and -#define atomic64_fetch_and(...) \ - __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_and_relaxed */ - #ifdef atomic64_andnot -/* atomic64_fetch_andnot_relaxed */ -#ifndef atomic64_fetch_andnot_relaxed -#define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot -#define atomic64_fetch_andnot_acquire atomic64_fetch_andnot -#define atomic64_fetch_andnot_release atomic64_fetch_andnot - -#else /* atomic64_fetch_andnot_relaxed */ -#ifndef atomic64_fetch_andnot_acquire -#define atomic64_fetch_andnot_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) -#endif +/* atomic64_fetch_andnot_relaxed() et al: */ -#ifndef atomic64_fetch_andnot_release -#define atomic64_fetch_andnot_release(...) \ - __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) +#ifndef atomic64_fetch_andnot_relaxed +# define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot +# define atomic64_fetch_andnot_acquire atomic64_fetch_andnot +# define atomic64_fetch_andnot_release atomic64_fetch_andnot +#else +# ifndef atomic64_fetch_andnot_acquire +# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_andnot_release +# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_andnot +# define atomic64_fetch_andnot(...) __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_andnot -#define atomic64_fetch_andnot(...) \ - __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_andnot_relaxed */ #endif /* atomic64_andnot */ -/* atomic64_fetch_xor_relaxed */ -#ifndef atomic64_fetch_xor_relaxed -#define atomic64_fetch_xor_relaxed atomic64_fetch_xor -#define atomic64_fetch_xor_acquire atomic64_fetch_xor -#define atomic64_fetch_xor_release atomic64_fetch_xor - -#else /* atomic64_fetch_xor_relaxed */ +/* atomic64_fetch_xor_relaxed() et al: */ -#ifndef atomic64_fetch_xor_acquire -#define atomic64_fetch_xor_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_xor_release -#define atomic64_fetch_xor_release(...) \ - __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) +#ifndef atomic64_fetch_xor_relaxed +# define atomic64_fetch_xor_relaxed atomic64_fetch_xor +# define atomic64_fetch_xor_acquire atomic64_fetch_xor +# define atomic64_fetch_xor_release atomic64_fetch_xor +#else +# ifndef atomic64_fetch_xor_acquire +# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_xor_release +# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_xor +# define atomic64_fetch_xor(...) __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) #endif - -#ifndef atomic64_fetch_xor -#define atomic64_fetch_xor(...) \ - __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) #endif -#endif /* atomic64_fetch_xor_relaxed */ +/* atomic64_xchg_relaxed() et al: */ -/* atomic64_xchg_relaxed */ #ifndef atomic64_xchg_relaxed -#define atomic64_xchg_relaxed atomic64_xchg -#define atomic64_xchg_acquire atomic64_xchg -#define atomic64_xchg_release atomic64_xchg - -#else /* atomic64_xchg_relaxed */ - -#ifndef atomic64_xchg_acquire -#define atomic64_xchg_acquire(...) \ - __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) -#endif +# define atomic64_xchg_relaxed atomic64_xchg +# define atomic64_xchg_acquire atomic64_xchg +# define atomic64_xchg_release atomic64_xchg +#else +# ifndef atomic64_xchg_acquire +# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) +# endif +# ifndef atomic64_xchg_release +# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) +# endif +# ifndef atomic64_xchg +# define atomic64_xchg(...) __atomic_op_fence(atomic64_xchg, __VA_ARGS__) +# endif +#endif + +/* atomic64_cmpxchg_relaxed() et al: */ -#ifndef atomic64_xchg_release -#define atomic64_xchg_release(...) \ - __atomic_op_release(atomic64_xchg, __VA_ARGS__) -#endif - -#ifndef atomic64_xchg -#define atomic64_xchg(...) \ - __atomic_op_fence(atomic64_xchg, __VA_ARGS__) -#endif -#endif /* atomic64_xchg_relaxed */ - -/* atomic64_cmpxchg_relaxed */ #ifndef atomic64_cmpxchg_relaxed -#define atomic64_cmpxchg_relaxed atomic64_cmpxchg -#define atomic64_cmpxchg_acquire atomic64_cmpxchg -#define atomic64_cmpxchg_release atomic64_cmpxchg - -#else /* atomic64_cmpxchg_relaxed */ - -#ifndef atomic64_cmpxchg_acquire -#define atomic64_cmpxchg_acquire(...) \ - __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic64_cmpxchg_release -#define atomic64_cmpxchg_release(...) \ - __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic64_cmpxchg -#define atomic64_cmpxchg(...) \ - __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_relaxed atomic64_cmpxchg +# define atomic64_cmpxchg_acquire atomic64_cmpxchg +# define atomic64_cmpxchg_release atomic64_cmpxchg +#else +# ifndef atomic64_cmpxchg_acquire +# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic64_cmpxchg_release +# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic64_cmpxchg +# define atomic64_cmpxchg(...) __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# endif #endif -#endif /* atomic64_cmpxchg_relaxed */ #ifndef atomic64_try_cmpxchg - -#define __atomic64_try_cmpxchg(type, _p, _po, _n) \ -({ \ +# define __atomic64_try_cmpxchg(type, _p, _po, _n) \ + ({ \ typeof(_po) __po = (_po); \ typeof(*(_po)) __r, __o = *__po; \ __r = atomic64_cmpxchg##type((_p), __o, (_n)); \ if (unlikely(__r != __o)) \ *__po = __r; \ likely(__r == __o); \ -}) - -#define atomic64_try_cmpxchg(_p, _po, _n) __atomic64_try_cmpxchg(, _p, _po, _n) -#define atomic64_try_cmpxchg_relaxed(_p, _po, _n) __atomic64_try_cmpxchg(_relaxed, _p, _po, _n) -#define atomic64_try_cmpxchg_acquire(_p, _po, _n) __atomic64_try_cmpxchg(_acquire, _p, _po, _n) -#define atomic64_try_cmpxchg_release(_p, _po, _n) __atomic64_try_cmpxchg(_release, _p, _po, _n) - -#else /* atomic64_try_cmpxchg */ -#define atomic64_try_cmpxchg_relaxed atomic64_try_cmpxchg -#define atomic64_try_cmpxchg_acquire atomic64_try_cmpxchg -#define atomic64_try_cmpxchg_release atomic64_try_cmpxchg -#endif /* atomic64_try_cmpxchg */ + }) +# define atomic64_try_cmpxchg(_p, _po, _n) __atomic64_try_cmpxchg(, _p, _po, _n) +# define atomic64_try_cmpxchg_relaxed(_p, _po, _n) __atomic64_try_cmpxchg(_relaxed, _p, _po, _n) +# define atomic64_try_cmpxchg_acquire(_p, _po, _n) __atomic64_try_cmpxchg(_acquire, _p, _po, _n) +# define atomic64_try_cmpxchg_release(_p, _po, _n) __atomic64_try_cmpxchg(_release, _p, _po, _n) +#else +# define atomic64_try_cmpxchg_relaxed atomic64_try_cmpxchg +# define atomic64_try_cmpxchg_acquire atomic64_try_cmpxchg +# define atomic64_try_cmpxchg_release atomic64_try_cmpxchg +#endif #ifndef atomic64_andnot static inline void atomic64_andnot(long long i, atomic64_t *v)
WARNING: multiple messages have this Message-ID (diff)
From: mingo@kernel.org (Ingo Molnar) To: linux-arm-kernel@lists.infradead.org Subject: [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Date: Sat, 5 May 2018 10:11:00 +0200 [thread overview] Message-ID: <20180505081100.nsyrqrpzq2vd27bk@gmail.com> (raw) In-Reply-To: <20180504180909.dnhfflibjwywnm4l@lakrids.cambridge.arm.com> * Mark Rutland <mark.rutland@arm.com> wrote: > On Fri, May 04, 2018 at 08:01:05PM +0200, Peter Zijlstra wrote: > > On Fri, May 04, 2018 at 06:39:32PM +0100, Mark Rutland wrote: > > > Currently <asm-generic/atomic-instrumented.h> only instruments the fully > > > ordered variants of atomic functions, ignoring the {relaxed,acquire,release} > > > ordering variants. > > > > > > This patch reworks the header to instrument all ordering variants of the atomic > > > functions, so that architectures implementing these are instrumented > > > appropriately. > > > > > > To minimise repetition, a macro is used to generate each variant from a common > > > template. The {full,relaxed,acquire,release} order variants respectively are > > > then built using this template, where the architecture provides an > > > implementation. > > > > include/asm-generic/atomic-instrumented.h | 1195 ++++++++++++++++++++++++----- > > > 1 file changed, 1008 insertions(+), 187 deletions(-) > > > > Is there really no way to either generate or further macro compress this? > > I can definitely macro compress this somewhat, but the bulk of the > repetition will be the ifdeffery, which can't be macro'd away IIUC. The thing is, the existing #ifdeffery is suboptimal to begin with. I just did the following cleanups (patch attached): include/linux/atomic.h | 1275 +++++++++++++++++++++--------------------------- 1 file changed, 543 insertions(+), 732 deletions(-) The gist of the changes is the following simplification of the main construct: Before: #ifndef atomic_fetch_dec_relaxed #ifndef atomic_fetch_dec #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) #else /* atomic_fetch_dec */ #define atomic_fetch_dec_relaxed atomic_fetch_dec #define atomic_fetch_dec_acquire atomic_fetch_dec #define atomic_fetch_dec_release atomic_fetch_dec #endif /* atomic_fetch_dec */ #else /* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_acquire #define atomic_fetch_dec_acquire(...) \ __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec_release #define atomic_fetch_dec_release(...) \ __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec #define atomic_fetch_dec(...) \ __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) #endif #endif /* atomic_fetch_dec_relaxed */ After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif The new variant is readable at a glance, and the hierarchy of defines is very obvious as well. And I think we could do even better - there's absolutely no reason why _every_ operation has to be made conditional on a finegrained level - they are overriden in API groups. In fact allowing individual override is arguably a fragility. So we could do the following simplification on top of that: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif #endif Note how the grouping of APIs based on 'atomic_fetch_dec' is already an assumption in the primary !atomic_fetch_dec_relaxed branch. I much prefer such clear constructs of API mapping versus magic macros. Thanks, Ingo =============================> >From 0171d4ed840d25befaedcf03e834bb76acb400c0 Mon Sep 17 00:00:00 2001 From: Ingo Molnar <mingo@kernel.org> Date: Sat, 5 May 2018 09:57:02 +0200 Subject: [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Use structured defines to make it all much more readable. Before: #ifndef atomic_fetch_dec_relaxed #ifndef atomic_fetch_dec #define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) #define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) #define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) #define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) #else /* atomic_fetch_dec */ #define atomic_fetch_dec_relaxed atomic_fetch_dec #define atomic_fetch_dec_acquire atomic_fetch_dec #define atomic_fetch_dec_release atomic_fetch_dec #endif /* atomic_fetch_dec */ #else /* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_acquire #define atomic_fetch_dec_acquire(...) \ __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec_release #define atomic_fetch_dec_release(...) \ __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) #endif #ifndef atomic_fetch_dec #define atomic_fetch_dec(...) \ __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) #endif #endif /* atomic_fetch_dec_relaxed */ After: #ifndef atomic_fetch_dec_relaxed # ifndef atomic_fetch_dec # define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) # define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) # define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) # define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) # else # define atomic_fetch_dec_relaxed atomic_fetch_dec # define atomic_fetch_dec_acquire atomic_fetch_dec # define atomic_fetch_dec_release atomic_fetch_dec # endif #else # ifndef atomic_fetch_dec_acquire # define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec_release # define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) # endif # ifndef atomic_fetch_dec # define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) # endif #endif Beyond the linecount reduction this also makes it easier to follow the various conditions. Also clean up a few other minor details and make the code more consistent throughout. No change in functionality. Cc: Peter Zijlstra <peterz@infradead.org> Cc: Linus Torvalds <torvalds@linux-foundation.org> Cc: Andrew Morton <akpm@linux-foundation.org> Cc: Thomas Gleixner <tglx@linutronix.de> Cc: Paul E. McKenney <paulmck@linux.vnet.ibm.com> Cc: Will Deacon <will.deacon@arm.com> Cc: linux-kernel at vger.kernel.org Signed-off-by: Ingo Molnar <mingo@kernel.org> --- include/linux/atomic.h | 1275 +++++++++++++++++++++--------------------------- 1 file changed, 543 insertions(+), 732 deletions(-) diff --git a/include/linux/atomic.h b/include/linux/atomic.h index 01ce3997cb42..dc157c092ae5 100644 --- a/include/linux/atomic.h +++ b/include/linux/atomic.h @@ -24,11 +24,11 @@ */ #ifndef atomic_read_acquire -#define atomic_read_acquire(v) smp_load_acquire(&(v)->counter) +# define atomic_read_acquire(v) smp_load_acquire(&(v)->counter) #endif #ifndef atomic_set_release -#define atomic_set_release(v, i) smp_store_release(&(v)->counter, (i)) +# define atomic_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif /* @@ -71,454 +71,351 @@ }) #endif -/* atomic_add_return_relaxed */ -#ifndef atomic_add_return_relaxed -#define atomic_add_return_relaxed atomic_add_return -#define atomic_add_return_acquire atomic_add_return -#define atomic_add_return_release atomic_add_return - -#else /* atomic_add_return_relaxed */ - -#ifndef atomic_add_return_acquire -#define atomic_add_return_acquire(...) \ - __atomic_op_acquire(atomic_add_return, __VA_ARGS__) -#endif +/* atomic_add_return_relaxed() et al: */ -#ifndef atomic_add_return_release -#define atomic_add_return_release(...) \ - __atomic_op_release(atomic_add_return, __VA_ARGS__) -#endif - -#ifndef atomic_add_return -#define atomic_add_return(...) \ - __atomic_op_fence(atomic_add_return, __VA_ARGS__) -#endif -#endif /* atomic_add_return_relaxed */ +#ifndef atomic_add_return_relaxed +# define atomic_add_return_relaxed atomic_add_return +# define atomic_add_return_acquire atomic_add_return +# define atomic_add_return_release atomic_add_return +#else +# ifndef atomic_add_return_acquire +# define atomic_add_return_acquire(...) __atomic_op_acquire(atomic_add_return, __VA_ARGS__) +# endif +# ifndef atomic_add_return_release +# define atomic_add_return_release(...) __atomic_op_release(atomic_add_return, __VA_ARGS__) +# endif +# ifndef atomic_add_return +# define atomic_add_return(...) __atomic_op_fence(atomic_add_return, __VA_ARGS__) +# endif +#endif + +/* atomic_inc_return_relaxed() et al: */ -/* atomic_inc_return_relaxed */ #ifndef atomic_inc_return_relaxed -#define atomic_inc_return_relaxed atomic_inc_return -#define atomic_inc_return_acquire atomic_inc_return -#define atomic_inc_return_release atomic_inc_return - -#else /* atomic_inc_return_relaxed */ - -#ifndef atomic_inc_return_acquire -#define atomic_inc_return_acquire(...) \ - __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic_inc_return_release -#define atomic_inc_return_release(...) \ - __atomic_op_release(atomic_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic_inc_return -#define atomic_inc_return(...) \ - __atomic_op_fence(atomic_inc_return, __VA_ARGS__) -#endif -#endif /* atomic_inc_return_relaxed */ +# define atomic_inc_return_relaxed atomic_inc_return +# define atomic_inc_return_acquire atomic_inc_return +# define atomic_inc_return_release atomic_inc_return +#else +# ifndef atomic_inc_return_acquire +# define atomic_inc_return_acquire(...) __atomic_op_acquire(atomic_inc_return, __VA_ARGS__) +# endif +# ifndef atomic_inc_return_release +# define atomic_inc_return_release(...) __atomic_op_release(atomic_inc_return, __VA_ARGS__) +# endif +# ifndef atomic_inc_return +# define atomic_inc_return(...) __atomic_op_fence(atomic_inc_return, __VA_ARGS__) +# endif +#endif + +/* atomic_sub_return_relaxed() et al: */ -/* atomic_sub_return_relaxed */ #ifndef atomic_sub_return_relaxed -#define atomic_sub_return_relaxed atomic_sub_return -#define atomic_sub_return_acquire atomic_sub_return -#define atomic_sub_return_release atomic_sub_return - -#else /* atomic_sub_return_relaxed */ - -#ifndef atomic_sub_return_acquire -#define atomic_sub_return_acquire(...) \ - __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic_sub_return_release -#define atomic_sub_return_release(...) \ - __atomic_op_release(atomic_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic_sub_return -#define atomic_sub_return(...) \ - __atomic_op_fence(atomic_sub_return, __VA_ARGS__) -#endif -#endif /* atomic_sub_return_relaxed */ +# define atomic_sub_return_relaxed atomic_sub_return +# define atomic_sub_return_acquire atomic_sub_return +# define atomic_sub_return_release atomic_sub_return +#else +# ifndef atomic_sub_return_acquire +# define atomic_sub_return_acquire(...) __atomic_op_acquire(atomic_sub_return, __VA_ARGS__) +# endif +# ifndef atomic_sub_return_release +# define atomic_sub_return_release(...) __atomic_op_release(atomic_sub_return, __VA_ARGS__) +# endif +# ifndef atomic_sub_return +# define atomic_sub_return(...) __atomic_op_fence(atomic_sub_return, __VA_ARGS__) +# endif +#endif + +/* atomic_dec_return_relaxed() et al: */ -/* atomic_dec_return_relaxed */ #ifndef atomic_dec_return_relaxed -#define atomic_dec_return_relaxed atomic_dec_return -#define atomic_dec_return_acquire atomic_dec_return -#define atomic_dec_return_release atomic_dec_return - -#else /* atomic_dec_return_relaxed */ - -#ifndef atomic_dec_return_acquire -#define atomic_dec_return_acquire(...) \ - __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) -#endif +# define atomic_dec_return_relaxed atomic_dec_return +# define atomic_dec_return_acquire atomic_dec_return +# define atomic_dec_return_release atomic_dec_return +#else +# ifndef atomic_dec_return_acquire +# define atomic_dec_return_acquire(...) __atomic_op_acquire(atomic_dec_return, __VA_ARGS__) +# endif +# ifndef atomic_dec_return_release +# define atomic_dec_return_release(...) __atomic_op_release(atomic_dec_return, __VA_ARGS__) +# endif +# ifndef atomic_dec_return +# define atomic_dec_return(...) __atomic_op_fence(atomic_dec_return, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_add_relaxed() et al: */ -#ifndef atomic_dec_return_release -#define atomic_dec_return_release(...) \ - __atomic_op_release(atomic_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic_dec_return -#define atomic_dec_return(...) \ - __atomic_op_fence(atomic_dec_return, __VA_ARGS__) -#endif -#endif /* atomic_dec_return_relaxed */ - - -/* atomic_fetch_add_relaxed */ #ifndef atomic_fetch_add_relaxed -#define atomic_fetch_add_relaxed atomic_fetch_add -#define atomic_fetch_add_acquire atomic_fetch_add -#define atomic_fetch_add_release atomic_fetch_add - -#else /* atomic_fetch_add_relaxed */ - -#ifndef atomic_fetch_add_acquire -#define atomic_fetch_add_acquire(...) \ - __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_add_release -#define atomic_fetch_add_release(...) \ - __atomic_op_release(atomic_fetch_add, __VA_ARGS__) -#endif +# define atomic_fetch_add_relaxed atomic_fetch_add +# define atomic_fetch_add_acquire atomic_fetch_add +# define atomic_fetch_add_release atomic_fetch_add +#else +# ifndef atomic_fetch_add_acquire +# define atomic_fetch_add_acquire(...) __atomic_op_acquire(atomic_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic_fetch_add_release +# define atomic_fetch_add_release(...) __atomic_op_release(atomic_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic_fetch_add +# define atomic_fetch_add(...) __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_inc_relaxed() et al: */ -#ifndef atomic_fetch_add -#define atomic_fetch_add(...) \ - __atomic_op_fence(atomic_fetch_add, __VA_ARGS__) -#endif -#endif /* atomic_fetch_add_relaxed */ - -/* atomic_fetch_inc_relaxed */ #ifndef atomic_fetch_inc_relaxed +# ifndef atomic_fetch_inc +# define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) +# define atomic_fetch_inc_relaxed(v) atomic_fetch_add_relaxed(1, (v)) +# define atomic_fetch_inc_acquire(v) atomic_fetch_add_acquire(1, (v)) +# define atomic_fetch_inc_release(v) atomic_fetch_add_release(1, (v)) +# else +# define atomic_fetch_inc_relaxed atomic_fetch_inc +# define atomic_fetch_inc_acquire atomic_fetch_inc +# define atomic_fetch_inc_release atomic_fetch_inc +# endif +#else +# ifndef atomic_fetch_inc_acquire +# define atomic_fetch_inc_acquire(...) __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic_fetch_inc_release +# define atomic_fetch_inc_release(...) __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic_fetch_inc +# define atomic_fetch_inc(...) __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_sub_relaxed() et al: */ -#ifndef atomic_fetch_inc -#define atomic_fetch_inc(v) atomic_fetch_add(1, (v)) -#define atomic_fetch_inc_relaxed(v) atomic_fetch_add_relaxed(1, (v)) -#define atomic_fetch_inc_acquire(v) atomic_fetch_add_acquire(1, (v)) -#define atomic_fetch_inc_release(v) atomic_fetch_add_release(1, (v)) -#else /* atomic_fetch_inc */ -#define atomic_fetch_inc_relaxed atomic_fetch_inc -#define atomic_fetch_inc_acquire atomic_fetch_inc -#define atomic_fetch_inc_release atomic_fetch_inc -#endif /* atomic_fetch_inc */ - -#else /* atomic_fetch_inc_relaxed */ - -#ifndef atomic_fetch_inc_acquire -#define atomic_fetch_inc_acquire(...) \ - __atomic_op_acquire(atomic_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_inc_release -#define atomic_fetch_inc_release(...) \ - __atomic_op_release(atomic_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_inc -#define atomic_fetch_inc(...) \ - __atomic_op_fence(atomic_fetch_inc, __VA_ARGS__) -#endif -#endif /* atomic_fetch_inc_relaxed */ - -/* atomic_fetch_sub_relaxed */ #ifndef atomic_fetch_sub_relaxed -#define atomic_fetch_sub_relaxed atomic_fetch_sub -#define atomic_fetch_sub_acquire atomic_fetch_sub -#define atomic_fetch_sub_release atomic_fetch_sub - -#else /* atomic_fetch_sub_relaxed */ - -#ifndef atomic_fetch_sub_acquire -#define atomic_fetch_sub_acquire(...) \ - __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) -#endif +# define atomic_fetch_sub_relaxed atomic_fetch_sub +# define atomic_fetch_sub_acquire atomic_fetch_sub +# define atomic_fetch_sub_release atomic_fetch_sub +#else +# ifndef atomic_fetch_sub_acquire +# define atomic_fetch_sub_acquire(...) __atomic_op_acquire(atomic_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic_fetch_sub_release +# define atomic_fetch_sub_release(...) __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic_fetch_sub +# define atomic_fetch_sub(...) __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_dec_relaxed() et al: */ -#ifndef atomic_fetch_sub_release -#define atomic_fetch_sub_release(...) \ - __atomic_op_release(atomic_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_sub -#define atomic_fetch_sub(...) \ - __atomic_op_fence(atomic_fetch_sub, __VA_ARGS__) -#endif -#endif /* atomic_fetch_sub_relaxed */ - -/* atomic_fetch_dec_relaxed */ #ifndef atomic_fetch_dec_relaxed +# ifndef atomic_fetch_dec +# define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) +# define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) +# define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) +# define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) +# else +# define atomic_fetch_dec_relaxed atomic_fetch_dec +# define atomic_fetch_dec_acquire atomic_fetch_dec +# define atomic_fetch_dec_release atomic_fetch_dec +# endif +#else +# ifndef atomic_fetch_dec_acquire +# define atomic_fetch_dec_acquire(...) __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic_fetch_dec_release +# define atomic_fetch_dec_release(...) __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic_fetch_dec +# define atomic_fetch_dec(...) __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_or_relaxed() et al: */ -#ifndef atomic_fetch_dec -#define atomic_fetch_dec(v) atomic_fetch_sub(1, (v)) -#define atomic_fetch_dec_relaxed(v) atomic_fetch_sub_relaxed(1, (v)) -#define atomic_fetch_dec_acquire(v) atomic_fetch_sub_acquire(1, (v)) -#define atomic_fetch_dec_release(v) atomic_fetch_sub_release(1, (v)) -#else /* atomic_fetch_dec */ -#define atomic_fetch_dec_relaxed atomic_fetch_dec -#define atomic_fetch_dec_acquire atomic_fetch_dec -#define atomic_fetch_dec_release atomic_fetch_dec -#endif /* atomic_fetch_dec */ - -#else /* atomic_fetch_dec_relaxed */ - -#ifndef atomic_fetch_dec_acquire -#define atomic_fetch_dec_acquire(...) \ - __atomic_op_acquire(atomic_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_dec_release -#define atomic_fetch_dec_release(...) \ - __atomic_op_release(atomic_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_dec -#define atomic_fetch_dec(...) \ - __atomic_op_fence(atomic_fetch_dec, __VA_ARGS__) -#endif -#endif /* atomic_fetch_dec_relaxed */ - -/* atomic_fetch_or_relaxed */ #ifndef atomic_fetch_or_relaxed -#define atomic_fetch_or_relaxed atomic_fetch_or -#define atomic_fetch_or_acquire atomic_fetch_or -#define atomic_fetch_or_release atomic_fetch_or - -#else /* atomic_fetch_or_relaxed */ - -#ifndef atomic_fetch_or_acquire -#define atomic_fetch_or_acquire(...) \ - __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_or_release -#define atomic_fetch_or_release(...) \ - __atomic_op_release(atomic_fetch_or, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_or -#define atomic_fetch_or(...) \ - __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) -#endif -#endif /* atomic_fetch_or_relaxed */ +# define atomic_fetch_or_relaxed atomic_fetch_or +# define atomic_fetch_or_acquire atomic_fetch_or +# define atomic_fetch_or_release atomic_fetch_or +#else +# ifndef atomic_fetch_or_acquire +# define atomic_fetch_or_acquire(...) __atomic_op_acquire(atomic_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic_fetch_or_release +# define atomic_fetch_or_release(...) __atomic_op_release(atomic_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic_fetch_or +# define atomic_fetch_or(...) __atomic_op_fence(atomic_fetch_or, __VA_ARGS__) +# endif +#endif + +/* atomic_fetch_and_relaxed() et al: */ -/* atomic_fetch_and_relaxed */ #ifndef atomic_fetch_and_relaxed -#define atomic_fetch_and_relaxed atomic_fetch_and -#define atomic_fetch_and_acquire atomic_fetch_and -#define atomic_fetch_and_release atomic_fetch_and - -#else /* atomic_fetch_and_relaxed */ - -#ifndef atomic_fetch_and_acquire -#define atomic_fetch_and_acquire(...) \ - __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_and_release -#define atomic_fetch_and_release(...) \ - __atomic_op_release(atomic_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_and -#define atomic_fetch_and(...) \ - __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# define atomic_fetch_and_relaxed atomic_fetch_and +# define atomic_fetch_and_acquire atomic_fetch_and +# define atomic_fetch_and_release atomic_fetch_and +#else +# ifndef atomic_fetch_and_acquire +# define atomic_fetch_and_acquire(...) __atomic_op_acquire(atomic_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic_fetch_and_release +# define atomic_fetch_and_release(...) __atomic_op_release(atomic_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic_fetch_and +# define atomic_fetch_and(...) __atomic_op_fence(atomic_fetch_and, __VA_ARGS__) +# endif #endif -#endif /* atomic_fetch_and_relaxed */ #ifdef atomic_andnot -/* atomic_fetch_andnot_relaxed */ -#ifndef atomic_fetch_andnot_relaxed -#define atomic_fetch_andnot_relaxed atomic_fetch_andnot -#define atomic_fetch_andnot_acquire atomic_fetch_andnot -#define atomic_fetch_andnot_release atomic_fetch_andnot - -#else /* atomic_fetch_andnot_relaxed */ -#ifndef atomic_fetch_andnot_acquire -#define atomic_fetch_andnot_acquire(...) \ - __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) -#endif +/* atomic_fetch_andnot_relaxed() et al: */ -#ifndef atomic_fetch_andnot_release -#define atomic_fetch_andnot_release(...) \ - __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) +#ifndef atomic_fetch_andnot_relaxed +# define atomic_fetch_andnot_relaxed atomic_fetch_andnot +# define atomic_fetch_andnot_acquire atomic_fetch_andnot +# define atomic_fetch_andnot_release atomic_fetch_andnot +#else +# ifndef atomic_fetch_andnot_acquire +# define atomic_fetch_andnot_acquire(...) __atomic_op_acquire(atomic_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic_fetch_andnot_release +# define atomic_fetch_andnot_release(...) __atomic_op_release(atomic_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic_fetch_andnot +# define atomic_fetch_andnot(...) __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) +# endif #endif -#ifndef atomic_fetch_andnot -#define atomic_fetch_andnot(...) \ - __atomic_op_fence(atomic_fetch_andnot, __VA_ARGS__) -#endif -#endif /* atomic_fetch_andnot_relaxed */ #endif /* atomic_andnot */ -/* atomic_fetch_xor_relaxed */ -#ifndef atomic_fetch_xor_relaxed -#define atomic_fetch_xor_relaxed atomic_fetch_xor -#define atomic_fetch_xor_acquire atomic_fetch_xor -#define atomic_fetch_xor_release atomic_fetch_xor - -#else /* atomic_fetch_xor_relaxed */ +/* atomic_fetch_xor_relaxed() et al: */ -#ifndef atomic_fetch_xor_acquire -#define atomic_fetch_xor_acquire(...) \ - __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) -#endif - -#ifndef atomic_fetch_xor_release -#define atomic_fetch_xor_release(...) \ - __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) +#ifndef atomic_fetch_xor_relaxed +# define atomic_fetch_xor_relaxed atomic_fetch_xor +# define atomic_fetch_xor_acquire atomic_fetch_xor +# define atomic_fetch_xor_release atomic_fetch_xor +#else +# ifndef atomic_fetch_xor_acquire +# define atomic_fetch_xor_acquire(...) __atomic_op_acquire(atomic_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic_fetch_xor_release +# define atomic_fetch_xor_release(...) __atomic_op_release(atomic_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic_fetch_xor +# define atomic_fetch_xor(...) __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) +# endif #endif -#ifndef atomic_fetch_xor -#define atomic_fetch_xor(...) \ - __atomic_op_fence(atomic_fetch_xor, __VA_ARGS__) -#endif -#endif /* atomic_fetch_xor_relaxed */ +/* atomic_xchg_relaxed() et al: */ -/* atomic_xchg_relaxed */ #ifndef atomic_xchg_relaxed -#define atomic_xchg_relaxed atomic_xchg -#define atomic_xchg_acquire atomic_xchg -#define atomic_xchg_release atomic_xchg - -#else /* atomic_xchg_relaxed */ - -#ifndef atomic_xchg_acquire -#define atomic_xchg_acquire(...) \ - __atomic_op_acquire(atomic_xchg, __VA_ARGS__) -#endif - -#ifndef atomic_xchg_release -#define atomic_xchg_release(...) \ - __atomic_op_release(atomic_xchg, __VA_ARGS__) -#endif +#define atomic_xchg_relaxed atomic_xchg +#define atomic_xchg_acquire atomic_xchg +#define atomic_xchg_release atomic_xchg +#else +# ifndef atomic_xchg_acquire +# define atomic_xchg_acquire(...) __atomic_op_acquire(atomic_xchg, __VA_ARGS__) +# endif +# ifndef atomic_xchg_release +# define atomic_xchg_release(...) __atomic_op_release(atomic_xchg, __VA_ARGS__) +# endif +# ifndef atomic_xchg +# define atomic_xchg(...) __atomic_op_fence(atomic_xchg, __VA_ARGS__) +# endif +#endif + +/* atomic_cmpxchg_relaxed() et al: */ -#ifndef atomic_xchg -#define atomic_xchg(...) \ - __atomic_op_fence(atomic_xchg, __VA_ARGS__) -#endif -#endif /* atomic_xchg_relaxed */ - -/* atomic_cmpxchg_relaxed */ #ifndef atomic_cmpxchg_relaxed -#define atomic_cmpxchg_relaxed atomic_cmpxchg -#define atomic_cmpxchg_acquire atomic_cmpxchg -#define atomic_cmpxchg_release atomic_cmpxchg - -#else /* atomic_cmpxchg_relaxed */ - -#ifndef atomic_cmpxchg_acquire -#define atomic_cmpxchg_acquire(...) \ - __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# define atomic_cmpxchg_relaxed atomic_cmpxchg +# define atomic_cmpxchg_acquire atomic_cmpxchg +# define atomic_cmpxchg_release atomic_cmpxchg +#else +# ifndef atomic_cmpxchg_acquire +# define atomic_cmpxchg_acquire(...) __atomic_op_acquire(atomic_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic_cmpxchg_release +# define atomic_cmpxchg_release(...) __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic_cmpxchg +# define atomic_cmpxchg(...) __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) +# endif #endif -#ifndef atomic_cmpxchg_release -#define atomic_cmpxchg_release(...) \ - __atomic_op_release(atomic_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic_cmpxchg -#define atomic_cmpxchg(...) \ - __atomic_op_fence(atomic_cmpxchg, __VA_ARGS__) -#endif -#endif /* atomic_cmpxchg_relaxed */ - #ifndef atomic_try_cmpxchg - -#define __atomic_try_cmpxchg(type, _p, _po, _n) \ -({ \ +# define __atomic_try_cmpxchg(type, _p, _po, _n) \ + ({ \ typeof(_po) __po = (_po); \ typeof(*(_po)) __r, __o = *__po; \ __r = atomic_cmpxchg##type((_p), __o, (_n)); \ if (unlikely(__r != __o)) \ *__po = __r; \ likely(__r == __o); \ -}) - -#define atomic_try_cmpxchg(_p, _po, _n) __atomic_try_cmpxchg(, _p, _po, _n) -#define atomic_try_cmpxchg_relaxed(_p, _po, _n) __atomic_try_cmpxchg(_relaxed, _p, _po, _n) -#define atomic_try_cmpxchg_acquire(_p, _po, _n) __atomic_try_cmpxchg(_acquire, _p, _po, _n) -#define atomic_try_cmpxchg_release(_p, _po, _n) __atomic_try_cmpxchg(_release, _p, _po, _n) - -#else /* atomic_try_cmpxchg */ -#define atomic_try_cmpxchg_relaxed atomic_try_cmpxchg -#define atomic_try_cmpxchg_acquire atomic_try_cmpxchg -#define atomic_try_cmpxchg_release atomic_try_cmpxchg -#endif /* atomic_try_cmpxchg */ - -/* cmpxchg_relaxed */ -#ifndef cmpxchg_relaxed -#define cmpxchg_relaxed cmpxchg -#define cmpxchg_acquire cmpxchg -#define cmpxchg_release cmpxchg - -#else /* cmpxchg_relaxed */ - -#ifndef cmpxchg_acquire -#define cmpxchg_acquire(...) \ - __atomic_op_acquire(cmpxchg, __VA_ARGS__) + }) +# define atomic_try_cmpxchg(_p, _po, _n) __atomic_try_cmpxchg(, _p, _po, _n) +# define atomic_try_cmpxchg_relaxed(_p, _po, _n) __atomic_try_cmpxchg(_relaxed, _p, _po, _n) +# define atomic_try_cmpxchg_acquire(_p, _po, _n) __atomic_try_cmpxchg(_acquire, _p, _po, _n) +# define atomic_try_cmpxchg_release(_p, _po, _n) __atomic_try_cmpxchg(_release, _p, _po, _n) +#else +# define atomic_try_cmpxchg_relaxed atomic_try_cmpxchg +# define atomic_try_cmpxchg_acquire atomic_try_cmpxchg +# define atomic_try_cmpxchg_release atomic_try_cmpxchg #endif -#ifndef cmpxchg_release -#define cmpxchg_release(...) \ - __atomic_op_release(cmpxchg, __VA_ARGS__) -#endif +/* cmpxchg_relaxed() et al: */ -#ifndef cmpxchg -#define cmpxchg(...) \ - __atomic_op_fence(cmpxchg, __VA_ARGS__) -#endif -#endif /* cmpxchg_relaxed */ +#ifndef cmpxchg_relaxed +# define cmpxchg_relaxed cmpxchg +# define cmpxchg_acquire cmpxchg +# define cmpxchg_release cmpxchg +#else +# ifndef cmpxchg_acquire +# define cmpxchg_acquire(...) __atomic_op_acquire(cmpxchg, __VA_ARGS__) +# endif +# ifndef cmpxchg_release +# define cmpxchg_release(...) __atomic_op_release(cmpxchg, __VA_ARGS__) +# endif +# ifndef cmpxchg +# define cmpxchg(...) __atomic_op_fence(cmpxchg, __VA_ARGS__) +# endif +#endif + +/* cmpxchg64_relaxed() et al: */ -/* cmpxchg64_relaxed */ #ifndef cmpxchg64_relaxed -#define cmpxchg64_relaxed cmpxchg64 -#define cmpxchg64_acquire cmpxchg64 -#define cmpxchg64_release cmpxchg64 - -#else /* cmpxchg64_relaxed */ - -#ifndef cmpxchg64_acquire -#define cmpxchg64_acquire(...) \ - __atomic_op_acquire(cmpxchg64, __VA_ARGS__) -#endif - -#ifndef cmpxchg64_release -#define cmpxchg64_release(...) \ - __atomic_op_release(cmpxchg64, __VA_ARGS__) -#endif - -#ifndef cmpxchg64 -#define cmpxchg64(...) \ - __atomic_op_fence(cmpxchg64, __VA_ARGS__) -#endif -#endif /* cmpxchg64_relaxed */ +# define cmpxchg64_relaxed cmpxchg64 +# define cmpxchg64_acquire cmpxchg64 +# define cmpxchg64_release cmpxchg64 +#else +# ifndef cmpxchg64_acquire +# define cmpxchg64_acquire(...) __atomic_op_acquire(cmpxchg64, __VA_ARGS__) +# endif +# ifndef cmpxchg64_release +# define cmpxchg64_release(...) __atomic_op_release(cmpxchg64, __VA_ARGS__) +# endif +# ifndef cmpxchg64 +# define cmpxchg64(...) __atomic_op_fence(cmpxchg64, __VA_ARGS__) +# endif +#endif + +/* xchg_relaxed() et al: */ -/* xchg_relaxed */ #ifndef xchg_relaxed -#define xchg_relaxed xchg -#define xchg_acquire xchg -#define xchg_release xchg - -#else /* xchg_relaxed */ - -#ifndef xchg_acquire -#define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) -#endif - -#ifndef xchg_release -#define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) +# define xchg_relaxed xchg +# define xchg_acquire xchg +# define xchg_release xchg +#else +# ifndef xchg_acquire +# define xchg_acquire(...) __atomic_op_acquire(xchg, __VA_ARGS__) +# endif +# ifndef xchg_release +# define xchg_release(...) __atomic_op_release(xchg, __VA_ARGS__) +# endif +# ifndef xchg +# define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) +# endif #endif -#ifndef xchg -#define xchg(...) __atomic_op_fence(xchg, __VA_ARGS__) -#endif -#endif /* xchg_relaxed */ - /** * atomic_add_unless - add unless the number is already a given value * @v: pointer of type atomic_t @@ -541,7 +438,7 @@ static inline int atomic_add_unless(atomic_t *v, int a, int u) * Returns non-zero if @v was non-zero, and zero otherwise. */ #ifndef atomic_inc_not_zero -#define atomic_inc_not_zero(v) atomic_add_unless((v), 1, 0) +# define atomic_inc_not_zero(v) atomic_add_unless((v), 1, 0) #endif #ifndef atomic_andnot @@ -607,6 +504,7 @@ static inline int atomic_inc_not_zero_hint(atomic_t *v, int hint) static inline int atomic_inc_unless_negative(atomic_t *p) { int v, v1; + for (v = 0; v >= 0; v = v1) { v1 = atomic_cmpxchg(p, v, v + 1); if (likely(v1 == v)) @@ -620,6 +518,7 @@ static inline int atomic_inc_unless_negative(atomic_t *p) static inline int atomic_dec_unless_positive(atomic_t *p) { int v, v1; + for (v = 0; v <= 0; v = v1) { v1 = atomic_cmpxchg(p, v, v - 1); if (likely(v1 == v)) @@ -640,6 +539,7 @@ static inline int atomic_dec_unless_positive(atomic_t *p) static inline int atomic_dec_if_positive(atomic_t *v) { int c, old, dec; + c = atomic_read(v); for (;;) { dec = c - 1; @@ -654,400 +554,311 @@ static inline int atomic_dec_if_positive(atomic_t *v) } #endif -#define atomic_cond_read_relaxed(v, c) smp_cond_load_relaxed(&(v)->counter, (c)) -#define atomic_cond_read_acquire(v, c) smp_cond_load_acquire(&(v)->counter, (c)) +#define atomic_cond_read_relaxed(v, c) smp_cond_load_relaxed(&(v)->counter, (c)) +#define atomic_cond_read_acquire(v, c) smp_cond_load_acquire(&(v)->counter, (c)) #ifdef CONFIG_GENERIC_ATOMIC64 #include <asm-generic/atomic64.h> #endif #ifndef atomic64_read_acquire -#define atomic64_read_acquire(v) smp_load_acquire(&(v)->counter) +# define atomic64_read_acquire(v) smp_load_acquire(&(v)->counter) #endif #ifndef atomic64_set_release -#define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) -#endif - -/* atomic64_add_return_relaxed */ -#ifndef atomic64_add_return_relaxed -#define atomic64_add_return_relaxed atomic64_add_return -#define atomic64_add_return_acquire atomic64_add_return -#define atomic64_add_return_release atomic64_add_return - -#else /* atomic64_add_return_relaxed */ - -#ifndef atomic64_add_return_acquire -#define atomic64_add_return_acquire(...) \ - __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# define atomic64_set_release(v, i) smp_store_release(&(v)->counter, (i)) #endif -#ifndef atomic64_add_return_release -#define atomic64_add_return_release(...) \ - __atomic_op_release(atomic64_add_return, __VA_ARGS__) -#endif +/* atomic64_add_return_relaxed() et al: */ -#ifndef atomic64_add_return -#define atomic64_add_return(...) \ - __atomic_op_fence(atomic64_add_return, __VA_ARGS__) -#endif -#endif /* atomic64_add_return_relaxed */ +#ifndef atomic64_add_return_relaxed +# define atomic64_add_return_relaxed atomic64_add_return +# define atomic64_add_return_acquire atomic64_add_return +# define atomic64_add_return_release atomic64_add_return +#else +# ifndef atomic64_add_return_acquire +# define atomic64_add_return_acquire(...) __atomic_op_acquire(atomic64_add_return, __VA_ARGS__) +# endif +# ifndef atomic64_add_return_release +# define atomic64_add_return_release(...) __atomic_op_release(atomic64_add_return, __VA_ARGS__) +# endif +# ifndef atomic64_add_return +# define atomic64_add_return(...) __atomic_op_fence(atomic64_add_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_inc_return_relaxed() et al: */ -/* atomic64_inc_return_relaxed */ #ifndef atomic64_inc_return_relaxed -#define atomic64_inc_return_relaxed atomic64_inc_return -#define atomic64_inc_return_acquire atomic64_inc_return -#define atomic64_inc_return_release atomic64_inc_return - -#else /* atomic64_inc_return_relaxed */ - -#ifndef atomic64_inc_return_acquire -#define atomic64_inc_return_acquire(...) \ - __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic64_inc_return_release -#define atomic64_inc_return_release(...) \ - __atomic_op_release(atomic64_inc_return, __VA_ARGS__) -#endif - -#ifndef atomic64_inc_return -#define atomic64_inc_return(...) \ - __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) -#endif -#endif /* atomic64_inc_return_relaxed */ - +# define atomic64_inc_return_relaxed atomic64_inc_return +# define atomic64_inc_return_acquire atomic64_inc_return +# define atomic64_inc_return_release atomic64_inc_return +#else +# ifndef atomic64_inc_return_acquire +# define atomic64_inc_return_acquire(...) __atomic_op_acquire(atomic64_inc_return, __VA_ARGS__) +# endif +# ifndef atomic64_inc_return_release +# define atomic64_inc_return_release(...) __atomic_op_release(atomic64_inc_return, __VA_ARGS__) +# endif +# ifndef atomic64_inc_return +# define atomic64_inc_return(...) __atomic_op_fence(atomic64_inc_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_sub_return_relaxed() et al: */ -/* atomic64_sub_return_relaxed */ #ifndef atomic64_sub_return_relaxed -#define atomic64_sub_return_relaxed atomic64_sub_return -#define atomic64_sub_return_acquire atomic64_sub_return -#define atomic64_sub_return_release atomic64_sub_return +# define atomic64_sub_return_relaxed atomic64_sub_return +# define atomic64_sub_return_acquire atomic64_sub_return +# define atomic64_sub_return_release atomic64_sub_return +#else +# ifndef atomic64_sub_return_acquire +# define atomic64_sub_return_acquire(...) __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) +# endif +# ifndef atomic64_sub_return_release +# define atomic64_sub_return_release(...) __atomic_op_release(atomic64_sub_return, __VA_ARGS__) +# endif +# ifndef atomic64_sub_return +# define atomic64_sub_return(...) __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_dec_return_relaxed() et al: */ -#else /* atomic64_sub_return_relaxed */ - -#ifndef atomic64_sub_return_acquire -#define atomic64_sub_return_acquire(...) \ - __atomic_op_acquire(atomic64_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic64_sub_return_release -#define atomic64_sub_return_release(...) \ - __atomic_op_release(atomic64_sub_return, __VA_ARGS__) -#endif - -#ifndef atomic64_sub_return -#define atomic64_sub_return(...) \ - __atomic_op_fence(atomic64_sub_return, __VA_ARGS__) -#endif -#endif /* atomic64_sub_return_relaxed */ - -/* atomic64_dec_return_relaxed */ #ifndef atomic64_dec_return_relaxed -#define atomic64_dec_return_relaxed atomic64_dec_return -#define atomic64_dec_return_acquire atomic64_dec_return -#define atomic64_dec_return_release atomic64_dec_return - -#else /* atomic64_dec_return_relaxed */ - -#ifndef atomic64_dec_return_acquire -#define atomic64_dec_return_acquire(...) \ - __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic64_dec_return_release -#define atomic64_dec_return_release(...) \ - __atomic_op_release(atomic64_dec_return, __VA_ARGS__) -#endif - -#ifndef atomic64_dec_return -#define atomic64_dec_return(...) \ - __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) -#endif -#endif /* atomic64_dec_return_relaxed */ +# define atomic64_dec_return_relaxed atomic64_dec_return +# define atomic64_dec_return_acquire atomic64_dec_return +# define atomic64_dec_return_release atomic64_dec_return +#else +# ifndef atomic64_dec_return_acquire +# define atomic64_dec_return_acquire(...) __atomic_op_acquire(atomic64_dec_return, __VA_ARGS__) +# endif +# ifndef atomic64_dec_return_release +# define atomic64_dec_return_release(...) __atomic_op_release(atomic64_dec_return, __VA_ARGS__) +# endif +# ifndef atomic64_dec_return +# define atomic64_dec_return(...) __atomic_op_fence(atomic64_dec_return, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_add_relaxed() et al: */ - -/* atomic64_fetch_add_relaxed */ #ifndef atomic64_fetch_add_relaxed -#define atomic64_fetch_add_relaxed atomic64_fetch_add -#define atomic64_fetch_add_acquire atomic64_fetch_add -#define atomic64_fetch_add_release atomic64_fetch_add - -#else /* atomic64_fetch_add_relaxed */ - -#ifndef atomic64_fetch_add_acquire -#define atomic64_fetch_add_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) -#endif +# define atomic64_fetch_add_relaxed atomic64_fetch_add +# define atomic64_fetch_add_acquire atomic64_fetch_add +# define atomic64_fetch_add_release atomic64_fetch_add +#else +# ifndef atomic64_fetch_add_acquire +# define atomic64_fetch_add_acquire(...) __atomic_op_acquire(atomic64_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_add_release +# define atomic64_fetch_add_release(...) __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_add +# define atomic64_fetch_add(...) __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_inc_relaxed() et al: */ -#ifndef atomic64_fetch_add_release -#define atomic64_fetch_add_release(...) \ - __atomic_op_release(atomic64_fetch_add, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_add -#define atomic64_fetch_add(...) \ - __atomic_op_fence(atomic64_fetch_add, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_add_relaxed */ - -/* atomic64_fetch_inc_relaxed */ #ifndef atomic64_fetch_inc_relaxed +# ifndef atomic64_fetch_inc +# define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) +# define atomic64_fetch_inc_relaxed(v) atomic64_fetch_add_relaxed(1, (v)) +# define atomic64_fetch_inc_acquire(v) atomic64_fetch_add_acquire(1, (v)) +# define atomic64_fetch_inc_release(v) atomic64_fetch_add_release(1, (v)) +# else +# define atomic64_fetch_inc_relaxed atomic64_fetch_inc +# define atomic64_fetch_inc_acquire atomic64_fetch_inc +# define atomic64_fetch_inc_release atomic64_fetch_inc +# endif +#else +# ifndef atomic64_fetch_inc_acquire +# define atomic64_fetch_inc_acquire(...) __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_inc_release +# define atomic64_fetch_inc_release(...) __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_inc +# define atomic64_fetch_inc(...) __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_sub_relaxed() et al: */ -#ifndef atomic64_fetch_inc -#define atomic64_fetch_inc(v) atomic64_fetch_add(1, (v)) -#define atomic64_fetch_inc_relaxed(v) atomic64_fetch_add_relaxed(1, (v)) -#define atomic64_fetch_inc_acquire(v) atomic64_fetch_add_acquire(1, (v)) -#define atomic64_fetch_inc_release(v) atomic64_fetch_add_release(1, (v)) -#else /* atomic64_fetch_inc */ -#define atomic64_fetch_inc_relaxed atomic64_fetch_inc -#define atomic64_fetch_inc_acquire atomic64_fetch_inc -#define atomic64_fetch_inc_release atomic64_fetch_inc -#endif /* atomic64_fetch_inc */ - -#else /* atomic64_fetch_inc_relaxed */ - -#ifndef atomic64_fetch_inc_acquire -#define atomic64_fetch_inc_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_inc_release -#define atomic64_fetch_inc_release(...) \ - __atomic_op_release(atomic64_fetch_inc, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_inc -#define atomic64_fetch_inc(...) \ - __atomic_op_fence(atomic64_fetch_inc, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_inc_relaxed */ - -/* atomic64_fetch_sub_relaxed */ #ifndef atomic64_fetch_sub_relaxed -#define atomic64_fetch_sub_relaxed atomic64_fetch_sub -#define atomic64_fetch_sub_acquire atomic64_fetch_sub -#define atomic64_fetch_sub_release atomic64_fetch_sub - -#else /* atomic64_fetch_sub_relaxed */ - -#ifndef atomic64_fetch_sub_acquire -#define atomic64_fetch_sub_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_sub_release -#define atomic64_fetch_sub_release(...) \ - __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_sub -#define atomic64_fetch_sub(...) \ - __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_sub_relaxed */ +# define atomic64_fetch_sub_relaxed atomic64_fetch_sub +# define atomic64_fetch_sub_acquire atomic64_fetch_sub +# define atomic64_fetch_sub_release atomic64_fetch_sub +#else +# ifndef atomic64_fetch_sub_acquire +# define atomic64_fetch_sub_acquire(...) __atomic_op_acquire(atomic64_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_sub_release +# define atomic64_fetch_sub_release(...) __atomic_op_release(atomic64_fetch_sub, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_sub +# define atomic64_fetch_sub(...) __atomic_op_fence(atomic64_fetch_sub, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_dec_relaxed() et al: */ -/* atomic64_fetch_dec_relaxed */ #ifndef atomic64_fetch_dec_relaxed +# ifndef atomic64_fetch_dec +# define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) +# define atomic64_fetch_dec_relaxed(v) atomic64_fetch_sub_relaxed(1, (v)) +# define atomic64_fetch_dec_acquire(v) atomic64_fetch_sub_acquire(1, (v)) +# define atomic64_fetch_dec_release(v) atomic64_fetch_sub_release(1, (v)) +# else +# define atomic64_fetch_dec_relaxed atomic64_fetch_dec +# define atomic64_fetch_dec_acquire atomic64_fetch_dec +# define atomic64_fetch_dec_release atomic64_fetch_dec +# endif +#else +# ifndef atomic64_fetch_dec_acquire +# define atomic64_fetch_dec_acquire(...) __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_dec_release +# define atomic64_fetch_dec_release(...) __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_dec +# define atomic64_fetch_dec(...) __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) +# endif +#endif + +/* atomic64_fetch_or_relaxed() et al: */ -#ifndef atomic64_fetch_dec -#define atomic64_fetch_dec(v) atomic64_fetch_sub(1, (v)) -#define atomic64_fetch_dec_relaxed(v) atomic64_fetch_sub_relaxed(1, (v)) -#define atomic64_fetch_dec_acquire(v) atomic64_fetch_sub_acquire(1, (v)) -#define atomic64_fetch_dec_release(v) atomic64_fetch_sub_release(1, (v)) -#else /* atomic64_fetch_dec */ -#define atomic64_fetch_dec_relaxed atomic64_fetch_dec -#define atomic64_fetch_dec_acquire atomic64_fetch_dec -#define atomic64_fetch_dec_release atomic64_fetch_dec -#endif /* atomic64_fetch_dec */ - -#else /* atomic64_fetch_dec_relaxed */ - -#ifndef atomic64_fetch_dec_acquire -#define atomic64_fetch_dec_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_dec_release -#define atomic64_fetch_dec_release(...) \ - __atomic_op_release(atomic64_fetch_dec, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_dec -#define atomic64_fetch_dec(...) \ - __atomic_op_fence(atomic64_fetch_dec, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_dec_relaxed */ - -/* atomic64_fetch_or_relaxed */ #ifndef atomic64_fetch_or_relaxed -#define atomic64_fetch_or_relaxed atomic64_fetch_or -#define atomic64_fetch_or_acquire atomic64_fetch_or -#define atomic64_fetch_or_release atomic64_fetch_or - -#else /* atomic64_fetch_or_relaxed */ - -#ifndef atomic64_fetch_or_acquire -#define atomic64_fetch_or_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# define atomic64_fetch_or_relaxed atomic64_fetch_or +# define atomic64_fetch_or_acquire atomic64_fetch_or +# define atomic64_fetch_or_release atomic64_fetch_or +#else +# ifndef atomic64_fetch_or_acquire +# define atomic64_fetch_or_acquire(...) __atomic_op_acquire(atomic64_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_or_release +# define atomic64_fetch_or_release(...) __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_or +# define atomic64_fetch_or(...) __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_or_release -#define atomic64_fetch_or_release(...) \ - __atomic_op_release(atomic64_fetch_or, __VA_ARGS__) -#endif -#ifndef atomic64_fetch_or -#define atomic64_fetch_or(...) \ - __atomic_op_fence(atomic64_fetch_or, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_or_relaxed */ +/* atomic64_fetch_and_relaxed() et al: */ -/* atomic64_fetch_and_relaxed */ #ifndef atomic64_fetch_and_relaxed -#define atomic64_fetch_and_relaxed atomic64_fetch_and -#define atomic64_fetch_and_acquire atomic64_fetch_and -#define atomic64_fetch_and_release atomic64_fetch_and - -#else /* atomic64_fetch_and_relaxed */ - -#ifndef atomic64_fetch_and_acquire -#define atomic64_fetch_and_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# define atomic64_fetch_and_relaxed atomic64_fetch_and +# define atomic64_fetch_and_acquire atomic64_fetch_and +# define atomic64_fetch_and_release atomic64_fetch_and +#else +# ifndef atomic64_fetch_and_acquire +# define atomic64_fetch_and_acquire(...) __atomic_op_acquire(atomic64_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_and_release +# define atomic64_fetch_and_release(...) __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_and +# define atomic64_fetch_and(...) __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_and_release -#define atomic64_fetch_and_release(...) \ - __atomic_op_release(atomic64_fetch_and, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_and -#define atomic64_fetch_and(...) \ - __atomic_op_fence(atomic64_fetch_and, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_and_relaxed */ - #ifdef atomic64_andnot -/* atomic64_fetch_andnot_relaxed */ -#ifndef atomic64_fetch_andnot_relaxed -#define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot -#define atomic64_fetch_andnot_acquire atomic64_fetch_andnot -#define atomic64_fetch_andnot_release atomic64_fetch_andnot - -#else /* atomic64_fetch_andnot_relaxed */ -#ifndef atomic64_fetch_andnot_acquire -#define atomic64_fetch_andnot_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) -#endif +/* atomic64_fetch_andnot_relaxed() et al: */ -#ifndef atomic64_fetch_andnot_release -#define atomic64_fetch_andnot_release(...) \ - __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) +#ifndef atomic64_fetch_andnot_relaxed +# define atomic64_fetch_andnot_relaxed atomic64_fetch_andnot +# define atomic64_fetch_andnot_acquire atomic64_fetch_andnot +# define atomic64_fetch_andnot_release atomic64_fetch_andnot +#else +# ifndef atomic64_fetch_andnot_acquire +# define atomic64_fetch_andnot_acquire(...) __atomic_op_acquire(atomic64_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_andnot_release +# define atomic64_fetch_andnot_release(...) __atomic_op_release(atomic64_fetch_andnot, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_andnot +# define atomic64_fetch_andnot(...) __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) +# endif #endif -#ifndef atomic64_fetch_andnot -#define atomic64_fetch_andnot(...) \ - __atomic_op_fence(atomic64_fetch_andnot, __VA_ARGS__) -#endif -#endif /* atomic64_fetch_andnot_relaxed */ #endif /* atomic64_andnot */ -/* atomic64_fetch_xor_relaxed */ -#ifndef atomic64_fetch_xor_relaxed -#define atomic64_fetch_xor_relaxed atomic64_fetch_xor -#define atomic64_fetch_xor_acquire atomic64_fetch_xor -#define atomic64_fetch_xor_release atomic64_fetch_xor - -#else /* atomic64_fetch_xor_relaxed */ +/* atomic64_fetch_xor_relaxed() et al: */ -#ifndef atomic64_fetch_xor_acquire -#define atomic64_fetch_xor_acquire(...) \ - __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) -#endif - -#ifndef atomic64_fetch_xor_release -#define atomic64_fetch_xor_release(...) \ - __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) +#ifndef atomic64_fetch_xor_relaxed +# define atomic64_fetch_xor_relaxed atomic64_fetch_xor +# define atomic64_fetch_xor_acquire atomic64_fetch_xor +# define atomic64_fetch_xor_release atomic64_fetch_xor +#else +# ifndef atomic64_fetch_xor_acquire +# define atomic64_fetch_xor_acquire(...) __atomic_op_acquire(atomic64_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_xor_release +# define atomic64_fetch_xor_release(...) __atomic_op_release(atomic64_fetch_xor, __VA_ARGS__) +# endif +# ifndef atomic64_fetch_xor +# define atomic64_fetch_xor(...) __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) #endif - -#ifndef atomic64_fetch_xor -#define atomic64_fetch_xor(...) \ - __atomic_op_fence(atomic64_fetch_xor, __VA_ARGS__) #endif -#endif /* atomic64_fetch_xor_relaxed */ +/* atomic64_xchg_relaxed() et al: */ -/* atomic64_xchg_relaxed */ #ifndef atomic64_xchg_relaxed -#define atomic64_xchg_relaxed atomic64_xchg -#define atomic64_xchg_acquire atomic64_xchg -#define atomic64_xchg_release atomic64_xchg - -#else /* atomic64_xchg_relaxed */ - -#ifndef atomic64_xchg_acquire -#define atomic64_xchg_acquire(...) \ - __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) -#endif +# define atomic64_xchg_relaxed atomic64_xchg +# define atomic64_xchg_acquire atomic64_xchg +# define atomic64_xchg_release atomic64_xchg +#else +# ifndef atomic64_xchg_acquire +# define atomic64_xchg_acquire(...) __atomic_op_acquire(atomic64_xchg, __VA_ARGS__) +# endif +# ifndef atomic64_xchg_release +# define atomic64_xchg_release(...) __atomic_op_release(atomic64_xchg, __VA_ARGS__) +# endif +# ifndef atomic64_xchg +# define atomic64_xchg(...) __atomic_op_fence(atomic64_xchg, __VA_ARGS__) +# endif +#endif + +/* atomic64_cmpxchg_relaxed() et al: */ -#ifndef atomic64_xchg_release -#define atomic64_xchg_release(...) \ - __atomic_op_release(atomic64_xchg, __VA_ARGS__) -#endif - -#ifndef atomic64_xchg -#define atomic64_xchg(...) \ - __atomic_op_fence(atomic64_xchg, __VA_ARGS__) -#endif -#endif /* atomic64_xchg_relaxed */ - -/* atomic64_cmpxchg_relaxed */ #ifndef atomic64_cmpxchg_relaxed -#define atomic64_cmpxchg_relaxed atomic64_cmpxchg -#define atomic64_cmpxchg_acquire atomic64_cmpxchg -#define atomic64_cmpxchg_release atomic64_cmpxchg - -#else /* atomic64_cmpxchg_relaxed */ - -#ifndef atomic64_cmpxchg_acquire -#define atomic64_cmpxchg_acquire(...) \ - __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic64_cmpxchg_release -#define atomic64_cmpxchg_release(...) \ - __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) -#endif - -#ifndef atomic64_cmpxchg -#define atomic64_cmpxchg(...) \ - __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# define atomic64_cmpxchg_relaxed atomic64_cmpxchg +# define atomic64_cmpxchg_acquire atomic64_cmpxchg +# define atomic64_cmpxchg_release atomic64_cmpxchg +#else +# ifndef atomic64_cmpxchg_acquire +# define atomic64_cmpxchg_acquire(...) __atomic_op_acquire(atomic64_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic64_cmpxchg_release +# define atomic64_cmpxchg_release(...) __atomic_op_release(atomic64_cmpxchg, __VA_ARGS__) +# endif +# ifndef atomic64_cmpxchg +# define atomic64_cmpxchg(...) __atomic_op_fence(atomic64_cmpxchg, __VA_ARGS__) +# endif #endif -#endif /* atomic64_cmpxchg_relaxed */ #ifndef atomic64_try_cmpxchg - -#define __atomic64_try_cmpxchg(type, _p, _po, _n) \ -({ \ +# define __atomic64_try_cmpxchg(type, _p, _po, _n) \ + ({ \ typeof(_po) __po = (_po); \ typeof(*(_po)) __r, __o = *__po; \ __r = atomic64_cmpxchg##type((_p), __o, (_n)); \ if (unlikely(__r != __o)) \ *__po = __r; \ likely(__r == __o); \ -}) - -#define atomic64_try_cmpxchg(_p, _po, _n) __atomic64_try_cmpxchg(, _p, _po, _n) -#define atomic64_try_cmpxchg_relaxed(_p, _po, _n) __atomic64_try_cmpxchg(_relaxed, _p, _po, _n) -#define atomic64_try_cmpxchg_acquire(_p, _po, _n) __atomic64_try_cmpxchg(_acquire, _p, _po, _n) -#define atomic64_try_cmpxchg_release(_p, _po, _n) __atomic64_try_cmpxchg(_release, _p, _po, _n) - -#else /* atomic64_try_cmpxchg */ -#define atomic64_try_cmpxchg_relaxed atomic64_try_cmpxchg -#define atomic64_try_cmpxchg_acquire atomic64_try_cmpxchg -#define atomic64_try_cmpxchg_release atomic64_try_cmpxchg -#endif /* atomic64_try_cmpxchg */ + }) +# define atomic64_try_cmpxchg(_p, _po, _n) __atomic64_try_cmpxchg(, _p, _po, _n) +# define atomic64_try_cmpxchg_relaxed(_p, _po, _n) __atomic64_try_cmpxchg(_relaxed, _p, _po, _n) +# define atomic64_try_cmpxchg_acquire(_p, _po, _n) __atomic64_try_cmpxchg(_acquire, _p, _po, _n) +# define atomic64_try_cmpxchg_release(_p, _po, _n) __atomic64_try_cmpxchg(_release, _p, _po, _n) +#else +# define atomic64_try_cmpxchg_relaxed atomic64_try_cmpxchg +# define atomic64_try_cmpxchg_acquire atomic64_try_cmpxchg +# define atomic64_try_cmpxchg_release atomic64_try_cmpxchg +#endif #ifndef atomic64_andnot static inline void atomic64_andnot(long long i, atomic64_t *v)
next prev parent reply other threads:[~2018-05-05 8:11 UTC|newest] Thread overview: 103+ messages / expand[flat|nested] mbox.gz Atom feed top 2018-05-04 17:39 [PATCH 0/6] arm64: add instrumented atomics Mark Rutland 2018-05-04 17:39 ` Mark Rutland 2018-05-04 17:39 ` [PATCH 1/6] locking/atomic, asm-generic: instrument ordering variants Mark Rutland 2018-05-04 17:39 ` Mark Rutland 2018-05-04 18:01 ` Peter Zijlstra 2018-05-04 18:01 ` Peter Zijlstra 2018-05-04 18:09 ` Mark Rutland 2018-05-04 18:09 ` Mark Rutland 2018-05-04 18:24 ` Peter Zijlstra 2018-05-04 18:24 ` Peter Zijlstra 2018-05-05 9:12 ` Mark Rutland 2018-05-05 9:12 ` Mark Rutland 2018-05-05 8:11 ` Ingo Molnar [this message] 2018-05-05 8:11 ` [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Ingo Molnar 2018-05-05 8:36 ` [PATCH] locking/atomics: Simplify the op definitions in atomic.h some more Ingo Molnar 2018-05-05 8:36 ` Ingo Molnar 2018-05-05 8:54 ` [PATCH] locking/atomics: Combine the atomic_andnot() and atomic64_andnot() API definitions Ingo Molnar 2018-05-05 8:54 ` Ingo Molnar 2018-05-06 12:15 ` [tip:locking/core] " tip-bot for Ingo Molnar 2018-05-06 14:15 ` [PATCH] " Andrea Parri 2018-05-06 14:15 ` Andrea Parri 2018-05-06 12:14 ` [tip:locking/core] locking/atomics: Simplify the op definitions in atomic.h some more tip-bot for Ingo Molnar 2018-05-09 7:33 ` Peter Zijlstra 2018-05-09 13:03 ` Will Deacon 2018-05-15 8:54 ` Ingo Molnar 2018-05-15 8:35 ` Ingo Molnar 2018-05-15 11:41 ` Peter Zijlstra 2018-05-15 12:13 ` Peter Zijlstra 2018-05-15 15:43 ` Mark Rutland 2018-05-15 17:10 ` Peter Zijlstra 2018-05-15 17:53 ` Mark Rutland 2018-05-15 18:11 ` Peter Zijlstra 2018-05-15 18:15 ` Peter Zijlstra 2018-05-15 18:52 ` Linus Torvalds 2018-05-15 19:39 ` Peter Zijlstra 2018-05-21 17:12 ` Mark Rutland 2018-05-06 14:12 ` [PATCH] " Andrea Parri 2018-05-06 14:12 ` Andrea Parri 2018-05-06 14:57 ` Ingo Molnar 2018-05-06 14:57 ` Ingo Molnar 2018-05-07 9:54 ` Andrea Parri 2018-05-07 9:54 ` Andrea Parri 2018-05-18 18:43 ` Palmer Dabbelt 2018-05-18 18:43 ` Palmer Dabbelt 2018-05-05 8:47 ` [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Peter Zijlstra 2018-05-05 8:47 ` Peter Zijlstra 2018-05-05 9:04 ` Ingo Molnar 2018-05-05 9:04 ` Ingo Molnar 2018-05-05 9:24 ` Peter Zijlstra 2018-05-05 9:24 ` Peter Zijlstra 2018-05-05 9:38 ` Ingo Molnar 2018-05-05 9:38 ` Ingo Molnar 2018-05-05 10:00 ` [RFC PATCH] locking/atomics/powerpc: Introduce optimized cmpxchg_release() family of APIs for PowerPC Ingo Molnar 2018-05-05 10:00 ` Ingo Molnar 2018-05-05 10:26 ` Boqun Feng 2018-05-05 10:26 ` Boqun Feng 2018-05-06 1:56 ` Benjamin Herrenschmidt 2018-05-06 1:56 ` Benjamin Herrenschmidt 2018-05-05 10:16 ` [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Boqun Feng 2018-05-05 10:16 ` Boqun Feng 2018-05-05 10:35 ` [RFC PATCH] locking/atomics/powerpc: Clarify why the cmpxchg_relaxed() family of APIs falls back to full cmpxchg() Ingo Molnar 2018-05-05 10:35 ` Ingo Molnar 2018-05-05 11:28 ` Boqun Feng 2018-05-05 11:28 ` Boqun Feng 2018-05-05 13:27 ` [PATCH] locking/atomics/powerpc: Move cmpxchg helpers to asm/cmpxchg.h and define the full set of cmpxchg APIs Ingo Molnar 2018-05-05 13:27 ` Ingo Molnar 2018-05-05 14:03 ` Boqun Feng 2018-05-05 14:03 ` Boqun Feng 2018-05-06 12:11 ` Ingo Molnar 2018-05-06 12:11 ` Ingo Molnar 2018-05-07 1:04 ` Boqun Feng 2018-05-07 1:04 ` Boqun Feng 2018-05-07 6:50 ` Ingo Molnar 2018-05-07 6:50 ` Ingo Molnar 2018-05-06 12:13 ` [tip:locking/core] " tip-bot for Boqun Feng 2018-05-07 13:31 ` [PATCH v2] " Boqun Feng 2018-05-07 13:31 ` Boqun Feng 2018-05-05 9:05 ` [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Dmitry Vyukov 2018-05-05 9:05 ` Dmitry Vyukov 2018-05-05 9:32 ` Peter Zijlstra 2018-05-05 9:32 ` Peter Zijlstra 2018-05-07 6:43 ` [RFC PATCH] locking/atomics/x86/64: Clean up and fix details of <asm/atomic64_64.h> Ingo Molnar 2018-05-07 6:43 ` Ingo Molnar 2018-05-05 9:09 ` [PATCH] locking/atomics: Clean up the atomic.h maze of #defines Ingo Molnar 2018-05-05 9:09 ` Ingo Molnar 2018-05-05 9:29 ` Peter Zijlstra 2018-05-05 9:29 ` Peter Zijlstra 2018-05-05 10:48 ` [PATCH] locking/atomics: Shorten the __atomic_op() defines to __op() Ingo Molnar 2018-05-05 10:48 ` Ingo Molnar 2018-05-05 10:59 ` Ingo Molnar 2018-05-05 10:59 ` Ingo Molnar 2018-05-06 12:15 ` [tip:locking/core] " tip-bot for Ingo Molnar 2018-05-06 12:14 ` [tip:locking/core] locking/atomics: Clean up the atomic.h maze of #defines tip-bot for Ingo Molnar 2018-05-04 17:39 ` [PATCH 2/6] locking/atomic, asm-generic: instrument atomic*andnot*() Mark Rutland 2018-05-04 17:39 ` Mark Rutland 2018-05-04 17:39 ` [PATCH 3/6] arm64: use <linux/atomic.h> for cmpxchg Mark Rutland 2018-05-04 17:39 ` Mark Rutland 2018-05-04 17:39 ` [PATCH 4/6] arm64: fix assembly constraints " Mark Rutland 2018-05-04 17:39 ` Mark Rutland 2018-05-04 17:39 ` [PATCH 5/6] arm64: use instrumented atomics Mark Rutland 2018-05-04 17:39 ` Mark Rutland 2018-05-04 17:39 ` [PATCH 6/6] arm64: instrument smp_{load_acquire,store_release} Mark Rutland 2018-05-04 17:39 ` Mark Rutland
Reply instructions: You may reply publicly to this message via plain-text email using any one of the following methods: * Save the following mbox file, import it into your mail client, and reply-to-all from there: mbox Avoid top-posting and favor interleaved quoting: https://en.wikipedia.org/wiki/Posting_style#Interleaved_style * Reply using the --to, --cc, and --in-reply-to switches of git-send-email(1): git send-email \ --in-reply-to=20180505081100.nsyrqrpzq2vd27bk@gmail.com \ --to=mingo@kernel.org \ --cc=aryabinin@virtuozzo.com \ --cc=boqun.feng@gmail.com \ --cc=catalin.marinas@arm.com \ --cc=dvyukov@google.com \ --cc=linux-arm-kernel@lists.infradead.org \ --cc=linux-kernel@vger.kernel.org \ --cc=mark.rutland@arm.com \ --cc=peterz@infradead.org \ --cc=will.deacon@arm.com \ /path/to/YOUR_REPLY https://kernel.org/pub/software/scm/git/docs/git-send-email.html * If your mail client supports setting the In-Reply-To header via mailto: links, try the mailto: linkBe sure your reply has a Subject: header at the top and a blank line before the message body.
This is an external index of several public inboxes, see mirroring instructions on how to clone and mirror all data and code used by this external index.