All of lore.kernel.org
 help / color / mirror / Atom feed
* Question about soc_pcm_apply_msb()
@ 2021-12-16  6:53 Kuninori Morimoto
  2021-12-16  9:48 ` Lars-Peter Clausen
  0 siblings, 1 reply; 3+ messages in thread
From: Kuninori Morimoto @ 2021-12-16  6:53 UTC (permalink / raw)
  To: Mark Brown; +Cc: Linux-ALSA, Kuninori Morimoto


Hi ALSA ML

At soc_pcm_apply_msb(),
(A) part finds max sig_bits for Codec, and
(B) part finds max sig_bits for CPU
and, set it to substream via soc_pcm_set_msb() at (X), (Y).

	static void soc_pcm_apply_msb()
	{
		...
 ^		for_each_rtd_codec_dais(rtd, i, codec_dai) {
(A)			...
 v			bits = max(pcm_codec->sig_bits, bits);
		}

 ^		for_each_rtd_cpu_dais(rtd, i, cpu_dai) {
(B)			...
 v			cpu_bits = max(pcm_cpu->sig_bits, cpu_bits);
		}

(X)		soc_pcm_set_msb(substream, bits);
(Y)		soc_pcm_set_msb(substream, cpu_bits);
	}

I wonder do we need both (X) (Y) ?
I think we can merge (A) and (B) (= find Codec/CPU max sig_bits),
and call soc_pcm_set_msb() once, but am I misunderstand ?

We have many patch around here, and below are the main patches.
The 1st patch has both (X)(Y) code.

new
	19bdcc7aeed4169820be6a683c422fc06d030136
	("ASoC: Add multiple CPU DAI support for PCM ops")

	57be92066f68e63bd4a72a65d45c3407c0cb552a
	("ASoC: soc-pcm: cleanup soc_pcm_apply_msb()")

	c8dd1fec47d0b1875f292c40bed381b343e38b40
	("ASoC: pcm: Refactor soc_pcm_apply_msb for multicodecs")

	58ba9b25454fe9b6ded804f69cb7ed4500b685fc
	("ASoC: Allow drivers to specify how many bits are significant on a DAI")
old

Thank you for your help !!

Best regards
---
Kuninori Morimoto

^ permalink raw reply	[flat|nested] 3+ messages in thread

* Re: Question about soc_pcm_apply_msb()
  2021-12-16  6:53 Question about soc_pcm_apply_msb() Kuninori Morimoto
@ 2021-12-16  9:48 ` Lars-Peter Clausen
  2021-12-16 23:18   ` Kuninori Morimoto
  0 siblings, 1 reply; 3+ messages in thread
From: Lars-Peter Clausen @ 2021-12-16  9:48 UTC (permalink / raw)
  To: Kuninori Morimoto, Mark Brown; +Cc: Linux-ALSA, Kuninori Morimoto

On 12/16/21 7:53 AM, Kuninori Morimoto wrote:
> Hi ALSA ML
>
> At soc_pcm_apply_msb(),
> (A) part finds max sig_bits for Codec, and
> (B) part finds max sig_bits for CPU
> and, set it to substream via soc_pcm_set_msb() at (X), (Y).
>
> 	static void soc_pcm_apply_msb()
> 	{
> 		...
>   ^		for_each_rtd_codec_dais(rtd, i, codec_dai) {
> (A)			...
>   v			bits = max(pcm_codec->sig_bits, bits);
> 		}
>
>   ^		for_each_rtd_cpu_dais(rtd, i, cpu_dai) {
> (B)			...
>   v			cpu_bits = max(pcm_cpu->sig_bits, cpu_bits);
> 		}
>
> (X)		soc_pcm_set_msb(substream, bits);
> (Y)		soc_pcm_set_msb(substream, cpu_bits);
> 	}
>
> I wonder do we need both (X) (Y) ?
> I think we can merge (A) and (B) (= find Codec/CPU max sig_bits),
> and call soc_pcm_set_msb() once, but am I misunderstand ?
We need both. Or alternatively you could write 
soc_pcm_set_msb(substream, min(bits, cpu_bits)).

What this does is it computes the maximum msb bits from both the CPU 
side and the CODEC side and then sets the msb bits reported to userspace 
to the minimum of the two.

The largest number of MSBs we'll see on the CODEC side is the max() and 
the largest number of MSBs we'll see on the CPU side is the max(). And 
the number of MSBs that the application will be able to see is the 
smaller of the two.



^ permalink raw reply	[flat|nested] 3+ messages in thread

* RE: Question about soc_pcm_apply_msb()
  2021-12-16  9:48 ` Lars-Peter Clausen
@ 2021-12-16 23:18   ` Kuninori Morimoto
  0 siblings, 0 replies; 3+ messages in thread
From: Kuninori Morimoto @ 2021-12-16 23:18 UTC (permalink / raw)
  To: Lars-Peter Clausen, Mark Brown; +Cc: Linux-ALSA, Kuninori Morimoto


Hi Lars-Peter

Thank you for your feedback

>> I wonder do we need both (X) (Y) ?
>> I think we can merge (A) and (B) (= find Codec/CPU max sig_bits),
>> and call soc_pcm_set_msb() once, but am I misunderstand ?
> We need both. Or alternatively you could write 
> soc_pcm_set_msb(substream, min(bits, cpu_bits)).
>
> What this does is it computes the maximum msb bits from both the CPU 
> side and the CODEC side and then sets the msb bits reported to userspace 
> to the minimum of the two.
>
> The largest number of MSBs we'll see on the CODEC side is the max() and 
> the largest number of MSBs we'll see on the CPU side is the max(). And 
> the number of MSBs that the application will be able to see is the 
> smaller of the two.

Oh, yes. thank you for explaining details.
I think snd_pcm_hw_rule_msbits() was the point.

Best regards
---
Kuninori Morimoto

^ permalink raw reply	[flat|nested] 3+ messages in thread

end of thread, other threads:[~2021-12-16 23:20 UTC | newest]

Thread overview: 3+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2021-12-16  6:53 Question about soc_pcm_apply_msb() Kuninori Morimoto
2021-12-16  9:48 ` Lars-Peter Clausen
2021-12-16 23:18   ` Kuninori Morimoto

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.