On Sun, Jun 19, 2016 at 11:46:29AM +0200, Richard Cochran wrote: > On Sun, Jun 19, 2016 at 12:45:50AM +0200, Henrik Austad wrote: > > edit: this turned out to be a somewhat lengthy answer. I have tried to > > shorten it down somewhere. it is getting late and I'm getting increasingly > > incoherent (Richard probably knows what I'm talking about ;) so I'll stop > > for now. > > Thanks for your responses, Henrik. I think your explanations are on spot. > > > note that an adjustable sample-clock is not a *requirement* but in general > > you'd want to avoid resampling in software. > > Yes, but.. > > Adjusting the local clock rate to match the AVB network rate is > essential. You must be able to *continuously* adjust the rate in > order to compensate drift. Again, there are exactly two ways to do > it, namely in hardware (think VCO) or in software (dynamic > resampling). Don't get me wrong, having an adjustable clock for the sampling is essential -but it si not -required-. > What you cannot do is simply buffer the AV data and play it out > blindly at the local clock rate. No, that you cannot do that, that would not be pretty :) > Regarding the media clock, if I understand correctly, there the talker > has two possibilities. Either the talker samples the stream at the > gPTP rate, or the talker must tell the listeners the relationship > (phase offset and frequency ratio) between the media clock and the > gPTP time. Please correct me if I got the wrong impression... Last first; AFAIK, there is no way for the Talker to tell a Listener the phase offset/freq ratio other than how each end-station/bridge in the gPTP-domain calculates this on psync_update event messages. I could be wrong though, and different encoding formats can probably convey such information. I have not seen any such mechanisms in the underlying 1722 format though. So a Talker should send a stream sampled as if the gPTP time drove the AD/DA sample frequency directly. Whether the local sampling is driven by gPTP or resampled to match gPTP-time prior to transmit is left as an implementation detail for the end-station. Did all that make sense? Thanks! -- Henrik Austad