On Tue, 18 Jan 2022 10:53:52 +0100 Gerd Hoffmann wrote: > On Tue, Jan 18, 2022 at 10:33:23AM +0200, Pekka Paalanen wrote: > > On Mon, 17 Jan 2022 19:47:39 +0100 > > Sven Schnelle wrote: > > > > > I also tested the speed on my Thinkpad X1 with Intel graphics, and there > > > a dmesg with 919 lines one the text console took about 2s to display. In > > > x11, i measure 22ms. This might be unfair because encoding might be > > > different, but i cannot confirm the 'memcpy' is faster than hardware > > > blitting' point. I think if that would be the case, no-one would care > > > about 2D acceleration. > > > > I think that is an extremely unfair comparison, because a graphical > > terminal app is not going to render every line of text streamed to it. > > It probably renders only the final view alone if you simply run > > 'dmesg', skipping the first 800-900 lines completely. > > Probably more like "render on every vblank", but yes, unlike fbcon it > surely wouldn't render every single character sent to the terminal. Yes, and since 1k lines of dmesg is such little data, I would guess even an old machine can chew that up in much less than one refresh period until it needs to draw, so there is only going to be one or two screen updates to be drawn. Also, since X11 does not have vblank or frame boundaries in the protocol, a terminal emulator app will do render throttling somehow else. Maybe when it temporarily exhausts input and a timer as a deadline in case input just keeps on flooding, would be my wild guess. Thanks, pq