The problem with dithering on Intel is that various tested Intel gpu's (Ironlake, IvyBridge, Haswell, Skylake iirc.) are dithering when they shouldn't. If one has a standard 8 bpc framebuffer feeding into a standard (legacy) 256 slots, 8 bit wide lut which was loaded with an identity mapping, feeding into a standard 8 bpc video output (DVI/HDMI/DP), the expected result is that pixels rendered into the framebuffer show up unmodified at the video output. What happens instead is that some dithering is needlessly applied. This is bad for various neuroscience/medical research equipment that requires pixels to pass unmodified in a pure 8 bpc configuration, e.g., because some digital info is color-encoded in-band in the rendered image to control research hardware, a la "if rgb pixel (123, 12, 23) is detected in the digital video stream, emit some trigger signal, or timestamp that moment with a hw clock, or start or stop some scientific recording equipment". Also there exist specialized visual stimulators to drive special displays with more than 12 bpc, e.g., 16 bpc, and so they encode the 8MSB of 16 bpc color values in pixels in even columns, and the 8LSB in the odd columns of the framebuffer. Unexpected dithering makes such equipment completely unusable. By now I must have spent months of my life, just trying to deal with dithering induced problems on different gpu's due to hw quirks or bugs somewhere in the graphics stack.
Atm. the intel kms driver disables dithering for anything with >= 8 bpc as a fix for this harmful hardware quirk.
Ideally we'd have uapi that makes dithering controllable per connector (on/off/auto, selectable depth), also in a way that those controls are exposed as RandR output properties, easily controllable by X clients. And some safe default in case the client can't access the properties (like I'd expect to happen with the dozens of Wayland compositors under the sun). Various drivers had this over time, e.g., AMD classic kms path (if i don't misremember) and nouveau, but some of it also got lost in the new atomic kms variants, and Intel never exposed this.
Or maybe some method that checks the values actually stored in the hw lut's, CTM etc. and if the values suggest no dithering should be needed, disable the dithering. E.g., if output depth is 8 bpc, one only needs dithering if the slots in the final active hw lut do have any meaningful values in the lower bits below the top 8 MSB, ie. if the content is actually > 8 bpc net bit depth.