Dithering ninjasand : I'm generating images with 24 bits per color channel, and the screen representation (at 8 bits per channel) shows banding in gradients. The best result so far has been to round up or down semi-randomly to the nearest 8 bit color (depending on the fractional part), as shown in the link below. Any tips for improvements? Will a fixed pattern look better? Bonus: Can I do this while keeping the gamma correct (preferably efficiently on the GPU)?
one plus one
Shared publicly•View activity
- I prefer ordered dithering over random dithering patterns or Floyd-Steinberg and similar dithering.
As for efficiency, everything is easy and fast when you're only dithering one color channel. If you want to dither all RGB channels, then the code for finding mixture that gives the closest perceptual match will be the most time-consuming part. If you're willing to sacrifice some quality, you can do some shortcuts by truncating the target color a closely matching color in the screen color cube, and then synthesizing a mixture using that color cell and adjacent color cells in the color cube based on a simple evaluation of the error vector introduced during the truncation. The resulting dithering won't look good on low bit depth displays though.
For good dithering ordered dithering on displays with low bit depth or a limited fixed palette, I've in the past gotten good results by color perceptional color comparisons by converting into the CIE Lab color-space and using that to construct a mixture with minimal distance to the target color.Aug 10, 2011
- Thanks a lot! I agree that the pattern looks slightly smoother, and it seems to lend itself nicely to a GLSL implementation. The other methods both seem too expensive for my real time application, though I can certainly use the one suggested in the comments in the source code (mixing pool) for exporting images.
As for converting colors to CIE Lab color-space, it could be very handy if I could do that with a certainty of how it would look on the screen. Currently the output on my screen is not a power function (even disregarding the edges of the color spectrum) — it has certain jumps in the output, which leads to yet another banding every 5 color value increments or so. My guess is that this is OS X trying to give me a correct gamma curve. If I could know the output values in CIE Lab, I'm sure this banding could be mitigated too, but I'm not sure if this is even possible.Aug 11, 2011
- Regarding the banding you see, it is possible that this is a result of using either DVI or an HDMI hookup without DeepColor to your monitor. Graphics adapters typically use per-channel look up tables (LUTs) to do gamma and other color adjustments. In good old VGA adapters 8-bit input levels are mapped to 10-bit output levels before being sent to the digital to analog converter (DAC).
DVI signals only have an 8-bit color depth per channel. Thus, the LUTs used for DVI maps 8-bit levels to 8-bit levels. This means that if there is any gamma or color adjustment going on, a linear sweep from level 0 to level 255 in the input will contain both duplicates and gaps in the output. The type of banding you're seeing is probably occurrences of such duplicates or gaps in the output level mapping.Aug 15, 2011
- Yes, that would explain perfectly what I'm seeing. So unfortunately, it seems there is little I can do from the software side in the general case with this problem. I will try it next time I'm at a desktop computer with VGA, though.Aug 17, 2011
Add a comment...