Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You would never want to ignore it. Gamma is about perceptual intensity, which follows a power law, as opposed to physical intensity (counting photons). That's why it's an "optimization" in the first place. Even in a magical world where all colors were represented with infinite-precision rational numbers, you'd still want to use a power law every time you wanted to display a gradient.


Perceptually-uniform color spaces such as CIE L*a*b* (and especially more modern alternatives, such as Oklab [1]) are better for gradients. This page has a simple demo: https://raphlinus.github.io/color/2021/01/18/oklab-critique....

This Observable notebook goes into more detail: https://observablehq.com/@mattdesl/perceptually-smooth-multi...

Gamma is more or less a poor man's perceptual color space, but I don't think it's very useful for that nowadays. There are much better options for image processing that requires perceptual uniformity (which is not everything, e.g. blurring is usually better done in linear RGB), and when you don't need that, I'm not aware of much reason to use it other than limited precision and sRGB being a ubiquitous device color space.

[1] https://bottosson.github.io/posts/oklab/


Depends on what you mean by "display". If you are talking about a monitor, sure, do whatever magic works best in there. But otherwise I prefer to work in a mathematical vector space and ignore it. Raytracing software for example can just ignore it as long as renderers understand the image format it outputs.


Indeed, but reading this I definitely get the sense this would all be easier if that was left to displays at the end of the chain, rather than be present in how the information is encoded on disk.

Can obviously understand how we ended up here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: