xrandr: use 1/gamma to compute gamma-correction
authorAndy Ritger <aritger@nvidia.com>
Fri, 24 Aug 2012 22:53:07 +0000 (15:53 -0700)
committerAaron Plattner <aplattner@nvidia.com>
Sat, 25 Aug 2012 03:49:11 +0000 (20:49 -0700)
commit6bf48ae8d8db58ab74182383e54332f120f024c2
tree4f78a0af963fe41f27ea4737cad50438123fa7e4
parent755234bd2ce0f3acde6507aba94b1e53a5a72f9b
xrandr: use 1/gamma to compute gamma-correction

To compute a gamma *correction* lookup table, use the specified gamma
value as the divisor in (1.0/gamma).  This matches the semantics of
xgamma(1) and the "gamma-value" and "{red,green,blue}-gamma" xorg.conf(5)
options.

For more details, see:
http://www.poynton.com/PDFs/TIDV/Gamma.pdf (Gamma in computer graphics, page 17)
http://cgit.freedesktop.org/xorg/xserver/tree/hw/xfree86/common/xf86cmap.c:ComputeGamma()

Signed-off-by: Andy Ritger <aritger@nvidia.com>
Reviewed-by: Aaron Plattner <aplattner@nvidia.com>
Signed-off-by: Aaron Plattner <aplattner@nvidia.com>
xrandr.c