Linear vs. Gamma, but Why??

Apple Dragon
3 min readNov 12, 2020

--

A few days ago, while learning how to build our Unity project for Windows/Mac OS X/Linux and WebGL, the course (and Unity) instructs us to use Linear for a desktop build and Gamma to build for a web app.

Unity — Project Settings -> Player
WebGL tab — Other Settings -> Rendering -> Color Space

Aside from a little warning box that pops up when switching to build a WebGL app, I realized this is a setting I change when told to, without really understanding, so I can never remember which setting to use for which build. To help reinforce this once and for all, I took this as a personal challenge to search for an answer.

A history lesson was necessary to fully understand,,,

It turns out that the origin of Gamma dates back to the days of CRTs. Maybe you’ve seen those boat anchors in history books or movies from the 1970’s and 80’s. I have actually been zapped a few times while repairing them (8 to 13 kilovolts, a I recall?). Just when I thought those days were behind me, legacy makes an appearance in my Unity Build Play Settings!

The short version of this saga is that raw images were captured but not seen properly on the curvature of the CRT monitor with the human eye. A Gamma function was encoded into the raw image file to achieve a recognizable image on the CRT, one that did made sense to the human eye. The standard has held encoded in image formats we continue to use today, such as jpg, tif, png, etc.

To accommodate LCD and LED monitors, an sRGB function helped normalize the Gamma effect to restore linearity. Since the most common LCD and LED monitors require 8 to 10-bit encoding, these legacy formats kept their place as standard image formats.

Note that the result of the sRGB curve against the Gamma curve averages out to a linear function. With the availability of 16 to 32-bit monitors, the above is no longer necessary, and the raw EXR (already linear) format accommodates without having to apply function acrobatics to display an image. If the need arises, a LUT (lookup table) can bridge the gap and display such images on 8 or 10-bit monitors.

What this boils down to is that a Gamma color space looks lighted and almost reflective and the Linear one darker with richer detail. Linear is the default setting that works for the desktop build and finer display monitors. Gamma is recommended for WebGL where accommodation for legacy displays are more likely necessary.

For a more detailed explanation, check out Gamma Correction and Linear Workflow by LightDark Academy.

--

--

Apple Dragon
Apple Dragon

Written by Apple Dragon

An unconventional software developer's journey

No responses yet