Often computer users complain that images viewed on one computer do not look the same when viewed on another. Sometimes the images are better; often worse. This page attempts to address that topic.
(A short discussion of color and how it is displayed on a computer monitor is at the end of this tutorial. Click here if you need to review that first.)
Here is the short list of factors that typically cause color display problems:
- Viewing environment
- Monitor settings
- Nonlinear human vision
- Gamma correction
- Color coding
Viewing environment
The environment a monitor sits in can have a profound effect on the ability to view color images with their proper colors. If you have reflections on the screen or a room that has an ambient brightness different from the room in which the image was created then that image will look different; sometimes greatly so. As one example, an image developed in a dimly-lighted room and shown in a bright room will appear contrasty to the viewer. If the room is bright, the image may appear washed out.
If an image looks “different” the first things to research are the ambient conditions under which that image was developed. This may give you a clue as to how to make the environment better for viewing that image.
Monitor Settings
Color display is very sensitive to how your monitor’s controls are set. If not set properly the full range of colors in a digital image may not show properly. As an example, if the brightness control is improperly set some shades of black may combine and shading will be lost in an image.
To set your monitor (this assumes a CRT), drop out of Windows to the DOS command prompt. The screen should be all dark except for the prompt. Turn the brightness control all the way down and then adjust it upward until the monitor just starts to lighten up. Back the control down until the screen turns dark again. Once set, don’t move that control further. Now, display a representative picture and adjust the contrast to accurately display the image (strangely, when doing this you will notice the “contrast” control really affecting the “brightness” of the image). Note: On some monitors the brightness control may be called “Black Level” and the contrast control may be called “Picture.” (Some monitors may appear more contrasty than others due to a black coating on the mask the electron beam passes through.)
Nonlinear Human Vision
One problem with display of color is that human vision does not respond in a linear manner to changes in brightness. You may have seen professional photographers holding up a gray card to determine exposure. The gray card is 18% luminance but is perceived by human vision as about half brightness. Computer and display systems have to take this intensity nonlinearity into consideration. Not all do it correctly or well.
CRTs also have their own nonlinearities which are a function of the generation of the electron beam in the monitor (near saturation, phosphors may also have some effect). These corrections involve the next concept: gamma correction.
Gamma Correction
Intensity of light on a monitor’s screen is nonlinear relative to the applied voltage to generate that intensity (intensity is roughly applied voltage raised to some power). The exact power number is typically given the name gamma. If you have an image consisting of linear-light intensities then some compensation must be made for that image or it will appear murky in the mid-tones. On the other hand, if the image has had gamma correction and you apply another gamma correction to it then the image midtones will be too light. (Note: Images generated by a camera typically have gamma correction applied to them by the camera. Images generated by other means may not.)
Ideally, when you create an image you should make whatever corrections are necessary to take out the effects of your ambient conditions. This will allow the viewer of the image to always apply a transform suitable to their ambient conditions and thus correctly view the image. In practice this ideal is rarely achieved and so images can vary from system to system.
Color Coding
It’s possible to have virtually an infinite color spectrum on a monitor due to its analog nature; it’s just not possible for the computer to feed that many colors to the monitor. Let’s take the transition between black and white to examine this.
The ratio of intensity between brightest white and darkest black is called the contrast ratio and it changes for each environment. Projected film has a ratio of around 80:1 but typical office conditions limit the ratio of most monitors to around 5:1.
Human vision, on the other hand, can detect small differences in intensity. Changes on the order of 0.1% of the range between black and white can be detected. If we code each of these differences assuming linear intensity some 9,900 codes (or 14 bits) would be required per level.
But, as we’ve seen, nothing is linear. If the coding roughly follows the required nonlinearities (small changes where human vision can detect small changes and larger changes where human vision only detects larger changes) then some 460 codes (about nine bits) per increment would be needed. From a practical standpoint, the computer uses eight bits. When nonlinearly coded this yields an image sufficient for broadcast-quality digital television.
To quote from the colorspace FAQ:
Desktop computers are optimized neither for image synthesis nor for video. They have programmable “gamma” and either poor standards or no standards. Consequently, image interchange among desktop computers is fraught with difficulty.
A Special Case: The Computer’s Color Compromise
Extending the logic above for black to white, consider a model where each primary color uses eight bits to define it. Then, using 24 bits, 16,777,216 individual colors can be represented digitally. This 24-bit model is currently the standard most current computer graphics are held to when simulating true colors as seen in nature.
Some computers, however, compromise by using a 256-color mode of operation. In this mode the compromise involves using a palette to hold the colors currently in use. Each entry in the palette contains the full 24-bit representation of that color (plus information the system needs). So long as an image can be fully represented by any 256 colors of the 16 million available the palette method works very well. And, using the palette greatly reduces the amount of memory and processing needed to display the image to the screen. (Note: With advances in CPU and graphics processor technology working in full 24-bit mode has become easier so more and more systems do not need this compromise.)
The problem with the palette compromise is that it is a compromise. If any single image requires more than 256 colors then some grouping of the closest colors must be made and all of these assigned to some intermediate color that exists in the palette. Also, since each image can carry its own palette, if more than one image is shown on the screen there may be a total of more than 256 individual colors represented and so the computer must impose further compromises and combinations.
When dealing with multiple images you can sometimes watch these compromises happening as various images become active on the screen. The remaining images on the screen may color shift as the active palette changes. It’s much worse if images with more than 16 colors are viewed on a 16 color display.
The bottom line is that when color display of an image is involved, you should expect viewing differences instead of expecting consistent images from computer to computer. Different is the norm, not the exception.
What Is Color?
As a refresher, let’s briefly discuss color itself. We’ll use the computer’s color monitor as the basis for this discussion.
First, would you believe that all those colors you see on your computer’s monitor are really only combinations of just three? Believe it, it’s true. The monitor uses a color scheme known as additive color. The three primary colors in this system are red, green, and blue; often just referred to as RGB. Color on a monitor is obtained by illuminating phosphors that glow in the proper primary color combinations when hit by an electron beam. The electron beam scans the face of the monitor through a mask which directs the beam to specific phosphors (the phosphors are laid out in either dot groups known as triads or side-by-side thin lines). To the eye, each triad (or line trio) appears as a single dot. The term additive is used because where no beam falls there is black and where all colors are lit together there is white. Where two colors overlap various intermediate colors are created. The figure below shows this relationship.
Different beam intensities vary the color intensity which translates to more color varieties. Each pixel in the display is defined by a specific digital code that defines the intensity of each primary color that has to be applied to the triad or line trio representing that pixel (pixels and triads are usually not a one-to-one match on the screen leading to further compromise).