Return to main page
View source or report issues on GitHub

Representing colors in computers

For many years, images and graphics have been some of the most important things represented by computers - think about how much of your daily interaction with computers is about pretty things on screens, pictures, or videos.

In its most basic form, a picture is just a very large collectin of dots (called pixels) that are each a particular color. Get enough dots and you can build a picture. Change the dots rapidly, you get video.

Primary colors of Human Vision - RED, GREEN, BLUE

In art class you may have learned that primary colors are red, blue, and yellow. These primary colors apply to mixing together paint or markers or crayons. But computers don’t use paint, they use LIGHT. And for humans, the primary colors of light are RED, GREEN, and BLUE. Why? It has to do with how your eyes work - you have three special types of structures in your eyes called CONES that detect red light, green light, and blue light primarily, and all other colors are detected by a mixture of these cones.

Computer monitors and TVs and other devices designed to show colors to human take advantage of the way our eyes work by actually only emitting those three colors in various amounts. Even though the sun outputs light at a particular wavelength we call “yellow”, computers don’t - instead, they output equal amounts of red light and green light, and our eyes INTERPRET that as yellow.

This means that to a computer, any color is simply defined by three numbers: how bright should the red glow, how bright should the green glow, and how bright should the blue glow?

You can play with this using the splotlight app below.






Center Hex: #FFFFFF

Storing a color value

Colors in a computer generally are represented with using one byte for the amount of red, one byte for the amount of green, and one byte for the amount of blue, in that order. This allows for \(256^3\) total possible colors, about 16 million of them, certainly more than the human eye can differentiate.

We often represent a color using hexadecimal, as shown in the above applet. The values range from 00 (0) to FF (255), where 00 means none and FF means full brightness. The first two characters represent the red, then the green, then the blue. The hashtag in front of the value is a clue to the computer that hexadecimal is coming.

Sometimes you will see a fourth component of a color, called “alpha”. This doesn’t actually change the color itself, it changes the OPACITY, or transparency, which will effect how that object blends in with the background. When all four bytes are used, we call that “32-bit color”.