Author |
Topic: 256 Color, High Color, True Color |
Chip Fossa
From: Monson, MA, USA (deceased)
|
Posted 16 Feb 2001 7:17 am
|
|
Hi Everyone,
Can someone explain the difference between
800x600 256 color, 800x600 High Color [16Bit], and 800x600 True Color [24 Bit]?
My present setting is 800x600 High Color [16 Bit]; and all seems well.
Thanks ya'll
chipsahoy |
|
|
|
David Pennybaker
From: Conroe, TX USA
|
Posted 16 Feb 2001 10:02 am
|
|
Quite simply, 256 color (8-bit) has 2^8 = 256 different values available for the R, G, and B (red, green, blue) levels to create it colors from.
16-bit color has 2^16 = 65,536 different values.
24-bit color has 2^24 = 16,777,216 different values.
You'll notice a BIG difference between 8-bit and 16-bit. The difference between 16-bit and 24-bit is more subtle, but it's there. Especially in scanned photographs.
Use the highest color setting possible for your resolution. Depending on your video card, you may have to use a lesser color setting if you increase resolution. A modern video card can easily display 1600x1200 at 32-bit color resolution.
------------------
The Unofficial Photographer of The Wilkinsons
|
|
|
|
Chip Fossa
From: Monson, MA, USA (deceased)
|
Posted 16 Feb 2001 10:19 am
|
|
Thanks David. I have an HP Pavilion pc with
about 733Mhz, 30GB, and 128RAM. The video
card is an Intel 82810 Graphics Controller
4.12.01.2576.
I also have a Voodoo3 2000 graphics card,
but can't seem to install it on the
Pavilion. I also have SoundBlaster MP3+ and
this, too, will not install.
The Pavilion came from the factory with Windows ME as the OS. |
|
|
|
Bobby Lee
From: Cloverdale, California, USA
|
Posted 16 Feb 2001 12:13 pm
|
|
If you're using an older video card, the amount of memory on it might limit the color depth that you can display. At 800x600, a 256 color display will require 480,000 bytes (about half a meg). For high color, each pixel uses 2 bytes so it's twice that. For true color, each pixel probably uses 4 bytes, so 2 MB of display memory is required. (It's easier for the processor to read 4 bytes than 3, so 24 bit color often uses 32 bit pixels.)
Another interesting twist is color fidelity. 24 bit color ("True Color") gives you 8 bits each of red, green and blue. That's 256 levels of each primary color. 16 bit color ("High Color") is usually set for 5 bits (32 levels) of red and blue, and 6 bits (64 levels) of green. That's because the human eye is more sensitive to variations in green.
8 bit color (256 colors) is actually a lookup table of 256 24-bit colors. 20 of those colors are predefined by Windows, and the rest are under the control of the application.
Here's another interesting wrinkle. Windows doesn't realy support high color internally. A Windows program deals with 24-bit or 8-bit color. All 16-bit color manipulation is handled by the video card's display driver, which throws away the extra 8 bits (3 red, 3 blue and 2 green) as it reformats the pixels into a 16-bit display buffer. This extra overhead can make the high color mode run a bit slower than true color.
256 colors, the lowest display mode, should really be avoided. Since the color space is so limited, it often resorts to "dithering" to simulate colors that aren't available in the lookup table. The result is a lot of dots and a "grainy" appearance, especially on photographs.
High color mode doesn't really give you the full color fidelity of the system. It's designed to avoid dithering on systems where there isn't enough video memory for true color. I wouldn't use high color (16-bit) if true color (24-bit) were available at the same resolution.
True color, when it's available, is almost always the best choice. The only exception would be if you only use the computer for word processing and spreadsheets. Then you're better off going for the highest resolution your monitor can handle, even if you have to back down to 8-bit or 16-bit color to do it. If you use a web browser or anything else that displays photos, you should use true color.
------------------
Bobby Lee - email: quasar@b0b.com - gigs - CDs
Sierra Session S-12 (E9), Williams DX-10 (E9, D6), Sierra Olympic S-12 (F Diatonic)
Sierra 8 Laptop (D13), Fender Stringmaster D-8 (E13, A6) |
|
|
|
Chip Fossa
From: Monson, MA, USA (deceased)
|
Posted 16 Feb 2001 1:11 pm
|
|
WOW b0b,
That's a great explanation of the color
formatting. I went ahead and changed over to True Color. It does seem a bit clearer and
more intense. Thanks b0b.
chipsahoy |
|
|
|
Bobby Lee
From: Cloverdale, California, USA
|
Posted 16 Feb 2001 2:31 pm
|
|
Another thing I should mention. The performance hit of constructing 16-bit pixels might not be as great as the time spent transferring the bigger 24-bit pixels into the card's 32-bit display memory. It's actually a pretty close race. The main reason for using true color is that it looks better. It really does, especially on the Forum background! |
|
|
|