Video display standards have evolved from early monochrome to today’s high resolution color. The evolution of these standards is summarized here.
Initial video standards were developed by IBM as one of the only players in the PC marketplace. As IBM’s influence over the hardware waned (or got diluted, whichever viewpoint you care to take) the Video Electronics Standards Association (VESA) was formed to define new standards for computer video displays.
But, it all started with the…
Monochrome Display Adapter (MDA)
Introduced in 1981, MDA was a pure text display showing 80 character lines with 25 vertical lines on the screen. Typically, the display was green text on a black background. Individual characters were 9 pixels wide by 14 pixels high (7×11 for the character, the rest for spacing). If you multiply that out you get a resolution of 720×350 but since the individual pixels were not capable of being addressed there were no graphics. Although, some programs managed some interesting bar charts and line art using various ASCII characters; particularly those above 128 used by code page 437.
The IBM MDA card had 4 KB of video memory. Display attributes included: invisible, underline, normal, bright/bold, reverse video, and blinking. Some attributes could be combined. IBM graphic’s card also contained a parallel printer port giving it the full name: Monochrome Display and Printer Adapter.
The monitor’s refresh rate was 50 Hz and users tended to complain about eyestrain after long days in front of the monitor.
See here for pictures and more details.
Hercules Graphics Card
Noting the 720×350 resolution of the MDA display, a company called Hercules Computer Technology (founded by Van Suwannukul), in 1982, developed an MDA-compatible video card that could display MDA text as well as graphics by having routines to individually address each pixel in the display. Because the screen height had to be a multiple of four, the full resolution of the Hercules Graphics Card was 720×348.
The Hercules card addressed two graphic pages, one at B0000h and the other at B8000h. When the second page was disabled there was no conflict with other adapters and the Hercules card could run in a dual-monitor mode with CGA or other graphics cards on the same computer. Hercules even made a CGA-compatible card called the Hercules Color Card and later the Hercules Graphics Card Plus (June 1986) followed by the Hercules InColor Card (April 1987) which had capabilities similar to EGA cards.
The graphics caught on and not only did Hercules cards multiply rapidly but clones of them started to appear; the ultimate homage to success. Most major software included a Hercules driver.
However, despite its attempts to keep up, Hercules started to fail as a company and was acquired by ELSA in August 1998 for $8.5 million. ELSA then declared bankrupcy in 1999 and the Hercules brand was bought by Guillemot Corporation, a French-based company, for $1.5 million. In 2004 Guillemot stopped producing graphic cards but Hercules, the name, lives on in some of their software and other products.
But, color was still the ultimate goal and Hercules was pushed out by other IBM specifications…
Color Graphics Adapter (CGA)
IBM came back to the fore when color started to appear in computer displays. The CGA standard, introduced in 1981 and primative by today’s standards, was still color; even if only 16 of them. Because the first PCs were for business, the color did not first catch on and the MDA monochrome standard we more often used. As prices came down and clones of the IBM PC were introduced, CGA became more of a standard.
The CGA card came with 16 KB of video memory and supported several different modes:
- Text mode which included 80×25 text (like the MDA system) in 16 colors. The resolution, however was lower as each character was made up of 8×8 pixels instead of the MDA’s 9×14 pixels. A 40×25 text mode was also supported in 16 colors. In both, the foreground and background colors could be changed for each character.
- Monochrome graphics mode which displayed graphics at 640×200 pixels. This was lower than the Hercules card but seemed to serve the purpose for an initial release and this was quickly replaced with the EGA standard.
- Color graphics mode which came in two flavors: a 320×200 pixel mode with four colors and a lesser-used resolution of 160×200 in 16 colors. The four-color mode only had two official palettes to choose from:
- Magenta, cyan, white and background color (black by default).
- Red, green, brown/yellow and background color (black by default).
The 16-color graphic mode used a composite color mode instead of the 16 colors of the CGA text above. Because the color technique was not supported in the BIOS there was little adoption of that mode except by some games.
The CGA color palette was based on the Motorola MC6845 display controller. Red, green, and blue were created by the three cathode rays with black being an absence of cathode rays. The other colors were mixes of two different colors and white used all three color beams. An “intensifier” bit gave a brighter version of the basic 8 colors for a total of 16. There was one exception to this. In the normal RGB model color #6 should be a dark yellow (#AAAA00) however IBM changed the monitor circuitry to detect it and lower its green component to more closely match a brown (#AA5500) color. Other monitor makers mimicked this which is why the intense version of #6, brown, turned out to be a bright yellow as the intense version was not so modified. There is no clear reason expressed why IBM did this but it’s speculated they wanted to match 3270 mainframe colors. So, the colors appeared as…
- Color 0 – Black – #000000 and the intense version, color 8 – Dark Grey – #555555
- Color 1 – Blue – #0000AA and the intense version, color 9 – Bright Blue – #5555FF
- Color 2 – Green – #00AA00 and the intense version, color 10 – Bright Green – #55FF55
- Color 3 – Cyan – #00AAAA and the intense version, color 11 – Bright Cyan – #55FFFF
- Color 4 – Red – #AA0000 and the intense version, color 12 – Bright Red – #FF5555
- Color 5 – Magenta – #AA00AA and the intense version, color 13 – Bright Magenta – #FF55FF
- Color 6 – Brown – #AA5500 and the intense version, color 14 – Bright Yellow – #FFFF55
Color 6 in some clone monitors –Yellow – #AAAA00 - Color 7 – Light Grey – #AAAAAA and the intense version, color 15 – Bright White – #FFFFFF (which you won’t see because the background here is white)
There were several tweaks to the CGA text and graphics systems which resulted in different default background colors, different colored borders, and other tweaks which gave the appearance of the CGA system having more than the graphic modes above; but, these were all tweaks and not changes to the basic system itself.
Refresh rate for CGA monitors was increased to 60 Hz as a result of eyestrain complaints from the MDA 50 Hz rate. (The higher the refresh rate the less likely pixels on the screen will flicker as the phosphor is refreshed at a faster rate.)
See here for more details and pictures.
But, the low resolution of CGA begged for higher resolutions. To fill those demands IBM developed EGA…
Enhanced Graphics Adapter (EGA)
The Enhanced Graphics Adapter was introduced by IBM in 1984 as the primary display for the new PC-AT Intel 286-based computer. EGA increased resolution to 640×350 pixels in 16 colors. The card itself contained 16 KB of ROM to extend the system BIOS to add graphics functions. The card started with 64 KB of video memory but later cards and clone cards came with 256KB of video memory to allow full implementation of all EGA modes which included…
- High-resolution mode with 640×350 pixel resolution. On any given screen display a total of 16 colors could be displayed; however, these could be selected from a palette of 64 colors.
- CGA mode included full 16-color versions of the CGA 640×200 and 320×200 graphics modes. The original CGA modes were present in the card but EGA is not 100% hardware-compatible with CGA.
- MDA could be supported to some degree. By setting switches on the card an MDA monitor could be driven by an EGA card however only the 640×350 display could be supported.
Some EGA clones extended the EGA features to include 640×400, 640×480, and 720×540 along with hardware detection of the attached monitor and a special 400-line interlace mode to use with older CGA monitors. None of these became standard however.
EGA’s life was fairly short as VGA was introduced by IBM in April of 1987 and quickly took over the market. In the meantime, IBM had a brief go with a specialized graphics system called PGC and the 8514 Display Standard…
Professional Graphics Controller (PGC)
The Professional Graphics Controller (PGC) enjoyed a short lifetime between 1984 and 1987. It offered the “high” resolution of 640×480 pixels with 256 colors out of a palette of 4,096 colors. Refresh rate was 60 Hz. The card had 320 KB of video RAM and an on-board microprocessor. The card had a CGA mode as well but this could be turned off in order to maintain a CGA card in the same computer if necessary.
Designed for high-end use, the controller was composed of three(!) adapter cards (two cards, each taking a single adapter slot and a third card between and attached to each card). All were physically connected together with cables. See here for a full description and pictures.
The price of several thousand dollars and the complicated hardware brought the PGC to a quick end even though it was a very good graphics card for its day.
8514 Display Standard
IBM introduced the 8514 Display Standard in 1987; about the same time as VGA. The companion monitor (model 8514) was also sold by IBM. The pair (8514/A Display Adapter and 8514 monitor) comprise the 8514 Display Standard and is generally regarded as the first mass-market video card accelerator. It was certainly not the first in the industry, but others before it were largely designed for workstations. Workstation accelerators were programmable; the 8514 was not; it was a fixed-function accelerator and could therefore be sold at a much lower price for mass-market use. The card typically had 2D-drawing functions like line-draw, color-fill, and BITBLT offloaded to it while the CPU worked on other tasks.
The basic modes the 8514 were designed to operate at were…
- 1024×768 pixels at 256 colors and 43.5 Hz interlaced.
- 640×480 pixels at 256 colors and 60 Hz non-interlaced and other regular VGA modes. The 8514/A card was only responsible for the 1024×768 graphic mode. All other modes were created using the VGA hardware on the computer’s motherboard and then the video was fed through the adapter card to the monitor which was connected to the adapter card. 8514 did not support an 800×600 pixel mode even though you might think it could.
Note the difference between interlaced and non-interlaced display and the frequency above. While the 8514 displayed a much higher resolution screen than most other mass-market solutions of the day, the use of an interlaced display was unusual.
8514 was replaced by IBM’s XGA standard which we’ll talk about later on this page. For now, we’ll get back in sequence with VGA…
Video Graphics Array (VGA)
With VGA you see a change in the terminology from adapter to array. This was a result of the fact that VGA graphics started to come on the motherboard as a single chip and not as plug-in adapter boards that took up an expansion slot in the computer. While since replaced with other standards for general use, VGA’s 640×480 remains a sort of lowest common denominator for all graphics cards. Indeed, even the Windows splash screen logo comes in at 640×480 because it shows before the graphics drivers for higher resolution are loaded into the system.
VGA supports both graphics and text modes of operation and can be used to emulate most (but not all) of the EGA, CGA, and MDA modes of operation). The most common VGA graphics modes include:
- 640×480 in 16 colors. This is a planar mode with four bit planes. When speaking about VGA, this is the mode most often thought of and is often what is meant when some say “VGA.”
- 640×350 in 16 colors.
- 320×200 in 16 colors.
- 320×200 in 256 colors (Mode 13h). This is a packed-pixel mode.
The VGA specification dictated 256KB of video RAM, 16- and 256-color modes, a 262,144 color palette (six bits for each of red, green, and blue), a selectable master clock (25 MHz or 28 MHz), up to 720 horizontal pixels, up to 480 lines, hardware smooth scrolling, split screen support, soft fonts, and more.
Another VGA programming trick essentially created another graphics mode: Mode X. By manipulating the 256 KB video RAM four separate planes could be formed where each used 256 colors. Mode X transferred some of the video memory operations to the video hardware instead of keeping them with the CPU. This sped up the display for things like games and was most often seen in 320×240 pixel resolution as that produced square pixels in 4:3 aspect ratio. Mode X also allowed double buffering; a method of keeping multiple video pages in memory in order to quickly flip between them. All VGA 16-color modes supported double buffering; only Mode X could do it in 256 colors.
Many other programming tweaks to VGA could (and were) also performed. Some, however, caused monitor display problems such as flickering, roll, and other abnormalities so they were not used commercially. Commercial software typically used “safe” VGA modes.
Video memory typically mapped into real mode memory in a PC in the memory spaces…
- B0000h (used for monochrome text mode)
- B8000h (used for color text and CGA graphics modes)
- A0000h (used for EGA/VGA graphics modes)
Note that by using the different memory areas it is possible to have two different monitors attached and running in a single computer. Early on, Lotus 1-2-3 took advantage of this by having the ability to display “high resolution” text on an MDA display along with color (low-resolution) graphics showing an associated graph of some part of the spreadsheet. Other such uses included coding on one screen with debugging information on another and similar applications.
VGA also had a subset called…
Multicolor Graphics Adapter (MCGA)
MCGA shipped first with the IBM PS/2 Model 25 in 1987. MCGA graphics were built into the motherboard of the computer. As a sort of step between EGA and VGA, MCGA had a short life and was shipped with only two IBM models, the PS/2 Model 25 and PS/2 Model 30 and fully discontinued by 1992. The MCGA capabilities were incorporated into VGA. Note: Some say that the 256-color mode of VGA is MCGA but, to be accurate, no MCGA cards were ever made; only the two IBM PS/2 models indicated had true MCGA chips. The 256-color mode of VGA, while similar, stands alone as part of the VGA specification.
The specific MCGA display modes included:
- All CGA modes (except the text mode that allowed connection of the MDA (model 5151) monitor).
- 640×480 monochrome at a 60 Hz refresh rate.
- 320×200 256-color at a 70 Hz refresh rate. The 256 colors were chosen from a palatte of 262,144 colors.
Like the other IBM standards, clone makers quickly cloned VGA. Indeed, while IBM produced later graphics specifications as we’ll see below, the VGA specification was the last IBM standard that other manufacturers followed closely. Over time, as extensions to VGA appeared, they were loosely grouped under the name Super VGA.
Super VGA (SVGA)
Super VGA was first defined in 1989 by the Video Electronics Standards Association (VESA); an association dedicated to providing open standards instead of the closed standards from a single company (IBM). While initially defined as 800×600 with 16 colors, SVGA evolved to 1024×768 with 256 colors and even higher resolutions and colors as time went on.
As a result SVGA is more of an umbrella than a fixed standard. Indeed, most any graphics system released between the early 1990s and early 2000s (a decade!) has generally been called SVGA. And, it was up to the user to determine from the specifications if the graphics system supported their needs.
The VESA SVGA standard was also called the VESA BIOS Extension (VBE). VBE could be implemented in either hardware or software. Often you would find a version of the VBE in a graphic card’s hardware BIOS with extensions in software drivers.
How could a standard be so fractured? With the introduction of VGA, the video interface between the adapter and the monitor changed to analog from digital. An analog system can support what is effectively an infinite number of colors. Therefore, color depth largely became a function of how the video adapter was constructed and not the monitor. Therefore, for a set of different monitors there could be thousands of different video adapters that could connect to the monitors and drive them accordingly. Of course, the monitors had to be able to handle the various refresh frequencies and some had to be larger to support the increasing number of pixels but it was easier to produce a few large multi-frequency monitors than it was to produce the graphics computing power necessary to drive them.
Thus, while SVGA is an accepted term, it has no specific meaning except to indicate a display capability generally somewhere between 800×600 pixels and 1024×768 pixels at color depths ranging from 256 colors (8-bits) to 65,536 colors (16-bits). But, even those values overlap the various XGA standards…
Extended Graphics Array (XGA)
IBM’s XGA was introduced in 1990 and is generally considered to be a 1024×768 pixel display. It would be wrong, however, to consider XGA a successor to SVGA as the two were initially released about the same time. Indeed, the SVGA “definition” has expanded as seen above and one might consider XGA to have been folded under the SVGA umbrella.
Initially, XGA was an enhancement to VGA and added two modes to VGA…
- 800×600 pixels at 16-bits/pixel for 65,536 colors.
- 1024×768 pixels at 8-bits/pixel for 256 colors.
Graphic display processing offloading features from the 8514 system were incorporated into and expanded under XGA. The number and type of drawing primitives were increased over the 8514 and the 16-bit color mode added.
Later, and XGA-2 specification added 640×480 at true color, increased the 1024×768 mode to high color (16-bit/pixel for 65,536 colors) and improved the graphic accelerator performance.
Note: XGA was an IBM standard, the VESA released a similar standard called Extended Video Graphics Array (EVGA) in 1991. The two should not be confused. EVGA, as a standalone term, never really caught on.
XGA, over time developed into a family of different standards. The following entries summarize this family…
- Wide Extended Graphic Array (WXGA). A widescreen standard that varied somewhat in the supported resolutions. The most common were 1280×720 pixels with an aspect ratio of 16:9, 1280×768 pixels with an aspect ratio of 5:3, 1280×800 pixels with an aspect ratio of 8:5, 1360×768 pixels with an aspect ratio of about 16:9, and 1366×768 pixels with an aspect ratio of about 16:9. The latter two are generally used for LCD Television; the first three of often found in notebook computers. The 1280×720 pixel mode is also called 720p when used with HDTV.
Super Extended Graphic Array (SXGA)
Super XGA was another step up in resolution and became a family of its own…
- Super Extended Graphic Array (SXGA and SXGA+). A resolution of 1280×1024 pixels with an aspect ratio of 5:4 and 1.3 million pixels. This resolution is common in 17-inch to 19-inch LCD monitors. The plus version has a resolution of 1400×1050 pixels with an aspect ratio of 4:3 and 1.47 million pixels. You might find this on notebook LCD screens.
- Wide Super Extended Graphics Array (WSXGA and WSXGA+). A resolution of 1440×900 pixels with an aspect ratio of 8:5 and 1.3 million pixels. The Apple widescreen MacBook Pro came with this resolution but it’s not otherwise widely used. The plus version has a resolution of 1680×1050 pixels with an aspect ratio of approximately 25:16 and 1.76 million pixels. Some Dell and Apple products have come with this native resolution.
Ultra Extended Graphics Array (UXGA)
Ultra XGA was another step up in resolution based on four times the standard 800×600 resolution of the SVGA standard. It’s basic format is 1600×1200 pixels and also became a family of its own…
- Ultra Extended Graphics Array (UXGA). A resolution of 1600×1200 pixels with an aspect ratio of 4:3 and 1.9 million pixels. Some manufacturers refer to this standard as Ultra Graphics Array (UGA) but that is not a recognized official name. There is no “plus” version of UXGA.
- Widescreen Ultra Extended Graphics Array (WUXGA). A resolution of 1920×1200 pixels with an aspect ratio of 8:5 and 2.3 million pixels. There is no “plus” version of WUXGA although one manufacturer sells a 17-inch monitor with this resolution and claims that it is WUXGA+. A number of different manufacturers make products with this resolution.
Quad (Quantum) Extended Graphics Array (QXGA)
As of 2005, the QXGA and related standards are the highest resolution presently defined. As of this writing, there are few commercial monitors with these resolutions and only a very few higher-end digital cameras. Expect more as fabrication techniques improve.
The Quad term derives from a multiple of 4 against a lower resolution standard. QXGA, for example, is 4 times the number of pixels of XGA at the same aspect ratio (4 times 786,432 pixels = 3,145,728 pixels which, at a 4:3 aspect ratio becomes a display of 2048×1536 pixels). Sometimes the name quantum is used instead to indicate there are so many pixels you’d have to measure them at the quantum level; a bit of an exaggeration but in keeping with the fun people have inventing names.
The QXGA family therefore can be summarized as…
- Quad Extended Graphics Array (QXGA). A resolution of 2048×1536 with an aspect ratio of 4:3 and 3.1 million pixels.
- Wide Quad Extended Graphics Array (WQXGA). A resolution of 2560×1600 with an aspect ratio of 8:5 and 4.1 million pixels.
- Quad Super Extended Graphics Array (QSXGA). A resolution of 2560×2048 with an aspect ratio of 5:4 and 5.2 million pixels.
- Wide Quad Super Extended Graphics Array (WQSXGA). A resolution of 3200×2048 with an aspect ratio of 25:16 and 6.6 million pixels.
- Quad Ultra Extended Graphics Array (QUXGA). A resolution of 3200×2400 with an aspect ratio of 4:3 and 7.7 million pixels.
- Wide Quad Ultra Extended Graphics Array (WQUXGA). A resolution of 3840×2400 with an aspect ratio of 8:5 and 9.2 million pixels.
Hex[adecatuple] Extended Graphics Array (HXGA)
As of 2005, the HXGA and related standards are the highest resolution presently defined. As of this writing, there are no commercial monitors with these resolutions and only a very few high-end digital cameras.
The Hex[adecatuple] term derives from a multiple of 16 against a lower resolution standard. HXGA, for example, is 16 times the number of pixels of XGA at the same aspect ratio (16 times 786,432 pixels = 12,582,912 pixels which, at a 4:3 aspect ratio becomes a display of 4096×3072 pixels).
The HXGA family therefore can be summarized as…
- Hex[adecatuple] Extended Graphics Array (HXGA). A resolution of 4096×3072 with an aspect ratio of 4:3 and 12.6 million pixels.
- Wide Hex[adecatuple] Extended Graphics Array (WHXGA). A resolution of 5120×3200 with an aspect ratio of 8:5 and 16.4 million pixels.
- Hex[adecatuple] Super Extended Graphics Array (HSXGA). A resolution of 5120×4096 with an aspect ratio of 5:4 and 21 million pixels.
- Wide Hex[adecatuple] Super Extended Graphics Array (WHSXGA). A resolution of 6400×4096 with an aspect ratio of 25:16 and 26 million pixels.
- Hex[adecatuple] Ultra Extended Graphics Array (HUXGA). A resolution of 6400×4800 with an aspect ratio of 4:3 and 31 million pixels.
- Wide Hex[adecatuple] Ultra Extended Graphics Array (WHUXGA). A resolution of 7680×4800 with an aspect ratio of 8:5 and 37 million pixels.