Home / Computers / Enhanced Graphics Adapter

Enhanced Graphics Adapter

Webopedia Staff
Last Updated May 24, 2021 7:41 am

The enhanced graphics adapter (EGA) was IBM’s second-generation video card for computer displays. First introduced in 1984, the EGA improved color and space resolution with 16 distinct colors from a range of 64, and a maximum resolution of 640 x 350 pixels.

The EGA is compatible with its predecessors, the color graphics adapter (CGA) and the monochrome display adapter (MDA) via specific configuration, and its successor, the video graphics adapter (VGA) released in 1989.

How does an enhanced graphic adapter work?

The EGA is a video card that enables the computer to output graphics with a range of colors. When a user executes software with graphics, the CPU communicates the graphic to the EGA and the graphics card takes on using pixels to create the image on the user’s monitor.

Improvements from CGA

Like the color graphics adapter (CGA), the EGA utilized an RGB color model with separate inputs connected directly into the monitor. Both graphic cards allowed for 16 distinct colors on a monitor at once, but don’t offer the same range of color options or resolution. The CGA’s palette contained 16 colors for use, unlike EGA’s 64-color palette, and supported a resolution of 160 x 100 pixels.

History of Enhanced Graphic Adapter

Between 1984 and 1987, the EGA was the standard for monitors. With stronger color and resolution quality, vendors and organizations alike could offer better visibility for graphics and readability for text. EGAs continued to be compatible and widely used for enhancing monitor displays into the 1990s before VGA became industry precedent.

Some of the most popular uses of this advancement in graphic technology were video games. Popular programs that used EGA include King’s Quest III (1986) and the first SimCity (1989).

Further Reading

Related Terms