Home / Definitions / Graphics Card

Graphics Card

Kaiti Norton
Last Updated May 24, 2021 8:04 am

A graphics card is a computer component that produces images for a display output. Sometimes they are built into the motherboard of a computer, whereas other graphics cards are designed as removable circuit boards that can be replaced or upgraded as needed. This is especially important for users who need a high-performance graphics card for detailed visual work like graphic design, video editing, or video gaming.

How does a graphics card work?

A graphics card consists of many of the same components as other printed circuit boards, including:

Is a graphics card necessary?

All consumer-grade computers (laptops and desktops) come with either an integrated graphics card that’s built into the motherboard or a dedicated graphics card that’s removable. Without either of these, a computer does not have a way to power the graphical user interface (GUI) and therefore no way to accept commands from a user. The only computers that have historically not needed graphics cards are large mainframe computers that do not have a GUI, but instead use graphics terminals connected to a user’s external device.

Dedicated graphics cards are not necessary, but for graphic-heavy applications like video/photo editing or video gaming, a weak graphics card can create significant lagging or cause program failure. As such, many professionals in these areas require dedicated graphics cards that can keep up with their high-performance demands. Sometimes users will attempt to improve their integrated graphics card performance by manually raising the clock speed, a practice called overclocking. However, overclocking can overheat and damage the computer components and therefore voids any manufacturer warranties.

 

Related Links