How Graphics Cards Work
The graphics card plays an essential role in the PC. It takes the digital information that the computer produces and turns it into something human beings can see. On most computers, the graphics card converts digital information to analog information for display on the monitor; on laptops, the data remains digital because laptop displays are digital.
RADEON™ 64-MB AGP Graphics Card
The goal of a graphics card is to create a set of signals that display the dots on the computer screen.
What is a Graphics Card?
Today's graphics cards are computing systems in their own right. But these
cards started out as very simple devices. By understanding the evolution of
graphics cards, you can begin to see why they are so powerful today.
How Graphics Cards Work
Memory: The first thing that a graphics card needs is memory. The memory holds the color of each pixel. In the simplest case, since each pixel is only black or white, you need just 1 bit to store each pixel's color (See How Bits and Bytes Work for details.). Since a byte holds 8 bits, you need (640/8) 80 bytes to store the pixel colors for one line of pixels on the display. You need (480 X 80) 34,800 bytes of memory to hold all of the pixels visible on the display.
Computer Interface: The second thing a graphics card needs is a way for the computer to change the graphics card's memory. This is normally done by connecting the graphics card to the card bus on the motherboard. The computer can send signals through the bus to alter the memory.
Video Interface: The next thing that the graphics card needs is a way to generate the signals for the monitor. The card must generate color signals that drive the cathode ray tube (CRT) electron beam, as well as synchronization signals for horizontal and vertical sync (See How Television Works for details.). Let's say that the screen is refreshing at 60 frames per second. This means that the graphics card scans the entire memory array 1 bit at a time and does this 60 times per second. It sends signals to the monitor for each pixel on each line, and then sends a horizontal sync pulse; it does this repeatedly for all 480 lines, and then sends a vertical sync pulse.
The basic parts of a graphics card are computer interface, memory and video
The other alternative is to use 1 byte per pixel and then use these bytes to index a Color Look-Up Table (CLUT). The CLUT contains 256 entries with 3 or 4 bytes per entry. The CLUT gets loaded with the 256 true colors that the screen will display.
Modern graphics cards have evolved to take some or all of this load off the microprocessor. A modern card contains its own high-power central processing unit (CPU) that is optimized for graphics operations. Depending on the graphics card, this CPU will be either a graphics coprocessor or a graphics accelerator.
Think of a coprocessor as a co-worker, and an accelerator as an assistant. The coprocessor and the CPU work simultaneously, while the accelerator receives instructions from the CPU and carries them out.
In the coprocessor system, the graphics card driver software sends graphics-related tasks directly to the graphics coprocessor. The operating system sends everything else to the CPU.
With a graphics accelerator, the driver software sends everything to the computer's CPU. The CPU then directs the graphics accelerator to perform specific graphics-intensive tasks. For example, the CPU might say to the accelerator, "Draw a polygon with these three vertices," and the accelerator would do the work of painting the pixels of the polygon into video memory.
More and more complex graphics operations have moved to the graphics coprocessor or accelerator, including shading, texturing and anti-aliasing.
As graphics cards and coprocessors continue to evolve, the capabilities
become more and more amazing. Modern cards can draw millions of polygons per
second. These features make it possible to create extremely realistic games and
More on Graphics Card Components
Graphics Card History and Standards
When IBM introduced the Video Graphics Array (VGA) in 1987, a new graphics standard came into being. A VGA display could support up to 256 colors (out of a possible 262,144-color palette) at resolutions up to 720x400. Perhaps the most interesting difference between VGA and the preceding formats is that VGA was analog, whereas displays had been digital up to that point. Going from digital to analog may seem like a step backward, but it actually provided the ability to vary the signal for more possible combinations than the strict on/off nature of digital. Of course, the way we manipulate digital display data has changed significantly since the days of CGA and EGA. Now, graphics-card manufacturers are able to provide all-digital display solutions that can support the same number of colors that analog adapters can.
Over the years, VGA gave way to Super Video Graphics Array (SVGA). SVGA cards were based on VGA, but each card manufacturer added resolutions and increased color depth in different ways. Eventually, the Video Electronics Standards Association (VESA) agreed on a standard implementation of SVGA that provided up to 16.8-million colors and 1280x1024 resolution. Most graphics cards available today support Ultra Extended Graphics Array (UXGA). UXGA can support a palette of up to 16.8-million colors and resolutions up to 1600x1200 pixels.
Graphics cards adhere to industry standards so that you can choose from a variety of cards for your PC. Even though any card you can buy today will offer higher colors and resolution than the basic VGA specification, VGA mode is the de facto standard for graphics and is the minimum on all cards. In addition to including VGA, a graphics card must be able to connect to your computer. While there are still a number of graphics cards that plug into an Industry Standard Architecture (ISA) or Peripheral Component Interconnect (PCI) slot, most current graphics cards use the Accelerated Graphics Port (AGP).
Send mail to
questions or comments about this web site.