VGA video card outputs

VGA is the long-used video standard which was introduced in 1987. VGA stands for Video Graphics Array. VGA is actually the name of the main chip on the original VGA video card introduced by IBM. Over time, VGA has become a more general term refering to a particular video card standard which includes the VGA output connector. The VGA connector is still commonly used but is in the process of being supplanted by the newer DVI connector. The VGA connector is sometimes called an HD-15 connector or a DSub-15 connector.

VGA connector

The blue connector show above is the VGA connector. Unlike DVI, there is only one kind of VGA connector to worry about. All VGA monitors can be plugged into all VGA connectors. All you have to do is make sure that the video card can set a resolution and refresh rate which the monitor can display.

The VGA standard transmits the image from the video card to the monitor using an analog format. The fact that it's analog means that the image can degrade while getting from the video card to the monitor. The higher the image resolution and refresh rate, the more that the image degrades. VGA works by storing its image in video RAM as digital data. The video card fetches the data from video RAM and converts it to analog form. The analog form of the data travels through the cable to the monitor. The analog data in the cable can be degraded by noise and crosstalk.

VGA cables

Cheap VGA monitor cables can usually display a 1024 X 768 image at 75 hertz with minimal degradation. But using higher screen resolutions or refresh rates with a cheap cable can cause the image to deteriorate noticeably. The VGA cable on the left is a high-quality cable and the one on the right is not. Low-quality VGA cables tend to be thin and use twisted pair wires for the image color component signals. High-quality cables are thicker and use thin coaxial cables for the color components. The good cables usually mention "coax" or "coaxial" whereas the cheaper cables made with twisted pairs usually don't mention the type of cable. High-quality cables also have ferrites at the ends to reduce electrical and radio noise. The ferrites are the cylindrical objects near the ends of the cable. The longer the VGA cable, the more the image quality deteriorates. So it's especially important to use a high quality cable if you're going beyond 6 feet (2 meters).

VGA converts the digital image in video RAM into analog signals in the cable. If the monitor is an LCD panel then the monitor converts the analog signals back into digital form before displaying the image. So if you're using a VGA output to drive an LCD panel then it has converted from digital to analog in the video card, transmitted the data through the cable in analog form, and then converted from analog back to digital in the monitor. Some image quality is lost in those conversions as well as in the analog transmission through the cable. The DVI standard leaves the image in digital form so it avoids the two unnecessary conversions between analog and digital. VGA is fine for connecting video cards to old CRT monitors. But for LCD panels you may get better results using the newer DVI standard.