I shamelessly stole this off the web....LOL
VGA:
http://en.wikipedia.org/wiki/VGA
DVI:
http://en.wikipedia.org/wiki/Dvi
The main difference is that VGA is an analog standard for computer
monitors that was first marketed in 1987, whereas DVI is a newer and
superior digital technology that has the potential to provide a much
better picture.
Here are two relevant excerpts from Wikipedia for your convenience:
"Existing standards, such as VGA, are analog and designed for CRT
based devices. As the source transmits each horizontal line of the
image, it varies its output voltage to represent the desired
brightness. In a CRT device, this is used to vary the intensity of the
scanning beam as it moves across the screen. However, in digital
displays, instead of a scanning beam there is an array of pixels and a
single brightness value must be chosen for each. The decoder does this
by sampling the voltage of the input signal at regular intervals. When
the source is also a digital device (such as a computer), this can
lead to distortion if the samples are not taken at the centre of each
pixel, and in general the crosstalk between adjacent pixels is high."
SOURCE:
http://en.wikipedia.org/wiki/Dvi
"DVI takes a different approach. The desired brightness of the pixels
is transmitted as a list of binary numbers. When the display is driven
at its native resolution, all it has to do is read each number and
apply that brightness to the appropriate pixel. In this way, each
pixel in the output buffer of the source device corresponds directly
to one pixel in the display device, whereas with an analog signal the
appearance of each pixel may be affected by its adjacent pixels as
well as by electrical noise and other forms of analog distortion."
SOURCE:
http://en.wikipedia.org/wiki/Dvi