DVI vs Analog Display ??

Euro98ITR

Big Geek
So my friend just bought the ATI 9600XT 128MB card. 9700series cards are not available here! And 9800 are over priced! He liked the card very much and I liked it too and im thinking of getting one for myself too.

So the 9600XT is currently connected to a Sony SDM - X52 TFT display on the analog display cable but the both the Sony and 9600XT have DVI input/outputs.
Any advantages of using the DVI connection instead of the analog display?
 
Only if you have a digital display. Even then, it can vary from person to person. Some may notice it, others may not. With a DVI to SVGA adapter, you can run two analog monitors on it, if you so wish.
 
This question has a relatively simple answer.

CRT monitor's rarely have DVI support. If your CRT monitor does have a DVI port or cable, then it obviously does....most will be found LCD's.

Computers are inherintly digital devices, and CRT monitors, with cathode-ray tubes, etc are not...they are analog. The video card takes the digital signals received from the computer, and converts them to analog and outputs this data through the VGA port to your CRT. There is always something lost in the translation. If you video card supports DVI output, then there is no translation to be made. The signals are digital from the computer, and the video card simply pipes the data through the DVI connection to the DVI supported (digital) display...LCD or rarely....CRT.

So is there a difference? Yes. Is it significant? Maybe. It depends on what you are doing. From a graphic design standpoint, use of colors is impoprtant. The color variances from analog to DVI are significant. Most graphic designers would still use large CRT monitors during the early stages of LCD development. As LCD technology improved, and DVI connections became the standard for every LCD....the transition was easy. If your word processing and playing games, the differences could be seen in a side by side comparison, but a performance difference in unlikely.
 
Back
Top