Graphics gurus, ATI x1800 spec states "16 bit per channel floating point HDR and 10 bit per channel DVI output." I need some help to understand what this is saying. Does "16 bit per channel floating point" mean 16 bit dynamic range for each of the RGB channels in the frame buffer? Does "10 bit per channel DVI output" mean 10 bit dynamic range for each of the RGB channels in the DVI output? If yes, is 10 bit per RGB channel currently the best there is for PC graphics cards? I couldn't seem to find this kind of info in the Nvidia card specs.