View Single Post
  #19 (permalink)  
Old 18th July, 2002, 01:54 PM
Aedan Aedan is offline
Chief Systems Administrator
 
Join Date: September 2001
Location: Europe
Posts: 13,075

Quote:
Originally posted by Hamzter
Turns out the rumour was true, toms hardware's preview of the Radeon 9700 claim that Direct X 9.0 will very much support 128bit colour depth. But the question is, why?
It doesn't. I should have seen this before, so I'm beginning to think I must be going stupid. On top of that, I've led everyone astray too! You're all too sheep-like!

Let me repeat, the Radeon 9700 does NOT support 128bit colour depth. However, the Radeon 9700 DOES support 128bit precision.

Why? Putting multple textures over the same polygon leads to inaccuraces in the rendering, due to lack of bits to represent the final resultant texture.

Textures can be represented as a set of floating point numbers, no problem there.

Doing operations (maths!) with floating point numbers requires a certain level of accuracy. Using 32bits of floating point, and using 128bits of floating point doesn't make any difference in how big the number can be at the end. It does make a difference in how accurate the number is, due to rounding errors in representing floating point numbers as binary numbers.

No, this isn't anything like 16bit to 32bit textures, that's a colour depth issue, not a floating point issue... This is much more subtle, and points to the fact that DirectX 9 is going to support more textures in a pass.

It's all to do with floating point maths, not with colour depth!

AidanII

P.S. I'd explain floating point numbers in binary, but it's a bit complex for the "average" person.
__________________
Any views, thoughts and opinions are entirely my own. They don't necessarily represent those of my employer (BlackBerry).

Last edited by Áedán; 18th July, 2002 at 01:56 PM.
Reply With Quote