|
2 registered members (TipmyPip, 1 invisible),
18,731
guests, and 7
spiders. |
Key:
Admin,
Global Mod,
Mod
|
|
|
Re: z-buffer accuracy: 16bit or 24bit?
#11896
02/21/03 02:17
02/21/03 02:17
|
Anonymous
Unregistered
|
Anonymous
Unregistered
|
In 32 bit mode or on a 32 bit desktop, a 3D renderer always uses a 32 bit z-buffer; on a 16 bit desktop it uses a 16 bit one.
|
|
|
Re: z-buffer accuracy: 16bit or 24bit?
#11900
02/21/03 03:43
02/21/03 03:43
|
Joined: May 2002
Posts: 7,441
ventilator
OP
Senior Expert
|
OP
Senior Expert
Joined: May 2002
Posts: 7,441
|
i read something about a bug in the nvidia drivers. it uses a 16bit z-buffer although it shouldn't. the programmer of the 3d-application has to force 32-bit in a special way to workaround this bug. could that be the reason? or is this a opengl issue only? the link: http://www6.tomshardware.com/graphic/20030127/geforce_fx-11.html
|
|
|
Re: z-buffer accuracy: 16bit or 24bit?
#11901
02/21/03 04:01
02/21/03 04:01
|
Anonymous
Unregistered
|
Anonymous
Unregistered
|
This bug affects only the OpenGL driver.
For increased precision, A5 uses nonlinear z-buffering, referred to as a "w-buffer" in some publications. Nonlinear buffering means that the precision increases when you are close to the object, but decreases when you are far from it. This could happen in your case if the tree is quite far away. Maybe you are using zoom here. The solution is to go closer to the object, while zooming out.
|
|
|
|
|
|