View Full Version : Object Clipping
is there a way to avoid/get rid of this clipping. In the near it looks fine but from the far it is aweful. And I can't see a chance to get rid of it. :(
How do you manage this?
Here a pic. Of course ingame it is much worse.
Well, you could export it using StrPix to a DXF file. Then use your favourite 3D program to edit it.
The only thing is- it's likely that you'll have to retexture it on your own after that.
EDIT: OR: if there is a mesh behind the clipping you can use transparent texture (the small purplr ones, that are invisible in game) to texture the clipping and you'd need to texture behind it. But that would be the fastest way.
It's related to your graphic card drivers. At time of release it was fine on my geforce III wich I still have. At 32 bit I dont have this clipping but at 16 bit the clipping returns.
It's realy sad that it exists but on today's drivers and cards you get this clipping for free :( Unless some one with experience with drivers or this engine can delete or change some settings then this clipping can be solved.
I don't see what the problem is...?
I don't see what the problem is...?
The problem only appears ingame. It can't be seen on screens.
What is it - is it when walls appear to come through objects, and vertices seem to poke through each other?
Yes, here is an example I made some time ago. The clean shot of 2 seconds is on my geforce III where all was fine. The blinking shots are the ones from my current geforce 7800 GTX.
This is what GodOfLight just told me on MSN:
"I don't think that it has anything to do with drivers as Piega says...at least not in my own case. I have the same computer hooked up to two monitors. One is a flatscreen monitor, the other an old box shaped one (no idea what those are called...we'll call it the box for now).
On the flatscreen monitor, no clipping or object flickering happens...no matter how badly the faces are aligned. Yet, when playing the exact same level viewed on the box monitor, the horrible flicker and clipping happens. However, it is always the same computer, and the same drivers...only different monitors. Whether or not this can go hand in hand or against Piega's theory, I do not know...all I know is that this is the case for me.
I know nothing of drivers and graphics cards etc...all i know is that depending on which monitor i view my level on, it changes the flicker effect."
Well this example is the same on my flatscreen monitor and my older crt monitor. It has defenitely something to do with drivers.
Geforce III in 16 bit=clipping
Geforce III in 32 bit=no clipping
Geforce 7800 16 and 32 bit=clipping
In some cases it could be monitor related but I dont believe that. A monitor recieves 2D data no 3D. A graphics card translate 3D to 2D, nothing more. When I return to older drivers on my Geforce III I have no clipping.
All members with latest hardware has clipping. Some dont even know that it is never meant to be this way. Tomb raider is a very old engine and the fault lies there. I guess it can be solved in the engine itself or a slightly adjusted driver. We live in directx 10 these days and some older direct games like 5 or 6 wont even work at all even if it is backwards compatible. Then we did not speak about windows yet! We must be glad we have adjusted XP patches.
So there is no real always working solution to that problem.
That's too bad, because it decreases the atmospere a lot as seen in Piegas Pic. :(
I posted a request on the Nvidia forum and Direct X game developement forum hoping some direct x specialist can dive into the engine to find a permanent solution to this problem so anyone can play as it is meant to be. I requested a patch. Waiting for a reply if it there ever comes one...
This effect is what you get when 2 faces are to close to each other. When rendering 3D into a 2D image, a depth buffer is used to make sure only the most front pixels are viewed. This buffer is the same size as the rendering buffer and when a pixel is rendered, the distance to the pixel is stored in the depth buffer. Before rendering a pixel, the distance of the pixel to render is compared to the distance of the pixel already rendered. Depending on the setting (LESS or LESSEQUAL), a pixel is only rendered when it's closer or also when it's at the same depth.
This depth buffer has a limited number of values however. Depending on how far and how close the viewing range has been set, it is possible that 2 faces who should be behind each other get the same depth. And then the game doesn't know which one to render any more. And then, because of rounding problems, the smallest change in the camera position can cause the other object to be in front again.
There isn't much to do except making the distance between faces bigger. If there aren't 2 faces close to each other, there shouldn't be any clipping problems. I don't know what kind of depth buffer TR4 is using, but I would guess a 16-bit depth buffer. Maybe the view distance settings will change it as well, I'm not sure how Core handled that stuff but normally the maximum view distance will affect the depth buffer since it has to divide the 65536 possible depth values between a larger range, which will result in more values being less accurate.
Yes, sounds interesting. But why does it not exist in 32 bit on my old system? Everything is drawed flawless even from a far distance. Now even normal static stairs on a transparent sloped floor clip into the joining floor from a distance.
Are drivers "handling" this info in another way these days? I'm sure it can be solved. Modern games of today do it all the time. Use flycheat in the mansion of Anniversary and the chimney in the courtyard before the gym is actually a rotated column from the interior of Lara's mansion lowered into the roof.
And in Legend there are two faces drawn together at once. Look at the glass window of the tech room. On the right on the column the frame is exactly on the column so your theory is right :)
There is a big difference how current games handle 3D data compared to how TR4 does it. Currently all rendering is done by customized shaders on the videocard, using big streams of data. I'm not sure how TR4 renders it's data, but I won't be suprised if it still renders indivudual faces using a software vertex processor. And it could be that modern day videocard manufactors are not spending to much time on supporting those ancient ways to render stuff anymore, and therefor it can be a bit buggy.
vBulletin® v3.8.8, Copyright ©2000-2014, vBulletin Solutions, Inc.