PDA

View Full Version : GF8/DX10 - juicy highlights!


Apofiss
14-11-06, 16:19
Ohw! After reading this great review @ guru3d.com, just couldn't resist not to post some visually most juicy/practical highlights!

Geometry shader

http://img225.imageshack.us/img225/383/imageviewnk1.jpg

The surface (rock vertices) of this demo are created at random and realtime with a Geometry shader. When the camera moves up the GPU will calculate new surface constantly and endlessly, random if you wish. This is a very good example of a the usage of a geometry shader. Small interesting side-note; the water follows the path of the surface and thus reacts to with it, which is a physics model.


Strechy skin - geometry shader +

With the introduction of Unified Shader technology the industry will also make you believe that GPU's no longer have a pixel pipeline. That's true but not entirely, we obviously are still dealing with a pixel pipeline yet the dynamics simply have changed.

http://img225.imageshack.us/img225/9350/imageviewak0.jpg


GPU physical simulations creation - Stream output

*Litter and Debris
*Smoke and Fog that moves
*Cloth and fluid flows with object and characters
*Large amounts of rubble (collapsing buildings, explosions, avalanches)
*Rampaging tornados full of debris
*Swarms of insects
*So many more possibilities!

Stream output is very important and useful new DirectX 10 feature as it enables data generated from geometry shaders (or vertex shaders if geometry shaders are not used) to be sent to memory buffers and subsequently forwarded back into the top of the GPU pipeline to be processed again. Basically what I'm saying here is such dataflow permits even more complex geometry processing, advanced lighting calculations and GPU-based physical simulations with little CPU involvement. You simply keep the data to be altered in the pipeline.

http://img225.imageshack.us/img225/8929/imageviewvg4.jpg


DX10 improvements (a bit techie here, just to remind)
Here you can see how DirectX's Shader Models have evolved ever since DX8 Shader Model 1

http://img225.imageshack.us/img225/7109/imageviewyt8.jpg


The Luminex Engine - 16x AA and AF

If a game does not natively support antialiasing, a user can select an NVIDIA driver control panel option called “Override Any Applications Setting”, which allows any control panel AA settings to be used with the game.

...But 16x quality at almost 4x performance, really good edges, really good performance, that obviously is always lovely.

As for AF, how it works with GF8/DX10: sample image (http://www.guru3d.com/admin/imageview.php?image=8792)

More info HERE (http://www.guru3d.com/article/Videocards/391/1/)


Source: guru3d.com

nomedo
14-11-06, 16:47
Ohw! After reading this great review @ guru3d.com, just couldn't resist not to post some visually most juicy/practical highlights!


great idea. ive been waiting for someone to bring all this up. Direct X 10 is going to be really cool.
and theres also some more great things the GF8 can do that isnt bound to DX10 too. like the angle independent AF, the new "q" anti aliasing, 128bit FP HDR and much more. ill see if i can find where ive read about all that.
the video for geometry shader on nVidias site is really cool.

Crofty_Tomb
14-11-06, 17:44
this is very stunning. Whats the name of the game, looks interesting. :)

Cochrane
14-11-06, 18:24
That's no game, that is a technology demo.

I agree that the possibilities of these new geometry shaders are amazing. By the way, they are not exclusive to DirectX 10: Nvidia just released a bunch of OpenGL extensions that allow you to use this power with OpenGL 2.1 (even and especially on Windows XP, by the way), too.

Apofiss
14-11-06, 18:35
By the way, they are not exclusive to DirectX 10: Nvidia just released a bunch of OpenGL extensions that allow you to use this power with OpenGL 2.1 (even and especially on Windows XP, by the way), too.

Indeed.. great news for those who had never been actually into the directX API :D

only Croft
14-11-06, 18:52
My good, this card is amazing. But as every thing in life unfurtanatly is expensive -_-

Mict
14-11-06, 19:53
But will it run Tomb Raider 1?

only Croft
14-11-06, 20:13
Sure, this card can run ebery game on the planet right now:D.
Aldough there is a bug (not shure if it is in TR1) where the game goes too fast on high end graphic cards, but there is a way to overide that problem

Cochrane
14-11-06, 20:28
The card does not offer Glide support which TR1 uses, so from a technical point of view, it does not run (or accelerate) TR1. Of course, since there are translation tools that translate Glide to OpenGL (Glide is a slightly changed version of OpenGL) like dgVoodo, it will run it like any OpenGL game.

abraham
15-11-06, 07:05
GREAT THIS IS GREAT FOR 3D DEVELOPMENT!!! CANT WAIT! go go BlueWorld,Interative:jmp:

Joseph
15-11-06, 23:08
Thanks for posting Apofiss. :) This stuff about DirectX10 is all very impressive! :tmb:
And the GF8800GTX is a true monster:

One of the most heated issues over the previous generation products opposed to the competition was the fact that the NVIDIA graphics cards could not render AA+HDR at the same time. Well that was not entirely true through as it was possible with the help of shaders as exactly four games have demonstrated. But it was a far from efficient method, a very far cry (Ed: please no more puns!) you might say.

So what if I would were to say that now not only you can push 16xAA with a single G80 graphics card, but also do full 128-bit FP (Floating point) HDR! To give you a clue the previous architecture could not do HDR + AA but it could do technically 64-bit HDR (just like the Radeons). So NVIDIA got a good wakeup call and noticed that a lot of people were buying ATI cards just so they could do HDR & AA the way it was intended. Now the G80 will do the same but it's even better. Look at 128-bit wide HDR as a palette of brightness/color range that is just amazing. Obviously we'll see this in games as soon as they will adopt it, and believe me they will. 128-bit precision (32-bit floating point values per component), permitting almost real-life lighting and shadows. Dark objects can appear extremely dark, and bright objects can be exhaustingly bright, with visible details present at both extremes, in addition to rendering completely smooth gradients in between.

As stated; HDR lighting effects can be used together with multisampled antialiasing now on GeForce 8 Series GPUs and the addition of angle-independent anisotropic filtering. The antialiasing can be used in conjunction with both FP16 (64-bit color) and FP32 (128-bit color) render targets.

Improved texture quality is something I MUST mention. We all have been complaining about shimmering effects and lesser filtering quality than the Radeons, it's a thing of the past. NVIDIA added raw horsepower for texture filtering making it really darn good and in fact claims it's even better then currently the most expensive team red product (x1950 XTX). Well .. we can test that !

Allow me to show you. See, I have this little tool called D3D AF Tester which helps me determine how image quality is in terms of Anisotropic filtering. So basically we knew that ATI always has been better at IQ compared to NVIDIA.

http://www.jozefdekkers.nl/Trforum/Computer/iq1.jpghttp://www.jozefdekkers.nl/Trforum/Computer/iq2.jpghttp://www.jozefdekkers.nl/Trforum/Computer/iq3.jpg

. GeForce 7900 GTX 16xAF (HQ) *** Radeon X1900 XTX 16xHQ AF *** GeForce 8800 GTX 16xAF Default .


Now have a look at the images above and let it sink in. It goes too far to explain what you are looking at; but the more perfect a round colored circle in the middle is the better image quality will be. A perfect round circle is perfect IQ.

Btw, look at the image quality circle of the Radeon X1900XTX 16xHQ AF, and understand why i love my card! I'm very interested what ATI will be coming with! :)