It's about pixel shaders. Specifically, it's about shader version 3.0. All the rumours on the net suggest that ATI won't support this with their upcoming R420 core. nVidia, of course, is making a lot of hype about the fact that their forthcoming core, NV40, will.
What nVidia doesn't tell you is that there are very, very few programs out there right now that even use the 2.0 shaders out there now. Most fall back on the older 1.3 and 1.4 versions.
Even the greatly anticipated titles Doom3 and HL2 won't use 3.0 shaders, due simply to the fact that those titles are going to have to run on the existing base of Ti4xxx and Radeon 9600 series cards.
Make no mistake; 3.0 shaders will matter... eighteen months from now, just in time for R450 and NV45. Simply put, shader 3.0 support doesn't mean squat--it's like having a car that can do 500mph, but doesn't have a steering wheel with which to turn.
Oh,then I should get ATI.I thought they just wouldn't work in the near future,like they wouldn't function on games.
*After a drive-by shooting*
"Hey Lois, why don't we move to England? The worst they have there are drive-by arguments,"
*England*
"Hey Neville, isn't that Stuart?" " I believe it is"
*Drives by*
"Hey Stuart...I DISAGREE!" *Drives away*
-Family Guy
When the new shader versions find their way into new games and will actually be really used by your card, your card will be well over a year old, and will be too slow and obsolete again, and way surpassed by newer cards.....
So supposed future kewl rendering methods, shader versions blahblahblah will never be usefull for currently sold cards. It has always been that way.
When you want a card now, take the best card there is now.....Buying a card with future technologies that will not find their way into new games for well over 2 years ( that is how long it at least akes to design a new game engine ) is really expensive is a waste of money.
Some 3D technologies were introduced in the past with a lot of publicity, and never even used by game programmers. It takes a lot of time to make your engine use new 3D card technologies, and that will only be done if the effects really cannot be left out, or cannot be achieved in more conventional coding.
Bump mapping was introduced years ago.......and only now there are games that actually use that on a more common scale. When you bought the first real expensive top notch card that featured support for bump mapping, it will be very old, slow and outdated now the technology is really there in games.....
Last edited by Beast of War; March 27th, 2004 at 07:53 PM.
Oh,then I should get ATI.I thought they just wouldn't work in the near future,like they wouldn't function on games.
For functionality's sake, yes. And while I'm no expert, I doubt that they would cease to work entirely, even if every game exclusively used 3.0 shaders; you would simply see a degradation of performance.
For the record, I'd actually like to revise a previous figure: I'd expect to start seeing games seriosuly begin utilising 3.0 shaders within the next 12-18 months. During the interim, however, both ATI and nVidia will likely release two new cores each (the first within the next 6-8 weeks, the second on the eve of 3.0 shaders coming into use--ATI will likely be first, and nVidia will release one simply to "keep up with the Jonses").
nope
some games wouldn't even play.
try to play deus ex 2 or prince of persia with an mx card. wont work
"Ask them, US and Brits didn’t do squat in WW2, it was won by the Soviet all by themselves versus the entire Germans Army and they did it while standing on one leg and with a arm tied behind their back. Western Europe, North Africa, Italy were all side shows" -unknown poster
Just need to clear up some misinformation in this thread.
The real reason Geforce FX cards rarely use 32 bit shaders is that they run them so slow as to be almost unuseable. Developers are forced to use 16 bit shaders if they want their games to run at a decent speed on FXs.
nope
some games wouldn't even play.
try to play deus ex 2 or prince of persia with an mx card. wont work
nVidia's MX variants? Those are based on the GeForce2...
Menzo~ Do you happen to know why nVidia's 32-bit shaders are so slow? Is it due to poor design on their part, or a lack of codification or something else entirely?
Just need to clear up some misinformation in this thread.
The real reason Geforce FX cards rarely use 32 bit shaders is that they run them so slow as to be almost unuseable. Developers are forced to use 16 bit shaders if they want their games to run at a decent speed on FXs.
There you´ve heard something wrong.
The 32-bit shaders work fine on FX.
It´s the 24-bit shaders. As soon as a game uses them the FX card gets the problems that you mentioned. That´s also the reason for their current performance disadvantage to ATI. In that case they either can run them only by software (since the hardware can not do it) and lose speed significantely. Or they switch back to 16-bit shaders.
As I said, if the the game developers would use 32-bit shaders, ATI would be in disadvantage.
"I want a medium size with bacon and extra cheese."
This site is part of the Defy Media Gaming network
The best serving of video game culture, since 2001. Whether you're looking for news, reviews, walkthroughs, or the biggest collection of PC gaming files on the planet, Game Front has you covered. We also make no illusions about gaming: it's supposed to be fun. Browse gaming galleries, humor lists, and honest, short-form reporting. Game on!