Newegg is top notch for me. I have not heard of anyone having problems with them other then when it was their own fault(didnt enter cc information in correctly and order was canceled.) My computer is completely built from parts I purchased on Newegg. Also my friend owns a networking company and we use NewEgg for the smaller purchases because sometimes we hands down cannot beat their prices with our suppliers. I wish they had Business terms though so we could by on credit.
I have similar good things to say about NewEgg. I do have business terms with them, and their service to their customers is impeccable. I still recommend the 9800 Pro over all the NVidia cards, cause it's just faster right now. Plus, it is only $212 (on NewEgg!). Good luck with whatever you get!
So.... does the ATI 9800 Pro have the 32 bit shaders? I found a really good buy for an ATi 9800 Pro. Hurry!Answer me ASAP! I'm buying one tommorow!
Thanks.
*After a drive-by shooting*
"Hey Lois, why don't we move to England? The worst they have there are drive-by arguments,"
*England*
"Hey Neville, isn't that Stuart?" " I believe it is"
*Drives by*
"Hey Stuart...I DISAGREE!" *Drives away*
-Family Guy
*After a drive-by shooting*
"Hey Lois, why don't we move to England? The worst they have there are drive-by arguments,"
*England*
"Hey Neville, isn't that Stuart?" " I believe it is"
*Drives by*
"Hey Stuart...I DISAGREE!" *Drives away*
-Family Guy
NVIDIA gf fx 5900 xt it is then.(so if an ati played a 32 bit shaded game,it wouldn't work? because I heard that NVIDIA doesn't do DX9 as well as ATI)
*After a drive-by shooting*
"Hey Lois, why don't we move to England? The worst they have there are drive-by arguments,"
*England*
"Hey Neville, isn't that Stuart?" " I believe it is"
*Drives by*
"Hey Stuart...I DISAGREE!" *Drives away*
-Family Guy
Nvidia cards are slower because they use the 32 bit rendering, while ATI uses simpler 24 bit rendering. More is better in image quality, but not for speed.
You cannot tell the diffrence in image quality btw....so there's nothing wrong with ATI cards now or in the near future.
A lot of confusion went on when nvidia "cheated" in the well known 3Dmark test. Nividia wrote special drivers that recognised the 3Dmark program and would switch to even simpler 16 bit rendering for greater speed. 3Dmark developers discovered it and made it public, dealing nvidia a tremendous negative PR blow. 3Dmark was the leading graphics card benchmark program, having a lot of influence on people and companies that sell copmputer components and systems. 3Dmark was not neutral and favored ATI all along, so one might ask himself if there are not buisiness intersts involved ( ATI probably paid 3Dmark developers to make ATI cards look much better )
This fight and conflict of interests caused influential well known hardware test recource sites like Tomshardware.com to (partly) abandon 3Dmark as their most important benchmark program.
They now use games for that, like Unreal2k3 engine, QuakeIII engine and other important engines of games now. Afterall for expensive powerfull 3D cards you want to know how well they run a game, as it is almost their only purpose. Office machines do not need 3D capability at all ( but often have it anyway, there are no more "2D" cards )
Nice to know Tomshardware.com tested the bf1942 engine too, with SW of WWII. I posted that result of almost all cards on the market in bf1942 engine.
HL2 is an interesting case......both companies have already optimalised their drivers to run that game as fast as possible, because when released it will be the newest and most influential game engine there is.
Depending on how you look at it, optimalising drivers for one game engine is "cheating" also.....because other games will not benefit from the speed obtained in that perticular game.
So when in doubt what card to buy, check out how a card does in several games and do not fare blind on a HL2 benchmark or 3Dmark benchmark.
Last edited by Beast of War; March 27th, 2004 at 06:43 PM.
NVIDIA gf fx 5900 xt it is then.(so if an ati played a 32 bit shaded game,it wouldn't work? because I heard that NVIDIA doesn't do DX9 as well as ATI)
Hold up for a sec, and I'll tell you a tale.
It's about pixel shaders. Specifically, it's about shader version 3.0. All the rumours on the net suggest that ATI won't support this with their upcoming R420 core. nVidia, of course, is making a lot of hype about the fact that their forthcoming core, NV40, will.
What nVidia doesn't tell you is that there are very, very few programs out there right now that even use the 2.0 shaders out there now. Most fall back on the older 1.3 and 1.4 versions.
Even the greatly anticipated titles Doom3 and HL2 won't use 3.0 shaders, due simply to the fact that those titles are going to have to run on the existing base of Ti4xxx and Radeon 9600 series cards.
Make no mistake; 3.0 shaders will matter... eighteen months from now, just in time for R450 and NV45. Simply put, shader 3.0 support doesn't mean squat--it's like having a car that can do 500mph, but doesn't have a steering wheel with which to turn.
This site is part of the Defy Media Gaming network
The best serving of video game culture, since 2001. Whether you're looking for news, reviews, walkthroughs, or the biggest collection of PC gaming files on the planet, Game Front has you covered. We also make no illusions about gaming: it's supposed to be fun. Browse gaming galleries, humor lists, and honest, short-form reporting. Game on!