Laz.e.rus April 5, 2005 Author Share Laz.e.rus Member April 5, 2005 I cant go 6800...too much. the 6600gt sounds like what Im looking for for now. That AGP link someone gave me above scared me tho. All the diff slots/keys for comparable 2/4/8x cards. Im not sure which i have or which the card would be. it may work fine in MOST 4xagps, but not mine, or vice versa. As for the links above Ice berge, thank you. That 256 bit 6600gt tho was a mistype by them--its listed as 128 bit below in details. and the first link wouldnt work for me. Link to comment Share on other sites More sharing options...
Guest zerodamage April 5, 2005 Share Guest zerodamage Guests April 5, 2005 Zero, I'd half agree to what you've said. ATI's boasting higher fps-speeds with current games, using current engines. That's all well and good, but nVidia's cards, while slightly behind (we're seriously talking about 5-20 fps in a real-world situation) all have future-compatible components. The future arrived this past week with Splinter Cell: Chaos Theory - the first game specifically utilizing PS3.0 applets. And the point behind PS engines is to make more complex programs run more efficiently, so yes, a shader effect written in 3.0 could also be done in 2.0, or maybe even 1.4, but it takes less code, and will handle it more efficiently in 3.0, using less card resources to get the same, or a greater effect. <{POST_SNAPBACK}> The primary purpose of the bump to 3.0 vs 2.0 is for the nicer effects that can be done. As has been shown with these nice effects is that they criple performance. HDR lighting was done in real time on the 6800 cards for Far Cry and it looked amazing but at about half the fps or even less compared to it being off. Splinter Cell games have always been better on Nvidia cards due to the shadowing effects being used. Sort of like Doom3 being heavily optimized for Nivida cards. Beyond that, the current generation of cards are good no matter which company you want. The fact of the matter is that ATI has the speed over Nvidia this time around in current games and future games as shown via full DX9 games like HL2 and those PS 3.0 effects in Far Cry. I guess the real test will be when the new Unreal engine is out and being tested. That will be a real test for this current generation of cards and the newest ones due really soon. I've recommended Nvidia cards before. Now that the X800 XL is out for around 300 dolllars and performs about the same as the 6800 in most games for about 2/3 the cost, that is my recommendation. If you want to get something on the cheap now, then look at the x700 or 6600. You want the fastest now and for a while, even after the R5xx series of cards by ATI come out and Nvidia's newest, then pick up the x850 XT. Link to comment Share on other sites More sharing options...
Norguard April 5, 2005 Share Norguard Member April 5, 2005 (edited) Laz, AGP 2.0, it says, needs to be 1.5V and 0.8V, and that all cards with AGP3.0 (8x) specifications need to be tolerant of 1.5V. The question isn't wether your board will support it, but wether the card will support 1.5V, if the board can't do 0.8. I'm betting money that the board can, though, and that the card will, as well. I was right in my initial statement that, like RAM, it'd be a fluke pairing that didn't work under those circumstances. Even in that fluke occurance, you wouldn't damage the card, any more than you'd damage DDR400 RAM that was supposed to be clocked down to 333... ...that is, you wouldn't. And the primary purpose of the bump from Pixel Shader Engine 1.4 to 2.0 was the level of effects that could be produced. ATI made the upgrade wisely. nVidia's 5XXX series used a proprietary process to get PS2.0 effects, which nobody supported... ...so for the card to do all of the recompiling, they took serious hits. The performance of the vanilla shader apps, however has to do with coding, more than anything else. If somebody optimizes code well in 3.0, it's going to run better than 2.0. If they code it poorly, the card has to do more work. With nicer effects come a performance hit. But a program optimized in 2.0 will have more of a performance hit than the exact same program of the same detail level optimized for 3.0. VALVe is about to unveil HDR processing in Half-Life 2. It's going to be done using PS2.0, almost assuredly, but I'll bet that they did better work with the coding than CryTek did with theirs. I don't think any card today is ready for unoptimized, real-time HDR rendering of full environments. I also don't think that the Unreal 3.0 tech demo would run, even on two 6800Ultra PEs in SLI with a playable framerate. Edited April 5, 2005 by Norguard Link to comment Share on other sites More sharing options...
Laz.e.rus April 5, 2005 Author Share Laz.e.rus Member April 5, 2005 (edited) Thx again. So, the 6600 gt will give me the ps3.0. It will fit in my 4x slot. And later I can either get an 8x board (doubt it) or a whole new mobo and vid card. At <$200 (and a $50 gift cert I have), less than 150. The MobO and such we're talking prolly a year on, so totally diff stuff will be out then, and my 6600gt will still be new enough to basically keep up. So, thats what it boils down to. Though not the BEST choice to upgrqade like this, for less than $150 its really the best I CAN do atm. Maybe in 4 months things will be better and Ill regret wasting the 150, but prolly not. If it goes 8 Mos to a year..money well spent for keeping up with the games at least. EDIT: Hey! jsut noticed! <==== Im not a "NEWBIE" anymore. Now, I am just with stupid Edited April 5, 2005 by Laz.e.rus Link to comment Share on other sites More sharing options...
Cujo April 5, 2005 Share Cujo Member April 5, 2005 (edited) going back to what i said earlier. here is something that proves my point. notice how all the results line up nicely? that indicates a cpu bottle neck. also, consult the rest of the article as it sums up all the cards in the price range you are looking at. http://www20.graphics.tomshardware.com/gra...ce_6600-33.html i don't think it's been asked yet but why so insistant on an upgrade when you can barely afford it? my 8500le with 64mb of ram is suiting me just fine at 8x6 with nothing on. i get 30fps at absolute minimum on aztec only. every other map is 40-120fps. your current card is far more powerful than mine. btw, i'm only running at 4x as well. Edited April 5, 2005 by [rAv]Cujo Link to comment Share on other sites More sharing options...
Laz.e.rus April 5, 2005 Author Share Laz.e.rus Member April 5, 2005 why? Im ready for more The idea is to upgrade every other model. Previous card was a Elsa GF2 Ultra 64mb. Went to ti4600. Now ready for another. I WOULD be upgrqading the whole thing if not for *ahem* "life changes". But I cant. Link to comment Share on other sites More sharing options...
Cujo April 5, 2005 Share Cujo Member April 5, 2005 well if you're hellbent on upgrading then yea the choice is really between a 6600/gt or a 9800pro/xt. personally i'm an ati fan and i really only play source and ati is far superior in source. as long as the card is a 4x/8x card it'll work. i don't think i've ever seen an 8x that wasn't backwards compatible but i suppose such an animal could exist. yea ps3.0 is nice but right now it's the same as sse3, just a marketing ploy that has the occasional benefit. on that point, it's amazing how technical everyone has made this. Link to comment Share on other sites More sharing options...
Guest zerodamage April 5, 2005 Share Guest zerodamage Guests April 5, 2005 Norgaurd, I am pleased to know of another video card buff. Glad you are around. I do not count heavily optimized games that are optimized for Nvidia's special paths and instructions at face value. I look at the over all picture. From UT2004, HL2, FarCry, and most games dealing with pure DX9 shaders and instructions show ATI on top. I make my opinion based on overall facts. I recommend to people playing mostly HL2 and CS source, UT2004, etc, to use ATI cards. If they are playing mostly Doom3, Splinter Cell and other types of games that use heavy shadow to use Nvidia. You can not go wrong with either cards at this point. My biggest recommendation is to not get either right now, wait a month or two until ATI's and Nvidia's newest cards come out and we see what they do and drive down the prices of the current stuff. I picked up my X800 pro/VIVO w/ 16 pipes last summer and will probably keep it until this fall, maybe next year depending on future games and prices, etc. I recommend that Laz take his time, get piece by piece. Get a mobo.. wait a bit, get a cpu, then get some RAM, then get the video card. Get PCI express to future proof your system a bit. I seriously would NOT buy an AGP card right now if I were buying brand new. Link to comment Share on other sites More sharing options...
Cujo April 5, 2005 Share Cujo Member April 5, 2005 zd and i seem to agree. btw, zd it's cool to meet a fellow 16 piper. too bad i had to rma mine(nothing to do with the mod). i hope the new one does 16 pipes. Link to comment Share on other sites More sharing options...
Guest zerodamage April 5, 2005 Share Guest zerodamage Guests April 5, 2005 zd and i seem to agree. btw, zd it's cool to meet a fellow 16 piper. too bad i had to rma mine(nothing to do with the mod). i hope the new one does 16 pipes. <{POST_SNAPBACK}> mine was the Gigabyte version which had 16 pipes enabled by default. Best card I've ever owned. Link to comment Share on other sites More sharing options...
Clueless April 5, 2005 Share Clueless GC Alumni April 5, 2005 Any of the VIVO ATI versions can be flashed to 16 pipe, even if it's shipped as 12, correct? Link to comment Share on other sites More sharing options...
Cujo April 6, 2005 Share Cujo Member April 6, 2005 the x800pro vivos can almost all be flashed unless they're hard modded. you have to check the gpu first. apparently some of the newer ones are. just google it and you'll get tonnes of hits. mine was a connect 3d. it worked up to 530/530ish but could not to xt pe speeds so i just flashed it with an xt bios and enjoyed an x800xt with vivo. source at 1024 with full aa and af was nice'n smooth. can't wait to see what it's like on my a64. Link to comment Share on other sites More sharing options...
Laz.e.rus April 6, 2005 Author Share Laz.e.rus Member April 6, 2005 (edited) Heres the prob with piece by piece as you are all well aware of. I buy the mobo (A64). Say month o r2 later, i buy cpu. 2 months later (the dusty cpu jsut dropped $65 from what i payed), I buy the ram. Gonna need a new power supply! Im not gonna bog this new baby with these 2 7500 rpm HDDs!!..gotta raid rite?...couple months later...o that Vid card. I had decided on the Nvidia 7400 gt PCIX, but the new 8XXX series is out supporting the new PCIX (2) boards- stock with 512 MB ddr4 and 32 pipes, My dusty unused-yet PCIX-Mobo would really choke that new PCIX2 card.... O man, I didnt even get the recommended liquid cooling yet, or the new windows for the 64 bit processor. Wow, my old mobo and cpu would be $138 cheaper now and havent even put them in a case yet.... O man!!! a Case!....arghhh yeah, you all know Edited April 6, 2005 by Laz.e.rus Link to comment Share on other sites More sharing options...
Cujo April 6, 2005 Share Cujo Member April 6, 2005 i never buy parts to just have them sit around. i wait til i can afford everything. imho that's just stupid. who mentioned piece by piece? Link to comment Share on other sites More sharing options...
Laz.e.rus April 6, 2005 Author Share Laz.e.rus Member April 6, 2005 i never buy parts to just have them sit around. i wait til i can afford everything. imho that's just stupid. who mentioned piece by piece? <{POST_SNAPBACK}> Norgaurd, Zero, and wolfsblood implied it. Realisticly, If I save up that much at once (doubt it), I couldnt convince myself to spend it all on pc/gaming. Id use it to replace my radiator and all 4 brakes and get a tuneup. The guilt factor would be too high. Know what I mean? Link to comment Share on other sites More sharing options...
Cujo April 6, 2005 Share Cujo Member April 6, 2005 lol with guilt factor in mind how did you manage to live through spending an insane amount on a alienware pc?? seriously though, at this point due to changing technologies, you are gonna have to start saving to do this upgrade. even with a new vid card your fps will still drop down when there's lots of physics calcs being done. in all honesty, and it's something you probably haven't considered, you could upgrade to an a64 with nf3 chipset now(recommend socket 939). use your existing ddr333 ram with a divider and your existing agp card. that would net you a more smooth gaming experience in source as well as give you an upgrade path for a new agp card and 1gig of ddr400. Link to comment Share on other sites More sharing options...
Laz.e.rus April 6, 2005 Author Share Laz.e.rus Member April 6, 2005 1- when I bought it I owned 1.5 pizza franchises and a 4 story house etc. Now I have a 450 sq ft apt. 2- a64 mobo/cpu only...cost? same power and all else? same windows? ANYthing else and cost? Link to comment Share on other sites More sharing options...
Ice_Berge_00 April 7, 2005 Share Ice_Berge_00 GC Alumni April 7, 2005 Word of warning, you probalby noticed this already but the new GF and ATI cards need a power connection for the cooling fan. I think they all recommend a 400 Watt power supply minimum. Link to comment Share on other sites More sharing options...
Laz.e.rus April 7, 2005 Author Share Laz.e.rus Member April 7, 2005 mines 5 something no prob there. But not sure if it would work for both intel and amd board? Link to comment Share on other sites More sharing options...
Ice_Berge_00 April 7, 2005 Share Ice_Berge_00 GC Alumni April 7, 2005 I think they're the same Link to comment Share on other sites More sharing options...
Norguard April 7, 2005 Share Norguard Member April 7, 2005 If it works for Intel, it'll work for AMD, but not the same the other way 'round, necessarily. If your PSU is revision 2.0, or 2.1 compliant, then it'll work for all of the above. Link to comment Share on other sites More sharing options...
Cujo April 7, 2005 Share Cujo Member April 7, 2005 his psu probably isn't 2.0 but it'll still work. unfortanately with any upgrade other than another intel you'd probably have to format. you could try a repair(an option when booting the winxp cd). my friend does it and it annoys the heck out of me cause it means a more "dirty" install. ie. you'd still have all your intel drivers in the system. but it would probably still work and you'd never notice. i'm just a purist in the fresh format/install sense. though he has done it from nf2-nf3-nf4 all using nvidia and amd so the drivers are the same. at this point it makes the most sense to go socket 754 as there are very few, if any, good 939 boards with nf3 chipset(needed for agp). http://www.newegg.com/app/ViewProductDesc....-103-483&depa=1 http://www.newegg.com/app/ViewProductDesc....-136-147&depa=1 there's my recommended cpu and mobo. all you'd have to do is set the divider right and your ram should work just fine. all your other hardware should work just fine. if you could list you complete system specs currently and if possible list your power supply stats. just the amps of the 3.3, 5, and 12v rails(it should say on your psu). if i can see that i can tell you for sure whether the psu will work. total without any taxes(do you guys have taxes?) and shipping is $298. future upgrades would be a gig of ddr400, a new vid card(maybe x800 something once prices fall some more), and maybe an sata hard drive. oh and maybe a new psu if you get an x800 card. again that depends on what your rails are cranking out. Link to comment Share on other sites More sharing options...
Ice_Berge_00 April 9, 2005 Share Ice_Berge_00 GC Alumni April 9, 2005 Hey, that's my processor! Link to comment Share on other sites More sharing options...
Laz.e.rus April 9, 2005 Author Share Laz.e.rus Member April 9, 2005 Ok, since we are going to THAT place, whats the story with a 64bit 2.x Ghz vs say a 3.4 Ghz intel. Is the 2.x really faster even though its a 1.x short? Or is it ONLY better for gaming? etc....I know..I think I opened a can of worms Link to comment Share on other sites More sharing options...
Ice_Berge_00 April 9, 2005 Share Ice_Berge_00 GC Alumni April 9, 2005 (edited) AMD uses a different design than intel. Specifically they have instruction sets, memory and other things designed differently so that the processor runs as a "slower" speed but has the same or better performance. It used to bother me, but basically the xxxx+ number is the equivelent Intel speed roughly. So a 3200+ is theoretically the equivalent of a P4 3.2GHz. However it is my impression that the AMD actually is better than that. There is also the benefits such as: - Lower Heat produced (AMD 64 Bit's run significantly cooler than Intel P4s) - Cost is less ( think I would've had to pay nearly double what I paid for my mobo and proc if I went Intel last fall) - AMD 64 bit processesors also have the advantage of the nForce chipsets for mobo's which is pretty sweet! - The performance is seen all accross the board, not just gaming. That's why I have a AMD Athlon 64-Bit 3200+ with a Gigabyte GA-K8NS-Pro mobo. I think the mobo was about 117 and the proc 170, but don't quote me on that. That was also last fall so the prices have gone down since I bought. Mine is also the 754 pin socket. The 64 bit also gives you the advantage of more processing capability. While the gains under Win XP32 aren't siginificant for gaming (at least my impression) if you do any video encoding, audio encoding or other processor intensive things it can be a big boost. Window 64-bit is due out sometime here, and I believe Longhorn has a 64-bit version expected too. Just FYI. If you're thinking about doing the proc and mobo now I'd say definately go AMD Athlon 64-bit! You won't regret it! When I plugged my old Radeon 9000 pro video card into my new system (had some delay in getting my new sweet video card), I saw a pretty decent performance boost just from the better CPU and mobo. My Old system was a P4 1.5GHz. I used to have problems playing CS 1.6 but after just upgrading the System the old vid card handled it nicely. Edited April 9, 2005 by Ice_Berge_00 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now