HWarrior July 24, 2006 Share HWarrior Member July 24, 2006 AMD buys ATI Future hard to predict, but what happens with nvidia chipsets now that AMD/ATI is a competitor. Hate to see AMD go exlusive ATI chipset. Not that it would be bad, but as a consumer I want a choice. What ya think? Link to comment Share on other sites More sharing options...
wayfarer July 24, 2006 Share wayfarer Member July 24, 2006 Wow. Makes you wonder about future products. It looks like a good deal. Not sure if it's a good deal for consumers or not. Link to comment Share on other sites More sharing options...
Vovik July 24, 2006 Share Vovik Member July 24, 2006 $5.4 billion...somebody cashed in! Link to comment Share on other sites More sharing options...
shadow July 24, 2006 Share shadow Member July 24, 2006 2 great companies combining into 1!!!!!!! Should come out with some fantastic products. (if you can't tell....i'm a ati and amd gamer.) Link to comment Share on other sites More sharing options...
Cujo July 25, 2006 Share Cujo Member July 25, 2006 interesting read: http://www.theinquirer.net/default.aspx?article=33219 Link to comment Share on other sites More sharing options...
wayfarer July 25, 2006 Share wayfarer Member July 25, 2006 So basically what that is saying is, in the not too distant future, CPU's will be running everything, including graphics again. Still unsure if this is beneficial to the joe-shmoe consumer? Though he will not have to pay big bucks for dual GPU's (or more) Link to comment Share on other sites More sharing options...
wayfarer July 25, 2006 Share wayfarer Member July 25, 2006 Wow, things are moving really fast! In less than a year the "top of the line" X 850 XT PE has been trumped twice, and will be trumped a third time with the release of the Radeon X1950 series cards The article later goes on to say that this is still going to be trumped yet again with the R600 core. Also, there's already info on what plans are ahead for AMD & ATI Are we sure this is not a lot of hype to calm down stock holders for AMD after the launch of Intel's conroe, and have them forget AMD's AM2 blunder? Link to comment Share on other sites More sharing options...
Cujo July 25, 2006 Share Cujo Member July 25, 2006 i wouldn't call am2 a blunder. they needed it as it lays the ground work for future plans. there's an analysis somewhere on theinq about that as well. Link to comment Share on other sites More sharing options...
anonymo July 25, 2006 Share anonymo Member July 25, 2006 Why not just have 2 motherboards in one computer...one for GFX and one for processing...wait...can I do that now?! Link to comment Share on other sites More sharing options...
HWarrior July 25, 2006 Author Share HWarrior Member July 25, 2006 Yeah AMD had to make a move to AM2. They needed to catch up in the short term. They saw Conroe coming and mini-cores are still far down the road. They needed to make the change from 939 to AM2 as a stop gap. AM2 adds DDR2. Intel been doing it for a while now. To do this AMD moved the memory interface from Northbridge to the CPU. That necessitates the change in socket & CPU. If they go to a smaller process (65nm from 90nm), then they can do a lot in the short term to stay on Intel's heels (including price cuts). Will be interesting to see where AMD is when Intel comes out with Quad-Cores. Link to comment Share on other sites More sharing options...
wayfarer July 25, 2006 Share wayfarer Member July 25, 2006 Ahhh, so 939 is being phased out & replaced by AM2, but this was a step to make AMD more cost competitive with intel not performance competitive? A short term move just to keep them in the market vs core 2 duo? Link to comment Share on other sites More sharing options...
Penty July 25, 2006 Share Penty Member July 25, 2006 It wasn't really a cost competitive move. They switched to AM2 primarily for 2 reasons. To allow the shift to DDR2, which finally equaled the performace of DDR, and to give them the extra pins they need for more hypertransport links. I have a feeling they did not anticipate exactly how good Intels latest chip is. Link to comment Share on other sites More sharing options...
Cujo July 25, 2006 Share Cujo Member July 25, 2006 they didn't for sure. Link to comment Share on other sites More sharing options...
HWarrior July 25, 2006 Author Share HWarrior Member July 25, 2006 Basically, they needed to simplify production thereby cutting costs. At one point this year they were making 754, 939, AM2 (2x512K L2 cache), & AM2 (2X1MB L2 cache) processors. They are transitioning to only making AM2 (2x512K) processors. 939 will be first to go, then AM2 (2x1MB), then 754. 754 is hanging around for the Semprons. Here is the shipping calendar. The AM2 (2x1MB) processors (4000, 4400, 4800) are going to be in short demand, so I don't see price dropping on those quickly. Yep, they aren't even listed in the official price drop list. Link to comment Share on other sites More sharing options...
wayfarer July 25, 2006 Share wayfarer Member July 25, 2006 So I'm guessing that if someone were looking to upgrade in a couple of months, the recommendations would be the Intel core2 duo? ($500-1000 range) Link to comment Share on other sites More sharing options...
HWarrior July 25, 2006 Author Share HWarrior Member July 25, 2006 I am doing AMD 939 upgrades at the end of the month for 2 other systems for the simple fact that it is just a processor swap for those. But for my new gaming rig, I am starting from scratch so I will find it hard not to go Intel. Link to comment Share on other sites More sharing options...
Dingy July 26, 2006 Share Dingy Member July 26, 2006 i wouldn't call am2 a blunder. they needed it as it lays the ground work for future plans. there's an analysis somewhere on theinq about that as well. There was an article I read this morning (Tom's or Ars) that talked a bit about the merger and how it is good because of the GPU becoming more like a CPU. That, and some other info about manufacturing of the ATI chips, and how ATI supplies many of the chips for PDAs, phones, etc... If anything, it was an interesting read. We'll just have to wait a while to find out how everything works... The one thing the articels did state was that nVidia was more Intel-like. So does that mean Intel will buy nVidia? Link to comment Share on other sites More sharing options...
Cujo July 26, 2006 Share Cujo Member July 26, 2006 i posted one further up in this topic. great article all about that stuff from the inq. Link to comment Share on other sites More sharing options...
anonymo July 28, 2006 Share anonymo Member July 28, 2006 From all the articles I've read about this what I gather is that this merger will allow ATI to shrink their chips, run them cooler and produce them for less...so smaller/colder/cheaper...is that what you got cujo? this is obviously just for the short term...i can see that in the next 5 years if ATI is still making GPU's like they do today then they'll basically be small motherboards, with AMD processors, like having a 939 with a FX-62 and 2 gigs of ram in a PCI-E card (or whatever slot standard comes into play in the future) Link to comment Share on other sites More sharing options...
Cujo July 28, 2006 Share Cujo Member July 28, 2006 something like that. Link to comment Share on other sites More sharing options...
anonymo July 28, 2006 Share anonymo Member July 28, 2006 I wonder if there's some way to turn a computer system into one giant GPU? I guess the only problem would be interfacing it with another computer to actually run the software...would be a neato project Link to comment Share on other sites More sharing options...
Dingy July 28, 2006 Share Dingy Member July 28, 2006 I wonder if there's some way to turn a computer system into one giant GPU? I guess the only problem would be interfacing it with another computer to actually run the software...would be a neato project Not too difficult really - and there have been many instances. Graphics cards work that way right now. There are actually some research projects (and software i think) that turns an ATI GPU into a transcoding engine (it has been a while since I saw it, so I don't really remember exactly what it did) - but it sure is neat! As it is, the PCI/PCIE slots are the interfaces to the computer - you got your memory, timing, data lines, strobes, etc... so that isn't the hard part. I think the hard part would be shrinking everything so that it fits reasonably well into one slot. For optimum performance (if it was a card), I think a new standard would be needed (hyper PCI?). Of course when you start talking about a new standard, much of the supporting silicon would have to change. Might not be a bad thing because the whole PCI design is getting a bit old. Hence, a new motherboard design. Hopefully one that becomes standardized... As it is, the GPU on a chip that plugs into the motherboard has been talked about and is an interesting concept. It certainly would be nice to buy a GPU chip for an upgrade as opposed to the card. The big issue though would be consumer acceptance. Remember, even though it looks good on paper doesn't really mean it will succeed. Remember the PS2? If everything does work out, it certainly will be interesting over the next couple of years. We'll see. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now