Search the Community
Showing results for tags 'amd'.
AMD's Radeon R9 290: A Mid-range Monster Yesterday AMD launched the Radeon R9 290, which is the second card in its all-new Hawaii series of GPUs designed to take on Nvidia's GK110-based super GPUs. This particular card is extremely similar to its big brother, the R9 290X, but has slightly lower clock speeds and fewer stream processors, allowing it to come in at a slightly lower price point of $400. Though it was originally designed to take on the formerly $400 GTX 770, AMD is now positioning it to compete with the GTX 780 due to Nvidia's recent price drops on both cards to $500 and $329, respectively. Read on to see how it handles the heat, both literally and figuratively. Little Hawaii As the second, lower-priced Hawaii board you might assume this card has been neutered more than a made-for-TV version of The Big Lebowski, but you would be wrong. Thankfully, AMD has left almost everything from the R9 290X intact, choosing to only reduce its texture units from 176 to 160, its Stream Processors from 2,816 to 2,560, and its maximum clock speed from 1,000MHz to 947MHz. It still has the same 4GB of memory, the same 512-bit memory bus, and is otherwise the exact same GPU. It also has the same PowerTune hardware and software that lets you dictate maximum fan speeds and core temps. Before we jump in, let's take a look at the specs for the Hawaii cards along with their Nvidia counterparts: *We are putting an asterick next to the AMD cards' TDP because it's not a quoted spec but "standard board power." As the spec chart shows, this card is almost exactly the same as the R9 290X, just like the GTX 780 and GTX Titan in that you have two cards with the same die but one is a bit less powerful. The two cards are the same physical size at 11 inches, both require a six-pin and an eight-pin power connector, and both cards draw a bit over 300 watts too. AMD listed the TDP for the 290X as 250w, but it hedged that answer and never gave it as an official number, but rather an estimate. It didn't reply to our emails asking for the TDP of the R9 290, so we'll just put 250w there with an asterick. PowerTune, TrueAudio, and XDMA Like it's larger, more-powerful sibling, the R9 290 comes with all the baked in features that define the top-tier of this generation of GPUs, namely revamped PowerTune controls, TrueAudio technology, and XDNA Crossfire. TrueAudio and XDMA Crossfire are exclusive to the R9 290/X series of cards, though the current iteration of PowerTune is found on all Rx based cards, and TrueAudio is also found on the $140 R7 260X board. Briefly, AMD has changed the PowerTune interface found in the Catalyst Control Center to give you an easier way to control clock, memory, and fan speeds. It also now has a slider that lets you dictate the maximum fan speed and maximum temperature, just like Nvidia is doing with its GPU Boost 2.0 technology found in its 700-series GPUs. You can tell the software to force the card to run at 90C, for example, and it'll throttle the clock speeds in order to maintain those temperatures. Additionally, if you're sensitive to acoustics, you can also set a limit on the fan speed while letting the other settings run at maximum value as well. It's also provided a "2-dimensional heat map" which we found confusing. We also found in testing that moving some of the sliders too far would cause the entire system to hard lock and then experience trouble rebooting, so tread carefully here. By default the fan on the R9 290 runs at a maximum speed of 47%. TrueAudio is also found on the R9 290, and whether or not it'll make a big difference in the life of an average gamer remains to be seen as no games that use it have been released yet. Gordon wrote an extremely in-depth article about it however, so head on over to it and you'll have all your questions answered. Finally, XDMA is a new technology appearing for the first time in the Radeon R9 290 series of cards. It eschews the ribbon cable we've grown so un-fond of over all these years and instead uses hardware built into the GPUs and also lets the cards communicate over the PCI-Express bus. Though AMD had seemingly wrangled its frame pacing issues with its recent fix, it's software-based and still available for R9 280X cards and lower. For the R9 290 series though, those changes are built into the drivers and handled through XDMA. The previous GPUs based on Tahiti and lower will still have to use the ribbon cable as there's no exclusive hardware built into the GPUs to handle that transaction, but this is not surprising. It is also reasonable to assume that going forward all new GPUs will use XDMA. The main reason for XDMA is to handle the increased traffic resulting from the proliferation of multiple displays as well as 4k panels. If AMD continued using the old ribbon cable there simply wouldn't be enough bandwidth to drive the displays at 60Hz, so XDMA was both a necessity to prepare for the future as well as a great way to allow for smoother CrossFire at super-high resolutions. AMD claims there is no performance penalty at all to this configuration, but unfortunatley there's not really any way to run Apples to Apples testing since the Crossfire connectors are removed on the cards (though the electrical contacts are still intact). We also don't have a second R9 290X or R9 290 card to test Crossfire currently, but we hope to get a second card in soon. (Source: http://www.maximumpc.com/amd_radeon_r9_290_benchmarks)
Finally got my Eyefinity 3x1 setup and working. Even got a mod that puts the HUD back on the center screen (hopefully that works on our server, haven't tested yet). I'll post a pic soon. First run through a single player map I got a little dizzy. Its really disorenting at first. The side screens stretch the image so that if you stare solely at the center, it simulates perefer... preif.... side vision. -Cink (still not an admin)