Feature Article

Nvidia vs. AMD: Who has the Best Budget GPU?

GameSpot may receive revenue from affiliate and advertising partnerships for sharing this content and from purchases through links.

Cheap trick.

While it's the upper end of the GPU spectrum that tends to get the most attention--from press and manufacturer alike--I'd argue the more interesting battle for that coveted PCIe slot lies much lower down the product chain. After all, if a particular brand can give someone a great experience with a cheapo card shoved into an old Dell tower, it'll be top of the list when it comes to an upgrade further down the line. At the budget end, factoring in things like physical size, power, and of course price, is far more important than when you're dealing with top end GPUs like the Nvidia GTX Titan X and AMD R9 295X2, thanks in part to the larger proliferation of small cases and power supplies that cheaper PCs are built with.

The Test Rigs

To that end, finding out who has the best budget GPU means finding the most reasonable set of components to pair those budget GPUs with; if you're spending under $150 on graphics, spending upwards of $500 on a 6-core i7 CPU isn't the wisest move. Unlike when Peter and I build different AMD and Intel machines in an effort to beat the consoles, to keep things fair and to ensure that the only difference in specs would be the GPU, I'm using two identical machines built by the folks over at Freshtech Solutions in the UK. Both are powered by an Intel i5 4460 CPU, which is the budget-CPU at $189 (£148). While some of AMD's offerings are cheaper, they simply can't compete with the single-threaded performance of Intel's Haswell-based chips, particularly as multithreading is still something of rarity in games.

Team red on the left, team green on the right.
Team red on the left, team green on the right.
No Caption Provided

Backing up the i5 is Gigabyte's H81M-S2H, a micro-ATX motherboard that's got just about everything you need for a budget gaming PC, including support for the latest Haswell processors, plenty of USB and SATA ports, and two RAM slots. Given the board costs just $52 (£36), you do take a hit with the RAM and PCIe speeds, which top out at 1600MHz and version 2.0 respectively. That's not the end of the world by any means, particularly when paired with a $150 GPU. That's also still more than enough grunt if you decide to upgrade the GPU to something like an R9 280, or GTX 960 later on. For storage there's a 1TB Seagate hard drive, while power comes courtesy of an 80Plus Bronze XFX TS 650W unit for worry-free upgrades.

The whole lot is housed inside a CIT Galaxy EVO Gaming case. The EVO isn't bad by any means, particularly given it costs just $44 (£30), but it errs toward being functional rather than optimal for the task at hand. All the usual features are present and correct, including plenty of front-mounted ports, seven expansion slots, four 5.25" external bays, one 3.5" external bay, room for up to five hard drives or SSDs, plus two included 120mm LED fans, and one 80mm LED fan that sits alongside the side panel window. Aesthetics aside, concessions have been to fit the price, including a lack of cable management, a top mount for the PSU (which means it'll intake warm air), and fans that are on the high end of the noise scale.

Still, the EVO does do a good job of keeping things cool, and despite the lack of any real cable management, it's possible to tuck cables away in the 5.25" bays. There's also plenty of room for longer GPUs for future upgrades. At a total price of £499 from Freshtech (a similar pre-built PC costs around $600 on Newegg in the US), the system is good value and neatly built, coming in at nearly £100 cheaper than building the system from parts.

The GPUs

Ah, but what about the GPUs? Here's where things get a little tricky. Both AMD and Nvidia make a range of budget gaming GPUs, which go as low as $70 for something like a GeForce GT 730. However, those types of GPU simply don't have enough horsepower to drive high-end games at 1080p at reasonable settings. For that, you to step up to the $150 range, and I'd suggest this is as low as you go for a gaming GPU, outside of any sweet sales promotions. On the Nvidia side, (one-off promotions notwithstanding) around $150 (£99) buys you a GTX 750 Ti, a small GPU powered entirely by the PCI-E bus.

No Caption Provided
GTX 750 TiAMD R9 270
ArchitectureMaxwellPitcairn
CUDA Cores (Nvidia)/Shader Cores (AMD)6401280
ROPs1632
Base Clock1020MHz900MHz
Boost Clock1085MHz925MHz
Memory 2GB 128-bit GDDR5 @ 1400MHz (5600MHz effective)2GB 256-bit GDDR5 @ 5400MHz
Memory Bandwidth86.4GB/s179.2 GB/s
TDP60W150W
GFLOPs (Single Precision)21.815.4
No Caption Provided

The 750 Ti is based on Nvidia's latest Maxwell architecture, the same architecture behind the likes of the GTX 980 and Titan X. Maxwell's claim to fame is that it's extremely power efficient, which results in a GPU that runs cool and quiet, and in this case, doesn't require extra power from the PSU. That makes it an ideal fit for smaller PCs, or off-the-shelf computers from the likes of Dell and HP that might not include extra power connectors for GPU upgrades. Despite all this power efficiency,

This makes it an ideal fit for small form factor PCs and cheap off-the-shelf computers from the likes of Dell and HP that sometimes feature small power supplies without extra power connectors for upgrades. Despite all this power-efficiency, the 750 Ti's 640 CUDA cores and 2GB of GDDR5 memory put in an impressive performance at 1080p. When I reviewed it at the start of last year, I found that it could run most games at high or medium settings at between 35-60fps, which is extremely good given that it has a paltry TDP of just 60W.

But what if you chucked a bit more juice into the mix? Enter AMD's R9 270, a beefy GPU based on the company's now ageing Pitcairn chip, which was originally released back in 2012 in the HD 7870 and 7850. Given the older architecture, the R9 270 doesn't boast the extreme power efficiency of the 750 Ti and runs with a toastier 150W TDP that requires a single 6-pin power connector. The extra power does get put to good use, though. The R9 270 is essentially just a rebranded 7870 with a slightly slower clock speed, but faster memory attached to a 256-bit memory bus.

That means you get 1280 shader cores, 80 texture units, and 32 ROPs for around $159 (£110). That makes it significantly more powerful on paper, and with the extra wattage on tap, it should overclock nicely too--at least enough to make up for the 75MHz difference between it and the 7870 it's based on.

Overclocking and Benchmarks

One of the great things about GPUs is just how easy they are to overclock, which means that even a budget card can punch far above its weight. Both the R9 270 and the 750 Ti can be given a decent push via software like EVGA's PrecisionX and MSI's Afterburner. However, with the 750 Ti being powered entirely by the PCIe bus (on reference versions at least), there's slightly less scope for overclocking. That said, I was able to add a cool 135MHz to its boost clock and 100MHz to its memory clock, with both remaining stable with a +12mv voltage.

Thanks to the R9 270's 6-pin power connector, MSI's Afterburner allows you to give the GPU's power limit a 20 percent boost. With that, the GPU was stable at 1110MHz and the memory at 1500MHz (6000Mhz effective clock). It's worth noting that while the two particular GPUs I used were able to hit these speeds, your milage may vary. It's easy enough to experiment though, using software to find the sweet spot for your card. If you're new to overclocking, be sure to check out GameSpot's Overclocking for Beginners, which details the basic principles.

GameGTX 750 TiAMD R9 270
Unigine Heaven, Ultra, Ultra Tessellation, 4XAA 3026
Bioshock Infinite, Ultra, AA6167
Tomb Raider, Ultra, No TressFX, FXAA5262
Metro: Last Light, Ultra3841
Far Cry 4, Ultra, SMAA3637
Battlefield 4, Ultra, HBAO, 4XMSAA2935 (Mantle)

It's worth noting that while these benchmarks are still impressive given all the games were run at very high settings, it is possible to hit a solid 60fps in things like Far Cry 4 and Battlefield 4 by switching down to high or medium settings, should you value frame rate over image quality. The Unigine result also shows up the R9 270's ageing architecture, which simply can't match the 750 Ti on floating point performance.

Verdict

So, do you go team red or team green if you're after the best budget GPU? Based solely on performance in games, the R9 270 takes it, pushing average frame rates in games like Tomb Raider above that magical 60fps mark. While it doesn't take a huge lead over the 750 Ti in all games, in general, you do get better frame rates than with the 750 Ti, particularly when using settings that make more of its much higher memory bandwidth.

That said, there is a price to be paid for extra bit of power, namely heat, noise, and power. With a 150W TDP, the R9 270 doesn't boast anywhere near the performance per watt of the 750 Ti, meaning you need to make sure you've got a spare 6-pin power connector, and some decent airflow in your case to get rid of the extra heat. It's not a whole lot of extra heat by any means, certainly nothing compared to something like an R9 290X, but it's something to bear in mind when you're planning a purchase.

The GTX 750 Ti, while not as powerful, has an astonishing amount of performance per watt. That it comes close to the R9 270 with just a 60W TDP is highly impressive. If you don't have access to a computer with a spare power connector, or one with less than stellar cooling, it's a great choice. In the end, though, AMD's cards just offer that little bit more for your money (things like G-Sync notwithstanding), with excellent performance across the board. The company might be lagging behind at the high-end, but for those on a budget, team red reigns supreme.

Got a news tip or want to contact us directly? Email news@gamespot.com


markypants

Mark Walton

Mark is a senior staff writer based out of the UK, the home of heavy metal and superior chocolate.

Back To Top
121 Comments  RefreshSorted By 
GameSpot has a zero tolerance policy when it comes to toxic conduct in comments. Any abusive, racist, sexist, threatening, bullying, vulgar, and otherwise objectionable behavior will result in moderation and/or account termination. Please keep your discussion civil.

Avatar image for vtoshkatur
vtoshkatur

1962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Also the fact that Nvidia seems to gimp the bit rate bus on their entry level/mid range gpu's is shady at best.

Upvote • 
Avatar image for vtoshkatur
vtoshkatur

1962

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

AMD is by far the best bang for your buck. I spent less than $250 on my R9 290 (A card that's comparable to the GTX Titan)

Upvote • 
Avatar image for animalosity
Animalosity

34

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

No fan-boy comment here, AMD has always been the better bang for buck go-to card. Does Nvidia produce all around higher frame rates? More than not, they usually do, though up until the GTX 980 and the Titan X, AMD's and Nvidia's flagships always trade blows with each other for the next king of frame rates. The same trend will repeat itself when AMD finally releases the R9-3XX series and if the rumors prove to be even half true, ditching GDDR5 memory for HBM in addition to 20nm transistor architecture, I think Nvidia will have some catching up to do. I've gotten to the point though where frame rate doesn't matter. I suppose that has much to do with my rig running a 295x2, however my crossfire monster is still considerably cooler, and cheaper than the fabled Titan X.


I'm not saying anything about AMD being better or Nvidia being better other than raw bang for buck. AMD typically drops the prices as soon as Nvidia's flagships release, and Nvidia tends to not even remain remotely competitive in the price department. There is no way that I have any interest in spending $1K on discreet graphics that wasn't necessarily intended for gaming to begin with. Even Nvidia out right told its consumer base that the Titan branding was more for heavy compute applications more on par with being a "budget" Quadro series workstation graphics (Minus the FP32/FP64 applications) utilized more in line with CAD functionality. Is it a fully unlocked GM200 (Maxwell) series GPU? Yes, but for me the benchmarks clearly show that 12GB of VRAM is unnecessary, even at 4K resolutions. Even the 295x2 with only effectively 1/3 the VRAM at 4GB (since the 4GB+4GB doesn't stack) has shown to handle 4K gaming just as well if not better. Yes, the 295x2 is a dual GPU on one PCB. I understand the argument here, however one can retort with the price still remains nearly $300 cheaper, and well, still a more reasonable solution for those that seek ultimate gaming performance.


Lastly to preface the multi-gpu arguments before I get flamed for those who are heavily biased against either AMD or Nvidia and their driver profiles, the 295x2 also runs considerably cooler thermals. With some free mods such as adding a second fan to the already liquid cooled GPU's radiator, and a 4 pin mini PWN fan extension cable plugged directly into a motherboard header for complete VRM shroud fan control simply runs every bit of 20-30 degrees cooler than its competitor. AMD already caps the thermal ceiling at 75 degrees anyway with its firmware, and the 2nd fan and shroud fan mods never hit that mark. Depends too on ambient temps I suppose in your environment....


Anyway, I realize I have gone way off topic here. I was just simply comparing and contrasting AMD vs Nvidia's offerings at the flagship level, but realize I am replying to a budget level discreet graphics article. To each their own. Again, I have always been one to fully believe that AMD definitely wins the "bang for buck" game against Nvidia.

Upvote • 
Avatar image for flashn00b
flashn00b

3961

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

If you're going on a budget, AMD is the obvious choice.

Upvote • 
Avatar image for coldfusion25
coldfusion25

107

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 0

I'm satisfied with my 970.

2 • 
Avatar image for sdtuu
sdtuu

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Agreed, the 970 is the ultimate budget card, price vs performance nothing can beat it right now. I have two for a little less than a single gtx980 and my gtx970sli setup blows it away, specially in high scaling games like BFH and tombraider, with newer games taking more advantage of SLI there's never been a better time. I will say though experience of crossfire shows it struggles with drivers, actually AMD does struggle with drivers for single card configs, I went from a HD9750 (which I loved) to nvidea and I'm just amazed how good nvidea are, they have their drivers out on time mostly before the game comes out, while a lot of the time (see dying light) your waiting months, it's disgusting from AMD who build cards that cost you more money in the long run (hardly a budget right?) yet **** you on performance.

Upvote • 
Avatar image for iantheinnocent
IanTheInnocent

26

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

I love my 980.

Upvote • 
Avatar image for danp111
danp111

40

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

wait you used a 270 instead of a 270x and ti instead of the normal one.. wat

Upvote • 
Avatar image for onefai
Onefai

49

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

AMD needs to step up the game. I have stopped using AMD CPU and GPU even I was an AMD fan. AMD Freesync is pretty attractive.

2 • 
Avatar image for superstition222
superstition222

27

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

You didn't mention that the 750 can't be crossfired. Also, 900 MHz for the 270 is very slow. My 7870 GHz Edition runs at 1050 MHz with just 1.1 volts. I can get it to run at 900 MHz with 1 volt.

Upvote • 
Avatar image for lord_meliodas
Lord_Meliodas

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> I mentioned that in my 1st post. AMD XFire not included. Also, the CPU used was Intel not AMD, which gives the NVIDIA the upper hand in terms of compatibility :3

Upvote • 
Avatar image for lord_meliodas
Lord_Meliodas

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I really don't know why everyone's complaining about power usage, temp and noise. If you ask me, I'll not build a PC with a PSU of less than 500w, I'll also not be building a gaming PC without proper cooling system, and finally, BUY A DAMNED HEADPHONE IF U DON'T LIKE HEARING YOUR PC's NOISE! tch.

Upvote • 
Avatar image for macklin_
Macklin_

57

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

<< LINK REMOVED >> Nothing to do with this conversation but I keep seeing this profile pic all over this site, on both user accounts and spam bots. What is it?

Upvote • 
Avatar image for lord_meliodas
Lord_Meliodas

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> It's also unknown to me. My guess is it might be the default pic set by the site.

Upvote • 
Avatar image for ekwhale
EKWhale

98

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Probably because, you know, power costs exist.

Since the average European is paying $0.22 per KWh, this means that the 90W difference is

(90W)*(1KW/1000W)*(3 hours per day usage, average)*(365 days per year)*(cost per KWh)
For a difference of 98.5 (cost per KWh) per year.

So the average American is talking about an additional $12 per year in power cost.
The average European is looking at over $20. And it's $10 more expensive up front.

Assuming, since you're getting a budget GPU, you're looking for about 3 years of usage, you're paying an additional hundred freaking dollars if you're not an American, and still over $50 if you ARE an American.

Considering the overclockability of the NVIDIA making it more powerful than the base AMD while STILL using less power, is it really freaking worth doing?

And as for the 500W comment? Why? 500W gets you the latest i7 and a budget card like the ones listed above easily. Even with the 80% factor you should be using bringing you down to 400W.

Getting a PSU that's too big for your machine is wasteful for a number of reasons. Size to your machine, or the machine you plan to build into. Otherwise, you're screwing yourself.

Upvote • 
Avatar image for lord_meliodas
Lord_Meliodas

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> And the additional power cost proves what? It does not make the card less powerful nor does it give the other a boost. As per your argument about overclocking, the 750Ti's power is very limited compared to that of the R9 270(as stated in the article) even WITHOUT AMD's XFire tech. Another thing is, a 500w PSU is pretty basic when you're building a gaming rig since you're surely going to have to use a good cooling system and the GPUs stated above. Moreover, having a safe extra power in your PSU is actually beneficial, considering that overclocking can most often be maximized and that the extra power can and will prevent power overdrawing, graphical corruptions, BSODs, and possible short circuits. I actually used to run a dual core(Athlon II x2 3.0ghz) with a 400w PSU but i didn't see any significant change to my electric bill after switching to a 550w one, but I did notice an increase to my power cost when i replaced my CPU with a Phenom II x4(50w~ increase in power usage), so I think you might be overthinking the power cost a little bit :3

Upvote • 
Avatar image for brockelley
Brockelley

87

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

lol what? the newest AMD GPUs are coming in less than like a single month, this article is just...... I mean how much money did Nvidia pay you..

Upvote • 
Avatar image for ekwhale
EKWhale

98

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> That's the previous generation of NVIDIA cards, too, genius. And AMD already dropped the prices on their last gen to try to compete better with NVIDIA's current gen.

Do you even read?

3 • 
Avatar image for arc_salvo
arc_salvo

557

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

Freesync is coming out soon, so the G-sync advantage is lessened, imho, and like kitty said, you can overclock the R9 270 to R9 270x performance fairly easily if you do some overclocking with the core and ram.


That said, if you don't have a big power supply or a big case, the 750ti is great. It can often run with most stock 300 watt supplies that come with home PC's that are pre-built by dell and the like, and it creates less heat, which makes it better for smaller form factors.


I usually get medium-size or bigger cases and a 500+ watt PSU, so I generally go AMD, but if I was just going to upgrade a friend or family member's pre-built HP or Dell or whatnot, I'd probably just throw in a 750ti.


Edit: All that said, while AMD cards can be a little bit hotter, I haven't noticed them being particularly noisier as long as you buy the right brand with a good cooler. The cheap cards with cheap coolers run hotter and noisier, but the slightly more expensive ones (and I mean like 5-15 dollars more, not a lot at all) are noticeably quieter and generate less heat.

Upvote • 
Avatar image for kitty
kitty

115480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

kitty  Moderator

When you overclock that R9 270 passed 1050, it becomes a R9 270x.
I don't understand why there is a 270 and a 270x when the only difference is like 100 to 150mhz in clock speed.
It's basically the same card. What AMD should have done was make the the 270 (7870 which is what it basically is, non ghz edition) and make the 270x the the 7870xt.
With the 7870xt, you'd get performance closer to a 7950 (R9 280), the difference being the 280 having the 3GB vram and better performance.

Upvote • 
Avatar image for BravoOneActual
BravoOneActual

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 1

If you're slapping a card in a three-year-old Dell Dimension you've been gifted or already own: GTX 750Ti


If you're building a moderately priced gaming PC from the ground up for 1080p gaming with surprisingly few compromises: R9 270


If you're saying AMD cards are inherently louder: Ditch reference coolers and quit saying silly things.

8 • 
Avatar image for kitty
kitty

115480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

kitty  Moderator

<< LINK REMOVED >> I'm rocking with a 970 atm. But I had two 7870's before which are R9 270x's. I still own them.
Excellent cards. Them in crossfire, can beat a 970 (stock). Great cards for 1080.
I was able to play games at 5760x1080 with them. I've 3 1080p monitors, it was blast.

I've moved up to a 970 because of vram. Plus I got a 120hz 1440p monitor. If AMD didn't wait so long to release the 4GB 270x. I would still be using them. Because i would have bought a pair of those instead.

2 • 
Avatar image for BravoOneActual
BravoOneActual

799

Forum Posts

0

Wiki Points

0

Followers

Reviews: 21

User Lists: 1

<< LINK REMOVED >><< LINK REMOVED >> I have a 970 in my personal machine and an R9 290 in my son's.


To be honest, in all practicality, they perform identically. I mean this in terms of consistency, not peak FPS or in games that might favor one brand over another. Honestly, I replaced the 290 to ease up on an aging PSU and to maybe SLI down the road, so it wasn't a performance boost that drove my decision.


Anyway, I have also owned a few 270's (X and non-X) and it's really a miraculously good card for the price. If I was not insane and addicted about always upgrading/experimenting, it would be all I - or 95% of gamers - would need for 1080p gaming.

2 • 
Avatar image for seanwil545
seanwil545

217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

You mean for ~$150, someone can turn Grandmas old Core 2 Duo/Quad into an entry-level gaming PC?

That can't possibly be possible!!!

3 • 
Avatar image for ishak27
Ishak27

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >> Can confirm that's true. I transformed my Core 2 Quad and replaced the GeForce G210 with a 270X


Runs like a glove.

Upvote • 
Avatar image for ekwhale
EKWhale

98

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> Gloves don't run. You screwing with me?

Upvote • 
Avatar image for seanwil545
seanwil545

217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

<< LINK REMOVED >><< LINK REMOVED >>

I was just having fun.

I have an AMD 4130 in a spare rig I initially built for streaming. I tossed a 270X in it and it runs Shadow of Mordor High-Medium settings just fine.

Upvote • 
Avatar image for xantufrog
xantufrog

17908

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

xantufrog  Moderator

<< LINK REMOVED >> it's a christmas miracle!

Upvote • 
Avatar image for Gamer_4_Fun
Gamer_4_Fun

3862

Forum Posts

0

Wiki Points

0

Followers

Reviews: 139

User Lists: 0

I switched from AMD R9 290 to GTX 980 SLI few months back. The major difference I found with the nvidia side is just a more fleshed out experience. While AMD's driver overhead is not as efficient as Nvidia's, it is no where near to what fanboys complain about. The biggest difference I felt were my feet - the heat output is so less with my rig on the ground, I can finally game comfortably.

Upvote • 
Avatar image for andmcq
andmcq

260

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

<< LINK REMOVED >> This is something most articles never address. The heat output and noise levels from AMD videocards in general is much, much higher than that of Nvidia's.

4 • 
Avatar image for elessarGObonzo
elessarGObonzo

2678

Forum Posts

0

Wiki Points

0

Followers

Reviews: 140

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> 8GB PCS+ 290X has never passed 58° running 1440p games on maxed settings. idles 35-38°. i may have better than average case cooling but with good aftermarket coolers R9s do not have the heat issue many claim.

Upvote • 
Avatar image for yngsten
yngsten

463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >>

Isn't that only an issue with reference cooling?

Upvote • 
Avatar image for Gamer_4_Fun
Gamer_4_Fun

3862

Forum Posts

0

Wiki Points

0

Followers

Reviews: 139

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> I have the Sapphire Trix (triple fans). It would reach 90 degrees at times. The reference cooler would hit 95 degrees and down throttling would happen.


With the 980 SLI, I don't cross over 68 degrees C. And I live in a region where average ambient temperature is around 35 degrees C.

Upvote • 
Avatar image for yngsten
yngsten

463

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >>

Oh, I see, wow that's quite a difference ain't it.

Upvote • 
Avatar image for kitty
kitty

115480

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

kitty  Moderator

<< LINK REMOVED >><< LINK REMOVED >> Never had problems with noise and I ran two cards in crossfire. fan speed never went passed 40% on the top and the bottom I don't think the fans changed much at all.
Top card peaked at 67c and bottom 57c.


2 • 
Avatar image for ekwhale
EKWhale

98

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >><< LINK REMOVED >> My NVIDIA cards sometime run in the 30C range. Low 40s while gaming. 67C is 125 freaking F. That's "will be unbearably uncomfortable if it's on you" range. Which means if that air is blowing in a smaller space, like your foot hole at your desk, it will be uncomfortable. 20C difference is HUGE.

As for noise, YOU never had a problem with noise. Considering people've routinely measured the noise levels for the various cards and the AMDs are consistently louder (because they have to move more air to keep them cool. This is physics, not a "THEY R NOT USE RIGHT FAN"), I'm saying that's another thing that is fine *for you* that may not be for others.

If you look at AMD long-term, it's just about never worth it. Worse performance for anything other than pushing more pixels (which is fixable for NVIDIA in a software update). They use significantly more energy. If you're a high end gamer, you're talking HUNDREDS of dollars difference over the lifespan of the card (which I'm saying is about 3 years). Just isn't worth it.

Heat all your components up, pay more, and get less performance? What, am I buying a geo with a ferrari skin on it?

Upvote • 
Avatar image for lordshifu
lordshifu

319

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

<< LINK REMOVED >><< LINK REMOVED >> true that!

Upvote • 
Avatar image for whatsazerg
whatsazerg

1151

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I've been a Nvidia guy for the last decade or so.... I'm considering AMD for my next rig. It'll depend on whats available at what price point when the time comes though.

Upvote • 
Avatar image for xantufrog
xantufrog

17908

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

xantufrog  Moderator

I've always been happy with both brands.

As an fyi, though, you can get better cards for 150 dollars - you can definitely find a 270x, and at times a 280, for 150-160 after rebate. A 280 for 160 would be a far better deal than a 750TI for 150 - if you can't afford the extra tenner then you probably shouldn't be investing in a gaming GPU to begin with.

Upvote • 
Avatar image for seanwil545
seanwil545

217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

<< LINK REMOVED >>

The real problem with those is that the power-supply requirements (500~550W) will certainly exceed most off-the-shelf rigs.

Upvote • 
Avatar image for xantufrog
xantufrog

17908

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

xantufrog  Moderator

<< LINK REMOVED >><< LINK REMOVED >> well, in truth you probably don't need anything more than a decent 450W for even the 280, based on benchmarks, but I do take your point - an off-the-shelf random Dell might not support a 280 with its PSU

Upvote • 
Avatar image for seanwil545
seanwil545

217

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

<< LINK REMOVED >><< LINK REMOVED >>

Agreed, I was just quoting manufacture requirements.

A solid 450W would probably be fine, but we are talking a good unit since 360W total system power consumption puts it at 80% load.


2 • 
Avatar image for SicoWolf
SicoWolf

49

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

AMD's R9 300 series comes out this month or May, and the R7 300 series will follow some weeks after. I'd wait and see where price/performance falls then.


The comments in GPU articles are always funny to read. The hyperbole about drivers, high-end performance, etc that is always present is amusing if nothing else. Bottom line, Nvidia and AMD are extremely competitive in the GPU sector and have been since the early 2000s. You can't go wrong with either the red team or the green team.

9 • 
Avatar image for arc_salvo
arc_salvo

557

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

<< LINK REMOVED >> I agree. I lean AMD nowadays, but I used to favor Nvidia (which is the camp I started with since the old Geforce 2 series) and Evga at that, but I switched to ATI (Before it was bought out by AMD) and found that I noticed no real difference in performance in practical day to day gaming.


So I stuck with ATI until it became AMD to try out red-side for a while. Drivers were a little worse before, but they've gotten better. What I'm happy about is that there's real competition between the two companies, which drives prices down and quality up.

Upvote • 
Avatar image for xantufrog
xantufrog

17908

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 5

xantufrog  Moderator

<< LINK REMOVED >> good advice

Upvote • 
Avatar image for djpetitte
djpetitte

777

Forum Posts

0

Wiki Points

0

Followers

Reviews: 6

User Lists: 0

if you didn't already know that amd offers better budget gpus, you must be new to pc gaming. which is ok too.

Ive owned both over the years and they both have pros and cons. Nvidia is more the gpus for dummies as I like to say. AMDs drivers have come along ways but still not as good as Nvidia but a lot of pc players like to tweak and troubleshoot, so that's one reason I have stuck with AMD for many years.

Upvote •