The Penalty For Pairing A GTX 1060 Or RX 480 With An Old CPU

It's always easier to replace a video card than it is a CPU and motherboard, so it's not surprising to find people with a GTX 1060 or RX 480 surrounded by comparatively ancient components. These setups are sacrificing some performance by bottle-necking their GPU, sure, but exactly how much is going to waste?

Matt Knuppel of Hardware Unboxed found himself in a situation to test the affect of old parts on the latest mid-range cards -- NVIDIA's GTX 1060 and AMD's RX 480.

We all salivate over the likes of the new Titan X, but it's the aforementioned cards that are most likely to end up in the bulk of PCs.

Knuppel managed to dig up two popular AMD and Intel systems from yesteryear -- one sporting a 3.2GHz Phenom II X4 955 and the other a 2.67GHz i5-750 (which I myself have, though overclocked to 3.2GHz). These were compared to a modern setup -- a 4.0-4.2GHz i7 6700K.

So, no surprise the new computer smacked the other two around, but in a few tests the older systems were able to keep up admirably. In particular, Star Wars Battlefront wasn't really CPU-bound and neither was The Division, with all three configurations returning similar framerates.

Where the updated CPU architecture dominated was the ever-demanding ARMA 3 and Witcher 3, where the 6700K came out in front by margins for 30-40 per cent. In these cases, this was the difference between playable (30fps or more) frame rates.

So, while a new GPU can certainly compensate for an ageing CPU, there is a limit. Given the rise of graphics APIs such as Direct3D 12 and Vulkan, aimed at reducing driver -- and therefore CPU -- overhead, you could hold out on a system upgrade for longer than you might expect.

[Hardware Unboxed]

Originally published on Kotaku Australia.

WATCH MORE: Tech News


Comments

    Not surprised by his findings, it's been an accepted fact since they started making GPUs that you needed a certain level of CPU power otherwise they'd be bottle-necked.

    I do find it a little bit unrealistic to base decisions just on benchmarks since they may not be indicative of real game performance - certainly not all the time. I've found certain games will happily chug along at 100fps until you hit a particular combination of events at which point the fps plummet. Sometimes it's CPU, sometimes it's GPU to blame, hell with lots of games being online and even featuring "cloud based AI" it could be server problems or internet lag.

    It'd be more meaningful to actually play some of the games for real rather than just running a benchmark. They do qualitative testing - ie: make a note of events that cause slowdowns then try it with different combinations of hardware to see if it's "smoother".

      Not neccesarily, The only bottleneck youll see is if you are running 4k gaming or more than triple screen. Otherwise an old cpu like the ones above can easily handle 1080p gaming with those cards as long as the devs are capable of making the game run on the GPU instead of the CPU.

      Another great video which provides a better explanation - https://www.youtube.com/watch?v=DAgpvWc4VBM

      Last edited 23/07/16 8:59 pm

        Higher resolutions move the bottleneck more to the GPU than the CPU, so lower resolutions suffer more with weaker CPUs.

        Sorry but that's not true for all games. Certain ones are definitely more CPU intensive. Even bloody World of Warcraft with it's old non-graphics intensive engine has moments that choke CPU more than GPU. The RTS games are also notorious for it.

        Most of the more simplistic shooter type games I certainly agree with you since they don't really exercise the CPU that much.

        Like I said, it's not even necessarily CPU *or* GPU that's the problem. Diablo3 for example is terrible on a slow internet connection but it looks like framerate lag, at least in some cases. That's why I suggested that proper testing rather than just a benchmark run is needed to actually determine what the actual bottleneck is.

          Exactly, It all depends on the game. If its developed well, It will rely heavily on the GPU.

            Sorry but that's not entirely true either. If it's developed well it'll spread the load well between both. There's no point having a completely unloaded CPU and smashing the GPU, just like it's bad having a smashed CPU and a GPU sitting doing nothing. I know that certain things are easy enough to offload to a GPU like pathing but the more you load up the GPU the less time it has for graphics.

            Since the GPU is primarily designed for graphics then it makes sense to use as much of it's power as possible for graphics rather than AI.

            The "perfect" game engine would be analyzing CPU and GPU load on the fly and spreading load. But I imagine that's probably difficult and probably has it's own shortcomings.

    I'm running a 980X overclocked to 4Ghz with a GTX1080 and it runs very well.
    I intend to build a new system and the existing system had a GTX690 which ran Doom 2016 ok except for the 2Gb video RAM which was causing intermittent drops to 2fps.
    So I got the GTX1080 ahead of time to run Doom 2016 which runs very well with it's 8Gb RAM.
    It's tempting to delay building the new system but the mobo and CPU is more than 3yo.
    i'd like to comment that the 980X processor really kicks ass over the last few years observing that later CPUs were only marginally better with improvements on power consumption and heat but not much improvement in benchmarking until recently.
    Years ago, I put a spare gaming GPU into my office machine and it's 3D benchmarking was about 80% as good as my home machine. Irrelevant, I know but it was interesting to try.

      Yeah your CPU was part of the start of this modern period. Things basically stalled after those initial i5s and I7s. I bought in at the 2500k point. Every year I look around and then decide not to upgrade. Last year I just bought another 2500k, because it was so cheap.

      yes I checked and my CPU and mobo is six years old.
      problem with buying the same CPU again years later even if cheap is the mobo is also old and hard to replace.
      I considered buying a spare mobo at the same time I get the original gear but that's a bit like buying a jacket and two pairs of pants. And they are still $300-$400.

      I just bought the MSI GE72VR laptop with the GTX1060, I thought it was the best compromise in price, I lusted after the GT80 Titan but it cost $5K, so I might be able to do a direct comparison with the 6700HQ and the 980X, would have to use a CPU benchmark, maybe Geekbench IV?
      I tested it in a cafe by playing WoW on it but the battery only lasted about 45 mins with the graphics maxed out and the battery is not removable unless you unscrew the bottom panel which makes it look smaller and I expect to use the mains on games anyway such as in motels.

    Yes, many modern CPU's barely perform any better than CPU's some three generations older... Why? Three Reasons:

    (1) Intel is a monopoly and has been doing micro upgrades - They are simply milking the market.
    (2) Intel has used more transistor space inside those new CPU's for integrated Intel graphics (which is weak) and also for SPU's that handle encryption.
    (3) Sadly Intel has spent considerable time adding complex back doors for the NSA... The latest Skylake for example has an integrated ARC CPU that runs an isolated OS blob and communicates via the Firmware that regular users have no access. This makes your computer completely exposed to certain types of surveillance (such as rootkits) even when the CPU has been put into hibernation... http://boingboing.net/2016/06/15/intel-x86-processors-ship-with.html

    In short our CPU's now contain more useless (and harmful) filler than raw performance. The joke is consumers are paying for it. Most people are dumb.

    The CPUs tested are just behind the curve. By comparison I've got two 2500k s and they both run at 4.8. I know other aspects of the architecture matter but I'm confident they'll handle pretty much everything for a while yet.

    It's sad how computers have stagnated. I doubt many people alive have seen something like it.

    Thanks @loganbooker this is the kind of article that brought me to giz in the first place many years ago. If i wanted to read about the latest diet, sex tip or swarm of bees engulfing a car i'd probably go somewhere else to find those gems...... like lifehacker.

    Bring back the Giz in Gizmodo!

      Hey mate, we write more pure tech than you'd think... If you want to avoid the entertainment or science stories, probably best to look through the specific categories like Computing, Mobile or Car Tech. Cheers!

    Cheers @campbellsimpson,

    Giz is pretty much part of my morning work ritual, so i'm usually here 5 days a week just seems the last year or so its been less about gizmo's, tech, gear and more about pop culture. Not saying the rest of the stories are always boring or completely irrelevant just prefer more tech! Not hating on it more championing the good stuff!

      I hope you pronounce giz with a hard 'g'.

      Either that or you have very accommodating colleagues.

    Hope everything keep this way. I've just bought an used 3570k and I expect it'll serve me very well for the next 5 years. I'd like to see a more complete comparison, including some mid-age CPU (a 4th gen i5, for instance) and using a 6th gen i5 rather than an 6700k. Results would be even less biased this way. Anyway, all those who are starting to mount a PC for the first time will face fair prices of skylake cpus in rich countries. In Brazil the difference in price is much bigger and they worth even less due to their irrelevant increase in performance.

Join the discussion!

Trending Stories Right Now