Conclusion
Overall there are quite a few interesting findings in this
article. One rather obvious thing is the fact, that consistent
scaling regarding GPU performance can only be found with theoretical benchmarks.
Referring to 3DMark we see that the overall score is between factor 1.63 higher than with one card. Translated into
the raw GPU score we see an scaling factor of 1.84, respectively 1.82 (depending
on our CPU preset) which is usually about the
maximum scaling you get with SLI. A closer look at Unigine Heaven reveals that
in case of the basic preset scaling factor is as low as 1.14 at stock CPU frequencies
and 1.13 with the CPU at 4.5 GHz. When it comes to the Extreme Preset then we
see that the GPU has much more influence again, which is acutally rather
obvious, since there is much higher load on the VGA(s). On the other hand there
is decent scaling when running the extreme preset, meaning framerates go up by a
factor of 1.65 with two cards and standard clocks and a factor of 1.62 with the
CPU at 4.5 GHz.
Games on the other hand are a completely different story. Some of you might be
asking, why in the name of ... we ran an SLI with low resolution. Well, browsing
all the results shows, there is actually some scaling going on and we find it
interesting to see that even at low resolutions certain games benefit from a
second card. Another point is that adding a second card makes dead sure, the
VGAs aren't limiting in any case, meaning CPU limitation becomes even better
visible. But lets go back to talking about the results now. Scaling factors at
low resolutions vary from 1 to 1.57. 1 basically means, the fps didn't change at
all and 1.57 is quite a mssive jump, which bears the question why. The game in
question is Call of Duty Black Ops 2 of which we know, graphics load is very
low. If we have a closer look at the scores we see that with the CPU ran at
stock clocks.
Now lets focus on high resolution and high details, where the games cause quite
some load on the two graphics cards. In this case the lowest scaling factor we
find with Sleeping Dogs and the CPU at 4.5 GHz, where fps are by factor 1.01
higher than with one card. On the other hand there is Crysis 3, which produces
some values that look rather ridiculous at a first glance. If we look at the
scaling factor with the Intel Core i7-3930K at stock clocks then there is a
scaling factor of 2.81 and at 4.5 GHz this factor goes up to 2.85. We've cross
checked this value several times and we always get the same
result. Our explanation on this goes as follows. Since there must be a
multiplying factor we assume, there is a CPU as well as GPU bottleneck being
opened. We think that when we add two cards in the system the benchmark is not
VGA limited anymore, which is causing the CPU to kick in. We think this is
acutally possible since the scaling factor is even higher when we overclock the
CPU 4.5 GHz, which means the CPU must have a rather drastic effect on the
endresult.
To end this conclusion we try to leave you guys with at least some sort of
buying advice. Basically if you do have the funds to buy two high-end graphics
cards and then use them in SLI (apparently) it makes sense to combine these two
cards with a powerful CPU, since you are quite likely to run into CPU limitation
on certain games. On the other hand the question remains how much sense it makes
to buy two high-end cards in the first place, since not too many games are
actually well optimized for multi-card setups. Overall you'll definitely end up
with a highly futureproof setup but the benefit comes at quite a cost.