Conclusion
As we already mentioned this is the first article of a series of
SLI scaling analysis with different CPUs and different numbes of graphics cards
as well as different graphics cards. So far there are already some rather
interesting findings. One rather obvious thing is the fact, that consistent
scaling regarding GPU performance can only be found with theoretical benchmarks.
Referring to 3DMark we see that the overall socre is between factor 1.56 and
1.59 (depending on our CPU preset) higher than with one card. Translated into
the raw GPU score we see an scaling factor of 1.82, which is usually about the
maximum scaling you get with SLI. A closer look at Unigine Heaven reveals that
in case of the basic preset scaling factor is as low as 1.20 at stock frequencies
and 1.23 with the CPU at 4.5 GHz. When it comes to the Extreme Preset then we
see that the GPU has much more influence again, which is acutally rather
obvious, since there is much higher load on the VGA(s). Nevertheless it's
interesting to see that overclocking the CPU to 4.5 GHz makes the scaling factor
go from 1.53 - with stock clocks - to 1.62, meaning the CPU is also limiting the
maximum score.
Games on the other hand are a completely different story. Some of you might be
asking, why in the name of ... we ran an SLI with low resolution. Well, browsing
all the results shows, there is actually some scaling to going on and we find it
interesting to see that even at low resolutions certains games benefit from a
second card. Another point is that adding a second card makes dead sure, the
VGAs aren't limiting in any case, meaning CPU limitation becomes even better
visible. But lets go back to talking about the results now. Scaling factors at
low resolutions vary from 1.01 to 1.16. The biggest performance difference can
be found with Battlefield 3, where there is a 16 percent benefit. If we have a closer look at the scores we see that with the CPU overclocked
to 4.5 GHz causes fps to go up by the mentioned 16 percent.
Now lets focus on high resolution and high details, where the games cause quite
some load on the two graphics cards. In this case the lowest scaling factor we
find with Sleeping Dogs and the CPU at 4.5 GHz, where fps are by factor 1.01
higher than with one card. On the other hand there is Crysis 3, which produces
some values that look rather ridiculous at a first glance. If we look at the
scaling factor with the Intel Core i7-4930K at stock clocks then there is a
scaling factor of 2.79 and at 4.5 GHz this factor goes up to 2.92. We've cross
checked this value about 20 times in the meantime and we always get the same
result. Our explanation on this goes as follows. Since there must be a
multiplying factor we assume, there is a CPU as well as GPU bottleneck being
opened. We think that when we add two cards in the system the benchmark is not
VGA limited anymore, which is causing the CPU to kick in. We think this is
acutally possible since the scaling factor is even higher when we overclock the
CPU 4.5 GHz, which means the CPU must have a rather drastic effect on the
endresult.
To close this conclusion we try to leave you guys with at least some sort of
buying advice. Basically if you do have the funds to buy two high-end graphics
cards and then use them in SLI (apparently) it makes sense to combine these two
cards with a powerful CPU, since you're quite likely to run into CPU limitation
on certain games. On the other hand the question remains how much sense it makes
to buy two high-end cards in the first place, since not too many games are
actually well optimized for multi-card setups. Overall you'll definitely end up
with a highly futureproof setup but the benefit comes at quite a cost.