Page 2 - NVIDIA GTX 560 Series Architecture
If anyone can say NVIDIA is not jealous of AMD's current graphics market position with a straight face, they are either an NVIDIA employee, or a really die-hard fanboy -- every true enthusiast knows AMD has been dominating the competition since the Radeon HD 4000 series back in 2008. And as business goes, this is simply way too long. Despite the fact NVIDIA has been constantly delivering unquestionably competitive products in every imaginable price bracket, the problem is the company is also consistently late with their product release cycles. The bread-and-butter midrange performance cards is especially true; while AMD is selling millions of GPUs around the world and laughing all the way to the bank, NVIDIA is still trying to get their shoes on, and their products out the door at acceptable yield levels. With the release of the second generation Fermi processors, NVIDIA finally fixed their yield issues, with improved performance and power consumption to boot. What does this cutthroat competition have in store for us today? Let's start off our review today with an in-depth look into the technical design of NVIDIA's latest offering in the $200 midrange performance market with Gigabyte's GeForce GTX 560 1GB.
The NVIDIA GeForce GTX 560 GPU is based off the refreshed Fermi architecture first introduced in the GTX 400 series cards last year. With the launch of the GTX 580 on November 9, 2010, the rest of the line came progressively; where the latest release -- the GTX 560 non-Ti version we are covering today -- appeared on the market on May 17, 2011. This DirectX 11 and OpenGL 4.1 compliant family of cards competes directly with AMD's Radeon HD 6000 series. We have covered both the Radeon HD 6850 and Radeon HD 6870 midrange performance cards earlier this year, and we will include them in our benchmarks today.
As the successor to the successful GeForce GTX 460, the company decided it was wise to continue to capitalize on this momentum, with the GF114 based NVIDIA GeForce GTX 560. Since we already have the slightly more expensive GTX 560 Ti available, in order to address the important $200 market, NVIDIA gave the GF114 core a minor surgery and disabled one out of its eight streaming multiprocessors. With all four 64-bit ROPs enabled, and each SM has 48 shader cores, 4 dispatch units, and 8 texture units, the end result is a GPU with 336 shader cores, 56 texture units, and 32 ROPs discussed in the introduction of this review. While the ROP count has not changed, by disabling one of the streaming multiprocessors, the GTX 560 is down 48 shader cores and 8 texture units from its fully unlocked Titanium brother. Thankfully, the memory interface is not crippled; the 256-bit bus width still delivers 128.2GB/s of bandwidth in conjunction with 1GB of GDDR5 memory.
Essentially, with the GTX 560 built on the TSMC 40nm fabrication process, transistor count of 1.95 billion, and a die size of 367mm², it is an overclocked GTX 460 with improved power consumption. Thanks to the updated Fermi architecture's transistor leakage improvements over its first generation counterparts, NVIDIA pits the GeForce GTX 560's TDP at 160W -- or 10W lower than the GTX 460. We will take a look into the Gigabyte GeForce GTX 560 1GB's power consumption and idle/load temperatures later on in this review to see how it fares against the competition.
The most interesting design aspect of this GPU is NVIDIA ships the GeForce GTX 560 with no reference board design. The reason behind this boils down to several crucial aspects. With more and more retail graphics cards shipped overclocked from the factory, the GF114 core's excellent overclocking characteristics, manufacturers trying to differentiate their products from each other, and no stringent OEM power consumption requirements, NVIDIA have actually left both the board design and clock speeds to its board partners. The recommended, or "stock", clock speed is 810MHz core, 1620MHz shader, and 1001MHz memory. However, almost no one ships the GTX 560 stock, and Gigabyte is no exception -- our supposedly bone-stock Gigabyte GeForce GTX 560 1GB is built on a custom PCB with a Windforce 2X cooler configured at 830MHz core, 1660MHz shader, and 1002MHz memory. We have seen cards clocked as high as 950MHz core from the factory at press time, so how much potential does Gigabyte keep under the hood with this very conservatively clocked card?
With that in mind, we will take a close look at the Gigabyte GV-N56GOC-1GI GeForce GTX 560 1GB OC in detail on the next page, followed by our usual battery of benchmarks, before moving onto the power usage, temperature, noise, and overclocking tidbits. Stay tuned for all the juicy details in the rest of this review!
1. Introduction, Specifications, Bundle
2. NVIDIA GTX 560 Series Architecture
3. A Closer Look, Test System
4. Benchmark: 3DMark 11
5. Benchmark: Battlefield: Bad Company 2
6. Benchmark: Call of Duty: Black Ops
7. Benchmark: Colin McRae: DiRT 2
8. Benchmark: Far Cry 2
9. Benchmark: Just Cause 2
10. Benchmark: Metro 2033
11. Benchmark: Unigine: Heaven v2.5
12. Power Usage, Temperature, Noise
13. Overclocking and Conclusion