OCing 8800GTX - manual and factory mode
Author: Luka Rakamaric
Date: 08 Apr 2007
With the series 8800 being on the market for a few months now, everybody has turned their interest to the newer models, like the 320 MB version of the GTS, or the upcoming R600 or 8900 series. Older cards like the 8800GTX, which is actually the fastest card around, get little attention if they are not somehow special, like being factory overclocked, or having a custom heatsink. So, when we got Galaxy's 8800GTX and Point of View's 8800GTX XO, we decided to look for the difference between stock and do-it-yourself overclocking and at the same time check some of the information that recently appeared on the forums about overclocking the 8800 series.
The heart of the 8800GTX is a G80 GPU, the first 'next generation' chip supporting DX10 and using a completely different architecture than its predecessors. NVIDIA has implemented so called unified shader architecture, meaning that there are no more pixel, vertex or geometry shaders, but only one type of shader that can do all those things. This was done with DX10 in mind, this is the first API to fully take advantage of this architecture. The G80 has 128 such simple processors, or SPs, that are clocked much higher than the rest of the GPU. We have had examples of different regions of the core operating at different frequencies in the past. In the G80, SPs operate at a stock value of 1350 MHz, which is more then twice the speed of the GPU itself, clocked at 575 MHz. The G80 in 8800GTX has a frame buffer size of 768 MB DDR3 memory operating at a stock 900 MHz.
So, what happens when you overclock your card? Most utilities, including nTune from NVIDIA, will offer you +1 MHz increments on both the core clock and the memory clock. No problem you say, raise the clock from 575 MHz to a round number of 600, get some performance increase, and go to bed happy.
Recently we got some information from our friends at TheTechRepository that this is not what actually happens. VGA card chip has a frequency oscillator that determines its clock, and in the case of G80, it can?t change the clock by just 1 MHz. Through a method of trial and error while thinking about crazy multipliers that might have something to do with this, it was determined that the core clock will jump in discreet jumps rather than by what you set as an actual core clock. The following numbers show what clock the core is actually operating at when you select a value in the OC application. It was created using SysTool for OC and RivaTuner, which was showing the actual clocks in its Hardware monitor page.
from 554 to 571 = 567
from 572 to 584 = 576
from 585 to 603 = 594
from 604 to 616 = 612
from 617 to 634 = 621
from 635 to 661 = 648
from 662 to ??? = 675
So we can see that when you set your core to 600 MHz, it actually works at 594, which is exactly one percent less than what you requested.