This week the GPU Flashback Archive series turns its attention to the NVIDIA 9 Series of graphics cards that replaced the successful and much loved 8 series. Arriving in April 2008, the new series featured an updated GPU design that eventually found itself built on a new 55nm manufacturing process. The period also marks a time when ATI and NVIDIA were trading blows as equals. An era when taking the performance crown was all that mattered, creating a situation that proved how healthy competition in an industry could indeed be very beneficial for consumers. Let’s go back in time and revisit the NVIDIA 9 Series, the cards that were popular on HWBOT and some of the more notable scores that have been submitted to the database.
NVIDIA GeForce 9: Overview
The era of the NVIDIA GeForce 9 series is actually one of considerable overlap. When the 9 series became available in stores at launch on April fool’s day 2008, a full array of 8 series cards were still available in the retail channel. There’s nothing too odd about that, as the previous generation typically gets a price cut to help clear inventory. It is a little odd however when the next generation GTX 200 series arrived on shelves just three months later. Today we’ll try and keep things simple and just focus on exactly what the GeForce 9 series offered. The 9 series may always be compared to the second revision Tesla chips that followed it, but for now we’ll leave the GTX 200 series for next week’s edition.
At launch on April 1st 2008 NVIDIA brought to the table two new flagship cards that would effectively replace the GeForce 8800 Ultra and 800 GTX. These are the GeForce 9800 GTX which retailed for $299, and the dual-GPU, dual-PCB GeForce 9800 GX2 which arrived demanding a fee of $599. Both cards used the G92 GPU which was in fact basically a redressed and tweaked version of the G80 used in the previous generation.
Here’s a shot of a GeForce 9800 GX2 which features a full shroud cooler not too dissimilar to what we see used with today’s modern cards.
The GeForce 9800 GX2 used a pair of PCBs, each fitted with G92 GPUs and memory chips. It was pretty much 2-way SLI in a single, two slot design as you can see from this image below. It’s also one of the first cards we see fitted with a HDMI port.
If we compare the G80 and G92 GPUs, there are in fact only a handful of genuine differences to talk about. The G92 was manufactured by TSMC using a 65nm process. This improves on the 80nm and 90nm silicon used in the GeForce 8 series. One other salient improvement was the implementation of PCIe Gen 2.0 support which offered AIBs more potential bandwidth with the rest of the system. NVIDIA also cites improved color and z-compression as key new features, but one gets the notion that perhaps the NVIDIA marketing team were a little bit limited in terms of ‘new stuff’ to talk about. In reality NVIDIA was busy working its next generation 200-series which would boast a wholly new and improved version of their successful Tesla architecture. The G92 was actually based on the same Tesla architecture as the previous generation. So why bother create the 9 series at all?
The graphics card market is one technology space which (on occasion) can be genuinely dynamic and fast paced. ATI (now AMD of course) and NVIDIA had created a duopoly where both companies strove to push out GPUs that would offer leading-edge performance. Despite avid fanboyism in certain quarters, most enthusiasts purchase the graphics card that will give them the highest frame-rates at the lowest price. In early 2008, that was the ATI Radeon HD 4850, a $199 card that was a real hit with enthusiasts. It was the card to get at the time and it made NVIDIA’s 8-series virtually redundant as a mainstream choice.
NVIDIA’s next-gen Tesla architecture was still a several months away, and the move to a newer 55nm process would also prove to be a challenge. Seldom do we see a new architecture arrive on a new manufacturing process. Intel’s ‘tick-tock’ had yet to be coined in 2008, but NVIDIA knew that it was a wise move to create a new graphics card series with aggressive pricing that it could also use as a test run for the new 55nm process. Thus the 9-series was born.
The price of a GeForce 9800 GTX and its 65nm G92 GPU was soon dropped in June 2008 to a $199 price point as a direct response to ATI and the Radeon HD 4850. This period really underlines the benefits of having a competitive market. Even in a duopoly, with only two players competing, we can see how it creates better value for consumers. If we had a truly competitive AMD today, high-end enthusiast Pascal cards would be considerably cheaper. History teaches us that this is a fact.
The GTX was joined in the market by the GeForce 9800 GTX+ card that launched at $229 featuring a 55nm G93b GPU. It was tasked with wooing gamers with a higher clocked GPU and Shader. It also just happens to be the most popular card in terms of submissions to the HWBOT database.
Check out the EVGA take on the GeForce 9800 GTX+ below.
The Most Popular NVIDIA GeForce 9 Card: The GeForce 9800 GTX+
It’s time to take a look at the most popular NVIDIA 9 series cards in terms of submissions to the HWBOT database:
- -GeForce 9800 GTX(+) – 19.68%
- -GeForce 9600 GT – 18.09%
- -GeForce 9800 GX2 – 15.82%
- -GeForce 9800 GT – 13.95%
- -GeForce 9500 GT (DDR2) – 5.86%
- -GeForce 9400 GT (DDR2) – 4.01%
- -GeForce 9600 GSO (GDDR3 / G92, 192bit) – 3.01%
- -GeForce 9500 GT (GDDR3 128-bit) – 2.65%
- -GeForce 9600M GT (GDDR3) – 1.45%
- -GeForce 9600 GT (1024MB) – 1.41%
The GeForce 9800 GTX+ is the most popular NVIDIA 9 series card, being used in just under 20% of all 9-series submissions on HWBOT. The more affordable 9600GT is second with 18% while the dual GPU 9800 GX2 is surprisingly popular with more than 15% of the pie. Let’s examine the key features that make the GeForce 9800 GTX+ such a popular card.
Here’s a single slot BFG GeForce 9800 GTX+ card which arrives with a snazzy custom water block.
The GeForce 9800 GTX+ uses a 55nm version of the G92 GPU dubbed the G92-420-B1. The chip itself has a die size of 260mm² and contains 754 million transistors. This is an increase in transistor count compared to the G80 GPU that was used in the previous generation (682 million transistors), a chip which actually had a significantly larger 484mm² die. In terms of clocks the 9800 GTX+ has its GPU clocked at 738MHz, up from 675MHz on the original 65nm 9800 GTX. Memory clocks remains unchanged at 1,100MHz (2,200MHz effective) using a lower 256-bit bus. In terms of Shader Units, both 9800 GTX cards use feature 128, the same as the 8800 GTX card. The G92 however features 64 texture mapping units, double that of the previous generation. The card supports the same DX10 standard with Shader Model 4.0 as well as OpenGL 3.3 and OpenCL 1.1. Cuda 1.1 makes a debut also.
The card itself featured 512MB of GDDR3 and came with two dual-link DVI outputs and an S-Video out (HDMI would be available with certain custom 3rd party designs). The card was a two slot design, and as with previous generation flagship cards, used a pair of 6-pin power ports. It had a TDP of 141 watts which is 14 watts lower than the 8800 GTX and (oddly) 1W more than the GTX. The reference cooler was also a similar design to what we saw on the 8800 GTX with fans reaching an audible 40+ DB under full load.
Check out this PCB of a GTX 9800+ card below:
In terms of Overclocking, the GTX 9800+ actually had plenty of headroom for enthusiasts to play with, even with reference cooling. This is what Nathan Kirsch writing for Legit Reviews had to say on the topic back in June 2008:
On the original GeForce 9800 GTX, we were able to take the core from 675MHz all the way up to 800MHz and the memory from 1100MHz to 1200MHz with little effort. The most impressive part of the overclock though was the improvement on the shaders. We were able to overclock the shaders from 1688MHz to an impressive 2025MHz!
With the new 55nm core… the highest settings we could get stable with the GeForce 9800 GTX+ were 855MHz on the core and 2200MHz on the shaders. That is 55MHz higher on the core and 175MHz higher on the shaders than what we could reach on the old 65nm G92 core.
NVIDIA GeForce 9: Record Scores
We can now take a look at some of the highest scores posted on HWBOT using the NVIDIA GeForce 9800 GTX+ card.
Highest GPU Frequency
Although technically speaking, GPU frequency (as with CPU frequency) is not a true benchmark, it does remain an important metric for many overclockers. Looking through the database, we find that the submission with the highest GPU core frequency in the HWBOT database using a 9800 GTX+ comes from AnomanderRake (UK). He pushed a GeForce 9800 GTX+ to 1,240MHz which is +83.70% beyond stock settings. His graphics memory was configured at an impressive 1,800MHz (+63.64%). The rig used also included an Intel Core i7 3930K ‘Sandy Bridge-E’ processor clocked at 5,118MHz (+59.94%).
You can find the Aquamark submission from AnomanderRake here on HWBOT: http://hwbot.org/submission/2341145_anomanderrake_3dmark_vantage___performance_geforce_9800_gtx()_9357_marks
The highest 3DMark06 score submitted to HWBOT using a single NVIDIA GeForce 9800 GTX+ card was made by legendary Aussie GPU pusher SniperOz. He pushed a 9800 GTX+ card to 1,080MHz (+60.00%) on the GPU core and 1,404MHz (+27.64%) on the graphics memory. With this configuration he managed a hardware first place score of 27,341 marks. The submission was actually fairly recent, happening in December 2015 as part of Australia’s assault on the HWBOT Country Cup 2015.
Here’s a close up of the LN2 cooled card in action. It also showcases some of the neatest insulation you will ever see.
You can find the submission from poparamiro here on HWBOT: http://hwbot.org/submission/3068131_sniperoz_3dmark06_geforce_9800_gtx()_49.8_hardware_points/
In the classic Aquamark benchmark we find Niuulh (France) has the highest score with a GeForce 9800 GTX+. The G92 GPU on the card was clocked at 1,026MHz (+52.00%) with memory boosted to 1,332MHz (+21.09%). This configuration allowed Niuulh to hit a score of 508,794 marks. The score was made just a few months ago and will no doubt have benefitted from the fact that the card was joined by a Intel Core i7 7700K ‘Kaby Lake’ chip clocked at 6,700MHz (+59.52%).
Check out this shot which includes some nice blow torch action.
You can find the submission from Niuulh here on HWBOT: http://hwbot.org/submission/3457755_niuulh_aquamark_geforce_9800_gtx()_508794_marks
Thanks for joining us for this week’s episode of the GPU Flashback Archive series. Come back next week and join us for a look at the landmark NVIDIA GeForce 200 series of graphics processors and cards.