Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
Comments Locked

414 Comments

View All Comments

  • CeriseCogburn - Sunday, May 13, 2012 - link

    And thus we can put the "sad performance increase" of this generation to rest, even though you try backing chizow in it in the prior pages.

    ROFL - way to go.
  • CknSalad - Thursday, May 10, 2012 - link

    I don't see the point in even getting the 680 GTX. Both nvidia and AMD show a good improvement in their flagship over their best last gen card of about 30-40%. Other than that, I can't help but feel underwhelmed with the current $250 and $350 range cards as they are just merely more power efficient (which is great), but with maybe 5% or so better performance. Hopefully we see better midrange to upper midrange cards in the near future as I don't want to spend anymore than $350, preferably $300-$325. AMD is just looking really bad this round. The only props I can give them is that they made a good gaming and compute card at the same time. Unfortunately, I feel that most gamers will not care too much for compute. It'll depend mainly on if games will become more compute heavy like metro 2033 or Crysis, but I highly doubt it as most games are consolized and with even the next gen consoles, pc games will still be a bit held back by the next gen consoles.
  • raghu78 - Thursday, May 10, 2012 - link

    I can't help but be surprised by how unintelligent your comments are. Look at the most demanding games released in the last 12 months - BF3, Alan Wake, Crysis 2, Batman Arkham City, The Witcher 2, Shogun 2 Total War, Anno 2070. Are you able to max out these games at 2560 x 1600 and get 60 fps. No you can't. I can bet the HD 7970 will win or tie each one of these games with the GTX 680 OC especially looking at HD 7970 OC editions available at upto 1.1 Ghz with headroom upto 1300 Mhz with voltage OC.
    What happens when games like Crysis 3 and Metro Last Light release in 2013. They will be hard to max out at 1080p. So please don't make such stupid statements. PC gaming is about gaming at maximum image quality at ultra high resolutions and in fact
    last but not the least multi monitor gaming. Also the PS4 and Xbox Next are to be launched in 2013 so the graphics quality of games will improve significantly.
  • maximumGPU - Thursday, May 10, 2012 - link

    dude please stop it.
    we get it, your an amd fanboy that had his feelings hurt when the gpu from his favourite manufacturer loses the race.

    Don't give one website or one review as evidence. Taken as a whole and across many reviews, most put the 680 on top as fastest card available stock for stock. ( for gaming, compute is a different story)

    start talking OC'ed editions and there won't be consistent comparisons as it's dependent on 3rd party PCB's and coolers. no doubt there will be nicely overclocked cards from both camps, but the most relevant comparison is the one done at stock speeds.
  • raghu78 - Thursday, May 10, 2012 - link

    this is escapist tendency. People running 1150 Mhz on HD 7970 OC editions at stock voltage are on the forums. i have got their feedback. In fact others who have pushed to 1250 with extra voltage have said the scaling is phenomenal. So if you know the facts then speak. At USD 500 you are talking about the high end of the market where people go for maximum performance with stock voltage OC or maximum voltage OC and some even modify their cards with watercooling. Websites will say what they have to. Its upto you to deduce the real potential and value.
    Then there is the availability question. All this comparison is irrelevant if you are talking about a product like GTX 680 which 6 weeks after launch is difficult to find. Don't give me the crap that its only extraordinary demand and not a supply problem. I am not hurt at all. In fact I have had Nvidia 7950 GT before my current HD 6950. What pisses me off is the constant 925 Mhz stock comparisons raised by Nvidia fanbois when there are cards like MSI HD 7970 Lightning (1070 Mhz) and Powercolor HD 7970 Vortex II (1100 Mhz).
  • ksheltarna - Thursday, May 10, 2012 - link

    i just ordered a GTX 680, just sold my 2 year old Radeon HD5850 for 120$
    I must say, GTX 680 is the best buy for your buck right now.
    Only 1 shop had it on stock in Denmark, it` s selling out as soon as it shows up, so i guess there are plenty of enthusiasts out there..:)
    I had to add this to my X79 Asus board with the 6 cores I7 and 32 Gb ram..and connected to my two 27 inches dell monitors running at 2560 x 1440..
    Now i can play the best games and make music production, all in one workstation..
    I` m not a Nvidia fan, was on Amd` s side, also back in time when they use to make good processors, but since then I moved to Intel cores, and now to Nvidia.
    Gotta be impartial when it comes to hardware.:)
  • CeriseCogburn - Friday, May 11, 2012 - link

    Here rag, have a looksie, and this time I will translate the chart, but before that imagine the hundreds of thousands on angry nVidia fans who have had to look at YEARS of charts with the GTX570 732 stock core (914 average air OC hwbot) put up against the 6850 or 6970 and now the 7850 and 7870 with much higher stock core clocks...

    Or the GTX580, an absolute winning monster at stock clocks, and almost always shown at stock clocks 777, very low stock clocks, when just the average OC is 951mhz, nearly 200mhz of raw power higher...
    http://hwbot.org/hardware/videocard/geforce_gtx_58...

    So your complaint, if you call it valid, applies to the last THREE nVidia generations... at the very least, and can we say, given the same standard you claim is needed, nVidia was SCREWED for years on end here... ?

    YES WE CAN.

    Now how about some truth when it comes to "at the same clockspeed" ? There's the link.

    http://hexus.net/tech/reviews/graphics/37209-gefor...

    At stock 7970 is only 86.6% and 88.4% the speed of the 680 (1920 and 2560 respectively)

    At the same clockspeed as the 680, the 7970 loses twice again, behind about 6% and 4% in each case, 1920X and 2560X resolution.

    The 7970 is SLOWER than the 680 at the same clockspeed.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    this is escapist tendency, you are in the GTX670 thread raghu78

    it is $399 and available

    it beats the 7970

    it's way better

    the 7970 costs way more

    amd drivers suck

    let us know when you've escaped the amd podpeople orbit ship
  • snakefist - Thursday, May 10, 2012 - link

    "PC gaming is about gaming at maximum image quality at ultra high resolutions and in fact last but not the least multi monitor gaming"

    hmmm, so PC gaming is basically a 10x costlier console? why do i occasionally like to play a game on my PC, and am rather successful at having fun with it, when i didn't payed 2000$ for it? guess cause i'm unintelligent... oh, ultra high resolution is a nice thing to have, but maximum image quality - often i fail to see any difference, except in framerate, and the faster paced the game is, quality difference is harder to spot, opposite to framerate drops

    "Also the PS4 and Xbox Next are to be launched in 2013 so the graphics quality of games will improve significantly."

    you are aware that all three new gaming consoles will be based on medium-range AMD cards, aren't you? and still expect that graphics quality will increase significantly? both AMD and NVIDIA are likely to have another generation of cards launched by then...
  • raghu78 - Thursday, May 10, 2012 - link

    You want a good example. Alan Wake on Xbox 360 and Alan Wake on PC at 2560 x 1600 maxed out. Try the difference. I have played Alan Wake on PC at 1080p maxed out. I can say it was fantastic. I have seen the Xbox 360 versions on youtube. The quality is much better on the PC. Its just not just texture quality, anti aliasing but even lighting and visual effects like god rays.
    I can say that BF3 would similarly look much better on the PC maxed out in DX11 at 2560 x 1600. If you prefer to cheap out and go for the console its fine. But the PC is the ultimate gaming platform.

Log in

Don't have an account? Sign up now