NVIDIA Moves Fermi GPUs to Legacy Status, Ends Mainstream Driver Support for 32-bit Operating Systemsby Nate Oh on April 7, 2018 8:00 AM EST
This week, NVIDIA has announced that they are ending mainstream graphics driver support for Fermi-based GeForce GPUs. Effective as of this month (i.e. immediately), all Fermi products are being moved to legacy support status, meaning they will no longer receive Game Ready driver enhancements, performance optimizations, and bugfixes. Instead, they will only receive critical bugfixes through the end of the legacy support phase in January 2019.
While the announcement mentions ‘Fermi series GeForce GPUs,’ the actual support plan specifies that mainstream driver support is limited to Kepler, Maxwell, and Pascal GPUs. So presumably all Fermi products are affected.
In the same vein, also effective this month is NVIDIA dropping mainstream driver support for 32-bit operating systems, as announced in December 2017. Like Fermi, 32-bit operating systems will still receive critical security updates through January 2019. This update also encompasses GeForce Experience, which will no longer receive software updates for Windows 32-bit operating systems.
Given the current drivers, March’s version 391.35 on the Release 390 branch, this likely means that the next branch is due to release later this month, and that it will simultaneously drop support for Fermi and 32-bit operating systems.
In context, NVIDIA’s previous architecture retirement came in March 2014, when their D3D10 Tesla architecture GPUs were moved to legacy status after around 8 years of support. And with this week’s announcement, Fermi has received mainstream support for around the same amount of time, marking the beginning of the end for NVIDIA's first D3D11-class GPU architecture.
However it's interesting to note that Fermi's legacy support window will end up being a lot shorter than Tesla's, stretching for just ten months versus two years for Tesla's. This may be a distinction that proves important, as complex and highly privileged video drivers have been an ongoing source of security vulnerabilities - including as recently as this year in NVIDIA's case. So while the vast majority of Fermi cards have been retired, for any that remain (particularly those in Internet-connected machines) the end of security updates is not a trivial matter.
In comparison, AMD’s GPUs contemporaneous to Fermi were moved to legacy status in 2015, with all pre-Graphics Core Next architectures affected. On AMD’s side, retiring pre-GCN products meant that all their supported GPUs were DX12 capable.
For NVIDIA’s Fermi, Kepler, Maxwell, and Pascal architectures, Fermi was the only one not immediately supported, though the current state of DX12 on Fermi is somewhat unclear. Last summer, NVIDIA’s drivers appeared to quietly enable it, and Fermi products are listed as DX12 supported GPUs, but NVIDIA’s DX12 GPU support page still notes Fermi support is pending. But in any case, this puts the focus on D3D12 supported GPUs, comparable to how NVIDIA’s 2014 retirement of D3D10 GPUs meant retirement of all pre-D3D11 products
NVIDIA support has also posted a list of Fermi series GeForce GPUs affected by this change.
Post Your CommentPlease log in or sign up to comment.
View All Comments
r3loaded - Saturday, April 7, 2018 - linkGoodbye Thermi, you were always the butt of many internet jokes and memes.
Samus - Saturday, April 7, 2018 - linkI'll never forget night GTX470 and I would cuddle up, the sensuous hum at idle ever so present, swiftly heating the cubby near my feet.
Hurr Durr - Saturday, April 7, 2018 - linkOne wonders why it didn`t happen sooner.
madwolfa - Saturday, April 7, 2018 - linkLong Term Support / Enterprise.
DanNeely - Sunday, April 8, 2018 - linkIt's been less than 4 years since the last Fermi products (a few low end 700 series branded cards) were released. Providing at least nominal support (new drivers work, but not much in the way of per game optimization is done) for old architectures until they're woefully obsolete is SOP for gaming cards.
AMD did pull the plug on their similarly old VLIW4 cards a few years sooner; but that was mostly noteworthy because it happened as early as it did. Presumably that happened because the radical architecture change to GCN meant that they weren't able to piggyback on the work done with more modern designs and supporting them was a lot more expensive.
spaceship9876 - Saturday, April 7, 2018 - linkYet Nvidia still sell the fermi GT710 for ~$35 which has no direct modern replacement at this price point. They also sell the GT610 and GT210.
cyrand - Saturday, April 7, 2018 - linkI thought the fermi GT710 was OEM only and the GT710 that currently being sold is the Kepler base one.
DanNeely - Saturday, April 7, 2018 - linkis it actually NV selling them, or just 3rd parties working on ancient inventory?
The growing size of video en/decode blocks means that NV really can't go any smaller than the 1030 with current generation tech. The GP-107 (1050/Ti) was already down to only 50% GPU cores and 50% everything else in die area; while dropping the PCIe from 16x to 4x and halving the size of the memory bus to 64bit for GP-108 means there's no room to cut farther unless they start cutting into higher end en/decoding hardware. At the point of doing that though they might as well just dust off GM108 instead.
427269616e - Sunday, April 8, 2018 - linkIsn't the replacement integrated graphics? They are surprisingly decent now.
DanNeely - Sunday, April 8, 2018 - linkUpgrading integrated graphics after 2 or 3 years requires a new CPU, often a new mobo, and potentially new ram; collectively that tends to be a lot more expensive than popping in a half sized card. Besides, there's still a significant chunk of the barely gaming capable system buying market that still 'knows' they need a separate GPU to game on because the IGP is horrible.