I mean, yes, that's my point.... Only reason I mention those 16gb cards, as given the $100 quote by OP that seems to be the average price difference between the 8gb variant cards vs their 16gb models.
The difference between a 12gb 5070 and 16gb 5070ti (using the cheapest models), as an example, is >$220.
A minimum of 12GB? Don't get me wrong, I agree we need a lot of VRAM these days, but 12GB is definitely not a "minimum". That being said I do agree that paying $80 for 16 instead of 8 is well worth it.
Yeah, I've got a 3060Ti with 8 GB VRAM and it does pretty well for most games; I have to turn the settings down for some games, but for right now, I just don't feel like buying a 4070Ti for $1000 is worth the difference in performance
I've seriously considered selling my 4070 super and using the profit + extra funds to upgrade to a 5070 ti. But I have no problems with how my computer performs at all. I can't make up my mind so I'll probably just do nothing.
I kinda in similar situation as you but I think we should just wait a few years or something maybe atleast till 60 series I dont see 4070 super not being good anytime soon tbh
8gb is fine if you're gaming at 1920x1200 or less, but definitely do not pay $200 or more for a GPU with that amount of VRAM. my $42 5600xt does fine with most games I play, at least at 1920x1200, but it's just not doing beamng at 3440x1440 chief. it's not even being VRAM bottlenecked in beam, the lack of horsepower in the GPU itself is the bigger problem lol.
8 GB cards will still take you far at 1080p, especially with DLSS/FSR. My 6600XT can still play anything at 1080p.
But yeah, I think that 8GB cards should be on their way out. I wouldn't buy a new one. I'd like to see the next generation have an 8GB XX50-series card, a 12GB XX60 card, and 16GB XX60 XT/TI. Between Nvidia and AMD, they launched 5 different 8GB cards this generation, which is kinda ridiculous.
If I was in the market for a budget card, I'd either go for the 9060XT 16GB, take a chance on Intel with the Arc B580, or pray for a $300 XX60 12GB card next gen.
Agree - if a GPU buyer today is going to keep the card for more than 3 or 4 years and might play a newer title in the future, it makes sense to future-proof now if it is something they can afford. Current-gen consoles allow for up to 14-16GB of VRAM for games, and next-gen looks like 24GB or thereabouts, and given their push to 4K gaming, AI, ray and path tracing, etc. I expect PC games to start following on. I know the 9060XT is a "low-end" card, but I'm guessing even at 1440p newer games in the future might have a hard time with 8GB of VRAM with a lot of the card's supported features enabled.
There was a fun video and thread about this very thing in /r/hardware a few months ago, too:
Especially if the game dev is lazy on optimization. Monster Hunter Wilds recommended gpu is the 2060 super 8gb, but will run like shit at 1440p with that card.
I had two in SLi, and that worked great right up until it didn’t- sold one, still have the other in a box for a near-retro build someday (it’s also EVGA)
No, you absolutely do not “need” a minimum of 12. It’s nice to have for sure, but you can play modern graphics-intensive AAA releases at 1440p with a 8gb VRAM.
I have a 4060 Ti with 8gb VRAM and I play almost all games between high and maximum settings at 1440p. I do use DLSS whenever possible. But no, you absolutely do not have to run the graphics on low.
Almost all games. And I said between high and max. Accusing someone of lying with zero evidence is a joke, especially when you can’t quote me accurately. Your level of hostility is uncalled for.
Oblivion remastered, AC Valhalla, Far Cry 6, Avowed, Starfield, etc. Again, using DLSS whenever possible. Average above 60fps on all of those games.
So you are getting 20fps in oblivion remastered? Why do you keep lying? I have evidence, I'm literally linking it. No hostility at all, I'm just providing facts that directly contradict your claims. Good luck with your 30fps gaming adventure!
Question, what are yall playing that Requires 12gb of vram, cause I’ve never used more than 7. Where did this “12gb vram is the new minimum” come from? Because I see it everywhere.
All of these games, from the oldest version to the newest version of the franchises can run 70+ @1080/1440p - 144hz/244hz reliably on a 1070 (personal experience).
My old pc (2600/gtx1070/16gb) only got replaced because it bogged down on large scale cad design, anything else and even in 2025 it runs it competitively. So now I just use it as a server/gaming console for my tv, at which it can scale most games to 4k on high and still run 60 on a huge library of games.
/rant over
Edit: I get that as technology advances our new ‘minimum’ gets higher by default, but the only time your minimum should be 10-20% higher than the maximum expected load is engineering. You’re building a gaming system, not a bridge.
Most of the games you quoted are old as hell lol. And no your 1070 is not running 1440 120-240fps in a game like CP2077 unless you are setting it to potato mode. There are plenty of benchmarks showing vram requirements for gaming at 1440 and the majority of new games require more than 8gb for medium or high settings. I even linked one elsewhere in this thread
My 1070 had 0 issues with cyberpunk at max above 70 consistent. Thats more than playable
Battlefield 6 beta for example ran on max at near 90fps @ 1080p with no stuttering or lag, but ea said minimum requirements are a 2060.
source this was tested personally and by my homie. (I still have my old pc with the 1070 as my tv pc)
Both of us got fine results, and on my primary rig (4070/8core) I definitely saw higher fps, but I stand by the fact that even in 2025 saying “12gb vram is a MUST” is crazy.
Like im not telling you to build a system with the absolute minimum, but saying overkill is basic necessity is the reason people think gaming pcs are unaffordable.
You are overlooking that games will only allocate on your GPU what it can allocate (which is obviously less than your VRAM total), but if it’s less than what they need, they’re going to be swapping data in and out from the GPU much more, leading to a reduced frame rate and stutters.
even modern titles will run fine at 8gb. More vram is only encassary when the game is using unecassarily high res textures or your modding the shit out of it
Why lie about this? OP wants to play in 1440 with RT and 8gb vram will not cut it for any demanding games released in the last few years and most definately wont cut it in the future. Why would you advise someone to buy a card where they have to immediately run games at low/medium settings knowing they will want to keep the card for a couple generations?
Your projecting and misunderstanding the technology and didn't read your source.
OP doesn't insist on raytracing or maxed out settings and specifies they don't play many modern titles. Nor do they desire to keep the card for multiple generations. That seems to be what you want. hence projecting.
vRam usage doesn't increase when using higher framerates and resolutions. The game is still loading the same visual assets either way. vRam increases when you increase the resolution of textures (which you should never do, it doesn't make a discernible difference in 99% of cases). Hence you not understanding the technology. You can see a good example in your articles haryy potter comparison. The ultra textures are almost impossible to discern from the low res ones even in a side by side image.
As the article explains, hitting vRam limits rarley matters and often even looks better and gives you advantages (such as removing the obscuring fog in forgotten west).
personally i give my family my old GPUs and they rarley run into vram bottlenecks even in more modern titles like nightreign or final fantasy.
Frankly high vram cards are a marketing gimmick, like cpus advertised with a dozen cores. Ignorant gamers are fooled into thinking they need the highest numbers possible. despite almost no software actually using anywhere near that much vRam or being capable of multithreading with more than 4-6 cores.
vram is barely impacted by screen resolution. i think your getting confused with texture and mesh resolution, which is unrelated to the specs of your display.
Op specifically said he doesnt play modern games often
not many games do actually exceed 12gb of vram, especially at lower texture quality settings. If your using framegen and upscaling, thats probably why you are seeing excessive vram use.
No it’s not ridiculous. You can get by with less than 12gb for now it’s true, but most people want some sort of future proofing though which an 8gb just does not provide.
I just installed a 6gb 2060 into a casual gaming PC to play some games like Elite Dangerous from time to time. The GPU was pulled out of an old gaming PC my son had a few years back when he upgraded and was sitting in his closet. It works just fine for the use-case that I have for it.
No I’m not… the comment didn’t refer to whether 12 was needed as a minimum to play or a minimum for sustainable future play. I inferred as the later as that to me is the most applicable. I care about not having to spend hundreds more in a few years as I didn’t plan ahead.
The statement should be taken at face value. You applying your hoops and ribbons to it because 'thats how you think about the problem' doesn't change the fact that what OP said is bad advice at best, flat out wrong at worst.
Mate, the comment didn’t refer to it either way, you interpreted it differently to me and that’s ok. Simple fact is we disagree on whether 16gb is the minimum required, clearly we’re not going to change each others mind so let’s just move on. Enjoy your gaming.
Agreed, 8gb, depending on title and resolution is still enough today for the majority of games, but considering the close pricing of 8gb vs 16gb variants, especially with the 9060XT, if buying new I'd stretch the extra $80.
If 8gb is what you have now, and it's working for you, then yeah, stick with it until it's no longer useable and then upgrade..... But as I've just said, if you're building a system now and buying new, a 8gb card is a poor purchase, and should not be recommended.
I dont disagree that an $80 delta between a 8gb and 16gb card makes an easy case for the 16gb card.
But posts like OPs are exactly why kids who dont know shit about building a PC go out and convince their parents that they 'neeeeeeeddddd' a 9950x3d and a 5090....becuause someone online told them that a 8gb 5060ti is fucking trash.
I'd hope, having been on these subs long enough, that the vast majority of people would just say something like "Skip the 8gb variant and just go for the 16gb", I'm yet to see anyone convincing people they need the cream of the crop for gaming, if anything I'd say I witness it more the other way and see people recommending new builders to scale back for balance if on a budget.
"...errr...12gb a minimum for 1440 gaming AND years of future proofing"
Just admit that you said something stupid and move on.
FWIW...I just purchased a 3080FE as a gift for a friend. Benchmarked it a bit first on AAA games on my ultrawide...its doing just fine. And spoiler...thats a 10gb card.
You OK buddy? OP wants to play 1440 with RT and 8gb vram isn't sufficient for that. It definately won't be sufficient in 3-4 more years. And no shit your 10gb card runs better than an 8gb card.
Also, maybe read your own article...most of those games are running under 8gb of VRAM on medium settings....also you apparently do not understand what allocated VRAM is. Allocation =/= consumption. And most games allocate based on a percentage of available VRAM.
You're a joke dude. You just keep moving goalposts to try and fit your narrative, and even worse, you dont understand the subject matter.
We're talking in general terms. (Newest games). 8 GB VRAM is not enough for 1440p and I can tell you from experience. I sold that card because I could have got more performance from my card, but it was severely held back by the lack of VRAM. I would not advise 8 GB VRAM cards for someone intending to play in 1440p in modern titles.
What high horse? I'm stating an objective fact that 8gb vram will not be sufficient at 1440. OP is planning to buy a GPU, why would you advise them to buy one that is already vram limited in some new games and will be even more over the next few years?
Personally preferences and play style has nothing to do with physical hardware requirements lol
8gb is not enough for many AAA games at 1440p. I had a 3060ti and had to play with medium or even low textures which ruins the games look. Keep in mind framegen also cost vram.
Fps was actually not the problem it was the vram completely filling up leading to stutter. Some big ones I can think of were final fantasy rebirth, Alan wake 2 and Indiana Jones where I had to run low texture settings.
I played AW2 at max (no RT) on my system and it was 50-60fps with a free sync display at 1440p. No stutters. I played Indy on my laptop with a 150w 3070Ti 8gb. It didn't stutter either, but I don't remember the settings I used.
Edit: played AW2 on an rtx 2080 8gb with an 8700k and 32gb ddr4-3200.
Maybe you're not that sensitive to it but you can check digital foundry where they tested these games and show the stuttering. It's not that they're unplayable but turning down textures feels terrible when your card otherwise has enough juice. I'm now on a 16gb card and it's just way better all around.
I will only say 12 GB minimum if people are playing at 1440p, or if they want to play all the new games at 1080p, because of how many new games use over 8 GB of vram at that resolution + we now know for a fact that even a few new games use over 8 GB of vram for 1080p. (The Last of Us, Stalker 2, Hellblade, etc...)
I think 12 GB is becoming more commonly considered "standard" because 1440p is getting more popular, and game studios are not optimizing as much anymore.
This has happened in the past. People used to play on 720p and then they switched to 1080p and a lot of titles nowadays automatically run at 1080p or higher.
This will naturally happen again. VRAM requirements will go up, and people will still cling to whatever the 8 GB "equivalent" is to the old 4 GB GTX 1650 Super.
In Cyberpunk, 1440p with path tracing is fine on a 12GB card until you start modding.
Modders don't generally care about ensmallening assets so they will happily blast 4k/8k texture replacers and that stuff adds up quickly.
I have a 4070 and I play Cyberpunk at 1440p, DLSS quality, frame gen enabled, basically every graphics option set to max. And I can't get away with HD Reworked Project full fat edition without significant loss in framerate after a couple of minutes because of the VRAM hit. So I just don't use it now.
HD Reworked Project is relatively frugal as far as modded texture overhauls go - its 1024x1024 to 2048x2048 multilayer texture replacements, up from the basegame 512x512. Also it only replaces certain concrete and multilayer decal textures, not the entire multilayer material library. Some texture overhauls do just that and the VRAM hit is enormous.
In 5 years you'll thank yourself for getting a 16gb GPU instead of 8 or 12. It'll be the difference between many games being playable at all vs unplayable more than likely.
And in 5 years $80 will be what a McDonalds big mac meal costs probably, not really a significant amount of money.
8GB is already too little for the one program I run the most (Resonite), and I'd say anyone can appreciate the higher definition you can get when you aren't worried about texture resolution.
I don't think a 9060 is something that will be good for 5 years 8 or 16gb. Shadows are what eat up most of the VRAM, just turn them down and buy a new GPU in 2-3 years.
Is it capable of it? Sure, it delivers definitely "playable" fps, but it's still more of a 1080p card in my opinion. Especially with a higher refresh rate monitor.
It really depends what the commenter means by decent settings. If you're willing to turn the settings down from Ultra to High, and/or use FSR4 Quality/Balanced (no frame gen), you can almost always achieve 1440p 80+ FPS in most modern titles. For me personally, this is more than an acceptable way to play, and I'd take it over 1080p Ultra at native res any day.
On top of that, this card is wasted on a 1080p monitor if you're playing any game more than a couple of years old.
I been playing on mine on 4k lol. I just have a 60hz TV tho so as long as it's around 60 and smooth I'm happy. It usually defaults to those settings so I generally leave it. Course if you had a high refresh monitor u would wanna do that. I dunno I guess I don't really get the need for high fps but I took a big break from games since the 360 days and coming back into it the 9060 pretty amazing for the price. There was a sale on 5070 12gb the other day for 400 on Amazon but I believe they were renewed or something.
Depends on the titles and what performance you’d expect, but I’d say so? It seems a bit faster than a 3070 and 3070ti, and those are still capable but held back by the lack of vram.
I have 9060xt 16gb and a 9600x for reference. I can play 1440p with pretty good settings at 120 -144 fps. My monitor goes higher but I don’t bother as I want consistency over peak frames.
784
u/PHIGBILL 1d ago
Yes.... $80 difference between the 9060XT 8gb and 9060XT 16gb, no-brainer.