Yeah - 8gb cards shouldn't have even been a thing. Games are getting too beefy.
shouldve been 12gb and 16gb models - Nvidia is very cheap with the vram. Those are all 300$ cards in my opinion.
Not true, it depends on resolution and most importantly settings.
https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5060-ti-16gb-review
My buddy Jarred included both the 8 and 16GB versions of the 4060Ti for comparison.
1080p "ultra melt my PC", both cards are around the same playable average / lows.
1440p "ultra melt my PC", now we start to see the 16GB card pull ahead slightly, but both are now into the "WTF you doing" range of performance with sub 60fps averages and ugly lows. This is the situation where it would behove the player to change the settings from 'ultra" to "high" or "very high" and pull the FPS up a bit.
2160p "ultra melt my PC", now things get really ugly. The 16GB card has a substantial difference but both cards have you in very unplayable sub 30fps.
And for reference 1080p "medium", both cards are 100+ with lows being 70+ so smooth experience.
I asked Jarred to include "high" instead of medium next time if it was possible because people playing with a lower mainstream 8GB card are the ones most likely to be playing at 1080p/1440p and choosing "high" or "medium". Would be nice to see the dividing line since there is a massive difference in texture size between medium, high and ultra/legendary/fusion reactor settings. Jarred has mentioned before that he didn't see differences at 1080p/1440p high for the 8 vs 16GB thing.
Seriously people need to take youtubers with a grain of salt, they review everything at "max settings" to compare relative performance between $2000+ USD cards at $400 USD cards. Expecting a $400~500 USD card to function on the same settings at a $2000+ is absurd.