Good on you sir! I’m reluctant on anything 12 gb or more. I had a RTX 3060 at 12 gb on my previous machine (don’t know when i’ll sell it though but thaT SHIT was on an i5!!!). Don’t know why the 50xx series is going 16 to 32 of Vram, thats bonkers! I’ve speculated whether or not the optimization of DLSS and the relative FSR are going to keep the amount at an all time constant. Because its just not that necessary considering nobody is going to buy 8k tvs unless EVERYONE will buy META headsets. Because the only rational explaination is our perception of texture resolution. If we don’t see it on screen , higher resolutions are not going to be needed .
And we still haven’t really seen anything that needs more Vram.
Last two years, I’ve seen nothing but progress on these upscaling technologies. PS4 Pro has been doing it since Horizon F.West. Asides from the AI renderings of clouds, aint much on the backend except for an AI rendering engine for atmospherics.
So cloud rendering which is an oberation of particals is mainly a backend compute problem. Which can be absolved if they include a data set of cloud imagery that is AI learned. Like a model. I mean they aren’t simluting realtime shit like tornadoes. Its all just visual. That means all they need is like 4 gb extra VRAM to store it and realtime add/remove it.
We haven’t seen a need for more capacity to render realtime shit. yet.
To me I’ve been looking into what is making Cyberpunk 2077 the interest. Its been path tracing all along but path tracing is merely a way to see the environment with a very specific magnifying glass.
If you follow Digital Foundry, those guys all have speculated the same thing. The business is bonkers really.