Image not available

1920x1080

nvidia-rtx-5000-r....jpg

🧵 How important is VRAM?

Anonymous No. 931104

I can get RTX 3060 and RTX 3060 Ti for the same price. 3060 boasts 12GB VRAM while 3060 Ti 8GB, however 3060 Ti is about 30% more powerful. Of course all that extra power of the Ti means nothing once the card has to use RAM when it runs out of VRAM buffer. I've seen specific scenarios with Blender and DAZ renders that took the 3060 Ti twice as long to render because of this.

A single high quality DAZ model is about 1.1GB. Windows takes 1GB because fuck you, a basic scene is 4-5GB. So the Ti would barely render two models in a scene before halting to a crawl. Is my assumption correct? I have no idea how Blender and UE5 fares in comparison but I am expecting similar issues.

Anonymous No. 931115

>>931104
none quadro cards also pay a tax of 512-1024 when running IRAY / Nvidia software (just another ngreedia move to force you to buy higher tier model).
If you want to do DAZ scenes and have multiple characters on screen and do not always want to optimize like a mad dog then 8GB is really really cutting it low.
12GB is easily used and quickly feels limited but its leagues above 8GB.
If it wasnt for the fact that Nshitty sits so tight on the industry then maybe we could actual have normal cards that would follow the rest of the PC market when it comes to memory. 8GB card in late 2022 is fucking stupid (we should be on 16GB as standard by now).
But then again they know that if you buy a great GPU you can play on that for 5-6 years since the core will not really falter that much. So they just cut it minimum on the VRAM so that in 2 years max your high end GPU struggles due to low memory. Done this for over a decade now.

So to answer you question: if you just want to model in blender and such then the 3060ti is fine.
If you want to use store assets and do stuff in DAZ then the 3060ti is simply a bad buy.

Anonymous No. 931117

>>931104
oh and as for UE5 then dont worry. Unlike blender, Maya, DAZ and so on UE5 is actually well made.

Anonymous No. 931132

>>931115
Thank you for your detailed reply. I want to make animations, import the models+animations so UE5, Blender, Maya, Substance, Photoshop. Also some AI (stable diffusion, PS neural filters). Edit footage in Resolve and Reaper. Play and capture video game footage on the side. I would love to get the Ti because the benchmarks are very nice for its cost but the VRAM size of the Ti model is bothering me. I would go for AMD but the software side is currently optimized for Nvidia. I don't hate the player and hating the game is useless, it is what it is.

The regular 3060 was supposed to have only 6GB initially but then AMD luckily fucked Nvidia in the ass a bit and they realized it would not compete, slapped 12GB on it and now we have a card that sits in a weird place. Not enough power to really use the VRAM for video games so it's useless for gamers, but it looks more attractive to me from a 3D work standpoint. I am thinking the 12GB won't be enough once I get to multiple model animations with rays and other dense effects.

Anonymous No. 931137

I recently had the same decision to make and went for the 3060 with 12GB of VRAM because you can never have enough VRAM when working with 3D shit and I can easily take a big stinking dump on the 5 seconds it needs more to render.
Of course if you're aiming to make animations like you've said, that time might add up so of course you want the one that has the quickest render times but then again someone aiming to get really productive and pro doesn't work with a 3060, he'd try to get at least a 4070 that is about to drop.

Anonymous No. 931140

>>931132
>I would love to get the Ti because the benchmarks are very nice for its cost but the VRAM size of the Ti model is bothering me.
well its like this: the Ti model is made for gaming and everything about it is geared towards gaming. Those 8GB VRAM are also much much faster than the slower 12GB version (which matters ALOT of gaming and machine learning (but with 8GB that aspect does not really matters anyways)).
>I would go for AMD
not to be a downer since I am a team red fanboy but when it comes to 3D / machine learning and such; CUDA is king. You cant really get around that if you want to use a wide range of software and such.
>Also some AI (stable diffusion, PS neural filters)
for basic image manipulation with "AI" then your 8GB version will be plenty (for now). For video interpolation or more "advanced" stuff like deepfakes then 8GB starts to become an issue.
>UE5, Blender, Maya
8GB is enough for these as long as you do not bloat scenes and make up for it having plenty of sys ram (32GB at minimum 64GB recommend).
>Substance, Photoshop
both substance and photoshop suck at GPU utilization and so you can get away with quite a shitty card here. Again here it matters to have a fast CPU/plenty of sys ram.
>Edit footage in Resolve and Reaper.
havent used those so cant really say either way. For something like adobe premiere your 8GB will be fine for 720-1080p editing without a bloated timeline (or you can go ham but just live with bad performance it will still work).
>The regular 3060 was supposed to have only 6GB initially but then AMD luckily fucked Nvidia in the ass a bit and they realized it would not compete, slapped 12GB on it and now we have a card that sits in a weird place
well what happened was that AMD put the expected VRAM on the card and Nvidia followed suit. For some reason "gamers" really want to get fucked in the arse but GPU manufacturers and glady run systems with some 16 core CPU, 128GB sys ram and then some silly 3080ti with 12gb vram.

Anonymous No. 931142

>>931132
>Not enough power to really use the VRAM for video games so it's useless for gamers
No. Plain wrong. Its useless right now as all games are made with 6-8GB VRAM in mind.
in 4 years your 3060 12GB will still run games with stuff on low-med but textures at high as 12GB will be the new standard. Your 8-10GB higher end cards will run stuff on medium but textures details on low/med. Its the classic planned obsolescence but the GPU makers you see here.
>from a 3D work standpoint
well for anything like IRAY it sure is. Also more VRAM = you can be more lazy. However ive been doing 3D work with much less VRAM years ago in Maya so its not like you cant make it work with 8GB just have to be more mindful.
>I am thinking the 12GB won't be enough once I get to multiple model animations with rays and other dense effects.
cant say. Depends on what models, what render, what engine and so on. If you plan to make stuff in Maya/Blender and move to UE5 to render then with very little optimization those 12GB will get you massive scenes. Even 8GB will serve you well here.
If you chose to do lots of particle based hairs in IRAY/insert your ray traced forward renderer here then sure 12/8GB will become an issue quick.

Anonymous No. 931143

>>931137
>because you can never have enough VRAM when working with 3D
this right here
>someone aiming to get really productive and pro doesn't work with a 3060
yeah also true. However I will add that some people come from places of the world where they can go out and buy the new titan/quadro/x090 every gen. And here you still see people making great stuff it just takes more effort and more knowledge and skill. Look at CGI 15 years ago and then look at your 3060 12GB and know that that shit would be the KING of kings back then.
Working with limited resources also forces you to learn a lot more and more about how stuff is actually rendered / stored on the VRAM.

Anonymous No. 931154

>>931140
>>UE5, Blender, Maya
>8GB is enough for these
Can you explain why would these require less vram? That's what I use as well, but have still been planning to go for 24GB 3090. All of game artists who primarily use unreal also prefer as much vram as possible.

I've been thinking about 4080 as well, but worried 16GB may not be enough. Although it also less cuda cores, which may be a bigger problem?

Anonymous No. 931159

>>931142
>No. Plain wrong. Its useless right now as all games are made with 6-8GB VRAM in mind.
"right now" with My 3060 (Regular) I can play all games on high at 1080p. 1440p and 4k is unneccessary and bring nothing to the table. Ultra Max settings also bring nothing to the table.

>in 4 years your 3060 12GB will still run games with stuff on low-med but textures at high as 12GB will be the new standard
wrong. Games are made for the consoles first and foremost.

>Also more VRAM = you can be more lazy
>equating more RAM with being lazy when the opposite is true since you have to do more work to author more textures

>>931154
Here's a headsup that you may not want to hear right now : rendering in Unreal is not a good experience. If you want to output good renders, use a DCC and compositing software like Nuke.

Anonymous No. 931164

>>931154
>All of game artists who primarily use unreal also prefer as much vram as possible.
well if you are working professionally on something then the cost of the GPU does not really matter when we are still talking consumer grade (which is ALL RTX cards). And the more VRAM the easier it is to work with (you still need to optimize everything in the end because you dont ship your game to people who all run 3090s).
>Can you explain why would these require less vram?
well because they use a lot more CPU / sys ram for most things. General modeling, history, a lot the sims are still CPU bound and so on. Your GPU is only really utilized in modeling programs when rendering and running GPU power sims (which is more and more the case). However since both Blender (Eevee) and UE5 use a deferred "real-time" render that does not require the entire scene to fit on your VRAM (it can work in chunks but ofc you lose heavily on performance and get stutters).
In Maya you have a host of different renders but most are paid options. However due to the age of Maya its a lot more CPU bound than GPU. 8GB might not to optimal but for doing work on a character or such and then moving the model to UE5 or such to render is not a problem.
>I've been thinking about 4080 as well, but worried 16GB may not be enough. Although it also less cuda cores, which may be a bigger problem?
I would personally ALWAYS take a 3090 over a 4080. No doubt. The 4080 is faster but the fact that you loose out on a whole 8GB of VRAM makes it a no contest for me. Only reason to go for the 4000 series is to get the 4090.

Anonymous No. 931165

>>931159
""right now" with My 3060 (Regular) I can play all games on high at 1080p."
sure? Never said otherwise. I was talking about a couple of years from now.
"wrong. Games are made for the consoles first and foremost"
nope. Used to be the case but there has been a clear push and change the last generation. You see a lot of games being made for current gen PC hardware and then getting downgraded for consoles.
It is still the case for certain series and such (COD for instance).
>equating more RAM with being lazy when the opposite is true since you have to do more work to author more textures
spoken like someone who has never had to work within a tight hardware budget where you have to push everything to get the best possible performance. The whole argument for pushing more resources lately in hardware is = Devs can be less skilled and a lot more lazy.

Anonymous No. 931170

>>931165
Excuse me but im talking about vram for rendering scenes on gpu

You have more vram you can put more high rez textures in. Your response?

Anonymous No. 931185

>>931170
>You have more vram you can put more high rez textures in. Your response?
sure. But its not like you can skip maps when you have less vram you just have to fit it / compress it more. Also when you have excess VRAM you start to do lazy stuff like splitting all the textures into individual maps and stop using channels to drive roughness, spec, metal, opacity and so on. That is what I mean but lazy. I know this from myself; the more hardware I have to work with the more lazy I become because its not an issue anymore.

Image not available

632x530

1658241415093.png

Anonymous No. 931215

>>931185
>Also when you have excess VRAM you start to do lazy stuff like splitting all the textures into individual maps and stop using channels to drive roughness, spec, metal, opacity and so on. That is what I mean but lazy.

Will you please stop saging your replies - you're not better than who you're replying to, all you're doing is making it so I cant see your reply - and stop intentionally using triggering words like "lazy". Everyone here knows how to pack channels in shaders and everyone is the opposite of "lazy", spending hundreds of hours doing the work.

Again, we are talking offline rendering on GPU.

When you process textures for real renderers like PRMan you have to do it on a PER TEXTURE basis for example you have TXMAKE that processes the maps so PRMAN can read them. It automatically adjusts to how many channels are in the texture and thus saves memory. There is no being "lazy". If you have a color map and a data map you can't put choose the same color space as its a per image feature in TXMAKE. You can render on XPU.

>By default, the number and order of channels in the texture file depends directly on the set of channels in the image file. This holds true for multi-channel OpenEXR files as well. An RGBA picture will generate a four-channel texture. An RGB picture will generate a three-channel texture. An R picture will generate a one-channel texture.

Anonymous No. 931221

>>931142
>in 4 years your 3060 12GB will still run games with stuff on low-med but textures at high as 12GB will be the new standard.
I disagree with this because you don't need 4k and 3060 is too weak to render 4k in real time anyway. Ti has 20+ extra fps in AAA video games with the same settings typically (25-30% more power) so for longetivity perspective that is strictly focused on 1080p (which 3060 Ti was made for) video games Ti will last you longer.
>If you chose to do lots of particle based hairs in IRAY/insert your ray traced forward renderer here then sure 12/8GB will become an issue quick.
It sure is nice to have RT and render it not 10x slower because you don't have enough VRAM.

Anonymous No. 931222

>>931142
>then sure 12/8GB will become an issue quick.

8gb is a hell of a lot less than 12gb. Just render as many objects as you can handle and composite.

Anonymous No. 931296

>>931104
It would depend of the Scene and amount of detail you wanna render, but basically if you will render big VFX scenes, VRAM will be the bigger player, as for regular run of the mill render of 2 characters on a simple background similar to what a future junior animator would do? 8GB is more than ok.

Also Encoder is usually the biggest thing to consider, VRAM would be an issue for you if your main focus is doing VFX. Now if you don't own a 3090 that has 24GB of VRAM, then just go for the 4080, simple as. The double encoder is no fucking joke, besides with tests I have seen on other software, it can do 4K HDR encoding like nothing vs the 3090 where it struggles and ends up failing, and as for 3D render, you do almost everything twice as fast on 4000 series vs 3090.

Also a big factor for scenes is also system RAM, usually that one is assigned first to your software, GPU power comes more in on simulation and render stuff, system ram handles more the UV textures and polygon count, but that could vary depending on how the program works, so check your program FAQ if you don't know.

Now if you own a 3090 like I do, I would simply wait until RTX 5000 comes out.

And yes, if you use 3D software, CUDA is basically a must, Radeon cards can work but they tend to come on with errors more often, specially rendering, unless you are a Blender user, there you can have a bit of a breather, but stuff like Autodesk Maya that I use, doesn't like it much, specially Arnold.

Image not available

500x381

1658001582429.jpg

Anonymous No. 932491

I'm in the exact same boat right now. Stuck with a 5700xt. AMD cards are the only ones with high ram but those cards are only good for gaymen. And if you want a nvidia card with high vram you gotta spend thousands on house burner 4000.

Image not available

1024x401

3090ti-or-4080.png

Anonymous No. 932498

So there's only a $39 difference between these two options I've been considering. Which one is the better buy? Is the extra 8GB VRAM with the 3090 Ti worth getting a second GPU for AV1 encoding over the 4080?

Anonymous No. 932501

>>931296
>as for regular run of the mill render of 2 characters on a simple background similar to what a future junior animator would do? 8GB is more than ok.
You're being deceptive. A simple playblast would be OK for animators. Animators animate

Anonymous No. 932572

Is 1650 good for maya?

Anonymous No. 932594

>>932498
The AV1 codec makes streaming HDR 4k Wide color gamut video more feasible. The rub is, HDR 4k Wide color gamut video that you create in a DCC has to be graded on a 10k (bare minimum) to at least 30k USD at least 5000-10000 nit display by someone with experience in grading and knows what they are doing.

Anonymous No. 932600

>>932498
>$39 difference between these two options
price of
either card seems realistic, even used

Anonymous No. 932601

>>932600
phoneposter detected

Anonymous No. 932602

>>932601
What did he mean by this?

Anonymous No. 932604

>>932602
>price of
>either card seems realistic, even used

>phoneposting spacing
>sentence doesn't even make sense

Anonymous No. 932610

>>932604
>has never seen a typo before
I will rewrite the sentence property so that you wont have to call upon your reading comprehension skills:
The price of either card listed in your picture seem unrealistic, even if they were used.
Ignoring the issue of RAM, you're dichotomy is premised on the the difference in price being around $40.
Now, even assuming the prices are as you say, the difference of $40 is marginal to the point of irrelevance. The extra 3090 vram gives you more capability by allowing larger or more complex scenes. The objectively faster 4080 will only give you faster render times.

Happy now?

Anonymous No. 932615

>>932610
you changed "realistic" to "unrealistic", have phoneposter spacing and now are claiming that other parties in question have reading comprehension issues

Anonymous No. 932622

>>932615
Yes, if you couldn't parse that out you have reading comprehension issues.
And phoneposter spacing isnt a thing you newfaggot

Anonymous No. 932630

>>932622
you changed one word to its polar opposite while phonespacing. Stop phoneposting and start posting something of value. Dont get upset.

Anonymous No. 932975

>>931104
I couldn't render a Genesis 9, 8k texture character with my 3070's 8gb.
That how fucked i am.
I think 12gb isn't worth it either.

Anonymous No. 932986

>>932975
You need a more efficient out of core renderer

Anonymous No. 933058

>>931104
I'm going to upgrade to either one or two 4090s whenever my 1080 kicks the bucket. It took like 2 hours to render a 150 frame animation.

Anonymous No. 933069

>>933058
Post your shitty animation

Anonymous No. 933334

>>932975
Consider that your system eats 1-2 GB easily if you are on Windows. 12GB is a huge difference then if you are on 8. Of course you still have to be mindful.

Anonymous No. 933347

>>932975
You can if you use texture compression.
The quality will suffer but not to the extent that you will notice with a 3070 unless your renders take a week.

Anonymous No. 933349

>>931104
Wow that's crazy
Anyway, you catch the game?

Anonymous No. 933370

>>932975
that's cause you should use mipmapping.
When I'm in Vray creating large scale environments, and I need to render it all out, I use on demand mipmapping.
I have only 8gb of vram, thats it.

Anonymous No. 933561

is 4080 16gigs really that bad if I want to primarily use it for unreal and maybe some blender rendering on the side if I ever feel like doing something in an offline renderer?

Literally all 3090s disappeared from shops in my country, I don't know what's happening. Are the stocks almost gone or what? It was getting really hard to find them even when there were some available, but only questionable brands were remaining.

So I'm just thinking I may get a 4080 instead. Rendering should be faster overall, memory may be enough for me but I can't tell, I just wonder how bad is a lower amount of CUDA cores compared to 3090 and how does that affect working with Unreal.

Anonymous No. 933760

>>932501
>Animators only animate
If an Animator doesn't know how to put out a simple render, sorry but he's actually a fucking retard and will have a harder time landing a job compared to the one who does know, either if the evaluators are animators themselves or not.

Anonymous No. 935805

Bumping this thread to ask: What if I bought a 3060ti now then another 3060ti later (or even another gpu) does blender/maya etc work well with dual gpus? Does it combine both vrams or one of the gpus will bottleneck?

Anonymous No. 935813

>>935805
You wont get far with 8gb vram

Anonymous No. 935826

>>935805
No. You need professionelle cards for mem pooling or it will work like shit. So wait if you can for a better time to buy

Anonymous No. 935831

>>935805
Mostly not. The scene (textures, meshes, materials, lights) and the associated data structures (BVHs, etc..) have to go into memory for any card to use it.
Some memory could potentially be saved when each card only renders a section of the image saving frame-buffer space. But that's hardly anything compared to the other stuff.

Anonymous No. 935965

>>932498
If you only care about rendering go with the 3090. If you also play games get the 4080.

Anonymous No. 935966

>>932975
Unless you are doing really close up shots 8k textures are overkill.

Anonymous No. 935968

>>935965
A new 3090 is extremely expensive - 1700 to 1900 without added tax. A 4070 with the same power but half the ram is 800. The 3090 should also be $800

Anonymous No. 935970

I can get a mobile workstation from my work with a 12GB A3000, or I can buy my own legion laptop with a 16GB 3080 Ti. Otherwise similar stats, except the workstation has a higher res screen.
I want to use it for blender and DAZ.

What should I do? Is there even going to be a noticeable difference for rendering?

Anonymous No. 935971

>>935970
post your work

Anonymous No. 935973

>>935971
Why tho?

Anonymous No. 935975

>>935973
you know why

Anonymous No. 936238

>>932594
>HDR 4k Wide color gamut video that you create in a DCC has to be graded on a 10k (bare minimum) to at least 30k USD at least 5000-10000 nit display by someone with experience in grading and knows what they are doing.
In a year or two you click one button and AI will grade it for you better than a human.

Anonymous No. 936245

>>936238
Just two more weeks

Anonymous No. 936253

It ain't only about the program, it's also what do you do with it. As an example, Unity works just fine with 4-6 GB of VRAM, until you try to calculate lightmaps, specially on large/complex scenarios. I would go for the 3060 12GB. What I personally did is went to the 3090, it was about 700$ used since mining stopped being profitable.

Anonymous No. 936261

>>936253
>he's building a vidya in his basement

Anonymous No. 936370

Shit tier 3090 or god tier 4080? Which one to go for?

>UE5
>ray tracing
>some offline rendering, probably just Cycles, for fun

Do I need the extra vram and cuda cores of 3090 or is it better to go for extra power of 4080? I've been waiting for a while and only shitty 3090 brands are always available, like zotac, gigabyte and gainward, not even the best models of those brands. But now I see 4080 Asus TUF is available, ofc, for extra 200 euros just because muh popular brand.

There's always this small worry what if I end up needing more than 16gigs for a project and end up not being able to open it at all, but atm I'm even fine with 6 gigs of my 1660ti, it's just slow as hell.

Anonymous No. 936389

>>936370
4080.
If you are already wasteful enough with resources you run out of 16gig vram you will run out of 24gig vram as well.
Unreal has auto lod, nanite and scalability settings to basically avoid that scenario of not being able to load a project altogether.
To run out in blender you have to do something pretty ridiculous like use adaptive subdivision in your entire scene.
All you have to do is tweak the settings a little and you are fine again.
The only software I know of that gets you into that situation is daz, but even there you can just use a plugin to downscale textures where you don't need them (like having 6 4k textures on a chair in the background would make any difference)

Look at this that way: 16 gig is a 166% improvement over what you have. Going from 16 to 24 is 50% improvement.

Anonymous No. 936391

>>936389
Cool, thanks for your thoughts

Anonymous No. 936446

>>932615
it was obvious what he meant you fucking mongoloid

Anonymous No. 936453

>>936389
The thing is that even if bideo games are starting to take up 12 gigs of vram in some areas, and they are supposed to be optimized, what are the chances 16 gigs will be enough for production?

Anonymous No. 936455

>>936453
>production this, production that

post your work

Anonymous No. 936850

>3090 - 1500€
>4080 - 1700€
>4090 - 2100€

I know we're getting robbed here either way, but what would you pick from these?

Anonymous No. 936852

>>936850
concentrate on where the real money is - mobile

Anonymous No. 936859

>>936852
Irrelevant for me t b h

Anonymous No. 936934

Would 1TB SSD be enough for OS + main software and projects I'm currently working on? Or should I go for 2TB? It's 3 times more expensive though, so I'm not sure.

Anonymous No. 936936

>>936934
I run out of space on even a 10tb HDD. I have 33tb, used to have 43 but one drive died

Anonymous No. 936953

>>936934
I bought a budget prebuilt and I use 500GB on my SSD (came with the PC) and a 2TB HDD for storage. It's not optimal, but I think with a 1tb SSD it would be a great setup.

Anonymous No. 936964

>>936953
>1 tb ssd
>great setup
>tiny amount of space
>not raid

you are not going to go far

Anonymous No. 936965

>>936964
sorry you overpaid ig

Anonymous No. 936966

>>936965
i have a workstation. You have a shitbox

Anonymous No. 936975

>>936936
>>936953
>>936964
Yeah I meant just for OS and software/projects, I would have another larger separate drive for storage.

Anonymous No. 936976

>>936975
sims are huge and you will be simulating everything

Anonymous No. 936984

ddr4 or ddr5? Is paying extra worth it? Idk who's crazy here, but people say it's still expensive, while it looks like I can get ddr5 for 100€ more, that doesn't sound like a lot to me. But the question is will I notice the difference. Does anyone here use ddr5? Also consider when I buy a new PC I don't generally upgrade until I literally can't do stuff with it anymore so maybe for futureproofing?

Anonymous No. 936985

>>936984
128gb ddr5

Anonymous No. 936986

>>936985
What does this mean, no explanation?

Anonymous No. 936992

>>936986
get 128 gb ddr5

Anonymous No. 937642

>>936992
lol, good luck running 128GB DDR5.