EEVEE with my RTX 3090

I recently purchased an RTX3090 and am wondering if I would be able to use a second RTX 3090 if I’m only using EEVEE. I KNOW EEVEE doesn’t use multiple GPUs, but I wasn’t sure if I used NVLink , would the computer see it as 1 GPU?

Also, if I do add a second GPU, would the only benefit be the pooled memory? Or would there be other benefits in EEVEE?

NVLink does not make a single GPU out of all the GPUs it connects. Eevee is currently limited by what OpenGL can do. In the future when we have a Vulkan backend multiple GPUs could be supported, but that won’t work out of the box.
A clear design how that should work in Blender also isn’t clear and what the actual benefits would be for the end user. Scaling across hardware boundaries isn’t cheap and has unpredictable bottlenecks.

Looking at your question you are suggesting that having NVLink all applications directly benefit from pooled memory. Most of the time applications needs to be modified to make use of this due to possible bottlenecks. I am not certain that the driver would solve this automagically.

1 Like

Thank you so much for the reply!
I was just wondering if purchasing a second RTX3090 would benefit me in any way.
Thank you for clearing that up for me though.
Do you think switching to Cycles would benefit from NV Link? If so, do you know what kind of gains I would see ? Is it just the VRAM amount? Or would the computer literally render twice as fast?

And as far as bottlenecking is concerned, I have a Threadripper 2950X, and from my understanding, it was made to handle multiple GPUs. But I may be understanding that incorrectly.

With or without NVLink Cycles will scale to multiple GPUs easily. As long as scenes will fit on a GPU NVLink doesn’t give you a benefit.When working with huge scenes that don’t fit on a GPU NVLink could help, but I haven’t seen this in action or seen any comparisons what the actual benefit is.

Thanks again for the response!
My main concern was that the scene I am working in is exceeding my 24 gigs of VRAM.
I bought the RTX3090 hoping it could power through EEVEE, and it has up til recently. But it seems that if I want to do any more upscaling to my computers performance, I may need to switch to Cycles. It seems like the benefits for the GPU don’t really play towards EEVEEs strengths.

This test was made using vray, I assume Cycles is similar (or not)
(copy/past of one of my replies on blender artist forum)

That is really interesting. I wouldn’t have expected those results.

So I wonder what the best option would be moving forward.
I really wanted to stick to EEVEE , but with me running out of VRAM, and EEVEE not being able to use 2 GPUs, I kinda feel like I need to switch to Cycles…

I wonder if it is all around better to use NVLink or not.

If I was in your place, I would ask myself this:

  • How often do I work on huge scale scenes that might not fit on 24GB of VRAM?

    • If the answer is “rarely”, then I would stick to one 3090 and when I need more, I would look for a render farm for those specific rare cases.
    • If it’s more frequent, then I would consider a second 3090 + NVLINK

Also, there is always “Out of Core” rendering support on Cycles as far as I remember, which let’s you switch to system RAM, when VRAM is not enough (as long as you don’t mind a little “lot ?” of rendering slowdown).

Oh okay, that makes sense. I guess thats just another limitation of EEVEE… I thought it would be able to do “out of core” rendering… But that doesn’t seem to be the case. Blender closes every time I approach the 24 gigs (while using EEVEE)…

Yeah, I would say (at least I hope) the 24 gigs is not that frequent.

Thank you for the recommendations. I am interested in how much faster I would be able to render just by adding a second GPU but without the NVLink.

Again, not Cycles but this time, Multi GPU scaling on Octane and Redshift

Source:

Seeing those results, I “expect” Cycles to behave the same and offer similar linear scaling.

Really cool! I was hoping it would be something like that.
Thanks again for your help in this.

Looks like I may be switching to Cycles.

Just a note that it is faulty to say that Eevee doesn’t support out of core rendering. Eevee supports out of core rendering as implemented by its device driver. Normally this means that all data for a single draw call should fit on the GPU. For Eevee this means that all the geometry (VBOs IBOs) and textures and output buffers of a single material is the limited factor. The GPU driver is the one who optimizes loading and offloading to the GPU. How the device driver implements this is a blackbox and could differ per GPU or driver version.
This is different with cycles where it is wise that all the geometry + textures for a single tile fits on the GPU.
Although if blender closes this might indicate a different issue. You should find out why it closes, before assuming something is the cause. You could apply simpler materials to the scene or reduce geometry and see what happens. Try out different drivers etc.

2 Likes

That is a good recommendation. I truly know almost nothing about computers.
I just assumed it was an issue with the VRAM. I was monitoring all of my resources, and noticed that when my VRAM was getting to 100%, the program would close…

Does Blender generate a crash report?

  • IF you have a scene that can fit in one GPU
  • and if you want to use EEVEE to scale with multiple GPUs
  • and if you are using Windows
  • and if you don’t mind doing some extra work

Then you can try this technique:

Don’t forget to read the comments, there are extra tips and tricks mentioned by other users.

2 Likes

About a year ago the nvidia driver for linux was incorporated OpenGL render offload for PRIME feature, this is when you have an intel iGPU, you can launch an application so that it uses OpenGL on nvidia card. I want to believe that if they incorporated this feature for PRIME with intel iGPU, then somehow it should also be possible to do it if you only have nvidia cards. I don’t have multiple nvidia cards so I can’t test it. You should ask about this on nvidia forum for Linux if you are Linux user.

Can i just say that this render looks so great! Ou i see its from the Boxx team

I saw this! I think it is a great idea.
My problem right now though is the scene fitting on one GPU. I’m running out of the 24 gig of VRAM which is the main reason I was hoping the computer could see two GPUs through NVLink as one… But apparently that isn’t the case.

I’ll definitely do this while rendering though.