Is Cycles able to run simultaneously an internal and an external GPU?

Hi everyone,
it’s been a while that i’m planning to upgrade my hardware, the idea was to buy a notebook because of the need to have something that I can bring with me wherever I need. Beside of that, I would love to have the opportunity to upgrade it with an eGpu in the future to be able to take advantage of all the power I could have at “home” but at the same time be able to have a nice machine on the go.
So my question is: What if I buy a notebook with a pretty decent internal GPU and then I’ll associate a sister of the same but External? For instance a Notebook with this specs:

RTX 2070 Max-Q 8G, i7-8750H, 16GB DDR4, 512GB NVMe SSD

associated with an

RTX 2070 eGPU

Will Cycles be able to use both of Graphic cards simultaneously?

I read a bunch of articles that discourage the use of eGpu due to the degradation and the slow down of the signal through the Thunderbolt 3 port, but everyone considered the “gaming side” of it.

Should I expect the same problem with cycles?

Can someone help me? Is there someone that has experience with this kind of stuff?

Thank you all,

Guido

I dont know about the slow down of the signal, however you do not need a larger bandwith since all the data for 3D rendering is stored in the vram and processed there. What travels in this case is just the info to the frame buffer

I would expect it to work, for rendering slower access to the external GPU is not as much of an issue as gaming. Mainly it could affect viewport interactivity and rendering scenes that don’t fit in GPU memory and need to access CPU memory.

We don’t have a similar setup to test this ourselves though, so we can’t make many guarantees about it.

if the computer sees the drivers, it simply sees two or more GPUs, I don’t think it distinguishes whether it is external or internal …
And even if they are external, if the GPU are properly powered, the PCI Express bus covers the whole bandwidth and the GPU will hardly be able to saturate it.

Can you please explain further what you mean in the second part of your statement referring to the pci bus and saturation?

@TonyBalonie welcome!
I formulated the text in a more understandable form by adding “Express” …
that’s why you logged in right? :wink:
if you don’t understand yet, sorry, english is not my primary language

That edit did help. I didn’t catch that you meant bandwidth when you said “whole band”. The other part I was slightly confused on is when you say the GPU will hardly be able to saturate it. Do you mean the GPU would have a hard time using the full transfer capability of the PCIe bus?

I understood … this is because in my language it is called “band length”

I was referring to the fact that the GPU is not capable of saturating the entire bandwidth of the PCIe bus

When you say saturate I am assuming you mean “use to the fullest ability”. I’ve personally haven’t heard the term saturation used when referring to bandwidth or hardware so that’s why I am a little confused on what you mean.

Assuming you do mean “use full capability” in place of saturation then wouldn’t the bottle neck of bandwidth not be in the PCIe bus, but the thunderbolt connecting the PCIe to the laptop? Since the laptop does not have an external facing PCIe bus it needs to use the thunderbolt cable connected to a dock with a PCIe.

Sorry if this is slightly off topic, just trying to learn more :slight_smile:

I think that we understood more or less …

Anyway I’m not an Apple hardware expert (although I’m pretty sure things work the same way)
if I have to connect an external GPU to a laptop PC, I use this

That connector would require the laptop to have an external facing Mini PCI-E port yes?

this that I linked, of course yes.

almost all laptops have this port, and it’s not external, someone has two, others have only one, and it’s often occupied by the wi-fi card …
in practice, this adapter has a cable that hooks onto the mini-pcie port that allows you to create this “forced” external door.

it is obvious that if you replace the wi-fi card with this kit … you will need a wi-fi usb …
(and also needs an external power supply for the GPU)

there are at least two other versions on the market
NGFF PCI-E Version and Expresscard PCI-E Version

but honestly I never tried them, and I know that some laptops are not compatible regardless of the port.
the producer made a list of the laptops that should work

Ya I have not heard of a laptop having one externally available, although I do not keep up with laptop standards as I use desktop mainly. Something tells me that OP is not trying to take apart their new laptop haha. which might be why they mention thunderbolt specifically. Although if looking for the best performance it would be worthwhile to try and use the mPCI-e port as it has more than double the transfer speed.

While I was doing some light reading I noticed someone say that in newer laptops the mini PCI-e is replaced with m.2, and you would need to find an m.2 to PCI-e adapter.

I have already answered you …

My mistake :sweat_smile:, at first glance I thought you referring to alternatives that still use the mini pci-e slot. I did not know that NGFF was a previous term for M.2, you learn something new every day!

@Ton @brecht @mont29 @sybren @jacqueslucke @GuidoMedici

Blender System Optimizer

Would it be possible to allow the user the ability to assign UI updates, and other such tasks to a device?

Such as, assigning UI updates to the internal GPU and/or CPU, and assign viewport rendering to the external GPU and visa versa?

Such features would allow the user;

  1. To create optimized (load balancing) profiles based upon their system hardware, workflow, workspace, and individual preferences.
  2. The process could be automated for less experienced users.
  3. Such profiles could be assigned to various workspaces, providing further optimizations.

There are many other factors eg. workspace settings that also effect performance.

Workspace Priorities Settings

A unified workspace priorities settings list, would allow the user to;

  1. Assign various levels of importance to each setting within a workspace from a single window.
  2. Save the user defined settings in an optimization profile.
  3. Assign or change optimization profiles for a workspace.

Various profiles could be created and used by the user depending on the workflow and workspace, allowing for further optimizations, and better performance.

Example:
A render time profile could be created to allow for added effects in the final render.

(Such features implementation would be similar to system over-clocking profiles used by various companies eg. Nvidia, AMD, etc.)

Imagine

Workspace Setting Profiles

  • Creating various settings profiles and adding them to favorites, then using the pie menu to switch between the settings while prototyping.

Asset Management: Core Assets & Asset Profiles

  • Using similar features to create various settings profiles for nodes eg. shaders, particle systems, etc., then using a pie menu or the asset manager to switch (override) just the settings of a node group. This would allow for core assets eg. the node tree, and asset profiles eg. node profile settings (overrides) to be created. Where the node tree isn’t replicated, and only the node profile settings (overrides) are saved.

In such as system, the ability to hide various inputs would be useful.

General Implementation

Core Asset: Creates a Core Asset ID linking the following;

  1. Operator (Node) IDs

Asset Profile: Creates an Asset Profile ID linking the following;

  1. Core Asset ID
  2. Operator Settings (User inputs)

@Dev1, please don’t add notifications for everyone, just leave it to people to decide which topics to read.

Cycles settings let you configure which device to use for rendering. Similar control for Eevee would be good, but beyond that there is not a lot that I think we should add.

Adding specific priorities to settings and assets is not practical to implement and would make for a pretty bad user interface as well. There are many more useful things that can be done automatically.

Thank you Viniciuspaciello,
that’s what I thought.

Hi Brecht,
thank you for your time, I decided to go with this kind of setup, I just bought a decent notebook with an RTX 2070, and in future, I think I will try to buy an 2070 eGpu.
I will keep you update about the proper functioning of this setup.
Thank you for the great job that you guys of Blender development team are doing. You are rocking it! Especially in this last year.
Keep up all the good work!

Thank you, Nokipaike