Python GPU(CUDA..)

Hello Everyone ,
So this is my first post in this forum and I wonder is it possible to run blender operator such as object splitting or baking or any other time consuming tasks using a the GPU within a python script .

btw amazing work with 2.8 version keep it up :star_struck::star_struck:

Thank you.

The only Blender builtin operators that can run on the GPU are Cycles/Eevee rendering and baking. This works the same as in the user interface, for Eevee it’s always on the GPU, for Cycles with the appropriate user preferences and scene settings.

Python scripts can create completely new operators to run on the GPU (using libraries like PyCUDA for example), though generally this is quite complicated and only helpful for specific types of tasks that require a lot of computation and relatively little interaction with other Blender data structures.

Thank you for the fast response ,Yeah the PyCuda it’s complicated .
Just to clarify what i need the GPU exactly for.

So basically i created a script that merge all the objects in the scene and using bvh and ray tracing it selects all the vertices viewed from the camera then it will select the linked object and will separate all the objects by loose parts and will return the count of the selected object and the unselected ones.
So this process takes a lot of time and that is normal specially if too much objects are present in the scene .

I can show the piece of code if needed.
Do you think pycuda worth a shout in this case ?

I don’t think GPU acceleration is worth the effort for this. It should be possible to get decent performance on the CPU with a BVH tree:
https://docs.blender.org/api/blender_python_api_master/mathutils.bvhtree.html

Yes i believe you are right but the problem that a scene can have 100 to 5000 objects specially cubes because the script will work with vox style objects .but thank you for the clarifications :smiley:

5000 x 8 cube vertices shouldn’t be that bad, I expect it’s possible to do the ray tracing in less than 0.1s.

ok ,thank you for your time.