Can I use GPU module to paralelize some computations?

I have an intuition that might be possible to run computations on the GPU by writing a fragment shader and creating some buffers (Maybe some textures)

I wanna compute a smooth vector field on a mesh but averaging loads of vectors though bmesh is slow in python. Thought I could build a few textures and throw all the data in to it and run all the computations in parallel using GLSL.

Would that be possible?

No knowledge about GPU module whatsoever, but maybe your task can be sped up by using numpy? Numpy is already included in Blender, the reference manual is here:

I tried using numpy but I didnt got a great speed improvement since it relies too much on the python calls.
I am not very good at writing algorithms with numpy.

Yes this is possible, but probably rather difficult. Numpy is your best bet, but it does take some effort to learn how to make the most of it (mostly this involves using numpy functions instead of python loops).

Edit: Watch this PyCon talk about getting the most out of numpy before resorting to the gpu. The problem with bmesh is that you’re using python loops to iterate through, which is far, far slower than numpy vectorized operations.

How well do you understand tensors? This is definitely doable in theory. Difficulty will be more up to your familiarity with the subject matter.

I ended doing it with the Numpy module, I could get some speed but for a so discrete and repetitive task like generating a cross field, I guess the GPU would do better.

It’s also possible to get Blender to use a Python distribution other than the bundled one.

so something you might try is using the Anaconda Python distribution which comes bundled with natively (and possibly some GPU) accelerated versions of numpy etc. which might get you a performance boost for little or no effort and can also provide/host just about any Python package you want through the conda package manager.

You could then play with things like PyTorch or TensorFlow.

If you have an NVIDIA graphics card you can use Numba for gpu processing:

My intention is to create an addon that is easy to install and cross-platform so I cant afford modifying blender builds or adding extra dependencies.

In that case your best bet is a GLSL compute shader. This thread on BA might give you a starting point: