Python API request: Python API for finding nearest element in screen space

Highlighting the elements under the mouse cursor is very important to provide excellent UX with interactive tools.(like poly build)

I tried various approaches for this, but in Python it was very expensive and high polygons couldn’t get practical performance. In the future, many add-on developers will try to create interactive tools for WorkSpaceTool like me.Wouldn’t you consider adding features for this to the python API, even if I or they don’t do “Reinvention of the wheel”?

For example, what I want is the following API.

find_nearest( bm , region , rv3D , coord , radius , targettype = (‘FACE’,‘VERT’,‘EDGE’) )
Return : list of tupples ( geom(BMVert or BMEdge or BMFace) , position of screen space , distance ), list is sorted by distance

The add-on developers will want different features. For example, it is an area on a screen or a rectangle or lasso, or an element on a line or curve or … If you can think about it, I think it is better to call out to many developers and get their opinions.


Try this:

  • raycast on the scene to get the polygon index and hit location which the ray intersect
  • queue the polygon elements and transform them to screen-space.
  • compute distances and select only the nearest element in screen space following your own logic

It will be as fast as computing all the distances for only one polygon because the raycast deals with picking the right one.

I tried them. The function is working well.
However, CPU load is high, and it takes about 1000ms for 100,000 vertices in Intel Core i7-7700. This is far from a pleasant user experience.


Yes, a feature like this would be very welcome! I have an addon that does something similar, though I’m using it for selection, rather than preselection… kinda like soft-selection where the user wants nearby elements that are not necessarily connected. it gets significantly laggy over a certain number of verts to search through.

I do think exposing Blender’s internal “find nearest element to cursor” option to the Python API could open up many new possibilities for add-ons and tools, but I don’t know about this particular use case. Pre selection highlighting was purposefully left out as a feature (see Anti Features).

This doesn’t really have anything to do with preselection. Sure, it can be used to create a feature like that, and it may even be what @sakana3 wants to use it for, but in my case it’s not for preselection at all. Certain types of addons need to be able to check local proximity and excluding it because somebody decided that preselection is an ‘anti-feature’ is wrong-headed.

I was addressing this statement from the first post:

I wasn’t arguing that finding the nearest element would not be useful. My point was why this type of selection wasn’t used in Blender and why opening this thread with that as the leading statement may not be a great idea.

On another note, I just remembered there was an experimental Python module that (I think) was able to partially implement this option in Blender with better performance than raycast. More info can be found here. As an FYI, that post is a little over a year old, so the test will likely only work with Blender 2.78 and/or 2.79.

Some initial experimenting.


Keeping a bmesh and kdtree alive in a modal. Using scene raycast since object raycast doesn’t work in edit mode. The kdtree consists of vco and face centers.

The good news:
Bmesh and kdtree is pretty fast to initialize.
Vertex and face pre-selection works with kdtree alone
Pre-selection itself is instant on 1M+ tris meshes.
Ngons are supported.
Bmesh doesn’t need to be re-aquired on depsgraph changes.

Bad news:
Kdtree needs to be rebuilt when geometry changes. Haven’t done any timeits yet.
Not sure how to deal with edges properly.

And I don’t even like pre-selection :stuck_out_tongue:

Are you sure you’re not talking about the mathutils.BVHTree class?

With mathutils.KDTree you need to submit every single point with individual .insert() function calls, so it’s not as efficient as the BVHTree that lets you send in the BMesh object to make a tree from its geometry, with fast C loops.
BVHTree supports nearest-point queries as well as raycasts.

Thanks for feedback!

This proposal has nothing to do with preselection. I believe this is not an ‘anti-feature’ as Blender’s standard feature “poly build” already implements highlighting.

As for Face, “mathutils.BVHTree” was the best performance. But I do not have very good solutions for Vert and Edge.

No, I mean kdtree. How do you find verts with a bvhtree? :stuck_out_tongue:

Inserting vco and balancing takes me 27ms on a 200k tris mesh.

Having such an API would be awesome :slight_smile:
From what I remember (not 100% sure it’s correct), Blender uses some sort of GPU query for selection.

wait- so you’re building your kdtree in 27ms for a 200k tri mesh, and getting instant searches afterward? i hardly see that as a downside as I’m currently spending around 400ms per search on a 100k tri mesh sorting a list manually lol

i guess i’ll be looking into kdtree next- any pointers on where to start there? I mean, google obviously- but if you had a particular resource that was useful to you I’d be appreciative of that as well!

All the docs I ever needed for kdtree can be found here.

I find kdtree insertions to be fast enough using bmesh. You can get some 15% increased insertion speed by caching the function to eliminate lookups and have the args unpacked in-place.

bm = bmesh.from_edit_mesh(
kd = mathutils.kdtree.KDTree(len(bm.verts))
insert = kd.insert

for v in bm.verts:
    co_idx =, v.index


Kdtrees to be fine for getting vertices and face centers if you have an actual co to supply as input, but once you start doing proximity searches outside the bounds of geometry, coupling it with a bvhtree will be helpful.


I’ve done some more experimenting with getting the closest vertex when the cursor is outside geometry. This was done using a barrage of raycasts in a radial pattern that expands per iteration.

I decided to do some optimization to the view_3d_utils to eliminate some overhead during the origin/direction conversion for the bvh raycasts.

The CPU usage tops out at 4% while scanning for hits outside geometry.

I also added a hit buffer to create a hystereis so that the selection doesn’t jump erratically.

I’ve uploaded the example here.


Some great efforts here. I just really think pre selection is something that should come with Blender on C++ leve. It’s really unfortunate to see the reasoning in the anti-features list. It doesn’t make sense. Sure, Blender may be better than other apps when it comes to mesh element selection, but better than others still doesn’t mean flawless.


Ideally this should be done as native C functions since unboxing objects and passing them on in python adds conversion overhead. Still, it’s possible for this to be fast in python by current implementation. A single raycast costs 0.05 ms. And 10 times less if no hits are found, and even less if the cast distance is clamped.

The main issue I see at the moment is getting edges efficiently. If I can solve this, then I’m pretty much set.

Pre-selection aside, having a high performance version that would return a specific element (vertex, edge, face) would open up possibilities for more elaborate tools that in some way require visual feedback that doesn’t involve actual selections.

Visualizing a target vertex for a target weld tool would be an example, and would show the snapping threshold. That was the idea of the last example.

1 Like

There is some preselection stuff already present on the c side used for the polybuild tool. Not sure how much work it would need to expose to python or how general purpose it is…

obviously not the most efficient way, but I would imagine the performance overhead would be minimal given the number of edges a face is likely to have- but couldn’t you just get the edges from a BMFace and manually check the distance of each one? so basically, use the kdtree to narrow it down to a small search, then manually search beyond that.

The main problem with edges is that their hitbox or so to speak is expected to be the edge itself, so a kdtree which deals with fixed points would only provide hits at fixed intervals.

But your suggestion sparked an idea.

Inserting face centers with BMFace.calc_center_median() and using kdtree.find_n(loc, n) where n is the number of closest points, we can use a value of 2 to yield two closest faces, then compare their edges by a simple set() intersection to find the common edge which should also be the closest to the cursor.

One obvious issue is that this fails on thin meshes. find_n() doesn’t care if the geometry is contiguous, so it will yield a face on the opposite side of the mesh if it’s the closest. I’ll try with a bvhtree later.

well- with my suggestion, you’d store face centers in a kdtree, then get the closest face and iterate over BMFace.edges to find the closest one just by doing vector math. It wouldn’t be blazing fast like a kdtree search, but you’d only be dealing with 4 edges in most situations so I doubt it would have any noticeable performance impact

edit: and to clarify on the vector math bit- for each edge you’d be finding each edge center (v1 + v2) / 2, then checking the distance from the raycast position and comparing it to the other edges in that face.

if you do it this way, you only need a single face so there’s no concern about non-contiguous/backfacing geometry causing problems