"I’m trying to access or mimic the orange boundary lines Blender generates in Object Mode to indicate overlapping mesh interactions (see the attached picture). Ideally, I’d like to extract the precise mesh data these lines represent, or at least the logic behind their calculation, for use in scripting custom mesh cutting operations. My Goal:
Direct Data Access: If the orange lines are based on true mesh elements, how can I locate these mesh components within Blender’s data structures?
Algorithmic Recreation: Alternatively, can someone shed light on potential algorithms within Blender responsible for calculating these intersection boundaries with high accuracy during interactive object manipulation?
Even Partial Clues: Any information about relevant Blender development areas or code sections dealing with similar visualizations would be incredibly helpful.
Background: I’ve investigated Blender’s Python API in the following areas (collision detection, viewport rendering, etc.). While helpful, nothing seems to provide direct access to the fine-grained overlap visuals in question. Could Blender be doing this as part of an internal step of its rendering process that isn’t directly accessible using the public APIs?
It’s an image space effect, writing unique objects IDs into a image buffer and detecting where adjacent pixels have different IDs. It doesn’t exist as mesh elements.
Thank you reply. I have some follow-up ideas and questions:
*Shader Logic:
Are there any areas of Blender’s API where custom shaders applied during viewport interaction might be accessible or modifiable? Experimenting by altering how these are drawn could uncover clues related to silhouette rendering.
Depth Buffer Investigation:
If Blender calculates silhouette information for boundary visualization, could depth buffer values after object updates and overlaps give clues about geometric proximities for inferring cut planes? Are there API mechanisms to retrieve depth buffers related to user views of the 3D scene?
Vertex Flagging:
I understand this visualization doesn’t modify your mesh data. However, might Blender internally store some temporary annotations for vertices relevant to generating these boundary lines? This would be purely during display logic, making it temporary, not persisting after the interaction ends. Any leads here guide targeted Python scripts I could run to see if vertex data offers hints, even on a hypothetical algorithm side at first.