Texture border/margin generation by copying pixels from the adjacent polygon

Hello everybody,

Inspired by this case I did some experiments on generating a better texture margin.

The current margin is generated by just extending/blurring the pixels on the border. This works reasonably well when the texels are downscaled, but it falls down when the texels are large on screen.

I generated a new margin, by copying pixels from the adjacent polygon.

These pictures are cheating a bit, because they show the optimal case of a perfectly pixel-aligned UV border (except for the sloped border). If you look in the linked case I there posted an example of a non-aligned UV map, and while less large there is still a notable improvement.

What I currently do in my (hacked together) test program is:

  • generate a map of the pixels that need generating, administrating which UV edge they are the border of.
  • for each pixel which need generating project the point onto the associated UV edge and sample a pixel from the polygon of the corresponding UV edge at the same distance.

I think the results are promising enough to warrant spending some more time on trying to implement this inside blender. But I first want to know what people think before I spend time on finding out how to do that.


This is certainly promising, I think it’s worth implementing something like this inside Blender.

From what I understand you are proposing that rather than to extrapolate pixels on the edge, to instead bake these pixels on adjacent faces?

I think that perhaps to give most continuous result, you could use baked pixels elsewhere in the image rather than baking additional pixels. That way any distortion or noise from the bake can be taken into account, since that also contributes to seams.

I didn’t bake extra pixels. Just resampled them from elsewhere in the image.

This was generated by exporting the UV data with a script and then reading in the original texture and the uv data and use that to generate the margin.

Maybe somebody in the know can tell me if the following functionality is available already somewhere in the code and where to find it:

  • Determine if a (2d) point is within a (2d) polygon. Float coordinates.
    (I strongly suspect this is already available somwhere, but couldn’t find it)
  • Determine which UV edge is the closest to a certain pixel. In my prototype I build a full distance map (manhattan distance, dijkstra) to determine which polygon is the source polygon, but that is rather wasteful.
1 Like

That sounds good then.

isect_point_tri_v2 is what you would use. You need to do it on a triangulated mesh (looptris) to ensure the triangulation is consistent, can’t use an arbitrary polygon function.

I don’t think we have anything specific to finding the closest edge. BLI_kdopbvh.h could be used probably. Although Dijkstra in pixel space does not seem that bad to me, hard to say which would be more efficient for a high poly mesh.

Ok, I’ll start with just porting what I did using dijkstra in pixel space, which needs an extra copy of the texture in memory during the border generation. That solves the point-in-polygon as well, because I can then just look it up in the map.


I’ve found that probably the most logical thing to do is to pass an MLoopUV * to write_internal_bake_pixels( ) (this is a static function, so there’s probably no problem adding a parameter there).

Then after that it would be the most logical thing to also add a MLoopUV parameter to RE_bake_margin( ). But this if maybe exposed via the API? In which case it’s probably a nono to add a parameter there?

Also, I’d prefer my code to be C++, and the current filter and baking code is all C, could I add an extra C++ file somewhere? And what would be the best place to put it? Or should I rewrite my dijkstra code (which uses std::make_heap, so C++) in C and put everything info the same file as the current extend filter? That feels illogical, because none of the other filter functions use UV info, they all seem purely pixel based.

It’s fine to write it in C++, porting existing files to C++ if needed or moving the code to a new file.

RE_bake_margin is still an API internal to Blender so there is no problem changing it.

Ok, here’s my plan:

  • I add the parameter to RE_bake_margin.
  • I add 2 files in blenkernel for my c++ code: BKE_Texture_margin.h and intern/texture_margin.cc

This way I can leave the rest of bake.c in C, and call into my own C++ code from there.

Is that OK?

I would just put all that code in the render/ module, but that’s details and easy to change.

Ok, first get it working. :smiley:
At least I now have something that compiles and links my own code, so I’m off to start the actual coding…


Some more questions:

I assume that during baking I only have an MDATA mesh (i.e. non edit-mode) available? And that this does not contain adjacency data? I can copy the building of the adjacency table from my prototype, but if that’s already available…

edit: it seems I messed up the terminology,because what I use appears to be the BMesh data (as noted in the DNA_mesh_type.h header). But I still think adjacency data is not readily available during baking?

At the moment I’m done generating the Dijkstra map to determine which border goes with which polygon, so now I need to figure out which edges touch. I’m starting to get a slightly better overview of the baking code, so after a very slow start it’s looking good so far.

The first iteration will only do what my prototype did (i.e. ignore the corners) and probably not support baking multiple objects at once or objects with multiple materials. But first get something working ;-D

1 Like


I uploaded a first , very rough version. It’s still slightly worse than my prototype and I’m not yet exactly sure why. Probably a rounding problem somewhere.

This still needs lots of work, and I’m not really looking for feedback (yet) as there’s still loads of stuff that I know needs to change. But as I’m not sure I’ll have time to work on it during the week I posted what I have now in case people want to play with it.

Because there’s no option to turn the margin on or off I for now abuse the margin size as a toggle. Even margins will be generated by the normal repeat filter. Uneven sized margins will be generated by my new code. Just so you can compare.


For adjacency data, BMesh always has it, though as far as I know only Mesh is used for baking currently. Mesh does not have adjacency data. Sometimes the algorithm can be adjusted to not require it, sometimes some utility functions like in BKE_mesh_mapping.h can be used to generate temporary adjacency, and sometimes it’s best to temporarily create a BMesh.

1 Like

Ok, I’ll look into it. For now I copied the adjacency table generation from my prototype. I’ll try to get rid of as much custom code a possible.

1 Like

I did some minor debugging, and the output of the patch is now equivalent to the output of my prototype.
I posted a simple blend file with the patch to demonstrate.
You don’t actually need the patch to look at the demonstration, as it contains the new texture.

Still lots of things to do, but at least it shows what’s possible.

Currently I only do one step of lookups, so if your border is wider then the adjacent polygon you will get black artefacts.

1 Like

I’ve noticed some strange light pixels at the edge of a polygon in the baking output:

I thought it was a rounding error in my new code, but 1h debugging later I found out it’s already in the raw cycles baking output.

Is this a known bug? Should I create a bugreport? Is it some setting? I though I saw someone mention ‘artifacts on the edges’ in the chat some days ago as if it was a known problem, but I can’t find any bugreports about it on developer.blender.org?

As it’s right on the edge of the polygon my guess would be that the differentials get calculated wrong before submitting the baking pixels to cycles… But that’s just a guess.

1 Like

I updated the patch. It still is a mess codestyle wise, but it should be functionally complete now if anyone wants to play with it.

I’ll see when I can find the time to clean it up a bit.

Example from the original case which inspired the patch.


Great progress!

It’s possible this is an unknown bug, I can’t tell from the screenshot why this would happen and not sure how to reproduce. The antialiasing/differentials code used is rather weak, so it may be known in that sense.

I’ll see if I can reproduce it. I think I still have the file that generated that. But as I was trying very hard not to get sidetracked I possibly overwrote it.

Updated the patch.

  • tried to adhere to the blender C++ style guide. I’ve probably missed stuff but it’s much more in line now.
  • rewrote the adjacency table generation, much more efficient now.
  • chained the extend filter to fill in the pixels this one misses.

I’m open for comments now. But I guess I’ll have to ping people a week after 3.0 is released :wink: