Grease Pencil & Geometry Nodes Integration - Design Proposal

Thanks for the replies so far. I will update this design task later:

  • Adding a FAQ explaining the differences between curves and grease pencil.
  • Changing the explicit conversion design to have Grease Pencil objects as instances of curves, and using Separate Geometry + Realize Instance (optionally combined in a single Grease Pencil to Curves node). This would be the one approach that may bring a big performance gain.
  • Consider Layer as simply a selection field.
4 Likes

I know it took a lot of effort to separate GP from core of Blender into its own module. If something like Separate Components node gets a GP output, doesn’t it mean tangling GP module with Blender core once again?

Layers aside, couldn’t GP strokes just be handled as curves with some built in attributes representing the stroke properties stored on stroke points?

1 Like

I thought the performance gain from instances was from shared memory. If each stroke is different, is there a benefit in passing them as instances to geometry nodes? having to go separate + realize before any real processing sounds like a chore if you’re in a 2D project working majorly with grease pencil data.

Anyway, to answer the original proposal, I think I’d prefer if that conversion was silent, but then it’s a bit hard to tell without knowing more about these performance implications that you mention.

While they look similar on the surface, with the GP 3.0 design, curves and grease pencil are very different internally (though they do share the same base storage data structure).

The curves data-block corresponds to a single group of curves, with a set of attributes on curves and points. On the other hand, conceptually a grease pencil data-block is more like a collection. It contains a tree of different curves data-blocks. Each might be be instanced from another object or unique, and each might have a different set of attributes. This design corresponds better to the existing instances component in geometry nodes.

Performance wise, it’s essential that we don’t make UX decisions that try to abstract a grease pencil data-block as a single curves geometry. Concatenating an arbitrary number of curves data-blocks on demand to act like a single curves data-block (with increasing indices, etc) doesn’t work. Since many nodes like “Sample Index” and “Sample Curves” require a single curves data-block as an input (rather than instances, etc), this means grease pencil just won’t work in some curve nodes.


For the rest of the proposal, I think it will be necessary to figure out what data a “Layer” socket corresponds to find out if they’re actually necessary. They’re used more like selections in these examples.

5 Likes

About explicit vs. implicit conversion, I think merging all the layers into a single curves datablock should probably be an explicit conversion. It affects performance quite significantly, and it’s probably unnecessary most of the time.

What I do think is commonly useful is to apply certain operations to all layers, like a transform or other field that doesn’t require interaction between curves, where one curve or curve point can be manipulated at a time. Making that convenient would be nice, but I’m not sure what the right way to fit that in geometry nodes would be. Nodes could apply the same operation to each layer individually, but would that be a special exception? Or would there be a similar mechanism anyway for e.g. volume grids?

About the layer socket, would that effectively be just a string with the layer name? The main advantage I see is that it can provide a better UI, with autocomplete of layer names or drag & drop in node groups and the modifier.

But then would we also add dedicated sockets for attributes, potential hair curve sets, bones, … in the future? Or should we add some metadata to strings to instead, and improve the UI that way?

2 Likes

Explicit nodes to select a particular layer or set of layers perhaps should also not be called conversion, it’s more like separate/join?

We do the same thing for instances right now-- each nested instance geometry can still be passed through nodes, the geometry is just processed in its local space, with no idea that it’s actually an instance. For example, because the string to curves node generates instances, the fill for a “t” curve is only calculated once in the example below, and each letter is processed separately.

Something similar could work for grease pencil layers. Not sure it would be a good idea, but it would even be possible to take that a bit further and just require converting grease pencil to instances for processing.

5 Likes

Re-Creating Modifiers in Geo-Nodes

The best would be to try to replicate the exiting Grease Pencil modifiers with Geometry Nodes. That would give a better view of what else may be needed, or should be considered.

I think this will also reveal some issues/limitations that the current modifiers have. Time Offset would be the modifier I am most inrested in seeing as a geo-node proposal, as this has many implications on how time works within this setup.

I am working on a public cut-out rigging addon for 3.6 LTS, this can show some of the complicated ways ‘time’ is used in Grease Pencils to control what is being shown, in some non-conventional ways. You can see a preview of that system here BLOWNAPART 2D Rigging Addon Demo - YouTube

Grease Pencil Frames/Time

Grease Pencils should definetly have more explicit data elements to consider beyond what comes with curves. More performance is cool but to @HooglyBoogly’s point there are additional elements within grease pencils that are not part of curves. One big element in addition to layers is Frames/Time!

Within a grease pencil layer you can have several different frames, that all have explicitly different stroke data within. This I think will be a large topic for porting grease pencil into geo nodes. How does a user set what frame they are working with, edit different frames from the current frame (or maybe we can only input data on the scene’s current frame?), also consider the relativity of what frame has data vs what frame is being displayed. This idea has a lot of influence over how cut-out rigging could work.

Now cut-out rigging doesn’t explicitly require time, just keyframe instancing, but avoiding concept of frames will be a pain as a lot of workflows in the 2D are very time based. So I hope that we retain the ability to control frame/time property from a geo-node.

For a crazy example consider a user using simulation nodes to generate some mesh, like a rope moving around. Then inputting that simulation to generate grease pencil strokes ontop of this rope, which would be grease pencil strokes saved on different frames as a simulation plays!

Has anyone else considered how time/frames would work in this proposal?

Line Art Considerations

Considering New Line Art proposal 2023, feedback? from @ChengduLittleA proposal I would like to just remind that this workflow for line art would likely be heavily influenced by this proposal.

For my personal project Tiny Media - YouTube I have been creating 3D buildings with Line Art outlines to give them 2D look. Currently it’s an all or nothing approach to processing geomoetry for Line Art. So in addition to having a way to create new grease pencil data, it would also be intresting to use Geo-Proximity or something similar to generate custom outlines with nodes, filter out parts of the geomoetry for line art, etc…

I haven’t used Geometry Nodes much, but about the modifiers filters, the design should include the following (some may already be supported by GN):

  • Layer Name
  • Layer Index (Arbitrary number assigned to layers)
  • Material
  • Material Index (Similar to Layer Index)
  • Vertex Group

All filters can be include or exclude.

This is just to ensure that the new design supports these options.

1 Like

Another important point that we should never forget is that 2D artists are more artists than technicians, so keeping the tool easy to use is not superfluous. The design should take into account that most 2D artists will not need super complex things, they are already capable of creating many things manually, and that only for users who need special things, add complexity. If we make everything complex, we will lose a large part of 2D artists who will opt for other more friendly software for 2D artists.

4 Likes

Don’t make geometry nodes very simple just for artists. This should be delegated to asset creators like hair. Extensibility and correctness features still seem to be a priority.

EDIT: Though on reflection, there seems to be one problem in my mind: What can artists do with nodes? In addition to restoring modifiers…

At no time have I wanted to say that the system should not be powerful; what I wanted to say is that you have to be able to get basic results with a basic setup. We can’t ask a 2D artist to create a bunch of nodes to change the line thickness, when now he can do it in a couple of seconds with a simple modifier. Perhaps predefined nodes as assets could work.

3 Likes

I didn’t want to appear contradictory to you, I just realized that it makes sense to say this.

So far the problem for me is: A sculptor, someone who draws with a graphics tablet, …
What can it do as a nodes system? The hair example shows good things. But I’m not sure if a pencil line is something that can be generated quite hard… Just hair, in itself, the result of some complex abstraction.
What are the plans for using a pencil with nodes. What can potentially be done? In addition to improving the management of already existing modifiers and integration with other types of geometries.

A lot of people do: generators of buildings, trees, furniture, rigs of machines and mechanisms, physics simulations, … with nodes.
I can imagine abstract ui or diagrams with a pencil. Post-processing effects or overlay. But it doesn’t look like something artists do yet.

EDIT: Why I think this is important: it would give an idea of what kind of generic operations would be useful, and as a result, it would show how to define complexity. If we consist this as a very early stage of the experiment, then the complexity is not so important. Then it will just be simplified in future.

1 Like

You are talking about something different. As I understood you are talking about Line Art + Geometry Nodes. and not Grease pencil drawing with Geometry nodes for additional effects.

My understanding is that @modmoderVAAAA was talking about drawing nodes / brush nodes ? those would affect the painting/sculpting experience, whereas the GP integration that is mostly in question here is the one where the artist processes existing strokes as they currently can with mesh & curves.

No, I’m talking about how you generate pencil strokes by nodes. Not about node-base brushes and something like that. I mentioned the artists in the sense that you either make a completely nodal project, or draw something with your own hands.

2 Likes

a question please concerning GP 3.0 and geometry nodes, will there Tilt data in the new structure when we convert a curve to grease pencil? and that to preserve the tilt data if we revert it.

maybe Tilt data will be usable in GP object to assign GP strokes as a path restriction while it’s an Gpencil object along strokes or around fills like curve object in Geometry node level :thinking: (and/or
in constraints level).

eg: bunch of stroke follow path along a stroke and/or bunch of strokes by jumping from one path to next path depending on stroke index,eg: stroke 1 = path1 stroke 2= path2 …etc
the bunch of stroke selected as a path by layers or attribute? or vertex group: any stroke have a point in the vertex group will be selected …,or strokes that use a materials, eventually by pass index like @antonioya montioned above…

the tilt will be useful to make strokes turn around the stroke path (follow curve) without forgetting radius of the stroke.

2 Likes

Hi, thanks everyone for the feedback. Due to schedulling (we have a Nodes Tools workshop this week) I won’t be able to work on this design proposal until next week.

That said, Curve Instances as proposed by Hans may be a good way forward. In particular if they can be more performant and allow for the simplicity of simply using Curve nodes directly in the Grease Pencil data (e.g., Curve Trim).

It is still not clear to me what we get from “instances” itself though.

2 Likes

To add a few thoughts to this exciting ‘GP meets GN’ topic:

Stroke and layer order
In a 2D workflow stroke and layer order matters: it determines which stroke is painted on top of which. So a design decision has to be made to support this principle in GN or not.

  • When supported, a mechanism must available to control/preserve layer and stroke order.
  • When GN will only support strokes in 3D space, the consequence is that GN can’t be used as a replacement of existing GP modifiers, because preserving stroke order is a clear requirement for these modifiers.

What are layers?
Layers are mainly mentioned as input selections so far, but the output side is equally important. Generated strokes in GN must be assigned somehow to a layer, otherwise they don’t exist in the GP data structure.
And layers have transform attributes (location, rotation, scale). This is important for the evaluation of strokes to curves in world space. And for the other way around: assigning strokes in world space to layers.
We have to ask Falk about layer groups. Layers groups are added in the new GP data structure – I don’t know if they can have transform attributes as well.

Stroke = curve?
GN curves are GP strokes when they can handle this:

  • For stroke points: position, radius, strength (alpha factor) and vertex color.
  • For strokes: thickness, hardness, material.

Typical for GP worth mentioning here: the color of a stroke point can be defined by material color (applied to the entire stroke) or by vertex color (applied to the point). That differs, I guess, from the 3D curve.

5 Likes

I work on a web-based game that heavily relies on 3d assets for the map, however we actually export them as SVG via grease pencil.

Problem is, if we want to change the model at all, we have to edit the mesh, convert to grease pencil, then make all the changes we had on the previous version, in our case we need to actually paint on emulated lighting because SVG exports don’t support lighting.

So, if there were a way to programmatically generate a grease pencil object based on a given mesh, that would be be everything I’ve ever wanted from blender truly. What was the outcome of this proposal?

1 Like