Grease Pencil & Geometry Nodes Integration - Design Proposal

Grease Pencil and Geometry Nodes Integration

As part of the upcoming Grease Pencil 3.0 project Grease Pencil will use a new curve data-type representation (the one used for the new Hair in Blender) and will support Geometry Nodes.

This design proposal explires how to integrate both systems. There are two main lines of thought considered:

  • Implicit conversion
  • Explicit conversion

The implicit conversion means you could connect a Grease Pencil geometry directly into a Curve node. The explicit conversion mimics the internal data representation. That means users are more conscious of performance implications and have full control over it.

Common design solutions

Both scenarios have some common elements.

We will introduce Grease Pencil as a new geometry type in the Spreadsheet with the control points and spline (similar to curves), but also with the layers.

We also show Grease Pencil in the Separate Components node in the same order as of the Spreadsheet:



Layers are an important element to control the Grease Pencil. The details are not clear yet, but we will probably need them as a new data-type (socket), as a way to restrict Geometry Nodes operations.

We also need to make sure that a node-group can be re-used for different Grease Pencil objects. That means that layers should be (most of the time) definable in the modifier level as Named Attributes.

A Layer to Selection node could even allow multiple-input of layers. Note that in this design Layers are treated as data (circle socket). But they may need to be fields (diamond socket).


Although layers will be defined in the modifier level, at some points in the workflow (or for advanced use-cases) the option to define a layer inside the node-tree is desirable. Therefore a Named Layer can work similar to the Named Attribute node.


Implicit conversion

The idea is to allow for Grease Pencil geometry to be used directly in any Curve-compatible node (trim, fillet, resample, revert, …). This is similar to how both Mesh and Curve geometries can be used in Point nodes (e.g., Points to Volume).

Explicit conversion

The other approach would be to have nodes to convert from/to Grease Pencil and curves.

It is not clear to me whether those would be a Curve to Grease Pencil (that could then be joined with the other geometry types) or an update of exiting Grease Layers.

Here are a few ideas around that:

Final thoughts

The best would be to try to replicate the exiting Grease Pencil modifiers with Geometry Nodes. That would give a better view of what else may be needed, or should be considered.

But at the core of it I believe most of the problems can be boiled down to the simple case. Anyone that wants to explore more complex scenarios please share the results of your considerations here.

I also need the rest of the Geometry Nodes team to assess whether there is a performance impact between the solutions. One of the appeals of the new Grease Pencil design is to be more robust and performance efficient.


I would love this in the workflow - GP has somethings (UX) curves do not, and curves have somethings GP does not - allowing to work with both in GN opens many doors!

1 Like

Looks pretty neat. Since internally they will all be the same curve structure so I believe any generic curve-based node can be chained in this setup as well?

I am not sure I understand what the explicit conversion approach brings. If GP uses the same structure as curves, how is a conversion necessary? Additionally what is the benefit of knowing that a conversion will happen and have a performance impact, when that conversion is necessary anyway in order to process the GP data?

Speaking of layers, perhaps they could be treated the same way materials are & with similar nodes (set layer, set layer index, etc).
I like the idea of “layer to selection”, so that all nodes that deal with points don’t need to be changed to handle GP layers as well.

1 Like

Thanks for the replies so far. I will update this design task later:

  • Adding a FAQ explaining the differences between curves and grease pencil.
  • Changing the explicit conversion design to have Grease Pencil objects as instances of curves, and using Separate Geometry + Realize Instance (optionally combined in a single Grease Pencil to Curves node). This would be the one approach that may bring a big performance gain.
  • Consider Layer as simply a selection field.

I know it took a lot of effort to separate GP from core of Blender into its own module. If something like Separate Components node gets a GP output, doesn’t it mean tangling GP module with Blender core once again?

Layers aside, couldn’t GP strokes just be handled as curves with some built in attributes representing the stroke properties stored on stroke points?

1 Like

I thought the performance gain from instances was from shared memory. If each stroke is different, is there a benefit in passing them as instances to geometry nodes? having to go separate + realize before any real processing sounds like a chore if you’re in a 2D project working majorly with grease pencil data.

Anyway, to answer the original proposal, I think I’d prefer if that conversion was silent, but then it’s a bit hard to tell without knowing more about these performance implications that you mention.

While they look similar on the surface, with the GP 3.0 design, curves and grease pencil are very different internally (though they do share the same base storage data structure).

The curves data-block corresponds to a single group of curves, with a set of attributes on curves and points. On the other hand, conceptually a grease pencil data-block is more like a collection. It contains a tree of different curves data-blocks. Each might be be instanced from another object or unique, and each might have a different set of attributes. This design corresponds better to the existing instances component in geometry nodes.

Performance wise, it’s essential that we don’t make UX decisions that try to abstract a grease pencil data-block as a single curves geometry. Concatenating an arbitrary number of curves data-blocks on demand to act like a single curves data-block (with increasing indices, etc) doesn’t work. Since many nodes like “Sample Index” and “Sample Curves” require a single curves data-block as an input (rather than instances, etc), this means grease pencil just won’t work in some curve nodes.

For the rest of the proposal, I think it will be necessary to figure out what data a “Layer” socket corresponds to find out if they’re actually necessary. They’re used more like selections in these examples.


About explicit vs. implicit conversion, I think merging all the layers into a single curves datablock should probably be an explicit conversion. It affects performance quite significantly, and it’s probably unnecessary most of the time.

What I do think is commonly useful is to apply certain operations to all layers, like a transform or other field that doesn’t require interaction between curves, where one curve or curve point can be manipulated at a time. Making that convenient would be nice, but I’m not sure what the right way to fit that in geometry nodes would be. Nodes could apply the same operation to each layer individually, but would that be a special exception? Or would there be a similar mechanism anyway for e.g. volume grids?

About the layer socket, would that effectively be just a string with the layer name? The main advantage I see is that it can provide a better UI, with autocomplete of layer names or drag & drop in node groups and the modifier.

But then would we also add dedicated sockets for attributes, potential hair curve sets, bones, … in the future? Or should we add some metadata to strings to instead, and improve the UI that way?


Explicit nodes to select a particular layer or set of layers perhaps should also not be called conversion, it’s more like separate/join?

We do the same thing for instances right now-- each nested instance geometry can still be passed through nodes, the geometry is just processed in its local space, with no idea that it’s actually an instance. For example, because the string to curves node generates instances, the fill for a “t” curve is only calculated once in the example below, and each letter is processed separately.

Something similar could work for grease pencil layers. Not sure it would be a good idea, but it would even be possible to take that a bit further and just require converting grease pencil to instances for processing.


Re-Creating Modifiers in Geo-Nodes

The best would be to try to replicate the exiting Grease Pencil modifiers with Geometry Nodes. That would give a better view of what else may be needed, or should be considered.

I think this will also reveal some issues/limitations that the current modifiers have. Time Offset would be the modifier I am most inrested in seeing as a geo-node proposal, as this has many implications on how time works within this setup.

I am working on a public cut-out rigging addon for 3.6 LTS, this can show some of the complicated ways ‘time’ is used in Grease Pencils to control what is being shown, in some non-conventional ways. You can see a preview of that system here BLOWNAPART 2D Rigging Addon Demo - YouTube

Grease Pencil Frames/Time

Grease Pencils should definetly have more explicit data elements to consider beyond what comes with curves. More performance is cool but to @HooglyBoogly’s point there are additional elements within grease pencils that are not part of curves. One big element in addition to layers is Frames/Time!

Within a grease pencil layer you can have several different frames, that all have explicitly different stroke data within. This I think will be a large topic for porting grease pencil into geo nodes. How does a user set what frame they are working with, edit different frames from the current frame (or maybe we can only input data on the scene’s current frame?), also consider the relativity of what frame has data vs what frame is being displayed. This idea has a lot of influence over how cut-out rigging could work.

Now cut-out rigging doesn’t explicitly require time, just keyframe instancing, but avoiding concept of frames will be a pain as a lot of workflows in the 2D are very time based. So I hope that we retain the ability to control frame/time property from a geo-node.

For a crazy example consider a user using simulation nodes to generate some mesh, like a rope moving around. Then inputting that simulation to generate grease pencil strokes ontop of this rope, which would be grease pencil strokes saved on different frames as a simulation plays!

Has anyone else considered how time/frames would work in this proposal?

Line Art Considerations

Considering New Line Art proposal 2023, feedback? from @ChengduLittleA proposal I would like to just remind that this workflow for line art would likely be heavily influenced by this proposal.

For my personal project Tiny Media - YouTube I have been creating 3D buildings with Line Art outlines to give them 2D look. Currently it’s an all or nothing approach to processing geomoetry for Line Art. So in addition to having a way to create new grease pencil data, it would also be intresting to use Geo-Proximity or something similar to generate custom outlines with nodes, filter out parts of the geomoetry for line art, etc…

I haven’t used Geometry Nodes much, but about the modifiers filters, the design should include the following (some may already be supported by GN):

  • Layer Name
  • Layer Index (Arbitrary number assigned to layers)
  • Material
  • Material Index (Similar to Layer Index)
  • Vertex Group

All filters can be include or exclude.

This is just to ensure that the new design supports these options.

1 Like

Another important point that we should never forget is that 2D artists are more artists than technicians, so keeping the tool easy to use is not superfluous. The design should take into account that most 2D artists will not need super complex things, they are already capable of creating many things manually, and that only for users who need special things, add complexity. If we make everything complex, we will lose a large part of 2D artists who will opt for other more friendly software for 2D artists.


Don’t make geometry nodes very simple just for artists. This should be delegated to asset creators like hair. Extensibility and correctness features still seem to be a priority.

EDIT: Though on reflection, there seems to be one problem in my mind: What can artists do with nodes? In addition to restoring modifiers…

At no time have I wanted to say that the system should not be powerful; what I wanted to say is that you have to be able to get basic results with a basic setup. We can’t ask a 2D artist to create a bunch of nodes to change the line thickness, when now he can do it in a couple of seconds with a simple modifier. Perhaps predefined nodes as assets could work.

1 Like

I didn’t want to appear contradictory to you, I just realized that it makes sense to say this.

So far the problem for me is: A sculptor, someone who draws with a graphics tablet, …
What can it do as a nodes system? The hair example shows good things. But I’m not sure if a pencil line is something that can be generated quite hard… Just hair, in itself, the result of some complex abstraction.
What are the plans for using a pencil with nodes. What can potentially be done? In addition to improving the management of already existing modifiers and integration with other types of geometries.

A lot of people do: generators of buildings, trees, furniture, rigs of machines and mechanisms, physics simulations, … with nodes.
I can imagine abstract ui or diagrams with a pencil. Post-processing effects or overlay. But it doesn’t look like something artists do yet.

EDIT: Why I think this is important: it would give an idea of what kind of generic operations would be useful, and as a result, it would show how to define complexity. If we consist this as a very early stage of the experiment, then the complexity is not so important. Then it will just be simplified in future.

1 Like

You are talking about something different. As I understood you are talking about Line Art + Geometry Nodes. and not Grease pencil drawing with Geometry nodes for additional effects.

My understanding is that @modmoderVAAAA was talking about drawing nodes / brush nodes ? those would affect the painting/sculpting experience, whereas the GP integration that is mostly in question here is the one where the artist processes existing strokes as they currently can with mesh & curves.

No, I’m talking about how you generate pencil strokes by nodes. Not about node-base brushes and something like that. I mentioned the artists in the sense that you either make a completely nodal project, or draw something with your own hands.