Clockworx Music Nodes - some issues to solve

Thanks - this is what I thought originally, so I will put it back to how it was before.

On a separate note, at some point I want to introduce “Live” MIDI input from my TMK88 MIDI keyboard, is there a way to execute the node tree all the time, like you do in AN, that I could use here?

Thanks, Clock.

PS. Got to go out now for a couple of hours, so might not respond so quickly…

To execute code many times per second you can use bpy.app.timers and bpy.app.handlers. It’s not trivial to get right though, depending on what you want to do exactly.

So, the plan, some time ago in 2.79 days I wrote some AN nodes to interface with Pygame.Midi to move objects when MIDI keys are pressed see here, I had an interface node that monitored the feed from Pygame and fed the MIDI string to nodes that controlled the objects.

Now I should like to re-create this in my own system, not dependant upon AN - there is far more in AN than I need to do this…

The system needs to be running “all the time” i.e. so many per second to monitor the feed from my MIDI keyboard. I had guessed at timers &/or handlers, but really need something to read to show me a workflow to achieve this.

I have created a way to animate & make sound files from a MIDI.csv file example below:

But I really want to add an interface to my MIDI keyboard & MIDI controller.

My work to date is here, it is early days, so no docstrings, etc yet! It did port my earlier AN work to 2.8, so I have the code necessary to make the new nodes in Clockworx Music, but I need to understand how to run several times per second, maybe 25? so I can animate “live” or use the keyboard to set keyframes, etc. as I did before in AN.

Thanks for your help in this, I really appreciate it.

Cheers, Clock.

OK, this is beginning to come together now:

Except (%$£&*^%$%-it :poop:) when I tried to render an animation - separate frames into a folder, the Handlers did not trigger the movements of the piano keys here, shown moved at a certain point in the animation. Oh well, can’t have it all, I suspect I did something wrong, but have not found out what yet.

Cheers, Clock.

PS. All the sound - that you can’t hear, but I can :rofl:, was made directly from the MIDI file using Aud routines.

PPS. %$£&*^%$% = Anglo-Saxon swear words…

Clock, make us same screen-recording I’m curious to see how your project works (when it starts working). :blush:

I just need to work out how to trigger a handler before a render frame change, I think it might be adding the handler to “render_pre”, or “render_post”, or “render_write”, but I am not yet sure…

As for screen capture videos, these run at 60fps on my Mac and it doesn’t run quite quick enough if there is a lot on animation going on.

I’ll keep looking at the API…

Cheers, Clock.

EDIT:

In the meantime I have got animation working from a MIDI controller:

But I still have to work out how to get this to trigger so many times per second, rather than only on frame change…

None of these work, the animation does not change when rendering to png files, just renders the first frame over & over. Does anyone have any ideas how to do this please?

Cheers, Clock.

Well Sir, hours of internet searching has not given me a clue on how to make a timer that runs every say 0.04 second and executes any node with a get_midi() function.

This works as a handler on frame change:

def start_midi(scene):
    for nodetree in [
        n for n in bpy.data.node_groups if n.rna_type.name == "Clockworx Music Editor"
        ]:
        for n in nodetree.nodes:
            if (hasattr(n, "get_midi")):
                n.get_midi()

And executes nicely every time I advance the timeline - this then reads the input from my midi controller. I just need a way to execute any node with the same function every nth of a second.

Also, I cannot get a similar function to execute when rendering an animation, it just renders the first frame over & over. Is there any reading material you know of that might help me?

Cheers, Clock.

Not sure where you have been looking, but the examples in the API documentation say exactly how you run a function every n seconds: https://docs.blender.org/api/blender2.8/bpy.app.timers.html

Timers do have limitations (e.g. you cannot access the context), but it might work for your use case.

I don’t know how to handle the update when rendering right now. You might have to use a combination of handlers. There is not a perfect solution for this yet. This system was rewritten in Animation Nodes multiple times.

1 Like

I was not sure where to put this code, I now presume the __init__.py file is a good place… I will check this, it didn’t work elsewhere - Thanks!

Sorry to be dumb, this is all new to me…

EDIT:

Init file was NOT a good place, but I have now cured the problem and it works, I have MIDI input running all the time (well at 0.04sec), thanks for the pointers… :grin:

1 Like

Objects are being moved, rotated and scaled by pressing buttons on my MIDI controller. I still cannot get a set of handles that work for rendering though…

Cheers, Clock.

PS. not going to get much more done before I go away for a week on Saturday, so I will think about how to get the handles working whilst idling my time away on the good ship Fridtjof Nansen… :ship: :cocktail: :cocktail: :cocktail:

1 Like

Finally!!! I have worked out how to add input & output sockets AND pass the data between them AND keep the last value from my MIDI controller persistent access consecutive executions if nothing has happened:

Now I can take my holiday (yes, yet another one) in peace! Despite the fact that I have still not cracked the Render Handlers.

Cheers, Clock. :ship: :cocktail: :older_man:

OK last few changes beforeI go play on the boat:

The cone is being moved by a slider on my MIDI controller, I have revised the node so you can specify which controller does what.

See you all in a week’s time, well I won’t see you unless you are on the same boat as me, and I will have my phone, but not may laptop. So no more work for a week, bye everybody :wave:

EDIT:

Latest code now uploaded to GitHub.

2 Likes

Well, I survived the boat trip! What fun we had in the Irish Sea with 50knot winds and 7m waves - good job I don’t get sea-sick, still the jacuzzi on our balcony was very nice, particularly with a G&T.

On the home from I got this far:

This “toy” digger is driven from my Korg nanoKontrol2 and my Live MIDI nodes, I will have to get my camera out and make a video of this in operation some time.

Hope you are all well and safe from little bugs. :bug: :bug:

Cheers, Clock. :cocktail:

2 Likes

@jacqueslucke For now I have put many nodes in several files, would I be better to have individual files for each node in a node directory with an empty __init__.py file? same for operators & sockets?

Meantime I now have controls for bones in armatures:

Cheers, Clock.

I prefer the latter as you can see in Animation Nodes. However, it does not really matter a lot in your case probably.

1 Like

What a cool project Mr. Clock. I have nothing important to add other than moral support.

1 Like

Thanks Jacques, I have spent several hours yesterday getting all the nodes, sockets, operators and menus split into separate files - it was getting very hard to remember which piece of code was where! I have not uploaded the changes yet, I think I must now check everything closely before I go any further.

Thank you for the advice, I really appreciate it.

@DanPool - thanks for the comment, I need moral support at times, working on a project like this is very challenging as most of the Node stuff is completely new to me.

Cheers, Clock.

Not that I foresee having time to commit to it, but I always thought it would be cool to put some audio processing nodes in place so you could do at least rudimentary audio design that could be driven by the animation or scene elements - for instance attaching sounds to footsteps, reverb and echo levels to regions of the environment, motor sounds to vehicles that could be modified by Doppler effects as they approach or speed away, etc. These are all typical things that might be used in a game, but it would be nice to use it in Blender for animation output with sound.

It’s but a dream…

1 Like

Yes, I have done all this in my AN version of these nodes, I have yet to transfer them to my own nodes, but it will come at some point. Doppler can be done in two ways, either with Aud library, or by playing with pitch on cut-up sounds.

Sounds triggered by locations, or movements of objects is already available in Clockworx Music Nodes. - More later!

Cheers, Clock.

I haven’t transferred the Reverb node yet, but ti is on the ToDo list…