Clockworx Music Nodes - some issues to solve

OK, I have a basic DAW Sequencer working, the notes are read and their position determines the note frequency used, their length determines the note length and the output sound can be written to the VSE, or they can be played as a sound animation:

I will investigate how this primitive system might be used to generate sound from objects moving, or interacting.

Cheers, Clock.

1 Like

OK, Oi Clock - why do you keep starting posts and sentences with “OK” & “So” - such bad English …anyway, I came up with this:

So, having downloaded a “fart” sound from the internet, now, whenever the little empty is a Z = 1 and has not been there on the frame before, it makes an, err, ummm, farting noise from my speakers… Oh the things that you have to do to amuse yourself when you get to my age and are told by your government to isolate yourself.

BTW Mrs. C has told me she is self-isolating away from the kitchen - not good!

Cheers, Clock. :poop: :woman_health_worker: :dancer: :bug:

1 Like

Keep dreaming sir, sometimes they come true:

Cheers, Clock. :laughing: :cocktail:



I am stuck in a position I cannot resolve:

I have a node that has three properties and an execute routine that concatenates them together into a list, this list is returned by a return statement. So far so good, If I plug in a node that reads the output using an operator, I see the correct return. If I plug in a node that reads the incoming socket I get what is in the first property only - frustrating to say the least.

Then I want to output two variables from another node into two output sockets, whatever I do I always get the same value in both sockets, although my return statement says; return var_1, var_2, or words to that effect.

So, clearly I’m missing something in my node, or nodetree definitions, but I don’t know what. The code is all here, maybe not the latest, but it shows my tree definition in the file.

Can someone point me in the right direction., like some worked examples, or explanations, as to how to define a tree, nodes, sockets, etc. and how to get multiple output sockets and to correctly read input sockets please.

Believe me, I have tried repeatedly to solve this, but without putting something right I can take this project no further and will have to abandon it.

Here’s hoping for some help, Clock.

Hmmm, let’s try an easier question:

What is the purpose of a “base” node as found in Animation Nodes and Sorcar for example, what should be in it and why?

Failing anyone knowing the answer, where might I look for guidance please?

I think I have solved the other issues using dictionaries, but I need further checks, however things like this now work:

In the last image, I have split the sound into 8 different chunks in the Equaliser so I can put a different echo setting, or even a different filter, on each channel.

Cheers, Clock. :poop:

OK, thanks for the help ( :rofl: :rofl: :joy: :wink: :rofl:), I solved the multiple outputs by using a single socket and passing a dictionary of values to it that I can extract what I want from in the connected node - i.e. I didn’t find out how to send data to two output nodes, but no worries, this dictionary method works in all situations for me. I used a dictionary so I can extract for, example a sound, because the dictionary key would be “sound” for a sound, etc.

I also sorted the same method to get Note Data from that node, i.e. it passes keys like “note_name”, note_duration", etc.

I remain “utterly clueless” as to what the base node is supposed to do, I cannot decipher meaning from the examples I quoted at all, oh well, such is life…

So this now works:

Notes are triggered by objects in the viewport as the timeline is run.

I also built an FIR-IIR Filter, I had fun with tuples here:

Plodding on remorselessly down life’s lonely road, Clock. :grin:

PS. Is everybody coping with their isolation like I am? :tumbler_glass: :tumbler_glass: :tumbler_glass: :tumbler_glass: :tumbler_glass: :tumbler_glass: :tumbler_glass:


Clock, your work is really cool. When Blender 2.90 has particle nodes and when more of Blender is node-based, is your plan to try and get Clockworx nodes integrated into the main branch of Blender? I am a former concert pianist and played in jazz groups, and it would be great to be able to have an official Blender version that I could hook up to my MIDI keyboard and do cool stuff with.


Maybe one day, it is not ready for that yet, but who knows? It will be released on my GitHub at some point when it is nearer completion, for the moment it is still WIP!

I am working on getting collections and objects into the node structure just now, live MIDI is working, but needs some re-coding to improve the structure:

Cheers, Clock. :bug:

Thank you for the reply, Clock. I would hope that it eventually does get in master. That would be great!

1 Like

More progress with the Live MIDI:

Better structure, now I can feed objects and bones… The digger now works by IK chain as I control the bucket with my Korg nanoKontrol2 :dancer:

Cheers, Clock. :wave:

1 Like

OK, next issue solved:

These two cubes are being animated in Z axis scale by the nodes here, fed from the sound input of my Mac. :wink:

I just need to work out how to stop “dumb users” connecting sockets that don’t match - any clues anyone, I cannot find a reference to this, but then again I have asked this before and nobody answered me…

Cheers, Clock. :dancer:


Also works for Armatures:


My developer skills are pretty much zero, so I haven’t been able to offer any help as I follow the development, but I will say this project looks cool!

I know Animation Nodes won’t allow the ‘wrong’ types of sockets to connect, but not specifically how.

1 Like

Ahah! I have sorted it so sockets only connect to others of the same kind, other than Generic sockets, which can connect to anything else!

So, whilst I was laying under my old lady doing a repetitive task that required little brain power, I solved the issue. Now before you all get the wrong end of the stick, I should point out that by “my old lady” I am referring to my glider and obviously not Mrs. Clockmender (some people’s minds…tut, tut), I hope that is all clear now. :roll_eyes:

Speaking of which, or is that witch? :rofl: when I got home I explained to Mrs. C. how I was going to sort the problem, to which she replied that what I had come up with was in fact the correct solution and that she had not told me earlier of this, because she felt it would be better for me if I worked it out for myself. This is presumably what has happened here also… :stuck_out_tongue_closed_eyes:

As a side issue I also found out how to automatically colour code the nodes based upon what they do:

Whilst you might say that the colours are bloody awful, at least they have proved the point and can be changed. :wink:

I have not posted my new nodes onto GitHub yet, I may do this some time next week once I have finished playing around with them, the nodes that is. :face_with_hand_over_mouth:

Cheers, Clock. :tumbler_glass: :crazy_face:


Next thing on my ToDo list was Baking Sound to Controls:

Successfully accomplished, you can specify the sound file, the maximum frequency range, the number of frequency splits (I use harmonic splits based upon the n/12th root of 0.5) and a host of other factors to bake a sound to the controls’ F-Curves.

These controls can then be used to animate objects, although I have not done that in a project yet, but do have the necessary nodes in place already, these are the ones I use to animate from MIDI controls.

Cheers, Clock. :tumbler_glass:

PS. As a bonus, I also worked out how to switch the context to different editors…

Here are the F-Curves in the Graph Editor:

Yay! :joy: - sound-bake animated objects:


hey clocking mate, I think the time has come to do that famous screen recording that shows the results at the rhythm of music :grin:


Soon! I have not resolved the issue of rendering an animation, currently the nodes are not executing between frames when you render animation, although if I render a frame, then advance the Timeline, then render the next frame, they are actioned. So I have two options:

  • Work out how the “fudge” to get the Blender render animation to work.

  • Write a simple Render Animation routine where the Timeline is advanced and each frame rendered and the image saved to disc, this I can do quite easily (I hope).

I have one more thing to do before I do a video, which I will record with my camera so you can see the MIDI controller in action, then I will make a compilation of the various bits, like sound production, DAW, Live MIDI, Sound Bake, Sound Animate, etc. That thing is to built the DAW notes from a MIDI file, I have all the code I need in various bits - the ones that Bake the MIDI file to controls.

So, a little patience please and we will get there, there is no manual for this stuff, I am having to experiment all the time to get things to work, sometimes it takes hours to get a “simple” thing to work.

Cheers, Clock. :dancer:

PS. Screen Capture videos on my Mac are not going to to work as “Mac” wants to record at 60fps and that taxes the CPU for this stuff…

1 Like

This is part of The Entertainer by Scott Joplin taken from a MIDI file I played, these notes took a billionth of a second to write with the new function on the MIDI Bake node:

OK, done that, I am rapidly running to of excuses not to produce a “Features” video…

Cheers, Clock. :dancer:


Still not cured the cursed render animation issue, but then I understand it took @jacqueslucke something like 2.5 millennia and a Cray computer to cure the problem with Animation nodes…


I’m also eagerly waiting for a video demo!

Regarding rendering the animation, I often use this simple script that I wrote:

import bpy

scene = bpy.context.scene
fp = scene.render.filepath
current_frame = bpy.context.scene.frame_current
frames = range(1, 100)

for i in frames:
    scene.render.filepath = fp + str(i)
    bpy.ops.render.render(write_still = True)
scene.render.filepath = fp

Before the Animation Nodes rendering issue was solved, this worked well to render animations with it.

1 Like

Sir, you are a star, I will try this out later today!, pity I can only give you one “Like” for this…


Brilliant! here is the result, note the timeline has advanced and the animation has worked, I will output a video later, but this worked, although I made a node of it and altered a few things:

Thank you many times… :joy:

Here is my Operator code:

import bpy

class CM_OT_RenderAnimation(bpy.types.Operator):
    bl_idname = "cm_audio.render_animation"
    bl_label = "Render Animation"

    def poll(cls, context):
        cm_node = context.node
        return cm_node.render_dir not in ["", "//"]

    def execute(self, context):
        scene = context.scene
        cm_node = context.node
        path = bpy.path.abspath(cm_node.render_dir)
        scene.render.filepath = path
        bpy.context.scene.frame_current = cm_node.strt_frame

        for i in range(cm_node.strt_frame, cm_node.stop_frame + 1):
            scene.render.filepath = f"{path}render{i}"
            bpy.ops.render.render(write_still = True)

        return {"FINISHED"}

No sound yet, but here is a sample:

Cheers, Clock.