adzialocha / kubismus

Experimental OSC Editor
GNU General Public License v3.0
7 stars 0 forks source link

How to understand and use the kubismus application #1

Open mulgurul opened 3 years ago

mulgurul commented 3 years ago

Hi

I'm working on a project where we need to create a cross platform app in Electron implementing OSC as basis for communicating with assorted music equipment.

I looked around and this project seems so near to what I have in mind, and so well organised. So I would really like to examine and understand this application thoroughly, but unfortunately I'm quite new to node.js development (I come from .net C# background) and I have some problems understanding the applications used scenario.

Is it possible that someone could explain a little about the use and thoughts which has been the background for the app?

I'm trying to figure out how to have an async source like an UDP Osc socket and integrate with the state model of a react web app.

In my own project I'm using A worker window with the socket stuff and passing objects with data through ipc to renderer and the adding it to a recoil state. But seems a bit ugly and with some overhead going from worker renderer to main web renderer over main.

This project seems so clean, but I'm totally lost on redux, so it's quite complicated to understand the structure of the app.

I do have Abbleton installed. What do I need to do to get the app in use?

Thanks a lot for any help if possible:-)

adzialocha commented 3 years ago

Hi @mulgurul!

Thank you for reaching out, you are the first one being interested in Kubismus as I mostly written that software for my own purposes. Happy to see that it helps other people as well!

Is it possible that someone could explain a little about the use and thoughts which has been the background for the app?

Kubismus originated from multiple projects around my attempts trying to control Ableton from the outside via OSC commands which turn on / off channels in an Ableton project. It started with this project: https://github.com/adzialocha/solo-controller and https://github.com/adzialocha/solo-link using this Max for Live / MaxMSP object to receive the OSC commands and connect them to Ableton parameters: https://github.com/adzialocha/solo-link-m4l. I think nowadays there is some OSC objects you can officially get from Ableton but in the past I've build them myself.

The idea was to have some sort of environment which allows me to play with randomness, simple sequences which trigger different effect channels. Also I wanted to have interdependencies between them eg. "When one channel got activated, deactivate the other one" etc. - for live concert setups I would have presets I can create and load during the performance. You could see Kubismus as a "command data generator" maybe.

The Kubismus version had some ways to control the recording / transport state of Ableton as we used it in the recording studio, routing different signals into different machines (Tape Machine, Plate Reverbs, Amps, Spakers in different rooms etc.), using Ableton as the routing matrix and Kubismus to control it. When pressing record in Kubismus it automatically also started Ableton and we could experiment with the routing behaviour while directly recording the music.

The result is a LP you can listen to here: https://andreasdzialocha.bandcamp.com/album/for-always-lp - you can also read more about the process here: https://adz.garden/for-always-lp/

Also, if this is of interest for you, Kubismus was once built as a C++ application, using the JUCE framework: https://github.com/adzialocha/kubismus-legacy - it contained some nice tools to generate special OSC patterns with some sort of timeline editor, but I've never finished it. Also it's my very first C++ project :baby_chick:

In my own project I'm using A worker window with the socket stuff and passing objects with data through ipc to renderer and the adding it to a recoil state. But seems a bit ugly and with some overhead going from worker renderer to main web renderer over main.

You will definitely need some sort of bridge to get the data via UDP somewhere else as you can't do this from the browser. I have that bridge in the renderer process, but the whole OSC logic is inside the browser / web-view, built on top of my https://github.com/adzialocha/osc-js library.

This project seems so clean, but I'm totally lost on redux, so it's quite complicated to understand the structure of the app.

If I would write that project again I would not use Redux but Higher Order Components instead, redux is indeed not very helpful for clarity and I'm also not sure I've made the best use of it. It's been a while I've been looking into the code myself, but let me try a short overview:

  1. The scenes are defined in the global redux store here: https://github.com/adzialocha/kubismus/blob/master/src/scripts/reducers/scenes.js. They contain what parameter (and with parameter I mean something in Ableton, like volume of a channel or dry/wet value of the reverb, usually a value between 0.0-1.0), is controlled based on what logic (randomness, some pattern, similar to a sequencer etc.), most of the UI is built around defining, loading, storing and controlling this scene data.

  2. When "playing" a scene the scene definitions are passed on with their parameters to separate "play" actions: https://github.com/adzialocha/kubismus/blob/master/src/scripts/actions/transport.js#L5

  3. Each separate play action is picked up by the player middleware here: https://github.com/adzialocha/kubismus/blob/master/src/scripts/middlewares/player.js

  4. The player service itself is the logic which has the internal clock and generates the parameter data based on the defined parameter behaviour: https://github.com/adzialocha/kubismus/tree/master/src/scripts/services/player - this part is completly separated from any React / Redux logic and works on its own. The player service can handle multiple "tasks" at the same time, each task stands for one single parameter. So usually a scene would consist of multiple parameters eg. multiple player tasks when running.

  5. The player service has a callback reporting status changes, these are then again dispatched as redux events in the middleware: https://github.com/adzialocha/kubismus/blob/master/src/scripts/middlewares/player.js#L18

  6. Which then again get dispatched as osc events: https://github.com/adzialocha/kubismus/blob/17d0890b49e74b9269eee30f92c30df6423039cb/src/scripts/actions/player.js#L39

  7. And finally picked up by the OSC middleware which sends the actual OSC command to the bridge: https://github.com/adzialocha/kubismus/blob/master/src/scripts/middlewares/osc.js

I do have Ableton installed. What do I need to do to get the app in use?

Check out that Max object I linked to above. Add it to your Ableton project!

mulgurul commented 3 years ago

HI Andreas

Thank you so much for your mail and rich explanation of the project. I'm as a electronic musician myself very impressed by the work you have done with regards to creating this album involving new creative thinking and ways together with homemade software of such complexity. I've listened to the album and it's a very inspiring sound universe and artistic approach to electronic composition. Fantastic work:-)

I produce electronic music myself, under the name "Circles of sound", and we released our first album, in fact on a german label "Lemongrass music" a year ago. I'm also a bass player myself:-)

But anyway I'm glad for your explanations. And that you dont praise redux actions completly. I have choosen recoil as state management as redux seemed to overwhelming to learn for just this project, and try to keep it a bit more simpel.But again, thanks a lot for your help:-)

De bedste hilsener Peter Meldgaard Software arkitekt og udvikler


Stjernholm IT aps Direkte: + 45 49 26 26 52 Stubbedamsvej 73 Mobil: +45 53 58 53 09 DK-3000 Helsingør


Fra: Andreas Dzialocha notifications@github.com Sendt: 10. februar 2021 19:21 Til: adzialocha/kubismus kubismus@noreply.github.com Cc: Peter Meldgaard pm@stjernholmit.dk; Mention mention@noreply.github.com Emne: Re: [adzialocha/kubismus] How to understand and use the kubismus application (#1)

Hi @mulgurulhttps://github.com/mulgurul!

Thank you for reaching out, so far you are the first one being interested in Kubismus as I mostly written that software for my own purposes :-)

Is it possible that someone could explain a little about the use and thoughts which has been the background for the app?

Kubismus originated from my first attempts trying to control Ableton from the outside via OSC commands which turn on / off channels in the Ableton Project. It started with with this project: https://github.com/adzialocha/solo-controller and https://github.com/adzialocha/solo-link using this Max for Live / Max object to receive the OSC commands and connect them to Ableton parameters: https://github.com/adzialocha/solo-link-m4l

The idea was to have some sort of environment which allows me to play with randomness, simple sequences which trigger different effect channels and have interdependencies between them eg. "When one channel got activated, deactivate the other one" etc. - for live concert setups I would have presets I can create and load during the performance.

The Kubismus version even had some ways to control the recording status of Ableton as we used it in the recording studio, routing different signals into different machines (Tape Machine, Plate Reverbs, Amps, Spakers in different rooms etc.), using Ableton as the routing matrix and Kubismus to control it. When pressing record in Kubismus it automatically also started Ableton and we could experiment with the routing behaviour while directly recording everything.

The result is a music record you can listen to here: https://andreasdzialocha.bandcamp.com/album/for-always-lp - you can also read more about the process here: https://adz.garden/for-always-lp/

Also, if this is of interest for you, Kubismus was once built as a C++ application, using the JUCE framework: https://github.com/adzialocha/kubismus-legacy - it contained some nice tools to generate special OSC patterns with some sort of timeline editor, but I've never finished it. Also it's my very first C++ project 🐤

In my own project I'm using A worker window with the socket stuff and passing objects with data through ipc to renderer and the adding it to a recoil state. But seems a bit ugly and with some overhead going from worker renderer to main web renderer over main.

You will definitely need some sort of bridge to get the data via UDP somewhere else as you can't do this from the browser (as you probably already know). I have that bridge in the renderer process, but the whole OSC logic is inside the browser / web-view, built on top of my https://github.com/adzialocha/osc-js library.

This project seems so clean, but I'm totally lost on redux, so it's quite complicated to understand the structure of the app.

If I would write that project again I would not use Redux but Higher Order Component patterns instead, redux is indeed not very helpful for clarity. It's been a while I've been looking into the code myself, but let me try a short overview:

  1. The scenes are defined in the global redux store here: https://github.com/adzialocha/kubismus/blob/master/src/scripts/reducers/scenes.js they contain what parameter is controlled based on what logic (randomness, some pattern, similar to a sequencer etc.), most of the UI is built around defining, loading, storing and controlling this scene data.

  2. Within the UI the user can "start" the transport mode which triggers a redux action. It gets picked up by the player middleware here: https://github.com/adzialocha/kubismus/blob/master/src/scripts/middlewares/player.js

  3. The player itself is the logic which has the internal clock and generates the parameter data based on the defined scene: https://github.com/adzialocha/kubismus/tree/master/src/scripts/services/player - this part is completly separated from any React / Redux logic and works on its own

  4. The player service has a callback reporting status changes, these are then again dispatched as redux events in the middleware: https://github.com/adzialocha/kubismus/blob/master/src/scripts/middlewares/player.js#L18

  5. Which then again get dispatched as osc events: https://github.com/adzialocha/kubismus/blob/17d0890b49e74b9269eee30f92c30df6423039cb/src/scripts/actions/player.js#L39

  6. And finally picked up by the OSC middleware which sends the actual OSC command to the bridge: https://github.com/adzialocha/kubismus/blob/master/src/scripts/middlewares/osc.js

I do have Ableton installed. What do I need to do to get the app in use?

Check out that Max object I linked to above. Add it to your Ableton project!

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/adzialocha/kubismus/issues/1#issuecomment-776913177, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMIZXU6SLCREQW6DF6DJQLS6LFBPANCNFSM4XHSMH6Q.