Closed Catsvilles closed 4 years ago
Hi,
The primary usage is as a control system for scsynth (Synth, Group, Buffers etc). In this respect it is a replacement for sclang.
It can do any control task that the sclang common library can do: load samples, spawn synths, send control messages. The async / promises throughout the API make it faster and easier to write that kind of code.
It cannot yet compile synth defs from JavaScript code, but it can start up an sclang interpreter and use that to compile defs. For most purposes that's what we need.
It is purposely limited to the essential communication and control tasks. I'm keeping the core lean. Any higher order way of working should be developed on top of supercollider.js
This avoids the bloat and tech debt, but I do realize it means that people can't jump in and be productive with a high level API.
https://crucialfelix.github.io/supercolliderjs/#/packages/server-plus/README adds convenience methods to Server.
Dryads: https://crucialfelix.github.io/supercolliderjs/#/packages/dryads/README are declarative and (partially) live updating. I really hope to finish what I was working on there because it is a powerful idea when it's fully implemented. That's how I plan myself to make music with it.
You could write code that specifies the sounds/patterns/patching. You play it, then edit your code, hit save and it live updates. It's like React hot loading, or like React's fast updating of the view in response to user actions. It's not that far off from that right now, but I have a lot of other work going on.
There is some sequencing in there now: https://crucialfelix.github.io/supercolliderjs/#/packages/dryads/SynthEventList
and you can do real-time event streams: https://crucialfelix.github.io/supercolliderjs/#/packages/dryads/SynthStream (map live controllers to synth events)
NRT: not implemented
@crucialfelix thanks for such a detailed answer! Are there any plans for NRT? How hard would it be to implement it? With some guidance I could try to get on it and submit pr if overall SuperCollider will appear to be the right thing for me. Also, I recently found out about this amazing package (https://github.com/Spacechild1/vstplugin) which basically allows using VSTs with SuperCollider, it comes as an UGen. I suppose it is possible to add custom UGens and operate them through JS with this package? :)
No problem at all to use vstplugin or any other scsynth plugin. You may however find it easier to use standard OS audio busses and do your inter-app patching using that. I usually use supercollider with multi-channel output and patch them all into a DAW like Ableton. Then you have lots of mixing, processing and recording tools. I used to do absolutely everything in supercollider, but that's a bit limiting.
I don't plan on working on NRT. Not quite sure how to go about it yet. But all the OSC generating functions are here: https://crucialfelix.github.io/supercolliderjs/#/packages/server/msg
You just wouldn't want to manually manage and supply all the ids.
I think you can subscribe to the Server OSC sender:
server.send.subscribe(function(msg) {
console.log(msg);
});
so you could record live coding or anything with that. All the node and buffer ids and sample paths would be in those messages.
You just have to collect all the messages and then save them to a file in NRT format. Then you can render that to an audio file using scsynth.
So maybe it isn't that hard to do.
No problem at all to use vstplugin or any other scsynth plugin. You may however find it easier to use standard OS audio busses and do your inter-app patching using that. I usually use supercollider with multi-channel output and patch them all into a DAW like Ableton. Then you have lots of mixing, processing and recording tools. I used to do absolutely everything in supercollider, but that's a bit limiting.
Yeah, absolutely, overall using a DAW would solve the problem but the thing is my initial goal is actually to find a way to make music completely with coding. So, I'm trying to assemble a kinda DAW-like thing, which has most of the traditional DAW features (rendering .wav, project files, VSTs, etc) but instead of user interfaces we use code for composing the music. So, I'm jumping around here and there to try and find some kind of audio engine, or whatever, for now looks like SC is the best capable thing.
I will try your recommendations, these do sound like something I already had on mind on how I could the things! Thanks a lot for these details!
You can also control DAWs with code. Reaper and Audacity have full OSC.
Catsvilles notifications@github.com schrieb am Do., 12. März 2020, 20:41:
No problem at all to use vstplugin or any other scsynth plugin. You may however find it easier to use standard OS audio busses and do your inter-app patching using that. I usually use supercollider with multi-channel output and patch them all into a DAW like Ableton. Then you have lots of mixing, processing and recording tools. I used to do absolutely everything in supercollider, but that's a bit limiting.
Yeah, absolutely, overall using a DAW would solve the problem but the thing is my initial goal is actually to find a way to make music completely with coding. So, I'm trying to assemble a kinda DAW-like thing, which has most of the traditional DAW features (rendering .wav, project files, VSTs, etc) but instead of user interfaces we use code for composing the music. So, I'm jumping around here and there to try and find some kind of audio engine, or whatever, for now looks like SC is the best capable thing.
I will try your recommendations, these do sound like something I already had on mind on how I could the things! Thanks a lot for these details!
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/crucialfelix/supercolliderjs/issues/78#issuecomment-598379357, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABVM4WZQ4BNINPT6P2MQJ3RHE3GPANCNFSM4LFGYLBQ .
@crucialfelix Yeah, I thought so, but I cannot wrap my head around on how actually I would do it. I mean, can I for starts launch the DAW just with OSC, is it even possible? Could I control every button, knob in the DAW this way? What if I would like my audio engine to be on separate/remote server (in this case it's the DAW) can I then fully control it via OSC (launch it, create music, render etc)? Thanks for getting to this talk, couldn't find anything about this on the web. But I already found out which DAWs fully support OSC (additionally to ones you've mentioned), for example Ardour has full support and very good documentation on OSC.
can I for starts launch the DAW just with OSC, is it even possible? No, but you can use a shell script that launches your whole setup.
Could I control every button, knob in the DAW this way? Yes
What if I would like my audio engine to be on separate/remote server (in this case it's the DAW) can I then fully control it via OSC (launch it, create music, render etc)?
Not launch it, but you could do everything else.
You should just try different ways and see what you can discover. Write up your work so other people can learn from it. (write the article you couldn't find online)
Thanks a lot for your help! Yeah, thats what I always try to do, after I actually dive into the some problem I go to other's people opened issues and leave the solutions I could find :) So, yeah, there are really questions on many forums about creating music fully with code and not many options are given, so I will be happy to share my researches once I put everything together! Thanks again, cheers!
Hi! Sorry, this may be a stupid question but after digging through the code and examples it is still not clear to me if it's a wrapper around SC to just use it as some kind of audio engine/server or it's actually possible to code music using only JS not sclang? Ideally I'm dreaming of finding a decent DAW like environment to code music patterns and stuff only with JavaScript or Python. Here are few questions that could help me decide if this is right software for me, I will be very thankful to get some answers :)