Open Coupe70 opened 1 year ago
Let me know if you can provide some more info on this one. Thanks!
@ideoforms Ok, I'll try to explain what's going on.
Let's say I have 3 tracks in my set and start listeners for the track names for all 3 tracks (track 0, 1, 2).
Now I re-arrange the tracks in Live and drag the third track between the other two. I would expect Live to send an update for the listeners 1 & 2 as I would expect the listeners to be tied to track order and not to a track itself. I would expect track C to take over listener 1 as it is moved to position 1 and track B to take over listener 2 as it is moved to position 2. But that doesn't happen.
When I change the name of one of the tracks it turns out that the tracks keep their listeners, no matter in what order they are. So if I change the name of track C I will get a message from listener 2 although it sits on position 1. Same happens when I change track order by deleting or adding tracks.
I COULD handle this in TouchOSC and let TouchOSC stop all listeners and start new ones and all that stuff. This would be quite difficult, but possible in theory (probably not so easy with other OSC tools). But the main problem would be: TouchOSC does not know about the change in track order, so it doesn't know it has to make these changes to the listeners.
Does Live give AbletonOSC any information about changes in track order that could be handled by AbletonOSC or at least forwarded via OSC? Or does AbletonOSC also have no idea what's going on in Live in this case?
Quite a strange finding, so maybe I'm just missing something....Live API let's you keep track of about EVERYTHING going on in tracks, clips and devices, but we are loosing the complete overview when tracks are moved...? Or the other way round: This would make the listeners quite useless as you would have to manually query everything anyway to recognize certain changes.
Would be interesting to see how listeners for things like clip color and name behave when dragging clips around, but these are not implemented yet.
Ah! I understand, thanks for the clear explanation. This is because the track index is captured in a closure when creating the listener, meaning that the track ID isn't re-queried each time a change happens... so it falls out of sync when the tracks are reordered.
The problem is that it's slightly inefficient to query the track_index on each change (because the track doesn't know its own index, so it requires an array search), but I think this may be the only way to correct the functionality. Thanks!
why is track ID not an actual unique internal identifier. This is incorrect use of the term unique identifier. Track index is much more appropriate. Also the fact that api doesn't expose actual internal ID limits memoization to string values on track names and unique naming convention, which is quite inefficient.
I actually just had a look at this problem.. I was able to add a listener to the "tracks" property of the song object. this correctly catches creating and deleting a track, as well as re-ordering thim. the problem is that the handlers are not set up to deal with the LOM objects, so it borks when trying to serailize them. but just showing some dirty code here that assumes that we are dealing with a track object works, for both listeners and getters. this would require some changes in some places, such as importing Live and checking for object types and applying proper serialization or handling it in the handlers themselves.
this is obviously just a super small and simple sample, but it shows that if we somehow store and cache the tracks (which the internal ableton code also seems to do multiple places) it should be fast enought to calculate ids / indexes and changes to those in a way that would be more consistent?
I think you're splitting hairs a bit much if the only loss is from an array search. Object ids (tracks, clips, etc) are non-trivial. AbletonOSC is a powerful tool, but the lack of call ids and lack of object ids makes it seem somewhat brittle. It's hard for me to wrap my head around an RPC framework where "likely" is your best bet for the object your calling on and the origins of the call.
I'm working on reverse engineering the entire Live python API (for 12 atm, but should work with 11 too). Essentially it will give auto-completion and a better/more robust development experience. Not directly related, but it should help to get into the "black box" which is developing with Ableton unless you are on the inside.
One thing to consider is the timing of such an event may allow for a midi script function to be more greedy (within a reasonable limit). Tracks and other elements are not likely to be moved/removed during a performance, and will often be done via computer keyboard and mouse.
Details hard to provide because of issue #30.