gogins / michael.gogins.studio

Studio of Michael Gogins: computer music, photographs, writings.
4 stars 0 forks source link

Return to "Mandelbrot set" idea #5

Closed gogins closed 3 years ago

gogins commented 7 years ago

Implement a score generator based on my old idea of a "Mandelbrot set" for music or sound.

A recurrent IFS with some additions may be used as a starting point. Either Hilbert indices or interpolation may be used to produce a parametric map.

This score generator may also be used with evolutionary programming, in which parameters in the same space as the IFS parameter block are evolved.

I have not decided whether a 2 dimensional IFS should be used to produce images that are then mapped to scores, or if rather an N dimensional IFS should generate a score directly that is then displayed.

gogins commented 7 years ago

Regarding 2-dimensional images mapped to music, these are the possibilities I have thought of and/or used:

  1. Image pixels are grains of sound. 1.1. X is time and Y is frequency. 1.2. X is time and Y is pitch.
  2. Image pixels represent notes. In all cases, X is time and Y is pitch. As one scans across a row of pixels, the value rising above some threshold represents a note on, the value falling below some threshold represents a note off. The main problem is that there are too many rows of pixels. This means that the note on events must be filtered: 2.1. The value threshold can be raised. 2.2. Once a note on event occurs on a row, no more note on events occur on that row until the threshold is crossed again. 2.3. The image is quantized by the pitch range, e.g. 2304 pixel height divided by 88 semitones is 26 pixels. The value for note on can be the sum of these values. 2.4. Alternatively, any number of note on events can be triggered within the 26 pixels, but the value threshhold is set high. 2.5. Or, a preset number of sounding voices is allowed. New voices are not created until one or more of the sounding voices has stopped.

The parameters of each pixel are red, green, blue or hue, saturation, value. It probably makes the most sense to map hue and saturation to instrument choice and stereo location, or something like that. Perhaps the mapping can be overlaid onto the image...

gogins commented 7 years ago

Actually I believe score renderings should be used directly, and these should be large enough so that some intelligence of the actual music can be gained by inspection. My laptop screen is 1920 x 1080. The score could be rendered with 2 pixels per semitone. That gives something like at most 4 x 5 = 20 scores on the screen. Then the IFS can be on the dimensions of the notes {i0, t1, d2, k3, v4, p5, K6, T7} where K is Chord.K(Floor(K6 % 2)) and T is Chord.T(T7). A fixed RIFS can be used with N transformations and an N square recurrence matrix, N = 6 or so would probably be high enough.

http://subprotocol.com/system/genetic-js.html provides what appears to be a usable genetic algorithm in JavaScript.

gogins commented 7 years ago

If each transformation always brings a point back to the timeline, the measure on that line then traces the function of chords by time.

Does this differ from the projection of the full measure onto the basis?

Each matrix in H is multiplied by the projection matrix, this produces a projective H.

gogins commented 6 years ago

This https://www.intechopen.com/books/fractal-analysis-applications-in-health-sciences-and-social-sciences/on-self-affine-and-self-similar-graphs-of-fractal-interpolation-functions-generated-from-iterated-fu apears to be the state of the art in fractal interpolation (up to 2 dimensions, but looks extensible). However they always want to go backwards, as far as I am concerned, from data to contrained IFS or RIFS; I want to generate constrained IFS or RIFS and then I get my score for nothing.

Fractal interpolation continues to be an active field of research in applied and perhaps pure mathematics, driven by geoscience and biomedicine.

gogins commented 6 years ago

My approach has succeeded in using mathematical music theory to identify quantifiable dimensions of harmony, voice-leading, and arrangement as P (set-class), I (inversion), T (transposition), V (clock group of octavewise revoicings of PIT), and i (clock group of instrumentwise rearrangements of N instruments). Then a "piece" is a function from time onto more or less fleeting chords in the vector space P, I, T, V, i.

The math really is: (P, I, T, V, i) = r(t) . This is a vector-valued function. I seem to have some sort of confused intuition that the component functions (P(t), I(t), T(t), V(t), i(t)) can be represented all at once by some sort of constrained IFS or RIFS, i.e. a fractal interpolation function (FIF).

I now wish to use IFS, RIFS, or constrained IFS or RIFS to generate interpolable and evolvable families of pieces. For this my confused intuition must be clarified.

Indeed I can prove it: distinct r induce distinct different sets of RIFS on PITVi. Also each component function can be represented by a FIF. These FIFs are orthogonal and thus can literally be added to produce a single FIF for r.

Of course they beat me to it since I am not really a mathematician but at least now I understand what I am doing... http://kpfu.ru/portal/docs/F10683575/e06_11.pdf, http://users.uoa.gr/~ldalla/papers/B4_Bouboulis-Dalla-2005.pdf, and perhaps most relevantly https://www.hindawi.com/journals/aaa/2015/278313/abs/.

And this looks promising as it is directly concerned with IFSs whose attractors are graphs of functions: https://arxiv.org/pdf/1610.01369.pdf. Indeed Barnsley et al. are directly considering how to derive "fractels" or fractal elements that compose FIFs in a general way. Earlier version here: https://arxiv.org/abs/1309.0972.

Massopust slides to illustrate fractels: http://www.fim.uni-passau.de/fileadmin/files/lehrstuhl/forster-heinlein/Bernried2017/Massopoust_Bernried_2017.pdf.

gogins commented 6 years ago

For WebGL renderings of scores, each voice can have a separate color by means of a particle system. This has been quite performant.

gogins commented 6 years ago

Concisely:

Consider a piece P of note-based music to be a succession of more or less fleeting chords, where a chord is a point in the vector space {P, I, T, V, i}. Then P is the graph of a vector function (P, I, T, V, i) = r(t). Any r can be approximated within any desired error by a fractal interpolation function F such that F for r is the sum of the orthogonal fractal interpolation functions f_P, f_I, etc. The matrix representation of the Hutchinson operator H for F is sized for the largest f and padded with unities for smaller fs. Iterating H on {P, I, T, V, i, t} gives P.

A piece P can then be evolved using the genetic algorithm on its H, and interpolations between two Ps can be achieved by interpolating between their two Hs. It follows that a Mandelbrot-like set for all P within some finite range R, within an error bound, consists of the 2-dimensional Hilbert index over R for Hs.

gogins commented 6 years ago

The literature considers fitting FIFs to sets of interpolation points. I consider generating, evolving, and interpolating between FIFs to obtain function points. In both cases the system of constraints for the Hutchinson operators are the same. http://dx.doi.org/10.1155/2015/278313 derive a matrix integral that when solved computes the Hutchinson operator fitted to and optimized for the interpolation points.

If I were starting from interpolation points, then it is possible that I could use a computer algebra system to solve for the matrices in the Hutchinson operator based on equation 22 of http://dx.doi.org/10.1155/2015/278313.

I think this approach is backward, and I just need to figure out what constraints or transformations I need to apply to any old Hutchinson operator in order to make it compute the graph of a vector-valued function.

Starting with one matrix in H, the next can be generated by partitioning the interval of time and the interval of the vector points. Must all transformations in H be shears?

In any event, a vector valued function is guaranteed if the constraints on t are satisfied. The

gogins commented 6 years ago

From Massopust's slides:

D. Hardin proved in 2012 that every compactly supported refinable function is a piecewise fractal function. In particular, the unique compactly supported continuous function determined by the mask of a convergent subdivision scheme is a piecewise fractal function.

Wow! Directly relevant to computer music of the sort concerned with directly generating a piece as a sound.

gogins commented 6 years ago

So maybe what I want is just that the objects I am making up to generate a piece are fractels.

gogins commented 6 years ago

As a way to refine and check my ideas, I have begun writing a paper on this topic in this repository.

gogins commented 6 years ago

I have been breaking my brain on understanding the Read-Bajraktarevic operator. Fortunately I have now made some progress. It's now clear that it depends on not spinning the domain, but shuffling or partitioning it. And I am now confident that the RB operator, because it does produce the graph of a (potentially multi-valued) fractal interpolation function, does at least in theory do exactly what I need.

I think that the concept of a local IFS simply means using one or more partitions of the domain of the total IFS as domains of sub-IFSs. I don't think there is anything there to code except for constraints.

What I now need to understand is how to actually implement the equation for the RB operator as a new variant of my MCRM algorithm. It's now clear how to do it in a computer algebra system, but it's not clear to me how to do it in JavaScript.

gogins commented 5 years ago

The experience of composing with Common Music and composing with RIFS in Silencio has been instructive -- highly so. In general, this experience reinforces my understanding of the value of irreducibility in composition.

What stands in the way of using RIFS as I would wish is the confounding non-orthogonality between scores as sets of notes and scores as sets of chords. The chord space group idea, with voicings, overcomes this in principle but remains more opaque in making variants than I would like.

I had the idea of a note as a vector including the chord generators:

  1. Time
  2. Duration
  3. Instrument
  4. MIDI Key
  5. MIDI Velocity
  6. X
  7. Y
  8. Z
  9. P
  10. I
  11. T
  12. Homogeneity

The non-orthogonality remains, between MIDI Key and P, I, and T. Generating a score with these dimensions would project into two sub-spaces, one for notes, one for chords. That is similar to what I do by attaching P, I, T, and V to notes or to Lindenmayer system turtles, but is slightly more elegant.

This would be similar to defining the MCRM generator signature as: (event, chord, event_transform, chord_transform, score).

gogins commented 5 years ago

What is a canon in PITV space?

gogins commented 5 years ago

I think I now understand how to implement a fractal function based on a local IFS in code.

The basic logic is similar to my simple code for recursively computing a Koch curve.

As I understand it, a local IFS recursively subdivides its fixed point or attractor into a number of ranges and domains. At each stage of iteration or contraction, the domain is divided up into discrete, non-overlapping sub-ranges, one for each transformation (shear or bilinear transformation) in the Hutchinson operator. For a regular IFS, these sub-ranges are the same as the domains of the transformations. For a local IFS, the domains may differ from the sub-ranges.

We may then pass into our implementation function the bounding box that represents the range of the fractal function. This box will shrink as the Hutchinson operator is iterated, thus representing the parent range. From it, the individual transformations will derive their contractions and translations, that is, their sub-ranges. It will then not be necessary to precompute the shear or bilinear transformations, only to specify their local forms.

gogins commented 5 years ago

I have modified ChordSpace.js to ChordSpace2.js by adding to the ChordSpaceGroup an additional dimension for the group of k-permutations with replacement of instrument numbers for voices. This is now a drop-in for the original ChordSpace.js.

It remains to implement local IFS and, of course. to test the code.

It has taken me decades to get to this point. I have so much trouble understanding things that would be immediate for an expert in the field or even for just a plain old professional mathematician. But now I will see if my original idea, seeded in 1983, actually makes sense in practice.

gogins commented 5 years ago

Test regime:

  1. Make sure that ScoreGraph.js loads and initializes.
  2. Make the simplest possible piece that moves on each dimension (in an audible way), e.g. stepping for one iteration.
  3. Make some real pieces.
gogins commented 5 years ago

Immediate syntax errors have been fixed.

Oops, looks like I forgot to rescale the range {P, I, T, V, A} from the computed graph to the actual ranges of the indexes of these dimensions.

gogins commented 5 years ago

To do:

gogins commented 5 years ago

Now gives the impression of sort of working. I will write test cases to sort things out.

gogins commented 5 years ago

This is quite compute intensive. It may only be practical in optimized and concurrent C++. I will try to intensively optimize the JavaScript code.

gogins commented 5 years ago

Writing test cases.

gogins commented 5 years ago

The duration of the timestep depends not only on the number of starting timesteps, but also the number of iterations. But I'm not sure there's a formula for this as the duration of the timestep may change from interpolation point to interpolation point. For now I can iterate the score graph and determine the average timestep.

gogins commented 5 years ago

There may need to be an overlap factor to get notes to tie. No, changed to Score.tieOverlaps(exact=true).

gogins commented 5 years ago

The basic algorithm is now working to specification.

gogins commented 5 years ago

To do:

gogins commented 5 years ago

Profiling shows that by far the major consumer of time is Score.tieOverlaps. If I can speed this up, and I think I can, the code may actually be fast enough to use in JavaScript.

It looks like property get and set are expensive, replacing with lower level code.

gogins commented 5 years ago

I have optimized Score.tieOverlaps, which is now about 2 times faster. I am still not sure if the code is fast enough. I would guess that C++ would be 2 times faster again, or more.

gogins commented 5 years ago

To do:

I keep thinking I'm on the verge of understanding something much more useful, perhaps a more fundamental understanding of how to build a variety of algorithms as in the original Java version of Silence but that operate in time as the fractal interpolation function does.

gogins commented 5 years ago

The key concept is that harmony (PIT) be generated as a function of time. What is missing from the fractal interpolation function approach is different ways of recursively generating structure within a segment of time. The questions then are:

This question in turn may reduce to that of how to move in time without overlapping in time. And the answer to that must be, move only within a subdivision and move by fraction.

However, it is not necessary that a segment subdivide time in the same way on each iteration. It is only necessary to stay within the parent subdivision, and to pass each child subdivision to the child iteration.

Conceptually the whole thing can be reduced to one function for recursively subdividing time, a function to return functions mapping time to chords, and a function for transforming chords to notes.

gogins commented 5 years ago

Arbitrary precision is required to interpolate between virtual parameters. There is a JavaScript library for this.

There are a number of JavaScript libraries for the genetic algorithm, start by looking at this one.

gogins commented 5 years ago

The biggest sticking point in my research on algorithmic composition, and the only real sticking point, is this tension between the representation of notes and the representation of chords.

gogins commented 5 years ago

I have a hazy notion that the existing overlap between IFS and 0-L systems can be extended to local IFS and FIFs. At each address in the fractal, context can determine one of a set of operations to apply. Not sure about the 0-L system stack.

gogins commented 5 years ago

Recurse, or branch, at each address. The state consists of a function that returns a state for an address, a function that partitions the domain, and a function that transforms the range.

gogins commented 5 years ago

This business has been bugging me ever since I started trying to use IFSs with chord spaces.

The key clearly is to ensure that harmony is a single value, possibly a vector value, that is strictly the graph of a function of time. That can be ensured with any IFS by partitioning time between all transformations of harmony. That is, no two transformations of harmony can be an image of the same time. The other dimensions may contract with rotations and overlaps, but not harmony, it must contract to the graph of a function.

In this case the dimensions of harmony can be much simplified. The rest of the attractor will take care of voicing and arranging. Harmony could even be a single Mason number, or it could be set-class, inversion, and transposition.

I have reduced the question to that of determining how to modify the Hutchinson operator so as to contract harmony only to the graph of a function of time, while leaving the other dimensions open to any contraction.

gogins commented 5 years ago

For the sake of thinking this through, I will assume that harmony is strictly Mason number, an integer modulo 12.

gogins commented 5 years ago

To create an operator that contracts harmony to a function of time, it should suffice that all transformations but one in the operator that transform harmony at a given time do so as unity. This is what I was doing by hand in the MCRM pieces. Can this be formalized? Or do all such transformations need to never rotate?

image

Proof positive that such transformations must not rotate. Hence, all transformations must always partition time and never rotate harmony. Anything else is permitted.

I believe that this can be implemented by setting the rotation of harmony to zero, and preprocessing the other transformations to partition time. Then simple affine transformation matrices, differing perhaps by partition of time and/or depth of iteration, can be used to compose the Hutchinson operator.

gogins commented 5 years ago

Please note, common sense confuses rotation about an axis with rotation in a plane. In 3 dimensions, these two things are identical, but in higher dimensions, rotation in a plane involves rotating about 2 or more dimensions. Here, we are concerned with rotation in the time-harmony plane.

gogins commented 5 years ago

The matrices then should be set up like this, where harmony consists of one or more dimensions:

Rotation by A in the time/harmony plane            
    x y z u  
    time harmony key velocity homogeneity
x time cos(A) sin(A)     0
y harmony -sin(A) cos(A)     0
z key     1   0
u velocity       1 0
translations           1
             
No rotation in the time/harmony plane            
    x y z u  
    time harmony key velocity homogeneity
x time   0     0
y harmony 0       0
z key         0
u velocity         0
translations           1
             
Factors to pre-process for partition of the time/harmony plane by time            
    x y z u  
    time harmony key velocity homogeneity
x time a       0
y harmony   c     0
z key         0
u velocity         0
translations   b d     1
gogins commented 5 years ago

I made the mistake of thinking that the bilinear transformation can be implemented using a transformation matrix. This is not the case, because the bilinear transformation makes parallel lines non-parallel, whereas only transformations that preserve parallels can be represented by an affine transformation matrix. However, I think the R-B operator stuff remains if I switch to shear transformations.

gogins commented 5 years ago

I am ready to implement the algorithms and mathematics in the draft paper. I will need fast matrix multiplication.

https://github.com/waylonflinn/weblas

https://github.com/mljs/matrix

https://github.com/lauerfab/MLweb/

https://github.com/toji/gl-matrix only 4x4!?

https://github.com/josdejong/mathjs

gogins commented 5 years ago

I am going with ml-matrix because math.js doesn't support decompositions, MLweb looks fiddly, and weblas doesn't have a lot of users, support decompositions, or show recent commits.

Oops, turns out that ml-matrix doesn't work in NW.js because it depends on import, which doesn't work in NW.js.

gogins commented 5 years ago

There's blasjs. I will try this as it is probably less likely to contain errors. The source code is TypeScript and must be compiled to JavaScript before use. There are the BLAS solvers for linear systems but, e.g., no eigenvalue decomposition.

Also numjs. Again, no decompositions.

And that leaves the scijs set of repositories based on ndarray. There is a good chunk of linear algebra and some signal processing here. Do these packages work in NW.js and the browser? It seems you have to use browserify, which I'm not familiar with, but I'll see if its simple enough or not.

gogins commented 5 years ago

And of course, TensorFlow for JavaScript. Good news, this works with the CDN. What is in there? Lots of stuff for machine learning, plus matrix arithmetic and some decompositions, but not eigenvalue decomposition.

As an aside: https://www.npmjs.com/package/@magenta/music. Uses the above.

gogins commented 5 years ago

OK, how to use TensorFlow for JavaScript as a static resource integrated with my build system?

gogins commented 5 years ago

Of course I missed that TensorFlow objects are immutable... I could use TensorFlow.Variables but that would be iffy as every operation will produce a TensorFlow object. After all this research I will end up using numeric.js, last upated in 2012, after all. It supports everything I need, and goes up to eigenvalue decompositions.

gogins commented 5 years ago

Hard to believe, but the basic algorithm actually seems to be working as intended. I will write the following tests:

gogins commented 3 years ago

As evidenced elsewhere on GitHub I found many bugs in ChordSpace.js. I believe these are fixed now but only in ChordSpaceBase.hpp. This issue must move all code to CsoundAC's ChordSpace classes.

gogins commented 3 years ago

I have moved all code to C++, fixed all ChordSpace bugs (I think), and fixed all HarmonyIFS bugs (I think). These issues have now been taken care of.

However, doing this work has deepened my understanding of potential issues in the underlying concepts and mathematics. I am therefore closing this issue and opening a new one.