Closed gmittal closed 6 years ago
I'm not sure the question is fully understood here; the XY coordinates are resolved to a position in the binary audio file in the application code. The audio pipeline folder contains the scripts which are used to create these files, I think perhaps modifying [this script](https://github.com/googlecreativelab/open-nsynth-super/blob/master/audio/workdir/06_build_pads.py)
to also output XY coordinates of each sound may accomplish what you're looking for here?
The implementation for NSynth Super is optimised for running on the device so may need to be modified for other use cases.
Looking to port some of these ideas to other platforms (i.e. iOS). Downloaded the "lite" audio binary files and the respective settings.json.
Looked through some of the source code, but got a little confused in the process. Can someone walk me through the process of using the binary audio files to generate a sound wave given an X, Y pair on the touchscreen plane?
Thanks.