grz0zrg / fsynth

Web-based and pixels-based collaborative synthesizer
https://www.fsynth.com
BSD 2-Clause "Simplified" License
206 stars 14 forks source link
additive additive-synthesizer audio distributed glsl gpu granular granular-synthesis javascript live-coding midi osc shaders sharedb spectral supercollider synthesizer webaudio webgl webgl-canvas

Fragment

Fragment - The Collaborative Graphical Audio Synthesizer

Source code repository for the Fragment app. which can be found at : https://www.fsynth.com

Table of Contents

About Fragment

Fragment is a graphical audio synth. / collaborative audiovisual live coding web. environment with a pixels based (image-synth) real-time sound synthesis approach, the sound synthesis is powered by pixels data generated from live GLSL code and Processing.js code with many different types of input data available.

Many videos of most features are available on YouTube

Fragment has only one fragment shader which has the particularity to be shared between the users of an online session, it update and compile on-the-fly as you or other peoples type, some settings are also synchronized between users such as slices and some global settings with the exception of inputs data which are not synchronized between users.

Fragment has many features making it a bliss to produce any kind of sounds associated (or not) with visuals, it is aimed at artists seeking a creative environment with few limitations to experiment with, a programmable noise-of-all-kinds software.

To output any sounds the client need to be used with the Fragment Audio Server which is a high performance native digital synthesizer.

Fragment require WebGL 2 compatible brower, audio can be produced independently from the visuals with the synthOutput vec4 uniform.

For any questions, a message board is available here

Requirement

Note on performances : Fragment has excellent performances with a modern multi-core system and a browser such as Chrome however due to browser UI reflow you may experience latency sometimes especially when typing in the code editor, this can be solved by using the independent code editor.

Fragment is able to do real-time distributed sound synthesis with its audio server, it support any number of machines over the wire and multicore support, this feature also need the fas_relay to work (see below)

Features

Sound Synthesis

Fragment capture pixels data (1px wide slices) from a WebGL drawing surface at the browser display refresh rate and translate the RGBA pixels value to notes, the notes are then interpreted and played by one or more synthesis method in real-time.

Common to all synthesis methods, the canvas represent frequencies (exponential mapping) on the vertical axis and time on the horizontal axis.

It can be seen as a front-end for a huge bank of oscillators / filters.

Audio synthesis is powered by an independent audio server, Fragment doesn't output any real-time sounds on its own.

External synthesizers can be triggered via MIDI out.

Slices data can be sent via OSC bundles to use Fragment as an interface.

MIDI

Fragment support MIDI inputs and MIDI outputs with compatible browsers.

Features

External synths can be triggered from pixels data via MIDI OUT, MIDI devices can be assigned to one or more slice, RGBA channels can be assigned to user-defined MIDI messages from the slice settings, Fragment has limited MPE support for output (non-standard for now) to support polyphony through 16 channels, every sounding note is temporarily assigned to its own MIDI channel, allowing microtonal, individual stereo panning and polyphonic capabilities.

If you need to control more parameters, see OSC below.

OSC

Fragment support OSC input and output, an OSC relay which translate WebSockets data to UDP packets should be used for this feature to work.

Fragment uniforms can be defined through OSC with two methods :

You can send a message to the /clear address to clear all OSC defined uniforms

Open Stage Control can be used to control partials or more parameters through OSC via faders etc.

Tools

Many tools are available to enhance Fragment.

Limitations

Tips and tricks

Project organization

Tech

Fragment client is a vanilla JavaScript web application, it use ECMAScript 5 and 6 (due to some API requiring it) and many Web API technologies. (Web Audio, Web Workers, Web MIDI, Web GL 2, Web Storage, indexedDB etc.) It was rapidly built from a prototype and had multiple iterations since then, UI code is probably the part which didn't change much in architectural terms and is probably the most bloated one, slices also.

Fragment client rely on few dependencies (CodeMirror, sharedb, Recorderjs etc.) and rely on some specifically built libraries such as WUI which handle all the UI widgets.

The client use a custom / simple build system and is architectured around its 'code injection' feature within a single function (see app_fs.js), all other files roughly follow a Fields declaration / Functions / Initialization structure, code injection and initialization calls is only done in app_fs.js for sanity.

Most backend apps are built using NodeJS.

Build system

Fragment is built with a custom build system scanning for changes in real-time and which include files when it read /*#include file*/, it execute several programs on the output files such as code minifier, the build system was made with the functional Anubis programming language, a programming language based on cartesian closed category theory.

Since the Anubis language is mainly private, a simplified (without live check & build) Python port of the build system is available, check out pyNut

If you want to build it by yourself, install pyNut script somewhere in your PATH then call pynutbuild shell script in the client root directory.

_appfs\ and _appcm\ are the entry point files used by the build system to produce a single file and a production ready file in the dist directory.

You may need to install these dependencies (code minifier) globally through NPM :

The Anubis build system can be found here and the build system is called by the shell script named nutbuild (root folder)

How to setup your own server

Fragment make use of NodeJS, NPM, MongoDB and Redis database (all defaults port so should work out of the box), install steps with APT (adapt to your package manager) :

On Windows the installation is also easy, just download & install each dependencies above, Redis does not have a Windows build but it may be replaced by memurai

Once those are installed, it is easy to run it locally:

Under Linux : proprietary GPU drivers is recommended due to performance reasons.

If you just want to try it out without the collaborative feature and GLSL code save, you don't need MongoDB and Redis, you just need "fsws" then point your browser to http://127.0.0.1:3000

If you want to use it with an OSC app like the SuperCollider fs.sc file or Open Stage Control, please look at the osc_relay directory.

To use the OSC relay :

To use the FAS relay :

Prod. system

Credits

Libraries :

Papers :

Data :

The repository for the early proof of concept can be found here.

Fragment on social medias

YouTube

Twitter

License

Simplified BSD license

Credits

The main inspiration for all of this is Alexander Zolotov Virtual ANS software

Heavily inspired by Shadertoy as well.

Some ideas also come from Sonographic sound processing and Metasynth