The above image remixes the Hydra code "Filet Mignon" from AFALFL and GLSL shader "Just another cube" from mrange. Licensed under CC BY-NC-SA 4.0 and CC0 respectively.
Patchies is a tool for building interactive audio-visual patches in the browser, using JavaScript. Try it out at patchies.app - it's open source!
Patchies lets you use the audio-visual tools that you know and love, together in one place. For example:
- P5.js, a JavaScript library for creative coding.
- Hydra, a live-coding video synthesizer.
- Strudel, a TidalCycles-like music environment
- ChucK, a programming language for real-time sound synthesis.
- ML5.js, friendly machine learning library for the web.
- Web Audio API, a powerful audio synthesis and processing API.
- GLSL fragment shaders, for complex visual effects.
- ...as well as write JavaScript code directly.
Patchies lets you "patch" multiple objects together using Message Passing, Video Chaining and Audio Chaining. It's inspired by tools such as Max/MSP, Pure Data, TouchDesigner, VVVV, and others.
"What I cannot create, I do not understand. Know how to solve every problem that has been solved." - Richard Feynman
- Go to patchies.app.
- Press
Enter
to create a new object. - Click and drag the title of the object on the top left to move.
- When hovering over an object, you'll see icon buttons such as "edit code" and "play/stop" on the top right.
- Use the "Edit Code" button to open the code editor.
- Press
Shift + Enter
while in a code editor to re-run the code, or hit the "Play" icon.
- Click on the title to focus on an object.
- Drag on the title to move the object around.
- Press
Delete
to delete an object.
- Press
Ctrl/Cmd + K
to bring up the command palette.- You can do many actions here, such as toggling fullscreen, import/export patch files, save/load patches in your browser, setting API keys, opening secondary output screen, toggling FPS monitors and more.
You can use the Shortcuts button on the bottom right to see a list of shortcuts. Here are some of the most useful ones:
Click on object / title
: focus on the object.Drag on object / title
: move the object around.Scroll up
: zoom in.Scroll down
: zoom out.Drag on empty space
: pan the canvas.Enter
: create a new object at cursor position.Ctrl/Cmd + K
: open the command palette to search for commands.Shift + Enter
: run the code in the code editor within the selected object.Delete
: delete the selected object.Cmd + Z
: undo the last action.Ctrl + C
: copy the selected object.Ctrl + V
: paste the copied object.
You can use send()
and recv()
functions to send and receive messages between objects. This allows you to create complex interactions between different parts of your patch. This is very similar to messages in Max/MSP.
Here is how to use send
and recv
in JavaScript objects:
// Object A
send('Hello from Object A')
// Object B
recv((data, meta) => {
// data = "Hello from Object A"
console.log('Received message:', data)
})
You can use the send
and recv
function in all JavaScript-based objects, such as js
, p5
, hydra
, strudel
and canvas
.
The meta
includes the inlet
which is an index of the inlet. This is helpful to distinguish inlets. You can also do send(data, {to: inletIndex})
to send data to only a particular inlet, for example:
recv((data, meta) => {
send(data, {to: meta.inlet})
})
In JavaScript objects such as js
, p5
, hydra
, you can call setPortCount(inletCount, outletCount)
to set the exact number of message inlets and outlets. Example: setPortCount(2, 1)
ensures there is 2 message inlets and 1 message outlet.
You can also send
messages into GLSL uniforms. If you define a uniform in your GLSL code like so:
uniform float iMix;
uniform vec2 iFoo;
This will create two inlets in the GLSL object: the first one allows send(0.5)
for iMix
, and the other allows send([0.0, 0.0])
for iFoo
. When you send
messages to these inlets, it will set the internal GLSL uniform values for the object.
You can chain video objects together to create complex video effects, by using the output of a video object as an input to another. For example: P5 -> Hydra -> GLSL. This is similar to shader graphs in TouchDesigner.
To leverage video chaining, use the leftmost orange inlets and outlets on the patch. You can connect the orange video outlet of a p5
to an orange video inlet of a hydra
object, and then connect the hydra
object to a glsl
.
This allows you to create video patches that are more powerful than what you can do with a single object. Have fun!
Similar to video chaining, you can chain many audio objects together to create complex audio effects.
-
You can use these objects as audio sources:
strudel
,chuck
,ai.tts
,ai.music
,soundfile~
,video
as well as the web audio objects (e.g.osc~
,sig~
,mic~
)- VERY IMPORTANT!: you must connect your audio sources to
dac~
to hear the audio output, otherwise you will hear nothing. Audio sources do not output audio unless connected todac~
. Usegain~
to control the volume.
- VERY IMPORTANT!: you must connect your audio sources to
-
You can use these objects to process audio:
gain~
,fft~
,+~
,lowpass~
,highpass~
,bandpass~
,allpass~
,notch~
,lowshelf~
,highshelf~
,peaking~
,compressor~
,pan~
,delay~
,waveshaper~
,convolver~
.- These objects correspond to Web Audio API nodes. See the Web Audio API documentation for more details.
-
Use the
fft~
object to analyze the frequency spectrum of the audio signal. See the Audio Analysis section below. -
You can use
dac~
to output audio to your speakers.
The fft~
audio object gives you an array of frequency bins that you can use to create visualizations in your patch. Here is how to use it:
-
GLSL: connect the purple "analyzer" outlet to a
sampler2D
GLSL uniform inlet.- Hit
Enter
to insert object, and try out thefft-freq.gl
andfft-waveform.gl
presets for working code samples. - To get the waveform (time-domain analysis) instead of the frequency analysis, you must name the uniform as exactly
uniform sampler2D waveTexture;
. Using other names will give you frequency analysis.
- Hit
-
Hydra and P5.js:
- Try out the
fft.hydra
preset for Hydra examples. - Try out the
fft-capped.p5
,fft-full.p5
andrms.p5
presets for P5.js examples. fft()
defaults to waveform (time-domain analysis). You can also callfft({type: 'wave'})
to be explicit.fft({type: 'freq'}).a
gives you frequency spectrum analysis.
- Try out the
-
The
fft()
function returns theFFTAnalysis
class instance which contains helpful properties and methods:- raw frequency bins:
fft().a
- bass energy as float (between 0 - 1):
fft().getEnergy('bass') / 255
. You can use these frequency ranges:bass
,lowMid
,mid
,highMid
,treble
. - energy between any frequency range as float (between 0 - 1):
fft().getEnergy(40, 200) / 255
- rms as float:
fft().rms
- average as float:
fft().avg
- spectral centroid as float:
fft().centroid
- raw frequency bins:
Here are the non-exhaustive list of objects that we have in Patchies. You can also hit n
on your keyboard to see list of objects to create, as well as drag in the objects from the bottom bar.
These objects support video chaining and can be connected to create complex visual effects:
-
P5.js is a JavaScript library for creative coding. It provides a simple way to create graphics and animations, but you can do very complex things with it.
-
If you are new to P5.js, I recommend watching Patt Vira's YouTube tutorials on YouTube, or on her website. They're fantastic for both beginners and experienced developers.
-
Read the P5.js documentation to see how P5 works.
-
See the P5.js tutorials and OpenProcessing for more inspirations.
-
You can also use ML5.js in your P5 sketch to add machine learning capabilities. Call
loadML5()
at the top of your sketch to load the ML5 library. -
You can call these special methods in your sketch:
noDrag()
disables dragging the whole canvas. You must call this method if you want to add interactivity to your sketch, such as adding sliders or mousePressed events. You can call it in yoursetup()
function.- When
noDrag()
is enabled, you can still drag the "p5" title to move the whole object around.
- When
send(message)
andrecv(callback)
, see Message Passing.
- Hydra is a live coding video synthesizer. You can use it to create complex video effects and animations.
- See the interactive Hydra documentation to learn how to use hydra.
- Try out the standalone editor at Hydra to see how Hydra works.
- You can call these special methods in your Hydra code:
setVideoCount(ins = 1, outs = 1)
creates the specified number of Hydra source ports.- For example,
setVideoCount(2)
will initializes0
ands1
with the first two video inlets.
- For example,
- full hydra synth is available as
h
- outputs are available as
o0
,o1
,o2
, ando3
. send(message)
andrecv(callback)
- GLSL is a shading language used in OpenGL. You can use it to create complex visual effects and animations.
- You can use video chaining by connecting any video objects (e.g.
p5
,hydra
,glsl
,swgl
,bchrn
,ai.img
orcanvas
) to the GLSL object via the four video inlets. - You can create any number of GLSL uniform inlets by defining them in your GLSL code.
- For example, if you define
uniform float iMix;
, it will create a float inlet for you to send values to. - You can send values to the uniform inlets using Message Passing.
- If you define the uniform as
sampler2D
such asuniform sampler2D iChannel0;
, it will create a video inlet for you to connect video sources to.
- For example, if you define
- See Shadertoy for examples of GLSL shaders.
- All shaders on the Shadertoy website are automatically compatible with
glsl
, as they accept the same uniforms. - I recommend playing with The Book of Shaders to learn the GLSL basics!
- SwissGL is a minimalistic wrapper for WebGL to create compute shaders and GPU-accelerated graphics.
- Perfect for data visualization, image processing, and GPU compute tasks.
- Supports video chaining for complex processing pipelines.
-
You can use HTML5 Canvas to create custom graphics and animations. The rendering context is exposed as
canvas
in the JavaScript code, so you can use methods likecanvas.fill()
to draw on the canvas. -
You can call these special methods in your canvas code:
noDrag()
to disable dragging the whole canvas. this is needed if you want to add interactivity to your canvas, such as adding sliders. You can call it in yoursetup()
function.getSource()
to get the video source from the previous video object using Video Chaining. This returns the HTML5 canvas element which you can use for e.g. copying pixels. You can call this in yoursetup()
function.send(message)
andrecv(callback)
, see Message Passing.
- Butterchurn is a JavaScript port of the Winamp Milkdrop visualizer.
- You can use it as video source and connect it to other video objects (e.g.
hydra
andglsl
) to derive more visual effects.
- Load and display images from URLs or local files.
- Supports video chaining for image processing pipelines.
- Can be used as texture sources for other visual objects.
- Strudel is a live coding environment based on TidalCycles. You can use it to expressively write dynamic music pieces, as well as create complex audio patterns and effects.
- See the Strudel workshop to learn how to use Strudel.
- Check out the Strudel showcase to get inspirations with how people use Strudel.
- ChucK is a programming language for real-time sound synthesis and music creation.
- Great for algorithmic composition and sound design.
- Runs in the browser via WebChucK.
- Run Python code directly in the browser using Pyodide.
- Great for data processing, scientific computing, and algorithmic composition.
- Full Python standard library available.
- Use
console.log()
to log messages to the virtual console. - Use
setInterval(callback, ms)
to run a callback everyms
milliseconds.- The code block has a special version of
setInterval
that automatically cleans up the interval on unmount. Do not usewindow.setInterval
from the window scope as that will not clean up.
- The code block has a special version of
- Use
send()
andrecv()
to send and receive messages between objects. This also works in other JS-based objects. See the Message Passing section below.
- Evaluate mathematical expressions and formulas.
- Perfect for control signals and parameter mapping.
- Supports variables and mathematical functions.
- Similar to
expr
but runs at audio rate for signal processing. - Use for audio synthesis and real-time signal manipulation.
- Trigger events and send messages when clicked.
- Perfect for user interaction and patch control.
- Use
send()
to output bang messages or custom data.
- Store and send predefined messages.
- Click to send the stored message to connected objects.
- Great for triggering sequences or sending configuration data.
- Continuous value control with customizable range.
- Perfect for real-time parameter adjustment.
- Outputs numeric values that can control other objects.
- Create Max/MSP-style textual objects with typed inlets and outlets.
- Supports a wide range of audio processing, control, and utility objects.
- Type an object name to create specialized functionality.
Audio Processing:
gain~
: Amplifies audio signals with gain controlosc~
: Oscillator for generating audio waveforms (sine, square, sawtooth, triangle)lowpass~
,highpass~
,bandpass~
,allpass~
,notch~
: Various audio filterslowshelf~
,highshelf~
,peaking~
: EQ filters for frequency shapingcompressor~
: Dynamic range compression for audiopan~
: Stereo positioning controldelay~
: Audio delay line with configurable delay time+~
: Audio signal additionsig~
: Generate constant audio signalswaveshaper~
: Distortion and waveshaping effectsconvolver~
: Convolution reverb using impulse responsesfft~
: FFT analysis for frequency domain processing
Sound Sources:
soundfile~
: Load and play audio files with transport controlssampler~
: Sample playback with triggering capabilitiesmic~
: Capture audio from microphone input
Control & Utility:
mtof
: Convert MIDI note numbers to frequenciesloadbang
: Send bang on patch loadmetro
: Metronome for regular timingdelay
: Message delay (not audio)adsr
: ADSR envelope generatordac~
: Send audio to speakersfslider
: Floating-point slider controlbang
: Alias for button object
- Receive MIDI messages from connected devices.
- Outputs note, velocity, and control change data.
- Perfect for musical controllers and hardware integration.
- Send MIDI messages to external devices or software.
- Control external synthesizers and DAWs.
- Supports note, CC, and system messages.
- Send messages over network protocols.
- Communicate with other applications or devices.
- Supports UDP and TCP protocols.
- Receive messages from network sources.
- Listen for data from other applications.
- Complements
netsend
for network communication.
These objects can be hidden via the "Toggle AI Features" command if you prefer not to use AI:
- Generate text using AI language models.
- Create dynamic content, lyrics, or procedural text.
- Integrates with message system for interactive generation.
- Generate images from text prompts using AI.
- Create visual content programmatically.
- Supports video chaining as texture source.
- Generate musical compositions using AI.
- Create backing tracks, melodies, or soundscapes.
- Outputs audio that can be processed by other objects.
- Convert text to speech using AI voices.
- Create dynamic narration or vocal elements.
- Outputs audio for further processing.
- Set the final output that appears as the background.
- The endpoint for video chaining pipelines.
- Determines what the audience sees as the main visual.
- Render Markdown text as formatted content.
- Perfect for documentation, instructions, or dynamic text display.
- Supports full Markdown syntax including links and formatting.
If you dislike AI features (e.g. text generation, image generation, speech synthesis and music generation), you can hide them by activating the command palette with CMD + K
, then search for "Toggle AI Features". This will hide all AI-related objects and features, such as ai.txt
, ai.img
, ai.tts
and ai.music
.