Live coded music and visuals using Sonic Pi and Hydra.
Sonic Pi handles the sound — atmospheric jungle, breakbeats, ambient textures. Hydra handles the visuals — audio-reactive patterns, camera effects, generative art. They talk to each other over OSC.
├── utils.rb # helper functions (tempo control, sample patterns, stutter)
├── projects/
│ ├── template.rb # full project template with annotated sections
│ ├── atmospheric_jungle_demo.rb
│ ├── atmospheric_jungle_live.rb # same but with MIDI out for hardware
│ └── tutorial.rb
├── visuals/
│ ├── acid_simulation.js # video-based, audio-reactive
│ ├── zhizha.js # camera + voronoi patterns
│ ├── cam_osc_audio_react.js
│ ├── rain.js # generative cellular patterns
│ ├── trippy_sphere.js # pure math, no camera
│ └── osc_receive.js # OSC listener template
├── project.rb # quick example
└── rain.rb # ambient piece
- Projects: Actual compositions.
atmospheric_jungle_demo.rbis the main one — chord progressions, amen break chopping, arps, the works. - Visuals: Hydra sketches, mostly audio-reactive. Paste them into the Hydra editor.
- Utils: Shared helpers loaded via
run_file. Pattern-based sequencing, sample wrappers, tempo transitions.
- Install Sonic Pi
- Open a project file (e.g.
projects/atmospheric_jungle_demo.rb) in Sonic Pi - Update the
run_filepath at the top to point to yourutils.rblocation - Run it
For visuals, open Hydra in a browser and paste a script from visuals/.
Sonic Pi sends OSC messages to localhost:51000. The osc_receive.js Hydra script listens on that port. Use this to sync visuals to your music.
The live version (atmospheric_jungle_live.rb) sends MIDI to a Korg Minilogue. Adjust MIDI port names in the script to match your setup.
- can-of-sprats — my other music coding project, using Sardine (Python) + SuperCollider
Do what you want with it.