Road work ahead! This is an in-progress hit.
Check back later for updates.

Patchline

Mixing music remotely or under pressure often creates friction between creative intent and technical execution. To solve this, I’m developing an AI-assisted, DAW-agnostic audio plugin that interprets natural language commands, such as “add reverb to the vocals” or “boost the kick around 100Hz”, and automatically applies those changes within the DAW environment.

The concept was inspired by Twitch Plays Pokémon, where a live audience collectively controlled a game through simple text inputs. I reimagined that idea in a studio context: a plugin where collaborators or audiences could shape a live mix in real time using natural language. Over time, the idea evolved into a practical tool for artists, live engineers, and worship teams - anyone who needs fast, intuitive control over a mix without necessarily needing to be on-site.

The plugin would use a local or cloud-based language model to interpret incoming commands and translate them into parameter adjustments within the DAW. Built-in safety barriers would prevent destructive changes, and users could lock presets or settings to maintain control over key elements during operation.

Now in active development, this tool aims to lower the technical barrier between collaborators and their sound, freeing them to focus on creativity while still enabling precise, responsive control.

Current Step: None