What if you could describe the music you want to hear and have an AI produce it in real-time, sending MIDI notes directly to your DAW? That’s exactly what I built: a Python application that uses AI agents to generate Eurobeat and 90s techno patterns, outputting them as live MIDI to Akai’s MPC Beats.
I’m not a musician. I enjoy playing guitar from time to time, but I have zero experience with music production software. However, I’m gifted myself a Akai MPK mini Plus MIDI controller, which has 8 knobs and 8 pads, and I experimented with using it to control a music generation agent. No idea what I’m doing, but it’s fun.
As Akai MIDI controller can be connected to a laptop, and there I’ve got Python, this saturday morning I decided to build a simple prototype that connects an AI agent to MIDI output. The idea is simple. You write a prompt like “Energetic eurobeat in Am, Daft Punk style”, and an AI agent powered by Claude on AWS Bedrock generates patterns for 8 tracks: two drum kits, bass, rhodes, pluck, pad, and a lead melody. The patterns are sent as MIDI messages to MPC Beats, where each track is routed to a different virtual instrument. You can then modify the music live by writing new instructions, and use the physical knobs and pads on an Akai MPK Mini Plus to mute/unmute tracks, regenerate patterns, or reset the session.
I’m using the MPC Beats because it’s free and has a simple MIDI setup, but in theory this could work with any DAW that accepts MIDI input. The whole system is built in Python using Strands Agents for the AI orchestration, mido + python-rtmidi for MIDI I/O, and Rich for the terminal UI.
The Architecture
The flow is straightforward:

Project Structure
src/
settings.py # Configuration: BPM, tracks, MIDI devices
cli.py # Click CLI entry point
commands/play.py # Main play command
agent/
prompts.py # System prompts for the AI producer
tools.py # PatternStore + @tool functions
factory.py # Agent creation
midi/
device.py # MIDI device detection
melody_player.py # Threaded melody loop player
drum_player.py # Threaded drum loop player
session/
state.py # State machine (IDLE/GENERATING/PLAYING)
session.py # Session orchestrator
ui/
menu.py # Interactive terminal menu
Configuration
Everything starts with settings.py. The MIDI devices and AWS region are loaded from environment variables, while the musical parameters are defined as constants:
BPM = 122BAR_DURATION = round((60 / BPM) * 4, 3)LOOP_BARS = 4LOOP_DURATION = round(BAR_DURATION * LOOP_BARS, 3)TRACKS = { 1: {"name": "Drums", "channel": 0, "type": "drums"}, 2: {"name": "Drums Detroit", "channel": 1, "type": "drums"}, 3: {"name": "Rhodes", "channel": 2, "type": "melody"}, 4: {"name": "Pluck", "channel": 3, "type": "melody"}, 5: {"name": "Bass", "channel": 4, "type": "melody"}, 6: {"name": "Org Bass", "channel": 5, "type": "melody"}, 7: {"name": "Pad", "channel": 6, "type": "melody"}, 8: {"name": "Lead", "channel": 7, "type": "melody"},}
Each track maps to a MIDI channel. Tracks 1-2 are drum kits (offset-based timing), tracks 3-8 are melodic instruments (duration-based timing). The MPC Beats “House Template” provides the virtual instruments: a Classic drum kit, a Detroit percussion kit, Electric Rhodes, Tube Pluck, Bassline, Organ Bass, Tube Pad, and an Instant Go lead synth.
The Bridge Between AI and MIDI: PatternStore and Tools
The core of the system is the PatternStore, a simple shared store where the AI writes patterns and the MIDI players read them:
class PatternStore: def __init__(self): self._patterns: dict[int, list] = {} def set(self, track_id: int, pattern: list) -> None: self._patterns[track_id] = pattern def get(self, track_id: int) -> list | None: return self._patterns.get(track_id) def clear(self) -> None: self._patterns.clear()
The Strands @tool functions are created via a factory that closes over the store:
def create_tools(store: PatternStore) -> list: tool def set_melody_pattern(track_id: int, pattern: str) -> str: """Define a melodic line for a specific track.""" data = json.loads(pattern) store.set(track_id, data) total = sum(n["duration"] for n in data) name = TRACKS[track_id]["name"] return f"OK - {name}: {len(data)} notes, total duration {total:.3f}s" tool def set_drum_pattern(track_id: int, pattern: str) -> str: """Define a drum pattern for a specific drum track.""" data = json.loads(pattern) store.set(track_id, data) name = TRACKS[track_id]["name"] return f"OK - {name}: {len(data)} hits" return [set_drum_pattern, set_melody_pattern]
A melody pattern is a JSON array of {note, duration, velocity} objects where the sum of durations must equal LOOP_DURATION (4 bars). A drum pattern uses {note, velocity, offset} where offset is the time in seconds from the loop start. The note value -1 represents silence, which is crucial for creating space in the arrangement.
The Agent
The agent is a Strands Agent using Claude Sonnet on AWS Bedrock. The system prompt is heavily detailed with music production instructions: frequency ranges for each track, velocity guidelines, and structural rules. The key instruction is “less is more” – not all tracks should play notes all the time:
def create_agent(store: PatternStore) -> Agent: return Agent( model=BedrockModel( model_id=Models.CLAUDE_SONNET, region_name=AWS_REGION, ), tools=create_tools(store), system_prompt=SYSTEM_PROMPT, callback_handler=None, )
There are two agents: one for initial generation (calls all 8 tools) and one for live modifications (only modifies the tracks that need to change). A third, lighter agent using Haiku generates the menu suggestions to keep latency and cost low.
MIDI Players
Two player classes handle the actual MIDI output. The MelodyLoopPlayer iterates through note events with durations:
def _loop(self, melody: list): while not self.stop_event.is_set(): current = self.store.get(self.track_id) or melody for ev in current: if self.stop_event.is_set(): break note = ev["note"] vel = ev.get("velocity", 80) if note >= 0: self._send("note_on", note=note, velocity=vel, channel=self.channel) deadline = time.time() + ev["duration"] while not self.stop_event.is_set() and time.time() < deadline: time.sleep(0.02) if note >= 0: self._send("note_off", note=note, velocity=0, channel=self.channel)
The DrumLoopPlayer uses offset-based timing instead, scheduling hits at specific points within the loop. Both players read from the PatternStore on each loop iteration, which enables hot-swapping patterns during live modifications.
The Session
The Session class orchestrates everything. It manages the state machine (IDLE -> GENERATING -> PLAYING), owns the PatternStore, creates the agents, and handles MIDI input from the controller:
class Session: def __init__(self): self.state = State.IDLE self.store = PatternStore() self.agent = create_agent(self.store) self.live_agent = create_live_agent(self.store) self._agent_busy = threading.Lock()
When generation completes, playback starts with a progressive intro – tracks are unmuted one by one with a 2-bar delay between each, creating a build-up effect:
def _start_playback(self): self.state = State.PLAYING for tid in TRACKS: self.players[tid].muted = True self.players[tid].start(patterns[tid]) intro_delay = BAR_DURATION * 2 for i, tid in enumerate(INTRO_ORDER): timer = threading.Timer(intro_delay * i, self._unmute_track, args=(tid,)) timer.start()
How It Works
- Run
python cli.py play - The app detects your MPK Mini Plus and shows a menu with AI-generated suggestions
- Select a suggestion or write your own prompt
- The AI generates 8 track patterns (takes a few seconds)
- Playback begins with a progressive build-up
- Write new instructions to modify the music live
- Use knobs K1-K8 to mute/unmute individual tracks
- PAD 1 regenerates with the same prompt, PAD 2 resets everything
Tech Stack
- Python 3.13 with Poetry
- Strands Agents for AI agent orchestration
- AWS Bedrock (Claude Sonnet + Haiku) for pattern generation
- mido + python-rtmidi for MIDI I/O
- Akai MPK Mini Plus as MIDI controller
- MPC Beats as the DAW/sound engine
- Rich for terminal UI
- Click for CLI
And that’s all. Full source code available on GitHub.


















