CrateBot analyzes your music library with machine learning, generating intelligent tags and evocative descriptions that help you find the perfect track in the moment.
You know that track. The one with the hypnotic bassline that works perfectly after a Moodymann record. But what's it called? Who made it? You bought it three years ago and played it twice.
As collections grow into the thousands, human memory fails. You can't remember every track's energy, mood, or that one detail that makes it special. Genre alone doesn't cut it—you need to know how a track feels, when to play it, and what makes it memorable.
Squinting at track names on CDJ displays while the crowd waits. Artist and title tell you nothing about energy or vibe.
That track worked perfectly at 3am in a sweaty basement. But you forgot to tag it, and now you can't find it when you need it.
Creating folders for every mood × energy × genre combination is a full-time job. And tracks belong in multiple categories anyway.
GOSPEL CHOIR SUNRISE ANTHEM
"music is the answer to your problems"
sweating cathedral
Tags that capture how tracks feel—not just what they're called. Evocative descriptions designed for instant recognition, like album art in text form.
The best DJs don't just pick tracks—they navigate energy. Each song is a decision point: go harder or pull back? Keep the vibe or shift the mood?
With proper tags, your library becomes a map. Filter by energy level, mood, and timing to see only tracks that fit this exact moment. No more scrolling through thousands of files hoping something catches your eye. Search "dark + rising + 3am" and let the right tracks surface.
CrateBot tags every track automatically—energy, mood, vibe, memorable hooks—so you can focus on reading the room, not managing metadata.
Three steps to smarter crates, powered by five audio analysis engines.
Point CrateBot at your already-tagged library. It extracts 135+ audio features and learns your classification patterns.
Batch process new tracks. CrateBot predicts genre, mood, timing, and descriptive tags—plus generates creative vibes.
Review predictions with audio playback. Make corrections, train again. Your model improves with each iteration.
Built on research from Google, Barcelona's Music Technology Group, and LAION-AI—trained on millions of audio samples and 55,000+ tagged tracks to understand music the way you hear it.
CrateBot handles the tedious work of track administration so you can focus on what you do best—reading the room, building a vibe, and creating a moment.
Genre, Timing, Mood, and Descriptive fields. Each track gets multiple tags per field for nuanced classification that matches how you think.
Automatically finds the memorable vocal phrase. "Music is the answer" beats "Track 04" when you're scanning a CDJ screen mid-set.
Tag thousands of tracks while you sleep. Checkpoint recovery means you can stop and resume without losing progress.
Inline audio playback lets you verify predictions. Corrections improve the model and sync back to your files.
Tags sync directly to your audio files. See them in Rekordbox, Traktor, Serato—or right on the CDJ display.
New tracks get tagged automatically when you add them. Your library stays organized without constant maintenance.
CrateBot is in early beta. Tell us about your library and we'll get you set up.