A Minor Memory
How I turned ten years of concerts into music notation
Cover view of the score-space and geographic mapping.
I've been to 40 concerts since 2016. I have photos and videos for most of them, tickets for many and setlists for some. For a long time they just sat in a cardboard box and my camera roll.
Spotify knows what I listened to. It doesn't know that I drove four hours to see the same band three times in a year, or that I went to three concerts in ten days in August 2016 and then nothing for months. It doesn't know who was there, or that the gap between 2020 and 2022 wasn't a choice. A concert history at scale contains patterns that only become visible when you lay it out and I wanted to see those patterns.
Finding a form
The first question was what a concert even is, in visual terms. A dot on a timeline? A pin on a map? Both are correct and yet they both felt inadequate. They reduce a concert to a single data point - when, or where - and everything else gets lost.
A musical score handles both. Time runs left to right, pitch runs top to bottom, groupings encode relationships between events. More than that, a score is designed to be performed. I started mapping concerts to staff positions: day 1-6 at the bottom, day 25-31 at the top. Those five positions map to five pitches — A3, C4, E4, G4, B4, the chord tones of an A minor 9. This made the visualisation playable and built entirely from data.
Three versions
Version 1 (2019)
Version 1 used p5.js, Leaflet, and Mapbox. I drew a five-line staff and plotted concerts dots on a map by hand, using techniques I remember from my Grenzen project. Hover states didn't work properly and the map was static. The score and the map lived in completely separate sections with no connection between them.
Version 2 (2024)
Version 2 introduced d3. Hover effects worked and displayed core details of each concert. It included an abstract map centered on Salzburg with differently sized circles reflecting the number of visits and connections between cities. But the score and the map still lived in separate sections, and memorabilia appeared as a pile at the bottom of the page with no connection to the individual concerts.
Version 3 (2026)
Version 3 also started as a static music staff. I rebuilt the score as a flat canvas before facing the same problem as before: how do we display location? Is there any way to combine them?
This time, I had a lot more options. AI tools had changed what I could prototype quickly, and using Codex for implementation and Claude as a thinking partner to work out which tools to use meant I could explore ideas I hadn't previously considered due to technical limitations.
The answer came from treating time and geography as two sides of the same coin (or data) rather than two separate problems. The score already places concerts in time. Geography is just another layer.
With Codex's support, I brought my sketch to life and built the scene in Three.js with d3-geo for the map. Since the score was already mapped to real pitches, using Tone.js to make the piece play was the natural next step. Once the score and geography shared the same Three.js scene, the isometric view became possible: tilt the staff forward and Europe appears below it, with threads connecting each note to its city. From above, you read a score. Rotated, you read a map.
Once the score and geography shared the same Three.js scene, the isometric view became possible: tilt the staff forward and Europe appears below it, with threads connecting each note to its city. From above, you read a score. Rotated, you read a map.
In this version I also expanded the encoding. Stem height represents a ticket price bracket. Soft coloured halos behind each note show who I went with: each person in the data set has a colour, and concerts with multiple people show blended halos. Concerts that happened within 3 days of each other, get beams. Each concert has its own detail page that can be opened via click and holds physical tickets, photos, and that night's setlist - some are scanned from physical copies, others I screenshotted and converted to structured text with AI.
The more encodings I defined, the more it started to feel like an actual piece of music. And since it was already a score, it should sound like one too... but more on that later.
The narrative layer
The staff and map show when and where but they don't show what those years actually felt like: who I went with, what the gaps meant, why a specific concert in July 2024 was different from every other one on the score. Without context, the data is just... data.
Using GSAP, I added a narrative layer that can be accessed through a play button and walks you through the decade in 12 beats, each focused on a moment or "era". Some beats zoom and pan. Some show handwritten annotations in the margins.
Getting Codex to place those annotations correctly took longer than most other things in the project. Positioning elements in a Three.js scene with a moving camera quickly turns the conversation into: a little up... no, up on the Y axis, not the Z axis. Rather quickly I asked Codex to build two small tools: one where I can lock in the camera position for each beat, and one that lets me draw directly on my iPad via localhost and save the placement of doodle elements into the scene. Later, window.isometricBeatCamera.logCurrentView() became my best friend.
What didn't make the cut
The abstract map graphic from version 2 is gone. The isometric view already does a lot: rotate, read the staff, follow the threads down to Europe. Adding a second geographic representation would have been too much.
I couldn't quite work out how to include both a geographic map and a frequency visualisation. The isometric view already does a lot: rotate, read the staff, follow the threads down to Europe. Adding a second geographic representation would have been too much.
In the current version, that information is a number in the " ? / Learn more" panel. Some data needs to be a chart, some data needs to be numbers.
The setlists went through something similar: I thought about pulling them from setlist.fm via an API, but that would have required an application process and I didn't want to wait for approval. I also considered creating and embedding Spotify playlists based on the setlists but the conditions were too restrictive.
And how do 40 concerts sound?
Getting the piece to "play" was easy. Press play and 40 notes play chronologically in their assigned pitch, producing perfectly boring audio that leaves a lot of data unrepresented. But who likes easy or boring?
What genres map to which instruments is both easy and difficult. Naturally, metalcore concerts sound very different from k-pop concerts (well, mostly anyway) but how do you capture the vibe through a single instrument?
I tried Strudel and GarageBand. Strudel keeps you inside the data but makes experimentation slow. GarageBand lets you experiment freely but doesn't know what a CSV is. Instead, I described to Claude what I was aiming for but...
data sonification is hard. Especially if you want it to sound like something you'd actually listen to but you have no experience in making music and Claude has no ears.
| Data | Musical parameter | Range / Logic |
|---|---|---|
| Day of month | Pitch (A minor 9 chord): Day 1-6 → A3 Day 7-12 → C4 Day 13-18 → E4 Day 19-24 → G4 Day 25-31 → B4 | |
| Setlist length | Note duration: setlist-scaled per concert | |
| City longitude | Stereo pan: west = left, east = right | |
| Social context | Solo / met-someone / no companions adds echo; group concerts are drier | |
| Genre | Instrument layer: genre-specific Synth/PluckSynth voice plays alongside main melody | |
| Concerts within 3 days | Beamed, shorter gap between notes |
Claude built me a little visualiser so it was easier to understand which sound correlates with which data point - try it out:
The more musical you make it, the easier it is to drift from the data. The stricter you keep the mapping, the less it sounds like real music. This version is the closest I've gotten to stay between both.
Forty concerts. One chord. For a long time it was just a box. Now it's a box and a score and a map and something that plays when you press a button.
Wanna see it live?Explore A Minor Memory














