Music isn’t just something you hear. When bass frequencies hit, you feel them in your chest, your bones, your entire body. This isn’t metaphorical - it’s physics meeting biology in ways that reshape what we mean by “listening.”
The Haptic Dimension
Your cochlea captures sound waves and translates them into neural signals. That’s the textbook version of hearing. But low frequencies - below around 250 Hz - do something else entirely. They vibrate your ribcage, your diaphragm, the fluid in your inner ear’s vestibular system. You’re not just hearing the bass. You’re feeling it proprioceptively, the same way you feel your body’s position in space.
This is why bass in headphones, no matter how good, never quite matches standing in front of a speaker cabinet. The cochlear response might be identical, but you’re missing the visceral, full-body component. The music is incomplete.
When Furniture Sings
Julie Freeman’s project Sonaforms® at shapedsound.com embeds transducers in stylish wooden forms so the furniture itself becomes the medium of vibration. People feel the music through contact, not just air pressure.
Evelyn Glennie, the deaf percussionist, doesn’t “hear” in the conventional sense. She feels vibrations through her feet, her hands, bone conduction through her skull. She’s not doing something extraordinary - she’s just making explicit what all of us do when we encounter low frequencies. Music was always tactile. We just forgot to notice.
What Clubs Know
This is why clubs are loud. Not because of poor acoustic design or contempt for hearing safety (though those play a role), but because music as a physical experience requires volume. The threshold where sound becomes haptic experience sits somewhere around 85-90 dB for most frequencies. Below that, you’re just hearing. Above it, you’re feeling.
The collective experience of a dancefloor - bodies moving in sync, locked into a shared rhythm - isn’t just auditory synchronization. It’s proprioceptive. Everyone’s ribcage is vibrating at the same frequency. The bass doesn’t just create the beat. It creates a commons of physical sensation.
What Gets Forgotten
When we reduce music to audio files, to waveforms that tickle the cochlea, we’re stripping away half the experience. Current AI can analyze those waveforms, classify them, even generate new ones. But it has no ribcage. No sense of what it means for a frequency to resonate through bone and tissue.
This isn’t a failing of the technology. It’s a reminder of what we’re measuring when we measure music. We’re not capturing the full phenomenon. We’re capturing one dimension of a multisensory experience, then acting surprised when something essential is missing.
The bass you feel in your chest isn’t separable from the bass you hear. They’re the same event, experienced through different sensory channels. Strip one away, and you haven’t preserved the music in another form. You’ve created something else entirely.