A real-time system that translates audio frequencies into color — mapping the physics of sound onto the psychology of color perception. Built with Flask, NumPy, SciPy, and SSE streaming.
Chromesthesia is a neurological condition where sounds automatically trigger color experiences. This project translates that synesthetic phenomenon into software — using the physics of sound (frequency, amplitude) to algorithmically produce corresponding colors.
The mapping isn't arbitrary. Low bass frequencies map to deep reds and purples, mid-range tones map to greens and yellows, and high treble maps to blues and whites — following both human perceptual research and physical wavelength relationships between sound and light.
Sound waves have frequency (Hz). Light waves have wavelength (nm). The mapping between the two domains uses a logarithmic scale to match how humans perceive both — neither sound nor color is perceived linearly.
From microphone input to rendered color — the full data flow through the system.
The original plan used Flask-SocketIO for real-time color updates — but a dependency conflict on the target environment forced a rethink. SSE turned out to be the better choice.
EventSource APII enjoy working on projects that sit at the intersection of science, data, and visual design. Let's make something unusual.