Projects
2024
Spatial Music Compositor: 3D Audio Visualization Engine

Spatial Music Compositor: 3D Audio Visualization Engine

An immersive spatial audio system that renders music in 3D space, allowing users to navigate through sound. Combines procedural visualization with binaural audio processing for VR/AR music experiences.

Spatial AudioWebGLThree.jsVR/ARProcedural GraphicsBinaural Audio

Project Vision

Spatial Music Compositor reimagines music listening as a navigable 3D experience. Instead of passive stereo playback, users explore musical space—walking around instruments, diving into synth textures, experiencing rhythm as spatial choreography.

Core Concept

Traditional music playback presents all instruments collapsed into a 2D stereo field. Spatial Music Compositor:

  • Separates individual stems (vocals, drums, bass, synths) into discrete 3D objects
  • Positions them in virtual space based on musical characteristics
  • Enables listener movement through the sonic environment
  • Renders binaural audio for realistic spatial perception

Technical Architecture

Audio Processing Pipeline

Source Files → Stem Separation → Spatial Positioning → Binaural Rendering → Output
              (AI-powered)       (Procedural)         (HRTF Processing)

Key Technologies

  • Web Audio API: Panner nodes for 3D audio positioning
  • Three.js: 3D environment rendering and navigation
  • Unity3D: Real-time 3D development platform for spatial audio experiences
  • HRTF Processing: Head-related transfer function for realistic binaural audio

Visualization Engine

  • Procedural Graphics: Audio-reactive particle systems and geometry
  • Frequency Mapping: Visual representation of spectral content
  • Dynamic Camera: Automated cinematography based on musical structure
  • VR Integration: WebXR support for immersive experiences

Features

Intelligent Spatial Layout

Positions stems based on:

  • Frequency Content: Low frequencies close, highs distant
  • Energy Level: Louder elements larger in space
  • Rhythmic Density: Percussive elements arranged in rhythmic constellations
  • Harmonic Relationships: Chord tones clustered, tensions spread
  1. Fly-Through: Free camera movement through the sound space
  2. Orbit: Circle around the musical center
  3. Path-Guided: Follow predetermined cinematic routes
  4. Performance Mode: Motion tracking for VR headset movement

Visual Themes

  • Particle Clouds: Swarms responding to frequency bands
  • Geometric Structures: Rotating polyhedra representing harmonic content
  • Organic Forms: Fluid simulations driven by audio amplitude
  • Abstract Landscapes: Procedural terrains mapped to spectral data

Use Cases

  1. Immersive Listening: New way to experience familiar music
  2. Music Education: Visualize musical structure and arrangement
  3. VR Concerts: Spatial presentation of live or recorded performances
  4. Composition Tool: Explore spatial mixing possibilities
  5. Music Therapy: Engaging visualization for therapeutic applications

Technical Challenges

  • Stem Separation Quality: AI models struggle with complex mixes
  • Spatial Coherence: Maintaining musical unity while spreading sources
  • Performance Optimization: Rendering complex 3D graphics + audio in browser
  • HRTF Personalization: Generic HRTFs don't work equally well for all listeners

Future Development

  • Real-Time Stem Generation: Live performance spatial mixing
  • Multi-User Spaces: Social listening in shared 3D environments
  • Haptic Feedback: Vibrotactile elements for bass and percussion
  • AI Choreography: Automated camera paths based on musical analysis
  • WebGPU Migration: GPU-accelerated graphics and audio processing

Tech Focus: Spatial audio, WebGL/Three.js, ML audio processing, VR/AR, procedural graphics, real-time visualization

Status: Experimental prototype, active R&D