Interacting with Sound and Space

Goals

Sound & Space aims at developing, evaluating, and applying new ways for humans to interact with immersive music and multimedia environments. This includes the design of expressive musical interfaces, spatial audio performance systems, and network systems for enhanced interaction. We explore various technologies, including augmented reality, computer vision, haptic sensors, and web-based solutions as creative tools. In rigorous scientific experiments we deliver new insights into Human Computer Interaction for creative applications. As composers and performers, we use these novel technologies and systems to create our own artistic works. This interdisciplinary approach allows us to fully understand and leverage their creative potential.

Issues Involved or Addressed

This multidisciplinary team addresses creative, technological, and scientific issues in immersive music and multimedia interaction. The core question is how these domains can be combined in a way that produces meaningful insights rather than isolated results. The project provides a setting where music, technology, and scientific inquiry intersect, raising practical and conceptual challenges that cannot be resolved within a single discipline.

Technologically, the work spans a wide range of tools and methods. The team engages with spatial audio algorithms, augmented and virtual reality, networked systems for real-time interaction, computer vision pipelines, and haptic input devices. Each area introduces its own constraints: spatial audio requires attention to perception and spatial geometry; AR and VR depend on stable alignment between virtual and physical space; networked interaction raises latency, synchronization, and robustness issues; haptic sensing and computer vision depend on uncertain, noisy signals that must be interpreted sensibly. A recurring issue is selecting the appropriate technology for a given interaction concept, not merely implementing tools for their own sake. Team members are encouraged to propose new modes of interaction and to evaluate them critically.

Methodologically, the project focuses on usability and user-experience research in creative applications. These contexts present evaluation challenges: “expressiveness” or “playability” cannot be measured in the same way as productivity or precision. This work requires iterative development, comparative studies, and the refinement of methods suited for artistic and exploratory use cases. A long-term, continuing team structure is necessary for establishing research protocols, building shared datasets, and comparing results across different systems and interaction modes.

Finally, the project links technical development and empirical research with active creative practice. This connection is essential: interactive systems reveal their strengths, weaknesses, and implicit assumptions only when used in real artistic contexts. By designing technologies and then employing them in compositional or performative work, the team gains a deeper, practice-based understanding of the issues at hand. This combined approach—development, investigation, and artistic use—supports a more complete understanding of interactive music systems and the broader field of immersive media.

Methods and Technologies

  • AR & VR Development
  • Embedded Audio Systems
  • Qualitative HCI Research
  • Network Audio
  • Artistic Research
  • Linux Audio
  • DSP for Spatial Audio
  • DSP for Sound Synthesis

Majors Sought

Computing: Computational Media, Human-Centered Computing, Human-Computer Interaction

Design: Music Technology

Engineering: Electrical Engineering

Preferred Interests and Preparation

N.A.

Advisors

Henrik von Coler
Henrik von Coler
hvc@gatech.edu

Henry Windish
Henry Windish
hwindish@gatech.edu

Tristan Peng
Music Technology
tp@gatech.edu

Orlando Kenny
Music Technology
okenny3@gatech.edu

Day, Time & Location

Full Team Meeting:
11:00-11:50 Friday
Couch 204 / Rich 133

Subteam meetings scheduled after classes begin.