Have you ever imagined playing an air-guitar with just finger gestures and hand motions with no strings, no grip strength required?Â
AIR (Augmented Instrumental Reality) is a wearable system that reimagines the guitar to be lightweight, portable, and accessible to more people. Traditional guitars present significant barriers to entry: they require considerable finger strength and dexterity, demand precise hand positioning, and their bulky form factor makes them challenging to transport. For individuals with conditions affecting hand strength, such as muscle dystrophy, these physical demands can make guitar playing impossible.Â
What started as a simple computer vision system with a wired glove evolved substantially through iterative development. Early testing revealed that velocity-based strum detection produced inconsistent results, leading us to implement angular and linear acceleration measurements that proved far more reliable. The initial wired connection became a low-latency Bluetooth system, eliminating tethering constraints while maintaining responsiveness. Through team discussions, professor guidance, and particularly through collaboration with CMU’s School of Music, the system expanded beyond basic chord playback to include an interactive game mode for learning pre-programmed songs and multiple timbres (piano, guitar, and vocal), giving users expressive options to match their musical vision.
Achieving 61 ms end-to-end latency and exceeding 95% accuracy for gesture recognition, AIR demonstrates that assistive technology can deliver real-time musical performance comparable to traditional instruments.Â
