Decomposing Sign Language Movements: A Multi-Band Visualization Method for Articulatory Analysis
Proceedings of the Fifteenth Language Resources and Evaluation Conference (LREC 2026)
Abstract
Understanding the structure of sign language movements requires methods that can isolate and analyze the hierarchical and simultaneous nature of sign articulation. We present a method for tracking and visualizing sign language movements that progressively isolates dependent movements within the articulatory chain: hand rotation from arm displacement and finger movement from hand movement. Using MediaPipe hand tracking on ordinary 2D video, we decompose motion into separate gestural components and compute velocity and direction for each articulator. We present these movement channels in a time-aligned multi-band visualization that reveals temporal structure, bimanual synchronization patterns, and the coordination of different articulatory components. An interactive web-based viewer synchronizes the visualization with video, enabling researchers to efficiently explore movement patterns and their relationship to signing. We demonstrate the method with examples from isolated signs and continuous signing, showing how it reveals patterns that are difficult to observe in raw video, including bimanual coordination, internal movements, and the distinction between linguistic and non-linguistic segments. This approach provides accessible tools for empirical investigation of rhythmic and prosodic patterns in sign languages.