Evaluating High-Resolution Piano Sustain Pedal Depth Estimation with Musically Informed Metrics
By: Hanwen Zhang , Kun Fang , Ziyu Wang and more
Potential Business Impact:
Helps music programs understand piano pedal use better.
Evaluation for continuous piano pedal depth estimation tasks remains incomplete when relying only on conventional frame-level metrics, which overlook musically important features such as direction-change boundaries and pedal curve contours. To provide more interpretable and musically meaningful insights, we propose an evaluation framework that augments standard frame-level metrics with an action-level assessment measuring direction and timing using segments of press/hold/release states and a gesture-level analysis that evaluates contour similarity of each press-release cycle. We apply this framework to compare an audio-only baseline with two variants: one incorporating symbolic information from MIDI, and another trained in a binary-valued setting, all within a unified architecture. Results show that the MIDI-informed model significantly outperforms the others at action and gesture levels, despite modest frame-level gains. These findings demonstrate that our framework captures musically relevant improvements indiscernible by traditional metrics, offering a more practical and effective approach to evaluating pedal depth estimation models.
Similar Papers
High-Resolution Sustain Pedal Depth Estimation from Piano Audio Across Room Acoustics
Sound
Lets piano pedals play music better.
Joint Estimation of Piano Dynamics and Metrical Structure with a Multi-task Multi-Scale Network
Audio and Speech Processing
Helps computers understand piano music's loudness.
Streaming Piano Transcription Based on Consistent Onset and Offset Decoding with Sustain Pedal Detection
Sound
Turns music into notes as it plays.