Violin Note Spectrum Detection Based on a Multi-Fundamental Frequency Estimation Algorithm
Issued Date
2026-01-01
Resource Type
eISSN
21693536
Scopus ID
2-s2.0-105031963498
Journal Title
IEEE Access
Rights Holder(s)
SCOPUS
Bibliographic Citation
IEEE Access (2026)
Suggested Citation
Zhang F., Li Z. Violin Note Spectrum Detection Based on a Multi-Fundamental Frequency Estimation Algorithm. IEEE Access (2026). doi:10.1109/ACCESS.2026.3669557 Retrieved from: https://repository.li.mahidol.ac.th/handle/123456789/115669
Title
Violin Note Spectrum Detection Based on a Multi-Fundamental Frequency Estimation Algorithm
Author's Affiliation
Corresponding Author(s)
Other Contributor(s)
Abstract
Traditional fundamental frequency estimation methods often exhibit misjudgments when processing polyphonic violin music due to harmonic overlaps, which significantly limit the accuracy of automatic music transcription systems. To address this challenge, this paper proposes a method for detecting violin note spectra based on a multi-fundamental frequency estimation algorithm. First, to mitigate the interference of fundamental information in the cepstrum, a note-corrected reverse-banding process is introduced. This approach enhances cepstral peaks while suppressing high-frequency noise, thereby improving the accuracy of fundamental period recognition. Second, a multi-resolution rapid time– frequency analysis method (RTFI) is employed for harmonic extraction, effectively separating overlapping harmonic components and improving the precision of fundamental frequency estimation. Finally, given the temporal variability of note spectral features, a single-frame multi-fundamental-frequency phased-estimation method is developed. This method separately estimates the transient and steady-state stages of each note, further enhancing the accuracy of multi-fundamental frequency estimation. Experimental results demonstrate that, across tests involving one to nine notes, the proposed algorithm outperforms existing approaches such as HPS, ISSA, and JEA in terms of recall, precision, and F-measure metrics. Notably, under the complex scenario of six simultaneous notes, the proposed algorithm achieves an F-measure of 94%, significantly exceeding those of the comparison methods. In addition, the proposed method shows superior performance in fundamental frequency count estimation and note label recognition, with a markedly lower total error rate.
