Author_Institution :
Phys. Dept., South China Univ. of Technol., Guangzhou, China
Abstract :
Spatial interpolation of head-related transfer functions (HRTFs) is necessary to reconstruct the spatially continuous HRTF function and therefore virtual sound sources in virtual auditory displays. On the basis of the artificial neural network with radial basis functions, this paper proposes a nonlinear interpolation method for HRTFs. Performance of the proposed interpolation method was validated using a high-resolution HRTF database with the directional resolution of 1°. Computational results indicate that the mean signal distortion ratio is 50.8 dB, 41.7 dB, 36.1 dB, 32.1 dB, 28.8 dB, 20.4 dB, and 16.9 dB for azimuthal intervals of 2°, 4°, 6°, 8°, 10°, 20°, and 30°, respectively. Moreover, the interpolation performance is better for the ipsilateral HRTFs compared with the contralateral HRTFs.
Keywords :
acoustic generators; acoustic signal processing; auditory displays; interpolation; radial basis function networks; transfer functions; artificial neural network; azimuthal intervals; head-related transfer function interpolation; high-resolution HRTF database; mean signal distortion ratio; nonlinear interpolation method; radial basis functions; spatial interpolation; spatially continuous HRTF function reconstruction; virtual auditory displays; virtual sound sources; Artificial neural networks; Azimuth; Databases; Ear; Interpolation; Transfer functions; head-related transfer function; neural network; spatial interpolation; virtual auditory display;