DocumentCode
1483060
Title
A New Framework for Underdetermined Speech Extraction Using Mixture of Beamformers
Author
Dmour, Mohammad A. ; Davies, Mike
Author_Institution
Inst. for Digital Commun. (IDCOM), Edinburgh Univ., Edinburgh, UK
Volume
19
Issue
3
fYear
2011
fDate
3/1/2011 12:00:00 AM
Firstpage
445
Lastpage
457
Abstract
This paper describes frequency-domain nonlinear mixture of beamformers that can extract a speech source from a known direction when there are fewer microphones than sources (the underdetermined case). Our approach models the data in each frequency bin via Gaussian mixture distributions, which can be learned using the expectation maximization algorithm. The model learning is performed using the observed mixture signals only, and no prior training is required. Nonlinear beamformers are then developed based on this model. The proposed estimators are a nonlinear weighted sum of linear minimum mean square error or minimum variance distortionless response beamformers. The resulting nonlinear beamformers do not need to know or estimate the number of sources, and can be applied to microphone arrays with two or more microphones. We test and evaluate the described methods on underdetermined speech mixtures.
Keywords
Gaussian distribution; array signal processing; expectation-maximisation algorithm; feature extraction; mean square error methods; microphone arrays; speech processing; Gaussian mixture; expectation maximization algorithm; linear minimum mean square error; microphone arrays; minimum variance distortionless response beamformers; underdetermined speech extraction; Data mining; Frequency; Mean square error methods; Microphone arrays; Noise reduction; Nonlinear distortion; Permission; Signal processing; Speech enhancement; Testing; Beamforming; Gaussian mixture model (GMM); speech extraction; speech separation; underdetermined;
fLanguage
English
Journal_Title
Audio, Speech, and Language Processing, IEEE Transactions on
Publisher
ieee
ISSN
1558-7916
Type
jour
DOI
10.1109/TASL.2010.2049514
Filename
5457967
Link To Document