DocumentCode :
25820
Title :
Hypothesis Testing in Feedforward Networks With Broadcast Failures
Author :
Zhenliang Zhang ; Chong, Edwin K. P. ; Pezeshki, Ali ; Moran, William
Author_Institution :
Dept. of Electr. & Comput. Eng., Colorado State Univ., Fort Collins, CO, USA
Volume :
7
Issue :
5
fYear :
2013
fDate :
Oct. 2013
Firstpage :
797
Lastpage :
810
Abstract :
Consider a large number of nodes, which sequentially make decisions between two given hypotheses. Each node takes a measurement of the underlying truth, observes the decisions from some immediate predecessors, and makes a decision between the given hypotheses. We consider two classes of broadcast failures: 1) each node broadcasts a decision to the other nodes, subject to random erasure in the form of a binary erasure channel; 2) each node broadcasts a randomly flipped decision to the other nodes in the form of a binary symmetric channel. We are interested in conditions under which there does (or does not) exist a decision strategy consisting of a sequence of likelihood ratio tests such that the node decisions converge in probability to the underlying truth, as the number of nodes goes to infinity. In both cases, we show that if each node only learns from a bounded number of immediate predecessors, then there does not exist a decision strategy such that the decisions converge in probability to the underlying truth. However, in case 1, we show that if each node learns from an unboundedly growing number of predecessors, then there exists a decision strategy such that the decisions converge in probability to the underlying truth, even when the erasure probabilities converge to 1. We show that a locally optimal strategy, consisting of a sequence of Bayesian likelihood ratio tests, is such a strategy, and we derive the convergence rate of the error probability for this strategy. In case 2, we show that if each node learns from all of its previous predecessors, then there exists a decision strategy such that the decisions converge in probability to the underlying truth when the flipping probabilities of the binary symmetric channels are bounded away from 1/2. Again, we show that a locally optimal strategy achieves this, and we derive the convergence rate of the error probability for it. In the case where the flipping probabilities converge to 1/2, we derive a necessary co- dition on the convergence rate of the flipping probabilities such that the decisions based on the locally optimal strategy still converge to the underlying truth. We also explicitly characterize the relationship between the convergence rate of the error probability and the convergence rate of the flipping probabilities.
Keywords :
Bayes methods; broadcast channels; convergence; decision making; error statistics; feedforward; maximum likelihood sequence estimation; random sequences; statistical testing; wireless channels; Bayesian likelihood ratio test; binary erasure channel; binary symmetric channel; broadcast failure; convergence rate; decision converge; decision strategy; error probability; feedforward network; flipping probability; hypothesis testing; necessary condition; node broadcast; node decision convergence; optimal strategy; predecessors; random erasure; random flipped decision; random sequence; sequential decision making; Bayes methods; Convergence; Educational institutions; Error probability; Feedforward neural networks; Indexes; Testing; Asymptotic learning; decentralized detection; erasure channel; herding; social learning; symmetric channel;
fLanguage :
English
Journal_Title :
Selected Topics in Signal Processing, IEEE Journal of
Publisher :
ieee
ISSN :
1932-4553
Type :
jour
DOI :
10.1109/JSTSP.2013.2258657
Filename :
6504468
Link To Document :
بازگشت