DocumentCode :
1575953
Title :
Computation networks to implement optimization
Author :
Sundaram, Ram
Author_Institution :
ECE Dept., Gannon Univ., Erie, PA, USA
fYear :
2010
Firstpage :
400
Lastpage :
403
Abstract :
This paper presents threshold binary networks and block-based gradient estimation networks to optimize the square error objective function and recover the regularized least squares (LS) solution. The threshold binary networks consist of linear processing elements with threshold nonlinearities to produce binary outputs. The objective function is expressed at the bit-level and the optimization takes place on partitions of the networks. Partitions are active only if the optimization takes place at the locations in that partition. The optimization process switches between active and inactive partitions to continue the progress toward deeper and more stable minima (lower error energy/cost) of the objective function. Block-based gradient networks use local estimates of the gradient and in-place convolution operations to recover the LS estimate. Regularization controls the rate of convergence to the LS estimate.
Keywords :
image restoration; least squares approximations; optimisation; block-based gradient estimation networks; computation networks; optimization; regularized least squares solution; square error objective function; threshold binary networks; Computer networks; Convolution; Cost function; Degradation; Flowcharts; Image restoration; Inverse problems; Least squares approximation; Switches; Zirconium;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems (MWSCAS), 2010 53rd IEEE International Midwest Symposium on
Conference_Location :
Seattle, WA
ISSN :
1548-3746
Print_ISBN :
978-1-4244-7771-5
Type :
conf
DOI :
10.1109/MWSCAS.2010.5548877
Filename :
5548877
Link To Document :
بازگشت