شماره ركورد كنفرانس :
4780
عنوان مقاله :
Image Denoising based on sparse representation in DEA
پديدآورندگان :
Sahebkheir Sanaz s.sahebkheir@student.kgut.ac.ir M.Sc. Student in Remote Sensing Engineering , Department of Surveying Engineering, Graduate University of Advanced Technology, Kerman, Iran, , Esmaeily Ali Department of Surveying Engineering, Graduate University of Advanced Technology, Kerman, Iran , Saba Mohammad Department of Radiology, Medical Science University, Kerman, Iran,
تعداد صفحه :
9
كليدواژه :
Data envelopment analysis (DEA) , Denoising , image decomposition , selflearning , sparse representation
سال انتشار :
1398
عنوان كنفرانس :
يازدهمين كنفرانس ملي تحليل پوششي داده ها
زبان مدرك :
انگليسي
چكيده فارسي :
DEA has been successfully applied to a host of different types of entities engaged in a wide variety of activities in many contexts worldwide. This paper presents a method for utilizing Data Envelopment Analysis (DEA) with sparse input and output using clustering concepts. The approach is based on decomposition of an image into multiple semantic components which have various image processing applications such as image denoising, super-resolution, enhancement and inpainting. In this paper, we present self-learning based image decomposition framework based on sparse representation. The proposed framework first learns an over-complete dictionary from the high spatial frequency parts of the input image for reconstruction purposes. We perform unsupervised clustering on the observed dictionary atoms which allows us to identify image-dependent components with similar context information. Different from previous image processing works with sparse representation, proposed method does not need training images. We conduct the proposed method for denoising a single image. We validate the results by using another dictionary learning method called denoising by sparse coding based on Douglas-Rachford Algorithm. Visually comparison and Psnr improvement of the proposed method showed its robustness.
كشور :
ايران
لينک به اين مدرک :
بازگشت