Title :
Smart compositing: A real-time content-adaptive blending method for remote visual collaboration
Author :
Hong, Wei ; Mitchell, April Slayden ; Trott, Mitchell
Author_Institution :
HP Labs., Palo Alto, CA, USA
Abstract :
This paper proposes a content-adaptive blending method, smart compositing, for displaying two overlapped video frames on the same screen while preserving the readability of both. A pixel-wise adaptive blending factor map is generated according to the edge and saturation information of the content of only the overlay frame. Using this blending factor map, regions of the overlay frame with edges or saturated color are assigned to be more opaque and the remaining regions are assigned to be more transparent. A halo is also created around the edges of the overlay content which enhances the edges and disambiguates them from the underlying frame. The proposed method is suitable for overlaying many different types of content (e.g. drawings, slides, texts, and pictures) and does not require any information (e.g., an opacity mask) from the application which generates the content. This method has low computational complexity and has been implemented in real-time.
Keywords :
computational complexity; groupware; image colour analysis; real-time systems; video signal processing; computational complexity; overlapped video frames; overlay content; overlay frame; pixel-wise adaptive blending factor map; real-time content-adaptive blending method; remote visual collaboration; saturated color; saturation information; smart compositing; Collaboration; Computational complexity; Image color analysis; Image edge detection; Real time systems; Streaming media; Visualization; Video compositing; content-adaptive; visual collaboration;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference on
Conference_Location :
Kyoto
Print_ISBN :
978-1-4673-0045-2
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2012.6288378