Title :
Variance invariant adaptive temporal supersampling for motion blurring
Author :
Neilson, Daniel ; Yang, Yee-Hong
Author_Institution :
Dept. of Comput. Sci., Alberta Univ., Edmonton, Alta., Canada
Abstract :
Adaptive temporal sampling, used to create motion blur in distributed ray tracing, generates more sample points in regions with motion blur than in regions without motion blur. When the number of sample points used on stationary objects in regions with motion blur exceeds the number of sample points used in other regions of the image, the variance in the color of the object can differ between the two regions. This paper identifies the cause of this variance discrepancy, and proposes a modification to existing adaptive temporal sampling algorithms which eliminated it. Our results demonstrate that the variance of stationary objects remains approximately the same throughout the entire image and that the proposed modification is capable of improving the running time of existing adaptive temporal sampling algorithms.
Keywords :
colour graphics; image sampling; ray tracing; rendering (computer graphics); adaptive temporal supersampling; distributed ray tracing; image modification; motion blurring; running time improvement; stationary objects; variance discrepancy; variance invariant adaptive temporal supersampling; Computer graphics; Distributed computing; Equations; Image sampling; Layout; Optical reflection; Pixel; Ray tracing; Rendering (computer graphics); Sampling methods;
Conference_Titel :
Computer Graphics and Applications, 2003. Proceedings. 11th Pacific Conference on
Print_ISBN :
0-7695-2028-6
DOI :
10.1109/PCCGA.2003.1238248