iccv iccv2013 iccv2013-207 iccv2013-207-reference knowledge-graph by maker-knowledge-mining

207 iccv-2013-Illuminant Chromaticity from Image Sequences


Source: pdf

Author: Veronique Prinet, Dani Lischinski, Michael Werman

Abstract: We estimate illuminant chromaticity from temporal sequences, for scenes illuminated by either one or two dominant illuminants. While there are many methods for illuminant estimation from a single image, few works so far have focused on videos, and even fewer on multiple light sources. Our aim is to leverage information provided by the temporal acquisition, where either the objects or the camera or the light source are/is in motion in order to estimate illuminant color without the need for user interaction or using strong assumptions and heuristics. We introduce a simple physically-based formulation based on the assumption that the incident light chromaticity is constant over a short space-time domain. We show that a deterministic approach is not sufficient for accurate and robust estimation: however, a probabilistic formulation makes it possible to implicitly integrate away hidden factors that have been ignored by the physical model. Experimental results are reported on a dataset of natural video sequences and on the GrayBall benchmark, indicating that we compare favorably with the state-of-the-art.


reference text

[1] F. Ciurea and B. Funt. A large image database for color constancy research. In Color Imaging Conf., 2003. 5

[2] M. Ebner. Color constancy based on local space average color. Machine Vision and Applications, 2009. 2

[3] G. D. Finlayson and G. Shaefer. Solving for colour constancy using a constrained dichromatic reflection model. IJCV, 2001. 2

[4] A. Gijsenij, T. Gevers, and J. van de Weijer. Generalised gamut mapping using image derivative structures for color constancy. IJCV, 2010. 6, 7

[5] A. Gijsenij, T. Gevers, and J. van de Weijer. Computational color constancy: Survey and experiments. TIP, 2011. 2, 7

[6] A. Gijsenij, R. Lu, and T. Gevers. Color constancy for multiple light sources. IEEE TIP, 2012. 2, 7

[7] E. Hsu, T. Mertens, S. Paris, S. Avidan, and F. Durand. Light mixture estimation for spatially varying white balance. ACM Trans. Graph., 2008. 1, 2, 4

[8] Y. Imai, Y. Kato, H. Kadoi, T. Horiuchi, and S. Tominaga. Estimation of multiple illuminants based on specular highlights detection. In Int. Conf. on Comput. Color Imaging, 2011. 2

[9] G. Klinker, S. Shafer, and T. Kanade. The measurement of highlights in color images. IJCV, 1988. 2

[10] H.-C. Lee. Method for computing the scene-illuminant chromaticity from specular highlights. J. Opt. Soc. Am. A, 3(10): 1694–1699, October 1986. 2, 3

[11] A. Levin, D. Lischinski, and Y. Weiss. A closed form solution to natural image matting. PAMI, 2008. 5

[12] C. Liu, J. Yuen, and A. Torralba. Sift flow: Dense correspondence across scenes and applications. PAMI, 2011. 5

[13] S. K. Nayar, G. Krishnan, M. D. Grossberg, and R. Raskar. Fast separation of direct and global components of a scene using high frequency illumination. ACM Trans. Graph., 2006. 3

[14] J. Renno, D. Makris, T. Ellis, and G. Jones. Application and evaluation of colour constancy in visual surveillance. In Int. Workshop on Performance Evaluation of Tracking and Surveillance, 2005. 2

[15] S. A. Shafer. Using color to separate reflection components. Color Research and Applications, 1985. 2, 3

[16] R. Tan and K. Ikeuchi. Estimating chromaticity of multicolored illuminations. In Workshop on Color and Photometric Methods in Computer Vision, 2003. 2

[17] R. Tan, K. Nishino, and K. Ikeuchi. Illumination chromaticity estimation using inverse-intensity chromaticity space. In CVPR, 2003. 2, 6

[18] J. van de Weijer, T. Gevers, and A. Gijsenij. Edge-based color constancy. IEEE TIP, 2007. 6

[19] N. Wang, B. Funt, C. Lang, and D. Xu. Video-based illumination estimation. In Int. Conf. on Comp. Color Imaging, 2011. 2, 5, 6, 7

[20] Q. Yang, S. Wang, N. Ahuja, and R. Yang. A uniform framework for estimating chromaticity, correspondence, and specular reflection. IEEE Trans. Im. Proc., 2011. 2, 3 33332270