iccv iccv2013 iccv2013-239 iccv2013-239-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Jun Wang, Wei Liu, Andy X. Sun, Yu-Gang Jiang
Abstract: Hashing techniques have been intensively investigated in the design of highly efficient search engines for largescale computer vision applications. Compared with prior approximate nearest neighbor search approaches like treebased indexing, hashing-based search schemes have prominent advantages in terms of both storage and computational efficiencies. Moreover, the procedure of devising hash functions can be easily incorporated into sophisticated machine learning tools, leading to data-dependent and task-specific compact hash codes. Therefore, a number of learning paradigms, ranging from unsupervised to supervised, have been applied to compose appropriate hash functions. How- ever, most of the existing hash function learning methods either treat hash function design as a classification problem or generate binary codes to satisfy pairwise supervision, and have not yet directly optimized the search accuracy. In this paper, we propose to leverage listwise supervision into a principled hash function learning framework. In particular, the ranking information is represented by a set of rank triplets that can be used to assess the quality of ranking. Simple linear projection-based hash functions are solved efficiently through maximizing the ranking quality over the training data. We carry out experiments on large image datasets with size up to one million and compare with the state-of-the-art hashing techniques. The extensive results corroborate that our learned hash codes via listwise supervision can provide superior search accuracy without incurring heavy computational overhead.
[1] B. Bai, J. Weston, D. Grangier, R. Collobert, K. Sadamasa, Y. Qi, O. Chapelle, and K. Weinberger. Supervised semantic indexing. In Proc. of the 18th ACM CIKM, pages 187–196, 2009.
[2] D. Bertsekas. Constrained optimization and lagrange multiplier methods. Computer Science and Applied Mathematics, Boston: Academic Press, 1, 1982.
[3] T.-S. Chua, J. Tang, R. Hong, H. Li, Z. Luo, and Y.-T. Zheng. Nus-wide: A real-world web image database from national university of singapore. In Proc. of ACM CIVR, Santorini, Greece, July 2009.
[4] K. Eshghi and S. Rajaram. Locality sensitive hash functions based on concomitant rank order statistics. In Proc. of ACM SIGKDD, pages 221–229, 2008.
[5] A. Gionis, P. Indyk, and R. Motwani. Similarity search in high dimensions via hashing. In Proc. of VLDB, pages 5 18– 529, 1999.
[6] Y. Gong and S. Lazebnik. Iterative quantization: A procrustean approach to learning binary codes. In Proc. of IEEE CVPR, pages 817–824, 2011.
[7] K. J ¨arvelin and J. Kek a¨l a¨inen. Ir evaluation methods for retrieving highly relevant documents. In Proc. of ACM SIGIR, pages 41–48, 2000.
[8] A. Krizhevsky and G. Hinton. Learning multiple layers of features from tiny images. Technical report, Department of Computer Science, University of Toronto, 2009.
[9] B. Kulis and T. Darrell. Learning to Hash with Binary Reconstructive Embeddings. In Proc. of NIPS, volume 22, pages 1042–1050, 2009.
[10] B. Kulis, P. Jain, and K. Grauman. Fast similarity search for learned metrics. IEEE TPAMI, 31(12):2143–2157, 2009.
[11] H. Li. A short introduction to learning to rank. IEICE Trans. on Information and Systems, 94(10): 1854–1862, 2011.
[12] W. Liu, J. Wang, R. Ji, Y.-G. Jiang, and S.-F. Chang. Supervised hashing with kernels. In Proc. of IEEE CVPR, Providence, USA, 2012.
[13] W. Liu, J. Wang, S. Kumar, and S.-F. Chang. Hashing with graphs. In Proc. of ICML, Bellevue, USA, 2011.
[14] M. Norouzi and D. Fleet. Minimal loss hashing for compact binary codes. In Proc. of ICML, Bellevue, USA, 2011.
[15] M. Norouzi, D. Fleet, and R. Salakhutdinov. Hamming distance metric learning. In Proc. of NIPS, volume 25, pages 1070–1078, 2012.
[16] P. Ram, D. Lee, H. Ouyang, and A. Gray. Rank-approximate nearest neighbor search: Retaining meaning and speed in high dimensions. In Proc. of NIPS, volume 22, pages 1536– 1544, 2009.
[17] R. Salakhutdinov and G. Hinton. Semantic hashing. International Journal of Approximate Reasoning, 50(7):969–978, 2009.
[18] G. Shakhnarovich. Learning task-specific similarity. PhD thesis, Massachusetts Institute of Technology, 2005.
[19] A. Shashua and A. Levin. Ranking with large margin principle: Two approaches. In Proc. of NIPS, volume 16, pages 937–944, 2003.
[20] A. Torralba, R. Fergus, and W. Freeman. 80 million tiny images: A large data set for nonparametric object and scene
[21]
[22]
[23]
[24]
[25]
[26]
[27] recognition. IEEE TPAMI, 30(1 1): 1958–1970, 2008. J. Wang, S. Kumar, and S.-F. Chang. Semi-supervised hashing for scalable image retrieval. In Proc. of IEEE CVPR, pages 3424–343 1, San Francisco, USA, June 2010. J. Wang, S. Kumar, and S.-F. Chang. Sequential projection learning for hashing with compact codes. In Proc. of ICML, pages 1127–1 134, Haifa, Israel, 2010. J. Wang, S. Kumar, and S.-F. Chang. Semi-supervised hashing for large-scale search. IEEE Trans. on PAMI, 34(12):2393–2406, 2012. Y. Weiss, A. Torralba, and R. Fergus. Spectral hashing. In Proc. of NIPS, volume 21, pages 1753–1760, 2008. F. Xia, T.-Y. Liu, J. Wang, W. Zhang, and H. Li. Listwise approach to learning to rank: theory and algorithm. In Proc. of ICML, pages 1192–1 199, 2008. H. Xu, J. Wang, Z. Li, G. Zeng, S. Li, and N. Yu. Complementary hashing for approximate nearest neighbor search. In Proc. of IEEE ICCV, pages 1631–1638, 2011. J. Yagnik, D. Strelow, D. Ross, and R.-S. Lin. The power of comparative reasoning. In Proc. of IEEE ICCV, pages 243 1– 2438, 2011. 333000333999