cvpr cvpr2013 cvpr2013-239 cvpr2013-239-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Paul Bodesheim, Alexander Freytag, Erik Rodner, Michael Kemmler, Joachim Denzler
Abstract: Detecting samples from previously unknown classes is a crucial task in object recognition, especially when dealing with real-world applications where the closed-world assumption does not hold. We present how to apply a null space method for novelty detection, which maps all training samples of one class to a single point. Beside the possibility of modeling a single class, we are able to treat multiple known classes jointly and to detect novelties for a set of classes with a single model. In contrast to modeling the support of each known class individually, our approach makes use of a projection in a joint subspace where training samples of all known classes have zero intra-class variance. This subspace is called the null space of the training data. To decide about novelty of a test sample, our null space approach allows for solely relying on a distance measure instead of performing density estimation directly. Therefore, we derive a simple yet powerful method for multi-class novelty detection, an important problem not studied sufficiently so far. Our novelty detection approach is assessed in com- prehensive multi-class experiments using the publicly available datasets Caltech-256 and ImageNet. The analysis reveals that our null space approach is perfectly suited for multi-class novelty detection since it outperforms all other methods.
[1] C. M. Bishop. Pattern Recognition and Machine Learning. Springer, 2006. 2, 6
[2] C.-C. Chang and C.-J. Lin. Libsvm: A library for support vector machines. Trans. Intell. Syst. Technology, 2(3):27: 1–27:27, 2011. 7
[3] J. Deng, W. Dong, R. Socher, L.-J. Li, K. Li, and L. Fei-Fei. Imagenet: A large-scale hierarchical image database. In Proc. CVPR, pages 248–255, 2009. 1, 6
[4] D. H. Foley and J. W. Sammon. An optimal set of discriminant vectors. IEEE Trans. Comput., C-24(3):281–289, 1975. 2
[5] A. Globerson and S. Roweis. Metric learning by collapsing classes. In Proc. NIPS, pages 451–458, 2005. 4, 8
[6] G. Griffin, A. Holub, and P. Perona. Caltech-256 object category dataset. Technical report, California Institute of Technology, 2007. 6
[7] Y.-F. Guo, L. Wu, H. Lu, Z. Feng, and X. Xue. Null foley-sammon transform. Pattern Recog., 39(1 1):2248–2251, 2006. 2, 3, 4
[8] M. Kemmler, E. Rodner, and J. Denzler. One-class classification with gaussian processes. In Proc. ACCV, pages 489–500, 2010. 5, 6
[9] C. H. Lampert, H. Nickisch, and S. Harmeling. Learning to detect unseen object classes by between-class attribute transfer. In Proc. CVPR, pages 951–958, 2009. 6
[10] T. Landgrebe, P. Paclik, D. M. J. Tax, and R. P. W. Duin. Optimising two-stage recognition systems. In Multiple Classifier Systems, pages 206–215, 2005. 6, 7
[11] H. Lian. On feature selection with principal component analysis for one-class svm. Pattern Recog. Lett. , 33(9): 1027–103 1, 2012. 4, 5
[12] Y. Lin, G. Gu, H. Liu, and J. Shen. Kernel null foley-sammon transform. In Proc. Int. Conf. Comput. Sci. Software Eng., pages 981–984, 2008. 3, 4
[13] D. G. Lowe. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vision, 60(2):91–1 10, 2004. 7
[14] S. Maji, A. C. Berg, and J. Malik. Classification using intersection kernel support vector machines is efficient. In Proc. CVPR, pages 1–8, 2008. 7
[15] M. Markou and S. Singh. Novelty detection: A review-part 1: Statistical approaches. Signal Process., 83(12):2481–2497, 2003. 6
[16] M. Markou and S. Singh. Novelty detection: A review-part 2: Neural network based approaches. Signal Process., 83(12):2499–2521, 2003. 6
[17] T. Mensink, J. Verbeek, F. Perronnin, and G. Csurka. Metric learning for large scale image classification: Generalizing to new classes at near-zero cost. In Proc. ECCV, pages 488–501, 2012. 4, 8
[18] C. E. Rasmussen and C. K. I. Williams. Gaussian Processes for Machine Learning. The MIT Press, 01 2006. 6
[19] V. Roth. Kernel fisher discriminants for outlier detection. Neural Computation, 18(4):942–960, 2006. 5
[20] B. Sch o¨lkopf, J. C. Platt, J. C. Shawe-Taylor, A. J. Smola, and R. C. Williamson. Estimating the support of a high-dimensional distribution. Neural Computation, 13(7): 1443–1471, 2001. 4, 5, 6
[21] D. M. J. Tax and R. P. W. Duin. Support vector data description.
[22]
[23]
[24]
[25]
[26] Machine Learning, 54(1):45–66, 2004. 5, 6 D. M. J. Tax and R. P. W. Duin. Growing a multi-class classifier with a reject option. Pattern Recog. Lett., 29(10): 1565–1570, 2008. 6, 7 D. M. J. Tax and K.-R. M¨ uller. Feature extraction for one-class classification. In Proc. ICANN/ICONIP, pages 342–349, 2003. 5 S. Vempati, A. Vedaldi, A. Zisserman, and C. V. Jawahar. Generalized rbf feature maps for efficient detection. In Proc. BMVC, pages 2.1–2.1 1, 2010. 7 M. Wu and J. Ye. A small sphere and large margin approach for novelty detection using training data with outliers. IEEE Trans. Pattern Anal. Mach. Intell., 31(1 1):2088–2092, 2009. 6 W. Zheng, L. Zhao, and C. Zou. Foley-sammon optimal discriminant vectors using kernel approach. IEEE Trans. Neural Netw., 16: 1–9, 2005. 3, 4 333333778199