nips nips2002 nips2002-158 nips2002-158-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Elzbieta Pekalska, David Tax, Robert Duin
Abstract: Problems in which abnormal or novel situations should be detected can be approached by describing the domain of the class of typical examples. These applications come from the areas of machine diagnostics, fault detection, illness identification or, in principle, refer to any problem where little knowledge is available outside the typical class. In this paper we explain why proximities are natural representations for domain descriptors and we propose a simple one-class classifier for dissimilarity representations. By the use of linear programming an efficient one-class description can be found, based on a small number of prototype objects. This classifier can be made (1) more robust by transforming the dissimilarities and (2) cheaper to compute by using a reduced representation set. Finally, a comparison to a comparable one-class classifier by Campbell and Bennett is given.
[1] K.P. Bennett and O.L. Mangasarian. Combining support vector and mathematical programming methods for induction. In B. Schölkopf, C.J.C. Burges, and A.J. Smola, editors, Advances in Kernel Methods, Support Vector Learning, pages 307–326. MIT Press, Cambridge, MA, 1999.
[2] C. Berg, J.P.R. Christensen, and P. Ressel. Harmonic Analysis on Semigroups. Springer-Verlag, 1984.
[3] C. Campbell and K.P. Bennett. A linear programming approach to novelty detection. In Neural Information Processing Systems, pages 395–401, 2000.
[4] M.P. Dubuisson and A.K. Jain. Modified Hausdorff distance for object matching. In 12th Internat. Conference on Pattern Recognition, volume 1, pages 566–568, 1994.
[5] R.P.W. Duin. Compactness and complexity of pattern recognition problems. In Internat. Symposium on Pattern Recognition ’In Memoriam Pierre Devijver’, pages 124–128, Royal Military Academy, Brussels, 1999.
[6] R.P.W. Duin and E. P˛ kalska. Complexity of dissimilarity based pattern classes. In Scandinae vian Conference on Image Analysis, 2001.
[7] D.W. Jacobs, D. Weinshall, and Y. Gdalyahu. Classification with non-metric distances: Image retrieval and class representation. IEEE Trans. on PAMI, 22(6):583–600, 2000.
[8] A.K. Jain and D. Zongker. Representation and recognition of handwritten digits using deformable templates. IEEE Trans. on PAMI, 19(12):1386–1391, 1997.
[9] Mangasarian O.L. Arbitrary-norm separating plane. Operations Research Letters, 24(1-2):15– 23, 1999.
[10] E. P˛ kalska, P. Paclik, and R.P.W. Duin. A generalized kernel approach to dissimilarity-based e classification. Journal of Machine Learning Research, 2(2):175–211, 2001.
[11] B. Schölkopf, J.C. Platt, A.J. Smola, and R.C. Williamson. Estimating the support of a highdimensional distribution. Neural Computation, 13:1443–1471, 2001.
[12] B. Schölkopf, Williamson R.C., A.J. Smola, J. Shawe-Taylor, and J.C. Platt. Support vector method for novelty detection. In Neural Information Processing Systems, 2000.
[13] D.M.J. Tax. One-class classification. PhD thesis, Delft University of Technology, The Netherlands, 2001.
[14] D.M.J. Tax and R.P.W. Duin. Support vector data description. Machine Learning, 2002. accepted.
[15] V. Vapnik. The Nature of Statistical Learning. Springer, N.Y., 1995.
[16] http://www.sidanet.org.