nips nips2013 nips2013-182 nips2013-182-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Masayuki Karasuyama, Hiroshi Mamitsuka
Abstract: Label propagation is one of the state-of-the-art methods for semi-supervised learning, which estimates labels by propagating label information through a graph. Label propagation assumes that data points (nodes) connected in a graph should have similar labels. Consequently, the label estimation heavily depends on edge weights in a graph which represent similarity of each node pair. We propose a method for a graph to capture the manifold structure of input features using edge weights parameterized by a similarity function. In this approach, edge weights represent both similarity and local reconstruction weight simultaneously, both being reasonable for label propagation. For further justification, we provide analytical considerations including an interpretation as a cross-validation of a propagation model in the feature space, and an error analysis based on a low dimensional manifold model. Experimental results demonstrated the effectiveness of our approach both in synthetic and real datasets. 1
[1] X. Zhu, Z. Ghahramani, and J. D. Lafferty, “Semi-supervised learning using Gaussian fields and harmonic functions,” in Proc. of the 20th ICML (T. Fawcett and N. Mishra, eds.), pp. 912–919, AAAI Press, 2003.
[2] D. Zhou, O. Bousquet, T. N. Lal, J. Weston, and B. Sch¨ lkopf, “Learning with local and global consiso tency,” in Advances in NIPS 16 (S. Thrun, L. Saul, and B. Sch¨ lkopf, eds.), MIT Press, 2004. o 8
[3] A. Kapoor, Y. A. Qi, H. Ahn, and R. Picard, “Hyperparameter and kernel learning for graph based semisupervised classification,” in Advances in NIPS 18 (Y. Weiss, B. Sch¨ lkopf, and J. Platt, eds.), pp. 627– o 634, MIT Press, 2006.
[4] X. Zhang and W. S. Lee, “Hyperparameter learning for graph based semi-supervised learning algorithms,” in Advances in NIPS 19 (B. Sch¨ lkopf, J. Platt, and T. Hoffman, eds.), pp. 1585–1592, MIT Press, 2007. o
[5] F. Wang and C. Zhang, “Label propagation through linear neighborhoods,” IEEE TKDE, vol. 20, pp. 55– 67, 2008.
[6] S. Roweis and L. Saul, “Nonlinear dimensionality reduction by locally linear embedding,” Science, vol. 290, no. 5500, pp. 2323–2326, 2000.
[7] S. I. Daitch, J. A. Kelner, and D. A. Spielman, “Fitting a graph to vector data,” in Proc. of the 26th ICML, (New York, NY, USA), pp. 201–208, ACM, 2009.
[8] H. Cheng, Z. Liu, and J. Yang, “Sparsity induced similarity measure for label propagation,” in IEEE 12th ICCV, pp. 317–324, IEEE, 2009.
[9] W. Liu, J. He, and S.-F. Chang, “Large graph construction for scalable semi-supervised learning,” in Proc. of the 27th ICML, pp. 679–686, Omnipress, 2010.
[10] J. Chen and Y. Liu, “Locally linear embedding: a survey,” Artificial Intelligence Review, vol. 36, pp. 29– 48, 2011.
[11] L. K. Saul and S. T. Roweis, “Think globally, fit locally: unsupervised learning of low dimensional manifolds,” JMLR, vol. 4, pp. 119–155, Dec. 2003.
[12] A. Gretton, K. M. Borgwardt, M. J. Rasch, B. Sch¨ lkopf, and A. J. Smola, “A kernel method for the twoo sample-problem,” in Advances in NIPS 19 (B. Sch¨ lkopf, J. C. Platt, and T. Hoffman, eds.), pp. 513–520, o MIT Press, 2007.
[13] E. Elhamifar and R. Vidal, “Sparse manifold clustering and embedding,” in Advances in NIPS 24 (J. Shawe-Taylor, R. Zemel, P. Bartlett, F. Pereira, and K. Weinberger, eds.), pp. 55–63, 2011.
[14] D. Kong, C. H. Ding, H. Huang, and F. Nie, “An iterative locally linear embedding algorithm,” in Proc. of the 29th ICML (J. Langford and J. Pineau, eds.), pp. 1647–1654, Omnipress, 2012.
[15] X. Zhu, J. Kandola, Z. Ghahramani, and J. Lafferty, “Nonparametric transforms of graph kernels for semisupervised learning,” in Advances in NIPS 17 (L. K. Saul, Y. Weiss, and L. Bottou, eds.), pp. 1641–1648, MIT Press, 2005.
[16] F. R. Bach and M. I. Jordan, “Learning spectral clustering,” in Advances in NIPS 16 (S. Thrun, L. K. Saul, and B. Sch¨ lkopf, eds.), 2004. o
[17] T. Jebara, J. Wang, and S.-F. Chang, “Graph construction and b-matching for semi-supervised learning,” in Proc. of the 26th ICML (A. P. Danyluk, L. Bottou, and M. L. Littman, eds.), pp. 441–448, ACM, 2009.
[18] M. S. Baghshah and S. B. Shouraki, “Metric learning for semi-supervised clustering using pairwise constraints and the geometrical structure of data,” Intelligent Data Analysis, vol. 13, no. 6, pp. 887–899, 2009.
[19] B. Shaw, B. Huang, and T. Jebara, “Learning a distance metric from a network,” in Advances in NIPS 24 (J. Shawe-Taylor, R. Zemel, P. Bartlett, F. Pereira, and K. Weinberger, eds.), pp. 1899–1907, 2011.
[20] S. A. Nene, S. K. Nayar, and H. Murase, “Columbia object image library,” tech. rep., CUCS-005-96, 1996.
[21] T. Hastie, R. Tibshirani, and J. H. Friedman, The elements of statistical learning: data mining, inference, and prediction. New York: Springer-Verlag, 2001.
[22] Y. LeCun, L. Bottou, Y. Bengio, and P. Haffner, “Gradient-based learning applied to document recognition,” Proceedings of the IEEE, vol. 86, no. 11, pp. 2278–2324, 1998.
[23] F. Samaria and A. Harter, “Parameterisation of a stochastic model for human face identification,” in Proceedings of the Second IEEE Workshop on Applications of Computer Vision, pp. 138–142, 1994.
[24] A. Asuncion and D. J. Newman, “UCI machine learning http://www.ics.uci.edu/˜mlearn/MLRepository.html, 2007. repository.”
[25] A. Georghiades, P. Belhumeur, and D. Kriegman, “From few to many: Illumination cone models for face recognition under variable lighting and pose,” IEEE TPAMI, vol. 23, no. 6, pp. 643–660, 2001.
[26] D. B. Graham and N. M. Allinson, “Characterizing virtual eigensignatures for general purpose face recognition,” in Face Recognition: From Theory to Applications ; NATO ASI Series F, Computer and Systems Sciences (H. Wechsler, P. J. Phillips, V. Bruce, F. Fogelman-Soulie, and T. S. Huang, eds.), vol. 163, pp. 446–456, 1998.
[27] L. Zelnik-Manor and P. Perona, “Self-tuning spectral clustering,” in Advances in NIPS 17, pp. 1601–1608, MIT Press, 2004. 9