nips nips2005 nips2005-41 nips2005-41-reference knowledge-graph by maker-knowledge-mining

41 nips-2005-Coarse sample complexity bounds for active learning


Source: pdf

Author: Sanjoy Dasgupta

Abstract: We characterize the sample complexity of active learning problems in terms of a parameter which takes into account the distribution over the input space, the specific target hypothesis, and the desired accuracy.


reference text

[1] D. Angluin. Queries revisited. ALT, 2001.

[2] M.-F. Balcan and A. Blum. A PAC-style model for learning from labeled and unlabeled data. Eighteenth Annual Conference on Learning Theory, 2005.

[3] D. Cohn, L. Atlas, and R. Ladner. Improving generalization with active learning. Machine Learning, 15(2):201–221, 1994.

[4] S. Dasgupta. Analysis of a greedy active learning strategy. NIPS, 2004.

[5] S. Dasgupta. Full version of this paper at www.cs.ucsd.edu/˜dasgupta/papers/sample.ps.

[6] S. Dasgupta, A. Kalai, and C. Monteleoni. Analysis of perceptron-based active learning. Eighteenth Annual Conference on Learning Theory, 2005.

[7] Y. Freund, S. Seung, E. Shamir, and N. Tishby. Selective sampling using the query by committee algorithm. Machine Learning Journal, 28:133–168, 1997.

[8] S. Goldman and M. Kearns. On the complexity of teaching. Journal of Computer and System Sciences, 50(1):20–31, 1995.

[9] D. Haussler. Decision-theoretic generalizations of the PAC model for neural net and other learning applications. Information and Computation, 100(1):78–150, 1992.

[10] J. Shawe-Taylor, P. Bartlett, R. Williamson, and M. Anthony. Structural risk minimization over data-dependent hierarchies. IEEE Transactions on Information Theory, 44(5):1926–1940, 1998.