nips nips2004 nips2004-8 nips2004-8-reference knowledge-graph by maker-knowledge-mining
Source: pdf
Author: Olivier Chapelle, Za\
Abstract: Choice-based conjoint analysis builds models of consumer preferences over products with answers gathered in questionnaires. Our main goal is to bring tools from the machine learning community to solve this problem more efficiently. Thus, we propose two algorithms to quickly and accurately estimate consumer preferences. 1
[1] B. E. Boser, I. M. Guyon, and V. N. Vapnik. A training algorithm for optimal margin classifiers. In Proc. 5th Annu. Workshop on Comput. Learning Theory, 1992.
[2] W. Chu and Z. Ghahramani. Gaussian processes for ordinal regression. Technical report, University College London, 2004.
[3] T. Evgeniou, C. Boussios, and G. Zacharia. Generalized robust conjoint estimation. Marketing Science, 25, 2005.
[4] Z. Harchaoui. Statistical learning approaches to conjoint estimation. Technical report, Max Planck Institute for Biological Cybernetics, to appear.
[5] R. Herbrich, T. Graepel, and K. Obermayer. Large margin rank boundaries for ordinal regression. In Advances in Large Margin Classifiers. MIT Press, 2000.
[6] J. Huber and K. Zwerina. The importance of utility balance in efficient choice designs. Journal of Marketing Research, 33, 1996.
[7] T. S. Jaakkola and D. Haussler. Probabilistic kernel regression models. In Artificial Intelligence and Statistics, 1999.
[8] T. S. Jaakkola and M. I. Jordan. Bayesian logistic regression: a variational approach. Statistics and Computing, 10:25–37, 2000.
[9] T. Jebara. Convex invariance learning. In Artificial Intelligence and Statistics, 2003.
[10] C. A. Micchelli and M. Pontil. Kernels for multi–task learning. In Advances in Neural Information Processing Systems 17, 2005.
[11] Sawtooth Software. Research paper series. Available at www.sawtoothsoftware.com/techpap.shtml#hbrel.
[12] B. Sch¨ lkopf and A. Smola. Learning with kernels. MIT Press, 2002. o
[13] A. Schwaighofer, V. Tresp, and K. Yu. Hierarchical bayesian modelling with gaussian processes. In Advances in Neural Information Processing Systems 17, 2005.
[14] M. Tipping. Bayesian inference: Principles and practice. In Advanced Lectures on Machine Learning. Springer, 2004.
[15] S. Tong and D. Koller. Support vector machine active learning with applications to text classification. In Journal of Machine Learning Research, volume 2, 2001.
[16] O. Toubia, J. R. Hauser, and D. I. Simester. Polyhedral methods for adaptive choicebased conjoint analysis. Journal of Marketing Research, 41(1):116–131, 2004.
[17] V. Vapnik and O. Chapelle. Bounds on error expectation for support vector machines. Neural Computation, 12(9), 2000.
[18] C. K. I. Williams and D. Barber. Bayesian classification with gaussian processes. IEEE Trans. Pattern Anal. Mach. Intell., 20, 1998.