hunch_net hunch_net-2010 hunch_net-2010-414 knowledge-graph by maker-knowledge-mining
Source: html
Introduction: from brain cancer. I asked Misha who worked with him to write about it. Partha Niyogi, Louis Block Professor in Computer Science and Statistics at the University of Chicago passed away on October 1, 2010, aged 43. I first met Partha Niyogi almost exactly ten years ago when I was a graduate student in math and he had just started as a faculty in Computer Science and Statistics at the University of Chicago. Strangely, we first talked at length due to a somewhat convoluted mathematical argument in a paper on pattern recognition. I asked him some questions about the paper, and, even though the topic was new to him, he had put serious thought into it and we started regular meetings. We made significant progress and developed a line of research stemming initially just from trying to understand that one paper and to simplify one derivation. I think this was typical of Partha, showing both his intellectual curiosity and his intuition for the serendipitous; having a sense and focus fo
sentIndex sentText sentNum sentScore
1 I asked Misha who worked with him to write about it. [sent-2, score-0.177]
2 Partha Niyogi, Louis Block Professor in Computer Science and Statistics at the University of Chicago passed away on October 1, 2010, aged 43. [sent-3, score-0.099]
3 I first met Partha Niyogi almost exactly ten years ago when I was a graduate student in math and he had just started as a faculty in Computer Science and Statistics at the University of Chicago. [sent-4, score-0.328]
4 Strangely, we first talked at length due to a somewhat convoluted mathematical argument in a paper on pattern recognition. [sent-5, score-0.154]
5 I asked him some questions about the paper, and, even though the topic was new to him, he had put serious thought into it and we started regular meetings. [sent-6, score-0.245]
6 We made significant progress and developed a line of research stemming initially just from trying to understand that one paper and to simplify one derivation. [sent-7, score-0.172]
7 I think this was typical of Partha, showing both his intellectual curiosity and his intuition for the serendipitous; having a sense and focus for inquiries worth pursuing, no matter how remote or challenging, and bringing his unique vision to new areas. [sent-8, score-0.409]
8 We had been working together continuously from that first meeting until he became too sick to continue. [sent-9, score-0.154]
9 Partha had been a great adviser and a close friend for me; I am very much thankful to him for his guidance, intellectual inspiration and friendship. [sent-10, score-0.525]
10 Partha had a broad range of interests in research centered around the problem of learning, which had been his interest since he was an undergraduate at the Indian Institute of Technology. [sent-11, score-0.158]
11 His research had three general themes: geometric methods in machine learning, particularly manifold methods; language evolution and language learning (he recently published a 500-page monograph on it) and speech analysis and recognition. [sent-12, score-0.552]
12 I will not talk about his individual works, a more in-depth summary of his research is in the University of Chicago Computer Science department obituary . [sent-13, score-0.09]
13 In every one of these areas he had his own approach, distinct, clear, and not afraid to challenge unexamined conventional wisdom. [sent-15, score-0.068]
14 To lose this intellectually rigorous but open-minded vision is not just a blow to those of us who knew him and worked with him, but to the field of machine learning itself. [sent-16, score-0.425]
15 I owe a lot to Partha; to his insight and thoughtful attitude to research and every aspect of life. [sent-17, score-0.312]
16 It had been a great privilege to be Partha’s student, collaborator and friend; his passing away leaves deep sadness and emptiness. [sent-18, score-0.253]
17 It is hard to believe Partha is no longer with us, but his friendship and what I learned from him will stay with me for the rest of my life. [sent-19, score-0.076]
wordName wordTfidf (topN-words)
[('partha', 0.608), ('university', 0.163), ('niyogi', 0.152), ('intellectual', 0.152), ('friend', 0.127), ('chicago', 0.111), ('computer', 0.109), ('science', 0.104), ('away', 0.099), ('vision', 0.099), ('asked', 0.097), ('student', 0.094), ('research', 0.09), ('statistics', 0.088), ('blow', 0.082), ('simplify', 0.082), ('curiosity', 0.082), ('strangely', 0.082), ('inspiration', 0.082), ('monograph', 0.082), ('convoluted', 0.082), ('collaborator', 0.082), ('owe', 0.082), ('adviser', 0.082), ('intellectually', 0.082), ('rigorous', 0.082), ('sick', 0.082), ('suresh', 0.082), ('ten', 0.082), ('thankful', 0.082), ('worked', 0.08), ('started', 0.08), ('language', 0.08), ('evolution', 0.076), ('stay', 0.076), ('indian', 0.076), ('geometric', 0.076), ('remote', 0.076), ('misha', 0.076), ('attitude', 0.072), ('pattern', 0.072), ('guidance', 0.072), ('passing', 0.072), ('faculty', 0.072), ('continuously', 0.072), ('thoughtful', 0.068), ('regular', 0.068), ('centered', 0.068), ('conventional', 0.068), ('manifold', 0.068)]
simIndex simValue blogId blogTitle
same-blog 1 0.99999952 414 hunch net-2010-10-17-Partha Niyogi has died
Introduction: from brain cancer. I asked Misha who worked with him to write about it. Partha Niyogi, Louis Block Professor in Computer Science and Statistics at the University of Chicago passed away on October 1, 2010, aged 43. I first met Partha Niyogi almost exactly ten years ago when I was a graduate student in math and he had just started as a faculty in Computer Science and Statistics at the University of Chicago. Strangely, we first talked at length due to a somewhat convoluted mathematical argument in a paper on pattern recognition. I asked him some questions about the paper, and, even though the topic was new to him, he had put serious thought into it and we started regular meetings. We made significant progress and developed a line of research stemming initially just from trying to understand that one paper and to simplify one derivation. I think this was typical of Partha, showing both his intellectual curiosity and his intuition for the serendipitous; having a sense and focus fo
2 0.21863142 357 hunch net-2009-05-30-Many ways to Learn this summer
Introduction: There are at least 3 summer schools related to machine learning this summer. The first is at University of Chicago June 1-11 organized by Misha Belkin , Partha Niyogi , and Steve Smale . Registration is closed for this one, meaning they met their capacity limit. The format is essentially an extended Tutorial/Workshop. I was particularly interested to see Valiant amongst the speakers. I’m also presenting Saturday June 6, on logarithmic time prediction. Praveen Srinivasan points out the second at Peking University in Beijing, China, July 20-27. This one differs substantially, as it is about vision, machine learning, and their intersection. The deadline for applications is June 10 or 15. This is also another example of the growth of research in China, with active support from NSF . The third one is at Cambridge , England, August 29-September 10. It’s in the MLSS series . Compared to the Chicago one, this one is more about the Bayesian side of ML, alth
3 0.13283254 228 hunch net-2007-01-15-The Machine Learning Department
Introduction: Carnegie Mellon School of Computer Science has the first academic Machine Learning department . This department already existed as the Center for Automated Learning and Discovery , but recently changed it’s name. The reason for changing the name is obvious: very few people think of themselves as “Automated Learner and Discoverers”, but there are number of people who think of themselves as “Machine Learners”. Machine learning is both more succinct and recognizable—good properties for a name. A more interesting question is “Should there be a Machine Learning Department?”. Tom Mitchell has a relevant whitepaper claiming that machine learning is answering a different question than other fields or departments. The fundamental debate here is “Is machine learning different from statistics?” At a cultural level, there is no real debate: they are different. Machine learning is characterized by several very active large peer reviewed conferences, operating in a computer
4 0.09541443 132 hunch net-2005-11-26-The Design of an Optimal Research Environment
Introduction: How do you create an optimal environment for research? Here are some essential ingredients that I see. Stability . University-based research is relatively good at this. On any particular day, researchers face choices in what they will work on. A very common tradeoff is between: easy small difficult big For researchers without stability, the ‘easy small’ option wins. This is often “ok”—a series of incremental improvements on the state of the art can add up to something very beneficial. However, it misses one of the big potentials of research: finding entirely new and better ways of doing things. Stability comes in many forms. The prototypical example is tenure at a university—a tenured professor is almost imposssible to fire which means that the professor has the freedom to consider far horizon activities. An iron-clad guarantee of a paycheck is not necessary—industrial research labs have succeeded well with research positions of indefinite duration. Atnt rese
5 0.076707445 231 hunch net-2007-02-10-Best Practices for Collaboration
Introduction: Many people, especially students, haven’t had an opportunity to collaborate with other researchers. Collaboration, especially with remote people can be tricky. Here are some observations of what has worked for me on collaborations involving a few people. Travel and Discuss Almost all collaborations start with in-person discussion. This implies that travel is often necessary. We can hope that in the future we’ll have better systems for starting collaborations remotely (such as blogs), but we aren’t quite there yet. Enable your collaborator . A collaboration can fall apart because one collaborator disables another. This sounds stupid (and it is), but it’s far easier than you might think. Avoid Duplication . Discovering that you and a collaborator have been editing the same thing and now need to waste time reconciling changes is annoying. The best way to avoid this to be explicit about who has write permission to what. Most of the time, a write lock is held for the e
6 0.076360695 424 hunch net-2011-02-17-What does Watson mean?
7 0.072573528 339 hunch net-2009-01-27-Key Scientific Challenges
8 0.071171351 194 hunch net-2006-07-11-New Models
9 0.071105458 344 hunch net-2009-02-22-Effective Research Funding
10 0.070103481 225 hunch net-2007-01-02-Retrospective
11 0.069580674 445 hunch net-2011-09-28-Somebody’s Eating Your Lunch
12 0.069224112 449 hunch net-2011-11-26-Giving Thanks
13 0.068697706 64 hunch net-2005-04-28-Science Fiction and Research
14 0.067443363 36 hunch net-2005-03-05-Funding Research
15 0.064672574 134 hunch net-2005-12-01-The Webscience Future
16 0.06296593 454 hunch net-2012-01-30-ICML Posters and Scope
17 0.061346881 329 hunch net-2008-11-28-A Bumper Crop of Machine Learning Graduates
18 0.060676694 106 hunch net-2005-09-04-Science in the Government
19 0.05997942 464 hunch net-2012-05-03-Microsoft Research, New York City
20 0.058269612 10 hunch net-2005-02-02-Kolmogorov Complexity and Googling
topicId topicWeight
[(0, 0.149), (1, -0.044), (2, -0.091), (3, 0.065), (4, -0.047), (5, -0.044), (6, 0.012), (7, 0.01), (8, 0.0), (9, -0.041), (10, 0.049), (11, -0.03), (12, -0.046), (13, -0.046), (14, -0.013), (15, -0.007), (16, 0.011), (17, 0.128), (18, -0.019), (19, 0.015), (20, 0.008), (21, 0.032), (22, -0.055), (23, -0.015), (24, 0.03), (25, -0.064), (26, 0.06), (27, -0.028), (28, -0.031), (29, 0.061), (30, 0.001), (31, 0.046), (32, 0.039), (33, 0.076), (34, 0.015), (35, 0.024), (36, -0.041), (37, -0.038), (38, 0.031), (39, 0.029), (40, 0.006), (41, 0.035), (42, 0.022), (43, -0.025), (44, -0.041), (45, -0.035), (46, 0.11), (47, -0.01), (48, 0.028), (49, 0.003)]
simIndex simValue blogId blogTitle
same-blog 1 0.94976711 414 hunch net-2010-10-17-Partha Niyogi has died
Introduction: from brain cancer. I asked Misha who worked with him to write about it. Partha Niyogi, Louis Block Professor in Computer Science and Statistics at the University of Chicago passed away on October 1, 2010, aged 43. I first met Partha Niyogi almost exactly ten years ago when I was a graduate student in math and he had just started as a faculty in Computer Science and Statistics at the University of Chicago. Strangely, we first talked at length due to a somewhat convoluted mathematical argument in a paper on pattern recognition. I asked him some questions about the paper, and, even though the topic was new to him, he had put serious thought into it and we started regular meetings. We made significant progress and developed a line of research stemming initially just from trying to understand that one paper and to simplify one derivation. I think this was typical of Partha, showing both his intellectual curiosity and his intuition for the serendipitous; having a sense and focus fo
2 0.63427043 228 hunch net-2007-01-15-The Machine Learning Department
Introduction: Carnegie Mellon School of Computer Science has the first academic Machine Learning department . This department already existed as the Center for Automated Learning and Discovery , but recently changed it’s name. The reason for changing the name is obvious: very few people think of themselves as “Automated Learner and Discoverers”, but there are number of people who think of themselves as “Machine Learners”. Machine learning is both more succinct and recognizable—good properties for a name. A more interesting question is “Should there be a Machine Learning Department?”. Tom Mitchell has a relevant whitepaper claiming that machine learning is answering a different question than other fields or departments. The fundamental debate here is “Is machine learning different from statistics?” At a cultural level, there is no real debate: they are different. Machine learning is characterized by several very active large peer reviewed conferences, operating in a computer
3 0.61744666 449 hunch net-2011-11-26-Giving Thanks
Introduction: Thanksgiving is perhaps my favorite holiday, because pausing your life and giving thanks provides a needed moment of perspective. As a researcher, I am most thankful for my education, without which I could not function. I want to share this, because it provides some sense of how a researcher starts. My long term memory seems to function particularly well, which makes any education I get is particularly useful. I am naturally obsessive, which makes me chase down details until I fully understand things. Natural obsessiveness can go wrong, of course, but it’s a great ally when you absolutely must get things right. My childhood was all in one hometown, which was a conscious sacrifice on the part of my father, implying disruptions from moving around were eliminated. I’m not sure how important this was since travel has it’s own benefits, but it bears thought. I had several great teachers in grade school, and naturally gravitated towards teachers over classmates, as they seemed
4 0.60109252 290 hunch net-2008-02-27-The Stats Handicap
Introduction: Graduating students in Statistics appear to be at a substantial handicap compared to graduating students in Machine Learning, despite being in substantially overlapping subjects. The problem seems to be cultural. Statistics comes from a mathematics background which emphasizes large publications slowly published under review at journals. Machine Learning comes from a Computer Science background which emphasizes quick publishing at reviewed conferences. This has a number of implications: Graduating statistics PhDs often have 0-2 publications while graduating machine learning PhDs might have 5-15. Graduating ML students have had a chance for others to build on their work. Stats students have had no such chance. Graduating ML students have attended a number of conferences and presented their work, giving them a chance to meet people. Stats students have had fewer chances of this sort. In short, Stats students have had relatively few chances to distinguish themselves and
5 0.58946133 329 hunch net-2008-11-28-A Bumper Crop of Machine Learning Graduates
Introduction: My impression is that this is a particularly strong year for machine learning graduates. Here’s my short list of the strong graduates I know. Analpha (for perversity’s sake) by last name: Jenn Wortmann . When Jenn visited us for the summer, she had one , two , three , four papers. That is typical—she’s smart, capable, and follows up many directions of research. I believe approximately all of her many papers are on different subjects. Ruslan Salakhutdinov . A Science paper on bijective dimensionality reduction , mastered and improved on deep belief nets which seems like an important flavor of nonlinear learning, and in my experience he’s very fast, capable and creative at problem solving. Marc’Aurelio Ranzato . I haven’t spoken with Marc very much, but he had a great visit at Yahoo! this summer, and has an impressive portfolio of applications and improvements on convolutional neural networks and other deep learning algorithms. Lihong Li . Lihong developed the
6 0.58296955 64 hunch net-2005-04-28-Science Fiction and Research
7 0.57086337 69 hunch net-2005-05-11-Visa Casualties
8 0.55336529 357 hunch net-2009-05-30-Many ways to Learn this summer
9 0.54446071 424 hunch net-2011-02-17-What does Watson mean?
10 0.53132164 13 hunch net-2005-02-04-JMLG
11 0.52824879 75 hunch net-2005-05-28-Running A Machine Learning Summer School
12 0.51171565 162 hunch net-2006-03-09-Use of Notation
13 0.50997168 335 hunch net-2009-01-08-Predictive Analytics World
14 0.50832707 306 hunch net-2008-07-02-Proprietary Data in Academic Research?
15 0.50732452 172 hunch net-2006-04-14-JMLR is a success
16 0.50454056 270 hunch net-2007-11-02-The Machine Learning Award goes to …
17 0.49997476 493 hunch net-2014-02-16-Metacademy: a package manager for knowledge
18 0.49806774 84 hunch net-2005-06-22-Languages of Learning
19 0.49712291 134 hunch net-2005-12-01-The Webscience Future
20 0.49522293 106 hunch net-2005-09-04-Science in the Government
topicId topicWeight
[(10, 0.011), (16, 0.314), (27, 0.187), (37, 0.022), (38, 0.049), (53, 0.077), (55, 0.086), (94, 0.082), (95, 0.071)]
simIndex simValue blogId blogTitle
1 0.9714666 197 hunch net-2006-07-17-A Winner
Introduction: Ed Snelson won the Predictive Uncertainty in Environmental Modelling Competition in the temp(erature) category using this algorithm . Some characteristics of the algorithm are: Gradient descent … on about 600 parameters … with local minima … to solve regression. This bears a strong resemblance to a neural network. The two main differences seem to be: The system has a probabilistic interpretation (which may aid design). There are (perhaps) fewer parameters than a typical neural network might have for the same problem (aiding speed).
2 0.96691966 69 hunch net-2005-05-11-Visa Casualties
Introduction: For the Chicago 2005 machine learning summer school we are organizing, at least 5 international students can not come due to visa issues. There seem to be two aspects to visa issues: Inefficiency . The system rejected the student simply by being incapable of even starting to evaluate their visa in less than 1 month of time. Politics . Border controls became much tighter after the September 11 attack. Losing a big chunk of downtown of the largest city in a country will do that. What I (and the students) learned is that (1) is a much larger problem than (2). Only 1 prospective student seems to have achieved an explicit visa rejection. Fixing problem (1) should be a no-brainer, because the lag time almost surely indicates overload, and overload on border controls should worry even people concerned with (2). The obvious fixes to overload are “spend more money” and “make the system more efficient”. With respect to (2), (which is a more minor issue by the numbers) it i
3 0.93681365 176 hunch net-2006-05-01-A conversation between Theo and Pat
Introduction: Pat (the practitioner) I need to do multiclass classification and I only have a decision tree. Theo (the thoeretician) Use an error correcting output code . Pat Oh, that’s cool. But the created binary problems seem unintuitive. I’m not sure the decision tree can solve them. Theo Oh? Is your problem a decision list? Pat No, I don’t think so. Theo Hmm. Are the classes well separated by axis aligned splits? Pat Err, maybe. I’m not sure. Theo Well, if they are, under the IID assumption I can tell you how many samples you need. Pat IID? The data is definitely not IID. Theo Oh dear. Pat Can we get back to the choice of ECOC? I suspect we need to build it dynamically in response to which subsets of the labels are empirically separable from each other. Theo Ok. What do you know about your problem? Pat Not much. My friend just gave me the dataset. Theo Then, no one can help you. Pat (What a fuzzy thinker. Theo keeps jumping t
same-blog 4 0.8751511 414 hunch net-2010-10-17-Partha Niyogi has died
Introduction: from brain cancer. I asked Misha who worked with him to write about it. Partha Niyogi, Louis Block Professor in Computer Science and Statistics at the University of Chicago passed away on October 1, 2010, aged 43. I first met Partha Niyogi almost exactly ten years ago when I was a graduate student in math and he had just started as a faculty in Computer Science and Statistics at the University of Chicago. Strangely, we first talked at length due to a somewhat convoluted mathematical argument in a paper on pattern recognition. I asked him some questions about the paper, and, even though the topic was new to him, he had put serious thought into it and we started regular meetings. We made significant progress and developed a line of research stemming initially just from trying to understand that one paper and to simplify one derivation. I think this was typical of Partha, showing both his intellectual curiosity and his intuition for the serendipitous; having a sense and focus fo
5 0.80739641 447 hunch net-2011-10-10-ML Symposium and ICML details
Introduction: Everyone should have received notice for NY ML Symposium abstracts. Check carefully, as one was lost by our system. The event itself is October 21, next week. Leon Bottou , Stephen Boyd , and Yoav Freund are giving the invited talks this year, and there are many spotlights on local work spread throughout the day. Chris Wiggins has setup 6(!) ML-interested startups to follow the symposium, which should be of substantial interest to the employment interested. I also wanted to give an update on ICML 2012 . Unlike last year, our deadline is coordinated with AIStat (which is due this Friday). The paper deadline for ICML has been pushed back to February 24 which should allow significant time for finishing up papers after the winter break. Other details may interest people as well: We settled on using CMT after checking out the possibilities. I wasn’t looking for this, because I’ve often found CMT clunky in terms of easy access to the right information. Nevert
6 0.78063172 59 hunch net-2005-04-22-New Blog: [Lowerbounds,Upperbounds]
7 0.75124907 19 hunch net-2005-02-14-Clever Methods of Overfitting
8 0.66889942 435 hunch net-2011-05-16-Research Directions for Machine Learning and Algorithms
9 0.62385517 286 hunch net-2008-01-25-Turing’s Club for Machine Learning
10 0.62365043 343 hunch net-2009-02-18-Decision by Vetocracy
11 0.61550575 371 hunch net-2009-09-21-Netflix finishes (and starts)
12 0.61170006 158 hunch net-2006-02-24-A Fundamentalist Organization of Machine Learning
13 0.60446185 132 hunch net-2005-11-26-The Design of an Optimal Research Environment
14 0.60389054 380 hunch net-2009-11-29-AI Safety
15 0.60294771 194 hunch net-2006-07-11-New Models
16 0.60112053 95 hunch net-2005-07-14-What Learning Theory might do
17 0.60075223 478 hunch net-2013-01-07-NYU Large Scale Machine Learning Class
18 0.60027844 370 hunch net-2009-09-18-Necessary and Sufficient Research
19 0.59986156 360 hunch net-2009-06-15-In Active Learning, the question changes
20 0.59596884 235 hunch net-2007-03-03-All Models of Learning have Flaws