hunch_net hunch_net-2009 hunch_net-2009-375 knowledge-graph by maker-knowledge-mining

375 hunch net-2009-10-26-NIPS workshops


meta infos for this blog

Source: html

Introduction: Many of the NIPS workshops have a deadline about now, and the NIPS early registration deadline is Nov. 6 . Several interest me: Adaptive Sensing, Active Learning, and Experimental Design due 10/27. Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra , due Nov. 6. Large-Scale Machine Learning: Parallelism and Massive Datasets , due 10/23 (i.e. past) Analysis and Design of Algorithms for Interactive Machine Learning , due 10/30. And I’m sure many of the others interest others. Workshops are great as a mechanism for research, so take a look if there is any chance you might be interested.


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Many of the NIPS workshops have a deadline about now, and the NIPS early registration deadline is Nov. [sent-1, score-1.092]

2 Several interest me: Adaptive Sensing, Active Learning, and Experimental Design due 10/27. [sent-3, score-0.54]

3 Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra , due Nov. [sent-4, score-0.345]

4 Large-Scale Machine Learning: Parallelism and Massive Datasets , due 10/23 (i. [sent-6, score-0.345]

5 past) Analysis and Design of Algorithms for Interactive Machine Learning , due 10/30. [sent-8, score-0.345]

6 And I’m sure many of the others interest others. [sent-9, score-0.469]

7 Workshops are great as a mechanism for research, so take a look if there is any chance you might be interested. [sent-10, score-0.571]


similar blogs computed by tfidf model

tfidf for this blog:

wordName wordTfidf (topN-words)

[('due', 0.345), ('deadline', 0.265), ('submodularity', 0.261), ('workshops', 0.259), ('discrete', 0.228), ('sparsity', 0.217), ('nips', 0.211), ('massive', 0.209), ('parallelism', 0.209), ('sensing', 0.209), ('design', 0.208), ('interest', 0.195), ('adaptive', 0.176), ('interactive', 0.169), ('registration', 0.162), ('experimental', 0.143), ('early', 0.141), ('past', 0.139), ('datasets', 0.134), ('optimization', 0.125), ('chance', 0.121), ('look', 0.117), ('sure', 0.114), ('machine', 0.114), ('mechanism', 0.11), ('active', 0.109), ('analysis', 0.101), ('others', 0.097), ('interested', 0.091), ('take', 0.086), ('great', 0.08), ('many', 0.063), ('learning', 0.062), ('algorithms', 0.059), ('research', 0.057), ('might', 0.057), ('several', 0.05)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.99999988 375 hunch net-2009-10-26-NIPS workshops

Introduction: Many of the NIPS workshops have a deadline about now, and the NIPS early registration deadline is Nov. 6 . Several interest me: Adaptive Sensing, Active Learning, and Experimental Design due 10/27. Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra , due Nov. 6. Large-Scale Machine Learning: Parallelism and Massive Datasets , due 10/23 (i.e. past) Analysis and Design of Algorithms for Interactive Machine Learning , due 10/30. And I’m sure many of the others interest others. Workshops are great as a mechanism for research, so take a look if there is any chance you might be interested.

2 0.21269813 46 hunch net-2005-03-24-The Role of Workshops

Introduction: A good workshop is often far more interesting than the papers at a conference. This happens because a workshop has a much tighter focus than a conference. Since you choose the workshops fitting your interest, the increased relevance can greatly enhance the level of your interest and attention. Roughly speaking, a workshop program consists of elements related to a subject of your interest. The main conference program consists of elements related to someone’s interest (which is rarely your own). Workshops are more about doing research while conferences are more about presenting research. Several conferences have associated workshop programs, some with deadlines due shortly. ICML workshops Due April 1 IJCAI workshops Deadlines Vary KDD workshops Not yet finalized Anyone going to these conferences should examine the workshops and see if any are of interest. (If none are, then maybe you should organize one next year.)

3 0.20045653 379 hunch net-2009-11-23-ICML 2009 Workshops (and Tutorials)

Introduction: I’m the workshops chair for ICML this year. As such, I would like to personally encourage people to consider running a workshop. My general view of workshops is that they are excellent as opportunities to discuss and develop research directions—some of my best work has come from collaborations at workshops and several workshops have substantially altered my thinking about various problems. My experience running workshops is that setting them up and making them fly often appears much harder than it actually is, and the workshops often come off much better than expected in the end. Submissions are due January 18, two weeks before papers. Similarly, Ben Taskar is looking for good tutorials , which is complementary. Workshops are about exploring a subject, while a tutorial is about distilling it down into an easily taught essence, a vital part of the research process. Tutorials are due February 13, two weeks after papers.

4 0.19552737 216 hunch net-2006-11-02-2006 NIPS workshops

Introduction: I expect the NIPS 2006 workshops to be quite interesting, and recommend going for anyone interested in machine learning research. (Most or all of the workshops webpages can be found two links deep.)

5 0.19113058 264 hunch net-2007-09-30-NIPS workshops are out.

Introduction: Here . I’m particularly interested in the Web Search , Efficient ML , and (of course) Learning Problem Design workshops but there are many others to check out as well. Workshops are a great chance to make progress on or learn about a topic. Relevance and interaction amongst diverse people can sometimes be magical.

6 0.18916802 285 hunch net-2008-01-23-Why Workshop?

7 0.18003863 71 hunch net-2005-05-14-NIPS

8 0.17210884 482 hunch net-2013-05-04-COLT and ICML registration

9 0.17150095 465 hunch net-2012-05-12-ICML accepted papers and early registration

10 0.17048672 113 hunch net-2005-09-19-NIPS Workshops

11 0.15918103 459 hunch net-2012-03-13-The Submodularity workshop and Lucca Professorship

12 0.14423376 488 hunch net-2013-08-31-Extreme Classification workshop at NIPS

13 0.14312226 124 hunch net-2005-10-19-Workshop: Atomic Learning

14 0.13264336 345 hunch net-2009-03-08-Prediction Science

15 0.13113771 293 hunch net-2008-03-23-Interactive Machine Learning

16 0.12763478 279 hunch net-2007-12-19-Cool and interesting things seen at NIPS

17 0.12103271 266 hunch net-2007-10-15-NIPS workshops extended to 3 days

18 0.12000255 141 hunch net-2005-12-17-Workshops as Franchise Conferences

19 0.11982569 387 hunch net-2010-01-19-Deadline Season, 2010

20 0.11435726 443 hunch net-2011-09-03-Fall Machine Learning Events


similar blogs computed by lsi model

lsi for this blog:

topicId topicWeight

[(0, 0.175), (1, -0.141), (2, -0.197), (3, -0.218), (4, 0.057), (5, 0.168), (6, 0.137), (7, 0.038), (8, 0.07), (9, 0.069), (10, 0.029), (11, 0.05), (12, 0.02), (13, -0.03), (14, -0.045), (15, 0.046), (16, 0.063), (17, -0.115), (18, -0.031), (19, -0.026), (20, 0.049), (21, 0.072), (22, 0.017), (23, -0.071), (24, -0.114), (25, -0.025), (26, 0.151), (27, -0.103), (28, -0.036), (29, -0.103), (30, 0.104), (31, 0.006), (32, 0.056), (33, -0.108), (34, -0.025), (35, -0.073), (36, -0.104), (37, -0.05), (38, -0.077), (39, -0.009), (40, 0.067), (41, -0.039), (42, 0.063), (43, 0.089), (44, -0.08), (45, -0.041), (46, -0.093), (47, -0.036), (48, 0.036), (49, 0.053)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.96272177 375 hunch net-2009-10-26-NIPS workshops

Introduction: Many of the NIPS workshops have a deadline about now, and the NIPS early registration deadline is Nov. 6 . Several interest me: Adaptive Sensing, Active Learning, and Experimental Design due 10/27. Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra , due Nov. 6. Large-Scale Machine Learning: Parallelism and Massive Datasets , due 10/23 (i.e. past) Analysis and Design of Algorithms for Interactive Machine Learning , due 10/30. And I’m sure many of the others interest others. Workshops are great as a mechanism for research, so take a look if there is any chance you might be interested.

2 0.64316756 113 hunch net-2005-09-19-NIPS Workshops

Introduction: Attendance at the NIPS workshops is highly recommended for both research and learning. Unfortunately, there does not yet appear to be a public list of workshops. However, I found the following workshop webpages of interest: Machine Learning in Finance Learning to Rank Foundations of Active Learning Machine Learning Based Robotics in Unstructured Environments There are many more workshops. In fact, there are so many that it is not plausible anyone can attend every workshop they are interested in. Maybe in future years the organizers can spread them out over more days to reduce overlap. Many of these workshops are accepting presentation proposals (due mid-October).

3 0.63650101 46 hunch net-2005-03-24-The Role of Workshops

Introduction: A good workshop is often far more interesting than the papers at a conference. This happens because a workshop has a much tighter focus than a conference. Since you choose the workshops fitting your interest, the increased relevance can greatly enhance the level of your interest and attention. Roughly speaking, a workshop program consists of elements related to a subject of your interest. The main conference program consists of elements related to someone’s interest (which is rarely your own). Workshops are more about doing research while conferences are more about presenting research. Several conferences have associated workshop programs, some with deadlines due shortly. ICML workshops Due April 1 IJCAI workshops Deadlines Vary KDD workshops Not yet finalized Anyone going to these conferences should examine the workshops and see if any are of interest. (If none are, then maybe you should organize one next year.)

4 0.62464279 264 hunch net-2007-09-30-NIPS workshops are out.

Introduction: Here . I’m particularly interested in the Web Search , Efficient ML , and (of course) Learning Problem Design workshops but there are many others to check out as well. Workshops are a great chance to make progress on or learn about a topic. Relevance and interaction amongst diverse people can sometimes be magical.

5 0.62060511 71 hunch net-2005-05-14-NIPS

Introduction: NIPS is the big winter conference of learning. Paper due date: June 3rd. (Tweaked thanks to Fei Sha .) Location: Vancouver (main program) Dec. 5-8 and Whistler (workshops) Dec 9-10, BC, Canada NIPS is larger than all of the other learning conferences, partly because it’s the only one at that time of year. I recommend the workshops which are often quite interesting and energetic.

6 0.60610247 285 hunch net-2008-01-23-Why Workshop?

7 0.59937376 379 hunch net-2009-11-23-ICML 2009 Workshops (and Tutorials)

8 0.5981093 266 hunch net-2007-10-15-NIPS workshops extended to 3 days

9 0.57760751 124 hunch net-2005-10-19-Workshop: Atomic Learning

10 0.57674396 216 hunch net-2006-11-02-2006 NIPS workshops

11 0.5671283 482 hunch net-2013-05-04-COLT and ICML registration

12 0.52861077 459 hunch net-2012-03-13-The Submodularity workshop and Lucca Professorship

13 0.5229435 488 hunch net-2013-08-31-Extreme Classification workshop at NIPS

14 0.52279705 443 hunch net-2011-09-03-Fall Machine Learning Events

15 0.50483632 465 hunch net-2012-05-12-ICML accepted papers and early registration

16 0.50138217 481 hunch net-2013-04-15-NEML II

17 0.49174294 421 hunch net-2011-01-03-Herman Goldstine 2011

18 0.45868865 389 hunch net-2010-02-26-Yahoo! ML events

19 0.44780344 279 hunch net-2007-12-19-Cool and interesting things seen at NIPS

20 0.42769161 137 hunch net-2005-12-09-Machine Learning Thoughts


similar blogs computed by lda model

lda for this blog:

topicId topicWeight

[(27, 0.179), (38, 0.073), (55, 0.222), (77, 0.275), (94, 0.105)]

similar blogs list:

simIndex simValue blogId blogTitle

1 0.91330659 436 hunch net-2011-06-22-Ultra LDA

Introduction: Shravan and Alex ‘s LDA code is released . On a single machine, I’m not sure how it currently compares to the online LDA in VW , but the ability to effectively scale across very many machines is surely interesting.

2 0.9032135 405 hunch net-2010-08-21-Rob Schapire at NYC ML Meetup

Introduction: I’ve been wanting to attend the NYC ML Meetup for some time and hope to make it next week on the 25th . Rob Schapire is talking about “Playing Repeated Games”, which in my experience is far more relevant to machine learning than the title might indicate.

same-blog 3 0.89013857 375 hunch net-2009-10-26-NIPS workshops

Introduction: Many of the NIPS workshops have a deadline about now, and the NIPS early registration deadline is Nov. 6 . Several interest me: Adaptive Sensing, Active Learning, and Experimental Design due 10/27. Discrete Optimization in Machine Learning: Submodularity, Sparsity & Polyhedra , due Nov. 6. Large-Scale Machine Learning: Parallelism and Massive Datasets , due 10/23 (i.e. past) Analysis and Design of Algorithms for Interactive Machine Learning , due 10/30. And I’m sure many of the others interest others. Workshops are great as a mechanism for research, so take a look if there is any chance you might be interested.

4 0.85279971 206 hunch net-2006-09-09-How to solve an NP hard problem in quadratic time

Introduction: This title is a lie, but it is a special lie which has a bit of truth. If n players each play each other, you have a tournament. How do you order the players from weakest to strongest? The standard first attempt is “find the ordering which agrees with the tournament on as many player pairs as possible”. This is called the “minimum feedback arcset” problem in the CS theory literature and it is a well known NP-hard problem. A basic guarantee holds for the solution to this problem: if there is some “true” intrinsic ordering, and the outcome of the tournament disagrees k times (due to noise for instance), then the output ordering will disagree with the original ordering on at most 2k edges (and no solution can be better). One standard approach to tractably solving an NP-hard problem is to find another algorithm with an approximation guarantee. For example, Don Coppersmith , Lisa Fleischer and Atri Rudra proved that ordering players according to the number of wins is

5 0.84001803 165 hunch net-2006-03-23-The Approximation Argument

Introduction: An argument is sometimes made that the Bayesian way is the “right” way to do machine learning. This is a serious argument which deserves a serious reply. The approximation argument is a serious reply for which I have not yet seen a reply 2 . The idea for the Bayesian approach is quite simple, elegant, and general. Essentially, you first specify a prior P(D) over possible processes D producing the data, observe the data, then condition on the data according to Bayes law to construct a posterior: P(D|x) = P(x|D)P(D)/P(x) After this, hard decisions are made (such as “turn left” or “turn right”) by choosing the one which minimizes the expected (with respect to the posterior) loss. This basic idea is reused thousands of times with various choices of P(D) and loss functions which is unsurprising given the many nice properties: There is an extremely strong associated guarantee: If the actual distribution generating the data is drawn from P(D) there is no better method.

6 0.8077572 269 hunch net-2007-10-24-Contextual Bandits

7 0.74725503 388 hunch net-2010-01-24-Specializations of the Master Problem

8 0.74647474 317 hunch net-2008-09-12-How do we get weak action dependence for learning with partial observations?

9 0.72484547 270 hunch net-2007-11-02-The Machine Learning Award goes to …

10 0.7186296 453 hunch net-2012-01-28-Why COLT?

11 0.71429759 395 hunch net-2010-04-26-Compassionate Reviewing

12 0.70536959 452 hunch net-2012-01-04-Why ICML? and the summer conferences

13 0.70418257 40 hunch net-2005-03-13-Avoiding Bad Reviewing

14 0.70188904 423 hunch net-2011-02-02-User preferences for search engines

15 0.69516367 379 hunch net-2009-11-23-ICML 2009 Workshops (and Tutorials)

16 0.67947471 116 hunch net-2005-09-30-Research in conferences

17 0.67933577 437 hunch net-2011-07-10-ICML 2011 and the future

18 0.6791454 454 hunch net-2012-01-30-ICML Posters and Scope

19 0.67739052 90 hunch net-2005-07-07-The Limits of Learning Theory

20 0.67555547 44 hunch net-2005-03-21-Research Styles in Machine Learning