hunch_net hunch_net-2005 hunch_net-2005-107 knowledge-graph by maker-knowledge-mining

107 hunch net-2005-09-05-Site Update


meta infos for this blog

Source: html

Introduction: I tweaked the site in a number of ways today, including: Updating to WordPress 1.5. Installing and heavily tweaking the Geekniche theme. Update: I switched back to a tweaked version of the old theme. Adding the Customizable Post Listings plugin. Installing the StatTraq plugin. Updating some of the links. I particularly recommend looking at the computer research policy blog. Adding threaded comments . This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. Overall, I think this is an improvement, and it addresses a few of my earlier problems . If you have any difficulties or anything seems “not quite right”, please speak up. A few other tweaks to the site may happen in the near future.


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 I tweaked the site in a number of ways today, including: Updating to WordPress 1. [sent-1, score-0.615]

2 Installing and heavily tweaking the Geekniche theme. [sent-3, score-0.305]

3 Update: I switched back to a tweaked version of the old theme. [sent-4, score-0.823]

4 I particularly recommend looking at the computer research policy blog. [sent-8, score-0.5]

5 This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. [sent-10, score-0.909]

6 Overall, I think this is an improvement, and it addresses a few of my earlier problems . [sent-11, score-0.346]

7 If you have any difficulties or anything seems “not quite right”, please speak up. [sent-12, score-0.515]

8 A few other tweaks to the site may happen in the near future. [sent-13, score-0.637]


similar blogs computed by tfidf model

tfidf for this blog:

wordName wordTfidf (topN-words)

[('installing', 0.415), ('updating', 0.307), ('tweaked', 0.285), ('adding', 0.243), ('site', 0.218), ('old', 0.194), ('comments', 0.192), ('thread', 0.184), ('switched', 0.171), ('threaded', 0.171), ('tweaking', 0.171), ('tweaks', 0.161), ('wordpress', 0.154), ('addresses', 0.138), ('heavily', 0.134), ('today', 0.13), ('recommend', 0.127), ('earlier', 0.117), ('anything', 0.115), ('speak', 0.115), ('update', 0.106), ('obviously', 0.104), ('extra', 0.104), ('difficulties', 0.103), ('improvement', 0.101), ('policy', 0.096), ('looking', 0.095), ('happen', 0.095), ('please', 0.094), ('back', 0.094), ('overall', 0.092), ('near', 0.09), ('structure', 0.085), ('computer', 0.082), ('version', 0.079), ('helpful', 0.077), ('including', 0.075), ('may', 0.073), ('doesn', 0.073), ('future', 0.069), ('post', 0.068), ('ways', 0.068), ('particularly', 0.059), ('right', 0.054), ('quite', 0.051), ('think', 0.051), ('number', 0.044), ('research', 0.041), ('problems', 0.04), ('seems', 0.037)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 1.0 107 hunch net-2005-09-05-Site Update

Introduction: I tweaked the site in a number of ways today, including: Updating to WordPress 1.5. Installing and heavily tweaking the Geekniche theme. Update: I switched back to a tweaked version of the old theme. Adding the Customizable Post Listings plugin. Installing the StatTraq plugin. Updating some of the links. I particularly recommend looking at the computer research policy blog. Adding threaded comments . This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. Overall, I think this is an improvement, and it addresses a few of my earlier problems . If you have any difficulties or anything seems “not quite right”, please speak up. A few other tweaks to the site may happen in the near future.

2 0.25838485 354 hunch net-2009-05-17-Server Update

Introduction: The hunch.net server has been updated. I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. Old threaded comments are now flattened. The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. I haven’t yet figured out a workaround. I setup a feedburner account . I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. This is something that I’ve wanted to do for awhile. Many other minor changes in font and format, with some help from Alina . If you have any suggestions for site tweaks, please speak up.

3 0.11688511 25 hunch net-2005-02-20-At One Month

Introduction: This is near the one month point, so it seems appropriate to consider meta-issues for the moment. The number of posts is a bit over 20. The number of people speaking up in discussions is about 10. The number of people viewing the site is somewhat more than 100. I am (naturally) dissatisfied with many things. Many of the potential uses haven’t been realized. This is partly a matter of opportunity (no conferences in the last month), partly a matter of will (no open problems because it’s hard to give them up), and partly a matter of tradition. In academia, there is a strong tradition of trying to get everything perfectly right before presentation. This is somewhat contradictory to the nature of making many posts, and it’s definitely contradictory to the idea of doing “public research”. If that sort of idea is to pay off, it must be significantly more succesful than previous methods. In an effort to continue experimenting, I’m going to use the next week as “open problems we

4 0.10608271 71 hunch net-2005-05-14-NIPS

Introduction: NIPS is the big winter conference of learning. Paper due date: June 3rd. (Tweaked thanks to Fei Sha .) Location: Vancouver (main program) Dec. 5-8 and Whistler (workshops) Dec 9-10, BC, Canada NIPS is larger than all of the other learning conferences, partly because it’s the only one at that time of year. I recommend the workshops which are often quite interesting and energetic.

5 0.10052036 191 hunch net-2006-07-08-MaxEnt contradicts Bayes Rule?

Introduction: A few weeks ago I read this . David Blei and I spent some time thinking hard about this a few years back (thanks to Kary Myers for pointing us to it): In short I was thinking that “bayesian belief updating” and “maximum entropy” were two othogonal principles. But it appear that they are not, and that they can even be in conflict ! Example (from Kass 1996); consider a Die (6 sides), consider prior knowledge E[X]=3.5. Maximum entropy leads to P(X)= (1/6, 1/6, 1/6, 1/6, 1/6, 1/6). Now consider a new piece of evidence A=”X is an odd number” Bayesian posterior P(X|A)= P(A|X) P(X) = (1/3, 0, 1/3, 0, 1/3, 0). But MaxEnt with the constraints E[X]=3.5 and E[Indicator function of A]=1 leads to (.22, 0, .32, 0, .47, 0) !! (note that E[Indicator function of A]=P(A)) Indeed, for MaxEnt, because there is no more ‘6′, big numbers must be more probable to ensure an average of 3.5. For bayesian updating, P(X|A) doesn’t have to have a 3.5

6 0.094774708 182 hunch net-2006-06-05-Server Shift, Site Tweaks, Suggestions?

7 0.092073351 220 hunch net-2006-11-27-Continuizing Solutions

8 0.090984009 297 hunch net-2008-04-22-Taking the next step

9 0.082514428 363 hunch net-2009-07-09-The Machine Learning Forum

10 0.081050448 342 hunch net-2009-02-16-KDNuggets

11 0.079652853 108 hunch net-2005-09-06-A link

12 0.078163445 225 hunch net-2007-01-02-Retrospective

13 0.077581808 122 hunch net-2005-10-13-Site tweak

14 0.071196124 382 hunch net-2009-12-09-Future Publication Models @ NIPS

15 0.069602944 365 hunch net-2009-07-31-Vowpal Wabbit Open Source Project

16 0.058850199 458 hunch net-2012-03-06-COLT-ICML Open Questions and ICML Instructions

17 0.055678025 393 hunch net-2010-04-14-MLcomp: a website for objectively comparing ML algorithms

18 0.052936234 490 hunch net-2013-11-09-Graduates and Postdocs

19 0.052509092 311 hunch net-2008-07-26-Compositional Machine Learning Algorithm Design

20 0.05248655 401 hunch net-2010-06-20-2010 ICML discussion site


similar blogs computed by lsi model

lsi for this blog:

topicId topicWeight

[(0, 0.105), (1, -0.029), (2, -0.039), (3, 0.054), (4, -0.026), (5, 0.022), (6, 0.011), (7, -0.057), (8, -0.012), (9, 0.024), (10, -0.04), (11, -0.017), (12, -0.044), (13, 0.038), (14, 0.044), (15, -0.08), (16, -0.094), (17, -0.102), (18, 0.038), (19, 0.069), (20, -0.048), (21, -0.038), (22, -0.051), (23, -0.119), (24, -0.104), (25, 0.004), (26, -0.072), (27, 0.081), (28, 0.014), (29, -0.061), (30, -0.026), (31, 0.086), (32, 0.013), (33, -0.078), (34, 0.144), (35, 0.106), (36, -0.022), (37, 0.018), (38, -0.078), (39, 0.038), (40, -0.066), (41, 0.015), (42, 0.05), (43, 0.014), (44, 0.061), (45, 0.005), (46, 0.156), (47, -0.028), (48, 0.127), (49, 0.001)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.98151243 107 hunch net-2005-09-05-Site Update

Introduction: I tweaked the site in a number of ways today, including: Updating to WordPress 1.5. Installing and heavily tweaking the Geekniche theme. Update: I switched back to a tweaked version of the old theme. Adding the Customizable Post Listings plugin. Installing the StatTraq plugin. Updating some of the links. I particularly recommend looking at the computer research policy blog. Adding threaded comments . This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. Overall, I think this is an improvement, and it addresses a few of my earlier problems . If you have any difficulties or anything seems “not quite right”, please speak up. A few other tweaks to the site may happen in the near future.

2 0.82921571 354 hunch net-2009-05-17-Server Update

Introduction: The hunch.net server has been updated. I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. Old threaded comments are now flattened. The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. I haven’t yet figured out a workaround. I setup a feedburner account . I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. This is something that I’ve wanted to do for awhile. Many other minor changes in font and format, with some help from Alina . If you have any suggestions for site tweaks, please speak up.

3 0.59437031 122 hunch net-2005-10-13-Site tweak

Introduction: Several people have had difficulty with comments which seem to have an allowed language significantly poorer than posts. The set of allowed html tags has been increased and the markdown filter has been put in place to try to make commenting easier. I’ll put some examples into the comments of this post.

4 0.55093551 25 hunch net-2005-02-20-At One Month

Introduction: This is near the one month point, so it seems appropriate to consider meta-issues for the moment. The number of posts is a bit over 20. The number of people speaking up in discussions is about 10. The number of people viewing the site is somewhat more than 100. I am (naturally) dissatisfied with many things. Many of the potential uses haven’t been realized. This is partly a matter of opportunity (no conferences in the last month), partly a matter of will (no open problems because it’s hard to give them up), and partly a matter of tradition. In academia, there is a strong tradition of trying to get everything perfectly right before presentation. This is somewhat contradictory to the nature of making many posts, and it’s definitely contradictory to the idea of doing “public research”. If that sort of idea is to pay off, it must be significantly more succesful than previous methods. In an effort to continue experimenting, I’m going to use the next week as “open problems we

5 0.49834946 182 hunch net-2006-06-05-Server Shift, Site Tweaks, Suggestions?

Introduction: Hunch.net has shifted to a new server, and wordpress has been updated to the latest version. If anyone notices difficulties associated with this, please comment. (Note that DNS updates can take awhile so the shift may not yet be complete.) More generally, this is a good time to ask for suggestions. What would make this blog more useful?

6 0.48302302 294 hunch net-2008-04-12-Blog compromised

7 0.47276291 297 hunch net-2008-04-22-Taking the next step

8 0.45635322 246 hunch net-2007-06-13-Not Posting

9 0.43164361 363 hunch net-2009-07-09-The Machine Learning Forum

10 0.40480739 108 hunch net-2005-09-06-A link

11 0.38418606 191 hunch net-2006-07-08-MaxEnt contradicts Bayes Rule?

12 0.36776072 1 hunch net-2005-01-19-Why I decided to run a weblog.

13 0.36069593 220 hunch net-2006-11-27-Continuizing Solutions

14 0.35410401 401 hunch net-2010-06-20-2010 ICML discussion site

15 0.34778109 240 hunch net-2007-04-21-Videolectures.net

16 0.34704518 151 hunch net-2006-01-25-1 year

17 0.33974391 225 hunch net-2007-01-02-Retrospective

18 0.33496422 340 hunch net-2009-01-28-Nielsen’s talk

19 0.32299626 134 hunch net-2005-12-01-The Webscience Future

20 0.32229668 172 hunch net-2006-04-14-JMLR is a success


similar blogs computed by lda model

lda for this blog:

topicId topicWeight

[(10, 0.039), (27, 0.1), (53, 0.684), (94, 0.04)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.98906648 107 hunch net-2005-09-05-Site Update

Introduction: I tweaked the site in a number of ways today, including: Updating to WordPress 1.5. Installing and heavily tweaking the Geekniche theme. Update: I switched back to a tweaked version of the old theme. Adding the Customizable Post Listings plugin. Installing the StatTraq plugin. Updating some of the links. I particularly recommend looking at the computer research policy blog. Adding threaded comments . This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. Overall, I think this is an improvement, and it addresses a few of my earlier problems . If you have any difficulties or anything seems “not quite right”, please speak up. A few other tweaks to the site may happen in the near future.

2 0.97891122 56 hunch net-2005-04-14-Families of Learning Theory Statements

Introduction: The diagram above shows a very broad viewpoint of learning theory. arrow Typical statement Examples Past->Past Some prediction algorithm A does almost as well as any of a set of algorithms. Weighted Majority Past->Future Assuming independent samples, past performance predicts future performance. PAC analysis, ERM analysis Future->Future Future prediction performance on subproblems implies future prediction performance using algorithm A . ECOC, Probing A basic question is: Are there other varieties of statements of this type? Avrim noted that there are also “arrows between arrows”: generic methods for transforming between Past->Past statements and Past->Future statements. Are there others?

3 0.97338259 16 hunch net-2005-02-09-Intuitions from applied learning

Introduction: Since learning is far from an exact science, it’s good to pay attention to basic intuitions of applied learning. Here are a few I’ve collected. Integration In Bayesian learning, the posterior is computed by an integral, and the optimal thing to do is to predict according to this integral. This phenomena seems to be far more general. Bagging, Boosting, SVMs, and Neural Networks all take advantage of this idea to some extent. The phenomena is more general: you can average over many different classification predictors to improve performance. Sources: Zoubin , Caruana Differentiation Different pieces of an average should differentiate to achieve good performance by different methods. This is know as the ‘symmetry breaking’ problem for neural networks, and it’s why weights are initialized randomly. Boosting explicitly attempts to achieve good differentiation by creating new, different, learning problems. Sources: Yann LeCun , Phil Long Deep Representation Ha

4 0.942788 91 hunch net-2005-07-10-Thinking the Unthought

Introduction: One thing common to much research is that the researcher must be the first person ever to have some thought. How do you think of something that has never been thought of? There seems to be no methodical manner of doing this, but there are some tricks. The easiest method is to just have some connection come to you. There is a trick here however: you should write it down and fill out the idea immediately because it can just as easily go away. A harder method is to set aside a block of time and simply think about an idea. Distraction elimination is essential here because thinking about the unthought is hard work which your mind will avoid. Another common method is in conversation. Sometimes the process of verbalizing implies new ideas come up and sometimes whoever you are talking to replies just the right way. This method is dangerous though—you must speak to someone who helps you think rather than someone who occupies your thoughts. Try to rephrase the problem so the a

5 0.93787247 2 hunch net-2005-01-24-Holy grails of machine learning?

Introduction: Let me kick things off by posing this question to ML researchers: What do you think are some important holy grails of machine learning? For example: – “A classifier with SVM-level performance but much more scalable” – “Practical confidence bounds (or learning bounds) for classification” – “A reinforcement learning algorithm that can handle the ___ problem” – “Understanding theoretically why ___ works so well in practice” etc. I pose this question because I believe that when goals are stated explicitly and well (thus providing clarity as well as opening up the problems to more people), rather than left implicit, they are likely to be achieved much more quickly. I would also like to know more about the internal goals of the various machine learning sub-areas (theory, kernel methods, graphical models, reinforcement learning, etc) as stated by people in these respective areas. This could help people cross sub-areas.

6 0.91754383 367 hunch net-2009-08-16-Centmail comments

7 0.91628534 145 hunch net-2005-12-29-Deadline Season

8 0.86671263 6 hunch net-2005-01-27-Learning Complete Problems

9 0.70824748 21 hunch net-2005-02-17-Learning Research Programs

10 0.59030986 201 hunch net-2006-08-07-The Call of the Deep

11 0.58660787 191 hunch net-2006-07-08-MaxEnt contradicts Bayes Rule?

12 0.58100861 151 hunch net-2006-01-25-1 year

13 0.57781899 60 hunch net-2005-04-23-Advantages and Disadvantages of Bayesian Learning

14 0.57112229 141 hunch net-2005-12-17-Workshops as Franchise Conferences

15 0.56412911 265 hunch net-2007-10-14-NIPS workshp: Learning Problem Design

16 0.54463857 283 hunch net-2008-01-07-2008 Summer Machine Learning Conference Schedule

17 0.5349105 321 hunch net-2008-10-19-NIPS 2008 workshop on Kernel Learning

18 0.53220695 152 hunch net-2006-01-30-Should the Input Representation be a Vector?

19 0.51463652 407 hunch net-2010-08-23-Boosted Decision Trees for Deep Learning

20 0.51066017 249 hunch net-2007-06-21-Presentation Preparation