hunch_net hunch_net-2009 hunch_net-2009-354 knowledge-graph by maker-knowledge-mining

354 hunch net-2009-05-17-Server Update


meta infos for this blog

Source: html

Introduction: The hunch.net server has been updated. I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. Old threaded comments are now flattened. The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. I haven’t yet figured out a workaround. I setup a feedburner account . I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. This is something that I’ve wanted to do for awhile. Many other minor changes in font and format, with some help from Alina . If you have any suggestions for site tweaks, please speak up.


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. [sent-3, score-0.74]

2 The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. [sent-5, score-1.724]

3 I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. [sent-8, score-0.377]

4 This is something that I’ve wanted to do for awhile. [sent-9, score-0.194]

5 Many other minor changes in font and format, with some help from Alina . [sent-10, score-0.551]

6 If you have any suggestions for site tweaks, please speak up. [sent-11, score-0.505]


similar blogs computed by tfidf model

tfidf for this blog:

wordName wordTfidf (topN-words)

[('threaded', 0.404), ('comments', 0.227), ('threading', 0.218), ('rss', 0.218), ('upgrade', 0.218), ('figured', 0.202), ('font', 0.202), ('incompatible', 0.202), ('tweaks', 0.191), ('brian', 0.191), ('wordpress', 0.182), ('server', 0.175), ('blogs', 0.159), ('built', 0.147), ('minor', 0.144), ('system', 0.141), ('speak', 0.136), ('alina', 0.136), ('ve', 0.131), ('format', 0.131), ('opportunity', 0.129), ('site', 0.129), ('suggestions', 0.129), ('wanted', 0.127), ('added', 0.127), ('setup', 0.121), ('taken', 0.117), ('account', 0.117), ('changes', 0.117), ('old', 0.115), ('please', 0.111), ('haven', 0.11), ('appears', 0.1), ('version', 0.094), ('help', 0.088), ('yet', 0.07), ('something', 0.067), ('used', 0.059), ('research', 0.048), ('use', 0.046), ('like', 0.043), ('new', 0.039), ('machine', 0.032), ('many', 0.026), ('learning', 0.013)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 1.0 354 hunch net-2009-05-17-Server Update

Introduction: The hunch.net server has been updated. I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. Old threaded comments are now flattened. The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. I haven’t yet figured out a workaround. I setup a feedburner account . I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. This is something that I’ve wanted to do for awhile. Many other minor changes in font and format, with some help from Alina . If you have any suggestions for site tweaks, please speak up.

2 0.25838485 107 hunch net-2005-09-05-Site Update

Introduction: I tweaked the site in a number of ways today, including: Updating to WordPress 1.5. Installing and heavily tweaking the Geekniche theme. Update: I switched back to a tweaked version of the old theme. Adding the Customizable Post Listings plugin. Installing the StatTraq plugin. Updating some of the links. I particularly recommend looking at the computer research policy blog. Adding threaded comments . This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. Overall, I think this is an improvement, and it addresses a few of my earlier problems . If you have any difficulties or anything seems “not quite right”, please speak up. A few other tweaks to the site may happen in the near future.

3 0.14133656 182 hunch net-2006-06-05-Server Shift, Site Tweaks, Suggestions?

Introduction: Hunch.net has shifted to a new server, and wordpress has been updated to the latest version. If anyone notices difficulties associated with this, please comment. (Note that DNS updates can take awhile so the shift may not yet be complete.) More generally, this is a good time to ask for suggestions. What would make this blog more useful?

4 0.12660706 25 hunch net-2005-02-20-At One Month

Introduction: This is near the one month point, so it seems appropriate to consider meta-issues for the moment. The number of posts is a bit over 20. The number of people speaking up in discussions is about 10. The number of people viewing the site is somewhat more than 100. I am (naturally) dissatisfied with many things. Many of the potential uses haven’t been realized. This is partly a matter of opportunity (no conferences in the last month), partly a matter of will (no open problems because it’s hard to give them up), and partly a matter of tradition. In academia, there is a strong tradition of trying to get everything perfectly right before presentation. This is somewhat contradictory to the nature of making many posts, and it’s definitely contradictory to the idea of doing “public research”. If that sort of idea is to pay off, it must be significantly more succesful than previous methods. In an effort to continue experimenting, I’m going to use the next week as “open problems we

5 0.10272529 401 hunch net-2010-06-20-2010 ICML discussion site

Introduction: A substantial difficulty with the 2009 and 2008 ICML discussion system was a communication vacuum, where authors were not informed of comments, and commenters were not informed of responses to their comments without explicit monitoring. Mark Reid has setup a new discussion system for 2010 with the goal of addressing this. Mark didn’t want to make it to intrusive, so you must opt-in. As an author, find your paper and “Subscribe by email” to the comments. As a commenter, you have the option of providing an email for follow-up notification.

6 0.095371373 297 hunch net-2008-04-22-Taking the next step

7 0.08579693 240 hunch net-2007-04-21-Videolectures.net

8 0.081540063 122 hunch net-2005-10-13-Site tweak

9 0.072171785 246 hunch net-2007-06-13-Not Posting

10 0.06737268 356 hunch net-2009-05-24-2009 ICML discussion site

11 0.06717515 225 hunch net-2007-01-02-Retrospective

12 0.066376559 365 hunch net-2009-07-31-Vowpal Wabbit Open Source Project

13 0.066046208 208 hunch net-2006-09-18-What is missing for online collaborative research?

14 0.065606728 428 hunch net-2011-03-27-Vowpal Wabbit, v5.1

15 0.06478247 381 hunch net-2009-12-07-Vowpal Wabbit version 4.0, and a NIPS heresy

16 0.064090699 363 hunch net-2009-07-09-The Machine Learning Forum

17 0.064040557 447 hunch net-2011-10-10-ML Symposium and ICML details

18 0.060142178 151 hunch net-2006-01-25-1 year

19 0.055840112 15 hunch net-2005-02-08-Some Links

20 0.054349676 475 hunch net-2012-10-26-ML Symposium and Strata-Hadoop World


similar blogs computed by lsi model

lsi for this blog:

topicId topicWeight

[(0, 0.095), (1, -0.045), (2, -0.054), (3, 0.061), (4, -0.035), (5, 0.023), (6, -0.025), (7, -0.117), (8, -0.039), (9, 0.004), (10, -0.054), (11, -0.04), (12, -0.044), (13, 0.072), (14, 0.083), (15, -0.073), (16, -0.179), (17, -0.047), (18, 0.018), (19, 0.095), (20, -0.017), (21, -0.04), (22, -0.063), (23, -0.092), (24, -0.108), (25, -0.018), (26, -0.03), (27, 0.066), (28, 0.013), (29, -0.047), (30, 0.008), (31, 0.094), (32, 0.003), (33, -0.101), (34, 0.124), (35, 0.086), (36, -0.022), (37, -0.003), (38, -0.058), (39, -0.009), (40, -0.135), (41, 0.027), (42, 0.013), (43, -0.042), (44, 0.02), (45, -0.003), (46, 0.082), (47, 0.005), (48, 0.05), (49, 0.018)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.97388273 354 hunch net-2009-05-17-Server Update

Introduction: The hunch.net server has been updated. I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. Old threaded comments are now flattened. The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. I haven’t yet figured out a workaround. I setup a feedburner account . I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. This is something that I’ve wanted to do for awhile. Many other minor changes in font and format, with some help from Alina . If you have any suggestions for site tweaks, please speak up.

2 0.83972758 107 hunch net-2005-09-05-Site Update

Introduction: I tweaked the site in a number of ways today, including: Updating to WordPress 1.5. Installing and heavily tweaking the Geekniche theme. Update: I switched back to a tweaked version of the old theme. Adding the Customizable Post Listings plugin. Installing the StatTraq plugin. Updating some of the links. I particularly recommend looking at the computer research policy blog. Adding threaded comments . This doesn’t thread old comments obviously, but the extra structure may be helpful for new ones. Overall, I think this is an improvement, and it addresses a few of my earlier problems . If you have any difficulties or anything seems “not quite right”, please speak up. A few other tweaks to the site may happen in the near future.

3 0.60331762 25 hunch net-2005-02-20-At One Month

Introduction: This is near the one month point, so it seems appropriate to consider meta-issues for the moment. The number of posts is a bit over 20. The number of people speaking up in discussions is about 10. The number of people viewing the site is somewhat more than 100. I am (naturally) dissatisfied with many things. Many of the potential uses haven’t been realized. This is partly a matter of opportunity (no conferences in the last month), partly a matter of will (no open problems because it’s hard to give them up), and partly a matter of tradition. In academia, there is a strong tradition of trying to get everything perfectly right before presentation. This is somewhat contradictory to the nature of making many posts, and it’s definitely contradictory to the idea of doing “public research”. If that sort of idea is to pay off, it must be significantly more succesful than previous methods. In an effort to continue experimenting, I’m going to use the next week as “open problems we

4 0.6017493 122 hunch net-2005-10-13-Site tweak

Introduction: Several people have had difficulty with comments which seem to have an allowed language significantly poorer than posts. The set of allowed html tags has been increased and the markdown filter has been put in place to try to make commenting easier. I’ll put some examples into the comments of this post.

5 0.54183775 297 hunch net-2008-04-22-Taking the next step

Introduction: At the last ICML , Tom Dietterich asked me to look into systems for commenting on papers. I’ve been slow getting to this, but it’s relevant now. The essential observation is that we now have many tools for online collaboration, but they are not yet much used in academic research. If we can find the right way to use them, then perhaps great things might happen, with extra kudos to the first conference that manages to really create an online community. Various conferences have been poking at this. For example, UAI has setup a wiki , COLT has started using Joomla , with some dynamic content, and AAAI has been setting up a “ student blog “. Similarly, Dinoj Surendran setup a twiki for the Chicago Machine Learning Summer School , which was quite useful for coordinating events and other things. I believe the most important thing is a willingness to experiment. A good place to start seems to be enhancing existing conference websites. For example, the ICML 2007 papers pag

6 0.53315711 294 hunch net-2008-04-12-Blog compromised

7 0.51166558 246 hunch net-2007-06-13-Not Posting

8 0.49845919 182 hunch net-2006-06-05-Server Shift, Site Tweaks, Suggestions?

9 0.46122459 401 hunch net-2010-06-20-2010 ICML discussion site

10 0.44659209 363 hunch net-2009-07-09-The Machine Learning Forum

11 0.414078 278 hunch net-2007-12-17-New Machine Learning mailing list

12 0.38985822 137 hunch net-2005-12-09-Machine Learning Thoughts

13 0.38844153 240 hunch net-2007-04-21-Videolectures.net

14 0.38707057 151 hunch net-2006-01-25-1 year

15 0.37428483 225 hunch net-2007-01-02-Retrospective

16 0.37320796 223 hunch net-2006-12-06-The Spam Problem

17 0.37206224 1 hunch net-2005-01-19-Why I decided to run a weblog.

18 0.36347586 134 hunch net-2005-12-01-The Webscience Future

19 0.36267504 487 hunch net-2013-07-24-ICML 2012 videos lost

20 0.35516515 81 hunch net-2005-06-13-Wikis for Summer Schools and Workshops


similar blogs computed by lda model

lda for this blog:

topicId topicWeight

[(10, 0.107), (27, 0.214), (53, 0.15), (79, 0.391)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.87576485 354 hunch net-2009-05-17-Server Update

Introduction: The hunch.net server has been updated. I’ve taken the opportunity to upgrade the version of wordpress which caused cascading changes. Old threaded comments are now flattened. The system we used to use ( Brian’s threaded comments ) appears incompatible with the new threading system built into wordpress. I haven’t yet figured out a workaround. I setup a feedburner account . I added an RSS aggregator for both Machine Learning and other research blogs that I like to follow. This is something that I’ve wanted to do for awhile. Many other minor changes in font and format, with some help from Alina . If you have any suggestions for site tweaks, please speak up.

2 0.75883681 254 hunch net-2007-07-12-ICML Trends

Introduction: Mark Reid did a post on ICML trends that I found interesting.

3 0.72276741 248 hunch net-2007-06-19-How is Compressed Sensing going to change Machine Learning ?

Introduction: Compressed Sensing (CS) is a new framework developed by Emmanuel Candes , Terry Tao and David Donoho . To summarize, if you acquire a signal in some basis that is incoherent with the basis in which you know the signal to be sparse in, it is very likely you will be able to reconstruct the signal from these incoherent projections. Terry Tao, the recent Fields medalist , does a very nice job at explaining the framework here . He goes further in the theory description in this post where he mentions the central issue of the Uniform Uncertainty Principle. It so happens that random projections are on average incoherent, within the UUP meaning, with most known basis (sines, polynomials, splines, wavelets, curvelets …) and are therefore an ideal basis for Compressed Sensing. [ For more in-depth information on the subject, the Rice group has done a very good job at providing a central library of papers relevant to the growing subject: http://www.dsp.ece.rice.edu/cs/ ] The Machine

4 0.70415652 27 hunch net-2005-02-23-Problem: Reinforcement Learning with Classification

Introduction: At an intuitive level, the question here is “Can reinforcement learning be solved with classification?” Problem Construct a reinforcement learning algorithm with near-optimal expected sum of rewards in the direct experience model given access to a classifier learning algorithm which has a small error rate or regret on all posed classification problems. The definition of “posed” here is slightly murky. I consider a problem “posed” if there is an algorithm for constructing labeled classification examples. Past Work There exists a reduction of reinforcement learning to classification given a generative model. A generative model is an inherently stronger assumption than the direct experience model. Other work on learning reductions may be important. Several algorithms for solving reinforcement learning in the direct experience model exist. Most, such as E 3 , Factored-E 3 , and metric-E 3 and Rmax require that the observation be the state. Recent work

5 0.66985154 162 hunch net-2006-03-09-Use of Notation

Introduction: For most people, a mathematical notation is like a language: you learn it and stick with it. For people doing mathematical research, however, this is not enough: they must design new notations for new problems. The design of good notation is both hard and worthwhile since a bad initial notation can retard a line of research greatly. Before we had mathematical notation, equations were all written out in language. Since words have multiple meanings and variable precedences, long equations written out in language can be extraordinarily difficult and sometimes fundamentally ambiguous. A good representative example of this is the legalese in the tax code. Since we want greater precision and clarity, we adopt mathematical notation. One fundamental thing to understand about mathematical notation, is that humans as logic verifiers, are barely capable. This is the fundamental reason why one notation can be much better than another. This observation is easier to miss than you might

6 0.57789266 204 hunch net-2006-08-28-Learning Theory standards for NIPS 2006

7 0.53785795 423 hunch net-2011-02-02-User preferences for search engines

8 0.52069533 332 hunch net-2008-12-23-Use of Learning Theory

9 0.51915848 201 hunch net-2006-08-07-The Call of the Deep

10 0.51660049 6 hunch net-2005-01-27-Learning Complete Problems

11 0.50333649 55 hunch net-2005-04-10-Is the Goal Understanding or Prediction?

12 0.49555999 483 hunch net-2013-06-10-The Large Scale Learning class notes

13 0.49423599 158 hunch net-2006-02-24-A Fundamentalist Organization of Machine Learning

14 0.49079645 478 hunch net-2013-01-07-NYU Large Scale Machine Learning Class

15 0.48996174 199 hunch net-2006-07-26-Two more UAI papers of interest

16 0.4890058 60 hunch net-2005-04-23-Advantages and Disadvantages of Bayesian Learning

17 0.4886598 227 hunch net-2007-01-10-A Deep Belief Net Learning Problem

18 0.48859555 152 hunch net-2006-01-30-Should the Input Representation be a Vector?

19 0.48646581 367 hunch net-2009-08-16-Centmail comments

20 0.48479119 347 hunch net-2009-03-26-Machine Learning is too easy