hunch_net hunch_net-2007 hunch_net-2007-265 knowledge-graph by maker-knowledge-mining

265 hunch net-2007-10-14-NIPS workshp: Learning Problem Design


meta infos for this blog

Source: html

Introduction: Alina and I are organizing a workshop on Learning Problem Design at NIPS . What is learning problem design? It’s about being clever in creating learning problems from otherwise unlabeled data. Read the webpage above for examples. I want to participate! Email us before Nov. 1 with a description of what you want to talk about.


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 Alina and I are organizing a workshop on Learning Problem Design at NIPS . [sent-1, score-0.438]

2 It’s about being clever in creating learning problems from otherwise unlabeled data. [sent-3, score-1.105]

3 1 with a description of what you want to talk about. [sent-7, score-0.662]


similar blogs computed by tfidf model

tfidf for this blog:

wordName wordTfidf (topN-words)

[('design', 0.324), ('clever', 0.304), ('webpage', 0.295), ('participate', 0.287), ('organizing', 0.28), ('alina', 0.253), ('description', 0.248), ('unlabeled', 0.248), ('email', 0.24), ('want', 0.237), ('read', 0.202), ('otherwise', 0.197), ('creating', 0.195), ('talk', 0.177), ('nips', 0.165), ('workshop', 0.158), ('problem', 0.147), ('us', 0.132), ('problems', 0.089), ('learning', 0.072)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 1.0 265 hunch net-2007-10-14-NIPS workshp: Learning Problem Design

Introduction: Alina and I are organizing a workshop on Learning Problem Design at NIPS . What is learning problem design? It’s about being clever in creating learning problems from otherwise unlabeled data. Read the webpage above for examples. I want to participate! Email us before Nov. 1 with a description of what you want to talk about.

2 0.14460781 234 hunch net-2007-02-22-Create Your Own ICML Workshop

Introduction: As usual ICML 2007 will be hosting a workshop program to be held this year on June 24th. The success of the program depends on having researchers like you propose interesting workshop topics and then organize the workshops. I’d like to encourage all of you to consider sending a workshop proposal. The proposal deadline has been extended to March 5. See the workshop web-site for details. Organizing a workshop is a unique way to gather an international group of researchers together to focus for an entire day on a topic of your choosing. I’ve always found that the cost of organizing a workshop is not so large, and very low compared to the benefits. The topic and format of a workshop are limited only by your imagination (and the attractiveness to potential participants) and need not follow the usual model of a mini-conference on a particular ML sub-area. Hope to see some interesting proposals rolling in.

3 0.13377763 488 hunch net-2013-08-31-Extreme Classification workshop at NIPS

Introduction: Manik and I are organizing the extreme classification workshop at NIPS this year. We have a number of good speakers lined up, but I would further encourage anyone working in the area to submit an abstract by October 9. I believe this is an idea whose time has now come. The NIPS website doesn’t have other workshops listed yet, but I expect several others to be of significant interest.

4 0.12754296 404 hunch net-2010-08-20-The Workshop on Cores, Clusters, and Clouds

Introduction: Alekh , John , Ofer , and I are organizing a workshop at NIPS this year on learning in parallel and distributed environments. The general interest level in parallel learning seems to be growing rapidly, so I expect quite a bit of attendance. Please join us if you are parallel-interested. And, if you are working in the area of parallel learning, please consider submitting an abstract due Oct. 17 for presentation at the workshop.

5 0.12502326 277 hunch net-2007-12-12-Workshop Summary—Principles of Learning Problem Design

Introduction: This is a summary of the workshop on Learning Problem Design which Alina and I ran at NIPS this year. The first question many people have is “What is learning problem design?” This workshop is about admitting that solving learning problems does not start with labeled data, but rather somewhere before. When humans are hired to produce labels, this is usually not a serious problem because you can tell them precisely what semantics you want the labels to have, and we can fix some set of features in advance. However, when other methods are used this becomes more problematic. This focus is important for Machine Learning because there are very large quantities of data which are not labeled by a hired human. The title of the workshop was a bit ambitious, because a workshop is not long enough to synthesize a diversity of approaches into a coherent set of principles. For me, the posters at the end of the workshop were quite helpful in getting approaches to gel. Here are some an

6 0.10651372 375 hunch net-2009-10-26-NIPS workshops

7 0.10388055 195 hunch net-2006-07-12-Who is having visa problems reaching US conferences?

8 0.10102824 143 hunch net-2005-12-27-Automated Labeling

9 0.097831063 444 hunch net-2011-09-07-KDD and MUCMD 2011

10 0.093022093 321 hunch net-2008-10-19-NIPS 2008 workshop on Kernel Learning

11 0.09007898 401 hunch net-2010-06-20-2010 ICML discussion site

12 0.088343531 345 hunch net-2009-03-08-Prediction Science

13 0.087024882 141 hunch net-2005-12-17-Workshops as Franchise Conferences

14 0.085947141 411 hunch net-2010-09-21-Regretting the dead

15 0.083807558 127 hunch net-2005-11-02-Progress in Active Learning

16 0.080652043 167 hunch net-2006-03-27-Gradients everywhere

17 0.077115215 370 hunch net-2009-09-18-Necessary and Sufficient Research

18 0.073092744 161 hunch net-2006-03-05-“Structural” Learning

19 0.072221413 114 hunch net-2005-09-20-Workshop Proposal: Atomic Learning

20 0.071288921 458 hunch net-2012-03-06-COLT-ICML Open Questions and ICML Instructions


similar blogs computed by lsi model

lsi for this blog:

topicId topicWeight

[(0, 0.124), (1, -0.02), (2, -0.074), (3, -0.046), (4, 0.056), (5, 0.104), (6, 0.078), (7, 0.015), (8, 0.014), (9, -0.027), (10, 0.004), (11, -0.059), (12, -0.03), (13, 0.129), (14, -0.058), (15, -0.059), (16, -0.103), (17, 0.108), (18, -0.136), (19, 0.042), (20, -0.173), (21, -0.031), (22, -0.022), (23, -0.026), (24, 0.036), (25, 0.059), (26, 0.11), (27, 0.018), (28, 0.12), (29, -0.015), (30, -0.077), (31, 0.091), (32, -0.007), (33, -0.084), (34, -0.052), (35, 0.009), (36, -0.028), (37, 0.042), (38, -0.033), (39, -0.037), (40, 0.057), (41, 0.086), (42, 0.011), (43, 0.018), (44, 0.075), (45, 0.008), (46, -0.115), (47, -0.022), (48, 0.094), (49, 0.068)]

similar blogs list:

simIndex simValue blogId blogTitle

same-blog 1 0.95487511 265 hunch net-2007-10-14-NIPS workshp: Learning Problem Design

Introduction: Alina and I are organizing a workshop on Learning Problem Design at NIPS . What is learning problem design? It’s about being clever in creating learning problems from otherwise unlabeled data. Read the webpage above for examples. I want to participate! Email us before Nov. 1 with a description of what you want to talk about.

2 0.60834152 114 hunch net-2005-09-20-Workshop Proposal: Atomic Learning

Introduction: This is a proposal for a workshop. It may or may not happen depending on the level of interest. If you are interested, feel free to indicate so (by email or comments). Description: Assume(*) that any system for solving large difficult learning problems must decompose into repeated use of basic elements (i.e. atoms). There are many basic questions which remain: What are the viable basic elements? What makes a basic element viable? What are the viable principles for the composition of these basic elements? What are the viable principles for learning in such systems? What problems can this approach handle? Hal Daume adds: Can composition of atoms be (semi-) automatically constructed[?] When atoms are constructed through reductions, is there some notion of the “naturalness” of the created leaning problems? Other than Markov fields/graphical models/Bayes nets, is there a good language for representing atoms and their compositions? The answer to these a

3 0.5915401 234 hunch net-2007-02-22-Create Your Own ICML Workshop

Introduction: As usual ICML 2007 will be hosting a workshop program to be held this year on June 24th. The success of the program depends on having researchers like you propose interesting workshop topics and then organize the workshops. I’d like to encourage all of you to consider sending a workshop proposal. The proposal deadline has been extended to March 5. See the workshop web-site for details. Organizing a workshop is a unique way to gather an international group of researchers together to focus for an entire day on a topic of your choosing. I’ve always found that the cost of organizing a workshop is not so large, and very low compared to the benefits. The topic and format of a workshop are limited only by your imagination (and the attractiveness to potential participants) and need not follow the usual model of a mini-conference on a particular ML sub-area. Hope to see some interesting proposals rolling in.

4 0.59042877 277 hunch net-2007-12-12-Workshop Summary—Principles of Learning Problem Design

Introduction: This is a summary of the workshop on Learning Problem Design which Alina and I ran at NIPS this year. The first question many people have is “What is learning problem design?” This workshop is about admitting that solving learning problems does not start with labeled data, but rather somewhere before. When humans are hired to produce labels, this is usually not a serious problem because you can tell them precisely what semantics you want the labels to have, and we can fix some set of features in advance. However, when other methods are used this becomes more problematic. This focus is important for Machine Learning because there are very large quantities of data which are not labeled by a hired human. The title of the workshop was a bit ambitious, because a workshop is not long enough to synthesize a diversity of approaches into a coherent set of principles. For me, the posters at the end of the workshop were quite helpful in getting approaches to gel. Here are some an

5 0.58219093 319 hunch net-2008-10-01-NIPS 2008 workshop on ‘Learning over Empirical Hypothesis Spaces’

Introduction: This workshop asks for insights how far we may/can push the theoretical boundary of using data in the design of learning machines. Can we express our classification rule in terms of the sample, or do we have to stick to a core assumption of classical statistical learning theory, namely that the hypothesis space is to be defined independent from the sample? This workshop is particularly interested in – but not restricted to – the ‘luckiness framework’ and the recently introduced notion of ‘compatibility functions’ in a semi-supervised learning context (more information can be found at http://www.kuleuven.be/wehys ).

6 0.54603118 198 hunch net-2006-07-25-Upcoming conference

7 0.52694863 321 hunch net-2008-10-19-NIPS 2008 workshop on Kernel Learning

8 0.4799754 411 hunch net-2010-09-21-Regretting the dead

9 0.44880515 340 hunch net-2009-01-28-Nielsen’s talk

10 0.44492757 144 hunch net-2005-12-28-Yet more nips thoughts

11 0.43185309 80 hunch net-2005-06-10-Workshops are not Conferences

12 0.42128235 161 hunch net-2006-03-05-“Structural” Learning

13 0.42061916 404 hunch net-2010-08-20-The Workshop on Cores, Clusters, and Clouds

14 0.41743198 431 hunch net-2011-04-18-A paper not at Snowbird

15 0.41621608 444 hunch net-2011-09-07-KDD and MUCMD 2011

16 0.41489008 124 hunch net-2005-10-19-Workshop: Atomic Learning

17 0.40669307 195 hunch net-2006-07-12-Who is having visa problems reaching US conferences?

18 0.4028897 381 hunch net-2009-12-07-Vowpal Wabbit version 4.0, and a NIPS heresy

19 0.39686728 488 hunch net-2013-08-31-Extreme Classification workshop at NIPS

20 0.38509339 113 hunch net-2005-09-19-NIPS Workshops


similar blogs computed by lda model

lda for this blog:

topicId topicWeight

[(11, 0.522), (53, 0.178), (55, 0.116)]

similar blogs list:

simIndex simValue blogId blogTitle

1 0.8604719 402 hunch net-2010-07-02-MetaOptimize

Introduction: Joseph Turian creates MetaOptimize for discussion of NLP and ML on big datasets. This includes a blog , but perhaps more importantly a question and answer section . I’m hopeful it will take off.

2 0.85926002 487 hunch net-2013-07-24-ICML 2012 videos lost

Introduction: A big ouch—all the videos for ICML 2012 were lost in a shuffle. Rajnish sends the below, but if anyone can help that would be greatly appreciated. —————————————————————————— Sincere apologies to ICML community for loosing 2012 archived videos What happened: In order to publish 2013 videos, we decided to move 2012 videos to another server. We have a weekly backup service from the provider but after removing the videos from the current server, when we tried to retrieve the 2012 videos from backup service, the backup did not work because of provider-specific requirements that we had ignored while removing the data from previous server. What are we doing about this: At this point, we are still looking into raw footage to find if we can retrieve some of the videos, but following are the steps we are taking to make sure this does not happen again in future: (1) We are going to create a channel on Vimeo (and potentially on YouTube) and we will publish there the p-in-p- or slide-vers

same-blog 3 0.78562307 265 hunch net-2007-10-14-NIPS workshp: Learning Problem Design

Introduction: Alina and I are organizing a workshop on Learning Problem Design at NIPS . What is learning problem design? It’s about being clever in creating learning problems from otherwise unlabeled data. Read the webpage above for examples. I want to participate! Email us before Nov. 1 with a description of what you want to talk about.

4 0.62767041 287 hunch net-2008-01-28-Sufficient Computation

Introduction: Do we have computer hardware sufficient for AI? This question is difficult to answer, but here’s a try: One way to achieve AI is by simulating a human brain. A human brain has about 10 15 synapses which operate at about 10 2 per second implying about 10 17 bit ops per second. A modern computer runs at 10 9 cycles/second and operates on 10 2 bits per cycle implying 10 11 bits processed per second. The gap here is only 6 orders of magnitude, which can be plausibly surpassed via cluster machines. For example, the BlueGene/L operates 10 5 nodes (one order of magnitude short). It’s peak recorded performance is about 0.5*10 15 FLOPS which translates to about 10 16 bit ops per second, which is nearly 10 17 . There are many criticisms (both positive and negative) for this argument. Simulation of a human brain might require substantially more detail. Perhaps an additional 10 2 is required per neuron. We may not need to simulate a human brain to achieve AI. Ther

5 0.38734704 264 hunch net-2007-09-30-NIPS workshops are out.

Introduction: Here . I’m particularly interested in the Web Search , Efficient ML , and (of course) Learning Problem Design workshops but there are many others to check out as well. Workshops are a great chance to make progress on or learn about a topic. Relevance and interaction amongst diverse people can sometimes be magical.

6 0.36843255 145 hunch net-2005-12-29-Deadline Season

7 0.31982142 107 hunch net-2005-09-05-Site Update

8 0.30859473 16 hunch net-2005-02-09-Intuitions from applied learning

9 0.29961517 91 hunch net-2005-07-10-Thinking the Unthought

10 0.29856449 56 hunch net-2005-04-14-Families of Learning Theory Statements

11 0.29815277 2 hunch net-2005-01-24-Holy grails of machine learning?

12 0.29294723 367 hunch net-2009-08-16-Centmail comments

13 0.28237793 292 hunch net-2008-03-15-COLT Open Problems

14 0.2756241 331 hunch net-2008-12-12-Summer Conferences

15 0.26460326 21 hunch net-2005-02-17-Learning Research Programs

16 0.2573117 6 hunch net-2005-01-27-Learning Complete Problems

17 0.25456044 356 hunch net-2009-05-24-2009 ICML discussion site

18 0.25269648 422 hunch net-2011-01-16-2011 Summer Conference Deadline Season

19 0.25238106 305 hunch net-2008-06-30-ICML has a comment system

20 0.25204951 387 hunch net-2010-01-19-Deadline Season, 2010