nips nips2005 nips2005-200 knowledge-graph by maker-knowledge-mining

200 nips-2005-Variable KD-Tree Algorithms for Spatial Pattern Search and Discovery


Source: pdf

Author: Jeremy Kubica, Joseph Masiero, Robert Jedicke, Andrew Connolly, Andrew W. Moore

Abstract: In this paper we consider the problem of finding sets of points that conform to a given underlying model from within a dense, noisy set of observations. This problem is motivated by the task of efficiently linking faint asteroid detections, but is applicable to a range of spatial queries. We survey current tree-based approaches, showing a trade-off exists between single tree and multiple tree algorithms. To this end, we present a new type of multiple tree algorithm that uses a variable number of trees to exploit the advantages of both approaches. We empirically show that this algorithm performs well using both simulated and astronomical data.

Reference: text


Summary: the most important sentenses genereted by tfidf model

sentIndex sentText sentNum sentScore

1 edu Abstract In this paper we consider the problem of finding sets of points that conform to a given underlying model from within a dense, noisy set of observations. [sent-11, score-0.36]

2 This problem is motivated by the task of efficiently linking faint asteroid detections, but is applicable to a range of spatial queries. [sent-12, score-0.649]

3 We survey current tree-based approaches, showing a trade-off exists between single tree and multiple tree algorithms. [sent-13, score-1.197]

4 To this end, we present a new type of multiple tree algorithm that uses a variable number of trees to exploit the advantages of both approaches. [sent-14, score-0.777]

5 As we look for really dim objects, the number of noise points increases greatly and swamps the number of detections of real objects. [sent-22, score-0.377]

6 For example, we may want to detect a specific configuration of corner points in an image or search for multi-way structure in scientific data. [sent-26, score-0.461]

7 Returning to the asteroid linking example, this corresponds to finding a handful of points that lie along a line within a data set of millions of detections. [sent-28, score-0.704]

8 We show that both single tree and conventional multiple tree algorithms can be inefficient and that a trade-off exists between these approaches. [sent-30, score-1.191]

9 To this end, we propose a new type of multiple tree algorithm that uses a variable number of tree nodes. [sent-31, score-1.195]

10 2 Problem Definition Our problem consists of finding sets of points that fit a given underlying spatial model. [sent-33, score-0.401]

11 In general, we are interested in finding sets with k or more points, thus providing a sufficient amount of support to confirm the discovery. [sent-35, score-0.365]

12 Finding this structure within the data may either be our end goal, such as in asteroid linkage, or may just be a preprocessor for a more sophisticated statistical test, such as renewal strings [1]. [sent-36, score-0.493]

13 Model points are the c points used to fully define the underlying model. [sent-42, score-0.456]

14 Support points are the remaining points used to confirm the model. [sent-43, score-0.456]

15 For example, if we are searching for sets of k linear points, we could use a set’s endpoints as model points and treat the middle k − 2 as support points. [sent-44, score-0.674]

16 The prototypical example used in this paper is the (linear) asteroid linkage problem: For each pair of points find the k − 2 best support points for the line that they define (such that we use at most one point at each time step). [sent-46, score-1.19]

17 1 Constructive Algorithms Constructive algorithms “build up” valid sets of points by repeatedly finding additional points that are compatible with the current set. [sent-50, score-0.782]

18 First, we exhaustively test all sets of c points to determine if they define a valid model. [sent-52, score-0.392]

19 Then, for each valid set we test all of the remaining points for support. [sent-53, score-0.331]

20 For example in the asteroid linkage problem, we can initially search over all O(N 2 ) pairs of points and for each of the resulting lines test all O(N) points to determine if they support that line. [sent-54, score-1.423]

21 A similar approach within the domain of target tracking is sequential tracking (for a good introduction see [3]), where points at early time steps are used to estimate a track that is then projected to later time steps to find additional support points. [sent-55, score-0.712]

22 Again returning to our asteroid example, we can place the points in a KD-tree [4]. [sent-57, score-0.684]

23 We can then limit the number of initial pairs examined by using this tree to find points compatible with our velocity constraints. [sent-58, score-0.867]

24 Further, we can use the KD-tree to only search for support points in localized regions around the line, ignoring large numbers of obviously infeasible points. [sent-59, score-0.747]

25 Similarly, trees have been used in tracking algorithms to efficiently find points near predicted track positions [5]. [sent-60, score-0.486]

26 We call these adaptations single tree algorithms, because at any given time the algorithm is searching at most one tree. [sent-61, score-0.586]

27 2 Parameter Space Methods Another approach is to search for valid sets of points by searching the model’s parameter space, such as in the Hough transform [6]. [sent-63, score-0.713]

28 The idea behind these approaches is that we can test whether each point is compatible with a small set of model parameters, allowing us to search parameter space to find the valid sets. [sent-64, score-0.45]

29 However, there is a clear potential to push further and use structure from multiple aspects of the search at the same time. [sent-70, score-0.367]

30 For example, in the domain of asteroid linkage we may be able to limit the number of short, initial associations that we have to consider by using information from later time steps. [sent-72, score-0.576]

31 This idea forms the basis of multiple tree search algorithms [7, 8, 9]. [sent-73, score-0.898]

32 Multiple tree methods explicitly search for the entire set of points at once by searching over combinations of tree nodes. [sent-74, score-1.6]

33 In standard single tree algorithms, the search tries to find individual points satisfying some criteria (e. [sent-75, score-0.987]

34 the next point to add) and the search state is represented by a single node that could contain such a point. [sent-77, score-0.321]

35 In contrast, multiple tree algorithms represent the current search state with multiple tree nodes that could contain points that together conform to the model. [sent-78, score-2.012]

36 Initially, the algorithm begins with k root nodes from either the same or different tree data structures, representing the k different points that must be found. [sent-79, score-0.958]

37 At each step in the search, it narrows in on a set of mutually compatible spatial regions and thus a set of individual points that fit the model by picking one of the model nodes and recursively exploring its children. [sent-80, score-0.689]

38 There are several important drawbacks to multiple tree algorithms. [sent-82, score-0.614]

39 First, additional trees introduce a higher branching factor in the search and increase the potential for taking deep “wrong turns. [sent-83, score-0.437]

40 ” Second, care must be taken in order to deal with missing or a variable number of support points. [sent-84, score-0.362]

41 discuss the use of an additional “missing” tree node to handle these cases [9]. [sent-87, score-0.614]

42 4 Variable Tree Algorithms In general we would like to exploit structural information from all aspects of our search problem, but do so while branching the search on just the parameters of interest. [sent-89, score-0.576]

43 To this end we propose a new type of search that uses a variable number of tree nodes. [sent-90, score-0.851]

44 Like a standard multiple tree algorithm, the variable tree algorithm searches combinations of tree nodes to find valid sets of points. [sent-91, score-2.164]

45 However, we limit this search to just those points required (A) (B) Figure 1: The model nodes’ bounds (1 and 2) define a region of feasible support (shaded) for any combination of model points from those nodes (A). [sent-92, score-1.317]

46 As shown in (B), we can classify entire support tree nodes as feasible (node b) or infeasible (nodes a and c). [sent-93, score-1.091]

47 Specifically, we use M model tree nodes,1 which guide the recursion and thus the search. [sent-95, score-0.557]

48 In addition, throughout the search we maintain information about other potential supporting points that can be used to confirm the final track or prune the search due to a lack of support. [sent-96, score-0.903]

49 For example in the asteroid linking problem each line is defined by only 2 points, thus we can efficiently search through the models using a multiple tree search with 2 model trees. [sent-97, score-1.547]

50 A, the spatial bounds of our current model nodes immediately limit the set of feasible support points for all line segments compatible with these nodes. [sent-99, score-1.048]

51 If we track which support points are feasible, we can use this information to prune the search due to a lack of support for any model defined by the points in those nodes. [sent-100, score-1.361]

52 The key idea behind the variable tree search is that we can use a dynamic representation of the potential support. [sent-101, score-0.86]

53 Specifically, we can place the support points in trees and maintain a dynamic list of currently valid support nodes. [sent-102, score-1.057]

54 B, by only testing entire nodes (instead of individual points), we are using spatial coherence of the support points to remove the expense of testing each support point at each step in the search. [sent-104, score-1.088]

55 And by maintaining a list of support tree nodes, we are no longer branching the search over these trees. [sent-105, score-1.155]

56 Further, using a combination of a list and a tree for our representation allows us to refine our support representation on the fly. [sent-107, score-0.842]

57 If we reach a point in the search where a support node is no longer valid, we can simply drop it off the list. [sent-108, score-0.578]

58 And if we reach a point where a support node provides too coarse a representation of the current support space, we can simply remove it and add both of its children to the list. [sent-109, score-0.755]

59 This leaves the question of when to split support nodes. [sent-110, score-0.346]

60 If we split them too soon, we may end up with many support nodes in our list and mitigate the benefits of the nodes’ spatial coherence. [sent-111, score-0.73]

61 If we wait too long to split them, then we may have a few large support nodes that cannot efficiently be pruned. [sent-112, score-0.522]

62 Although we are still investigating splitting strategies, the experiments in this paper use a heuristic that seeks to provide a small number of support nodes that are a reasonable fit to the feasible region. [sent-113, score-0.566]

63 We effectively split a support node if doing so would allow one of its two children to be pruned. [sent-114, score-0.475]

64 The full variable tree algorithm is given in Figure 2. [sent-116, score-0.581]

65 A simple example of finding linear tracks while using the track’s endpoints (earliest and latest in time) as model points and 1 Typically M = c, although in some cases it may be beneficial to use a different number of model nodes. [sent-117, score-0.327]

66 IF we have enough valid support points: IF all of m ∈ M are leaves: Test all combinations of points owned by the model nodes, using the support nodes’ points as potential support. [sent-135, score-1.177]

67 ELSE Let m∗ be the non-leaf model tree node that owns the most points. [sent-137, score-0.645]

68 Figure 2: A simple variable tree algorithm for spatial structure search. [sent-140, score-0.693]

69 This algorithm shown uses simple heuristics such as: searching the model node with the most points and splitting a support node if it is too wide. [sent-141, score-0.812]

70 using all other points for support is illustrated in Figure 3. [sent-143, score-0.485]

71 The first column shows all the tree nodes that are currently part of the search. [sent-144, score-0.73]

72 The second and third columns show the search’s position on the two model trees and the current set of valid support nodes respectively. [sent-145, score-0.701]

73 Unlike the pure multiple tree search, the variable tree search does not “branch off” on the support trees, allowing us to consider multiple support nodes from the same time step at any point in the search. [sent-146, score-2.234]

74 Again, it is important to note that by testing the support points as we search, we are both incorporating support information into the pruning decisions and “pruning” the support points for entire sets of models at once. [sent-147, score-1.329]

75 5 Results on the Asteroid Linking Domain The goal of the single-night asteroid linkage problem is to find sets of 2-dimensional point detections that correspond to a roughly linear motion model. [sent-148, score-0.687]

76 The asteroid detection data consists of detections from 8 images of the night sky separated by half-hour intervals. [sent-156, score-0.597]

77 It should be noted that only limited preprocessing was done to the data, resulting in a very high level Search Step 1: Search Step 2: Search Step 5: Figure 3: The variable tree algorithm performs a depth first search over the model nodes. [sent-162, score-0.845]

78 At each level of the search the model nodes are checked for mutual compatibility and each support node on the list is check for compatibility with the set of model nodes. [sent-163, score-1.003]

79 Since we are not branching on the support nodes, we can split a support node and add both children to our list. [sent-164, score-0.838]

80 This figure shows the current model and support nodes and their spatial regions. [sent-165, score-0.632]

81 Table 1: The running times (in seconds) for the asteroid linkers with different detection thresholds σ and thus different numbers N and density of observations. [sent-166, score-0.464]

82 The results on the intra-night asteroid tracking domain, shown in Table 1, illustrate a clear advantage to using a variable tree approach. [sent-174, score-1.024]

83 As the significance threshold σ decreases, the number and density of detections increases, allowing the support tree nodes to capture feasibility information for a large number of support points. [sent-175, score-1.5]

84 In contrast, neither the full multiple tree algorithm nor the single-tree algorithm performed well. [sent-176, score-0.614]

85 For the multiple tree algorithm, this decrease in performance is likely due to a combination of the high number of time steps, the allowance of a missing observation, and the high density. [sent-177, score-0.664]

86 Table 2: Average running times (in seconds) for a 2-dimensional rectangle search with different numbers of points N. [sent-179, score-0.545]

87 30 Table 3: Average running times (in seconds) for a rectangle search with different numbers of required corners k. [sent-209, score-0.367]

88 02 6 Experiments on the Simulated Rectangle Domain We can apply the above techniques to a range of other model-based spatial search problems. [sent-226, score-0.345]

89 Potential support points are those points that fall within some threshold of the other 2D − 2 corners. [sent-230, score-0.761]

90 The results, shown in Table 2, show a graceful scaling of all of the multiple tree algorithms. [sent-240, score-0.614]

91 In contrast, the brute force and single tree algorithms run into trouble as the number of points becomes moderately large. [sent-241, score-0.937]

92 The variable tree algorithm consistently performs the best, as it is able to avoid significant amounts of redundant computation. [sent-242, score-0.581]

93 One potential drawback of the full multiple tree algorithm is that since it branches on all points, it may become inefficient as the allowable number of missing support points grows. [sent-243, score-1.235]

94 To test this we looked at 3-dimensional data and varied the minimum number of required support points k. [sent-244, score-0.485]

95 As shown in Table 3, all multiple tree methods become more expensive as the number of required support points decreases. [sent-245, score-1.127]

96 7 Conclusions Tree-based spatial algorithms provide the potential for significant computational savings with multiple tree algorithms providing further opportunities to exploit structure in the data. [sent-248, score-1.044]

97 We presented a variable tree approach that exploits the advantages of both single tree and multiple tree algorithms. [sent-250, score-1.721]

98 A combinatorial search is carried out over just the minimum number of model points, while still tracking the feasibility of the various support points. [sent-251, score-0.62]

99 As shown in the above experiments, this approach provides significant computational savings over both the traditional single tree and and multiple tree searches. [sent-252, score-1.177]

100 Finally, it is interesting to note that the dynamic support technique described in this paper is general and may be applied to a range of other algorithms, such as the Fast Hough Transform [10], that maintain information on which points support a given model. [sent-253, score-0.778]


similar papers computed by tfidf model

tfidf for this paper:

wordName wordTfidf (topN-words)

[('tree', 0.526), ('asteroid', 0.377), ('support', 0.257), ('search', 0.233), ('points', 0.228), ('nodes', 0.204), ('detections', 0.149), ('spatial', 0.112), ('valid', 0.103), ('faint', 0.101), ('linkage', 0.1), ('multiple', 0.088), ('node', 0.088), ('kubica', 0.088), ('compatible', 0.083), ('branching', 0.08), ('brute', 0.08), ('trees', 0.078), ('connolly', 0.075), ('hough', 0.075), ('scurr', 0.075), ('feasible', 0.075), ('tracking', 0.066), ('astronomy', 0.066), ('prune', 0.064), ('track', 0.063), ('sets', 0.061), ('split', 0.061), ('searching', 0.06), ('linking', 0.059), ('list', 0.059), ('opportunities', 0.056), ('variable', 0.055), ('add', 0.054), ('pittsburgh', 0.052), ('force', 0.052), ('algorithms', 0.051), ('apple', 0.05), ('hawaii', 0.05), ('honolulu', 0.05), ('jedicke', 0.05), ('masiero', 0.05), ('rectangle', 0.05), ('compatibility', 0.05), ('corners', 0.05), ('missing', 0.05), ('searches', 0.048), ('threshold', 0.048), ('andrew', 0.047), ('providing', 0.047), ('potential', 0.046), ('nding', 0.044), ('night', 0.044), ('renewal', 0.044), ('child', 0.043), ('pruning', 0.041), ('children', 0.041), ('astronomical', 0.04), ('returning', 0.04), ('handful', 0.04), ('gb', 0.04), ('jeremy', 0.04), ('allowable', 0.04), ('conform', 0.04), ('place', 0.039), ('moore', 0.038), ('rectangles', 0.037), ('observational', 0.037), ('endpoints', 0.037), ('ghz', 0.037), ('associations', 0.037), ('savings', 0.037), ('end', 0.037), ('maintain', 0.036), ('strings', 0.035), ('running', 0.034), ('feasibility', 0.033), ('domain', 0.032), ('model', 0.031), ('inef', 0.031), ('limit', 0.03), ('remove', 0.03), ('exploit', 0.03), ('constructive', 0.03), ('heuristics', 0.03), ('splitting', 0.03), ('infeasible', 0.029), ('survey', 0.029), ('transform', 0.028), ('expensive', 0.028), ('pa', 0.028), ('effectively', 0.028), ('current', 0.028), ('leaves', 0.028), ('detection', 0.027), ('seconds', 0.027), ('combinations', 0.027), ('con', 0.027), ('table', 0.027), ('density', 0.026)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.99999917 200 nips-2005-Variable KD-Tree Algorithms for Spatial Pattern Search and Discovery

Author: Jeremy Kubica, Joseph Masiero, Robert Jedicke, Andrew Connolly, Andrew W. Moore

Abstract: In this paper we consider the problem of finding sets of points that conform to a given underlying model from within a dense, noisy set of observations. This problem is motivated by the task of efficiently linking faint asteroid detections, but is applicable to a range of spatial queries. We survey current tree-based approaches, showing a trade-off exists between single tree and multiple tree algorithms. To this end, we present a new type of multiple tree algorithm that uses a variable number of trees to exploit the advantages of both approaches. We empirically show that this algorithm performs well using both simulated and astronomical data.

2 0.1583807 154 nips-2005-Preconditioner Approximations for Probabilistic Graphical Models

Author: John D. Lafferty, Pradeep K. Ravikumar

Abstract: We present a family of approximation techniques for probabilistic graphical models, based on the use of graphical preconditioners developed in the scientific computing literature. Our framework yields rigorous upper and lower bounds on event probabilities and the log partition function of undirected graphical models, using non-iterative procedures that have low time complexity. As in mean field approaches, the approximations are built upon tractable subgraphs; however, we recast the problem of optimizing the tractable distribution parameters and approximate inference in terms of the well-studied linear systems problem of obtaining a good matrix preconditioner. Experiments are presented that compare the new approximation schemes to variational methods. 1

3 0.14544286 125 nips-2005-Message passing for task redistribution on sparse graphs

Author: K. Wong, Zhuo Gao, David Tax

Abstract: The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.

4 0.14482473 175 nips-2005-Sequence and Tree Kernels with Statistical Feature Mining

Author: Jun Suzuki, Hideki Isozaki

Abstract: This paper proposes a new approach to feature selection based on a statistical feature mining technique for sequence and tree kernels. Since natural language data take discrete structures, convolution kernels, such as sequence and tree kernels, are advantageous for both the concept and accuracy of many natural language processing tasks. However, experiments have shown that the best results can only be achieved when limited small sub-structures are dealt with by these kernels. This paper discusses this issue of convolution kernels and then proposes a statistical feature selection that enable us to use larger sub-structures effectively. The proposed method, in order to execute efficiently, can be embedded into an original kernel calculation process by using sub-structure mining algorithms. Experiments on real NLP tasks confirm the problem in the conventional method and compare the performance of a conventional method to that of the proposed method.

5 0.13432463 204 nips-2005-Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation

Author: Dmitry Malioutov, Alan S. Willsky, Jason K. Johnson

Abstract: This paper presents a new framework based on walks in a graph for analysis and inference in Gaussian graphical models. The key idea is to decompose correlations between variables as a sum over all walks between those variables in the graph. The weight of each walk is given by a product of edgewise partial correlations. We provide a walk-sum interpretation of Gaussian belief propagation in trees and of the approximate method of loopy belief propagation in graphs with cycles. This perspective leads to a better understanding of Gaussian belief propagation and of its convergence in loopy graphs. 1

6 0.1333099 153 nips-2005-Policy-Gradient Methods for Planning

7 0.11016823 69 nips-2005-Fast Gaussian Process Regression using KD-Trees

8 0.090055831 70 nips-2005-Fast Information Value for Graphical Models

9 0.08833731 11 nips-2005-A Hierarchical Compositional System for Rapid Object Detection

10 0.085387066 18 nips-2005-Active Learning For Identifying Function Threshold Boundaries

11 0.083066002 78 nips-2005-From Weighted Classification to Policy Search

12 0.081774928 121 nips-2005-Location-based activity recognition

13 0.08121831 124 nips-2005-Measuring Shared Information and Coordinated Activity in Neuronal Networks

14 0.060480736 192 nips-2005-The Information-Form Data Association Filter

15 0.059642099 5 nips-2005-A Computational Model of Eye Movements during Object Class Detection

16 0.058409203 149 nips-2005-Optimal cue selection strategy

17 0.058188196 123 nips-2005-Maximum Margin Semi-Supervised Learning for Structured Variables

18 0.057904918 171 nips-2005-Searching for Character Models

19 0.056273017 96 nips-2005-Inference with Minimal Communication: a Decision-Theoretic Variational Approach

20 0.055519551 122 nips-2005-Logic and MRF Circuitry for Labeling Occluding and Thinline Visual Contours


similar papers computed by lsi model

lsi for this paper:

topicId topicWeight

[(0, 0.196), (1, 0.029), (2, 0.065), (3, 0.038), (4, -0.054), (5, 0.024), (6, -0.158), (7, 0.23), (8, 0.082), (9, 0.24), (10, 0.141), (11, -0.109), (12, -0.002), (13, 0.001), (14, 0.042), (15, 0.006), (16, -0.103), (17, -0.09), (18, -0.023), (19, 0.034), (20, 0.048), (21, -0.028), (22, -0.074), (23, -0.15), (24, -0.029), (25, 0.04), (26, -0.107), (27, -0.136), (28, 0.094), (29, 0.098), (30, 0.03), (31, -0.03), (32, 0.127), (33, 0.093), (34, 0.098), (35, 0.056), (36, -0.032), (37, -0.084), (38, 0.017), (39, 0.084), (40, 0.034), (41, -0.002), (42, 0.038), (43, 0.055), (44, 0.046), (45, 0.011), (46, 0.045), (47, -0.031), (48, 0.023), (49, -0.049)]

similar papers list:

simIndex simValue paperId paperTitle

same-paper 1 0.97940075 200 nips-2005-Variable KD-Tree Algorithms for Spatial Pattern Search and Discovery

Author: Jeremy Kubica, Joseph Masiero, Robert Jedicke, Andrew Connolly, Andrew W. Moore

Abstract: In this paper we consider the problem of finding sets of points that conform to a given underlying model from within a dense, noisy set of observations. This problem is motivated by the task of efficiently linking faint asteroid detections, but is applicable to a range of spatial queries. We survey current tree-based approaches, showing a trade-off exists between single tree and multiple tree algorithms. To this end, we present a new type of multiple tree algorithm that uses a variable number of trees to exploit the advantages of both approaches. We empirically show that this algorithm performs well using both simulated and astronomical data.

2 0.70058244 204 nips-2005-Walk-Sum Interpretation and Analysis of Gaussian Belief Propagation

Author: Dmitry Malioutov, Alan S. Willsky, Jason K. Johnson

Abstract: This paper presents a new framework based on walks in a graph for analysis and inference in Gaussian graphical models. The key idea is to decompose correlations between variables as a sum over all walks between those variables in the graph. The weight of each walk is given by a product of edgewise partial correlations. We provide a walk-sum interpretation of Gaussian belief propagation in trees and of the approximate method of loopy belief propagation in graphs with cycles. This perspective leads to a better understanding of Gaussian belief propagation and of its convergence in loopy graphs. 1

3 0.67763609 125 nips-2005-Message passing for task redistribution on sparse graphs

Author: K. Wong, Zhuo Gao, David Tax

Abstract: The problem of resource allocation in sparse graphs with real variables is studied using methods of statistical physics. An efficient distributed algorithm is devised on the basis of insight gained from the analysis and is examined using numerical simulations, showing excellent performance and full agreement with the theoretical results.

4 0.6448518 175 nips-2005-Sequence and Tree Kernels with Statistical Feature Mining

Author: Jun Suzuki, Hideki Isozaki

Abstract: This paper proposes a new approach to feature selection based on a statistical feature mining technique for sequence and tree kernels. Since natural language data take discrete structures, convolution kernels, such as sequence and tree kernels, are advantageous for both the concept and accuracy of many natural language processing tasks. However, experiments have shown that the best results can only be achieved when limited small sub-structures are dealt with by these kernels. This paper discusses this issue of convolution kernels and then proposes a statistical feature selection that enable us to use larger sub-structures effectively. The proposed method, in order to execute efficiently, can be embedded into an original kernel calculation process by using sub-structure mining algorithms. Experiments on real NLP tasks confirm the problem in the conventional method and compare the performance of a conventional method to that of the proposed method.

5 0.56754905 154 nips-2005-Preconditioner Approximations for Probabilistic Graphical Models

Author: John D. Lafferty, Pradeep K. Ravikumar

Abstract: We present a family of approximation techniques for probabilistic graphical models, based on the use of graphical preconditioners developed in the scientific computing literature. Our framework yields rigorous upper and lower bounds on event probabilities and the log partition function of undirected graphical models, using non-iterative procedures that have low time complexity. As in mean field approaches, the approximations are built upon tractable subgraphs; however, we recast the problem of optimizing the tractable distribution parameters and approximate inference in terms of the well-studied linear systems problem of obtaining a good matrix preconditioner. Experiments are presented that compare the new approximation schemes to variational methods. 1

6 0.49317658 69 nips-2005-Fast Gaussian Process Regression using KD-Trees

7 0.482779 121 nips-2005-Location-based activity recognition

8 0.47501907 153 nips-2005-Policy-Gradient Methods for Planning

9 0.47218853 11 nips-2005-A Hierarchical Compositional System for Rapid Object Detection

10 0.46661299 18 nips-2005-Active Learning For Identifying Function Threshold Boundaries

11 0.41324082 70 nips-2005-Fast Information Value for Graphical Models

12 0.40818894 96 nips-2005-Inference with Minimal Communication: a Decision-Theoretic Variational Approach

13 0.40326834 122 nips-2005-Logic and MRF Circuitry for Labeling Occluding and Thinline Visual Contours

14 0.39094591 103 nips-2005-Kernels for gene regulatory regions

15 0.37358704 185 nips-2005-Subsequence Kernels for Relation Extraction

16 0.37307888 124 nips-2005-Measuring Shared Information and Coordinated Activity in Neuronal Networks

17 0.36105451 139 nips-2005-Non-iterative Estimation with Perturbed Gaussian Markov Processes

18 0.34899557 171 nips-2005-Searching for Character Models

19 0.33205155 46 nips-2005-Consensus Propagation

20 0.32575601 71 nips-2005-Fast Krylov Methods for N-Body Learning


similar papers computed by lda model

lda for this paper:

topicId topicWeight

[(3, 0.04), (10, 0.035), (27, 0.039), (31, 0.098), (34, 0.079), (39, 0.023), (41, 0.016), (55, 0.067), (67, 0.239), (69, 0.114), (73, 0.038), (88, 0.082), (91, 0.029)]

similar papers list:

simIndex simValue paperId paperTitle

1 0.91792351 174 nips-2005-Separation of Music Signals by Harmonic Structure Modeling

Author: Yun-gang Zhang, Chang-shui Zhang

Abstract: Separation of music signals is an interesting but difficult problem. It is helpful for many other music researches such as audio content analysis. In this paper, a new music signal separation method is proposed, which is based on harmonic structure modeling. The main idea of harmonic structure modeling is that the harmonic structure of a music signal is stable, so a music signal can be represented by a harmonic structure model. Accordingly, a corresponding separation algorithm is proposed. The main idea is to learn a harmonic structure model for each music signal in the mixture, and then separate signals by using these models to distinguish harmonic structures of different signals. Experimental results show that the algorithm can separate signals and obtain not only a very high Signalto-Noise Ratio (SNR) but also a rather good subjective audio quality. 1

same-paper 2 0.78041089 200 nips-2005-Variable KD-Tree Algorithms for Spatial Pattern Search and Discovery

Author: Jeremy Kubica, Joseph Masiero, Robert Jedicke, Andrew Connolly, Andrew W. Moore

Abstract: In this paper we consider the problem of finding sets of points that conform to a given underlying model from within a dense, noisy set of observations. This problem is motivated by the task of efficiently linking faint asteroid detections, but is applicable to a range of spatial queries. We survey current tree-based approaches, showing a trade-off exists between single tree and multiple tree algorithms. To this end, we present a new type of multiple tree algorithm that uses a variable number of trees to exploit the advantages of both approaches. We empirically show that this algorithm performs well using both simulated and astronomical data.

3 0.77496433 63 nips-2005-Efficient Unsupervised Learning for Localization and Detection in Object Categories

Author: Nicolas Loeff, Himanshu Arora, Alexander Sorokin, David Forsyth

Abstract: We describe a novel method for learning templates for recognition and localization of objects drawn from categories. A generative model represents the configuration of multiple object parts with respect to an object coordinate system; these parts in turn generate image features. The complexity of the model in the number of features is low, meaning our model is much more efficient to train than comparative methods. Moreover, a variational approximation is introduced that allows learning to be orders of magnitude faster than previous approaches while incorporating many more features. This results in both accuracy and localization improvements. Our model has been carefully tested on standard datasets; we compare with a number of recent template models. In particular, we demonstrate state-of-the-art results for detection and localization. 1

4 0.61730313 72 nips-2005-Fast Online Policy Gradient Learning with SMD Gain Vector Adaptation

Author: Jin Yu, Douglas Aberdeen, Nicol N. Schraudolph

Abstract: Reinforcement learning by direct policy gradient estimation is attractive in theory but in practice leads to notoriously ill-behaved optimization problems. We improve its robustness and speed of convergence with stochastic meta-descent, a gain vector adaptation method that employs fast Hessian-vector products. In our experiments the resulting algorithms outperform previously employed online stochastic, offline conjugate, and natural policy gradient methods. 1

5 0.60755891 96 nips-2005-Inference with Minimal Communication: a Decision-Theoretic Variational Approach

Author: O. P. Kreidl, Alan S. Willsky

Abstract: Given a directed graphical model with binary-valued hidden nodes and real-valued noisy observations, consider deciding upon the maximum a-posteriori (MAP) or the maximum posterior-marginal (MPM) assignment under the restriction that each node broadcasts only to its children exactly one single-bit message. We present a variational formulation, viewing the processing rules local to all nodes as degrees-of-freedom, that minimizes the loss in expected (MAP or MPM) performance subject to such online communication constraints. The approach leads to a novel message-passing algorithm to be executed offline, or before observations are realized, which mitigates the performance loss by iteratively coupling all rules in a manner implicitly driven by global statistics. We also provide (i) illustrative examples, (ii) assumptions that guarantee convergence and efficiency and (iii) connections to active research areas. 1

6 0.59271914 153 nips-2005-Policy-Gradient Methods for Planning

7 0.59052944 78 nips-2005-From Weighted Classification to Policy Search

8 0.58403009 154 nips-2005-Preconditioner Approximations for Probabilistic Graphical Models

9 0.57937396 45 nips-2005-Conditional Visual Tracking in Kernel Space

10 0.57640159 144 nips-2005-Off-policy Learning with Options and Recognizers

11 0.57326353 124 nips-2005-Measuring Shared Information and Coordinated Activity in Neuronal Networks

12 0.57256156 181 nips-2005-Spiking Inputs to a Winner-take-all Network

13 0.57198989 90 nips-2005-Hot Coupling: A Particle Approach to Inference and Normalization on Pairwise Undirected Graphs

14 0.5705871 36 nips-2005-Bayesian models of human action understanding

15 0.56922477 132 nips-2005-Nearest Neighbor Based Feature Selection for Regression and its Application to Neural Activity

16 0.56629819 67 nips-2005-Extracting Dynamical Structure Embedded in Neural Activity

17 0.56584579 30 nips-2005-Assessing Approximations for Gaussian Process Classification

18 0.56405371 43 nips-2005-Comparing the Effects of Different Weight Distributions on Finding Sparse Representations

19 0.5634622 187 nips-2005-Temporal Abstraction in Temporal-difference Networks

20 0.5632323 102 nips-2005-Kernelized Infomax Clustering