nips nips2000 nips2000-132 knowledge-graph by maker-knowledge-mining
Source: pdf
Author: David B. Grimes, Michael Mozer
Abstract: Although connectionist models have provided insights into the nature of perception and motor control, connectionist accounts of higher cognition seldom go beyond an implementation of traditional symbol-processing theories. We describe a connectionist constraint satisfaction model of how people solve anagram problems. The model exploits statistics of English orthography, but also addresses the interplay of sub symbolic and symbolic computation by a mechanism that extracts approximate symbolic representations (partial orderings of letters) from sub symbolic structures and injects the extracted representation back into the model to assist in the solution of the anagram. We show the computational benefit of this extraction-injection process and discuss its relationship to conscious mental processes and working memory. We also account for experimental data concerning the difficulty of anagram solution based on the orthographic structure of the anagram string and the target word. Historically, the mind has been viewed from two opposing computational perspectives. The symbolic perspective views the mind as a symbolic information processing engine. According to this perspective, cognition operates on representations that encode logical relationships among discrete symbolic elements, such as stacks and structured trees, and cognition involves basic operations such as means-ends analysis and best-first search. In contrast, the subsymbolic perspective views the mind as performing statistical inference, and involves basic operations such as constraint-satisfaction search. The data structures on which these operations take place are numerical vectors. In some domains of cognition, significant progress has been made through analysis from one computational perspective or the other. The thesis of our work is that many of these domains might be understood more completely by focusing on the interplay of subsymbolic and symbolic information processing. Consider the higher-cognitive domain of problem solving. At an abstract level of description, problem solving tasks can readily be formalized in terms of symbolic representations and operations. However, the neurobiological hardware that underlies human cognition appears to be subsymbolic-representations are noisy and graded, and the brain operates and adapts in a continuous fashion that is difficult to characterize in discrete symbolic terms. At some level-between the computational level of the task description and the implementation level of human neurobiology-the symbolic and subsymbolic accounts must come into contact with one another. We focus on this point of contact by proposing mechanisms by which symbolic representations can modulate sub symbolic processing, and mechanisms by which subsymbolic representations are made symbolic. We conjecture that these mechanisms can not only provide an account for the interplay of symbolic and sub symbolic processes in cognition, but that they form a sensible computational strategy that outperforms purely subsymbolic computation, and hence, symbolic reasoning makes sense from an evolutionary perspective. In this paper, we apply our approach to a high-level cognitive task, anagram problem solving. An anagram is a nonsense string of letters whose letters can be rearranged to form a word. For example, the solution to the anagram puzzle RYTEHO is THEORY. Anagram solving is a interesting task because it taps higher cognitive abilities and issues of awareness, it has a tractable state space, and interesting psychological data is available to model. 1 A Sub symbolic Computational Model We start by presenting a purely subsymbolic model of anagram processing. By subsymbolic, we mean that the model utilizes only English orthographic statistics and does not have access to an English lexicon. We will argue that this model proves insufficient to explain human performance on anagram problem solving. However, it is a key component of a hybrid symbolic-subsymbolic model we propose, and is thus described in detail. 1.1 Problem Representation A computational model of anagram processing must represent letter orderings. For example, the model must be capable of representing a solution such as
Reference: text
sentIndex sentText sentNum sentScore
1 edu c Abstract Although connectionist models have provided insights into the nature of perception and motor control, connectionist accounts of higher cognition seldom go beyond an implementation of traditional symbol-processing theories. [sent-5, score-0.567]
2 We describe a connectionist constraint satisfaction model of how people solve anagram problems. [sent-6, score-0.688]
3 We show the computational benefit of this extraction-injection process and discuss its relationship to conscious mental processes and working memory. [sent-8, score-0.15]
4 We also account for experimental data concerning the difficulty of anagram solution based on the orthographic structure of the anagram string and the target word. [sent-9, score-1.272]
5 Historically, the mind has been viewed from two opposing computational perspectives. [sent-10, score-0.162]
6 The symbolic perspective views the mind as a symbolic information processing engine. [sent-11, score-1.463]
7 According to this perspective, cognition operates on representations that encode logical relationships among discrete symbolic elements, such as stacks and structured trees, and cognition involves basic operations such as means-ends analysis and best-first search. [sent-12, score-1.307]
8 In contrast, the subsymbolic perspective views the mind as performing statistical inference, and involves basic operations such as constraint-satisfaction search. [sent-13, score-0.669]
9 The data structures on which these operations take place are numerical vectors. [sent-14, score-0.089]
10 In some domains of cognition, significant progress has been made through analysis from one computational perspective or the other. [sent-15, score-0.206]
11 The thesis of our work is that many of these domains might be understood more completely by focusing on the interplay of subsymbolic and symbolic information processing. [sent-16, score-1.186]
12 At an abstract level of description, problem solving tasks can readily be formalized in terms of symbolic representations and operations. [sent-18, score-0.82]
13 However, the neurobiological hardware that underlies human cognition appears to be subsymbolic-representations are noisy and graded, and the brain operates and adapts in a continuous fashion that is difficult to characterize in discrete symbolic terms. [sent-19, score-1.099]
14 At some level-between the computational level of the task description and the implementation level of human neurobiology-the symbolic and subsymbolic accounts must come into contact with one another. [sent-20, score-1.311]
15 We focus on this point of contact by proposing mechanisms by which symbolic representations can modulate sub symbolic processing, and mechanisms by which subsymbolic representations are made symbolic. [sent-21, score-2.133]
16 In this paper, we apply our approach to a high-level cognitive task, anagram problem solving. [sent-23, score-0.578]
17 An anagram is a nonsense string of letters whose letters can be rearranged to form a word. [sent-24, score-0.755]
18 For example, the solution to the anagram puzzle RYTEHO is THEORY. [sent-25, score-0.59]
19 Anagram solving is a interesting task because it taps higher cognitive abilities and issues of awareness, it has a tractable state space, and interesting psychological data is available to model. [sent-26, score-0.233]
20 1 A Sub symbolic Computational Model We start by presenting a purely subsymbolic model of anagram processing. [sent-27, score-1.564]
21 By subsymbolic, we mean that the model utilizes only English orthographic statistics and does not have access to an English lexicon. [sent-28, score-0.162]
22 We will argue that this model proves insufficient to explain human performance on anagram problem solving. [sent-29, score-0.655]
23 However, it is a key component of a hybrid symbolic-subsymbolic model we propose, and is thus described in detail. [sent-30, score-0.056]
24 1 Problem Representation A computational model of anagram processing must represent letter orderings. [sent-32, score-0.613]
wordName wordTfidf (topN-words)
[('symbolic', 0.615), ('anagram', 0.52), ('subsymbolic', 0.326), ('cognition', 0.173), ('sub', 0.138), ('interplay', 0.137), ('perspective', 0.095), ('mind', 0.087), ('connectionist', 0.083), ('orthographic', 0.081), ('contact', 0.081), ('english', 0.079), ('representations', 0.077), ('string', 0.068), ('letters', 0.063), ('mechanisms', 0.063), ('cognitive', 0.058), ('operates', 0.058), ('accounts', 0.058), ('operations', 0.057), ('views', 0.051), ('domains', 0.049), ('processes', 0.046), ('purely', 0.045), ('human', 0.044), ('awareness', 0.041), ('boulder', 0.041), ('colorado', 0.041), ('puzzle', 0.041), ('orthography', 0.041), ('proposing', 0.041), ('rearranged', 0.041), ('seldom', 0.041), ('historically', 0.041), ('solving', 0.038), ('computational', 0.038), ('orderings', 0.037), ('neurobiological', 0.037), ('gr', 0.037), ('insufficient', 0.037), ('abilities', 0.037), ('opposing', 0.037), ('mental', 0.037), ('evolutionary', 0.037), ('modulate', 0.037), ('focusing', 0.037), ('underlies', 0.034), ('conjecture', 0.034), ('hardware', 0.034), ('satisfaction', 0.034), ('graded', 0.034), ('sensible', 0.034), ('level', 0.033), ('structures', 0.032), ('presenting', 0.032), ('formalized', 0.032), ('logical', 0.032), ('description', 0.031), ('concerning', 0.03), ('adapts', 0.03), ('utilizes', 0.03), ('hybrid', 0.03), ('solution', 0.029), ('psychological', 0.029), ('benefit', 0.029), ('extracts', 0.029), ('letter', 0.029), ('involves', 0.028), ('exploits', 0.028), ('mozer', 0.028), ('addresses', 0.028), ('insights', 0.028), ('outperforms', 0.028), ('proves', 0.028), ('task', 0.027), ('trees', 0.026), ('motor', 0.026), ('perception', 0.026), ('model', 0.026), ('michael', 0.025), ('access', 0.025), ('co', 0.025), ('discrete', 0.025), ('basic', 0.025), ('implementation', 0.025), ('characterize', 0.025), ('reasoning', 0.025), ('readily', 0.025), ('people', 0.025), ('account', 0.024), ('capable', 0.024), ('fashion', 0.024), ('progress', 0.024), ('er', 0.024), ('traditional', 0.024), ('encode', 0.022), ('extracted', 0.022), ('interesting', 0.022), ('structured', 0.022), ('understood', 0.022)]
simIndex simValue paperId paperTitle
same-paper 1 1.0 132 nips-2000-The Interplay of Symbolic and Subsymbolic Processes in Anagram Problem Solving
Author: David B. Grimes, Michael Mozer
Abstract: Although connectionist models have provided insights into the nature of perception and motor control, connectionist accounts of higher cognition seldom go beyond an implementation of traditional symbol-processing theories. We describe a connectionist constraint satisfaction model of how people solve anagram problems. The model exploits statistics of English orthography, but also addresses the interplay of sub symbolic and symbolic computation by a mechanism that extracts approximate symbolic representations (partial orderings of letters) from sub symbolic structures and injects the extracted representation back into the model to assist in the solution of the anagram. We show the computational benefit of this extraction-injection process and discuss its relationship to conscious mental processes and working memory. We also account for experimental data concerning the difficulty of anagram solution based on the orthographic structure of the anagram string and the target word. Historically, the mind has been viewed from two opposing computational perspectives. The symbolic perspective views the mind as a symbolic information processing engine. According to this perspective, cognition operates on representations that encode logical relationships among discrete symbolic elements, such as stacks and structured trees, and cognition involves basic operations such as means-ends analysis and best-first search. In contrast, the subsymbolic perspective views the mind as performing statistical inference, and involves basic operations such as constraint-satisfaction search. The data structures on which these operations take place are numerical vectors. In some domains of cognition, significant progress has been made through analysis from one computational perspective or the other. The thesis of our work is that many of these domains might be understood more completely by focusing on the interplay of subsymbolic and symbolic information processing. Consider the higher-cognitive domain of problem solving. At an abstract level of description, problem solving tasks can readily be formalized in terms of symbolic representations and operations. However, the neurobiological hardware that underlies human cognition appears to be subsymbolic-representations are noisy and graded, and the brain operates and adapts in a continuous fashion that is difficult to characterize in discrete symbolic terms. At some level-between the computational level of the task description and the implementation level of human neurobiology-the symbolic and subsymbolic accounts must come into contact with one another. We focus on this point of contact by proposing mechanisms by which symbolic representations can modulate sub symbolic processing, and mechanisms by which subsymbolic representations are made symbolic. We conjecture that these mechanisms can not only provide an account for the interplay of symbolic and sub symbolic processes in cognition, but that they form a sensible computational strategy that outperforms purely subsymbolic computation, and hence, symbolic reasoning makes sense from an evolutionary perspective. In this paper, we apply our approach to a high-level cognitive task, anagram problem solving. An anagram is a nonsense string of letters whose letters can be rearranged to form a word. For example, the solution to the anagram puzzle RYTEHO is THEORY. Anagram solving is a interesting task because it taps higher cognitive abilities and issues of awareness, it has a tractable state space, and interesting psychological data is available to model. 1 A Sub symbolic Computational Model We start by presenting a purely subsymbolic model of anagram processing. By subsymbolic, we mean that the model utilizes only English orthographic statistics and does not have access to an English lexicon. We will argue that this model proves insufficient to explain human performance on anagram problem solving. However, it is a key component of a hybrid symbolic-subsymbolic model we propose, and is thus described in detail. 1.1 Problem Representation A computational model of anagram processing must represent letter orderings. For example, the model must be capable of representing a solution such as
2 0.068038002 10 nips-2000-A Productive, Systematic Framework for the Representation of Visual Structure
Author: Shimon Edelman, Nathan Intrator
Abstract: We describe a unified framework for the understanding of structure representation in primate vision. A model derived from this framework is shown to be effectively systematic in that it has the ability to interpret and associate together objects that are related through a rearrangement of common
3 0.038439237 19 nips-2000-Adaptive Object Representation with Hierarchically-Distributed Memory Sites
Author: Bosco S. Tjan
Abstract: Theories of object recognition often assume that only one representation scheme is used within one visual-processing pathway. Versatility of the visual system comes from having multiple visual-processing pathways, each specialized in a different category of objects. We propose a theoretically simpler alternative, capable of explaining the same set of data and more. A single primary visual-processing pathway, loosely modular, is assumed. Memory modules are attached to sites along this pathway. Object-identity decision is made independently at each site. A site's response time is a monotonic-decreasing function of its confidence regarding its decision. An observer's response is the first-arriving response from any site. The effective representation(s) of such a system, determined empirically, can appear to be specialized for different tasks and stimuli, consistent with recent clinical and functional-imaging findings. This, however, merely reflects a decision being made at its appropriate level of abstraction. The system itself is intrinsically flexible and adaptive.
4 0.037194591 130 nips-2000-Text Classification using String Kernels
Author: Huma Lodhi, John Shawe-Taylor, Nello Cristianini, Christopher J. C. H. Watkins
Abstract: We introduce a novel kernel for comparing two text documents. The kernel is an inner product in the feature space consisting of all subsequences of length k. A subsequence is any ordered sequence of k characters occurring in the text though not necessarily contiguously. The subsequences are weighted by an exponentially decaying factor of their full length in the text, hence emphasising those occurrences which are close to contiguous. A direct computation of this feature vector would involve a prohibitive amount of computation even for modest values of k, since the dimension of the feature space grows exponentially with k. The paper describes how despite this fact the inner product can be efficiently evaluated by a dynamic programming technique. A preliminary experimental comparison of the performance of the kernel compared with a standard word feature space kernel [6] is made showing encouraging results. 1
5 0.034865927 71 nips-2000-Interactive Parts Model: An Application to Recognition of On-line Cursive Script
Author: Predrag Neskovic, Philip C. Davis, Leon N. Cooper
Abstract: In this work, we introduce an Interactive Parts (IP) model as an alternative to Hidden Markov Models (HMMs). We t ested both models on a database of on-line cursive script. We show that implementations of HMMs and the IP model, in which all letters are assumed to have the same average width , give comparable results. However , in contrast to HMMs, the IP model can handle duration modeling without an increase in computational complexity. 1
6 0.026205609 139 nips-2000-The Use of MDL to Select among Computational Models of Cognition
7 0.02298468 147 nips-2000-Who Does What? A Novel Algorithm to Determine Function Localization
8 0.022786131 140 nips-2000-Tree-Based Modeling and Estimation of Gaussian Processes on Graphs with Cycles
9 0.022500921 127 nips-2000-Structure Learning in Human Causal Induction
10 0.022490075 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
11 0.021274174 2 nips-2000-A Comparison of Image Processing Techniques for Visual Speech Recognition Applications
12 0.021146679 101 nips-2000-Place Cells and Spatial Navigation Based on 2D Visual Feature Extraction, Path Integration, and Reinforcement Learning
13 0.020282403 87 nips-2000-Modelling Spatial Recall, Mental Imagery and Neglect
14 0.019532166 82 nips-2000-Learning and Tracking Cyclic Human Motion
15 0.018691268 92 nips-2000-Occam's Razor
16 0.01828091 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
17 0.018061332 31 nips-2000-Beyond Maximum Likelihood and Density Estimation: A Sample-Based Criterion for Unsupervised Learning of Complex Models
18 0.017748486 6 nips-2000-A Neural Probabilistic Language Model
19 0.017604882 85 nips-2000-Mixtures of Gaussian Processes
20 0.017337516 138 nips-2000-The Use of Classifiers in Sequential Inference
topicId topicWeight
[(0, 0.063), (1, -0.027), (2, 0.003), (3, 0.0), (4, -0.013), (5, 0.019), (6, 0.029), (7, -0.016), (8, 0.051), (9, 0.017), (10, 0.031), (11, 0.073), (12, 0.027), (13, 0.01), (14, -0.02), (15, 0.046), (16, 0.004), (17, -0.054), (18, 0.004), (19, 0.061), (20, 0.019), (21, -0.043), (22, 0.02), (23, -0.001), (24, 0.043), (25, -0.044), (26, 0.053), (27, -0.009), (28, -0.033), (29, 0.078), (30, -0.003), (31, 0.185), (32, -0.188), (33, 0.049), (34, -0.088), (35, -0.036), (36, -0.103), (37, 0.086), (38, -0.034), (39, -0.09), (40, -0.056), (41, 0.093), (42, 0.22), (43, 0.053), (44, 0.257), (45, -0.2), (46, -0.014), (47, 0.084), (48, 0.016), (49, 0.098)]
simIndex simValue paperId paperTitle
same-paper 1 0.98145914 132 nips-2000-The Interplay of Symbolic and Subsymbolic Processes in Anagram Problem Solving
Author: David B. Grimes, Michael Mozer
Abstract: Although connectionist models have provided insights into the nature of perception and motor control, connectionist accounts of higher cognition seldom go beyond an implementation of traditional symbol-processing theories. We describe a connectionist constraint satisfaction model of how people solve anagram problems. The model exploits statistics of English orthography, but also addresses the interplay of sub symbolic and symbolic computation by a mechanism that extracts approximate symbolic representations (partial orderings of letters) from sub symbolic structures and injects the extracted representation back into the model to assist in the solution of the anagram. We show the computational benefit of this extraction-injection process and discuss its relationship to conscious mental processes and working memory. We also account for experimental data concerning the difficulty of anagram solution based on the orthographic structure of the anagram string and the target word. Historically, the mind has been viewed from two opposing computational perspectives. The symbolic perspective views the mind as a symbolic information processing engine. According to this perspective, cognition operates on representations that encode logical relationships among discrete symbolic elements, such as stacks and structured trees, and cognition involves basic operations such as means-ends analysis and best-first search. In contrast, the subsymbolic perspective views the mind as performing statistical inference, and involves basic operations such as constraint-satisfaction search. The data structures on which these operations take place are numerical vectors. In some domains of cognition, significant progress has been made through analysis from one computational perspective or the other. The thesis of our work is that many of these domains might be understood more completely by focusing on the interplay of subsymbolic and symbolic information processing. Consider the higher-cognitive domain of problem solving. At an abstract level of description, problem solving tasks can readily be formalized in terms of symbolic representations and operations. However, the neurobiological hardware that underlies human cognition appears to be subsymbolic-representations are noisy and graded, and the brain operates and adapts in a continuous fashion that is difficult to characterize in discrete symbolic terms. At some level-between the computational level of the task description and the implementation level of human neurobiology-the symbolic and subsymbolic accounts must come into contact with one another. We focus on this point of contact by proposing mechanisms by which symbolic representations can modulate sub symbolic processing, and mechanisms by which subsymbolic representations are made symbolic. We conjecture that these mechanisms can not only provide an account for the interplay of symbolic and sub symbolic processes in cognition, but that they form a sensible computational strategy that outperforms purely subsymbolic computation, and hence, symbolic reasoning makes sense from an evolutionary perspective. In this paper, we apply our approach to a high-level cognitive task, anagram problem solving. An anagram is a nonsense string of letters whose letters can be rearranged to form a word. For example, the solution to the anagram puzzle RYTEHO is THEORY. Anagram solving is a interesting task because it taps higher cognitive abilities and issues of awareness, it has a tractable state space, and interesting psychological data is available to model. 1 A Sub symbolic Computational Model We start by presenting a purely subsymbolic model of anagram processing. By subsymbolic, we mean that the model utilizes only English orthographic statistics and does not have access to an English lexicon. We will argue that this model proves insufficient to explain human performance on anagram problem solving. However, it is a key component of a hybrid symbolic-subsymbolic model we propose, and is thus described in detail. 1.1 Problem Representation A computational model of anagram processing must represent letter orderings. For example, the model must be capable of representing a solution such as
2 0.39671811 147 nips-2000-Who Does What? A Novel Algorithm to Determine Function Localization
Author: Ranit Aharonov-Barki, Isaac Meilijson, Eytan Ruppin
Abstract: We introduce a novel algorithm, termed PPA (Performance Prediction Algorithm), that quantitatively measures the contributions of elements of a neural system to the tasks it performs. The algorithm identifies the neurons or areas which participate in a cognitive or behavioral task, given data about performance decrease in a small set of lesions. It also allows the accurate prediction of performances due to multi-element lesions. The effectiveness of the new algorithm is demonstrated in two models of recurrent neural networks with complex interactions among the elements. The algorithm is scalable and applicable to the analysis of large neural networks. Given the recent advances in reversible inactivation techniques, it has the potential to significantly contribute to the understanding of the organization of biological nervous systems, and to shed light on the long-lasting debate about local versus distributed computation in the brain.
3 0.37165105 10 nips-2000-A Productive, Systematic Framework for the Representation of Visual Structure
Author: Shimon Edelman, Nathan Intrator
Abstract: We describe a unified framework for the understanding of structure representation in primate vision. A model derived from this framework is shown to be effectively systematic in that it has the ability to interpret and associate together objects that are related through a rearrangement of common
4 0.35794064 127 nips-2000-Structure Learning in Human Causal Induction
Author: Joshua B. Tenenbaum, Thomas L. Griffiths
Abstract: We use graphical models to explore the question of how people learn simple causal relationships from data. The two leading psychological theories can both be seen as estimating the parameters of a fixed graph. We argue that a complete account of causal induction should also consider how people learn the underlying causal graph structure, and we propose to model this inductive process as a Bayesian inference. Our argument is supported through the discussion of three data sets.
5 0.32340416 19 nips-2000-Adaptive Object Representation with Hierarchically-Distributed Memory Sites
Author: Bosco S. Tjan
Abstract: Theories of object recognition often assume that only one representation scheme is used within one visual-processing pathway. Versatility of the visual system comes from having multiple visual-processing pathways, each specialized in a different category of objects. We propose a theoretically simpler alternative, capable of explaining the same set of data and more. A single primary visual-processing pathway, loosely modular, is assumed. Memory modules are attached to sites along this pathway. Object-identity decision is made independently at each site. A site's response time is a monotonic-decreasing function of its confidence regarding its decision. An observer's response is the first-arriving response from any site. The effective representation(s) of such a system, determined empirically, can appear to be specialized for different tasks and stimuli, consistent with recent clinical and functional-imaging findings. This, however, merely reflects a decision being made at its appropriate level of abstraction. The system itself is intrinsically flexible and adaptive.
6 0.29926056 116 nips-2000-Sex with Support Vector Machines
7 0.29469553 139 nips-2000-The Use of MDL to Select among Computational Models of Cognition
8 0.27102292 85 nips-2000-Mixtures of Gaussian Processes
9 0.25228465 131 nips-2000-The Early Word Catches the Weights
11 0.23812844 138 nips-2000-The Use of Classifiers in Sequential Inference
12 0.23604612 71 nips-2000-Interactive Parts Model: An Application to Recognition of On-line Cursive Script
13 0.21169618 113 nips-2000-Robust Reinforcement Learning
14 0.20444614 67 nips-2000-Homeostasis in a Silicon Integrate and Fire Neuron
15 0.20191774 86 nips-2000-Model Complexity, Goodness of Fit and Diminishing Returns
16 0.20183402 115 nips-2000-Sequentially Fitting ``Inclusive'' Trees for Inference in Noisy-OR Networks
17 0.19887331 1 nips-2000-APRICODD: Approximate Policy Construction Using Decision Diagrams
18 0.18150099 64 nips-2000-High-temperature Expansions for Learning Models of Nonnegative Data
20 0.16460435 27 nips-2000-Automatic Choice of Dimensionality for PCA
topicId topicWeight
[(10, 0.029), (17, 0.081), (32, 0.013), (33, 0.011), (55, 0.044), (62, 0.031), (65, 0.037), (67, 0.027), (72, 0.486), (76, 0.025), (81, 0.021), (90, 0.035), (91, 0.011), (97, 0.022)]
simIndex simValue paperId paperTitle
same-paper 1 0.88667214 132 nips-2000-The Interplay of Symbolic and Subsymbolic Processes in Anagram Problem Solving
Author: David B. Grimes, Michael Mozer
Abstract: Although connectionist models have provided insights into the nature of perception and motor control, connectionist accounts of higher cognition seldom go beyond an implementation of traditional symbol-processing theories. We describe a connectionist constraint satisfaction model of how people solve anagram problems. The model exploits statistics of English orthography, but also addresses the interplay of sub symbolic and symbolic computation by a mechanism that extracts approximate symbolic representations (partial orderings of letters) from sub symbolic structures and injects the extracted representation back into the model to assist in the solution of the anagram. We show the computational benefit of this extraction-injection process and discuss its relationship to conscious mental processes and working memory. We also account for experimental data concerning the difficulty of anagram solution based on the orthographic structure of the anagram string and the target word. Historically, the mind has been viewed from two opposing computational perspectives. The symbolic perspective views the mind as a symbolic information processing engine. According to this perspective, cognition operates on representations that encode logical relationships among discrete symbolic elements, such as stacks and structured trees, and cognition involves basic operations such as means-ends analysis and best-first search. In contrast, the subsymbolic perspective views the mind as performing statistical inference, and involves basic operations such as constraint-satisfaction search. The data structures on which these operations take place are numerical vectors. In some domains of cognition, significant progress has been made through analysis from one computational perspective or the other. The thesis of our work is that many of these domains might be understood more completely by focusing on the interplay of subsymbolic and symbolic information processing. Consider the higher-cognitive domain of problem solving. At an abstract level of description, problem solving tasks can readily be formalized in terms of symbolic representations and operations. However, the neurobiological hardware that underlies human cognition appears to be subsymbolic-representations are noisy and graded, and the brain operates and adapts in a continuous fashion that is difficult to characterize in discrete symbolic terms. At some level-between the computational level of the task description and the implementation level of human neurobiology-the symbolic and subsymbolic accounts must come into contact with one another. We focus on this point of contact by proposing mechanisms by which symbolic representations can modulate sub symbolic processing, and mechanisms by which subsymbolic representations are made symbolic. We conjecture that these mechanisms can not only provide an account for the interplay of symbolic and sub symbolic processes in cognition, but that they form a sensible computational strategy that outperforms purely subsymbolic computation, and hence, symbolic reasoning makes sense from an evolutionary perspective. In this paper, we apply our approach to a high-level cognitive task, anagram problem solving. An anagram is a nonsense string of letters whose letters can be rearranged to form a word. For example, the solution to the anagram puzzle RYTEHO is THEORY. Anagram solving is a interesting task because it taps higher cognitive abilities and issues of awareness, it has a tractable state space, and interesting psychological data is available to model. 1 A Sub symbolic Computational Model We start by presenting a purely subsymbolic model of anagram processing. By subsymbolic, we mean that the model utilizes only English orthographic statistics and does not have access to an English lexicon. We will argue that this model proves insufficient to explain human performance on anagram problem solving. However, it is a key component of a hybrid symbolic-subsymbolic model we propose, and is thus described in detail. 1.1 Problem Representation A computational model of anagram processing must represent letter orderings. For example, the model must be capable of representing a solution such as
2 0.61590165 1 nips-2000-APRICODD: Approximate Policy Construction Using Decision Diagrams
Author: Robert St-Aubin, Jesse Hoey, Craig Boutilier
Abstract: We propose a method of approximate dynamic programming for Markov decision processes (MDPs) using algebraic decision diagrams (ADDs). We produce near-optimal value functions and policies with much lower time and space requirements than exact dynamic programming. Our method reduces the sizes of the intermediate value functions generated during value iteration by replacing the values at the terminals of the ADD with ranges of values. Our method is demonstrated on a class of large MDPs (with up to 34 billion states), and we compare the results with the optimal value functions.
3 0.21844526 106 nips-2000-Propagation Algorithms for Variational Bayesian Learning
Author: Zoubin Ghahramani, Matthew J. Beal
Abstract: Variational approximations are becoming a widespread tool for Bayesian learning of graphical models. We provide some theoretical results for the variational updates in a very general family of conjugate-exponential graphical models. We show how the belief propagation and the junction tree algorithms can be used in the inference step of variational Bayesian learning. Applying these results to the Bayesian analysis of linear-Gaussian state-space models we obtain a learning procedure that exploits the Kalman smoothing propagation, while integrating over all model parameters. We demonstrate how this can be used to infer the hidden state dimensionality of the state-space model in a variety of synthetic problems and one real high-dimensional data set. 1
4 0.2184376 107 nips-2000-Rate-coded Restricted Boltzmann Machines for Face Recognition
Author: Yee Whye Teh, Geoffrey E. Hinton
Abstract: We describe a neurally-inspired, unsupervised learning algorithm that builds a non-linear generative model for pairs of face images from the same individual. Individuals are then recognized by finding the highest relative probability pair among all pairs that consist of a test image and an image whose identity is known. Our method compares favorably with other methods in the literature. The generative model consists of a single layer of rate-coded, non-linear feature detectors and it has the property that, given a data vector, the true posterior probability distribution over the feature detector activities can be inferred rapidly without iteration or approximation. The weights of the feature detectors are learned by comparing the correlations of pixel intensities and feature activations in two phases: When the network is observing real data and when it is observing reconstructions of real data generated from the feature activations.
5 0.21638778 122 nips-2000-Sparse Representation for Gaussian Process Models
Author: Lehel Csatč´¸, Manfred Opper
Abstract: We develop an approach for a sparse representation for Gaussian Process (GP) models in order to overcome the limitations of GPs caused by large data sets. The method is based on a combination of a Bayesian online algorithm together with a sequential construction of a relevant subsample of the data which fully specifies the prediction of the model. Experimental results on toy examples and large real-world data sets indicate the efficiency of the approach.
6 0.21504338 74 nips-2000-Kernel Expansions with Unlabeled Examples
7 0.2144326 98 nips-2000-Partially Observable SDE Models for Image Sequence Recognition Tasks
8 0.21324657 52 nips-2000-Fast Training of Support Vector Classifiers
9 0.21251357 4 nips-2000-A Linear Programming Approach to Novelty Detection
10 0.21177889 104 nips-2000-Processing of Time Series by Neural Circuits with Biologically Realistic Synaptic Dynamics
11 0.21169752 68 nips-2000-Improved Output Coding for Classification Using Continuous Relaxation
12 0.21055248 10 nips-2000-A Productive, Systematic Framework for the Representation of Visual Structure
13 0.21046843 51 nips-2000-Factored Semi-Tied Covariance Matrices
14 0.21015145 71 nips-2000-Interactive Parts Model: An Application to Recognition of On-line Cursive Script
15 0.20987274 147 nips-2000-Who Does What? A Novel Algorithm to Determine Function Localization
16 0.20986666 130 nips-2000-Text Classification using String Kernels
17 0.20954432 80 nips-2000-Learning Switching Linear Models of Human Motion
18 0.20940597 102 nips-2000-Position Variance, Recurrence and Perceptual Learning
19 0.20932582 2 nips-2000-A Comparison of Image Processing Techniques for Visual Speech Recognition Applications
20 0.20829417 79 nips-2000-Learning Segmentation by Random Walks