2005 - David E. Rumelhart Prize for Contributions to the Theoretical Foundations of Human Cognition
1995 - Fellow of John Simon Guggenheim Memorial Foundation
Paul Smolensky mainly focuses on Artificial intelligence, Linguistics, Connectionism, Optimality theory and Cognitive science. His Artificial intelligence research is multidisciplinary, incorporating elements of Machine learning, Relevance and Natural language processing. His Natural language processing research incorporates themes from Mathematical economics and Syllable, Metrical phonology, Syllable weight.
He combines subjects such as Epistemology, Cognitive architecture and Symbolic computation with his study of Connectionism. His Optimality theory study improves the overall literature in Constraint. He works mostly in the field of Cognitive science, limiting it down to concerns involving Information processing and, occasionally, Set, Cognitive model and Dynamicism.
His main research concerns Artificial intelligence, Natural language processing, Linguistics, Connectionism and Theoretical computer science. The study incorporates disciplines such as Optimality theory, Constraint and Grammar in addition to Artificial intelligence. The Optimality theory study combines topics in areas such as Algorithm and Learnability.
In his research, Branching is intimately related to Theoretical linguistics, which falls under the overarching field of Natural language processing. His work carried out in the field of Connectionism brings together such families of science as Variable, Cognitive science, DUAL, Harmonic Grammar and Symbolic computation. Paul Smolensky has included themes like Sequence, Representation, Tensor product and Formal language in his Theoretical computer science study.
Paul Smolensky spends much of his time researching Artificial intelligence, Theoretical computer science, Natural language processing, Artificial neural network and Tensor product. His Artificial intelligence study often links to related topics such as Grammar. His studies deal with areas such as Parsing, Formal language, Knowledge base, Property and Sequence as well as Theoretical computer science.
His Natural language processing research is multidisciplinary, relying on both Theoretical linguistics and Symbolic computation. The concepts of his Artificial neural network study are interwoven with issues in Speech recognition, Speech production, Structure and Principle of compositionality. His Principle of compositionality research integrates issues from Structure, Training set, Numeral system, Cognitive science and Decomposition.
The scientist’s investigation covers issues in Theoretical computer science, Natural language processing, Artificial intelligence, Sequence and Tensor product. His Theoretical computer science research includes themes of Relational encoding and Natural language. His work carried out in the field of Natural language processing brings together such families of science as Deep learning, Theoretical linguistics and Symbolic computation.
His biological study focuses on Transformer. His Sequence study integrates concerns from other disciplines, such as Tuple, Formal language, Encoder, Relation and Lisp. His research investigates the connection between Tensor product and topics such as Autoencoder that intersect with problems in Sentence and Synthetic data.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Optimality Theory: Constraint Interaction in Generative Grammar
Alan S. Prince;Paul Smolensky.
On the proper treatment of connectionism
Behavioral and Brain Sciences (1993)
Information processing in dynamical systems: foundations of harmony theory
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1 (1986)
Learnability in Optimality Theory
Bruce Tesar;Paul Smolensky.
Tensor product variable binding and the representation of symbolic structures in connectionist systems
Artificial Intelligence (1990)
Skeletonization: A Technique for Trimming the Fat from a Network via Relevance Assessment
Michael C. Mozer;Paul Smolensky.
neural information processing systems (1988)
On the comprehension/production dilemma in child language
Linguistic Inquiry (1996)
The harmonic mind: From neural computation to optimality-theoretic grammar (Cognitive architecture), Vol. 1
Paul Smolensky;Géraldine Legendre.
Optimality: From Neural Networks to Universal Grammar
Alan Prince;Paul Smolensky.
Using Relevance to Reduce Network Size Automatically
Michael C. Mozer;Paul Smolensky.
Connection Science (1989)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: