Anders Lansner mostly deals with Neuroscience, Artificial neural network, Artificial intelligence, Bcpnn and Electrophysiology. His work on Central nervous system, Biological neural network and Inhibitory postsynaptic potential as part of general Neuroscience research is frequently linked to Lamprey, thereby connecting diverse disciplines of science. His Artificial neural network research includes themes of Cerebral cortex, Neurophysiology, Nerve net and Computational model.
His research in Artificial intelligence intersects with topics in Machine learning, Task and Attractor. His work in the fields of Machine learning, such as Spiking neural network, overlaps with other areas such as Open source. His Hebbian theory course of study focuses on Algorithm and Neocortex.
Anders Lansner mainly investigates Neuroscience, Artificial intelligence, Artificial neural network, Attractor and Attractor network. Anders Lansner performs multidisciplinary study on Neuroscience and Lamprey in his works. His Artificial intelligence research is multidisciplinary, incorporating elements of Machine learning and Pattern recognition.
His study on Content-addressable memory is often connected to Bcpnn as part of broader study in Artificial neural network. His work deals with themes such as Neocortex and Theoretical computer science, which intersect with Attractor. The study incorporates disciplines such as Context, Lateral inhibition and Pattern completion in addition to Attractor network.
His primary areas of investigation include Neuroscience, Artificial neural network, Artificial intelligence, Hebbian theory and Learning rule. His work in Neuroscience addresses subjects such as Synaptic plasticity, which are connected to disciplines such as Reward system. Anders Lansner interconnects Network model, Bayesian probability, Computer architecture and Implementation in the investigation of issues within Artificial neural network.
His study in Artificial intelligence focuses on Attractor network in particular. His study in Attractor network is interdisciplinary in nature, drawing from both Stimulus and Excitatory postsynaptic potential. His Hebbian theory study frequently involves adjacent topics like Attractor.
His scientific interests lie mostly in Artificial neural network, Artificial intelligence, Bcpnn, Neuroscience and Hebbian theory. His Artificial neural network research includes elements of Network model and Bioinformatics. His work carried out in the field of Artificial intelligence brings together such families of science as Olfactory bulb, Computer architecture and Olfactory system.
His Neuroscience study frequently draws connections between adjacent fields such as Synaptic scaling. Anders Lansner has included themes like Prefrontal cortex, Attractor and Bayesian probability, Bayes' theorem in his Hebbian theory study. His Working memory study combines topics from a wide range of disciplines, such as Neuroplasticity, Long-term memory and Explicit memory.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Simulation of networks of spiking neurons: A review of tools and strategies
Romain Brette;Michelle Rudolph;Ted Carnevale;Michael L. Hines.
Journal of Computational Neuroscience (2007)
A Bayesian neural network method for adverse drug reaction signal generation
Andrew Bate;M. Lindquist;I.R. Edwards;S. Olsson.
European Journal of Clinical Pharmacology (1998)
Neural networks that co-ordinate locomotion and body orientation in lamprey
S. Grillner;T. Deliagina;A. El Manira;R.H. Hill.
Trends in Neurosciences (1995)
Neuronal network generating locomotor behavior in lamprey: circuitry, transmitters, membrane properties, and simulation.
Sten Grillner;Peter Wallen;Lennart Brodin;Anders Lansner.
Annual Review of Neuroscience (1991)
Neurocognitive Architecture of Working Memory
Johan Eriksson;Edward K. Vogel;Anders B. Lansner;Fredrik Bergstrom.
The cortex as a central pattern generator.
Rafael Yuste;Jason N MacLean;Jeffrey Smith;Anders Lansner.
Nature Reviews Neuroscience (2005)
Intrinsic function of a neuronal network - a vertebrate central pattern generator.
Sten Grillner;Örjan Ekeberg;Abdeljabbar El Manira;Anders Lansner.
Brain Research Reviews (1998)
Computing the Local Field Potential (LFP) from Integrate-and-Fire Network Models.
Alberto Mazzoni;Alberto Mazzoni;Henrik Lindén;Henrik Lindén;Hermann Cuntz;Hermann Cuntz;Hermann Cuntz;Anders Lansner.
PLOS Computational Biology (2015)
A computer based model for realistic simulations of neural networks
Ö. Ekeberg;P. Wallén;A. Lansner;H. Tråvén.
Biological Cybernetics (1991)
Associative memory models: from the cell-assembly theory to biophysically detailed cortex simulations.
Trends in Neurosciences (2009)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below: