His primary areas of investigation include Information transfer, Theoretical computer science, Transfer entropy, Complex system and Entropy. His Information transfer study integrates concerns from other disciplines, such as Machine learning, Elementary cognitive task, Functional magnetic resonance imaging and Information processing. Joseph T. Lizier has included themes like Computation and Cellular automaton in his Theoretical computer science study.
His Transfer entropy study also includes
Joseph T. Lizier focuses on Transfer entropy, Information transfer, Theoretical computer science, Complex system and Information theory. Joseph T. Lizier interconnects Data mining, Mutual information, Statistical physics, Multivariate statistics and Algorithm in the investigation of issues within Transfer entropy. He works mostly in the field of Information transfer, limiting it down to concerns involving Cellular automaton and, occasionally, Causal structure and Contrast.
Joseph T. Lizier focuses mostly in the field of Theoretical computer science, narrowing it down to matters related to Computation and, in some cases, Swarm behaviour. The Complex system study combines topics in areas such as Artificial life and Information processing. His research investigates the connection between Information theory and topics such as Entropy that intersect with problems in Entropy and Entropy.
His scientific interests lie mostly in Transfer entropy, Multivariate statistics, Pairwise comparison, Statistical hypothesis testing and Information transfer. His Transfer entropy research includes themes of Bivariate analysis, Information theory, Network model and Algorithm. His work carried out in the field of Bivariate analysis brings together such families of science as Complex system, Mutual information and Data mining.
His Information transfer research incorporates themes from Statistical physics and Information processing. His studies examine the connections between Information processing and genetics, as well as such issues in Biological system, with regards to Artificial neural network. The concepts of his Random variable study are interwoven with issues in Theoretical computer science, Entropy, Entropy, Entropy and Entropy.
Transfer entropy, Multivariate statistics, Information theory, Statistical hypothesis testing and Group behavior are his primary areas of study. His work deals with themes such as Polarization, Time series and Topology, which intersect with Transfer entropy. His research in Multivariate statistics intersects with topics in Pearson product-moment correlation coefficient, Econometrics and Autocorrelation.
His study in Information theory is interdisciplinary in nature, drawing from both Sample size determination, Network model, Data mining, Word error rate and Statistical inference. His studies deal with areas such as False positive rate, Autoregressive model, Algorithm, Granger causality and Sampling distribution as well as Statistical hypothesis testing. Group behavior is connected with Shoaling and schooling, Affect, Conformity and Zoology in his study.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
JIDT: an information-theoretic toolkit for studying the dynamics of complex systems
Joseph Troy Lizier;Joseph Troy Lizier.
Frontiers in Robotics and AI (2014)
Local information transfer as a spatiotemporal filter for complex systems.
Joseph T. Lizier;Joseph T. Lizier;Mikhail Prokopenko;Albert Y. Zomaya.
Physical Review E (2008)
Measuring Information-Transfer Delays
Michael Wibral;Nicolae Pampu;Viola Priesemann;Felix Siebenhühner;Felix Siebenhühner.
PLOS ONE (2013)
Information processing in echo state networks at the edge of chaos
Joschka Boedecker;Oliver Obst;Oliver Obst;Joseph T. Lizier;Joseph T. Lizier;N. Michael Mayer.
Theory in Biosciences (2012)
Differentiating information transfer and causal effect
Joseph T. Lizier;Joseph T. Lizier;Mikhail Prokopenko;Mikhail Prokopenko.
European Physical Journal B (2010)
An Introduction to Transfer Entropy: Information Flow in Complex Systems
Terry Bossomaier;Lionel Barnett;Michael Harr;Joseph T. Lizier.
Published in <b>2016</b> (2016)
Local active information storage as a tool to understand distributed neural information processing
Michael Wibral;Joseph T. Lizier;Sebastian Vögler;Viola Priesemann.
Frontiers in Neuroinformatics (2014)
Multivariate information-theoretic measures reveal directed information structure and task relevant changes in fMRI connectivity
Joseph T. Lizier;Jakob Heinzle;Annette Horstmann;John-Dylan Haynes.
Journal of Computational Neuroscience (2011)
Local measures of information storage in complex distributed computation
Joseph T. Lizier;Mikhail Prokopenko;Albert Y. Zomaya.
Information Sciences (2012)
Directed Information Measures in Neuroscience
Michael Wibral;Raul Vicente;Joseph T. Lizier.
Directed Information Measures in Neuroscience: Understanding Complex Systems (2014)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
University of Sydney
University of Göttingen
University of Sydney
University of Sydney
Max Planck Institute for Mathematics in the Sciences
Charité - University Medicine Berlin
Santa Fe Institute
University of Sydney
Norwegian University of Life Sciences
Centre national de la recherche scientifique, CNRS
École de Technologie Supérieure
University of Michigan–Ann Arbor
Portland State University
Grenoble Alpes University
École Polytechnique Fédérale de Lausanne
Texas A&M University
University of Cologne
University of Bonn
Prefectural University of Hiroshima
Toyohashi University of Technology
GNS Science
Michigan State University
University of Ulm
Brighton and Sussex Medical School
Charité - University Medicine Berlin
Rice University