2022 - Research.com Rising Star of Science Award
His scientific interests lie mostly in Artificial intelligence, Natural language processing, Word, State and Margin. His Artificial intelligence research incorporates themes from Pattern recognition, Speech recognition and Orders of magnitude. His Speech recognition study incorporates themes from Word representation, Word lists by frequency and Word embedding.
His work in the fields of Language model and Syntactic structure overlaps with other areas such as Swahili and Hebrew. There are a combination of areas like Analogy, Online encyclopedia and Quality integrated together with his Word study. His State research includes themes of Translation and Inference.
His main research concerns Artificial intelligence, Natural language processing, Language model, Word and Machine learning. His study ties his expertise on Pattern recognition together with the subject of Artificial intelligence. Within one scientific family, he focuses on topics pertaining to Generative model under Natural language processing, and may sometimes address concerns connected to Syntax, Principle of compositionality and Latent variable.
His study in Language model is interdisciplinary in nature, drawing from both Transfer of learning, Cognitive psychology, Theoretical computer science and Transformer. The concepts of his Word study are interwoven with issues in Representation, Translation and State. His research investigates the connection between Machine learning and topics such as Class that intersect with problems in Binary classification.
Edouard Grave mainly investigates Question answering, Transformer, Artificial intelligence, Machine translation and Information retrieval. His research integrates issues of Language model, Algorithm, Quantization and Cross lingual in his study of Transformer. His Artificial intelligence research is multidisciplinary, relying on both Machine learning and Natural language processing.
His Natural language processing research is multidisciplinary, incorporating elements of Translation and Training set. His Machine translation study deals with Computation intersecting with Deep learning and Inference. He has researched Information retrieval in several fields, including Artificial neural network and Generative grammar.
This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.
Enriching Word Vectors with Subword Information
Piotr Bojanowski;Edouard Grave;Armand Joulin;Tomas Mikolov.
Transactions of the Association for Computational Linguistics (2017)
Bag of Tricks for Efficient Text Classification
Armand Joulin;Edouard Grave;Piotr Bojanowski;Tomas Mikolov.
conference of the european chapter of the association for computational linguistics (2017)
Unsupervised Cross-lingual Representation Learning at Scale
Alexis Conneau;Kartikay Khandelwal;Naman Goyal;Vishrav Chaudhary.
meeting of the association for computational linguistics (2020)
Learning Word Vectors for 157 Languages
Edouard Grave;Piotr Bojanowski;Prakhar Gupta;Armand Joulin.
language resources and evaluation (2018)
Advances in Pre-Training Distributed Word Representations
Tomas Mikolov;Edouard Grave;Piotr Bojanowski;Christian Puhrsch.
language resources and evaluation (2017)
Parseval networks: improving robustness to adversarial examples
Moustapha Cisse;Piotr Bojanowski;Edouard Grave;Yann Dauphin.
international conference on machine learning (2017)
FastText.zip: Compressing text classification models
Armand Joulin;Edouard Grave;Piotr Bojanowski;Matthijs Douze.
arXiv: Computation and Language (2016)
Colorless green recurrent networks dream hierarchically
Kristina Gulordava;Piotr Bojanowski;Edouard Grave;Tal Linzen.
north american chapter of the association for computational linguistics (2018)
Reducing Transformer Depth on Demand with Structured Dropout
Angela Fan;Edouard Grave;Armand Joulin.
international conference on learning representations (2020)
Loss in Translation: Learning Bilingual Word Mapping with a Retrieval Criterion
Armand Joulin;Piotr Bojanowski;Tomas Mikolov;Hervé Jégou.
empirical methods in natural language processing (2018)
If you think any of the details on this page are incorrect, let us know.
We appreciate your kind effort to assist us to improve this page, it would be helpful providing us with as much detail as possible in the text box below:
Facebook (United States)
Czech Technical University in Prague
École Normale Supérieure
Facebook (United States)
ETH Zurich
Facebook (United States)
Columbia University
University College London
Facebook (United States)
University of Washington
Polish Academy of Sciences
Technical University of Denmark
University of Minnesota
University of Missouri
University of Vermont
Florida Atlantic University
National Institute of Advanced Industrial Science and Technology
Geological Survey of Canada
China University of Geosciences
Spanish National Research Council
University of Erlangen-Nuremberg
Max Planck Institute for Human Development
Johns Hopkins University
University of South Florida
Cayetano Heredia University
University of Michigan–Ann Arbor