H-Index & Metrics Top Publications

H-Index & Metrics

Discipline name H-index Citations Publications World Ranking National Ranking
Computer Science H-index 53 Citations 19,999 141 World Ranking 2481 National Ranking 43

Overview

What is he best known for?

The fields of study he is best known for:

  • Artificial intelligence
  • Machine learning
  • Natural language processing

His scientific interests lie mostly in Artificial intelligence, Natural language processing, Word, Theoretical computer science and Recurrent neural network. His research investigates the connection between Artificial intelligence and topics such as Machine learning that intersect with issues in Adversarial system. His work deals with themes such as Context, Training set, Artificial neural network, Information retrieval and Test set, which intersect with Natural language processing.

His study in the field of Word embedding is also linked to topics like Gender bias. In his research on the topic of Theoretical computer science, Parsing, Parser combinator and Range is strongly related with Algorithm. As a part of the same scientific study, Yoav Goldberg usually deals with the Recurrent neural network, concentrating on Deep learning and frequently concerns with Nervous system network models, Types of artificial neural networks, Neural network software, Cellular neural network and Time delay neural network.

His most cited work include:

  • Neural Word Embedding as Implicit Matrix Factorization (1213 citations)
  • Improving Distributional Similarity with Lessons Learned from Word Embeddings (914 citations)
  • word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method. (839 citations)

What are the main themes of his work throughout his whole career to date?

Yoav Goldberg mainly focuses on Artificial intelligence, Natural language processing, Parsing, Word and Machine learning. Many of his studies involve connections with topics such as Pattern recognition and Artificial intelligence. His Natural language processing research incorporates themes from Context, Speech recognition and Hebrew.

The Parsing study which covers Algorithm that intersects with Theoretical computer science. His Word research is multidisciplinary, incorporating elements of Semantics and Representation. His Language model research is multidisciplinary, relying on both Artificial neural network, Focus and Domain.

He most often published in these fields:

  • Artificial intelligence (81.78%)
  • Natural language processing (62.15%)
  • Parsing (25.70%)

What were the highlights of his more recent work (between 2020-2021)?

  • Artificial intelligence (81.78%)
  • Natural language processing (62.15%)
  • Process (11.68%)

In recent papers he was focusing on the following fields of study:

Yoav Goldberg mostly deals with Artificial intelligence, Natural language processing, Process, Natural language and Language model. His work in the fields of Embedding overlaps with other areas such as Structure. His biological study spans a wide range of topics, including Historical linguistics, Sign language, Sound change and Relation.

His study in Natural language is interdisciplinary in nature, drawing from both Distress, Association, Lexicon, Semantics and Multilevel model. Yoav Goldberg interconnects Equivalence, SIGNAL, Cognitive science and Meaning in the investigation of issues within Language model. His Space research is multidisciplinary, incorporating perspectives in Value, Event, Contrast and Representation.

Between 2020 and 2021, his most popular works were:

  • Amnesic Probing: Behavioral Explanation With Amnesic Counterfactuals (8 citations)
  • Measuring and Improving Consistency in Pretrained Language Models (5 citations)
  • Contrastive Explanations for Model Interpretability. (3 citations)

In his most recent research, the most cited papers focused on:

  • Artificial intelligence
  • Machine learning
  • Programming language

Yoav Goldberg mainly investigates Attribution, Property, Point, Natural language processing and Artificial intelligence. Attribution is connected with Process, Causal chain, Interpretation, Causality and Event in his research. His research in Property intersects with topics in Language model and Consistency.

Point is connected with Scrutiny, Intervention, Alternative methods, Representation and Cognitive psychology in his study. His Natural language processing study combines topics from a wide range of disciplines, such as Contrast, Space, Representation, Interpretability and Value.

This overview was generated by a machine learning system which analysed the scientist’s body of work. If you have any feedback, you can contact us here.

Top Publications

Neural Word Embedding as Implicit Matrix Factorization

Omer Levy;Yoav Goldberg.
neural information processing systems (2014)

1767 Citations

Improving Distributional Similarity with Lessons Learned from Word Embeddings

Omer Levy;Yoav Goldberg;Ido Dagan.
Transactions of the Association for Computational Linguistics (2015)

1353 Citations

word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method.

Yoav Goldberg;Omer Levy.
arXiv: Computation and Language (2014)

1320 Citations

Dependency-Based Word Embeddings

Omer Levy;Yoav Goldberg.
meeting of the association for computational linguistics (2014)

1221 Citations

Universal Dependencies v1: A Multilingual Treebank Collection

Joakim Nivre;Marie-Catherine de Marneffe;Filip Ginter;Yoav Goldberg.
language resources and evaluation (2016)

819 Citations

Neural Network Methods in Natural Language Processing

Yoav Goldberg;Graeme Hirst.
(2017)

779 Citations

A primer on neural network models for natural language processing

Yoav Goldberg.
Journal of Artificial Intelligence Research (2016)

753 Citations

Linguistic Regularities in Sparse and Explicit Word Representations

Omer Levy;Yoav Goldberg.
conference on computational natural language learning (2014)

604 Citations

Universal Dependency Annotation for Multilingual Parsing

Ryan McDonald;Joakim Nivre;Yvonne Quirmbach-Brundage;Yoav Goldberg.
meeting of the association for computational linguistics (2013)

523 Citations

DyNet: The Dynamic Neural Network Toolkit

Graham Neubig;Chris Dyer;Yoav Goldberg;Austin Matthews.
arXiv: Machine Learning (2017)

511 Citations

Profile was last updated on December 6th, 2021.
Research.com Ranking is based on data retrieved from the Microsoft Academic Graph (MAG).
The ranking h-index is inferred from publications deemed to belong to the considered discipline.

If you think any of the details on this page are incorrect, let us know.

Contact us

Top Scientists Citing Yoav Goldberg

Noah A. Smith

Noah A. Smith

University of Washington

Publications: 83

Anders Søgaard

Anders Søgaard

University of Copenhagen

Publications: 75

Yue Zhang

Yue Zhang

Westlake University

Publications: 63

Graham Neubig

Graham Neubig

Carnegie Mellon University

Publications: 59

Joakim Nivre

Joakim Nivre

Uppsala University

Publications: 59

Hinrich Schütze

Hinrich Schütze

Ludwig-Maximilians-Universität München

Publications: 53

Anna Korhonen

Anna Korhonen

University of Cambridge

Publications: 47

Chris Dyer

Chris Dyer

Google (United States)

Publications: 43

Yonatan Belinkov

Yonatan Belinkov

Technion – Israel Institute of Technology

Publications: 43

Ivan Vulić

Ivan Vulić

University of Cambridge

Publications: 42

Jiawei Han

Jiawei Han

University of Illinois at Urbana-Champaign

Publications: 41

Iryna Gurevych

Iryna Gurevych

University of Paderborn

Publications: 41

Eduard Hovy

Eduard Hovy

Carnegie Mellon University

Publications: 39

Kai-Wei Chang

Kai-Wei Chang

University of California, Los Angeles

Publications: 39

Ido Dagan

Ido Dagan

Bar-Ilan University

Publications: 37

Something went wrong. Please try again later.