01.07.2020

Neun wissenschaftliche Beiträge angenommen für ACL 2020

Alexander Koller wird an der 58. Jahrestagung der Association for Computational Linguistics (ACL 2020) zusammen mit Co-Autorin Emily Bender vortragen. Der Beitrag "Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data" hat einen klugen Tintenfisch in der Hauptrolle und ist auch als Audio erhältlich.
Insgesamt wurden 9 Beiträge von DFKI und UdS mit der Tianjin University und der University of Washington für ACL2020 angenommen:

  • Christoph Alt, Aleksandra Gabryszak and Leonhard Hennig. "Probing Linguistic Features of Sentence-Level Representations in Relation Extraction", long paper
  • Christoph Alt, Aleksandra Gabryszak and Leonhard Hennig. "TACRED Revisited: A Thorough Evaluation of the TACRED Relation Extraction Task", long paper
  • Emily M. Bender and Alexander Koller, "Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data", long paper
  • David Harbecke and Christoph Alt. "Considering Likelihood in NLP Classification Explanations with Occlusion and Language Modeling", student paper
  • Nico Herbig, Tim Düwel, Santanu Pal, Kalliopi Maria Meladaki, Mahsa Monshizadeh, Antonio Krüger and Josef van Genabith. "MMPE: A Multi-Modal Interface for Post-Editing Machine Translation", long paper
  • Nico Herbig, Santanu Pal, Tim Düwel, Kalliopi Maria Meladaki, Mahsa Monshizadeh, Vladislav Hnatovskiy, Antonio Krüger and Josef van Genabith. "MMPE: A Multi-Modal Interface using Handwriting, Touch Reordering, and Speech Commands for Post-Editing Machine Translation", demo paper
  • Hongfei Xu, Josef van Genabith, Deyi Xiong, Qiuhui Liu and Jingyi Zhang. "Learning Source Phrase Representations for Neural Machine Translation", long paper
  • Hongfei Xu, Josef van Genabith, Deyi Xiong and Qiuhui Liu. "Dynamically Adjusting Transformer Batch Size by Monitoring Gradient Direction Change", short paper
  • Hongfei Xu, Qiuhui Liu, Josef van Genabith, Deyi Xiong and Jingyi Zhang. "Lipschitz Constrained Parameter Initialization for Deep Transformers", short paper