I'm an AI Research Scientist at FAIR (Meta AI) working on Massively Multilingual Machine Translation models. I completed my PhD in 2020 from Université Grenoble Alpes, where I worked at LIG (GETALP) and at Inria (Thoth) under the supervision of Jakob Verbeek and Laurent Besacier on the subject of Neural Machine Translation and more broadly on sequence-to-sequence prediction.
Prior to my PhD, I graduated from Centrale Paris (P2016) and Ecole normale supérieure de Cachan where I did my MVA master. More details can be found in my resume.

What's new?
Publications
Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English
Maha Elbayad, Michael Ustaszewski, Emmanuelle Esperança-Rodier, Francis Brunet Manquat, Jakob Verbeek, Laurent Besacier
COLING, 2020
ArXiv Slides Video Code Bibtex
  @article{elbayad2020online,
  title={Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English},
  author={Elbayad, Maha and Ustaszewski, Michael and Esperan{\c{c}}a-Rodier, Emmanuelle and Manquat, Francis Brunet, Jakob Verbeek and Besacier, Laurent},
  journal={COLING},
  year={2020}
}
 
Efficient Wait-k Models for Simultaneous Machine Translation
Maha Elbayad, Laurent Besacier, Jakob Verbeek
INTERSPEECH, 2020
ArXiv In proceedings Slides Video Code Bibtex
  @inproceedings{elbayad20waitk,
    title={Efficient Wait-k Models for Simultaneous Machine Translation},
    author={Elbayad, Maha and Besacier, Laurent and Verbeek, Jakob},
    booktitle ={INTERSPEECH},
    year={2020}
  }
 
ON-TRAC Consortium for End-to-End and Simultaneous Speech Translation Challenge Tasks at IWSLT 2020
Maha Elbayad*, Ha Nguyen*, Fethi Bougares, Natalia Tomashenko, Antoine Caubrière, Benjamin Lecouteux, Yannick Estève, Laurent Besacier
* Equal contribution
IWSLT, 2020
ArXiv In proceedings Video Bibtex
  @inproceedings{Elbayad20iwslt,
    title = "{ON}-{TRAC} Consortium for End-to-End and Simultaneous Speech Translation Challenge Tasks at {IWSLT} 2020",
    author = "Elbayad, Maha  and Nguyen, Ha  
              and Bougares, Fethi  and Tomashenko, Natalia  
              and Caubri{\`e}re, Antoine  and Lecouteux, Benjamin  
              and Est{\`e}ve, Yannick  and Besacier, Laurent",
    booktitle = "In Proc. of IWSLT",
    year = "2020"
    }
 
Depth-Adaptive Transformer
Maha Elbayad*, Jiatao Gu, Edouard Grave, Michael Auli
* Work done while interning at Facebook AI
Eighth International Conference on Learning Representations (ICLR), 2020
ArXiv In proceedings Video Bibtex
  @InProceedings{elbayad19arxiv,
    author ="Elbayad, Maha and Gu, Jiatao and Grave, Edouard and Auli, Michael",
    title = "Depth-Adaptive Transformer",
    booktitle = "In Proc. of ICLR",
    year = "2020",
 }
 
Pervasive Attention - 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction
Maha Elbayad, Laurent Besacier, Jakob Verbeek
The SIGNLL Conference on Computational Natural Language Learning (CoNLL), 2018
ArXiv In proceedings Poster Code Bibtex
  @InProceedings{elbayad18conll,
    author ="Elbayad, Maha and Besacier, Laurent and Verbeek, Jakob",
    title = "Pervasive Attention: 2D Convolutional Neural Networks for Sequence-to-Sequence Prediction",
    booktitle = "Proceedings of the 22nd Conference on Computational Natural Language Learning",
    year = "2018",
 }
 
Token-level and sequence-level loss smoothing for RNN language models
Maha Elbayad, Laurent Besacier, Jakob Verbeek
Annual Meeting of the Association for Computational Linguistics (ACL), 2018
ArXiv In proceedings Slides Video Code Bibtex
  @InProceedings{elbayad18acl,
  author = "ELBAYAD, Maha and Besacier, Laurent and Verbeek, Jakob",
  title = "Token-level and sequence-level loss smoothing for RNN language models",
  booktitle = "Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
  year = "2018",
}
 
Ph.D Thesis

Rethinking the Design of Sequence-to-Sequence Models for Efficient Machine Translation
Defended on June 22nd, 2020
Manuscript Slides
Talks
Teaching