Seuraa
Machel Reid
Machel Reid
Researcher, University of Tokyo; incoming PhD Student, University of Washington
Vahvistettu sähköpostiosoite verkkotunnuksessa cs.washington.edu - Kotisivu
Nimike
Viittaukset
Viittaukset
Vuosi
LEWIS: Levenshtein Editing for Unsupervised Text Style Transfer
M Reid, V Zhong
ACL-IJCNLP 2021 Findings, 2021
182021
Large Language Models are Zero-Shot Reasoners
T Kojima, SS Gu, M Reid, Y Matsuo, Y Iwasawa
arXiv preprint arXiv:2205.11916, 2022
122022
Can Wikipedia Help Offline Reinforcement Learning?
M Reid, Y Yamada, SS Gu
arXiv preprint arXiv:2201.12122, 2022
122022
VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition Modeling
M Reid, E Marrese-Taylor, Y Matsuo
EMNLP 2020, 2020
92020
Subformer: Exploring Weight Sharing for Parameter Efficiency in Generative Transformers
M Reid, E Marrese-Taylor, Y Matsuo
EMNLP 2021 Findings, 2021
42021
AfroMT: Pretraining Strategies and Reproducible Benchmarks for Translation of 8 African Languages
M Reid, J Hu, G Neubig, Y Matsuo
EMNLP 2021, 2021
32021
Variational Inference for Learning Representations of Natural Language Edits
E Marrese-Taylor, M Reid, Y Matsuo
AAAI 2021, 2020
32020
PARADISE: Exploiting Parallel Data for Multilingual Sequence-to-Sequence Pretraining
M Reid, M Artetxe
NAACL 2022, 2021
22021
Low-Resource Machine Translation Using Cross-Lingual Language Model Pretraining
F Zheng, M Reid, E Marrese-Taylor, Y Matsuo
AmericasNLP Workshop, NAACL 2021, 2021
22021
On the Impact of Data Augmentation on Downstream Performance in Natural Language Processing
I Okimura, M Reid, M Kawano, Y Matsuo
Proceedings of the Third Workshop on Insights from Negative Results in NLP …, 2022
12022
Learning to Model Editing Processes
M Reid, G Neubig
arXiv preprint arXiv:2205.12374, 2022
2022
A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
DI Adelani, JO Alabi, A Fan, J Kreutzer, X Shen, M Reid, D Ruiter, ...
NAACL 2022, 2022
2022
A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation
D Ifeoluwa Adelani, J Oluwadara Alabi, A Fan, J Kreutzer, X Shen, M Reid, ...
arXiv e-prints, arXiv: 2205.02022, 2022
2022
Combining Pretrained High-Resource Embeddings and Subword Representations for Low-Resource Languages
M Reid, E Marrese-Taylor, Y Matsuo
AfricaNLP Workshop, ICLR 2020, 2020
2020
Järjestelmä ei voi suorittaa toimenpidettä nyt. Yritä myöhemmin uudelleen.
Artikkelit 1–14