Shaojie Bai
Shaojie Bai
Machine Learning Department, Carnegie Mellon University
Verified email at cs.cmu.edu - Homepage
Title
Cited by
Cited by
Year
An empirical evaluation of generic convolutional and recurrent networks for sequence modeling
S Bai, JZ Kolter, V Koltun
arXiv preprint arXiv:1803.01271, 2018
9052018
Multimodal Transformer for Unaligned Multimodal Language Sequences
YHH Tsai, S Bai, PP Liang, JZ Kolter, LP Morency, R Salakhutdinov
Proceedings of the Annual Meeting of the Association for Computational …, 2019
492019
Deep equilibrium models
S Bai, JZ Kolter, V Koltun
Advances in Neural Information Processing Systems, 688-699, 2019
362019
Trellis networks for sequence modeling
S Bai, JZ Kolter, V Koltun
International Conference on Learning Representations (ICLR), 2019
292019
Convolutional sequence modeling revisited
S Bai, JZ Kolter, V Koltun
International Conference on Learning Representations (ICLR), 2018
222018
Transformer Dissection: An Unified Understanding for Transformer's Attention via the Lens of Kernel
YHH Tsai, S Bai, M Yamada, LP Morency, R Salakhutdinov
Proceedings of the Conference on Empirical Methods in Natural Language …, 2019
142019
An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. CoRR abs/1803.01271 (2018)
S Bai, JZ Kolter, V Koltun
arXiv preprint arXiv:1803.01271, 2018
102018
Multiscale deep equilibrium models
S Bai, V Koltun, JZ Kolter
arXiv preprint arXiv:2006.08656, 2020
2020
The system can't perform the operation now. Try again later.
Articles 1–8