Seuraa
Takeshi Kojima
Takeshi Kojima
Tuntematon yhteys
Vahvistettu sähköpostiosoite verkkotunnuksessa weblab.t.u-tokyo.ac.jp
Nimike
Viittaukset
Viittaukset
Vuosi
Large language models are zero-shot reasoners
T Kojima, SS Gu, M Reid, Y Matsuo, Y Iwasawa
Advances in neural information processing systems 35, 22199-22213, 2022
19232022
Robustifying Vision Transformer without Retraining from Scratch by Test-Time Class-Conditional Feature Alignment
T Kojima, Y Matsuo, Y Iwasawa
Proceedings of the 31st International Joint Conference on Artificial …, 2022
192022
Unnatural error correction: Gpt-4 can almost perfectly handle unnatural scrambled text
Q Cao, T Kojima, Y Matsuo, Y Iwasawa
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
32023
Making Use of Latent Space in Language GANs for Generating Diverse Text without Pre-training
T Kojima, Y Iwasawa, Y Matsuo
Proceedings of the 16th Conference of the European Chapter of the …, 2021
22021
Robustifying Vision Transformer Without Retraining from Scratch Using Attention-Based Test-Time Adaptation
T Kojima, Y Iwasawa, Y Matsuo
New Generation Computing 41 (1), 5-24, 2023
12023
On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons
T Kojima, I Okimura, Y Iwasawa, H Yanaka, Y Matsuo
arXiv preprint arXiv:2404.02431, 2024
2024
Cycle Sketch GAN: Unpaired Sketch to Sketch Translation Based on Cycle GAN Algorithm
T Kojima
Proceedings of the Annual Conference of JSAI 33rd (2019), 3B3E203-3B3E203, 2019
2019
Järjestelmä ei voi suorittaa toimenpidettä nyt. Yritä myöhemmin uudelleen.
Artikkelit 1–7