Yan Xu
Yan Xu
Co-Chief Technology Officer @ Linklogis
Verified email at
Cited by
Cited by
Classifying relations via long short term memory networks along shortest dependency paths
Y Xu, L Mou, G Li, Y Chen, H Peng, Z Jin
Proceedings of the 2015 conference on empirical methods in natural language …, 2015
Natural language inference by tree-based convolution and heuristic matching
L Mou, R Men, G Li, Y Xu, L Zhang, R Yan, Z Jin
arXiv preprint arXiv:1512.08422, 2015
How Transferable are Neural Networks in NLP Applications?
L Mou, Z Meng, R Yan, G Li, Y Xu, L Zhang, Z Jin
Proc. EMNLP2016, 2016
Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation
Y Xu, R Jia, L Mou, G Li, Y Chen, Y Lu, Z Jin
Proc. COLING2016, 2016
Tree-based convolution: A new architecture for sentence modeling
L Mou, H Peng, G Li, Y Xu, L Zhang, Z Jin
arXiv preprint arXiv:1504.01106, 2015
Building program vector representations for deep learning
L Mou, G Li, Y Liu, H Peng, Z Jin, Y Xu, L Zhang
arXiv preprint arXiv:1409.3358, 2014
Compressing Neural Language Models by Sparse Word Representations
Y Chen, L Mou, Y Xu, G Li, Z Jin
Proc. ACL2016, 2016
Distilling word embeddings: An encoding approach
L Mou, G Li, Y Xu, L Zhang, Z Jin
Proc. CIKM2016, 2015
An engineerable ontology based approach for requirements elicitation in process centered problem domain
G Li, Z Jin, Y Xu, Y Lu
Knowledge Science, Engineering and Management: 5th International Conference …, 2011
Learning non-taxonomic relations on demand for ontology extension
Y Xu, G Li, L Mou, Y Lu
International journal of software engineering and knowledge engineering 24 …, 2014
魏强, 金芝, 许焱
软件学报 25 (8), 1640-1658, 2014
物联网服务建模: 一种基于环境建模的方法
李戈, 魏强, 李力行, 金芝, 许焱, 郑丽伟
中国科学: 信息科学, 1198-1218, 2013
Tree-based convolution: A new neural architecture for sentence modeling
L Mou, H Peng, G Li, Y Xu, L Zhang, Z Jin
Proceedings of Conference on Empirical Methods in Natural Language …, 2015
基于多 Web 信息源的主题概念网络获取
许焱, 金芝, 李戈, 魏强
计算机研究与发展 50 (9), 1843-1854, 2013
The system can't perform the operation now. Try again later.
Articles 1–14