Optimal ANN-SNN conversion for high-accuracy and ultra-low-latency spiking neural networks T Bu, W Fang, J Ding, PL Dai, Z Yu, T Huang arXiv preprint arXiv:2303.04347, 2023 | 147 | 2023 |
Optimized potential initialization for low-latency spiking neural networks T Bu, J Ding, Z Yu, T Huang Proceedings of the AAAI Conference on Artificial Intelligence 36 (1), 11-20, 2022 | 69 | 2022 |
Reducing ann-snn conversion error through residual membrane potential Z Hao, T Bu, J Ding, T Huang, Z Yu Proceedings of the AAAI Conference on Artificial Intelligence 37 (1), 11-21, 2023 | 31 | 2023 |
Bridging the gap between anns and snns by calibrating offset spikes Z Hao, J Ding, T Bu, T Huang, Z Yu arXiv preprint arXiv:2302.10685, 2023 | 24 | 2023 |
Snn-rat: Robustness-enhanced spiking neural network through regularized adversarial training J Ding, T Bu, Z Yu, T Huang, J Liu Advances in Neural Information Processing Systems 35, 24780-24793, 2022 | 21 | 2022 |
Decoding pixel-level image features from two-photon calcium signals of macaque visual cortex Y Zhang, T Bu, J Zhang, S Tang, Z Yu, JK Liu, T Huang Neural Computation 34 (6), 1369-1397, 2022 | 12 | 2022 |
Rate gradient approximation attack threats deep spiking neural networks T Bu, J Ding, Z Hao, Z Yu Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern …, 2023 | 9 | 2023 |
Robust Decoding of Rich Dynamical Visual Scenes With Retinal Spikes Z Yu, T Bu, Y Zhang, S Jia, T Huang, JK Liu IEEE Transactions on Neural Networks and Learning Systems, 2024 | 1 | 2024 |
A Progressive Training Framework for Spiking Neural Networks with Learnable Multi-hierarchical Model Z Hao, X Shi, Z Huang, T Bu, Z Yu, T Huang The Twelfth International Conference on Learning Representations, 2023 | 1 | 2023 |
Training Adversarially Robust SNNs with Gradient Sparsity Regularization Y Liu, T Bu, Z Yu, T Huang | | 2023 |
Threaten Spiking Neural Networks through Combining Rate and Temporal Information Z Hao, T Bu, X Shi, Z Huang, Z Yu, T Huang The Twelfth International Conference on Learning Representations, 0 | | |