Official PyTorch implementation of LinK, from the following paper:
LinK: Linear Kernel for LiDAR-based 3D Perception. CVPR 2023.
Tao Lu, Xiang Ding, Haisong Liu, Gangshan Wu, Limin Wang
Multimedia Computing Group, Nanjing University
[arxiv
][Conference version
]
LinK is a large kernel backbone for 3D perception tasks, consisting of a linear kernel generator and a pre-aggregation strategy. The two designs scale up the perception range into 21x21x21 with linear complexity.
name | kernel config | mIoU | model |
---|---|---|---|
LinK | cos_x:(2x3)^3 | 67.72 | model |
LinK | cos:(3x7)^3 | 67.50 | model |
LinK(encoder-only) | cos_x:(2x3)^3 | 67.33 | model |
LinK(encoder-only) | cos:(3x5)^3 | 67.07 | model |
- Validation
name | kernel config | NDS | mAP | model |
---|---|---|---|---|
LinK | cos:(3x7)^3 | 69.5 | 63.6 | model |
- Test
name | kernel config | NDS | mAP | model |
---|---|---|---|---|
LinK | cos:(3x7)^3 | 71.0 | 66.3 | model |
LinK(TTA) | cos:(3x7)^3 | 73.4 | 69.8 | model |
Clone this repo to your workspace.
git clone https://github.com/MCG-NJU/LinK.git
cd LinK
please check segmentation/INSTALL.md and segmentation/GET_STARTED.md.
see detection/INSTALL.md and detection/GET_STARTED.md.
If you find our work helpful, please consider citing:
@InProceedings{lu2023link,
author = {Lu, Tao and Ding, Xiang and Liu, Haisong and Wu, Gangshan and Wang, Limin},
title = {LinK: Linear Kernel for LiDAR-Based 3D Perception},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2023},
pages = {1105-1115}
}
@article{lu2022app,
title={APP-Net: Auxiliary-point-based Push and Pull Operations for Efficient Point Cloud Classification},
author={Lu, Tao and Liu, Chunxu and Chen, Youxin and Wu, Gangshan and Wang, Limin},
journal={arXiv preprint arXiv:2205.00847},
year={2022}
}
- Tao Lu: taolu@smail.nju.edu.cn
- Xiang Ding: xding@smail.nju.edu.cn
- Haisong Liu: liuhs@smail.nju.edu.cn
Our code is based on CenterPoint, SPVNAS, spconv, torchsparse. And we thank a lot for the kind help from Ruixiang Zhang, Xu Yan and Yukang Chen.