-
Notifications
You must be signed in to change notification settings - Fork 0
/
index.html
62 lines (53 loc) · 4.91 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
<p style="font-size:50px;text-align:center">Kaer(Carl) Huang 黄卡尔</p>
<p></p>
<p></p>
<p>I am a senior researcher at Lenovo Research. My research interest is computer vision, including detection, segmentation, tracking algorithms (MOT/MOTS/VIS/VOS/VOT/3D Tracking), visual foundation models, deep generative models(Diffusion models and their application).</p>
<p></p>
<p></p>
<p style="color:red;"> <b>I have several openings for self-motivated research interns on these topics, please feel free to drop me an e-mail( huangke1@lenovo.com).</b> </p>
<p></p>
<p style="font-size:40px;">Competition Achievements</p>
<p><a href="https://cvpr2022.wad.vision/">1st in WAD BDD MOT Challenge at CVPR2022</a></p>
<p><a href="https://sslad2022.github.io/">1st in SSLAD BDD MOT Challenge at ECCV2022</a></p>
<p><a href="https://sslad2022.github.io/">1st in SSLAD BDD MOTS Challenge at ECCV2022</a></p>
<p><a href="https://sslad2022.github.io/">1st in SSLAD BDD SSMOT Challenge at ECCV2022</a></p>
<p><a href="https://sslad2022.github.io/">1st in SSLAD BDD SSMOTS Challenge at ECCV2022</a></p>
<p><a href="https://taodataset.org/workshop/cvpr23/">1st in Tracking VIS Long Tail Challenge at CVPR2023</a></p>
<p><a href="https://taodataset.org/workshop/cvpr23/">1st in Tracking VIS Open World Challenge at CVPR2023</a></p>
<p><a href="https://cvpr2023.wad.vision/">1st in WAD BDD MOTS Challenge at CVPR2023</a></p>
<p><a href="https://cvpr2023.wad.vision/">2nd in WAD BDD MOT Challenge at CVPR2023</a></p>
<p><a href="https://cvpr2023.wad.vision/">1st in WAD Argoverse End-to-End Forecasting Challenge at CVPR2023</a></p>
<p><a href="https://macvi.org/">2nd in MaCVi BoaTrack Challenge at WACV2024</a></p>
<p><a href="https://macvi.org/">3rd in MaCVi SeaDronesSee-MOT Challenge at WACV2024</a></p>
<p></p>
<p></p>
<p style="font-size:40px;">Academic talks</p>
<p><a href="https://www.youtube.com/watch?v=13PjjBEgEcM&t=334s">CVPR2022 WAD Invited speaker (BDD section)</a></p>
<p><a href="https://sslad2022.github.io/pages/challenge.html">ECCV2022 SSLAD Invited speaker (Tracking Challenge section) </a></p>
<p><a href="https://www.youtube.com/watch?v=BLMaacUEkxo&t=736s">CVPR2023 WAD Invited speaker (BDD section)</a></p>
<p><a href="https://taodataset.org/workshop/cvpr23/">CVPR2023 Tracking W Invited speaker (Challenge section)</a></p>
<p></p>
<p></p>
<p style="font-size:40px;">Academic Service</p>
<p>Reviewer for the following journals/conferences: IEEE TPAMI, TVT, CVPR</p>
<p>Try to be a co-organizer of the Tracking Workshop at CVPR2024</p>
<p>the member of CAAI(Chinese Association for Artificial Intelligence)</p>
<p>the member of CCF(China Computer Federation)</p>
<p>the member of IEEE(Institute of Electrical and Electronics Engineers)</p>
<p></p>
<p></p>
<p style="font-size:40px;">Publications</p>
<p><a href="https://patents.google.com/?inventor=%E9%BB%84%E5%8D%A1%E5%B0%94&oq=%E9%BB%84%E5%8D%A1%E5%B0%94">16 published patents </a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:LkGwnXOMwfcC">One paper was accepted to ICCV2023</a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:IjCSPb-OGe4C">Multi-Object Tracking by Self-Supervised Learning Appearance Model(CVPR2023)</a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:u-x6o8ySG0sC">One paper was accepted to ECCV2022 W</a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:d1gkVwhDpl0C">Lane detection with position embedding</a></p>
<p>Two papers accepted by WACV2024 </p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:ufrVoPGSRksC">technique report for 1st Place Solution for CVPR2023 BURST Long Tail and Open World Challenges</a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:WF5omc3nYNoC">technique report for 1st/2nd Place Solution for CVPR2023 WAD BDD MOT/MOTS Tracking Challenges</a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:_FxGoFyzp5QC">technique report for Argoverse Challenges on Unified Sensor-based Detection, Tracking, and Forecasting</a></p>
<p><a href="https://scholar.google.com/citations?view_op=view_citation&hl=en&user=Zh2ihGcAAAAJ&citation_for_view=Zh2ihGcAAAAJ:Se3iqnhoufwC">ReIDTracker Sea: the technical report of BoaTrack and SeaDronesSee-MOT challenge at MaCVi of WACV24</a></p>
<p>Two papers submitted to Journals Transactions on Circuits and Systems for Video Technology (under review)</p>
<p>One paper submitted to CVPR2024 (under review)</p>
<p>One paper submitted to IJCAI24 (under review)</p>
<p>One paper submitted to ICML24 (under review)</p>