Hey there 👋 Ever wondered how we can make colonoscopies more effective? Well, buckle up, because we're diving into the exciting world of intelligent colonoscopy!
- [2024/10/30] Release IntelliScope project pushing colonoscopy research from pure vision to multimodal analysis
- [2024/09/01] Create welcome page
Illustration of a colonoscope inside large intestine (colon). Image credit: https://www.vecteezy.com
Let's face it - colorectal cancer is a big deal. It's one of the top cancer killers worldwide. But here's the good news: we can often prevent it if we catch it early. That's where colonoscopies come in. They're our best shot at finding and removing those sneaky precancerous polyps before they cause trouble to your body.
But here's the catch - colonoscopies are only as good as the doctor performing them. And let's be real, even the best doctors are human. They get tired, they might miss things sometimes, and some are more experienced than others.
This is where AI swoops in like a superhero! We're using cutting-edge artificial intelligence to endower colonoscopy a major upgrade. Think of it as giving doctors a pair of super-AI-powered glasses that help them spot things they might otherwise miss.
That's why we're going to explore the critical role of AI in colonoscopy. Here's what AI brings to the table:
- 🔍 Improved polyp detection rates
- AI is like a tireless assistant, constantly scanning for even the tiniest polyps that human eyes might overlook.
- 🎯 High sensitivity in distinguishing precancerous polyps
- Not all polyps are created equal. AI can be trained to differentiate between the harmless ones and those that could become cancerous, helping doctors prioritize treatment.
- 🖼️ Enhanced overall precision of colon evaluation
- It's not just about spotting polyps. AI provides a comprehensive view of the colon, helping doctors make more accurate assessments.
- 😀 No added risk to colonoscopy
- Here's the best part - all these benefits come with zero additional risk to the patient. It's like getting a free upgrade on your health check!
The past few years have been a wild ride in the world of intelligent colonoscopy techniques. Let me tell you about one of our proudest achievements:
- 🔥
[2024]
ColonINST & ColonGPT (arXiv Paper & Project page)- This year, we’re taking intelligent colonoscopy to the next level, a multimodal world, with three groundbreaking initiatives:
- 💥 Collecting a large-scale multimodal instruction tuning dataset ColonINST, featuring 300K+ colonoscopy images, 62 categories, 128K+ GPT-4V-generated medical captions, and 450K+ human-machine dialogues.
- 💥 Developing the first multimodal language model ColonGPT that can handle conversational tasks based on user preferences.
- 💥 Launching a multimodal benchmark to enable fair and rapid comparisons going forward.
- 🔥
[MIR-2024]
Drag&Drop (Paper & Code)- Authors: Yu-Cheng Chou (🇺🇸 Johns Hopkins University), Bowen Li (🇺🇸 Johns Hopkins University), Deng-Ping Fan (🇨🇭 ETH Zürich), Alan Yuille (🇺🇸 Johns Hopkins University), Zongwei Zhou (🇺🇸 Johns Hopkins University)
- 💥 We introduces the first high-dimensional annotation method, eliminating the need for slice-by-slice labeling.
- Cuts annotation effort by 87.5% in video polyp detection while maintaining or improving performance.
- Increases polyp detection precision by 7.8% with the same annotation budget compared to per-pixel methods.
- 🔥
[MIR-2022]
SUN-SEG (Paper & Code)- Authors: Ge-Peng Ji (🇦🇺 Australian National University), Guobao Xiao (🇨🇳 Minjiang University), Yu-Cheng Chou (🇺🇸 Johns Hopkins University), Deng-Ping Fan (🇨🇭 ETH Zürich), Kai Zhao (🇺🇸 University of California, Los Angeles), Geng Chen (🇦🇪 Inception Institute of Artificial Intelligence), Luc Van Gool (🇨🇭 ETH Zürich)
- 💥 We introduce a large-scale and high-quality per-frame annotated VPS dataset, named SUN-SEG, which includes 158,690 frames from the famous SUN-database. We extend the expert labels with diverse types, ie, object mask, boundary, scribble, polygon, and visual attribute.
- Paving the way for future research in colonoscopy video content analysis, with fine-grained categories, annotations.
- 🔥
[MICCAI'2021]
PNS-Net (Paper & Code)- Authors: Ge-Peng Ji (🇦🇪 Inception Institute of Artificial Intelligence), Yu-Cheng Chou (🇨🇳 Wuhan University), Deng-Ping Fan (🇦🇪 Inception Institute of Artificial Intelligence), Geng Chen (🇦🇪 Inception Institute of Artificial Intelligence), Huazhu Fu (🇦🇪 Inception Institute of Artificial Intelligence), Debesh Jha (🇳🇴 SimulaMet), Ling Shao (🇦🇪 Inception Institute of Artificial Intelligence)
- 💥 This super-efficient model (~140fps) is designed for video-level polyp segmentation.
- We received MICCAI Student Travel Award (link)
- It's already racked up over 130 citations (Google Scholar).
[MICCAI'2020]
PraNet (Paper & Code)- Authors: Deng-Ping Fan (🇦🇪 Inception Institute of Artificial Intelligence), Ge-Peng Ji (🇨🇳 Wuhan University), Tao Zhou (🇦🇪 Inception Institute of Artificial Intelligence), Geng Chen (🇦🇪 Inception Institute of Artificial Intelligence), Huazhu Fu (🇦🇪 Inception Institute of Artificial Intelligence), Jianbing Shen (🇦🇪 Inception Institute of Artificial Intelligence), Ling Shao (🇦🇪 Inception Institute of Artificial Intelligence)
- 💥 A gloden baseline for image-level polyp segmentation
- It's racked up over 1,100 citations on Google Scholar, and counting!
- Our GitHub repo is very popular with more than 400 stars!
This is just the start of building our Roman Empire 🔱. We’re on a mission to make colonoscopies smarter, more accurate, and ultimately, save more lives. Want to join us on this exciting journey? Stay tuned, and let’s revolutionize cancer prevention together! Feel free to reach out (📧 gepengai.ji@gmail.com) if you're interested in collaborating and pushing the boundaries of intelligent colonoscopy. Welcome to our AI4Colonoscopy Discussion Forum
- (SUBFORUM#1) ask any questions 😥 “论文中遇见了问题?代码不会跑?“
- (SUBFORUM#2) showcase/promote your work 😥 ”想增加论文影响力?如何向社区宣传自己的工作?“
- (SUBFORUM#3) access data resources 😥 “下载不到数据?如何使用/处理手头的数据?”
- (SUBFORUM#4) share research ideas 😥 ”找不到合作者?想不出有趣的idea?“