Skip to content

Latest commit

 

History

History
84 lines (62 loc) · 4.84 KB

vimunet.md

File metadata and controls

84 lines (62 loc) · 4.84 KB

ViM-UNet: Vision Mamba in Biomedical Segmentation

We introduce ViM-UNet, a novel segmentation architecture based on Vision Mamba for instance segmentation in microscopy.

This is the documentation for the installation instructions, known issues and linked suggestions and benchmarking scripts.

TLDR

  1. Please install torch-em and ViM (based on our fork: https://github.com/anwai98/Vim)
  2. Supports ViM Tiny and ViM Small for 2d segmentation using ViM-UNet.
  3. Check out our preprint (accepted at MIDL 2024 - Short Paper) for more details.
    • The key observation: "ViM-UNet performs similarly or better that UNet (depending on the task), and outperforms UNETR while being more efficient." Its main advantage is for segmentation problems that rely on larger context.

Benchmarking Methods

Re-implemented methods in torch-em:

  1. ViM-UNet
  2. UNet
  3. UNETR

External methods:

Here are the scripts to run the benchmarking for the reference methods.

  1. nnU-Net (see here for installation instructions)
  2. U-Mamba (see here for installation instructions, and issues encountered with our suggestions to take care of them)

Installation

For ViM-UNet:

  1. Create a new environment and activate it:
$ mamba create -n vimunet python=3.10.13
$ mamba activate vimunet
  1. Install torch-em from source.

  2. Install PyTorch:

$ pip install torch==2.1.1 torchvision==0.16.1 torchaudio==2.1.1 --index-url https://download.pytorch.org/whl/cu118

Q1. Why use pip? - for installation consistency

Q2. Why choose CUDA 11.8? - Vim seems to prefer $\le$ 11.8 (see here)

  1. Install ViM and related dependencies (causal-conv1d**, mamba, Vim***):
$ git clone https://github.com/anwai98/Vim.git
$ cd Vim
$ git checkout dev  # Our latest changes are hosted at 'dev'.
$ pip install -r vim/vim_requirements.txt
$ pip install -e causal-conv1d
$ pip install -e mamba-1p1p1
$ pip install -e .

NOTE: The installation is sometimes a bit tricky, but following the steps and keeping the footnotes in mind should do the trick. We are working on providing an easier and more stable installation, see this issue.

For UNet and UNETR

  1. Install torch-em from source.
  2. Install segment-anything from source.

Known Issues and Suggestions

  • GLIBCXX_<VERSION> related issues:

    • Suggestion: Specify your path to the mamba environment to LD_LIBRARY_PATH. For example,
    $ export LD_LIBRARY_PATH=/scratch/usr/nimanwai/micromamba/envs/vimunet/lib/
  • FileNotFoundError: [Error 2] No such file or directory: 'ldconfig':

    • Suggestion: Possible reason is that the path variable isn't set correctly. I found this here quite useful. You can provide it as the following example:
    $ export PATH=$PATH:/usr/sbin  # it could also be located at /usr/bin, etc. please check your system configurations for this.
  • **NameError: name 'bare_metal_version' is not defined while installing causal-conv1d:

    • Suggestion: This one's a bit tricky. From our findings, the possible issue is that the path to CUDA_HOME isn't visible to the installed PyTorch. The quickest way to test this is: python -c "from torch.utils.cpp_extension import CUDA_HOME; print(CUDA_HOME)". It's often stored at /usr/local/cuda, hence to expose the path, here's the example script: export CUDA_HOME=/usr/local/cuda.

      NOTE: If you are using your cluster's cuda installation and not sure where is it located, this should do the trick: module show cuda/$VERSION

  • ***Remember to install the suggested ViM branch for installation. It's important as we enable a few changes to: a) automatically install the vision mamba as a developer module, and b) setting AMP to false for known issues (see mention 1 and mention 2 for hints)