Skip to content

SakuraRiven/DCLNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Description

This is a PyTorch Implementation of DCLNet " Disentangled Contour Learning for Quadrilateral Text Detection ".

ICDAR2017 MLT Precision Recall F-score
DCLNet 81.0 66.9 73.3
DCLNet* 81.9 71.4 76.3

Prerequisites

Only tested on

  • Anaconda3
  • python 3.7.1
  • torch 1.2.0
  • torchvision 0.4.0
  • opencv-python 4.4.0.42
  • easydict 1.9

Installation

1. LANMS

Refer to LANMS

2. DCN

Refer to DCN in DBNet. Note that this repo puts dcn in the model dir.

3. Clone

git clone https://github.com/SakuraRiven/DCLNet.git
cd DCLNet

4. Data & Pre-Trained Model

Make a new folder pths and put the download pths into pths

mkdir pths
cd pths
mkdir backbone pretrain train
cd ..
mv resnet50-19c8e357.pth pths/backbone/
mv model_epoch_5.pth pths/pretrain/
mv model_epoch_150.pth pths/train/

Here is an example:

.
├── DCLNet
│   ├── model
│   │   └── dcn
│   └── pths
│       ├── backbone
│       ├── pretrain
│       └── train
└── data
    ├── ICDAR2017
    │   ├── train_img
    │   ├── train_gt
    │   ├── valid_img
    │   ├── valid_gt
    │   └── test_img
    └── SynthText

Train

CUDA_VISIBLE_DEVICES=0,1,2,3 python train.py pretrain

Finetune

CUDA_VISIBLE_DEVICES=0,1,2,3 python train.py finetune

Evaluate

CUDA_VISIBLE_DEVICES=0 python eval.py
CUDA_VISIBLE_DEVICES=0 python multi_scale_eval.py

Detect

CUDA_VISIBLE_DEVICES=0 python detect.py

Citation

Please cite the related works in your publications if it helps your research:

@inproceedings{bi2021disentangled,
  title={Disentangled Contour Learning for Quadrilateral Text Detection},
  author={Bi, Yanguang and Hu, Zhiqiang},
  booktitle={Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision},
  pages={909--918},
  year={2021}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages