Skip to content

Latest commit

 

History

History
34 lines (24 loc) · 1.38 KB

README.md

File metadata and controls

34 lines (24 loc) · 1.38 KB

Chinese Named Entity Recognition using BERT + Softmax

Introduction

[ALGORITHM]

@article{devlin2018bert,
  title={Bert: Pre-training of deep bidirectional transformers for language understanding},
  author={Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina},
  journal={arXiv preprint arXiv:1810.04805},
  year={2018}
}

Dataset

Train Dataset

trainset text_num entity_num
CLUENER2020 10748 23338

Test Dataset

testset text_num entity_num
CLUENER2020 1343 2982

Results and models

Method Pretrain Precision Recall F1-Score Download
bert_softmax pretrain 0.7885 0.7998 0.7941 model | log