bert ckpt下载
时间: 2023-10-23 20:13:46 浏览: 229
BERT的预训练模型可以从Google官方的GitHub仓库中下载。以下是BERT的下载链接:
1. BERT-Base, Uncased: 12-layer, 768-hidden, 12-heads, 110M parameters
- [Download link](https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip)
2. BERT-Large, Uncased: 24-layer, 1024-hidden, 16-heads, 340M parameters
- [Download link](https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-24_H-1024_A-16.zip)
3. BERT-Base, Cased: 12-layer, 768-hidden, 12-heads , 110M parameters
- [Download link](https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip)
4. BERT-Large, Cased: 24-layer, 1024-hidden, 16-heads, 340M parameters
- [Download link](https://storage.googleapis.com/bert_models/2018_10_18/cased_L-24_H-1024_A-16.zip)
5. BERT-Base, Multilingual Cased: 104 languages, 12-layer, 768-hidden, 12-heads, 110M parameters
- [Download link](https://storage.googleapis.com/bert_models/2018_11_23/multi_cased_L-12_H-768_A-12.zip)
6. BERT-Base, Multilingual Uncased: 102 languages, 12-layer, 768-hidden, 12-heads, 110M parameters
- [Download link](https://storage.googleapis.com/bert_models/2018_11_03/multilingual_L-12_H-768_A-12.zip)
下载后解压缩,你就可以在本地使用BERT预训练模型进行fine-tuning或者其他任务了。
阅读全文