语言:
- 葡萄牙语
标签:
- 命名实体识别
指标:
- F1值
- 准确率
- 精确率
- 召回率
RiskData 巴西葡萄牙语命名实体识别模型
模型描述
本模型是基于Neuralmind BERTimbau针对葡萄牙语进行微调的版本。
使用场景与限制
使用方法
from transformers import BertForTokenClassification, DistilBertTokenizerFast, pipeline
model = BertForTokenClassification.from_pretrained('monilouise/ner_pt_br')
tokenizer = DistilBertTokenizerFast.from_pretrained('neuralmind/bert-base-portuguese-cased'
, model_max_length=512
, do_lower_case=False
)
nlp = pipeline('ner', model=model, tokenizer=tokenizer, grouped_entities=True)
result = nlp("联邦审计法院位于巴西利亚,由鲁伊·巴尔博萨创立。")
局限性及偏差
- 该微调模型的训练语料包含约180篇从谷歌新闻抓取的新闻文章。原项目旨在识别与欺诈和腐败相关新闻中的命名实体,并将这些实体分为四类:人物(PERSON)、组织(ORGANIZATION)、公共机构(PUBLIC INSTITUTION)和地点(LOCAL)。
训练流程
评估结果
准确率: 0.98,
精确率: 0.86
召回率: 0.91
F1值: 0.88
评分通过以下代码计算:
def align_predictions(predictions: np.ndarray, label_ids: np.ndarray) -> Tuple[List[int], List[int]]:
preds = np.argmax(predictions, axis=2)
batch_size, seq_len = preds.shape
out_label_list = [[] for _ in range(batch_size)]
preds_list = [[] for _ in range(batch_size)]
for i in range(batch_size):
for j in range(seq_len):
if label_ids[i, j] != nn.CrossEntropyLoss().ignore_index:
out_label_list[i].append(id2tag[label_ids[i][j]])
preds_list[i].append(id2tag[preds[i][j]])
return preds_list, out_label_list
def compute_metrics(p: EvalPrediction) -> Dict:
preds_list, out_label_list = align_predictions(p.predictions, p.label_ids)
return {
"accuracy_score": accuracy_score(out_label_list, preds_list),
"precision": precision_score(out_label_list, preds_list),
"recall": recall_score(out_label_list, preds_list),
"f1": f1_score(out_label_list, preds_list),
}
参考文献及引用信息
有关BERTimbau语言模型的更多信息:
@inproceedings{souza2020bertimbau,
author = {Souza, F{\'a}bio and Nogueira, Rodrigo and Lotufo, Roberto},
title = {{BERT}imbau: pretrained {BERT} models for {B}razilian {P}ortuguese},
booktitle = {9th Brazilian Conference on Intelligent Systems, {BRACIS}, Rio Grande do Sul, Brazil, October 20-23 (to appear)},
year = {2020}
}
@article{souza2019portuguese,
title={Portuguese Named Entity Recognition using BERT-CRF},
author={Souza, F{\'a}bio and Nogueira, Rodrigo and Lotufo, Roberto},
journal={arXiv preprint arXiv:1909.10649},
url={http://arxiv.org/abs/1909.10649},
year={2019}
}