位置:首页 > 新闻资讯 > 『2022语言与智能技术竞赛』段落检索比赛基线

『2022语言与智能技术竞赛』段落检索比赛基线

时间:2025-07-16  |  作者:  |  阅读:0

段落检索(passage retrieval)是指从大规模语料库中找出和用户查询最相关段落的过程。段落检索作为许多自然语言处理任务中的关键组件,是自然语言处理和人工智能领域的重要前沿课题,近年来受到了学术界和工业界的广泛关注。

任务描述

给定一个问题q 及其所有相关段落的集合 Pq,其中p∈Pq?为与q相关的单条段落,以及一个包含所有候选段落的集合 P。参赛系统的目标是根据 q,从P中检索出所有与q相关的段落Pq,并将Pq?中的段落尽可能排序到检索结果列表靠前的位置。

本项目提供了基于飞桨框架PaddlePaddle的开源基线系统运行示例。

基线介绍

基线系统包含以下两个步骤:

Step 1: 训练双塔模型用于召回阶段

Step 2: 训练精排模型用于精排阶段

更多细节可以参考论文RocketQA (Qu et al., 2021). 基线实现了论文中的前两个step。

本教程为方便选手快速、低成本搭建基线,并进行迭代优化,提供了精排模型的训练、预测代码,以及精排阶段fine-tune后的模型。

选手可以根据比赛下载数据中提供的基线召回top50结果,进行精排模型的优化,提升排序结果。

完整的基线代码可参考github地址:https://github.com/PaddlePaddle/RocketQA/tree/main/research/DuReader-Retrieval-Baseline

环境准备

下载热启模型

包括原始预训练模型ernie1.0以及finetune后的基线模型

In [22]

%cd /home/aistudio/work/!wget https://dataset-bj.cdn.bcebos.com/qianyan/ernie_base_1.0_CN.tar.gz!tar -zxvf ernie_base_1.0_CN.tar.gz!mv ernie_base_1.0_CN pretrained-models!rm ernie_base_1.0_CN.tar.gz!wget https://dataset-bj.cdn.bcebos.com/qianyan/cross_finetuned_params.tar.gz!tar -zxvf cross_finetuned_params.tar.gz!mv cross_params finetuned-models!rm cross_finetuned_params.tar.gz登录后复制

/home/aistudio/work--2022-04-27 19:47:51-- https://dataset-bj.cdn.bcebos.com/qianyan/ernie_base_1.0_CN.tar.gz正在解析主机 dataset-bj.cdn.bcebos.com (dataset-bj.cdn.bcebos.com)... 182.61.128.166正在连接 dataset-bj.cdn.bcebos.com (dataset-bj.cdn.bcebos.com)|182.61.128.166|:443... 已连接。已发出 HTTP 请求,正在等待回应... 200 OK长度: 807405150 (770M) [application/x-gzip]正在保存至: “ernie_base_1.0_CN.tar.gz”ernie_base_1.0_CN.t 100%[===================>] 770.00M 70.7MB/s in 13s 2022-04-27 19:48:04 (61.4 MB/s) - 已保存 “ernie_base_1.0_CN.tar.gz” [807405150/807405150])ernie_base_1.0_CN/ernie_base_1.0_CN/saved_weights.pdparamsernie_base_1.0_CN/ernie_config.jsonernie_base_1.0_CN/vocab.txternie_base_1.0_CN/params/ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_9_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_10_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_7_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_1_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_11_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_7_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_5_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_4_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_10_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_3_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_0_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_11_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_2_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_8_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_6_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_0_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_5_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_3_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_1_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_10_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_8_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_2_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_3_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_6_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_9_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_9_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_5_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_4_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_7_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_4_post_att_layer_norm_biasernie_base_1.0_CN/params/mask_lm_trans_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_2_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_7_post_att_layer_norm_biasernie_base_1.0_CN/params/mask_lm_trans_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_8_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_9_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_2_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_0_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_0_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_5_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_2_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_7_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_1_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_9_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_8_post_ffn_layer_norm_biasernie_base_1.0_CN/params/sent_embeddingernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/pooled_fc.b_0ernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_2_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_4_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_6_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/word_embeddingernie_base_1.0_CN/params/encoder_layer_7_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_10_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_1_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_11_post_ffn_layer_norm_biasernie_base_1.0_CN/params/pre_encoder_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_4_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_6_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_5_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_10_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_1_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_10_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_3_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_6_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_7_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_1_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_10_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_4_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_1_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_7_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/mask_lm_trans_fc.w_0ernie_base_1.0_CN/params/encoder_layer_9_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_11_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_11_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/pos_embeddingernie_base_1.0_CN/params/encoder_layer_8_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_11_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_11_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_key_fc.b_0ernie_base_1.0_CN/params/encoder_layer_2_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_7_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_8_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_5_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_6_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_3_post_ffn_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_5_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_1_multi_head_att_output_fc.b_0ernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/encoder_layer_9_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_11_ffn_fc_1.w_0ernie_base_1.0_CN/params/encoder_layer_6_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_6_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_value_fc.w_0ernie_base_1.0_CN/params/encoder_layer_11_multi_head_att_query_fc.b_0ernie_base_1.0_CN/params/mask_lm_trans_fc.b_0ernie_base_1.0_CN/params/encoder_layer_8_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_output_fc.w_0ernie_base_1.0_CN/params/encoder_layer_2_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_10_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/pre_encoder_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_4_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_3_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_9_post_att_layer_norm_scaleernie_base_1.0_CN/params/encoder_layer_3_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_2_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_0_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/pooled_fc.w_0ernie_base_1.0_CN/params/encoder_layer_8_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_3_ffn_fc_0.b_0ernie_base_1.0_CN/params/encoder_layer_5_post_att_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_8_multi_head_att_value_fc.b_0ernie_base_1.0_CN/params/encoder_layer_1_post_ffn_layer_norm_biasernie_base_1.0_CN/params/encoder_layer_5_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_10_ffn_fc_0.w_0ernie_base_1.0_CN/params/encoder_layer_4_multi_head_att_query_fc.w_0ernie_base_1.0_CN/params/encoder_layer_3_multi_head_att_key_fc.w_0ernie_base_1.0_CN/params/encoder_layer_4_ffn_fc_1.b_0ernie_base_1.0_CN/params/encoder_layer_6_ffn_fc_0.w_0ernie_base_1.0_CN/params/mask_lm_out_fc.b_0ernie_base_1.0_CN/params/encoder_layer_9_multi_head_att_output_fc.w_0--2022-04-27 19:48:18-- https://dataset-bj.cdn.bcebos.com/qianyan/cross_finetuned_params.tar.gz正在解析主机 dataset-bj.cdn.bcebos.com (dataset-bj.cdn.bcebos.com)... 182.61.128.166正在连接 dataset-bj.cdn.bcebos.com (dataset-bj.cdn.bcebos.com)|182.61.128.166|:443... 已连接。已发出 HTTP 请求,正在等待回应... 200 OK长度: 364982739 (348M) [application/x-gzip]正在保存至: “cross_finetuned_params.tar.gz”cross_finetuned_par 100%[===================>] 348.07M 63.9MB/s in 5.6s 2022-04-27 19:48:23 (61.8 MB/s) - 已保存 “cross_finetuned_params.tar.gz” [364982739/364982739])cross_params/cross_params/encoder_layer_4_post_ffn_layer_norm_biascross_params/encoder_layer_3_multi_head_att_value_fc.b_0cross_params/encoder_layer_1_post_ffn_layer_norm_biascross_params/encoder_layer_7_ffn_fc_0.b_0cross_params/encoder_layer_4_post_ffn_layer_norm_scalecross_params/encoder_layer_7_ffn_fc_1.w_0cross_params/encoder_layer_4_ffn_fc_0.w_0cross_params/encoder_layer_6_multi_head_att_value_fc.w_0cross_params/encoder_layer_2_post_att_layer_norm_biascross_params/loss_scaling_0cross_params/encoder_layer_5_multi_head_att_key_fc.b_0cross_params/encoder_layer_4_post_att_layer_norm_scalecross_params/encoder_layer_8_ffn_fc_1.b_0cross_params/encoder_layer_10_multi_head_att_output_fc.b_0cross_params/encoder_layer_6_multi_head_att_query_fc.b_0cross_params/encoder_layer_7_multi_head_att_key_fc.w_0cross_params/encoder_layer_11_multi_head_att_key_fc.w_0cross_params/encoder_layer_0_multi_head_att_key_fc.b_0cross_params/encoder_layer_1_multi_head_att_value_fc.w_0cross_params/encoder_layer_1_multi_head_att_query_fc.w_0cross_params/encoder_layer_7_post_att_layer_norm_biascross_params/encoder_layer_2_multi_head_att_value_fc.b_0cross_params/encoder_layer_8_post_att_layer_norm_scalecross_params/encoder_layer_10_multi_head_att_key_fc.w_0cross_params/encoder_layer_8_ffn_fc_0.b_0cross_params/pre_encoder_layer_norm_biascross_params/encoder_layer_11_post_att_layer_norm_scalecross_params/encoder_layer_11_multi_head_att_value_fc.w_0cross_params/encoder_layer_8_multi_head_att_key_fc.w_0cross_params/encoder_layer_6_multi_head_att_output_fc.b_0cross_params/encoder_layer_11_ffn_fc_1.w_0cross_params/encoder_layer_3_multi_head_att_value_fc.w_0cross_params/encoder_layer_4_multi_head_att_key_fc.w_0cross_params/encoder_layer_8_post_ffn_layer_norm_biascross_params/encoder_layer_7_multi_head_att_output_fc.b_0cross_params/encoder_layer_0_multi_head_att_query_fc.w_0cross_params/encoder_layer_10_multi_head_att_value_fc.w_0cross_params/encoder_layer_11_multi_head_att_output_fc.b_0cross_params/encoder_layer_2_multi_head_att_key_fc.w_0cross_params/encoder_layer_8_post_ffn_layer_norm_scalecross_params/encoder_layer_3_ffn_fc_1.w_0cross_params/encoder_layer_10_post_ffn_layer_norm_scalecross_params/encoder_layer_0_ffn_fc_0.w_0cross_params/encoder_layer_1_multi_head_att_key_fc.b_0cross_params/encoder_layer_0_multi_head_att_query_fc.b_0cross_params/encoder_layer_11_multi_head_att_key_fc.b_0cross_params/encoder_layer_5_post_ffn_layer_norm_biascross_params/encoder_layer_8_multi_head_att_output_fc.b_0cross_params/encoder_layer_4_multi_head_att_key_fc.b_0cross_params/encoder_layer_2_multi_head_att_query_fc.b_0cross_params/encoder_layer_5_multi_head_att_query_fc.w_0cross_params/encoder_layer_4_ffn_fc_0.b_0cross_params/encoder_layer_5_multi_head_att_value_fc.b_0cross_params/encoder_layer_7_post_ffn_layer_norm_biascross_params/encoder_layer_4_multi_head_att_value_fc.w_0cross_params/encoder_layer_5_multi_head_att_value_fc.w_0cross_params/encoder_layer_11_ffn_fc_0.w_0cross_params/encoder_layer_9_multi_head_att_key_fc.b_0cross_params/encoder_layer_7_ffn_fc_1.b_0cross_params/sent_embeddingcross_params/encoder_layer_10_post_att_layer_norm_scalecross_params/encoder_layer_4_multi_head_att_value_fc.b_0cross_params/encoder_layer_2_ffn_fc_1.b_0cross_params/encoder_layer_9_post_att_layer_norm_biascross_params/encoder_layer_1_multi_head_att_key_fc.w_0cross_params/encoder_layer_11_post_att_layer_norm_biascross_params/encoder_layer_3_post_att_layer_norm_biascross_params/encoder_layer_5_multi_head_att_output_fc.b_0cross_params/encoder_layer_10_multi_head_att_query_fc.w_0cross_params/encoder_layer_7_multi_head_att_output_fc.w_0cross_params/encoder_layer_11_multi_head_att_query_fc.b_0cross_params/encoder_layer_4_ffn_fc_1.b_0cross_params/encoder_layer_0_multi_head_att_output_fc.b_0cross_params/encoder_layer_5_ffn_fc_0.w_0cross_params/encoder_layer_3_post_ffn_layer_norm_biascross_params/pre_encoder_layer_norm_scalecross_params/encoder_layer_7_post_att_layer_norm_scalecross_params/encoder_layer_2_multi_head_att_key_fc.b_0cross_params/encoder_layer_11_multi_head_att_output_fc.w_0cross_params/encoder_layer_7_multi_head_att_value_fc.b_0cross_params/encoder_layer_10_post_ffn_layer_norm_biascross_params/encoder_layer_3_multi_head_att_query_fc.w_0cross_params/encoder_layer_9_multi_head_att_query_fc.b_0cross_params/encoder_layer_3_post_ffn_layer_norm_scalecross_params/encoder_layer_1_post_ffn_layer_norm_scalecross_params/encoder_layer_9_ffn_fc_0.b_0cross_params/encoder_layer_11_post_ffn_layer_norm_scalecross_params/encoder_layer_0_post_ffn_layer_norm_biascross_params/encoder_layer_5_ffn_fc_0.b_0cross_params/encoder_layer_7_multi_head_att_key_fc.b_0cross_params/encoder_layer_3_multi_head_att_key_fc.w_0cross_params/encoder_layer_6_post_ffn_layer_norm_scalecross_params/encoder_layer_3_multi_head_att_query_fc.b_0cross_params/encoder_layer_11_ffn_fc_0.b_0cross_params/_cls_out_bcross_params/encoder_layer_8_multi_head_att_output_fc.w_0cross_params/encoder_layer_9_ffn_fc_0.w_0cross_params/encoder_layer_0_post_att_layer_norm_biascross_params/encoder_layer_6_multi_head_att_output_fc.w_0cross_params/encoder_layer_9_post_ffn_layer_norm_biascross_params/encoder_layer_5_post_ffn_layer_norm_scalecross_params/encoder_layer_4_multi_head_att_query_fc.w_0cross_params/encoder_layer_8_post_att_layer_norm_biascross_params/encoder_layer_5_ffn_fc_1.w_0cross_params/pooled_fc.w_0cross_params/encoder_layer_3_multi_head_att_output_fc.b_0cross_params/encoder_layer_10_post_att_layer_norm_biascross_params/encoder_layer_1_ffn_fc_1.b_0cross_params/encoder_layer_10_ffn_fc_1.w_0cross_params/encoder_layer_9_ffn_fc_1.w_0cross_params/encoder_layer_11_post_ffn_layer_norm_biascross_params/encoder_layer_4_multi_head_att_query_fc.b_0cross_params/encoder_layer_3_multi_head_att_output_fc.w_0cross_params/encoder_layer_0_post_ffn_layer_norm_scalecross_params/encoder_layer_6_multi_head_att_key_fc.w_0cross_params/encoder_layer_5_ffn_fc_1.b_0cross_params/encoder_layer_7_post_ffn_layer_norm_scalecross_params/encoder_layer_0_post_att_layer_norm_scalecross_params/encoder_layer_3_multi_head_att_key_fc.b_0cross_params/encoder_layer_0_multi_head_att_value_fc.b_0cross_params/pooled_fc.b_0cross_params/encoder_layer_7_multi_head_att_query_fc.w_0cross_params/encoder_layer_4_post_att_layer_norm_biascross_params/encoder_layer_11_multi_head_att_query_fc.w_0cross_params/encoder_layer_3_ffn_fc_0.b_0cross_params/encoder_layer_8_ffn_fc_1.w_0cross_params/encoder_layer_9_multi_head_att_query_fc.w_0cross_params/encoder_layer_0_multi_head_att_output_fc.w_0cross_params/encoder_layer_6_ffn_fc_1.b_0cross_params/encoder_layer_6_ffn_fc_0.w_0cross_params/encoder_layer_5_multi_head_att_key_fc.w_0cross_params/word_embeddingcross_params/encoder_layer_3_ffn_fc_0.w_0cross_params/encoder_layer_2_multi_head_att_output_fc.b_0cross_params/encoder_layer_10_ffn_fc_1.b_0cross_params/encoder_layer_4_multi_head_att_output_fc.w_0cross_params/encoder_layer_6_post_att_layer_norm_scalecross_params/encoder_layer_5_multi_head_att_output_fc.w_0cross_params/encoder_layer_2_ffn_fc_0.w_0cross_params/encoder_layer_6_multi_head_att_key_fc.b_0cross_params/encoder_layer_0_multi_head_att_key_fc.w_0cross_params/encoder_layer_6_ffn_fc_0.b_0cross_params/encoder_layer_5_post_att_layer_norm_scalecross_params/learning_rate_0cross_params/encoder_layer_1_ffn_fc_1.w_0cross_params/encoder_layer_2_multi_head_att_query_fc.w_0cross_params/encoder_layer_1_post_att_layer_norm_scalecross_params/encoder_layer_0_ffn_fc_1.w_0cross_params/encoder_layer_6_multi_head_att_query_fc.w_0cross_params/encoder_layer_9_post_ffn_layer_norm_scalecross_params/encoder_layer_10_ffn_fc_0.w_0cross_params/encoder_layer_10_ffn_fc_0.b_0cross_params/encoder_layer_11_ffn_fc_1.b_0cross_params/encoder_layer_0_ffn_fc_1.b_0cross_params/encoder_layer_0_multi_head_att_value_fc.w_0cross_params/encoder_layer_2_multi_head_att_output_fc.w_0cross_params/encoder_layer_8_ffn_fc_0.w_0cross_params/encoder_layer_10_multi_head_att_value_fc.b_0cross_params/encoder_layer_9_multi_head_att_output_fc.b_0cross_params/encoder_layer_8_multi_head_att_value_fc.b_0cross_params/encoder_layer_4_multi_head_att_output_fc.b_0cross_params/encoder_layer_2_post_att_layer_norm_scalecross_params/encoder_layer_2_multi_head_att_value_fc.w_0cross_params/encoder_layer_3_ffn_fc_1.b_0cross_params/encoder_layer_11_multi_head_att_value_fc.b_0cross_params/encoder_layer_8_multi_head_att_value_fc.w_0cross_params/encoder_layer_9_multi_head_att_key_fc.w_0cross_params/encoder_layer_8_multi_head_att_query_fc.b_0cross_params/encoder_layer_9_multi_head_att_output_fc.w_0cross_params/encoder_layer_10_multi_head_att_key_fc.b_0cross_params/encoder_layer_8_multi_head_att_key_fc.b_0cross_params/encoder_layer_1_multi_head_att_output_fc.b_0cross_params/encoder_layer_1_ffn_fc_0.b_0cross_params/encoder_layer_9_ffn_fc_1.b_0cross_params/encoder_layer_1_ffn_fc_0.w_0cross_params/encoder_layer_6_post_ffn_layer_norm_biascross_params/encoder_layer_9_multi_head_att_value_fc.w_0cross_params/encoder_layer_1_multi_head_att_value_fc.b_0cross_params/encoder_layer_2_ffn_fc_1.w_0cross_params/encoder_layer_5_post_att_layer_norm_biascross_params/encoder_layer_9_multi_head_att_value_fc.b_0cross_params/encoder_layer_6_multi_head_att_value_fc.b_0cross_params/encoder_layer_0_ffn_fc_0.b_0cross_params/encoder_layer_1_post_att_layer_norm_biascross_params/encoder_layer_1_multi_head_att_query_fc.b_0cross_params/encoder_layer_10_multi_head_att_output_fc.w_0cross_params/encoder_layer_6_ffn_fc_1.w_0cross_params/encoder_layer_8_multi_head_att_query_fc.w_0cross_params/encoder_layer_7_multi_head_att_query_fc.b_0cross_params/encoder_layer_2_ffn_fc_0.b_0cross_params/encoder_layer_9_post_att_layer_norm_scalecross_params/encoder_layer_7_multi_head_att_value_fc.w_0cross_params/encoder_layer_10_multi_head_att_query_fc.b_0cross_params/encoder_layer_5_multi_head_att_query_fc.b_0cross_params/encoder_layer_4_ffn_fc_1.w_0cross_params/encoder_layer_7_ffn_fc_0.w_0cross_params/pos_embeddingcross_params/encoder_layer_6_post_att_layer_norm_biascross_params/encoder_layer_1_multi_head_att_output_fc.w_0cross_params/encoder_layer_2_post_ffn_layer_norm_scalecross_params/encoder_layer_3_post_att_layer_norm_scalecross_params/_cls_out_wcross_params/encoder_layer_2_post_ffn_layer_norm_biasmv: 无法获取'cross_finetuned_params' 的文件状态(stat): 没有那个文件或目录登录后复制

训练数据

项目环境中提供了基线模型的训练数据,位于 /home/aistudio/data/data142459/

  • cross.train.tsv 精排模型的基线训练数据
  • dev.json 验证集
  • dev.retrieval.top50.res.tsv 基线双塔模型在验证集上的top50检索结果

其中精排模型的训练数据构造方式为:对于标注训练集中的每个相关正例,基于双塔基线模型召回结果的top50,从中随机采样负例,组成为训练数据中的一条样本。

模型训练

下面给出了三种精排模型的训练启动方式,分别是:

  • 单卡训练(使用demo数据)
  • 单卡训练(使用全量数据)
  • 多卡训练(使用全量数据)

在运行命令前,可以修改script/run_cross_encoder_train.sh中的参数。

训练过程中,可以在/home/aistudio/work/log/ 目录下查看训练日志。

In [5]

%cd /home/aistudio/work/#单卡训练(demo数据)!export CUDA_VISIBLE_DEVICES=0TRAIN_SET=”../data/data142459/cross.train.demo.tsv“MODEL_PATH=”./pretrained-models/ernie_base_1.0_CN/params“!sh script/run_cross_encoder_train.sh $TRAIN_SET $MODEL_PATH 3 1登录后复制

/home/aistudio/work+ export FLAGS_eager_delete_tensor_gb=0+ export FLAGS_sync_nccl_allreduce=1+ export FLAGS_fraction_of_gpu_memory_to_use=0.95+ export GLOG_v=1+ [ 4 != 4 ]+ TRAIN_SET=../data/data142459/cross.train.demo.tsv+ MODEL_PATH=./pretrained-models/ernie_base_1.0_CN/params+ epoch=3+ node=1+ CHECKPOINT_PATH=output+ [ ! -d output ]+ [ ! -d log ]+ lr=1e-5+ batch_size=32+ cat ../data/data142459/cross.train.demo.tsv+ wc -l+ train_exampls=5000+ expr 5000 / 32 / 1+ save_steps=156+ expr 156 * 32 * 1+ data_size=4992+ expr 156 * 3 / 2+ new_save_steps=234+ echo 49924992+ python -m paddle.distributed.launch --log_dir log ./src/train_ce.py --use_cuda true --verbose true --do_train true --do_val false --do_test false --use_mix_precision false --train_data_size 4992 --batch_size 32 --init_pretraining_params ./pretrained-models/ernie_base_1.0_CN/params --train_set ../data/data142459/cross.train.demo.tsv --save_steps 234 --validation_steps 234 --checkpoints output --weight_decay 0.01 --warmup_proportion 0.0 --epoch 3 --max_seq_len 384 --for_cn true --vocab_path ./pretrained-models/ernie_base_1.0_CN/vocab.txt --ernie_config_path ./pretrained-models/ernie_base_1.0_CN/ernie_config.json --learning_rate 1e-5 --skip_steps 10 --num_iteration_per_drop_scope 1 --num_labels 2 --random_seed 1登录后复制In [?]

%cd /home/aistudio/work/#单卡训练!export CUDA_VISIBLE_DEVICES=0TRAIN_SET=”../data/data142459/cross.train.tsv“MODEL_PATH=”./pretrained-models/ernie_base_1.0_CN/params“!sh script/run_cross_encoder_train.sh $TRAIN_SET $MODEL_PATH 3 1登录后复制In [7]

%cd /home/aistudio/work/#多卡训练!export CUDA_VISIBLE_DEVICES=0,1,2,3TRAIN_SET=”../data/data142459/cross.train.tsv“MODEL_PATH=”./pretrained-models/ernie_base_1.0_CN/params“!sh script/run_cross_encoder_train.sh $TRAIN_SET $MODEL_PATH 3 4登录后复制

/home/aistudio/work+ export FLAGS_eager_delete_tensor_gb=0+ export FLAGS_sync_nccl_allreduce=1+ export FLAGS_fraction_of_gpu_memory_to_use=0.95+ export GLOG_v=1+ [ 4 != 4 ]+ TRAIN_SET=../data/data142459/cross.train.demo.tsv+ MODEL_PATH=pretrained-models/ernie_base_1.0_CN/params+ epoch=3+ node=4+ CHECKPOINT_PATH=output+ [ ! -d output ]+ mkdir output+ [ ! -d log ]+ mkdir log+ lr=1e-5+ batch_size=32+ cat ../data/data142459/cross.train.demo.tsv+ wc -l+ train_exampls=5000+ save_steps=$[5000/32/4]+ data_size=$[$[5000/32/4]*32*4]+ new_save_steps=$[$[5000/32/4]*3/2]+ python -m paddle.distributed.launch --log_dir log ./src/train_ce.py --use_cuda true --verbose true --do_train true --do_val false --do_test false --use_mix_precision false --train_data_size $[$[5000/32/4]*32*4] --batch_size 32 --init_pretraining_params pretrained-models/ernie_base_1.0_CN/params --train_set ../data/data142459/cross.train.demo.tsv --save_steps $[$[5000/32/4]*3/2] --validation_steps $[$[5000/32/4]*3/2] --checkpoints output --weight_decay 0.01 --warmup_proportion 0.0 --epoch 3 --max_seq_len 384 --for_cn true --vocab_path pretrained-models/ernie_base_1.0_CN/vocab.txt --ernie_config_path pretrained-models/ernie_base_1.0_CN/ernie_config.json --learning_rate 1e-5 --skip_steps 10 --num_iteration_per_drop_scope 1 --num_labels 2 --random_seed 1登录后复制

模型预测和评估

训练完成后,模型参数会保存在/home/aistudio/work/output/目录下。

作为示例,这里使用fine-tune后的基线模型对验证集的top50结果预测打分并进行评估。

模型打分预测

运行下面的命令可以完成精排模型对验证集top50结果的打分。

打分结果将保存在/home/aistudio/work/output/目录下。

In [1]

%cd /home/aistudio/work/!export CUDA_VISIBLE_DEVICES=0TEST_SET=”../data/data142459/dev.retrieval.top50.res.tsv“MODEL_PATH=”finetuned-models/cross_params/“!sh script/run_cross_encoder_inference.sh $TEST_SET $MODEL_PATH登录后复制

/home/aistudio/work+ export FLAGS_eager_delete_tensor_gb=0+ export FLAGS_sync_nccl_allreduce=1+ export FLAGS_fraction_of_gpu_memory_to_use=0.95+ export GLOG_v=1+ [ 2 != 2 ]+ TASK_DATA=../data/data142459/dev.retrieval.top50.res.tsv+ MODEL_PATH=finetuned-models/cross_params/+ batch_size=128+ [ ! -d output ]+ mkdir output+ [ ! -d log ]+ mkdir log+ python -u ./src/train_ce.py --use_cuda true --verbose true --do_train false --do_val false --do_test true --batch_size 128 --init_checkpoint finetuned-models/cross_params/ --test_set ../data/data142459/dev.retrieval.top50.res.tsv --test_save output/../data/data142459/dev.retrieval.top50.res.tsv.score --max_seq_len 384 --for_cn true --vocab_path pretrained-models/ernie_base_1.0_CN/vocab.txt --ernie_config_path pretrained-models/ernie_base_1.0_CN/ernie_config.json登录后复制

效果评估

首先,将打分结果转换为评测的标准输入形式:

In [8]

MODEL_OUTPUT=”output/dev.retrieval.top50.res.tsv.score.0.0“ID_MAP=”../data/data142459/dev.retrieval.top50.res.id_map.tsv“!python metric/convert_rerank_res_to_json.py $MODEL_OUTPUT $ID_MAP登录后复制

输出文件将保存在output/cross_res.json

该文件格式即为比赛系统要求的提交结果形式。

运行以下命令即可得到最终的评估结果:

In [9]

REFERENCE_FIEL=”../data/data142459/dev.json“PREDICTION_FILE=”output/cross_res.json“!python metric/evaluation.py $REFERENCE_FIEL $PREDICTION_FILE登录后复制

{”MRR@10“: 0.7284081349206347, ”QueriesRanked“: 2000, ”recall@1“: 0.641, ”recall@50“: 0.9175}登录后复制

福利游戏

相关文章

更多

精选合集

更多

大家都在玩

热门话题

大家都在看

更多