位置:首页 > 新闻资讯 > 基于PaddleSeg与BiSeNet的自动驾驶道路分割模型

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型

时间:2025-07-23  |  作者:  |  阅读:0

本次项目根据要求训练一个基于飞桨PaddleSeg的自动驾驶道路分割模型,采用实时语义分割网络BiSeNet,使用多个经预处理的训练集进行训练,并从多角度对模型进行评估与可视化,最后保存导出

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

基于PaddleSeg的自动驾驶道路分割模型

车道线分割是图像分割领域的热点问题,在自动驾驶领域有着重要应用。

本次项目根据要求训练一个基于飞桨PaddleSeg的自动驾驶道路分割模型,采用实时语义分割网络BiSeNet,使用多个经预处理的训练集进行训练,并在训练结束后从多角度对模型进行评估与可视化,最后保存导出模型,以备科研实训下一阶段使用。

山东大学数据科学与人工智能实验班

小组成员:陈其轩 王启帆 何金原 高梓又

参考资料:

PaddleSeg实战——人像分割(最新版) https://aistudio.baidu.com/aistudio/projectdetail/2189481

轻量级实时语义分割经典BiSeNet及其进化BiSeNet v2 https://zhuanlan.zhihu.com/p/141692672

BiSeNet: Bilateral Segmentation Network for Real-time Semantic Segmentation

1.PaddleSeg安装与环境配置

PaddleSeg是基于飞桨PaddlePaddle开发的端到端图像分割开发套件,涵盖了高精度和轻量级等不同方向的大量高质量分割模型。通过模块化的设计,提供了配置化驱动和API调用两种应用方式,帮助开发者更便捷地完成从训练到部署的全流程图像分割应用。登录后复制 ? ?In [1]

#使用git下载paddleseg # github地址#! git clone https://github.com/PaddlePaddle/PaddleSeg# gitee地址,推荐! git clone https://gitee.com/paddlepaddle/PaddleSeg.git登录后复制 ? ? ? ?

正克隆到 'PaddleSeg'...remote: Enumerating objects: 18055, done.remote: Counting objects: 100% (3018/3018), done.remote: Compressing objects: 100% (1534/1534), done.remote: Total 18055 (delta 1698), reused 2505 (delta 1435), pack-reused 15037接收对象中: 100% (18055/18055), 341.84 MiB | 11.88 MiB/s, 完成.处理 delta 中: 100% (11562/11562), 完成.检查连接... 完成。登录后复制 ? ? ? ?In [2]

#安装环境!pip install -r PaddleSeg/requirements.txt#运行验证 为节省时间可跳过#! python train.py --config configs/quick_start/bisenet_optic_disc_512x512_1k.yml登录后复制 ? ? ? ?

Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simpleRequirement already satisfied: pyyaml>=5.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 1)) (5.1.2)Requirement already satisfied: visualdl>=2.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 2)) (2.3.0)Requirement already satisfied: opencv-python in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 3)) (4.1.1.26)Requirement already satisfied: tqdm in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 4)) (4.27.0)Requirement already satisfied: filelock in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 5)) (3.0.12)Requirement already satisfied: scipy in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 6)) (1.6.3)Requirement already satisfied: prettytable in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 7)) (0.7.2)Requirement already satisfied: sklearn in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from -r PaddleSeg/requirements.txt (line 8)) (0.0)Requirement already satisfied: numpy in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.19.5)Requirement already satisfied: matplotlib in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2.2.3)Requirement already satisfied: flask>=1.1.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.1.1)Requirement already satisfied: Flask-Babel>=1.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.0.0)Requirement already satisfied: protobuf>=3.11.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (3.20.0)Requirement already satisfied: pandas in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.1.5)Requirement already satisfied: requests in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2.24.0)Requirement already satisfied: six>=1.14.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.16.0)Requirement already satisfied: Pillow>=7.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (8.2.0)Requirement already satisfied: bce-python-sdk in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (0.8.53)Requirement already satisfied: scikit-learn in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from sklearn->-r PaddleSeg/requirements.txt (line 8)) (0.24.2)Requirement already satisfied: click>=5.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (7.0)Requirement already satisfied: Jinja2>=2.10.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (3.0.0)Requirement already satisfied: itsdangerous>=0.24 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.1.0)Requirement already satisfied: Werkzeug>=0.15 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from flask>=1.1.1->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (0.16.0)Requirement already satisfied: Babel>=2.3 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2.8.0)Requirement already satisfied: pytz in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Flask-Babel>=1.0.0->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2019.3)Requirement already satisfied: pycryptodome>=3.8.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (3.9.9)Requirement already satisfied: future>=0.6.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from bce-python-sdk->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (0.18.0)Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (3.0.9)Requirement already satisfied: cycler>=0.10 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (0.10.0)Requirement already satisfied: python-dateutil>=2.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2.8.2)Requirement already satisfied: kiwisolver>=1.0.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from matplotlib->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.1.0)Requirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (1.25.6)Requirement already satisfied: idna<3,>=2.5 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2.8)Requirement already satisfied: certifi>=2017.4.17 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2019.9.11)Requirement already satisfied: chardet<4,>=3.0.2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from requests->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (3.0.4)Requirement already satisfied: joblib>=0.11 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-learn->sklearn->-r PaddleSeg/requirements.txt (line 8)) (0.14.1)Requirement already satisfied: threadpoolctl>=2.0.0 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from scikit-learn->sklearn->-r PaddleSeg/requirements.txt (line 8)) (2.1.0)Requirement already satisfied: MarkupSafe>=2.0.0rc2 in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from Jinja2>=2.10.1->flask>=1.1.1->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (2.0.1)Requirement already satisfied: setuptools in /opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages (from kiwisolver>=1.0.1->matplotlib->visualdl>=2.0.0->-r PaddleSeg/requirements.txt (line 2)) (56.2.0)登录后复制 ? ? ? ?

2.数据集准备

2.1数据集选取

为满足尽可能多使用数据集的作业要求,本项目共挂载4个数据集。由于AI Studio对挂载数据集数量的限制(最多可挂载2个数据集),我们将其中三个数据进行处理制作为一个大型数据集集合并进行挂载。所用到的数据集分别为:登录后复制 ? ? ? ?

车道线检测-初赛https://aistudio.baidu.com/aistudio/datasetdetail/54076

车道线检测数据集https://aistudio.baidu.com/aistudio/datasetdetail/68698

2020中国华录杯·数据湖算法大赛—定向算法赛(车道线识别)https://aistudio.baidu.com/aistudio/datasetdetail/54289

智能车2022baseline数据集 https://aistudio.baidu.com/aistudio/datasetdetail/125507

2.2数据集结构整理

在训练自定义数据集前,我们将数据集转化为以下结构:

custom_dataset | |--images | |--image1.jpg | |--image2.jpg | |--... | |--labels | |--label1.jpg | |--label2.png | |--... | |--train.txt | |--val.txt | |--test.txt登录后复制 ? ? ? ?

其中train.txt和val.txt的内容如下:

images/image1.jpg labels/label1.pngimages/image2.jpg labels/label2.png...登录后复制 ? ?

2.3yml文件修改与参数调整

在完成上述操作后,我们需要将自定义数据集的信息写入yml文件

应修改的文件地址:/home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml

此处我们已经编写好yml文件,使用Linux命令行对原有yml文件进行替换即可

In [3]

! mv bisenet_optic_disc_512x512_1k.yml /home/aistudio/PaddleSeg/configs/quick_start/登录后复制 ? ?

以该yml文件为示例讲解yml文件的编写方法:

batch_size: 4 #一次训练所抓取的数据样本数量iters: 20000 #迭代次数train_dataset: type: Dataset dataset_root: /home/aistudio/custom_dataset/dataset2 #train.txt所在目录 train_path: /home/aistudio/custom_dataset/dataset2/train.txt #train.txt地址 num_classes: 20 #分类数量 transforms: #图像处理 - type: Resize target_size: [512, 512] #Resize大小 - type: RandomHorizontalFlip #图像增广方法 - type: Normalize #正则化 mode: trainval_dataset: type: Dataset dataset_root: /home/aistudio/custom_dataset/dataset2 #val.txt所在目录 val_path: /home/aistudio/custom_dataset/dataset2/val.txt #val.txt地址 num_classes: 20 #分类数量 transforms: #图像处理 - type: Resize target_size: [512, 512] #Resize大小 - type: Normalize #正则化 mode: valoptimizer: #优化器 type: sgd #随机梯度下降 momentum: 0.9 #momentum动量法 weight_decay: 4.0e-5 #权重衰减lr_scheduler: #学习率调节 type: PolynomialDecay #多项式衰减 learning_rate: 0.01 #学习率 end_lr: 0 power: 0.9loss: #损失函数 types: - type: CrossEntropyLoss #交叉熵损失函数 coef: [1, 1, 1, 1, 1]model: type: BiSeNetV2 #BiSeNet pretrained: Null登录后复制 ? ?

2.4解压数据集

In [4]

# 利用Python解压数据集集合! python /home/aistudio/unzip.py登录后复制 ? ?

3.模型训练

3.1模型选择

3.1.1轻量级语义分割

基于轻量化网络模型的设计作为一个热门的研究方法,许多研究者都在运算量、参数量和精度之间寻找平衡,希望使用尽量少的运算量和参数量的同时获得较高的模型精度。目前,轻量级模型主要有SqueezeNet、MobileNet系列和ShuffleNet系列等,这些模型在图像分类领域取得了不错的效果,可以作为基本的主干网络应用于语义分割任务当中。

然而,在语义分割领域,由于需要对输入图片进行逐像素的分类,运算量很大。通常,为了减少语义分割所产生的计算量,通常而言有两种方式:减小图片大小和降低模型复杂度。减小图片大小可以最直接地减少运算量,但是图像会丢失掉大量的细节从而影响精度。降低模型复杂度则会导致模型的特征提取能力减弱,从而影响分割精度。所以,如何在语义分割任务中应用轻量级模型,兼顾实时性和精度性能具有相当大的挑战性。

3.1.2主流加速方式

当前主要有三种加速方法:

  • 通过剪裁或 resize 来限定输入大小,以降低计算复杂度。尽管这种方法简单而有效,空间细节的损失还是让预测打了折扣,尤其是边界部分,导致度量和可视化的精度下降;
  • 通过减少网络通道数量加快处理速度,尤其是在骨干模型的早期阶段,但是这会弱化空间信息。
  • 为追求极其紧凑的框架而丢弃模型的最后阶段(比如ENet)。该方法的缺点也很明显:由于 ENet 抛弃了最后阶段的下采样,模型的感受野不足以涵盖大物体,导致判别能力较差。

这些提速的方法会丢失很多 Spatial Details 或者牺牲 Spatial Capacity,从而导致精度大幅下降。为了弥补空间信息的丢失,有些算法会采用 U-shape 的方式恢复空间信息。但是,U-shape 会降低速度,同时很多丢失的信息并不能简单地通过融合浅层特征来恢复。

3.1.3BiSeNet

BiSeNet: Bilateral Segmentation Network for Real-time Semantic Segmentation中出了一种新的双向分割网络BiSeNet。首先,设计了一个带有小步长的空间路径来保留空间位置信息生成高分辨率的特征图;同时设计了一个带有快速下采样率的语义路径来获取客观的感受野。在这两个模块之上引入一个新的特征融合模块将二者的特征图进行融合,实现速度和精度的平衡。

3.2BiSeNet模型训练

In [5]

# 进行模型训练! python /home/aistudio/PaddleSeg/train.py --config /home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml --do_eval --use_vdl --save_interval 4000 --save_dir output登录后复制 ? ? ? ?

/home/aistudio/PaddleSeg/paddleseg/models/losses/rmi_loss.py:78: DeprecationWarning: invalid escape sequence i ”“"2022-07-21 14:55:26 [INFO]------------Environment Information-------------platform: Linux-4.15.0-140-generic-x86_64-with-debian-stretch-sidPython: 3.7.4 (default, Aug 13 2019, 20:35:49) [GCC 7.3.0]Paddle compiled with cuda: TrueNVCC: Cuda compilation tools, release 10.1, V10.1.243cudnn: 7.6GPUs used: 1CUDA_VISIBLE_DEVICES: NoneGPU: ['GPU 0: Tesla V100-SXM2-16GB']GCC: gcc (Ubuntu 7.5.0-3ubuntu1~16.04) 7.5.0PaddleSeg: 2.5.0PaddlePaddle: 2.3.1OpenCV: 4.1.1------------------------------------------------2022-07-21 14:55:27 [INFO]---------------Config Information---------------batch_size: 4iters: 20000loss: coef: - 1 - 1 - 1 - 1 - 1 types: - ignore_index: 255 type: CrossEntropyLosslr_scheduler: end_lr: 0 learning_rate: 0.01 power: 0.9 type: PolynomialDecaymodel: pretrained: null type: BiSeNetV2optimizer: momentum: 0.9 type: sgd weight_decay: 4.0e-05train_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: train num_classes: 20 train_path: /home/aistudio/custom_dataset/dataset2/train.txt transforms: - target_size: - 512 - 512 type: Resize - type: RandomHorizontalFlip - type: Normalize type: Datasetval_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: val num_classes: 20 transforms: - target_size: - 512 - 512 type: Resize - type: Normalize type: Dataset val_path: /home/aistudio/custom_dataset/dataset2/val.txt------------------------------------------------W0721 14:55:27.411620 1235 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 10.1W0721 14:55:27.411733 1235 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6./opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/dygraph/math_op_patch.py:278: UserWarning: The dtype of left and right variables are not the same, left dtype is paddle.float32, but right dtype is paddle.int64, the right dtype will convert to paddle.float32 format(lhs_dtype, rhs_dtype, lhs_dtype))2022-07-21 14:55:54 [INFO][TRAIN] epoch: 1, iter: 10/20000, loss: 11.9167, lr: 0.009996, batch_cost: 2.2641, reader_cost: 0.77581, ips: 1.7667 samples/sec | ETA 12:34:182022-07-21 14:55:57 [INFO][TRAIN] epoch: 1, iter: 20/20000, loss: 2.5850, lr: 0.009991, batch_cost: 0.2598, reader_cost: 0.13210, ips: 15.3944 samples/sec | ETA 01:26:312022-07-21 14:55:59 [INFO][TRAIN] epoch: 1, iter: 30/20000, loss: 1.1348, lr: 0.009987, batch_cost: 0.2198, reader_cost: 0.09766, ips: 18.1970 samples/sec | ETA 01:13:092022-07-21 14:56:01 [INFO][TRAIN] epoch: 1, iter: 40/20000, loss: 0.7021, lr: 0.009982, batch_cost: 0.2444, reader_cost: 0.11737, ips: 16.3676 samples/sec | ETA 01:21:172022-07-21 14:56:04 [INFO][TRAIN] epoch: 1, iter: 50/20000, loss: 0.6947, lr: 0.009978, batch_cost: 0.2438, reader_cost: 0.12147, ips: 16.4102 samples/sec | ETA 01:21:022022-07-21 14:56:06 [INFO][TRAIN] epoch: 1, iter: 60/20000, loss: 0.5881, lr: 0.009973, batch_cost: 0.2378, reader_cost: 0.11496, ips: 16.8229 samples/sec | ETA 01:19:012022-07-21 14:56:08 [INFO][TRAIN] epoch: 1, iter: 70/20000, loss: 0.8604, lr: 0.009969, batch_cost: 0.2278, reader_cost: 0.10715, ips: 17.5629 samples/sec | ETA 01:15:392022-07-21 14:56:11 [INFO][TRAIN] epoch: 1, iter: 80/20000, loss: 0.6053, lr: 0.009964, batch_cost: 0.2561, reader_cost: 0.11706, ips: 15.6195 samples/sec | ETA 01:25:012022-07-21 14:56:14 [INFO][TRAIN] epoch: 1, iter: 90/20000, loss: 0.6464, lr: 0.009960, batch_cost: 0.2548, reader_cost: 0.12605, ips: 15.7008 samples/sec | ETA 01:24:322022-07-21 14:56:16 [INFO][TRAIN] epoch: 1, iter: 100/20000, loss: 0.5420, lr: 0.009955, batch_cost: 0.2432, reader_cost: 0.11942, ips: 16.4474 samples/sec | ETA 01:20:392022-07-21 14:56:19 [INFO][TRAIN] epoch: 1, iter: 110/20000, loss: 0.5387, lr: 0.009951, batch_cost: 0.2530, reader_cost: 0.13051, ips: 15.8098 samples/sec | ETA 01:23:522022-07-21 14:56:21 [INFO][TRAIN] epoch: 1, iter: 120/20000, loss: 0.5950, lr: 0.009946, batch_cost: 0.2387, reader_cost: 0.10965, ips: 16.7573 samples/sec | ETA 01:19:052022-07-21 14:56:24 [INFO][TRAIN] epoch: 1, iter: 130/20000, loss: 0.6278, lr: 0.009942, batch_cost: 0.2755, reader_cost: 0.14333, ips: 14.5177 samples/sec | ETA 01:31:142022-07-21 14:56:26 [INFO][TRAIN] epoch: 1, iter: 140/20000, loss: 0.4930, lr: 0.009937, batch_cost: 0.2368, reader_cost: 0.11549, ips: 16.8916 samples/sec | ETA 01:18:222022-07-21 14:56:29 [INFO][TRAIN] epoch: 1, iter: 150/20000, loss: 0.5909, lr: 0.009933, batch_cost: 0.2500, reader_cost: 0.12266, ips: 16.0025 samples/sec | ETA 01:22:412022-07-21 14:56:31 [INFO][TRAIN] epoch: 1, iter: 160/20000, loss: 0.5714, lr: 0.009928, batch_cost: 0.2551, reader_cost: 0.13288, ips: 15.6785 samples/sec | ETA 01:24:212022-07-21 14:56:34 [INFO][TRAIN] epoch: 1, iter: 170/20000, loss: 0.5105, lr: 0.009924, batch_cost: 0.2492, reader_cost: 0.12710, ips: 16.0528 samples/sec | ETA 01:22:212022-07-21 14:56:36 [INFO][TRAIN] epoch: 1, iter: 180/20000, loss: 0.4768, lr: 0.009919, batch_cost: 0.2407, reader_cost: 0.11817, ips: 16.6198 samples/sec | ETA 01:19:302022-07-21 14:56:38 [INFO][TRAIN] epoch: 1, iter: 190/20000, loss: 0.5907, lr: 0.009915, batch_cost: 0.2414, reader_cost: 0.11939, ips: 16.5726 samples/sec | ETA 01:19:412022-07-21 14:56:41 [INFO][TRAIN] epoch: 1, iter: 200/20000, loss: 0.7546, lr: 0.009910, batch_cost: 0.2728, reader_cost: 0.14395, ips: 14.6610 samples/sec | ETA 01:30:022022-07-21 14:56:44 [INFO][TRAIN] epoch: 1, iter: 210/20000, loss: 0.4847, lr: 0.009906, batch_cost: 0.2655, reader_cost: 0.10178, ips: 15.0678 samples/sec | ETA 01:27:332022-07-21 14:56:46 [INFO][TRAIN] epoch: 1, iter: 220/20000, loss: 0.5259, lr: 0.009901, batch_cost: 0.2679, reader_cost: 0.14123, ips: 14.9289 samples/sec | ETA 01:28:192022-07-21 14:56:49 [INFO][TRAIN] epoch: 1, iter: 230/20000, loss: 0.6713, lr: 0.009897, batch_cost: 0.2564, reader_cost: 0.12498, ips: 15.6030 samples/sec | ETA 01:24:282022-07-21 14:56:51 [INFO][TRAIN] epoch: 1, iter: 240/20000, loss: 0.7099, lr: 0.009892, batch_cost: 0.2456, reader_cost: 0.11903, ips: 16.2841 samples/sec | ETA 01:20:532022-07-21 14:56:54 [INFO][TRAIN] epoch: 1, iter: 250/20000, loss: 0.5256, lr: 0.009888, batch_cost: 0.3031, reader_cost: 0.16171, ips: 13.1978 samples/sec | ETA 01:39:452022-07-21 14:56:57 [INFO][TRAIN] epoch: 1, iter: 260/20000, loss: 0.6190, lr: 0.009883, batch_cost: 0.2621, reader_cost: 0.13200, ips: 15.2640 samples/sec | ETA 01:26:122022-07-21 14:57:01 [INFO][TRAIN] epoch: 1, iter: 270/20000, loss: 0.5735, lr: 0.009879, batch_cost: 0.3585, reader_cost: 0.18691, ips: 11.1586 samples/sec | ETA 01:57:522022-07-21 14:57:04 [INFO][TRAIN] epoch: 1, iter: 280/20000, loss: 0.3380, lr: 0.009874, batch_cost: 0.2897, reader_cost: 0.15261, ips: 13.8082 samples/sec | ETA 01:35:122022-07-21 14:57:06 [INFO][TRAIN] epoch: 1, iter: 290/20000, loss: 0.3988, lr: 0.009870, batch_cost: 0.2610, reader_cost: 0.13885, ips: 15.3271 samples/sec | ETA 01:25:432022-07-21 14:57:09 [INFO][TRAIN] epoch: 1, iter: 300/20000, loss: 0.7116, lr: 0.009865, batch_cost: 0.2562, reader_cost: 0.13377, ips: 15.6124 samples/sec | ETA 01:24:072022-07-21 14:57:11 [INFO][TRAIN] epoch: 1, iter: 310/20000, loss: 0.7189, lr: 0.009861, batch_cost: 0.2380, reader_cost: 0.11669, ips: 16.8041 samples/sec | ETA 01:18:062022-07-21 14:57:14 [INFO][TRAIN] epoch: 1, iter: 320/20000, loss: 0.5597, lr: 0.009856, batch_cost: 0.3014, reader_cost: 0.14422, ips: 13.2725 samples/sec | ETA 01:38:512022-07-21 14:57:17 [INFO][TRAIN] epoch: 1, iter: 330/20000, loss: 0.6489, lr: 0.009852, batch_cost: 0.2453, reader_cost: 0.12232, ips: 16.3059 samples/sec | ETA 01:20:252022-07-21 14:57:19 [INFO][TRAIN] epoch: 1, iter: 340/20000, loss: 0.5716, lr: 0.009847, batch_cost: 0.2472, reader_cost: 0.12221, ips: 16.1835 samples/sec | ETA 01:20:592022-07-21 14:57:21 [INFO][TRAIN] epoch: 1, iter: 350/20000, loss: 0.6402, lr: 0.009843, batch_cost: 0.2189, reader_cost: 0.09461, ips: 18.2729 samples/sec | ETA 01:11:412022-07-21 14:57:24 [INFO][TRAIN] epoch: 1, iter: 360/20000, loss: 0.5413, lr: 0.009838, batch_cost: 0.2359, reader_cost: 0.11353, ips: 16.9545 samples/sec | ETA 01:17:132022-07-21 14:57:26 [INFO][TRAIN] epoch: 1, iter: 370/20000, loss: 0.5669, lr: 0.009834, batch_cost: 0.2752, reader_cost: 0.14123, ips: 14.5330 samples/sec | ETA 01:30:022022-07-21 14:57:29 [INFO][TRAIN] epoch: 1, iter: 380/20000, loss: 0.4473, lr: 0.009829, batch_cost: 0.2407, reader_cost: 0.11707, ips: 16.6190 samples/sec | ETA 01:18:422022-07-21 14:57:31 [INFO][TRAIN] epoch: 1, iter: 390/20000, loss: 0.6943, lr: 0.009825, batch_cost: 0.2233, reader_cost: 0.09761, ips: 17.9126 samples/sec | ETA 01:12:592022-07-21 14:57:33 [INFO][TRAIN] epoch: 1, iter: 400/20000, loss: 0.5977, lr: 0.009820, batch_cost: 0.2307, reader_cost: 0.10906, ips: 17.3376 samples/sec | ETA 01:15:212022-07-21 14:57:36 [INFO][TRAIN] epoch: 1, iter: 410/20000, loss: 0.6317, lr: 0.009816, batch_cost: 0.2220, reader_cost: 0.09904, ips: 18.0181 samples/sec | ETA 01:12:282022-07-21 14:57:38 [INFO][TRAIN] epoch: 1, iter: 420/20000, loss: 0.5135, lr: 0.009811, batch_cost: 0.2344, reader_cost: 0.11267, ips: 17.0646 samples/sec | ETA 01:16:292022-07-21 14:57:40 [INFO][TRAIN] epoch: 1, iter: 430/20000, loss: 0.6564, lr: 0.009807, batch_cost: 0.2304, reader_cost: 0.10728, ips: 17.3593 samples/sec | ETA 01:15:092022-07-21 14:57:43 [INFO][TRAIN] epoch: 1, iter: 440/20000, loss: 0.4523, lr: 0.009802, batch_cost: 0.2406, reader_cost: 0.11929, ips: 16.6264 samples/sec | ETA 01:18:252022-07-21 14:57:45 [INFO][TRAIN] epoch: 1, iter: 450/20000, loss: 0.5845, lr: 0.009798, batch_cost: 0.2446, reader_cost: 0.11998, ips: 16.3550 samples/sec | ETA 01:19:412022-07-21 14:57:48 [INFO][TRAIN] epoch: 1, iter: 460/20000, loss: 0.5523, lr: 0.009793, batch_cost: 0.2455, reader_cost: 0.11657, ips: 16.2936 samples/sec | ETA 01:19:562022-07-21 14:57:50 [INFO][TRAIN] epoch: 1, iter: 470/20000, loss: 0.3911, lr: 0.009789, batch_cost: 0.2669, reader_cost: 0.14121, ips: 14.9849 samples/sec | ETA 01:26:532022-07-21 14:57:53 [INFO][TRAIN] epoch: 1, iter: 480/20000, loss: 0.7399, lr: 0.009784, batch_cost: 0.2524, reader_cost: 0.12457, ips: 15.8508 samples/sec | ETA 01:22:052022-07-21 14:57:55 [INFO][TRAIN] epoch: 1, iter: 490/20000, loss: 0.5473, lr: 0.009780, batch_cost: 0.2565, reader_cost: 0.13508, ips: 15.5927 samples/sec | ETA 01:23:242022-07-21 14:57:58 [INFO][TRAIN] epoch: 1, iter: 500/20000, loss: 0.6126, lr: 0.009775, batch_cost: 0.2894, reader_cost: 0.15135, ips: 13.8231 samples/sec | ETA 01:34:022022-07-21 14:58:01 [INFO][TRAIN] epoch: 1, iter: 510/20000, loss: 0.5861, lr: 0.009771, batch_cost: 0.2424, reader_cost: 0.11570, ips: 16.5007 samples/sec | ETA 01:18:442022-07-21 14:58:03 [INFO][TRAIN] epoch: 1, iter: 520/20000, loss: 0.3032, lr: 0.009766, batch_cost: 0.2433, reader_cost: 0.11946, ips: 16.4406 samples/sec | ETA 01:18:592022-07-21 14:58:06 [INFO][TRAIN] epoch: 1, iter: 530/20000, loss: 0.3768, lr: 0.009762, batch_cost: 0.3242, reader_cost: 0.16667, ips: 12.3394 samples/sec | ETA 01:45:112022-07-21 14:58:09 [INFO][TRAIN] epoch: 1, iter: 540/20000, loss: 0.4011, lr: 0.009757, batch_cost: 0.3133, reader_cost: 0.15552, ips: 12.7685 samples/sec | ETA 01:41:362022-07-21 14:58:12 [INFO][TRAIN] epoch: 1, iter: 550/20000, loss: 0.5117, lr: 0.009753, batch_cost: 0.2322, reader_cost: 0.11025, ips: 17.2295 samples/sec | ETA 01:15:152022-07-21 14:58:14 [INFO][TRAIN] epoch: 1, iter: 560/20000, loss: 0.5166, lr: 0.009748, batch_cost: 0.2291, reader_cost: 0.10920, ips: 17.4609 samples/sec | ETA 01:14:132022-07-21 14:58:17 [INFO][TRAIN] epoch: 1, iter: 570/20000, loss: 0.6538, lr: 0.009744, batch_cost: 0.2840, reader_cost: 0.14188, ips: 14.0847 samples/sec | ETA 01:31:582022-07-21 14:58:19 [INFO][TRAIN] epoch: 1, iter: 580/20000, loss: 0.6845, lr: 0.009739, batch_cost: 0.2429, reader_cost: 0.11783, ips: 16.4674 samples/sec | ETA 01:18:372022-07-21 14:58:22 [INFO][TRAIN] epoch: 1, iter: 590/20000, loss: 0.5049, lr: 0.009735, batch_cost: 0.2462, reader_cost: 0.12593, ips: 16.2477 samples/sec | ETA 01:19:382022-07-21 14:58:24 [INFO][TRAIN] epoch: 1, iter: 600/20000, loss: 0.3684, lr: 0.009730, batch_cost: 0.2445, reader_cost: 0.12355, ips: 16.3579 samples/sec | ETA 01:19:032022-07-21 14:58:27 [INFO][TRAIN] epoch: 1, iter: 610/20000, loss: 0.7003, lr: 0.009726, batch_cost: 0.2367, reader_cost: 0.11647, ips: 16.9025 samples/sec | ETA 01:16:282022-07-21 14:58:29 [INFO][TRAIN] epoch: 1, iter: 620/20000, loss: 0.5458, lr: 0.009721, batch_cost: 0.2403, reader_cost: 0.11208, ips: 16.6434 samples/sec | ETA 01:17:372022-07-21 14:58:32 [INFO][TRAIN] epoch: 1, iter: 630/20000, loss: 0.5631, lr: 0.009716, batch_cost: 0.2693, reader_cost: 0.11775, ips: 14.8525 samples/sec | ETA 01:26:562022-07-21 14:58:34 [INFO][TRAIN] epoch: 1, iter: 640/20000, loss: 0.3798, lr: 0.009712, batch_cost: 0.2325, reader_cost: 0.11114, ips: 17.2057 samples/sec | ETA 01:15:002022-07-21 14:58:36 [INFO][TRAIN] epoch: 1, iter: 650/20000, loss: 0.5365, lr: 0.009707, batch_cost: 0.2208, reader_cost: 0.09945, ips: 18.1170 samples/sec | ETA 01:11:122022-07-21 14:58:39 [INFO][TRAIN] epoch: 1, iter: 660/20000, loss: 0.4280, lr: 0.009703, batch_cost: 0.2356, reader_cost: 0.11444, ips: 16.9788 samples/sec | ETA 01:15:562022-07-21 14:58:41 [INFO][TRAIN] epoch: 1, iter: 670/20000, loss: 0.4687, lr: 0.009698, batch_cost: 0.2433, reader_cost: 0.11879, ips: 16.4401 samples/sec | ETA 01:18:232022-07-21 14:58:43 [INFO][TRAIN] epoch: 1, iter: 680/20000, loss: 0.6834, lr: 0.009694, batch_cost: 0.2437, reader_cost: 0.11726, ips: 16.4120 samples/sec | ETA 01:18:282022-07-21 14:58:46 [INFO][TRAIN] epoch: 1, iter: 690/20000, loss: 0.4881, lr: 0.009689, batch_cost: 0.2473, reader_cost: 0.12316, ips: 16.1777 samples/sec | ETA 01:19:342022-07-21 14:58:48 [INFO][TRAIN] epoch: 1, iter: 700/20000, loss: 0.6050, lr: 0.009685, batch_cost: 0.2426, reader_cost: 0.11743, ips: 16.4852 samples/sec | ETA 01:18:022022-07-21 14:58:51 [INFO][TRAIN] epoch: 1, iter: 710/20000, loss: 0.5325, lr: 0.009680, batch_cost: 0.2519, reader_cost: 0.12577, ips: 15.8790 samples/sec | ETA 01:20:592022-07-21 14:58:53 [INFO][TRAIN] epoch: 1, iter: 720/20000, loss: 0.5232, lr: 0.009676, batch_cost: 0.2441, reader_cost: 0.11157, ips: 16.3834 samples/sec | ETA 01:18:272022-07-21 14:58:56 [INFO][TRAIN] epoch: 1, iter: 730/20000, loss: 0.5826, lr: 0.009671, batch_cost: 0.2420, reader_cost: 0.11868, ips: 16.5268 samples/sec | ETA 01:17:432022-07-21 14:58:58 [INFO][TRAIN] epoch: 1, iter: 740/20000, loss: 0.4908, lr: 0.009667, batch_cost: 0.2646, reader_cost: 0.14021, ips: 15.1149 samples/sec | ETA 01:24:562022-07-21 14:59:01 [INFO][TRAIN] epoch: 1, iter: 750/20000, loss: 0.4985, lr: 0.009662, batch_cost: 0.2810, reader_cost: 0.14470, ips: 14.2354 samples/sec | ETA 01:30:092022-07-21 14:59:04 [INFO][TRAIN] epoch: 1, iter: 760/20000, loss: 0.6042, lr: 0.009658, batch_cost: 0.2612, reader_cost: 0.13136, ips: 15.3153 samples/sec | ETA 01:23:452022-07-21 14:59:06 [INFO][TRAIN] epoch: 1, iter: 770/20000, loss: 0.4807, lr: 0.009653, batch_cost: 0.2289, reader_cost: 0.10698, ips: 17.4742 samples/sec | ETA 01:13:212022-07-21 14:59:09 [INFO][TRAIN] epoch: 1, iter: 780/20000, loss: 0.5735, lr: 0.009649, batch_cost: 0.2435, reader_cost: 0.11190, ips: 16.4280 samples/sec | ETA 01:17:592022-07-21 14:59:11 [INFO][TRAIN] epoch: 1, iter: 790/20000, loss: 0.6511, lr: 0.009644, batch_cost: 0.2678, reader_cost: 0.13647, ips: 14.9342 samples/sec | ETA 01:25:452022-07-21 14:59:15 [INFO][TRAIN] epoch: 1, iter: 800/20000, loss: 0.4552, lr: 0.009640, batch_cost: 0.3913, reader_cost: 0.21311, ips: 10.2212 samples/sec | ETA 02:05:132022-07-21 14:59:18 [INFO][TRAIN] epoch: 1, iter: 810/20000, loss: 0.5345, lr: 0.009635, batch_cost: 0.2880, reader_cost: 0.14126, ips: 13.8907 samples/sec | ETA 01:32:052022-07-21 14:59:20 [INFO][TRAIN] epoch: 1, iter: 820/20000, loss: 0.5315, lr: 0.009631, batch_cost: 0.2487, reader_cost: 0.12208, ips: 16.0824 samples/sec | ETA 01:19:302022-07-21 14:59:23 [INFO][TRAIN] epoch: 1, iter: 830/20000, loss: 0.5320, lr: 0.009626, batch_cost: 0.2360, reader_cost: 0.11429, ips: 16.9456 samples/sec | ETA 01:15:252022-07-21 14:59:25 [INFO][TRAIN] epoch: 1, iter: 840/20000, loss: 0.5432, lr: 0.009622, batch_cost: 0.2216, reader_cost: 0.09770, ips: 18.0506 samples/sec | ETA 01:10:452022-07-21 14:59:27 [INFO][TRAIN] epoch: 1, iter: 850/20000, loss: 0.3951, lr: 0.009617, batch_cost: 0.2428, reader_cost: 0.12152, ips: 16.4753 samples/sec | ETA 01:17:292022-07-21 14:59:30 [INFO][TRAIN] epoch: 1, iter: 860/20000, loss: 0.5536, lr: 0.009613, batch_cost: 0.2441, reader_cost: 0.11412, ips: 16.3874 samples/sec | ETA 01:17:512022-07-21 14:59:32 [INFO][TRAIN] epoch: 1, iter: 870/20000, loss: 0.5246, lr: 0.009608, batch_cost: 0.2398, reader_cost: 0.11729, ips: 16.6810 samples/sec | ETA 01:16:272022-07-21 14:59:35 [INFO][TRAIN] epoch: 1, iter: 880/20000, loss: 0.6179, lr: 0.009604, batch_cost: 0.2881, reader_cost: 0.14870, ips: 13.8850 samples/sec | ETA 01:31:482022-07-21 14:59:37 [INFO][TRAIN] epoch: 1, iter: 890/20000, loss: 0.5340, lr: 0.009599, batch_cost: 0.2247, reader_cost: 0.10413, ips: 17.8045 samples/sec | ETA 01:11:332022-07-21 14:59:40 [INFO][TRAIN] epoch: 1, iter: 900/20000, loss: 0.5292, lr: 0.009595, batch_cost: 0.2436, reader_cost: 0.11777, ips: 16.4173 samples/sec | ETA 01:17:332022-07-21 14:59:42 [INFO][TRAIN] epoch: 1, iter: 910/20000, loss: 0.4689, lr: 0.009590, batch_cost: 0.2211, reader_cost: 0.09907, ips: 18.0935 samples/sec | ETA 01:10:202022-07-21 14:59:45 [INFO][TRAIN] epoch: 1, iter: 920/20000, loss: 0.3855, lr: 0.009585, batch_cost: 0.2453, reader_cost: 0.11874, ips: 16.3062 samples/sec | ETA 01:18:002022-07-21 14:59:47 [INFO][TRAIN] epoch: 1, iter: 930/20000, loss: 0.4187, lr: 0.009581, batch_cost: 0.2500, reader_cost: 0.12157, ips: 16.0022 samples/sec | ETA 01:19:262022-07-21 14:59:50 [INFO][TRAIN] epoch: 1, iter: 940/20000, loss: 0.4044, lr: 0.009576, batch_cost: 0.2540, reader_cost: 0.12903, ips: 15.7474 samples/sec | ETA 01:20:412022-07-21 14:59:52 [INFO][TRAIN] epoch: 1, iter: 950/20000, loss: 0.6536, lr: 0.009572, batch_cost: 0.2217, reader_cost: 0.09806, ips: 18.0384 samples/sec | ETA 01:10:242022-07-21 14:59:54 [INFO][TRAIN] epoch: 1, iter: 960/20000, loss: 0.4353, lr: 0.009567, batch_cost: 0.2398, reader_cost: 0.11553, ips: 16.6803 samples/sec | ETA 01:16:052022-07-21 14:59:57 [INFO][TRAIN] epoch: 1, iter: 970/20000, loss: 0.4946, lr: 0.009563, batch_cost: 0.2392, reader_cost: 0.11176, ips: 16.7224 samples/sec | ETA 01:15:512022-07-21 14:59:59 [INFO][TRAIN] epoch: 1, iter: 980/20000, loss: 0.5284, lr: 0.009558, batch_cost: 0.2553, reader_cost: 0.12890, ips: 15.6654 samples/sec | ETA 01:20:562022-07-21 15:00:02 [INFO][TRAIN] epoch: 1, iter: 990/20000, loss: 0.4630, lr: 0.009554, batch_cost: 0.2495, reader_cost: 0.11552, ips: 16.0318 samples/sec | ETA 01:19:032022-07-21 15:00:04 [INFO][TRAIN] epoch: 1, iter: 1000/20000, loss: 0.4146, lr: 0.009549, batch_cost: 0.2474, reader_cost: 0.11718, ips: 16.1670 samples/sec | ETA 01:18:202022-07-21 15:00:07 [INFO][TRAIN] epoch: 1, iter: 1010/20000, loss: 0.4732, lr: 0.009545, batch_cost: 0.2937, reader_cost: 0.15381, ips: 13.6211 samples/sec | ETA 01:32:562022-07-21 15:00:09 [INFO][TRAIN] epoch: 1, iter: 1020/20000, loss: 0.4508, lr: 0.009540, batch_cost: 0.2254, reader_cost: 0.10488, ips: 17.7483 samples/sec | ETA 01:11:172022-07-21 15:00:12 [INFO][TRAIN] epoch: 1, iter: 1030/20000, loss: 0.5881, lr: 0.009536, batch_cost: 0.2452, reader_cost: 0.12463, ips: 16.3134 samples/sec | ETA 01:17:312022-07-21 15:00:14 [INFO][TRAIN] epoch: 1, iter: 1040/20000, loss: 0.5697, lr: 0.009531, batch_cost: 0.2504, reader_cost: 0.12768, ips: 15.9728 samples/sec | ETA 01:19:082022-07-21 15:00:17 [INFO][TRAIN] epoch: 1, iter: 1050/20000, loss: 0.5305, lr: 0.009527, batch_cost: 0.2431, reader_cost: 0.12030, ips: 16.4538 samples/sec | ETA 01:16:462022-07-21 15:00:21 [INFO][TRAIN] epoch: 1, iter: 1060/20000, loss: 0.6297, lr: 0.009522, batch_cost: 0.3790, reader_cost: 0.19315, ips: 10.5533 samples/sec | ETA 01:59:382022-07-21 15:00:24 [INFO][TRAIN] epoch: 1, iter: 1070/20000, loss: 0.4769, lr: 0.009518, batch_cost: 0.3346, reader_cost: 0.16908, ips: 11.9557 samples/sec | ETA 01:45:332022-07-21 15:00:26 [INFO][TRAIN] epoch: 1, iter: 1080/20000, loss: 0.4034, lr: 0.009513, batch_cost: 0.2279, reader_cost: 0.10339, ips: 17.5551 samples/sec | ETA 01:11:502022-07-21 15:00:28 [INFO][TRAIN] epoch: 1, iter: 1090/20000, loss: 0.4825, lr: 0.009509, batch_cost: 0.2192, reader_cost: 0.09771, ips: 18.2467 samples/sec | ETA 01:09:052022-07-21 15:00:31 [INFO][TRAIN] epoch: 1, iter: 1100/20000, loss: 0.3385, lr: 0.009504, batch_cost: 0.2352, reader_cost: 0.11049, ips: 17.0073 samples/sec | ETA 01:14:052022-07-21 15:00:33 [INFO][TRAIN] epoch: 1, iter: 1110/20000, loss: 0.4106, lr: 0.009500, batch_cost: 0.2282, reader_cost: 0.10406, ips: 17.5322 samples/sec | ETA 01:11:492022-07-21 15:00:35 [INFO][TRAIN] epoch: 1, iter: 1120/20000, loss: 0.4325, lr: 0.009495, batch_cost: 0.2330, reader_cost: 0.11247, ips: 17.1680 samples/sec | ETA 01:13:182022-07-21 15:00:38 [INFO][TRAIN] epoch: 1, iter: 1130/20000, loss: 0.5360, lr: 0.009490, batch_cost: 0.2789, reader_cost: 0.14722, ips: 14.3424 samples/sec | ETA 01:27:422022-07-21 15:00:41 [INFO][TRAIN] epoch: 1, iter: 1140/20000, loss: 0.4442, lr: 0.009486, batch_cost: 0.2589, reader_cost: 0.13046, ips: 15.4507 samples/sec | ETA 01:21:222022-07-21 15:00:43 [INFO][TRAIN] epoch: 1, iter: 1150/20000, loss: 0.4827, lr: 0.009481, batch_cost: 0.2802, reader_cost: 0.15269, ips: 14.2767 samples/sec | ETA 01:28:012022-07-21 15:00:46 [INFO][TRAIN] epoch: 1, iter: 1160/20000, loss: 0.4974, lr: 0.009477, batch_cost: 0.2409, reader_cost: 0.11418, ips: 16.6029 samples/sec | ETA 01:15:382022-07-21 15:00:48 [INFO][TRAIN] epoch: 1, iter: 1170/20000, loss: 0.4769, lr: 0.009472, batch_cost: 0.2298, reader_cost: 0.10773, ips: 17.4065 samples/sec | ETA 01:12:072022-07-21 15:00:50 [INFO][TRAIN] epoch: 1, iter: 1180/20000, loss: 0.3819, lr: 0.009468, batch_cost: 0.2293, reader_cost: 0.10535, ips: 17.4476 samples/sec | ETA 01:11:542022-07-21 15:00:53 [INFO][TRAIN] epoch: 1, iter: 1190/20000, loss: 0.6574, lr: 0.009463, batch_cost: 0.2347, reader_cost: 0.11229, ips: 17.0438 samples/sec | ETA 01:13:342022-07-21 15:00:55 [INFO][TRAIN] epoch: 1, iter: 1200/20000, loss: 0.4916, lr: 0.009459, batch_cost: 0.2399, reader_cost: 0.11714, ips: 16.6766 samples/sec | ETA 01:15:092022-07-21 15:00:58 [INFO][TRAIN] epoch: 1, iter: 1210/20000, loss: 0.5461, lr: 0.009454, batch_cost: 0.2457, reader_cost: 0.12399, ips: 16.2820 samples/sec | ETA 01:16:562022-07-21 15:01:00 [INFO][TRAIN] epoch: 1, iter: 1220/20000, loss: 0.4712, lr: 0.009450, batch_cost: 0.2203, reader_cost: 0.09798, ips: 18.1535 samples/sec | ETA 01:08:582022-07-21 15:01:02 [INFO][TRAIN] epoch: 1, iter: 1230/20000, loss: 0.4156, lr: 0.009445, batch_cost: 0.2464, reader_cost: 0.12192, ips: 16.2358 samples/sec | ETA 01:17:042022-07-21 15:01:05 [INFO][TRAIN] epoch: 1, iter: 1240/20000, loss: 0.5316, lr: 0.009441, batch_cost: 0.2317, reader_cost: 0.10498, ips: 17.2663 samples/sec | ETA 01:12:262022-07-21 15:01:07 [INFO][TRAIN] epoch: 1, iter: 1250/20000, loss: 0.4818, lr: 0.009436, batch_cost: 0.2443, reader_cost: 0.11366, ips: 16.3757 samples/sec | ETA 01:16:192022-07-21 15:01:10 [INFO][TRAIN] epoch: 1, iter: 1260/20000, loss: 0.4698, lr: 0.009432, batch_cost: 0.2878, reader_cost: 0.13636, ips: 13.8999 samples/sec | ETA 01:29:522022-07-21 15:01:12 [INFO][TRAIN] epoch: 1, iter: 1270/20000, loss: 0.4265, lr: 0.009427, batch_cost: 0.2383, reader_cost: 0.11406, ips: 16.7835 samples/sec | ETA 01:14:232022-07-21 15:01:15 [INFO][TRAIN] epoch: 1, iter: 1280/20000, loss: 0.5951, lr: 0.009423, batch_cost: 0.2522, reader_cost: 0.12792, ips: 15.8574 samples/sec | ETA 01:18:422022-07-21 15:01:17 [INFO][TRAIN] epoch: 1, iter: 1290/20000, loss: 0.3429, lr: 0.009418, batch_cost: 0.2255, reader_cost: 0.10364, ips: 17.7360 samples/sec | ETA 01:10:192022-07-21 15:01:20 [INFO][TRAIN] epoch: 1, iter: 1300/20000, loss: 0.3973, lr: 0.009414, batch_cost: 0.2432, reader_cost: 0.11715, ips: 16.4474 samples/sec | ETA 01:15:472022-07-21 15:01:22 [INFO][TRAIN] epoch: 1, iter: 1310/20000, loss: 0.4413, lr: 0.009409, batch_cost: 0.2407, reader_cost: 0.11528, ips: 16.6167 samples/sec | ETA 01:14:592022-07-21 15:01:25 [INFO][TRAIN] epoch: 1, iter: 1320/20000, loss: 0.4925, lr: 0.009404, batch_cost: 0.2979, reader_cost: 0.13995, ips: 13.4276 samples/sec | ETA 01:32:442022-07-21 15:01:29 [INFO][TRAIN] epoch: 1, iter: 1330/20000, loss: 0.5488, lr: 0.009400, batch_cost: 0.3533, reader_cost: 0.16668, ips: 11.3203 samples/sec | ETA 01:49:562022-07-21 15:01:31 [INFO][TRAIN] epoch: 1, iter: 1340/20000, loss: 0.4812, lr: 0.009395, batch_cost: 0.2705, reader_cost: 0.13960, ips: 14.7887 samples/sec | ETA 01:24:072022-07-21 15:01:34 [INFO][TRAIN] epoch: 1, iter: 1350/20000, loss: 0.5172, lr: 0.009391, batch_cost: 0.2422, reader_cost: 0.11986, ips: 16.5181 samples/sec | ETA 01:15:162022-07-21 15:01:36 [INFO][TRAIN] epoch: 1, iter: 1360/20000, loss: 0.3980, lr: 0.009386, batch_cost: 0.2518, reader_cost: 0.13068, ips: 15.8856 samples/sec | ETA 01:18:132022-07-21 15:01:38 [INFO][TRAIN] epoch: 1, iter: 1370/20000, loss: 0.3723, lr: 0.009382, batch_cost: 0.2300, reader_cost: 0.10781, ips: 17.3912 samples/sec | ETA 01:11:242022-07-21 15:01:41 [INFO][TRAIN] epoch: 1, iter: 1380/20000, loss: 0.4541, lr: 0.009377, batch_cost: 0.2837, reader_cost: 0.13631, ips: 14.1019 samples/sec | ETA 01:28:012022-07-21 15:01:44 [INFO][TRAIN] epoch: 1, iter: 1390/20000, loss: 0.4767, lr: 0.009373, batch_cost: 0.2392, reader_cost: 0.11585, ips: 16.7225 samples/sec | ETA 01:14:112022-07-21 15:01:46 [INFO][TRAIN] epoch: 1, iter: 1400/20000, loss: 0.3933, lr: 0.009368, batch_cost: 0.2449, reader_cost: 0.11558, ips: 16.3356 samples/sec | ETA 01:15:542022-07-21 15:01:48 [INFO][TRAIN] epoch: 1, iter: 1410/20000, loss: 0.4505, lr: 0.009364, batch_cost: 0.2324, reader_cost: 0.11003, ips: 17.2099 samples/sec | ETA 01:12:002022-07-21 15:01:51 [INFO][TRAIN] epoch: 1, iter: 1420/20000, loss: 0.5567, lr: 0.009359, batch_cost: 0.2350, reader_cost: 0.11300, ips: 17.0214 samples/sec | ETA 01:12:462022-07-21 15:01:53 [INFO][TRAIN] epoch: 1, iter: 1430/20000, loss: 0.5346, lr: 0.009355, batch_cost: 0.2317, reader_cost: 0.11083, ips: 17.2664 samples/sec | ETA 01:11:422022-07-21 15:01:56 [INFO][TRAIN] epoch: 1, iter: 1440/20000, loss: 0.3221, lr: 0.009350, batch_cost: 0.2409, reader_cost: 0.11926, ips: 16.6020 samples/sec | ETA 01:14:312022-07-21 15:01:58 [INFO][TRAIN] epoch: 1, iter: 1450/20000, loss: 0.3976, lr: 0.009346, batch_cost: 0.2317, reader_cost: 0.11015, ips: 17.2617 samples/sec | ETA 01:11:382022-07-21 15:02:00 [INFO][TRAIN] epoch: 1, iter: 1460/20000, loss: 0.4399, lr: 0.009341, batch_cost: 0.2346, reader_cost: 0.10962, ips: 17.0511 samples/sec | ETA 01:12:292022-07-21 15:02:03 [INFO][TRAIN] epoch: 1, iter: 1470/20000, loss: 0.4112, lr: 0.009336, batch_cost: 0.2401, reader_cost: 0.11930, ips: 16.6624 samples/sec | ETA 01:14:082022-07-21 15:02:05 [INFO][TRAIN] epoch: 1, iter: 1480/20000, loss: 0.3547, lr: 0.009332, batch_cost: 0.2274, reader_cost: 0.10579, ips: 17.5871 samples/sec | ETA 01:10:122022-07-21 15:02:07 [INFO][TRAIN] epoch: 1, iter: 1490/20000, loss: 0.4614, lr: 0.009327, batch_cost: 0.2467, reader_cost: 0.11496, ips: 16.2169 samples/sec | ETA 01:16:052022-07-21 15:02:10 [INFO][TRAIN] epoch: 1, iter: 1500/20000, loss: 0.4686, lr: 0.009323, batch_cost: 0.2370, reader_cost: 0.07827, ips: 16.8744 samples/sec | ETA 01:13:052022-07-21 15:02:12 [INFO][TRAIN] epoch: 1, iter: 1510/20000, loss: 0.3991, lr: 0.009318, batch_cost: 0.2656, reader_cost: 0.12437, ips: 15.0627 samples/sec | ETA 01:21:502022-07-21 15:02:15 [INFO][TRAIN] epoch: 1, iter: 1520/20000, loss: 0.5329, lr: 0.009314, batch_cost: 0.2597, reader_cost: 0.12597, ips: 15.4041 samples/sec | ETA 01:19:582022-07-21 15:02:17 [INFO][TRAIN] epoch: 1, iter: 1530/20000, loss: 0.4557, lr: 0.009309, batch_cost: 0.2351, reader_cost: 0.11179, ips: 17.0172 samples/sec | ETA 01:12:212022-07-21 15:02:20 [INFO][TRAIN] epoch: 1, iter: 1540/20000, loss: 0.4484, lr: 0.009305, batch_cost: 0.2414, reader_cost: 0.11561, ips: 16.5687 samples/sec | ETA 01:14:162022-07-21 15:02:22 [INFO][TRAIN] epoch: 1, iter: 1550/20000, loss: 0.4284, lr: 0.009300, batch_cost: 0.2361, reader_cost: 0.11385, ips: 16.9391 samples/sec | ETA 01:12:362022-07-21 15:02:25 [INFO][TRAIN] epoch: 1, iter: 1560/20000, loss: 0.2956, lr: 0.009296, batch_cost: 0.2505, reader_cost: 0.12430, ips: 15.9669 samples/sec | ETA 01:16:592022-07-21 15:02:27 [INFO][TRAIN] epoch: 1, iter: 1570/20000, loss: 0.4430, lr: 0.009291, batch_cost: 0.2752, reader_cost: 0.13904, ips: 14.5354 samples/sec | ETA 01:24:312022-07-21 15:02:30 [INFO][TRAIN] epoch: 1, iter: 1580/20000, loss: 0.4362, lr: 0.009287, batch_cost: 0.2415, reader_cost: 0.11759, ips: 16.5609 samples/sec | ETA 01:14:092022-07-21 15:02:32 [INFO][TRAIN] epoch: 1, iter: 1590/20000, loss: 0.5054, lr: 0.009282, batch_cost: 0.2598, reader_cost: 0.12305, ips: 15.3962 samples/sec | ETA 01:19:422022-07-21 15:02:36 [INFO][TRAIN] epoch: 1, iter: 1600/20000, loss: 0.3720, lr: 0.009277, batch_cost: 0.3773, reader_cost: 0.19223, ips: 10.6007 samples/sec | ETA 01:55:422022-07-21 15:02:39 [INFO][TRAIN] epoch: 1, iter: 1610/20000, loss: 0.5518, lr: 0.009273, batch_cost: 0.2356, reader_cost: 0.11445, ips: 16.9774 samples/sec | ETA 01:12:122022-07-21 15:02:41 [INFO][TRAIN] epoch: 1, iter: 1620/20000, loss: 0.4688, lr: 0.009268, batch_cost: 0.2354, reader_cost: 0.11329, ips: 16.9929 samples/sec | ETA 01:12:062022-07-21 15:02:43 [INFO][TRAIN] epoch: 1, iter: 1630/20000, loss: 0.5300, lr: 0.009264, batch_cost: 0.2420, reader_cost: 0.11269, ips: 16.5280 samples/sec | ETA 01:14:052022-07-21 15:02:46 [INFO][TRAIN] epoch: 1, iter: 1640/20000, loss: 0.5341, lr: 0.009259, batch_cost: 0.2995, reader_cost: 0.15607, ips: 13.3563 samples/sec | ETA 01:31:382022-07-21 15:02:49 [INFO][TRAIN] epoch: 1, iter: 1650/20000, loss: 0.5649, lr: 0.009255, batch_cost: 0.2306, reader_cost: 0.10422, ips: 17.3478 samples/sec | ETA 01:10:312022-07-21 15:02:51 [INFO][TRAIN] epoch: 1, iter: 1660/20000, loss: 0.4397, lr: 0.009250, batch_cost: 0.2425, reader_cost: 0.12138, ips: 16.4959 samples/sec | ETA 01:14:072022-07-21 15:02:53 [INFO][TRAIN] epoch: 1, iter: 1670/20000, loss: 0.4640, lr: 0.009246, batch_cost: 0.2126, reader_cost: 0.08875, ips: 18.8119 samples/sec | ETA 01:04:572022-07-21 15:02:55 [INFO][TRAIN] epoch: 1, iter: 1680/20000, loss: 0.3646, lr: 0.009241, batch_cost: 0.2170, reader_cost: 0.09464, ips: 18.4297 samples/sec | ETA 01:06:162022-07-21 15:02:58 [INFO][TRAIN] epoch: 1, iter: 1690/20000, loss: 0.4645, lr: 0.009237, batch_cost: 0.2354, reader_cost: 0.11311, ips: 16.9925 samples/sec | ETA 01:11:502022-07-21 15:03:00 [INFO][TRAIN] epoch: 1, iter: 1700/20000, loss: 0.5172, lr: 0.009232, batch_cost: 0.2416, reader_cost: 0.11691, ips: 16.5581 samples/sec | ETA 01:13:402022-07-21 15:03:02 [INFO][TRAIN] epoch: 1, iter: 1710/20000, loss: 0.4305, lr: 0.009228, batch_cost: 0.2286, reader_cost: 0.10740, ips: 17.4983 samples/sec | ETA 01:09:402022-07-21 15:03:05 [INFO][TRAIN] epoch: 1, iter: 1720/20000, loss: 0.4329, lr: 0.009223, batch_cost: 0.2426, reader_cost: 0.12253, ips: 16.4892 samples/sec | ETA 01:13:542022-07-21 15:03:07 [INFO][TRAIN] epoch: 1, iter: 1730/20000, loss: 0.5170, lr: 0.009218, batch_cost: 0.2228, reader_cost: 0.10112, ips: 17.9545 samples/sec | ETA 01:07:502022-07-21 15:03:09 [INFO][TRAIN] epoch: 1, iter: 1740/20000, loss: 0.5246, lr: 0.009214, batch_cost: 0.2341, reader_cost: 0.11167, ips: 17.0862 samples/sec | ETA 01:11:142022-07-21 15:03:12 [INFO][TRAIN] epoch: 1, iter: 1750/20000, loss: 0.3562, lr: 0.009209, batch_cost: 0.2459, reader_cost: 0.11569, ips: 16.2693 samples/sec | ETA 01:14:462022-07-21 15:03:14 [INFO][TRAIN] epoch: 1, iter: 1760/20000, loss: 0.5561, lr: 0.009205, batch_cost: 0.2534, reader_cost: 0.13036, ips: 15.7823 samples/sec | ETA 01:17:022022-07-21 15:03:18 [INFO][TRAIN] epoch: 1, iter: 1770/20000, loss: 0.4291, lr: 0.009200, batch_cost: 0.3164, reader_cost: 0.16638, ips: 12.6435 samples/sec | ETA 01:36:072022-07-21 15:03:20 [INFO][TRAIN] epoch: 1, iter: 1780/20000, loss: 0.4387, lr: 0.009196, batch_cost: 0.2527, reader_cost: 0.12658, ips: 15.8263 samples/sec | ETA 01:16:452022-07-21 15:03:22 [INFO][TRAIN] epoch: 1, iter: 1790/20000, loss: 0.4254, lr: 0.009191, batch_cost: 0.2394, reader_cost: 0.11477, ips: 16.7106 samples/sec | ETA 01:12:382022-07-21 15:03:25 [INFO][TRAIN] epoch: 1, iter: 1800/20000, loss: 0.3958, lr: 0.009187, batch_cost: 0.2333, reader_cost: 0.11257, ips: 17.1445 samples/sec | ETA 01:10:462022-07-21 15:03:27 [INFO][TRAIN] epoch: 1, iter: 1810/20000, loss: 0.5448, lr: 0.009182, batch_cost: 0.2694, reader_cost: 0.12892, ips: 14.8498 samples/sec | ETA 01:21:392022-07-21 15:03:30 [INFO][TRAIN] epoch: 1, iter: 1820/20000, loss: 0.3464, lr: 0.009178, batch_cost: 0.2447, reader_cost: 0.11292, ips: 16.3478 samples/sec | ETA 01:14:082022-07-21 15:03:32 [INFO][TRAIN] epoch: 1, iter: 1830/20000, loss: 0.4049, lr: 0.009173, batch_cost: 0.2338, reader_cost: 0.11191, ips: 17.1071 samples/sec | ETA 01:10:482022-07-21 15:03:35 [INFO][TRAIN] epoch: 1, iter: 1840/20000, loss: 0.3595, lr: 0.009169, batch_cost: 0.2430, reader_cost: 0.11532, ips: 16.4619 samples/sec | ETA 01:13:322022-07-21 15:03:37 [INFO][TRAIN] epoch: 1, iter: 1850/20000, loss: 0.3629, lr: 0.009164, batch_cost: 0.2482, reader_cost: 0.12273, ips: 16.1139 samples/sec | ETA 01:15:052022-07-21 15:03:40 [INFO][TRAIN] epoch: 1, iter: 1860/20000, loss: 0.4569, lr: 0.009159, batch_cost: 0.3002, reader_cost: 0.13980, ips: 13.3247 samples/sec | ETA 01:30:452022-07-21 15:03:43 [INFO][TRAIN] epoch: 1, iter: 1870/20000, loss: 0.4834, lr: 0.009155, batch_cost: 0.3170, reader_cost: 0.15557, ips: 12.6198 samples/sec | ETA 01:35:462022-07-21 15:03:46 [INFO][TRAIN] epoch: 1, iter: 1880/20000, loss: 0.4126, lr: 0.009150, batch_cost: 0.2451, reader_cost: 0.12095, ips: 16.3171 samples/sec | ETA 01:14:012022-07-21 15:03:48 [INFO][TRAIN] epoch: 1, iter: 1890/20000, loss: 0.4030, lr: 0.009146, batch_cost: 0.2472, reader_cost: 0.10706, ips: 16.1790 samples/sec | ETA 01:14:372022-07-21 15:03:51 [INFO][TRAIN] epoch: 1, iter: 1900/20000, loss: 0.5064, lr: 0.009141, batch_cost: 0.2380, reader_cost: 0.11292, ips: 16.8087 samples/sec | ETA 01:11:472022-07-21 15:03:53 [INFO][TRAIN] epoch: 1, iter: 1910/20000, loss: 0.5435, lr: 0.009137, batch_cost: 0.2442, reader_cost: 0.11924, ips: 16.3781 samples/sec | ETA 01:13:382022-07-21 15:03:55 [INFO][TRAIN] epoch: 1, iter: 1920/20000, loss: 0.4150, lr: 0.009132, batch_cost: 0.2287, reader_cost: 0.10521, ips: 17.4898 samples/sec | ETA 01:08:542022-07-21 15:03:58 [INFO][TRAIN] epoch: 1, iter: 1930/20000, loss: 0.4284, lr: 0.009128, batch_cost: 0.2304, reader_cost: 0.10437, ips: 17.3617 samples/sec | ETA 01:09:232022-07-21 15:04:00 [INFO][TRAIN] epoch: 1, iter: 1940/20000, loss: 0.3087, lr: 0.009123, batch_cost: 0.2290, reader_cost: 0.10486, ips: 17.4700 samples/sec | ETA 01:08:552022-07-21 15:04:02 [INFO][TRAIN] epoch: 1, iter: 1950/20000, loss: 0.3878, lr: 0.009119, batch_cost: 0.2342, reader_cost: 0.11147, ips: 17.0770 samples/sec | ETA 01:10:272022-07-21 15:04:05 [INFO][TRAIN] epoch: 1, iter: 1960/20000, loss: 0.4249, lr: 0.009114, batch_cost: 0.2468, reader_cost: 0.12314, ips: 16.2067 samples/sec | ETA 01:14:122022-07-21 15:04:08 [INFO][TRAIN] epoch: 1, iter: 1970/20000, loss: 0.3492, lr: 0.009109, batch_cost: 0.2698, reader_cost: 0.14259, ips: 14.8261 samples/sec | ETA 01:21:042022-07-21 15:04:10 [INFO][TRAIN] epoch: 1, iter: 1980/20000, loss: 0.3650, lr: 0.009105, batch_cost: 0.2613, reader_cost: 0.13450, ips: 15.3055 samples/sec | ETA 01:18:292022-07-21 15:04:12 [INFO][TRAIN] epoch: 1, iter: 1990/20000, loss: 0.4084, lr: 0.009100, batch_cost: 0.2212, reader_cost: 0.09364, ips: 18.0809 samples/sec | ETA 01:06:242022-07-21 15:04:15 [INFO][TRAIN] epoch: 1, iter: 2000/20000, loss: 0.6924, lr: 0.009096, batch_cost: 0.2393, reader_cost: 0.10859, ips: 16.7151 samples/sec | ETA 01:11:472022-07-21 15:04:17 [INFO][TRAIN] epoch: 1, iter: 2010/20000, loss: 0.3879, lr: 0.009091, batch_cost: 0.2466, reader_cost: 0.12393, ips: 16.2214 samples/sec | ETA 01:13:562022-07-21 15:04:20 [INFO][TRAIN] epoch: 1, iter: 2020/20000, loss: 0.4924, lr: 0.009087, batch_cost: 0.2507, reader_cost: 0.11846, ips: 15.9557 samples/sec | ETA 01:15:072022-07-21 15:04:23 [INFO][TRAIN] epoch: 1, iter: 2030/20000, loss: 0.6063, lr: 0.009082, batch_cost: 0.2864, reader_cost: 0.12909, ips: 13.9681 samples/sec | ETA 01:25:462022-07-21 15:04:25 [INFO][TRAIN] epoch: 1, iter: 2040/20000, loss: 0.3620, lr: 0.009078, batch_cost: 0.2553, reader_cost: 0.13009, ips: 15.6662 samples/sec | ETA 01:16:252022-07-21 15:04:28 [INFO][TRAIN] epoch: 1, iter: 2050/20000, loss: 0.4169, lr: 0.009073, batch_cost: 0.2528, reader_cost: 0.10433, ips: 15.8226 samples/sec | ETA 01:15:372022-07-21 15:04:31 [INFO][TRAIN] epoch: 1, iter: 2060/20000, loss: 0.4918, lr: 0.009068, batch_cost: 0.2956, reader_cost: 0.14172, ips: 13.5316 samples/sec | ETA 01:28:232022-07-21 15:04:33 [INFO][TRAIN] epoch: 1, iter: 2070/20000, loss: 0.3849, lr: 0.009064, batch_cost: 0.2323, reader_cost: 0.10801, ips: 17.2201 samples/sec | ETA 01:09:242022-07-21 15:04:35 [INFO][TRAIN] epoch: 1, iter: 2080/20000, loss: 0.4007, lr: 0.009059, batch_cost: 0.2286, reader_cost: 0.10636, ips: 17.4994 samples/sec | ETA 01:08:162022-07-21 15:04:37 [INFO][TRAIN] epoch: 1, iter: 2090/20000, loss: 0.5691, lr: 0.009055, batch_cost: 0.2250, reader_cost: 0.10220, ips: 17.7797 samples/sec | ETA 01:07:092022-07-21 15:04:40 [INFO][TRAIN] epoch: 1, iter: 2100/20000, loss: 0.4965, lr: 0.009050, batch_cost: 0.2605, reader_cost: 0.12591, ips: 15.3531 samples/sec | ETA 01:17:432022-07-21 15:04:43 [INFO][TRAIN] epoch: 1, iter: 2110/20000, loss: 0.5045, lr: 0.009046, batch_cost: 0.2604, reader_cost: 0.13304, ips: 15.3627 samples/sec | ETA 01:17:382022-07-21 15:04:45 [INFO][TRAIN] epoch: 1, iter: 2120/20000, loss: 0.6160, lr: 0.009041, batch_cost: 0.2611, reader_cost: 0.12662, ips: 15.3198 samples/sec | ETA 01:17:482022-07-21 15:04:50 [INFO][TRAIN] epoch: 1, iter: 2130/20000, loss: 0.4363, lr: 0.009037, batch_cost: 0.4411, reader_cost: 0.23197, ips: 9.0675 samples/sec | ETA 02:11:232022-07-21 15:04:53 [INFO][TRAIN] epoch: 1, iter: 2140/20000, loss: 0.4418, lr: 0.009032, batch_cost: 0.2883, reader_cost: 0.14741, ips: 13.8734 samples/sec | ETA 01:25:492022-07-21 15:04:55 [INFO][TRAIN] epoch: 1, iter: 2150/20000, loss: 0.3939, lr: 0.009028, batch_cost: 0.2399, reader_cost: 0.11678, ips: 16.6724 samples/sec | ETA 01:11:222022-07-21 15:04:58 [INFO][TRAIN] epoch: 1, iter: 2160/20000, loss: 0.3488, lr: 0.009023, batch_cost: 0.2571, reader_cost: 0.13286, ips: 15.5570 samples/sec | ETA 01:16:262022-07-21 15:05:00 [INFO][TRAIN] epoch: 1, iter: 2170/20000, loss: 0.6358, lr: 0.009018, batch_cost: 0.2494, reader_cost: 0.12771, ips: 16.0388 samples/sec | ETA 01:14:062022-07-21 15:05:02 [INFO][TRAIN] epoch: 1, iter: 2180/20000, loss: 0.6096, lr: 0.009014, batch_cost: 0.2212, reader_cost: 0.09842, ips: 18.0818 samples/sec | ETA 01:05:422022-07-21 15:05:05 [INFO][TRAIN] epoch: 1, iter: 2190/20000, loss: 0.4214, lr: 0.009009, batch_cost: 0.2344, reader_cost: 0.11323, ips: 17.0634 samples/sec | ETA 01:09:352022-07-21 15:05:07 [INFO][TRAIN] epoch: 1, iter: 2200/20000, loss: 0.4162, lr: 0.009005, batch_cost: 0.2320, reader_cost: 0.10736, ips: 17.2430 samples/sec | ETA 01:08:492022-07-21 15:05:09 [INFO][TRAIN] epoch: 1, iter: 2210/20000, loss: 0.3995, lr: 0.009000, batch_cost: 0.2348, reader_cost: 0.10915, ips: 17.0360 samples/sec | ETA 01:09:372022-07-21 15:05:12 [INFO][TRAIN] epoch: 1, iter: 2220/20000, loss: 0.3374, lr: 0.008996, batch_cost: 0.2410, reader_cost: 0.11705, ips: 16.5965 samples/sec | ETA 01:11:252022-07-21 15:05:14 [INFO][TRAIN] epoch: 1, iter: 2230/20000, loss: 0.5185, lr: 0.008991, batch_cost: 0.2355, reader_cost: 0.10940, ips: 16.9854 samples/sec | ETA 01:09:442022-07-21 15:05:16 [INFO][TRAIN] epoch: 1, iter: 2240/20000, loss: 0.4500, lr: 0.008987, batch_cost: 0.2272, reader_cost: 0.10501, ips: 17.6082 samples/sec | ETA 01:07:142022-07-21 15:05:19 [INFO][TRAIN] epoch: 1, iter: 2250/20000, loss: 0.7274, lr: 0.008982, batch_cost: 0.2196, reader_cost: 0.09743, ips: 18.2177 samples/sec | ETA 01:04:572022-07-21 15:05:21 [INFO][TRAIN] epoch: 1, iter: 2260/20000, loss: 0.3841, lr: 0.008977, batch_cost: 0.2435, reader_cost: 0.12147, ips: 16.4251 samples/sec | ETA 01:12:002022-07-21 15:05:24 [INFO][TRAIN] epoch: 1, iter: 2270/20000, loss: 0.3783, lr: 0.008973, batch_cost: 0.2637, reader_cost: 0.12557, ips: 15.1665 samples/sec | ETA 01:17:562022-07-21 15:05:26 [INFO][TRAIN] epoch: 1, iter: 2280/20000, loss: 0.4149, lr: 0.008968, batch_cost: 0.2567, reader_cost: 0.12468, ips: 15.5815 samples/sec | ETA 01:15:482022-07-21 15:05:29 [INFO][TRAIN] epoch: 1, iter: 2290/20000, loss: 0.4114, lr: 0.008964, batch_cost: 0.2483, reader_cost: 0.12077, ips: 16.1123 samples/sec | ETA 01:13:162022-07-21 15:05:31 [INFO][TRAIN] epoch: 1, iter: 2300/20000, loss: 0.3833, lr: 0.008959, batch_cost: 0.2748, reader_cost: 0.09942, ips: 14.5554 samples/sec | ETA 01:21:042022-07-21 15:05:34 [INFO][TRAIN] epoch: 1, iter: 2310/20000, loss: 0.3883, lr: 0.008955, batch_cost: 0.2737, reader_cost: 0.13252, ips: 14.6148 samples/sec | ETA 01:20:412022-07-21 15:05:36 [INFO][TRAIN] epoch: 1, iter: 2320/20000, loss: 0.4866, lr: 0.008950, batch_cost: 0.2333, reader_cost: 0.10720, ips: 17.1449 samples/sec | ETA 01:08:442022-07-21 15:05:39 [INFO][TRAIN] epoch: 1, iter: 2330/20000, loss: 0.5399, lr: 0.008946, batch_cost: 0.2499, reader_cost: 0.12264, ips: 16.0066 samples/sec | ETA 01:13:352022-07-21 15:05:41 [INFO][TRAIN] epoch: 1, iter: 2340/20000, loss: 0.5577, lr: 0.008941, batch_cost: 0.2511, reader_cost: 0.12407, ips: 15.9286 samples/sec | ETA 01:13:542022-07-21 15:05:44 [INFO][TRAIN] epoch: 1, iter: 2350/20000, loss: 0.3897, lr: 0.008936, batch_cost: 0.2436, reader_cost: 0.11524, ips: 16.4189 samples/sec | ETA 01:11:392022-07-21 15:05:47 [INFO][TRAIN] epoch: 1, iter: 2360/20000, loss: 0.4661, lr: 0.008932, batch_cost: 0.2605, reader_cost: 0.13014, ips: 15.3573 samples/sec | ETA 01:16:342022-07-21 15:05:49 [INFO][TRAIN] epoch: 1, iter: 2370/20000, loss: 0.4750, lr: 0.008927, batch_cost: 0.2347, reader_cost: 0.10929, ips: 17.0454 samples/sec | ETA 01:08:572022-07-21 15:05:51 [INFO][TRAIN] epoch: 1, iter: 2380/20000, loss: 0.2691, lr: 0.008923, batch_cost: 0.2381, reader_cost: 0.11472, ips: 16.7981 samples/sec | ETA 01:09:552022-07-21 15:05:55 [INFO][TRAIN] epoch: 1, iter: 2390/20000, loss: 0.3781, lr: 0.008918, batch_cost: 0.3452, reader_cost: 0.16137, ips: 11.5866 samples/sec | ETA 01:41:192022-07-21 15:05:58 [INFO][TRAIN] epoch: 1, iter: 2400/20000, loss: 0.4223, lr: 0.008914, batch_cost: 0.3499, reader_cost: 0.17834, ips: 11.4321 samples/sec | ETA 01:42:382022-07-21 15:06:00 [INFO][TRAIN] epoch: 1, iter: 2410/20000, loss: 0.4630, lr: 0.008909, batch_cost: 0.2278, reader_cost: 0.10040, ips: 17.5572 samples/sec | ETA 01:06:472022-07-21 15:06:03 [INFO][TRAIN] epoch: 1, iter: 2420/20000, loss: 0.5419, lr: 0.008905, batch_cost: 0.2332, reader_cost: 0.10913, ips: 17.1547 samples/sec | ETA 01:08:192022-07-21 15:06:05 [INFO][TRAIN] epoch: 1, iter: 2430/20000, loss: 0.3899, lr: 0.008900, batch_cost: 0.2256, reader_cost: 0.10341, ips: 17.7276 samples/sec | ETA 01:06:042022-07-21 15:06:07 [INFO][TRAIN] epoch: 1, iter: 2440/20000, loss: 0.4359, lr: 0.008895, batch_cost: 0.2207, reader_cost: 0.09881, ips: 18.1261 samples/sec | ETA 01:04:352022-07-21 15:06:10 [INFO][TRAIN] epoch: 1, iter: 2450/20000, loss: 0.4368, lr: 0.008891, batch_cost: 0.2248, reader_cost: 0.10436, ips: 17.7935 samples/sec | ETA 01:05:452022-07-21 15:06:12 [INFO][TRAIN] epoch: 1, iter: 2460/20000, loss: 0.3899, lr: 0.008886, batch_cost: 0.2440, reader_cost: 0.12158, ips: 16.3905 samples/sec | ETA 01:11:202022-07-21 15:06:14 [INFO][TRAIN] epoch: 1, iter: 2470/20000, loss: 0.3536, lr: 0.008882, batch_cost: 0.2332, reader_cost: 0.11277, ips: 17.1556 samples/sec | ETA 01:08:072022-07-21 15:06:17 [INFO][TRAIN] epoch: 1, iter: 2480/20000, loss: 0.4160, lr: 0.008877, batch_cost: 0.2426, reader_cost: 0.11513, ips: 16.4862 samples/sec | ETA 01:10:502022-07-21 15:06:19 [INFO][TRAIN] epoch: 1, iter: 2490/20000, loss: 0.4458, lr: 0.008873, batch_cost: 0.2346, reader_cost: 0.11269, ips: 17.0521 samples/sec | ETA 01:08:272022-07-21 15:06:22 [INFO][TRAIN] epoch: 1, iter: 2500/20000, loss: 0.4657, lr: 0.008868, batch_cost: 0.2490, reader_cost: 0.12594, ips: 16.0639 samples/sec | ETA 01:12:372022-07-21 15:06:24 [INFO][TRAIN] epoch: 1, iter: 2510/20000, loss: 0.3307, lr: 0.008864, batch_cost: 0.2478, reader_cost: 0.12426, ips: 16.1446 samples/sec | ETA 01:12:132022-07-21 15:06:27 [INFO][TRAIN] epoch: 1, iter: 2520/20000, loss: 0.4111, lr: 0.008859, batch_cost: 0.2522, reader_cost: 0.12310, ips: 15.8605 samples/sec | ETA 01:13:282022-07-21 15:06:29 [INFO][TRAIN] epoch: 1, iter: 2530/20000, loss: 0.4417, lr: 0.008854, batch_cost: 0.2727, reader_cost: 0.12989, ips: 14.6664 samples/sec | ETA 01:19:242022-07-21 15:06:32 [INFO][TRAIN] epoch: 1, iter: 2540/20000, loss: 0.5739, lr: 0.008850, batch_cost: 0.2613, reader_cost: 0.13658, ips: 15.3106 samples/sec | ETA 01:16:012022-07-21 15:06:35 [INFO][TRAIN] epoch: 1, iter: 2550/20000, loss: 0.5732, lr: 0.008845, batch_cost: 0.2929, reader_cost: 0.13852, ips: 13.6581 samples/sec | ETA 01:25:102022-07-21 15:06:37 [INFO][TRAIN] epoch: 1, iter: 2560/20000, loss: 0.4234, lr: 0.008841, batch_cost: 0.2373, reader_cost: 0.11543, ips: 16.8587 samples/sec | ETA 01:08:572022-07-21 15:06:40 [INFO][TRAIN] epoch: 1, iter: 2570/20000, loss: 0.4583, lr: 0.008836, batch_cost: 0.2364, reader_cost: 0.11282, ips: 16.9203 samples/sec | ETA 01:08:402022-07-21 15:06:42 [INFO][TRAIN] epoch: 1, iter: 2580/20000, loss: 0.5129, lr: 0.008832, batch_cost: 0.2552, reader_cost: 0.12716, ips: 15.6763 samples/sec | ETA 01:14:042022-07-21 15:06:44 [INFO][TRAIN] epoch: 1, iter: 2590/20000, loss: 0.3699, lr: 0.008827, batch_cost: 0.2327, reader_cost: 0.10832, ips: 17.1865 samples/sec | ETA 01:07:322022-07-21 15:06:47 [INFO][TRAIN] epoch: 1, iter: 2600/20000, loss: 0.3810, lr: 0.008822, batch_cost: 0.2442, reader_cost: 0.11336, ips: 16.3781 samples/sec | ETA 01:10:492022-07-21 15:06:49 [INFO][TRAIN] epoch: 1, iter: 2610/20000, loss: 0.4032, lr: 0.008818, batch_cost: 0.2366, reader_cost: 0.11340, ips: 16.9030 samples/sec | ETA 01:08:352022-07-21 15:06:52 [INFO][TRAIN] epoch: 1, iter: 2620/20000, loss: 0.4468, lr: 0.008813, batch_cost: 0.2522, reader_cost: 0.12869, ips: 15.8618 samples/sec | ETA 01:13:022022-07-21 15:06:54 [INFO][TRAIN] epoch: 1, iter: 2630/20000, loss: 0.4805, lr: 0.008809, batch_cost: 0.2518, reader_cost: 0.12643, ips: 15.8878 samples/sec | ETA 01:12:532022-07-21 15:06:57 [INFO][TRAIN] epoch: 1, iter: 2640/20000, loss: 0.4878, lr: 0.008804, batch_cost: 0.2213, reader_cost: 0.09903, ips: 18.0736 samples/sec | ETA 01:04:022022-07-21 15:06:59 [INFO][TRAIN] epoch: 1, iter: 2650/20000, loss: 0.5027, lr: 0.008800, batch_cost: 0.2269, reader_cost: 0.09784, ips: 17.6294 samples/sec | ETA 01:05:362022-07-21 15:07:03 [INFO][TRAIN] epoch: 1, iter: 2660/20000, loss: 0.3972, lr: 0.008795, batch_cost: 0.4306, reader_cost: 0.22012, ips: 9.2903 samples/sec | ETA 02:04:252022-07-21 15:07:06 [INFO][TRAIN] epoch: 1, iter: 2670/20000, loss: 0.3846, lr: 0.008791, batch_cost: 0.2658, reader_cost: 0.13217, ips: 15.0480 samples/sec | ETA 01:16:462022-07-21 15:07:08 [INFO][TRAIN] epoch: 1, iter: 2680/20000, loss: 0.5337, lr: 0.008786, batch_cost: 0.2248, reader_cost: 0.10269, ips: 17.7968 samples/sec | ETA 01:04:522022-07-21 15:07:10 [INFO][TRAIN] epoch: 1, iter: 2690/20000, loss: 0.3792, lr: 0.008781, batch_cost: 0.2412, reader_cost: 0.11824, ips: 16.5847 samples/sec | ETA 01:09:342022-07-21 15:07:13 [INFO][TRAIN] epoch: 1, iter: 2700/20000, loss: 0.4578, lr: 0.008777, batch_cost: 0.2364, reader_cost: 0.10836, ips: 16.9208 samples/sec | ETA 01:08:092022-07-21 15:07:15 [INFO][TRAIN] epoch: 1, iter: 2710/20000, loss: 0.3737, lr: 0.008772, batch_cost: 0.2423, reader_cost: 0.11718, ips: 16.5116 samples/sec | ETA 01:09:482022-07-21 15:07:18 [INFO][TRAIN] epoch: 1, iter: 2720/20000, loss: 0.4554, lr: 0.008768, batch_cost: 0.2533, reader_cost: 0.12977, ips: 15.7933 samples/sec | ETA 01:12:562022-07-21 15:07:20 [INFO][TRAIN] epoch: 1, iter: 2730/20000, loss: 0.4278, lr: 0.008763, batch_cost: 0.2412, reader_cost: 0.12108, ips: 16.5827 samples/sec | ETA 01:09:252022-07-21 15:07:23 [INFO][TRAIN] epoch: 1, iter: 2740/20000, loss: 0.4343, lr: 0.008759, batch_cost: 0.2429, reader_cost: 0.11634, ips: 16.4695 samples/sec | ETA 01:09:522022-07-21 15:07:25 [INFO][TRAIN] epoch: 1, iter: 2750/20000, loss: 0.4592, lr: 0.008754, batch_cost: 0.2404, reader_cost: 0.11674, ips: 16.6374 samples/sec | ETA 01:09:072022-07-21 15:07:27 [INFO][TRAIN] epoch: 1, iter: 2760/20000, loss: 0.4283, lr: 0.008749, batch_cost: 0.2322, reader_cost: 0.10676, ips: 17.2263 samples/sec | ETA 01:06:432022-07-21 15:07:30 [INFO][TRAIN] epoch: 1, iter: 2770/20000, loss: 0.3794, lr: 0.008745, batch_cost: 0.2285, reader_cost: 0.10557, ips: 17.5034 samples/sec | ETA 01:05:372022-07-21 15:07:32 [INFO][TRAIN] epoch: 1, iter: 2780/20000, loss: 0.3483, lr: 0.008740, batch_cost: 0.2592, reader_cost: 0.11846, ips: 15.4315 samples/sec | ETA 01:14:232022-07-21 15:07:35 [INFO][TRAIN] epoch: 1, iter: 2790/20000, loss: 0.3371, lr: 0.008736, batch_cost: 0.2517, reader_cost: 0.12049, ips: 15.8949 samples/sec | ETA 01:12:102022-07-21 15:07:38 [INFO][TRAIN] epoch: 1, iter: 2800/20000, loss: 0.3481, lr: 0.008731, batch_cost: 0.2841, reader_cost: 0.13740, ips: 14.0806 samples/sec | ETA 01:21:262022-07-21 15:07:40 [INFO][TRAIN] epoch: 1, iter: 2810/20000, loss: 0.6683, lr: 0.008727, batch_cost: 0.2555, reader_cost: 0.12957, ips: 15.6568 samples/sec | ETA 01:13:112022-07-21 15:07:43 [INFO][TRAIN] epoch: 1, iter: 2820/20000, loss: 0.2658, lr: 0.008722, batch_cost: 0.2517, reader_cost: 0.12430, ips: 15.8950 samples/sec | ETA 01:12:032022-07-21 15:07:45 [INFO][TRAIN] epoch: 1, iter: 2830/20000, loss: 0.3860, lr: 0.008717, batch_cost: 0.2466, reader_cost: 0.12234, ips: 16.2178 samples/sec | ETA 01:10:342022-07-21 15:07:48 [INFO][TRAIN] epoch: 1, iter: 2840/20000, loss: 0.3310, lr: 0.008713, batch_cost: 0.2559, reader_cost: 0.12597, ips: 15.6329 samples/sec | ETA 01:13:102022-07-21 15:07:50 [INFO][TRAIN] epoch: 1, iter: 2850/20000, loss: 0.4162, lr: 0.008708, batch_cost: 0.2468, reader_cost: 0.11942, ips: 16.2069 samples/sec | ETA 01:10:322022-07-21 15:07:52 [INFO][TRAIN] epoch: 1, iter: 2860/20000, loss: 0.5188, lr: 0.008704, batch_cost: 0.2288, reader_cost: 0.10441, ips: 17.4857 samples/sec | ETA 01:05:202022-07-21 15:07:55 [INFO][TRAIN] epoch: 1, iter: 2870/20000, loss: 0.5634, lr: 0.008699, batch_cost: 0.2387, reader_cost: 0.11674, ips: 16.7551 samples/sec | ETA 01:08:092022-07-21 15:07:57 [INFO][TRAIN] epoch: 1, iter: 2880/20000, loss: 0.3509, lr: 0.008695, batch_cost: 0.2438, reader_cost: 0.11976, ips: 16.4092 samples/sec | ETA 01:09:332022-07-21 15:08:00 [INFO][TRAIN] epoch: 1, iter: 2890/20000, loss: 0.3994, lr: 0.008690, batch_cost: 0.2492, reader_cost: 0.12511, ips: 16.0540 samples/sec | ETA 01:11:032022-07-21 15:08:02 [INFO][TRAIN] epoch: 1, iter: 2900/20000, loss: 0.3933, lr: 0.008685, batch_cost: 0.2505, reader_cost: 0.12836, ips: 15.9704 samples/sec | ETA 01:11:222022-07-21 15:08:05 [INFO][TRAIN] epoch: 1, iter: 2910/20000, loss: 0.3235, lr: 0.008681, batch_cost: 0.2913, reader_cost: 0.14790, ips: 13.7321 samples/sec | ETA 01:22:582022-07-21 15:08:08 [INFO][TRAIN] epoch: 1, iter: 2920/20000, loss: 0.3627, lr: 0.008676, batch_cost: 0.3357, reader_cost: 0.17281, ips: 11.9137 samples/sec | ETA 01:35:342022-07-21 15:08:12 [INFO][TRAIN] epoch: 1, iter: 2930/20000, loss: 0.4665, lr: 0.008672, batch_cost: 0.3085, reader_cost: 0.16018, ips: 12.9654 samples/sec | ETA 01:27:462022-07-21 15:08:14 [INFO][TRAIN] epoch: 1, iter: 2940/20000, loss: 0.3812, lr: 0.008667, batch_cost: 0.2514, reader_cost: 0.12692, ips: 15.9119 samples/sec | ETA 01:11:282022-07-21 15:08:16 [INFO][TRAIN] epoch: 1, iter: 2950/20000, loss: 0.5189, lr: 0.008663, batch_cost: 0.2301, reader_cost: 0.10745, ips: 17.3869 samples/sec | ETA 01:05:222022-07-21 15:08:19 [INFO][TRAIN] epoch: 1, iter: 2960/20000, loss: 0.2923, lr: 0.008658, batch_cost: 0.2290, reader_cost: 0.10596, ips: 17.4666 samples/sec | ETA 01:05:022022-07-21 15:08:21 [INFO][TRAIN] epoch: 1, iter: 2970/20000, loss: 0.4297, lr: 0.008653, batch_cost: 0.2381, reader_cost: 0.11619, ips: 16.7995 samples/sec | ETA 01:07:342022-07-21 15:08:23 [INFO][TRAIN] epoch: 1, iter: 2980/20000, loss: 0.4020, lr: 0.008649, batch_cost: 0.2364, reader_cost: 0.11324, ips: 16.9234 samples/sec | ETA 01:07:022022-07-21 15:08:26 [INFO][TRAIN] epoch: 1, iter: 2990/20000, loss: 0.4838, lr: 0.008644, batch_cost: 0.2356, reader_cost: 0.11406, ips: 16.9800 samples/sec | ETA 01:06:472022-07-21 15:08:28 [INFO][TRAIN] epoch: 1, iter: 3000/20000, loss: 0.4002, lr: 0.008640, batch_cost: 0.2298, reader_cost: 0.10783, ips: 17.4047 samples/sec | ETA 01:05:072022-07-21 15:08:31 [INFO][TRAIN] epoch: 1, iter: 3010/20000, loss: 0.3220, lr: 0.008635, batch_cost: 0.2573, reader_cost: 0.13225, ips: 15.5464 samples/sec | ETA 01:12:512022-07-21 15:08:33 [INFO][TRAIN] epoch: 1, iter: 3020/20000, loss: 0.4214, lr: 0.008631, batch_cost: 0.2399, reader_cost: 0.11714, ips: 16.6726 samples/sec | ETA 01:07:532022-07-21 15:08:35 [INFO][TRAIN] epoch: 1, iter: 3030/20000, loss: 0.4681, lr: 0.008626, batch_cost: 0.2418, reader_cost: 0.11388, ips: 16.5410 samples/sec | ETA 01:08:232022-07-21 15:08:38 [INFO][TRAIN] epoch: 1, iter: 3040/20000, loss: 0.4758, lr: 0.008621, batch_cost: 0.3001, reader_cost: 0.13911, ips: 13.3291 samples/sec | ETA 01:24:492022-07-21 15:08:41 [INFO][TRAIN] epoch: 1, iter: 3050/20000, loss: 0.3360, lr: 0.008617, batch_cost: 0.2690, reader_cost: 0.13264, ips: 14.8697 samples/sec | ETA 01:15:592022-07-21 15:08:44 [INFO][TRAIN] epoch: 1, iter: 3060/20000, loss: 0.4790, lr: 0.008612, batch_cost: 0.2590, reader_cost: 0.09907, ips: 15.4436 samples/sec | ETA 01:13:072022-07-21 15:08:46 [INFO][TRAIN] epoch: 1, iter: 3070/20000, loss: 0.5001, lr: 0.008608, batch_cost: 0.2672, reader_cost: 0.13652, ips: 14.9673 samples/sec | ETA 01:15:242022-07-21 15:08:49 [INFO][TRAIN] epoch: 1, iter: 3080/20000, loss: 0.4304, lr: 0.008603, batch_cost: 0.2563, reader_cost: 0.12152, ips: 15.6095 samples/sec | ETA 01:12:152022-07-21 15:08:51 [INFO][TRAIN] epoch: 1, iter: 3090/20000, loss: 0.3764, lr: 0.008599, batch_cost: 0.2280, reader_cost: 0.09550, ips: 17.5428 samples/sec | ETA 01:04:152022-07-21 15:08:54 [INFO][TRAIN] epoch: 1, iter: 3100/20000, loss: 0.4385, lr: 0.008594, batch_cost: 0.2351, reader_cost: 0.10701, ips: 17.0156 samples/sec | ETA 01:06:122022-07-21 15:08:56 [INFO][TRAIN] epoch: 1, iter: 3110/20000, loss: 0.4904, lr: 0.008589, batch_cost: 0.2319, reader_cost: 0.10158, ips: 17.2489 samples/sec | ETA 01:05:162022-07-21 15:08:58 [INFO][TRAIN] epoch: 1, iter: 3120/20000, loss: 0.4073, lr: 0.008585, batch_cost: 0.2333, reader_cost: 0.10906, ips: 17.1489 samples/sec | ETA 01:05:372022-07-21 15:09:01 [INFO][TRAIN] epoch: 1, iter: 3130/20000, loss: 0.5778, lr: 0.008580, batch_cost: 0.2443, reader_cost: 0.11885, ips: 16.3709 samples/sec | ETA 01:08:412022-07-21 15:09:03 [INFO][TRAIN] epoch: 1, iter: 3140/20000, loss: 0.3426, lr: 0.008576, batch_cost: 0.2364, reader_cost: 0.11508, ips: 16.9217 samples/sec | ETA 01:06:252022-07-21 15:09:06 [INFO][TRAIN] epoch: 1, iter: 3150/20000, loss: 0.4100, lr: 0.008571, batch_cost: 0.2420, reader_cost: 0.11801, ips: 16.5299 samples/sec | ETA 01:07:572022-07-21 15:09:08 [INFO][TRAIN] epoch: 1, iter: 3160/20000, loss: 0.5320, lr: 0.008567, batch_cost: 0.2782, reader_cost: 0.14625, ips: 14.3782 samples/sec | ETA 01:18:042022-07-21 15:09:11 [INFO][TRAIN] epoch: 1, iter: 3170/20000, loss: 0.4115, lr: 0.008562, batch_cost: 0.2430, reader_cost: 0.11660, ips: 16.4625 samples/sec | ETA 01:08:092022-07-21 15:09:13 [INFO][TRAIN] epoch: 1, iter: 3180/20000, loss: 0.3951, lr: 0.008557, batch_cost: 0.2533, reader_cost: 0.11296, ips: 15.7895 samples/sec | ETA 01:11:012022-07-21 15:09:17 [INFO][TRAIN] epoch: 1, iter: 3190/20000, loss: 0.5235, lr: 0.008553, batch_cost: 0.3509, reader_cost: 0.17824, ips: 11.4000 samples/sec | ETA 01:38:182022-07-21 15:09:19 [INFO][TRAIN] epoch: 1, iter: 3200/20000, loss: 0.3579, lr: 0.008548, batch_cost: 0.2595, reader_cost: 0.13354, ips: 15.4148 samples/sec | ETA 01:12:392022-07-21 15:09:22 [INFO][TRAIN] epoch: 1, iter: 3210/20000, loss: 0.4029, lr: 0.008544, batch_cost: 0.2347, reader_cost: 0.10922, ips: 17.0406 samples/sec | ETA 01:05:412022-07-21 15:09:24 [INFO][TRAIN] epoch: 1, iter: 3220/20000, loss: 0.3379, lr: 0.008539, batch_cost: 0.2402, reader_cost: 0.11842, ips: 16.6521 samples/sec | ETA 01:07:102022-07-21 15:09:27 [INFO][TRAIN] epoch: 1, iter: 3230/20000, loss: 0.3596, lr: 0.008534, batch_cost: 0.2484, reader_cost: 0.12530, ips: 16.1001 samples/sec | ETA 01:09:262022-07-21 15:09:29 [INFO][TRAIN] epoch: 1, iter: 3240/20000, loss: 0.3281, lr: 0.008530, batch_cost: 0.2375, reader_cost: 0.11461, ips: 16.8400 samples/sec | ETA 01:06:202022-07-21 15:09:31 [INFO][TRAIN] epoch: 1, iter: 3250/20000, loss: 0.4098, lr: 0.008525, batch_cost: 0.2428, reader_cost: 0.12111, ips: 16.4771 samples/sec | ETA 01:07:462022-07-21 15:09:34 [INFO][TRAIN] epoch: 1, iter: 3260/20000, loss: 0.5891, lr: 0.008521, batch_cost: 0.2124, reader_cost: 0.09167, ips: 18.8334 samples/sec | ETA 00:59:152022-07-21 15:09:36 [INFO][TRAIN] epoch: 1, iter: 3270/20000, loss: 0.3651, lr: 0.008516, batch_cost: 0.2390, reader_cost: 0.11789, ips: 16.7350 samples/sec | ETA 01:06:382022-07-21 15:09:38 [INFO][TRAIN] epoch: 1, iter: 3280/20000, loss: 0.5262, lr: 0.008512, batch_cost: 0.2387, reader_cost: 0.11501, ips: 16.7596 samples/sec | ETA 01:06:302022-07-21 15:09:41 [INFO][TRAIN] epoch: 1, iter: 3290/20000, loss: 0.4970, lr: 0.008507, batch_cost: 0.2960, reader_cost: 0.13647, ips: 13.5137 samples/sec | ETA 01:22:262022-07-21 15:09:44 [INFO][TRAIN] epoch: 1, iter: 3300/20000, loss: 0.3668, lr: 0.008502, batch_cost: 0.2492, reader_cost: 0.11549, ips: 16.0526 samples/sec | ETA 01:09:212022-07-21 15:09:46 [INFO][TRAIN] epoch: 1, iter: 3310/20000, loss: 0.3535, lr: 0.008498, batch_cost: 0.2451, reader_cost: 0.11636, ips: 16.3201 samples/sec | ETA 01:08:102022-07-21 15:09:49 [INFO][TRAIN] epoch: 1, iter: 3320/20000, loss: 0.3139, lr: 0.008493, batch_cost: 0.2465, reader_cost: 0.11889, ips: 16.2254 samples/sec | ETA 01:08:322022-07-21 15:09:51 [INFO][TRAIN] epoch: 1, iter: 3330/20000, loss: 0.4525, lr: 0.008489, batch_cost: 0.2469, reader_cost: 0.11901, ips: 16.1990 samples/sec | ETA 01:08:362022-07-21 15:09:54 [INFO][TRAIN] epoch: 1, iter: 3340/20000, loss: 0.3956, lr: 0.008484, batch_cost: 0.2545, reader_cost: 0.11526, ips: 15.7196 samples/sec | ETA 01:10:392022-07-21 15:09:56 [INFO][TRAIN] epoch: 1, iter: 3350/20000, loss: 0.3553, lr: 0.008479, batch_cost: 0.2430, reader_cost: 0.11879, ips: 16.4589 samples/sec | ETA 01:07:262022-07-21 15:09:59 [INFO][TRAIN] epoch: 1, iter: 3360/20000, loss: 0.2864, lr: 0.008475, batch_cost: 0.2427, reader_cost: 0.11591, ips: 16.4789 samples/sec | ETA 01:07:192022-07-21 15:10:01 [INFO][TRAIN] epoch: 1, iter: 3370/20000, loss: 0.3513, lr: 0.008470, batch_cost: 0.2428, reader_cost: 0.12043, ips: 16.4723 samples/sec | ETA 01:07:182022-07-21 15:10:03 [INFO][TRAIN] epoch: 1, iter: 3380/20000, loss: 0.5496, lr: 0.008466, batch_cost: 0.2468, reader_cost: 0.12361, ips: 16.2046 samples/sec | ETA 01:08:222022-07-21 15:10:06 [INFO][TRAIN] epoch: 1, iter: 3390/20000, loss: 0.4567, lr: 0.008461, batch_cost: 0.2314, reader_cost: 0.10747, ips: 17.2866 samples/sec | ETA 01:04:032022-07-21 15:10:08 [INFO][TRAIN] epoch: 1, iter: 3400/20000, loss: 0.3446, lr: 0.008457, batch_cost: 0.2300, reader_cost: 0.10583, ips: 17.3882 samples/sec | ETA 01:03:382022-07-21 15:10:11 [INFO][TRAIN] epoch: 1, iter: 3410/20000, loss: 0.4683, lr: 0.008452, batch_cost: 0.2498, reader_cost: 0.12146, ips: 16.0114 samples/sec | ETA 01:09:042022-07-21 15:10:13 [INFO][TRAIN] epoch: 1, iter: 3420/20000, loss: 0.3715, lr: 0.008447, batch_cost: 0.2760, reader_cost: 0.14626, ips: 14.4949 samples/sec | ETA 01:16:152022-07-21 15:10:16 [INFO][TRAIN] epoch: 1, iter: 3430/20000, loss: 0.2925, lr: 0.008443, batch_cost: 0.2426, reader_cost: 0.11785, ips: 16.4884 samples/sec | ETA 01:06:592022-07-21 15:10:18 [INFO][TRAIN] epoch: 1, iter: 3440/20000, loss: 0.3652, lr: 0.008438, batch_cost: 0.2420, reader_cost: 0.11866, ips: 16.5270 samples/sec | ETA 01:06:472022-07-21 15:10:21 [INFO][TRAIN] epoch: 1, iter: 3450/20000, loss: 0.4360, lr: 0.008434, batch_cost: 0.3193, reader_cost: 0.13887, ips: 12.5285 samples/sec | ETA 01:28:032022-07-21 15:10:25 [INFO][TRAIN] epoch: 1, iter: 3460/20000, loss: 0.3057, lr: 0.008429, batch_cost: 0.3115, reader_cost: 0.16518, ips: 12.8408 samples/sec | ETA 01:25:522022-07-21 15:10:27 [INFO][TRAIN] epoch: 1, iter: 3470/20000, loss: 0.3782, lr: 0.008424, batch_cost: 0.2622, reader_cost: 0.13734, ips: 15.2554 samples/sec | ETA 01:12:142022-07-21 15:10:29 [INFO][TRAIN] epoch: 1, iter: 3480/20000, loss: 0.4177, lr: 0.008420, batch_cost: 0.2305, reader_cost: 0.10780, ips: 17.3554 samples/sec | ETA 01:03:272022-07-21 15:10:32 [INFO][TRAIN] epoch: 1, iter: 3490/20000, loss: 0.4046, lr: 0.008415, batch_cost: 0.2430, reader_cost: 0.11984, ips: 16.4613 samples/sec | ETA 01:06:512022-07-21 15:10:34 [INFO][TRAIN] epoch: 1, iter: 3500/20000, loss: 0.5678, lr: 0.008411, batch_cost: 0.2346, reader_cost: 0.11176, ips: 17.0526 samples/sec | ETA 01:04:302022-07-21 15:10:37 [INFO][TRAIN] epoch: 1, iter: 3510/20000, loss: 0.3199, lr: 0.008406, batch_cost: 0.2688, reader_cost: 0.14579, ips: 14.8797 samples/sec | ETA 01:13:522022-07-21 15:10:39 [INFO][TRAIN] epoch: 1, iter: 3520/20000, loss: 0.4008, lr: 0.008402, batch_cost: 0.2495, reader_cost: 0.12521, ips: 16.0340 samples/sec | ETA 01:08:312022-07-21 15:10:42 [INFO][TRAIN] epoch: 1, iter: 3530/20000, loss: 0.3773, lr: 0.008397, batch_cost: 0.2504, reader_cost: 0.12554, ips: 15.9746 samples/sec | ETA 01:08:442022-07-21 15:10:45 [INFO][TRAIN] epoch: 1, iter: 3540/20000, loss: 0.3974, lr: 0.008392, batch_cost: 0.3120, reader_cost: 0.15467, ips: 12.8215 samples/sec | ETA 01:25:352022-07-21 15:10:48 [INFO][TRAIN] epoch: 1, iter: 3550/20000, loss: 0.5607, lr: 0.008388, batch_cost: 0.2503, reader_cost: 0.11908, ips: 15.9779 samples/sec | ETA 01:08:382022-07-21 15:10:50 [INFO][TRAIN] epoch: 1, iter: 3560/20000, loss: 0.4736, lr: 0.008383, batch_cost: 0.2383, reader_cost: 0.11640, ips: 16.7887 samples/sec | ETA 01:05:162022-07-21 15:10:52 [INFO][TRAIN] epoch: 1, iter: 3570/20000, loss: 0.4516, lr: 0.008379, batch_cost: 0.2380, reader_cost: 0.10482, ips: 16.8095 samples/sec | ETA 01:05:092022-07-21 15:10:55 [INFO][TRAIN] epoch: 1, iter: 3580/20000, loss: 0.4875, lr: 0.008374, batch_cost: 0.2502, reader_cost: 0.10365, ips: 15.9847 samples/sec | ETA 01:08:282022-07-21 15:10:57 [INFO][TRAIN] epoch: 1, iter: 3590/20000, loss: 0.3739, lr: 0.008369, batch_cost: 0.2681, reader_cost: 0.13759, ips: 14.9211 samples/sec | ETA 01:13:192022-07-21 15:11:00 [INFO][TRAIN] epoch: 1, iter: 3600/20000, loss: 0.3562, lr: 0.008365, batch_cost: 0.2696, reader_cost: 0.13799, ips: 14.8369 samples/sec | ETA 01:13:412022-07-21 15:11:03 [INFO][TRAIN] epoch: 1, iter: 3610/20000, loss: 0.3799, lr: 0.008360, batch_cost: 0.2501, reader_cost: 0.12651, ips: 15.9953 samples/sec | ETA 01:08:182022-07-21 15:11:05 [INFO][TRAIN] epoch: 1, iter: 3620/20000, loss: 0.3751, lr: 0.008356, batch_cost: 0.2542, reader_cost: 0.13135, ips: 15.7363 samples/sec | ETA 01:09:232022-07-21 15:11:08 [INFO][TRAIN] epoch: 1, iter: 3630/20000, loss: 0.3792, lr: 0.008351, batch_cost: 0.2369, reader_cost: 0.11338, ips: 16.8852 samples/sec | ETA 01:04:372022-07-21 15:11:10 [INFO][TRAIN] epoch: 1, iter: 3640/20000, loss: 0.5071, lr: 0.008346, batch_cost: 0.2429, reader_cost: 0.12067, ips: 16.4672 samples/sec | ETA 01:06:132022-07-21 15:11:12 [INFO][TRAIN] epoch: 1, iter: 3650/20000, loss: 0.4547, lr: 0.008342, batch_cost: 0.2361, reader_cost: 0.11423, ips: 16.9431 samples/sec | ETA 01:04:192022-07-21 15:11:15 [INFO][TRAIN] epoch: 1, iter: 3660/20000, loss: 0.3599, lr: 0.008337, batch_cost: 0.2553, reader_cost: 0.11689, ips: 15.6654 samples/sec | ETA 01:09:322022-07-21 15:11:17 [INFO][TRAIN] epoch: 1, iter: 3670/20000, loss: 0.3250, lr: 0.008333, batch_cost: 0.2538, reader_cost: 0.12995, ips: 15.7627 samples/sec | ETA 01:09:032022-07-21 15:11:20 [INFO][TRAIN] epoch: 1, iter: 3680/20000, loss: 0.3162, lr: 0.008328, batch_cost: 0.2244, reader_cost: 0.10295, ips: 17.8235 samples/sec | ETA 01:01:022022-07-21 15:11:22 [INFO][TRAIN] epoch: 1, iter: 3690/20000, loss: 0.5091, lr: 0.008323, batch_cost: 0.2178, reader_cost: 0.09620, ips: 18.3664 samples/sec | ETA 00:59:122022-07-21 15:11:24 [INFO][TRAIN] epoch: 1, iter: 3700/20000, loss: 0.4181, lr: 0.008319, batch_cost: 0.2374, reader_cost: 0.11249, ips: 16.8500 samples/sec | ETA 01:04:292022-07-21 15:11:27 [INFO][TRAIN] epoch: 1, iter: 3710/20000, loss: 0.4653, lr: 0.008314, batch_cost: 0.3121, reader_cost: 0.15375, ips: 12.8181 samples/sec | ETA 01:24:432022-07-21 15:11:31 [INFO][TRAIN] epoch: 1, iter: 3720/20000, loss: 0.4560, lr: 0.008310, batch_cost: 0.3251, reader_cost: 0.15987, ips: 12.3042 samples/sec | ETA 01:28:122022-07-21 15:11:33 [INFO][TRAIN] epoch: 1, iter: 3730/20000, loss: 0.5724, lr: 0.008305, batch_cost: 0.2188, reader_cost: 0.09669, ips: 18.2796 samples/sec | ETA 00:59:202022-07-21 15:11:35 [INFO][TRAIN] epoch: 1, iter: 3740/20000, loss: 0.3923, lr: 0.008301, batch_cost: 0.2312, reader_cost: 0.10808, ips: 17.2975 samples/sec | ETA 01:02:402022-07-21 15:11:38 [INFO][TRAIN] epoch: 1, iter: 3750/20000, loss: 0.5189, lr: 0.008296, batch_cost: 0.2368, reader_cost: 0.11319, ips: 16.8946 samples/sec | ETA 01:04:072022-07-21 15:11:40 [INFO][TRAIN] epoch: 1, iter: 3760/20000, loss: 0.3511, lr: 0.008291, batch_cost: 0.2488, reader_cost: 0.12282, ips: 16.0798 samples/sec | ETA 01:07:192022-07-21 15:11:43 [INFO][TRAIN] epoch: 1, iter: 3770/20000, loss: 0.3467, lr: 0.008287, batch_cost: 0.2530, reader_cost: 0.12898, ips: 15.8103 samples/sec | ETA 01:08:262022-07-21 15:11:45 [INFO][TRAIN] epoch: 1, iter: 3780/20000, loss: 0.5450, lr: 0.008282, batch_cost: 0.2489, reader_cost: 0.12431, ips: 16.0705 samples/sec | ETA 01:07:172022-07-21 15:11:48 [INFO][TRAIN] epoch: 1, iter: 3790/20000, loss: 0.3520, lr: 0.008278, batch_cost: 0.3173, reader_cost: 0.17016, ips: 12.6081 samples/sec | ETA 01:25:422022-07-21 15:11:51 [INFO][TRAIN] epoch: 1, iter: 3800/20000, loss: 0.4174, lr: 0.008273, batch_cost: 0.2581, reader_cost: 0.12382, ips: 15.4997 samples/sec | ETA 01:09:402022-07-21 15:11:53 [INFO][TRAIN] epoch: 1, iter: 3810/20000, loss: 0.3643, lr: 0.008268, batch_cost: 0.2344, reader_cost: 0.11195, ips: 17.0658 samples/sec | ETA 01:03:142022-07-21 15:11:55 [INFO][TRAIN] epoch: 1, iter: 3820/20000, loss: 0.4225, lr: 0.008264, batch_cost: 0.2220, reader_cost: 0.09959, ips: 18.0177 samples/sec | ETA 00:59:522022-07-21 15:11:58 [INFO][TRAIN] epoch: 1, iter: 3830/20000, loss: 0.4806, lr: 0.008259, batch_cost: 0.2449, reader_cost: 0.12058, ips: 16.3342 samples/sec | ETA 01:05:592022-07-21 15:12:00 [INFO][TRAIN] epoch: 1, iter: 3840/20000, loss: 0.4841, lr: 0.008255, batch_cost: 0.2516, reader_cost: 0.12001, ips: 15.8976 samples/sec | ETA 01:07:462022-07-21 15:12:03 [INFO][TRAIN] epoch: 1, iter: 3850/20000, loss: 0.3473, lr: 0.008250, batch_cost: 0.2560, reader_cost: 0.12997, ips: 15.6278 samples/sec | ETA 01:08:532022-07-21 15:12:06 [INFO][TRAIN] epoch: 1, iter: 3860/20000, loss: 0.3111, lr: 0.008245, batch_cost: 0.2639, reader_cost: 0.13834, ips: 15.1549 samples/sec | ETA 01:11:002022-07-21 15:12:08 [INFO][TRAIN] epoch: 1, iter: 3870/20000, loss: 0.4574, lr: 0.008241, batch_cost: 0.2412, reader_cost: 0.11422, ips: 16.5823 samples/sec | ETA 01:04:502022-07-21 15:12:10 [INFO][TRAIN] epoch: 1, iter: 3880/20000, loss: 0.3299, lr: 0.008236, batch_cost: 0.2274, reader_cost: 0.10385, ips: 17.5868 samples/sec | ETA 01:01:062022-07-21 15:12:13 [INFO][TRAIN] epoch: 1, iter: 3890/20000, loss: 0.3384, lr: 0.008232, batch_cost: 0.2360, reader_cost: 0.11281, ips: 16.9478 samples/sec | ETA 01:03:222022-07-21 15:12:15 [INFO][TRAIN] epoch: 1, iter: 3900/20000, loss: 0.4258, lr: 0.008227, batch_cost: 0.2426, reader_cost: 0.11998, ips: 16.4885 samples/sec | ETA 01:05:052022-07-21 15:12:17 [INFO][TRAIN] epoch: 1, iter: 3910/20000, loss: 0.3746, lr: 0.008222, batch_cost: 0.2416, reader_cost: 0.11829, ips: 16.5560 samples/sec | ETA 01:04:472022-07-21 15:12:20 [INFO][TRAIN] epoch: 1, iter: 3920/20000, loss: 0.5183, lr: 0.008218, batch_cost: 0.2990, reader_cost: 0.15620, ips: 13.3796 samples/sec | ETA 01:20:072022-07-21 15:12:23 [INFO][TRAIN] epoch: 1, iter: 3930/20000, loss: 0.4921, lr: 0.008213, batch_cost: 0.2354, reader_cost: 0.10985, ips: 16.9924 samples/sec | ETA 01:03:022022-07-21 15:12:25 [INFO][TRAIN] epoch: 1, iter: 3940/20000, loss: 0.2848, lr: 0.008209, batch_cost: 0.2443, reader_cost: 0.11316, ips: 16.3739 samples/sec | ETA 01:05:232022-07-21 15:12:28 [INFO][TRAIN] epoch: 1, iter: 3950/20000, loss: 0.3905, lr: 0.008204, batch_cost: 0.2505, reader_cost: 0.12296, ips: 15.9700 samples/sec | ETA 01:07:002022-07-21 15:12:30 [INFO][TRAIN] epoch: 1, iter: 3960/20000, loss: 0.5582, lr: 0.008199, batch_cost: 0.2369, reader_cost: 0.11257, ips: 16.8871 samples/sec | ETA 01:03:192022-07-21 15:12:33 [INFO][TRAIN] epoch: 1, iter: 3970/20000, loss: 0.3308, lr: 0.008195, batch_cost: 0.2711, reader_cost: 0.14767, ips: 14.7564 samples/sec | ETA 01:12:252022-07-21 15:12:36 [INFO][TRAIN] epoch: 1, iter: 3980/20000, loss: 0.3520, lr: 0.008190, batch_cost: 0.3625, reader_cost: 0.18163, ips: 11.0350 samples/sec | ETA 01:36:462022-07-21 15:12:39 [INFO][TRAIN] epoch: 1, iter: 3990/20000, loss: 0.4225, lr: 0.008186, batch_cost: 0.2915, reader_cost: 0.15763, ips: 13.7239 samples/sec | ETA 01:17:462022-07-21 15:12:42 [INFO][TRAIN] epoch: 1, iter: 4000/20000, loss: 0.4342, lr: 0.008181, batch_cost: 0.2236, reader_cost: 0.09971, ips: 17.8880 samples/sec | ETA 00:59:372022-07-21 15:12:42 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 456s 62ms/step - batch_cost: 0.0617 - reader cost: 0.0083: 5:19 - batch_cost: 0.0579 - rea - ETA: 5:11 - - ETA: 5:08 - batch_cos - ETA: 5:07 - batch_cost: 0.0586 - ETA: 5:08 - batch_cost: 0.0591 - reader - ETA: 5:06 - batch_cost: 0. - ETA: 5:06 - batch_cost: 0.0598 - read - ETA: 5:06 - batch - ET - ETA: 4:49 - batch_cost: 0.0606 - reader co - ETA: 4:49 - batch_cost: 0.0606 - reader cost: 0.0 - ETA - ETA: - ETA: 4:26 - batch_cost: 0 - ETA: 4:24 - batch_co - ETA: 4:21 - batch_cost: 0.0612 - reader cost: - ETA: 4:21 - batch_cost: 0.0613 - reader - ETA: 4:21 - batch_cost: 0.0614 - reader cost: 0.0 - ETA: 4:21 - batch_cost: - ETA: 4:21 - batch_cost: 0.0619 - reader cost: - ETA: 4:21 - batch_cost: 0.0620 - reader cost: 0.0 - ETA: 4:21 - batch_cost: 0.0620 - reader cost: 0 - ETA: 4:20 - batch_cost: 0.0620 - reader cost: 0. - ETA: 4:20 - batch - ETA: 4:19 - batch_cost: 0.0623 - reader cost: 0.0 - ETA: 4:18 - batch_cost: 0.0623 - reader - ETA: 4:13 - batch_cost: 0.0624 - reader cost: - ETA: 4:13 - ba - ETA: 4:10 - batch_cost - ETA: 4:07 - batch_cost: 0.0 - ETA: 4:00 - batch_cost: 0.0624 - reader cost: 0. - ETA: 4:00 - batch - ETA: 3:58 - batc - ETA: 3:55 - batch_cost: 0.0625 - ETA: 3:54 - batch_cost: 0.0626 - - ETA: 3:53 - batch_cost: 0.0626 - reader cost: 0.010 - ETA: 3:53 - batch_cost: 0.0626 - reader co - ETA: 3 - ETA: 3:39 - batch_cost: 0.0627 - reader cost: 0. - ETA: 3:39 - batch_cost: 0.0627 - - ETA: 3:37 - batch_cost: - ETA: 3:30 - batch_cost: 0.0627 - rea - ETA: 3:29 - batch_cost: 0.0627 - reader cost: 0.0 - ETA: 3:28 - batch_cost: 0.0627 - reader cos - ETA: 3:28 - batch_cost: 0.0627 - reader co - ETA: 3:26 - - ETA: 3:24 - batch_cost: 0.0628 - reader co - ETA: 3:23 - batch_cost: 0.0628 - reader co - ETA: 3:23 - batch_cost: - ETA: 3:18 - ba - ETA: 3:06 - batc - ETA: 3:03 - batch_cost: 0.0635 - reader cost: 0.01 - ETA: 3:03 - batch_cost: 0.0635 - reader cos - ETA: 3:02 - batch_cost: 0.0635 - reader cost: 0.0 - ETA: 3:02 - batch_cost: 0.0635 - reader co - ETA: 3:01 - batch - ETA: 2:54 - batch_cost: 0.0636 - reader cost: 0.01 - ETA: 2:54 - batch_cost - ETA: 2:52 - batch_cost: 0.0637 - reader cost: 0. - ETA: 2:51 - batch - ETA: 2:48 - batch_cost: 0.0637 - reader cost: 0.01 - ETA: 2:48 - b - ETA: - ETA: 2:41 - batch_cost: 0.0637 - reader cost: 0. - ET - ETA: 2:33 - batch_cost: 0.0638 - reader cost - ETA: 2:32 - - ETA: 2:29 - batch_cost: 0.0638 - - ETA: 2:28 - batch_ - ETA: 2:25 - batch_cost: 0.0638 - reader - ETA: 2:24 - - ETA: 2:09 - batch_co - - ETA: 1:59 - batch_cost: 0.0637 - reader cost: 0.0 - ETA: 1:59 - batch_cost: 0.0637 - reader cost: 0. - ETA: 1:58 - batch_cost: 0.0637 - reader cost: 0 - ETA: 1:58 - batch_cost: 0.0636 - reader cost: - ETA: 1:57 - b - ETA: 1:53 - batch_cost: 0.0636 - reader cost: 0.0 - ETA: 1:53 - batch_cost: 0.0635 - reader cos - ETA: 1:52 - ba - ETA: 1:48 - batch_cost: 0.0635 - reader cost: 0 - ETA: 1:47 - batch_cost: - ETA: 1:45 - batch_cost: 0.063 - ETA: 1:42 - batch_cost: 0 - ETA: 1:32 - batch_cost: 0.0632 - reader cost: 0.010 - ETA: 1:32 - batch_cost: 0.0631 - reader cost: 0 - ETA: 1:32 - batch_cost: 0. - ETA: 1:22 - batch_cost: 0.0628 - reader c - ETA: 1:20 - batch_cost: 0.0 - ETA: 1:18 - batch_cost: 0.0627 - reader cost - ETA: 1:17 - batch_cost: 0.0627 - reader cost: 0.009 - ETA: 1:17 - batch_cost: 0.0627 - reader - ETA: 1:15 - batch_cost: 0.0627 - reader cost: 0.009 - ETA: 1:15 - batch_cost: 0.0627 - reader cost - ETA: 1:14 - batch_cost: 0.0626 - reader cost: 0.009 - ETA: 1:14 - batch_cost: 0.0626 - reade - ETA: - ETA: 1:08 - batch_cost: 0.0626 - read - ETA: 1:07 - batch_cost: 0.0627 - rea - ETA: 1:00 - batch_cost: 0.0629 - reader c - ETA: 59 - ETA: 56 - ETA: 54s - batch_cost: 0.06 - ETA: 52s - batch_cost: 0.0627 - - ETA: 51s - batch_cost: 0. - ETA: 50s - batch_cost - ETA: 48s - batch_cost: 0.0626 - r - ETA: 47s - batch_cost: - ETA: 45s - batch_cost: 0.0625 - rea - ETA: 44s - batch_cost: 0.0625 - reade - ETA: 36s - batch_cost: 0.0624 - reader - ETA: 36s - batch_cost: 0.0624 - reader cost - ETA: 36s - batch_cost: 0.0624 - reader cost: 0.00 - ETA: 35s - batch_cost: 0.0624 - reade - ETA: 35s - batch_cost: 0.0625 - reader - ETA: - ETA: 16s - batch_co - ETA: 14s - batch_ - ETA: 12s - batch_cost: 0.0620 - reader cost: - ETA: 12s - batch_cost: - ETA: 11s - batch_cost: 0.0619 - reader - ETA: 10s - batch_cost - ETA: 8s - batch_cost: 0.0619 - reader cost: 0.00 - ETA: 7s - batch_cost: 0.0619 - read2022-07-21 15:20:18 [INFO][EVAL] #Images: 7361 mIoU: 0.0894 Acc: 0.9832 Kappa: 0.2964 Dice: 0.10932022-07-21 15:20:18 [INFO][EVAL] Class IoU: [0.9849 0.1079 0.0229 0.008 0. 0. 0. 0. 0.1458 0.0001 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.519 ]2022-07-21 15:20:18 [INFO][EVAL] Class Precision: [0.9857 0.3042 0.2541 0.4175 0. 0. 0. 0. 0.5409 0.0209 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.768 ]2022-07-21 15:20:18 [INFO][EVAL] Class Recall: [0.9991 0.1433 0.0246 0.0081 0. 0. 0. 0. 0.1664 0.0001 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.6155]2022-07-21 15:20:18 [INFO][EVAL] The model with the best validation mIoU (0.0894) was saved at iter 4000.2022-07-21 15:20:20 [INFO][TRAIN] epoch: 1, iter: 4010/20000, loss: 0.4041, lr: 0.008176, batch_cost: 0.1634, reader_cost: 0.02467, ips: 24.4795 samples/sec | ETA 00:43:322022-07-21 15:20:23 [INFO][TRAIN] epoch: 1, iter: 4020/20000, loss: 0.4797, lr: 0.008172, batch_cost: 0.2952, reader_cost: 0.14256, ips: 13.5496 samples/sec | ETA 01:18:372022-07-21 15:20:26 [INFO][TRAIN] epoch: 1, iter: 4030/20000, loss: 0.3249, lr: 0.008167, batch_cost: 0.2906, reader_cost: 0.15223, ips: 13.7627 samples/sec | ETA 01:17:212022-07-21 15:20:29 [INFO][TRAIN] epoch: 1, iter: 4040/20000, loss: 0.4118, lr: 0.008163, batch_cost: 0.3300, reader_cost: 0.14932, ips: 12.1217 samples/sec | ETA 01:27:462022-07-21 15:20:31 [INFO][TRAIN] epoch: 1, iter: 4050/20000, loss: 0.4153, lr: 0.008158, batch_cost: 0.2376, reader_cost: 0.10992, ips: 16.8326 samples/sec | ETA 01:03:102022-07-21 15:20:34 [INFO][TRAIN] epoch: 1, iter: 4060/20000, loss: 0.4541, lr: 0.008153, batch_cost: 0.2552, reader_cost: 0.13091, ips: 15.6735 samples/sec | ETA 01:07:482022-07-21 15:20:36 [INFO][TRAIN] epoch: 1, iter: 4070/20000, loss: 0.3979, lr: 0.008149, batch_cost: 0.2278, reader_cost: 0.10450, ips: 17.5590 samples/sec | ETA 01:00:282022-07-21 15:20:38 [INFO][TRAIN] epoch: 1, iter: 4080/20000, loss: 0.4290, lr: 0.008144, batch_cost: 0.2231, reader_cost: 0.10085, ips: 17.9284 samples/sec | ETA 00:59:112022-07-21 15:20:41 [INFO][TRAIN] epoch: 1, iter: 4090/20000, loss: 0.3676, lr: 0.008140, batch_cost: 0.2288, reader_cost: 0.10687, ips: 17.4815 samples/sec | ETA 01:00:402022-07-21 15:20:43 [INFO][TRAIN] epoch: 1, iter: 4100/20000, loss: 0.4343, lr: 0.008135, batch_cost: 0.2549, reader_cost: 0.13233, ips: 15.6907 samples/sec | ETA 01:07:332022-07-21 15:20:46 [INFO][TRAIN] epoch: 1, iter: 4110/20000, loss: 0.4673, lr: 0.008130, batch_cost: 0.2613, reader_cost: 0.13317, ips: 15.3104 samples/sec | ETA 01:09:112022-07-21 15:20:48 [INFO][TRAIN] epoch: 1, iter: 4120/20000, loss: 0.3649, lr: 0.008126, batch_cost: 0.2361, reader_cost: 0.10910, ips: 16.9442 samples/sec | ETA 01:02:282022-07-21 15:20:51 [INFO][TRAIN] epoch: 1, iter: 4130/20000, loss: 0.4073, lr: 0.008121, batch_cost: 0.2643, reader_cost: 0.12948, ips: 15.1324 samples/sec | ETA 01:09:542022-07-21 15:20:54 [INFO][TRAIN] epoch: 1, iter: 4140/20000, loss: 0.3604, lr: 0.008117, batch_cost: 0.2895, reader_cost: 0.14755, ips: 13.8187 samples/sec | ETA 01:16:302022-07-21 15:20:56 [INFO][TRAIN] epoch: 1, iter: 4150/20000, loss: 0.4729, lr: 0.008112, batch_cost: 0.2504, reader_cost: 0.12316, ips: 15.9760 samples/sec | ETA 01:06:082022-07-21 15:20:59 [INFO][TRAIN] epoch: 1, iter: 4160/20000, loss: 0.3199, lr: 0.008107, batch_cost: 0.2589, reader_cost: 0.13488, ips: 15.4475 samples/sec | ETA 01:08:212022-07-21 15:21:01 [INFO][TRAIN] epoch: 1, iter: 4170/20000, loss: 0.4568, lr: 0.008103, batch_cost: 0.2305, reader_cost: 0.10588, ips: 17.3526 samples/sec | ETA 01:00:492022-07-21 15:21:03 [INFO][TRAIN] epoch: 1, iter: 4180/20000, loss: 0.3283, lr: 0.008098, batch_cost: 0.2266, reader_cost: 0.10509, ips: 17.6551 samples/sec | ETA 00:59:442022-07-21 15:21:06 [INFO][TRAIN] epoch: 1, iter: 4190/20000, loss: 0.3070, lr: 0.008094, batch_cost: 0.2449, reader_cost: 0.12017, ips: 16.3309 samples/sec | ETA 01:04:322022-07-21 15:21:08 [INFO][TRAIN] epoch: 1, iter: 4200/20000, loss: 0.3590, lr: 0.008089, batch_cost: 0.2346, reader_cost: 0.11209, ips: 17.0525 samples/sec | ETA 01:01:462022-07-21 15:21:11 [INFO][TRAIN] epoch: 1, iter: 4210/20000, loss: 0.3962, lr: 0.008084, batch_cost: 0.2517, reader_cost: 0.13106, ips: 15.8942 samples/sec | ETA 01:06:132022-07-21 15:21:13 [INFO][TRAIN] epoch: 1, iter: 4220/20000, loss: 0.3946, lr: 0.008080, batch_cost: 0.2598, reader_cost: 0.13576, ips: 15.3950 samples/sec | ETA 01:08:202022-07-21 15:21:16 [INFO][TRAIN] epoch: 1, iter: 4230/20000, loss: 0.3711, lr: 0.008075, batch_cost: 0.2598, reader_cost: 0.13564, ips: 15.3938 samples/sec | ETA 01:08:172022-07-21 15:21:18 [INFO][TRAIN] epoch: 1, iter: 4240/20000, loss: 0.4026, lr: 0.008070, batch_cost: 0.2508, reader_cost: 0.12177, ips: 15.9476 samples/sec | ETA 01:05:522022-07-21 15:21:21 [INFO][TRAIN] epoch: 1, iter: 4250/20000, loss: 0.3796, lr: 0.008066, batch_cost: 0.2605, reader_cost: 0.12724, ips: 15.3562 samples/sec | ETA 01:08:222022-07-21 15:21:24 [INFO][TRAIN] epoch: 1, iter: 4260/20000, loss: 0.4999, lr: 0.008061, batch_cost: 0.3219, reader_cost: 0.17991, ips: 12.4259 samples/sec | ETA 01:24:262022-07-21 15:21:27 [INFO][TRAIN] epoch: 1, iter: 4270/20000, loss: 0.3328, lr: 0.008057, batch_cost: 0.2818, reader_cost: 0.12974, ips: 14.1920 samples/sec | ETA 01:13:532022-07-21 15:21:30 [INFO][TRAIN] epoch: 1, iter: 4280/20000, loss: 0.4521, lr: 0.008052, batch_cost: 0.2502, reader_cost: 0.12727, ips: 15.9880 samples/sec | ETA 01:05:322022-07-21 15:21:32 [INFO][TRAIN] epoch: 1, iter: 4290/20000, loss: 0.4581, lr: 0.008047, batch_cost: 0.2867, reader_cost: 0.13596, ips: 13.9509 samples/sec | ETA 01:15:042022-07-21 15:21:36 [INFO][TRAIN] epoch: 1, iter: 4300/20000, loss: 0.3726, lr: 0.008043, batch_cost: 0.3300, reader_cost: 0.16152, ips: 12.1207 samples/sec | ETA 01:26:212022-07-21 15:21:38 [INFO][TRAIN] epoch: 1, iter: 4310/20000, loss: 0.6045, lr: 0.008038, batch_cost: 0.2383, reader_cost: 0.11248, ips: 16.7838 samples/sec | ETA 01:02:192022-07-21 15:21:40 [INFO][TRAIN] epoch: 1, iter: 4320/20000, loss: 0.4439, lr: 0.008034, batch_cost: 0.2365, reader_cost: 0.11470, ips: 16.9160 samples/sec | ETA 01:01:472022-07-21 15:21:43 [INFO][TRAIN] epoch: 1, iter: 4330/20000, loss: 0.3517, lr: 0.008029, batch_cost: 0.2284, reader_cost: 0.10412, ips: 17.5161 samples/sec | ETA 00:59:382022-07-21 15:21:45 [INFO][TRAIN] epoch: 1, iter: 4340/20000, loss: 0.5329, lr: 0.008024, batch_cost: 0.2268, reader_cost: 0.10427, ips: 17.6353 samples/sec | ETA 00:59:112022-07-21 15:21:47 [INFO][TRAIN] epoch: 1, iter: 4350/20000, loss: 0.3495, lr: 0.008020, batch_cost: 0.2364, reader_cost: 0.11460, ips: 16.9189 samples/sec | ETA 01:01:402022-07-21 15:21:50 [INFO][TRAIN] epoch: 1, iter: 4360/20000, loss: 0.5058, lr: 0.008015, batch_cost: 0.2537, reader_cost: 0.13252, ips: 15.7667 samples/sec | ETA 01:06:072022-07-21 15:21:52 [INFO][TRAIN] epoch: 1, iter: 4370/20000, loss: 0.3799, lr: 0.008011, batch_cost: 0.2418, reader_cost: 0.11095, ips: 16.5424 samples/sec | ETA 01:02:592022-07-21 15:21:55 [INFO][TRAIN] epoch: 1, iter: 4380/20000, loss: 0.5155, lr: 0.008006, batch_cost: 0.2774, reader_cost: 0.12058, ips: 14.4204 samples/sec | ETA 01:12:122022-07-21 15:21:58 [INFO][TRAIN] epoch: 1, iter: 4390/20000, loss: 0.4763, lr: 0.008001, batch_cost: 0.2552, reader_cost: 0.11810, ips: 15.6755 samples/sec | ETA 01:06:232022-07-21 15:22:00 [INFO][TRAIN] epoch: 1, iter: 4400/20000, loss: 0.4425, lr: 0.007997, batch_cost: 0.2436, reader_cost: 0.12187, ips: 16.4211 samples/sec | ETA 01:03:192022-07-21 15:22:02 [INFO][TRAIN] epoch: 1, iter: 4410/20000, loss: 0.5651, lr: 0.007992, batch_cost: 0.2346, reader_cost: 0.11142, ips: 17.0487 samples/sec | ETA 01:00:572022-07-21 15:22:05 [INFO][TRAIN] epoch: 1, iter: 4420/20000, loss: 0.3894, lr: 0.007987, batch_cost: 0.2465, reader_cost: 0.12326, ips: 16.2270 samples/sec | ETA 01:04:002022-07-21 15:22:07 [INFO][TRAIN] epoch: 1, iter: 4430/20000, loss: 0.4367, lr: 0.007983, batch_cost: 0.2346, reader_cost: 0.11006, ips: 17.0527 samples/sec | ETA 01:00:522022-07-21 15:22:10 [INFO][TRAIN] epoch: 1, iter: 4440/20000, loss: 0.2988, lr: 0.007978, batch_cost: 0.2385, reader_cost: 0.11422, ips: 16.7681 samples/sec | ETA 01:01:512022-07-21 15:22:12 [INFO][TRAIN] epoch: 1, iter: 4450/20000, loss: 0.4290, lr: 0.007974, batch_cost: 0.2541, reader_cost: 0.12599, ips: 15.7427 samples/sec | ETA 01:05:512022-07-21 15:22:14 [INFO][TRAIN] epoch: 1, iter: 4460/20000, loss: 0.3366, lr: 0.007969, batch_cost: 0.2334, reader_cost: 0.11216, ips: 17.1343 samples/sec | ETA 01:00:272022-07-21 15:22:17 [INFO][TRAIN] epoch: 1, iter: 4470/20000, loss: 0.4225, lr: 0.007964, batch_cost: 0.2452, reader_cost: 0.12210, ips: 16.3165 samples/sec | ETA 01:03:272022-07-21 15:22:19 [INFO][TRAIN] epoch: 1, iter: 4480/20000, loss: 0.4718, lr: 0.007960, batch_cost: 0.2474, reader_cost: 0.12182, ips: 16.1667 samples/sec | ETA 01:03:592022-07-21 15:22:22 [INFO][TRAIN] epoch: 1, iter: 4490/20000, loss: 0.6429, lr: 0.007955, batch_cost: 0.2372, reader_cost: 0.11396, ips: 16.8617 samples/sec | ETA 01:01:192022-07-21 15:22:24 [INFO][TRAIN] epoch: 1, iter: 4500/20000, loss: 0.3741, lr: 0.007951, batch_cost: 0.2342, reader_cost: 0.11046, ips: 17.0814 samples/sec | ETA 01:00:292022-07-21 15:22:27 [INFO][TRAIN] epoch: 1, iter: 4510/20000, loss: 0.3108, lr: 0.007946, batch_cost: 0.2546, reader_cost: 0.12051, ips: 15.7129 samples/sec | ETA 01:05:432022-07-21 15:22:30 [INFO][TRAIN] epoch: 1, iter: 4520/20000, loss: 0.4806, lr: 0.007941, batch_cost: 0.3030, reader_cost: 0.13438, ips: 13.2015 samples/sec | ETA 01:18:102022-07-21 15:22:32 [INFO][TRAIN] epoch: 1, iter: 4530/20000, loss: 0.4332, lr: 0.007937, batch_cost: 0.2358, reader_cost: 0.11306, ips: 16.9655 samples/sec | ETA 01:00:472022-07-21 15:22:34 [INFO][TRAIN] epoch: 1, iter: 4540/20000, loss: 0.3897, lr: 0.007932, batch_cost: 0.2113, reader_cost: 0.08985, ips: 18.9276 samples/sec | ETA 00:54:272022-07-21 15:22:37 [INFO][TRAIN] epoch: 1, iter: 4550/20000, loss: 0.5403, lr: 0.007927, batch_cost: 0.2399, reader_cost: 0.11759, ips: 16.6760 samples/sec | ETA 01:01:452022-07-21 15:22:40 [INFO][TRAIN] epoch: 1, iter: 4560/20000, loss: 0.4583, lr: 0.007923, batch_cost: 0.3525, reader_cost: 0.17278, ips: 11.3465 samples/sec | ETA 01:30:432022-07-21 15:22:43 [INFO][TRAIN] epoch: 1, iter: 4570/20000, loss: 0.4106, lr: 0.007918, batch_cost: 0.2860, reader_cost: 0.13546, ips: 13.9867 samples/sec | ETA 01:13:322022-07-21 15:22:45 [INFO][TRAIN] epoch: 1, iter: 4580/20000, loss: 0.4482, lr: 0.007914, batch_cost: 0.2330, reader_cost: 0.10870, ips: 17.1683 samples/sec | ETA 00:59:522022-07-21 15:22:48 [INFO][TRAIN] epoch: 1, iter: 4590/20000, loss: 0.3926, lr: 0.007909, batch_cost: 0.2403, reader_cost: 0.11609, ips: 16.6447 samples/sec | ETA 01:01:432022-07-21 15:22:50 [INFO][TRAIN] epoch: 1, iter: 4600/20000, loss: 0.4208, lr: 0.007904, batch_cost: 0.2475, reader_cost: 0.12371, ips: 16.1630 samples/sec | ETA 01:03:312022-07-21 15:22:53 [INFO][TRAIN] epoch: 1, iter: 4610/20000, loss: 0.3772, lr: 0.007900, batch_cost: 0.2413, reader_cost: 0.11739, ips: 16.5758 samples/sec | ETA 01:01:532022-07-21 15:22:55 [INFO][TRAIN] epoch: 1, iter: 4620/20000, loss: 0.4719, lr: 0.007895, batch_cost: 0.2432, reader_cost: 0.12082, ips: 16.4495 samples/sec | ETA 01:02:192022-07-21 15:22:57 [INFO][TRAIN] epoch: 1, iter: 4630/20000, loss: 0.3381, lr: 0.007891, batch_cost: 0.2352, reader_cost: 0.10806, ips: 17.0055 samples/sec | ETA 01:00:152022-07-21 15:23:00 [INFO][TRAIN] epoch: 1, iter: 4640/20000, loss: 0.3651, lr: 0.007886, batch_cost: 0.3026, reader_cost: 0.15394, ips: 13.2187 samples/sec | ETA 01:17:272022-07-21 15:23:03 [INFO][TRAIN] epoch: 1, iter: 4650/20000, loss: 0.3644, lr: 0.007881, batch_cost: 0.2598, reader_cost: 0.13391, ips: 15.3975 samples/sec | ETA 01:06:272022-07-21 15:23:05 [INFO][TRAIN] epoch: 1, iter: 4660/20000, loss: 0.3640, lr: 0.007877, batch_cost: 0.2270, reader_cost: 0.10084, ips: 17.6210 samples/sec | ETA 00:58:022022-07-21 15:23:08 [INFO][TRAIN] epoch: 1, iter: 4670/20000, loss: 0.3817, lr: 0.007872, batch_cost: 0.2335, reader_cost: 0.11240, ips: 17.1295 samples/sec | ETA 00:59:392022-07-21 15:23:10 [INFO][TRAIN] epoch: 1, iter: 4680/20000, loss: 0.3787, lr: 0.007867, batch_cost: 0.2304, reader_cost: 0.10769, ips: 17.3606 samples/sec | ETA 00:58:492022-07-21 15:23:12 [INFO][TRAIN] epoch: 1, iter: 4690/20000, loss: 0.5417, lr: 0.007863, batch_cost: 0.2359, reader_cost: 0.10951, ips: 16.9552 samples/sec | ETA 01:00:112022-07-21 15:23:15 [INFO][TRAIN] epoch: 1, iter: 4700/20000, loss: 0.5064, lr: 0.007858, batch_cost: 0.2376, reader_cost: 0.11494, ips: 16.8332 samples/sec | ETA 01:00:352022-07-21 15:23:17 [INFO][TRAIN] epoch: 1, iter: 4710/20000, loss: 0.3518, lr: 0.007854, batch_cost: 0.2593, reader_cost: 0.13122, ips: 15.4260 samples/sec | ETA 01:06:042022-07-21 15:23:20 [INFO][TRAIN] epoch: 1, iter: 4720/20000, loss: 0.4202, lr: 0.007849, batch_cost: 0.2330, reader_cost: 0.10972, ips: 17.1640 samples/sec | ETA 00:59:202022-07-21 15:23:22 [INFO][TRAIN] epoch: 1, iter: 4730/20000, loss: 0.3870, lr: 0.007844, batch_cost: 0.2290, reader_cost: 0.10748, ips: 17.4673 samples/sec | ETA 00:58:162022-07-21 15:23:24 [INFO][TRAIN] epoch: 1, iter: 4740/20000, loss: 0.4262, lr: 0.007840, batch_cost: 0.2359, reader_cost: 0.11272, ips: 16.9558 samples/sec | ETA 00:59:592022-07-21 15:23:27 [INFO][TRAIN] epoch: 1, iter: 4750/20000, loss: 0.4623, lr: 0.007835, batch_cost: 0.2536, reader_cost: 0.12991, ips: 15.7749 samples/sec | ETA 01:04:262022-07-21 15:23:29 [INFO][TRAIN] epoch: 1, iter: 4760/20000, loss: 0.3557, lr: 0.007830, batch_cost: 0.2325, reader_cost: 0.10936, ips: 17.2047 samples/sec | ETA 00:59:032022-07-21 15:23:33 [INFO][TRAIN] epoch: 1, iter: 4770/20000, loss: 0.3968, lr: 0.007826, batch_cost: 0.3587, reader_cost: 0.19119, ips: 11.1526 samples/sec | ETA 01:31:022022-07-21 15:23:35 [INFO][TRAIN] epoch: 1, iter: 4780/20000, loss: 0.3876, lr: 0.007821, batch_cost: 0.2411, reader_cost: 0.11723, ips: 16.5895 samples/sec | ETA 01:01:092022-07-21 15:23:37 [INFO][TRAIN] epoch: 1, iter: 4790/20000, loss: 0.4948, lr: 0.007817, batch_cost: 0.2378, reader_cost: 0.11361, ips: 16.8206 samples/sec | ETA 01:00:162022-07-21 15:23:40 [INFO][TRAIN] epoch: 1, iter: 4800/20000, loss: 0.4025, lr: 0.007812, batch_cost: 0.2321, reader_cost: 0.11236, ips: 17.2319 samples/sec | ETA 00:58:482022-07-21 15:23:42 [INFO][TRAIN] epoch: 1, iter: 4810/20000, loss: 0.4704, lr: 0.007807, batch_cost: 0.2390, reader_cost: 0.11306, ips: 16.7381 samples/sec | ETA 01:00:302022-07-21 15:23:45 [INFO][TRAIN] epoch: 1, iter: 4820/20000, loss: 0.4029, lr: 0.007803, batch_cost: 0.2696, reader_cost: 0.12564, ips: 14.8395 samples/sec | ETA 01:08:112022-07-21 15:23:49 [INFO][TRAIN] epoch: 1, iter: 4830/20000, loss: 0.3120, lr: 0.007798, batch_cost: 0.3699, reader_cost: 0.19596, ips: 10.8132 samples/sec | ETA 01:33:312022-07-21 15:23:51 [INFO][TRAIN] epoch: 1, iter: 4840/20000, loss: 0.3889, lr: 0.007793, batch_cost: 0.2478, reader_cost: 0.11739, ips: 16.1434 samples/sec | ETA 01:02:362022-07-21 15:23:53 [INFO][TRAIN] epoch: 1, iter: 4850/20000, loss: 0.4272, lr: 0.007789, batch_cost: 0.2416, reader_cost: 0.11928, ips: 16.5564 samples/sec | ETA 01:01:002022-07-21 15:23:56 [INFO][TRAIN] epoch: 1, iter: 4860/20000, loss: 0.3627, lr: 0.007784, batch_cost: 0.2444, reader_cost: 0.11907, ips: 16.3670 samples/sec | ETA 01:01:402022-07-21 15:23:59 [INFO][TRAIN] epoch: 1, iter: 4870/20000, loss: 0.2982, lr: 0.007780, batch_cost: 0.2653, reader_cost: 0.14112, ips: 15.0777 samples/sec | ETA 01:06:532022-07-21 15:24:01 [INFO][TRAIN] epoch: 1, iter: 4880/20000, loss: 0.4014, lr: 0.007775, batch_cost: 0.2387, reader_cost: 0.11632, ips: 16.7559 samples/sec | ETA 01:00:092022-07-21 15:24:04 [INFO][TRAIN] epoch: 1, iter: 4890/20000, loss: 0.3431, lr: 0.007770, batch_cost: 0.2816, reader_cost: 0.14058, ips: 14.2043 samples/sec | ETA 01:10:552022-07-21 15:24:07 [INFO][TRAIN] epoch: 1, iter: 4900/20000, loss: 0.4587, lr: 0.007766, batch_cost: 0.2847, reader_cost: 0.13325, ips: 14.0504 samples/sec | ETA 01:11:382022-07-21 15:24:09 [INFO][TRAIN] epoch: 1, iter: 4910/20000, loss: 0.4679, lr: 0.007761, batch_cost: 0.2309, reader_cost: 0.10332, ips: 17.3207 samples/sec | ETA 00:58:042022-07-21 15:24:11 [INFO][TRAIN] epoch: 1, iter: 4920/20000, loss: 0.4062, lr: 0.007756, batch_cost: 0.2365, reader_cost: 0.11466, ips: 16.9165 samples/sec | ETA 00:59:252022-07-21 15:24:14 [INFO][TRAIN] epoch: 1, iter: 4930/20000, loss: 0.4924, lr: 0.007752, batch_cost: 0.2490, reader_cost: 0.12430, ips: 16.0635 samples/sec | ETA 01:02:322022-07-21 15:24:16 [INFO][TRAIN] epoch: 1, iter: 4940/20000, loss: 0.3202, lr: 0.007747, batch_cost: 0.2442, reader_cost: 0.11640, ips: 16.3817 samples/sec | ETA 01:01:172022-07-21 15:24:19 [INFO][TRAIN] epoch: 1, iter: 4950/20000, loss: 0.3110, lr: 0.007743, batch_cost: 0.2419, reader_cost: 0.11826, ips: 16.5386 samples/sec | ETA 01:00:392022-07-21 15:24:21 [INFO][TRAIN] epoch: 2, iter: 4960/20000, loss: 0.3453, lr: 0.007738, batch_cost: 0.2513, reader_cost: 0.12764, ips: 15.9186 samples/sec | ETA 01:02:592022-07-21 15:24:24 [INFO][TRAIN] epoch: 2, iter: 4970/20000, loss: 0.3896, lr: 0.007733, batch_cost: 0.2472, reader_cost: 0.12488, ips: 16.1810 samples/sec | ETA 01:01:552022-07-21 15:24:26 [INFO][TRAIN] epoch: 2, iter: 4980/20000, loss: 0.3915, lr: 0.007729, batch_cost: 0.2407, reader_cost: 0.11640, ips: 16.6167 samples/sec | ETA 01:00:152022-07-21 15:24:28 [INFO][TRAIN] epoch: 2, iter: 4990/20000, loss: 0.3838, lr: 0.007724, batch_cost: 0.2362, reader_cost: 0.11349, ips: 16.9383 samples/sec | ETA 00:59:042022-07-21 15:24:31 [INFO][TRAIN] epoch: 2, iter: 5000/20000, loss: 0.4880, lr: 0.007719, batch_cost: 0.2301, reader_cost: 0.10802, ips: 17.3823 samples/sec | ETA 00:57:312022-07-21 15:24:33 [INFO][TRAIN] epoch: 2, iter: 5010/20000, loss: 0.5577, lr: 0.007715, batch_cost: 0.2503, reader_cost: 0.11797, ips: 15.9839 samples/sec | ETA 01:02:312022-07-21 15:24:36 [INFO][TRAIN] epoch: 2, iter: 5020/20000, loss: 0.4198, lr: 0.007710, batch_cost: 0.3103, reader_cost: 0.15212, ips: 12.8890 samples/sec | ETA 01:17:282022-07-21 15:24:39 [INFO][TRAIN] epoch: 2, iter: 5030/20000, loss: 0.4269, lr: 0.007705, batch_cost: 0.2253, reader_cost: 0.10158, ips: 17.7536 samples/sec | ETA 00:56:122022-07-21 15:24:41 [INFO][TRAIN] epoch: 2, iter: 5040/20000, loss: 0.4499, lr: 0.007701, batch_cost: 0.2338, reader_cost: 0.10715, ips: 17.1055 samples/sec | ETA 00:58:182022-07-21 15:24:43 [INFO][TRAIN] epoch: 2, iter: 5050/20000, loss: 0.5697, lr: 0.007696, batch_cost: 0.2442, reader_cost: 0.11889, ips: 16.3796 samples/sec | ETA 01:00:502022-07-21 15:24:46 [INFO][TRAIN] epoch: 2, iter: 5060/20000, loss: 0.3750, lr: 0.007692, batch_cost: 0.2414, reader_cost: 0.11405, ips: 16.5717 samples/sec | ETA 01:00:062022-07-21 15:24:48 [INFO][TRAIN] epoch: 2, iter: 5070/20000, loss: 0.3640, lr: 0.007687, batch_cost: 0.2328, reader_cost: 0.11094, ips: 17.1837 samples/sec | ETA 00:57:552022-07-21 15:24:51 [INFO][TRAIN] epoch: 2, iter: 5080/20000, loss: 0.4524, lr: 0.007682, batch_cost: 0.2508, reader_cost: 0.11668, ips: 15.9501 samples/sec | ETA 01:02:212022-07-21 15:24:54 [INFO][TRAIN] epoch: 2, iter: 5090/20000, loss: 0.4370, lr: 0.007678, batch_cost: 0.3474, reader_cost: 0.16383, ips: 11.5148 samples/sec | ETA 01:26:192022-07-21 15:24:57 [INFO][TRAIN] epoch: 2, iter: 5100/20000, loss: 0.4431, lr: 0.007673, batch_cost: 0.2545, reader_cost: 0.11567, ips: 15.7173 samples/sec | ETA 01:03:122022-07-21 15:24:59 [INFO][TRAIN] epoch: 2, iter: 5110/20000, loss: 0.4494, lr: 0.007668, batch_cost: 0.2393, reader_cost: 0.11772, ips: 16.7159 samples/sec | ETA 00:59:232022-07-21 15:25:01 [INFO][TRAIN] epoch: 2, iter: 5120/20000, loss: 0.3455, lr: 0.007664, batch_cost: 0.2320, reader_cost: 0.10694, ips: 17.2451 samples/sec | ETA 00:57:312022-07-21 15:25:04 [INFO][TRAIN] epoch: 2, iter: 5130/20000, loss: 0.4059, lr: 0.007659, batch_cost: 0.2335, reader_cost: 0.11132, ips: 17.1300 samples/sec | ETA 00:57:522022-07-21 15:25:06 [INFO][TRAIN] epoch: 2, iter: 5140/20000, loss: 0.3808, lr: 0.007654, batch_cost: 0.2498, reader_cost: 0.11316, ips: 16.0106 samples/sec | ETA 01:01:522022-07-21 15:25:09 [INFO][TRAIN] epoch: 2, iter: 5150/20000, loss: 0.3546, lr: 0.007650, batch_cost: 0.3274, reader_cost: 0.17860, ips: 12.2160 samples/sec | ETA 01:21:022022-07-21 15:25:12 [INFO][TRAIN] epoch: 2, iter: 5160/20000, loss: 0.5153, lr: 0.007645, batch_cost: 0.2604, reader_cost: 0.12662, ips: 15.3600 samples/sec | ETA 01:04:242022-07-21 15:25:14 [INFO][TRAIN] epoch: 2, iter: 5170/20000, loss: 0.3590, lr: 0.007641, batch_cost: 0.2227, reader_cost: 0.08812, ips: 17.9603 samples/sec | ETA 00:55:022022-07-21 15:25:17 [INFO][TRAIN] epoch: 2, iter: 5180/20000, loss: 0.4540, lr: 0.007636, batch_cost: 0.2344, reader_cost: 0.10723, ips: 17.0627 samples/sec | ETA 00:57:542022-07-21 15:25:19 [INFO][TRAIN] epoch: 2, iter: 5190/20000, loss: 0.3632, lr: 0.007631, batch_cost: 0.2203, reader_cost: 0.09800, ips: 18.1534 samples/sec | ETA 00:54:232022-07-21 15:25:21 [INFO][TRAIN] epoch: 2, iter: 5200/20000, loss: 0.3836, lr: 0.007627, batch_cost: 0.2218, reader_cost: 0.09968, ips: 18.0377 samples/sec | ETA 00:54:422022-07-21 15:25:24 [INFO][TRAIN] epoch: 2, iter: 5210/20000, loss: 0.4591, lr: 0.007622, batch_cost: 0.2462, reader_cost: 0.12128, ips: 16.2443 samples/sec | ETA 01:00:412022-07-21 15:25:26 [INFO][TRAIN] epoch: 2, iter: 5220/20000, loss: 0.4158, lr: 0.007617, batch_cost: 0.2339, reader_cost: 0.10969, ips: 17.1016 samples/sec | ETA 00:57:362022-07-21 15:25:28 [INFO][TRAIN] epoch: 2, iter: 5230/20000, loss: 0.4235, lr: 0.007613, batch_cost: 0.2411, reader_cost: 0.11464, ips: 16.5930 samples/sec | ETA 00:59:202022-07-21 15:25:31 [INFO][TRAIN] epoch: 2, iter: 5240/20000, loss: 0.4470, lr: 0.007608, batch_cost: 0.2321, reader_cost: 0.10978, ips: 17.2309 samples/sec | ETA 00:57:062022-07-21 15:25:33 [INFO][TRAIN] epoch: 2, iter: 5250/20000, loss: 0.3240, lr: 0.007603, batch_cost: 0.2226, reader_cost: 0.09752, ips: 17.9702 samples/sec | ETA 00:54:432022-07-21 15:25:35 [INFO][TRAIN] epoch: 2, iter: 5260/20000, loss: 0.3996, lr: 0.007599, batch_cost: 0.2412, reader_cost: 0.11783, ips: 16.5856 samples/sec | ETA 00:59:142022-07-21 15:25:38 [INFO][TRAIN] epoch: 2, iter: 5270/20000, loss: 0.4645, lr: 0.007594, batch_cost: 0.2768, reader_cost: 0.12642, ips: 14.4506 samples/sec | ETA 01:07:572022-07-21 15:25:41 [INFO][TRAIN] epoch: 2, iter: 5280/20000, loss: 0.3042, lr: 0.007590, batch_cost: 0.2855, reader_cost: 0.15084, ips: 14.0118 samples/sec | ETA 01:10:022022-07-21 15:25:43 [INFO][TRAIN] epoch: 2, iter: 5290/20000, loss: 0.4297, lr: 0.007585, batch_cost: 0.2351, reader_cost: 0.11156, ips: 17.0176 samples/sec | ETA 00:57:372022-07-21 15:25:46 [INFO][TRAIN] epoch: 2, iter: 5300/20000, loss: 0.3662, lr: 0.007580, batch_cost: 0.2427, reader_cost: 0.11558, ips: 16.4800 samples/sec | ETA 00:59:272022-07-21 15:25:48 [INFO][TRAIN] epoch: 2, iter: 5310/20000, loss: 0.5296, lr: 0.007576, batch_cost: 0.2444, reader_cost: 0.11804, ips: 16.3650 samples/sec | ETA 00:59:502022-07-21 15:25:50 [INFO][TRAIN] epoch: 2, iter: 5320/20000, loss: 0.4344, lr: 0.007571, batch_cost: 0.2142, reader_cost: 0.09279, ips: 18.6700 samples/sec | ETA 00:52:252022-07-21 15:25:53 [INFO][TRAIN] epoch: 2, iter: 5330/20000, loss: 0.3390, lr: 0.007566, batch_cost: 0.2476, reader_cost: 0.12470, ips: 16.1583 samples/sec | ETA 01:00:312022-07-21 15:25:55 [INFO][TRAIN] epoch: 2, iter: 5340/20000, loss: 0.3630, lr: 0.007562, batch_cost: 0.2242, reader_cost: 0.10039, ips: 17.8440 samples/sec | ETA 00:54:462022-07-21 15:25:58 [INFO][TRAIN] epoch: 2, iter: 5350/20000, loss: 0.4521, lr: 0.007557, batch_cost: 0.2557, reader_cost: 0.11479, ips: 15.6448 samples/sec | ETA 01:02:252022-07-21 15:26:01 [INFO][TRAIN] epoch: 2, iter: 5360/20000, loss: 0.3585, lr: 0.007552, batch_cost: 0.3688, reader_cost: 0.16836, ips: 10.8465 samples/sec | ETA 01:29:582022-07-21 15:26:04 [INFO][TRAIN] epoch: 2, iter: 5370/20000, loss: 0.4727, lr: 0.007548, batch_cost: 0.2452, reader_cost: 0.10883, ips: 16.3104 samples/sec | ETA 00:59:472022-07-21 15:26:06 [INFO][TRAIN] epoch: 2, iter: 5380/20000, loss: 0.3542, lr: 0.007543, batch_cost: 0.2349, reader_cost: 0.11125, ips: 17.0274 samples/sec | ETA 00:57:142022-07-21 15:26:08 [INFO][TRAIN] epoch: 2, iter: 5390/20000, loss: 0.4911, lr: 0.007539, batch_cost: 0.2267, reader_cost: 0.10391, ips: 17.6426 samples/sec | ETA 00:55:122022-07-21 15:26:11 [INFO][TRAIN] epoch: 2, iter: 5400/20000, loss: 0.4155, lr: 0.007534, batch_cost: 0.2520, reader_cost: 0.11143, ips: 15.8726 samples/sec | ETA 01:01:192022-07-21 15:26:13 [INFO][TRAIN] epoch: 2, iter: 5410/20000, loss: 0.3013, lr: 0.007529, batch_cost: 0.2611, reader_cost: 0.13148, ips: 15.3219 samples/sec | ETA 01:03:282022-07-21 15:26:16 [INFO][TRAIN] epoch: 2, iter: 5420/20000, loss: 0.3551, lr: 0.007525, batch_cost: 0.2447, reader_cost: 0.11435, ips: 16.3460 samples/sec | ETA 00:59:272022-07-21 15:26:18 [INFO][TRAIN] epoch: 2, iter: 5430/20000, loss: 0.3183, lr: 0.007520, batch_cost: 0.2455, reader_cost: 0.11718, ips: 16.2943 samples/sec | ETA 00:59:362022-07-21 15:26:21 [INFO][TRAIN] epoch: 2, iter: 5440/20000, loss: 0.4429, lr: 0.007515, batch_cost: 0.2323, reader_cost: 0.10807, ips: 17.2157 samples/sec | ETA 00:56:222022-07-21 15:26:23 [INFO][TRAIN] epoch: 2, iter: 5450/20000, loss: 0.3599, lr: 0.007511, batch_cost: 0.2250, reader_cost: 0.10202, ips: 17.7795 samples/sec | ETA 00:54:332022-07-21 15:26:25 [INFO][TRAIN] epoch: 2, iter: 5460/20000, loss: 0.3519, lr: 0.007506, batch_cost: 0.2444, reader_cost: 0.12107, ips: 16.3698 samples/sec | ETA 00:59:122022-07-21 15:26:28 [INFO][TRAIN] epoch: 2, iter: 5470/20000, loss: 0.3930, lr: 0.007501, batch_cost: 0.2445, reader_cost: 0.12044, ips: 16.3598 samples/sec | ETA 00:59:122022-07-21 15:26:30 [INFO][TRAIN] epoch: 2, iter: 5480/20000, loss: 0.5024, lr: 0.007497, batch_cost: 0.2230, reader_cost: 0.10131, ips: 17.9393 samples/sec | ETA 00:53:572022-07-21 15:26:32 [INFO][TRAIN] epoch: 2, iter: 5490/20000, loss: 0.4301, lr: 0.007492, batch_cost: 0.2368, reader_cost: 0.11337, ips: 16.8949 samples/sec | ETA 00:57:152022-07-21 15:26:35 [INFO][TRAIN] epoch: 2, iter: 5500/20000, loss: 0.3975, lr: 0.007487, batch_cost: 0.2280, reader_cost: 0.10417, ips: 17.5463 samples/sec | ETA 00:55:052022-07-21 15:26:37 [INFO][TRAIN] epoch: 2, iter: 5510/20000, loss: 0.3594, lr: 0.007483, batch_cost: 0.2288, reader_cost: 0.10609, ips: 17.4852 samples/sec | ETA 00:55:142022-07-21 15:26:39 [INFO][TRAIN] epoch: 2, iter: 5520/20000, loss: 0.5003, lr: 0.007478, batch_cost: 0.2213, reader_cost: 0.09958, ips: 18.0768 samples/sec | ETA 00:53:242022-07-21 15:26:42 [INFO][TRAIN] epoch: 2, iter: 5530/20000, loss: 0.3434, lr: 0.007473, batch_cost: 0.3236, reader_cost: 0.14455, ips: 12.3601 samples/sec | ETA 01:18:022022-07-21 15:26:45 [INFO][TRAIN] epoch: 2, iter: 5540/20000, loss: 0.3275, lr: 0.007469, batch_cost: 0.2583, reader_cost: 0.12248, ips: 15.4860 samples/sec | ETA 01:02:142022-07-21 15:26:47 [INFO][TRAIN] epoch: 2, iter: 5550/20000, loss: 0.4840, lr: 0.007464, batch_cost: 0.2243, reader_cost: 0.10117, ips: 17.8333 samples/sec | ETA 00:54:012022-07-21 15:26:50 [INFO][TRAIN] epoch: 2, iter: 5560/20000, loss: 0.3587, lr: 0.007460, batch_cost: 0.2352, reader_cost: 0.11081, ips: 17.0037 samples/sec | ETA 00:56:362022-07-21 15:26:52 [INFO][TRAIN] epoch: 2, iter: 5570/20000, loss: 0.3864, lr: 0.007455, batch_cost: 0.2296, reader_cost: 0.10765, ips: 17.4183 samples/sec | ETA 00:55:132022-07-21 15:26:54 [INFO][TRAIN] epoch: 2, iter: 5580/20000, loss: 0.4177, lr: 0.007450, batch_cost: 0.2401, reader_cost: 0.11699, ips: 16.6621 samples/sec | ETA 00:57:412022-07-21 15:26:57 [INFO][TRAIN] epoch: 2, iter: 5590/20000, loss: 0.4842, lr: 0.007446, batch_cost: 0.2355, reader_cost: 0.11072, ips: 16.9887 samples/sec | ETA 00:56:322022-07-21 15:26:59 [INFO][TRAIN] epoch: 2, iter: 5600/20000, loss: 0.3961, lr: 0.007441, batch_cost: 0.2326, reader_cost: 0.10909, ips: 17.1977 samples/sec | ETA 00:55:492022-07-21 15:27:01 [INFO][TRAIN] epoch: 2, iter: 5610/20000, loss: 0.4291, lr: 0.007436, batch_cost: 0.2310, reader_cost: 0.10726, ips: 17.3165 samples/sec | ETA 00:55:232022-07-21 15:27:04 [INFO][TRAIN] epoch: 2, iter: 5620/20000, loss: 0.3863, lr: 0.007432, batch_cost: 0.2424, reader_cost: 0.11046, ips: 16.5032 samples/sec | ETA 00:58:052022-07-21 15:27:07 [INFO][TRAIN] epoch: 2, iter: 5630/20000, loss: 0.3990, lr: 0.007427, batch_cost: 0.3402, reader_cost: 0.17431, ips: 11.7574 samples/sec | ETA 01:21:282022-07-21 15:27:10 [INFO][TRAIN] epoch: 2, iter: 5640/20000, loss: 0.5615, lr: 0.007422, batch_cost: 0.2654, reader_cost: 0.12106, ips: 15.0739 samples/sec | ETA 01:03:302022-07-21 15:27:12 [INFO][TRAIN] epoch: 2, iter: 5650/20000, loss: 0.3362, lr: 0.007418, batch_cost: 0.2141, reader_cost: 0.08888, ips: 18.6825 samples/sec | ETA 00:51:122022-07-21 15:27:14 [INFO][TRAIN] epoch: 2, iter: 5660/20000, loss: 0.2867, lr: 0.007413, batch_cost: 0.2606, reader_cost: 0.11749, ips: 15.3479 samples/sec | ETA 01:02:172022-07-21 15:27:17 [INFO][TRAIN] epoch: 2, iter: 5670/20000, loss: 0.3442, lr: 0.007408, batch_cost: 0.2526, reader_cost: 0.12789, ips: 15.8345 samples/sec | ETA 01:00:192022-07-21 15:27:20 [INFO][TRAIN] epoch: 2, iter: 5680/20000, loss: 0.3033, lr: 0.007404, batch_cost: 0.2541, reader_cost: 0.12751, ips: 15.7405 samples/sec | ETA 01:00:392022-07-21 15:27:22 [INFO][TRAIN] epoch: 2, iter: 5690/20000, loss: 0.4931, lr: 0.007399, batch_cost: 0.2423, reader_cost: 0.10988, ips: 16.5063 samples/sec | ETA 00:57:472022-07-21 15:27:24 [INFO][TRAIN] epoch: 2, iter: 5700/20000, loss: 0.3469, lr: 0.007394, batch_cost: 0.2507, reader_cost: 0.11961, ips: 15.9561 samples/sec | ETA 00:59:442022-07-21 15:27:27 [INFO][TRAIN] epoch: 2, iter: 5710/20000, loss: 0.4016, lr: 0.007390, batch_cost: 0.2407, reader_cost: 0.11641, ips: 16.6188 samples/sec | ETA 00:57:192022-07-21 15:27:29 [INFO][TRAIN] epoch: 2, iter: 5720/20000, loss: 0.4506, lr: 0.007385, batch_cost: 0.2327, reader_cost: 0.10932, ips: 17.1871 samples/sec | ETA 00:55:232022-07-21 15:27:31 [INFO][TRAIN] epoch: 2, iter: 5730/20000, loss: 0.3409, lr: 0.007380, batch_cost: 0.2234, reader_cost: 0.10003, ips: 17.9068 samples/sec | ETA 00:53:072022-07-21 15:27:34 [INFO][TRAIN] epoch: 2, iter: 5740/20000, loss: 0.3458, lr: 0.007376, batch_cost: 0.2283, reader_cost: 0.10267, ips: 17.5220 samples/sec | ETA 00:54:152022-07-21 15:27:36 [INFO][TRAIN] epoch: 2, iter: 5750/20000, loss: 0.4088, lr: 0.007371, batch_cost: 0.2226, reader_cost: 0.09954, ips: 17.9681 samples/sec | ETA 00:52:522022-07-21 15:27:38 [INFO][TRAIN] epoch: 2, iter: 5760/20000, loss: 0.4165, lr: 0.007366, batch_cost: 0.2240, reader_cost: 0.10029, ips: 17.8596 samples/sec | ETA 00:53:092022-07-21 15:27:40 [INFO][TRAIN] epoch: 2, iter: 5770/20000, loss: 0.3951, lr: 0.007362, batch_cost: 0.2225, reader_cost: 0.09884, ips: 17.9736 samples/sec | ETA 00:52:462022-07-21 15:27:43 [INFO][TRAIN] epoch: 2, iter: 5780/20000, loss: 0.4514, lr: 0.007357, batch_cost: 0.2276, reader_cost: 0.10282, ips: 17.5747 samples/sec | ETA 00:53:562022-07-21 15:27:46 [INFO][TRAIN] epoch: 2, iter: 5790/20000, loss: 0.3179, lr: 0.007353, batch_cost: 0.3168, reader_cost: 0.15696, ips: 12.6277 samples/sec | ETA 01:15:012022-07-21 15:27:49 [INFO][TRAIN] epoch: 2, iter: 5800/20000, loss: 0.3213, lr: 0.007348, batch_cost: 0.2677, reader_cost: 0.12527, ips: 14.9409 samples/sec | ETA 01:03:212022-07-21 15:27:51 [INFO][TRAIN] epoch: 2, iter: 5810/20000, loss: 0.4012, lr: 0.007343, batch_cost: 0.2489, reader_cost: 0.12244, ips: 16.0678 samples/sec | ETA 00:58:522022-07-21 15:27:53 [INFO][TRAIN] epoch: 2, iter: 5820/20000, loss: 0.3272, lr: 0.007339, batch_cost: 0.2109, reader_cost: 0.08880, ips: 18.9657 samples/sec | ETA 00:49:502022-07-21 15:27:56 [INFO][TRAIN] epoch: 2, iter: 5830/20000, loss: 0.3543, lr: 0.007334, batch_cost: 0.2334, reader_cost: 0.10971, ips: 17.1361 samples/sec | ETA 00:55:072022-07-21 15:27:58 [INFO][TRAIN] epoch: 2, iter: 5840/20000, loss: 0.4056, lr: 0.007329, batch_cost: 0.2329, reader_cost: 0.11225, ips: 17.1719 samples/sec | ETA 00:54:582022-07-21 15:28:00 [INFO][TRAIN] epoch: 2, iter: 5850/20000, loss: 0.3293, lr: 0.007325, batch_cost: 0.2300, reader_cost: 0.10805, ips: 17.3876 samples/sec | ETA 00:54:152022-07-21 15:28:02 [INFO][TRAIN] epoch: 2, iter: 5860/20000, loss: 0.3254, lr: 0.007320, batch_cost: 0.2237, reader_cost: 0.09645, ips: 17.8810 samples/sec | ETA 00:52:432022-07-21 15:28:05 [INFO][TRAIN] epoch: 2, iter: 5870/20000, loss: 0.4162, lr: 0.007315, batch_cost: 0.2317, reader_cost: 0.10725, ips: 17.2670 samples/sec | ETA 00:54:332022-07-21 15:28:07 [INFO][TRAIN] epoch: 2, iter: 5880/20000, loss: 0.4265, lr: 0.007311, batch_cost: 0.2338, reader_cost: 0.11085, ips: 17.1059 samples/sec | ETA 00:55:012022-07-21 15:28:10 [INFO][TRAIN] epoch: 2, iter: 5890/20000, loss: 0.3579, lr: 0.007306, batch_cost: 0.2659, reader_cost: 0.13224, ips: 15.0448 samples/sec | ETA 01:02:312022-07-21 15:28:13 [INFO][TRAIN] epoch: 2, iter: 5900/20000, loss: 0.5291, lr: 0.007301, batch_cost: 0.3302, reader_cost: 0.15049, ips: 12.1154 samples/sec | ETA 01:17:352022-07-21 15:28:16 [INFO][TRAIN] epoch: 2, iter: 5910/20000, loss: 0.4302, lr: 0.007297, batch_cost: 0.2842, reader_cost: 0.14366, ips: 14.0762 samples/sec | ETA 01:06:432022-07-21 15:28:19 [INFO][TRAIN] epoch: 2, iter: 5920/20000, loss: 0.3429, lr: 0.007292, batch_cost: 0.2688, reader_cost: 0.12771, ips: 14.8808 samples/sec | ETA 01:03:042022-07-21 15:28:21 [INFO][TRAIN] epoch: 2, iter: 5930/20000, loss: 0.3605, lr: 0.007287, batch_cost: 0.2281, reader_cost: 0.10636, ips: 17.5348 samples/sec | ETA 00:53:292022-07-21 15:28:23 [INFO][TRAIN] epoch: 2, iter: 5940/20000, loss: 0.4836, lr: 0.007283, batch_cost: 0.2196, reader_cost: 0.09411, ips: 18.2128 samples/sec | ETA 00:51:272022-07-21 15:28:25 [INFO][TRAIN] epoch: 2, iter: 5950/20000, loss: 0.3144, lr: 0.007278, batch_cost: 0.2390, reader_cost: 0.11562, ips: 16.7351 samples/sec | ETA 00:55:582022-07-21 15:28:28 [INFO][TRAIN] epoch: 2, iter: 5960/20000, loss: 0.4308, lr: 0.007273, batch_cost: 0.2524, reader_cost: 0.12208, ips: 15.8505 samples/sec | ETA 00:59:032022-07-21 15:28:30 [INFO][TRAIN] epoch: 2, iter: 5970/20000, loss: 0.5022, lr: 0.007269, batch_cost: 0.2546, reader_cost: 0.12120, ips: 15.7133 samples/sec | ETA 00:59:312022-07-21 15:28:33 [INFO][TRAIN] epoch: 2, iter: 5980/20000, loss: 0.4935, lr: 0.007264, batch_cost: 0.2400, reader_cost: 0.11756, ips: 16.6644 samples/sec | ETA 00:56:052022-07-21 15:28:35 [INFO][TRAIN] epoch: 2, iter: 5990/20000, loss: 0.3472, lr: 0.007259, batch_cost: 0.2269, reader_cost: 0.10572, ips: 17.6270 samples/sec | ETA 00:52:592022-07-21 15:28:37 [INFO][TRAIN] epoch: 2, iter: 6000/20000, loss: 0.3631, lr: 0.007255, batch_cost: 0.2277, reader_cost: 0.10674, ips: 17.5633 samples/sec | ETA 00:53:082022-07-21 15:28:40 [INFO][TRAIN] epoch: 2, iter: 6010/20000, loss: 0.5204, lr: 0.007250, batch_cost: 0.2376, reader_cost: 0.11643, ips: 16.8355 samples/sec | ETA 00:55:232022-07-21 15:28:42 [INFO][TRAIN] epoch: 2, iter: 6020/20000, loss: 0.4191, lr: 0.007245, batch_cost: 0.2416, reader_cost: 0.11915, ips: 16.5573 samples/sec | ETA 00:56:172022-07-21 15:28:45 [INFO][TRAIN] epoch: 2, iter: 6030/20000, loss: 0.2933, lr: 0.007241, batch_cost: 0.2301, reader_cost: 0.10508, ips: 17.3858 samples/sec | ETA 00:53:342022-07-21 15:28:47 [INFO][TRAIN] epoch: 2, iter: 6040/20000, loss: 0.2813, lr: 0.007236, batch_cost: 0.2586, reader_cost: 0.13151, ips: 15.4658 samples/sec | ETA 01:00:102022-07-21 15:28:50 [INFO][TRAIN] epoch: 2, iter: 6050/20000, loss: 0.3064, lr: 0.007231, batch_cost: 0.3217, reader_cost: 0.15455, ips: 12.4344 samples/sec | ETA 01:14:472022-07-21 15:28:53 [INFO][TRAIN] epoch: 2, iter: 6060/20000, loss: 0.3696, lr: 0.007227, batch_cost: 0.2338, reader_cost: 0.11033, ips: 17.1095 samples/sec | ETA 00:54:192022-07-21 15:28:55 [INFO][TRAIN] epoch: 2, iter: 6070/20000, loss: 0.3570, lr: 0.007222, batch_cost: 0.2453, reader_cost: 0.12155, ips: 16.3089 samples/sec | ETA 00:56:562022-07-21 15:28:57 [INFO][TRAIN] epoch: 2, iter: 6080/20000, loss: 0.4339, lr: 0.007217, batch_cost: 0.2289, reader_cost: 0.10667, ips: 17.4759 samples/sec | ETA 00:53:062022-07-21 15:29:00 [INFO][TRAIN] epoch: 2, iter: 6090/20000, loss: 0.3575, lr: 0.007213, batch_cost: 0.2353, reader_cost: 0.11189, ips: 16.9995 samples/sec | ETA 00:54:332022-07-21 15:29:02 [INFO][TRAIN] epoch: 2, iter: 6100/20000, loss: 0.3377, lr: 0.007208, batch_cost: 0.2388, reader_cost: 0.11679, ips: 16.7493 samples/sec | ETA 00:55:192022-07-21 15:29:04 [INFO][TRAIN] epoch: 2, iter: 6110/20000, loss: 0.3916, lr: 0.007203, batch_cost: 0.2300, reader_cost: 0.10673, ips: 17.3877 samples/sec | ETA 00:53:152022-07-21 15:29:07 [INFO][TRAIN] epoch: 2, iter: 6120/20000, loss: 0.4877, lr: 0.007199, batch_cost: 0.2296, reader_cost: 0.10230, ips: 17.4253 samples/sec | ETA 00:53:062022-07-21 15:29:09 [INFO][TRAIN] epoch: 2, iter: 6130/20000, loss: 0.3088, lr: 0.007194, batch_cost: 0.2528, reader_cost: 0.12871, ips: 15.8211 samples/sec | ETA 00:58:262022-07-21 15:29:12 [INFO][TRAIN] epoch: 2, iter: 6140/20000, loss: 0.4555, lr: 0.007189, batch_cost: 0.2384, reader_cost: 0.11575, ips: 16.7792 samples/sec | ETA 00:55:042022-07-21 15:29:14 [INFO][TRAIN] epoch: 2, iter: 6150/20000, loss: 0.3978, lr: 0.007185, batch_cost: 0.2203, reader_cost: 0.09742, ips: 18.1553 samples/sec | ETA 00:50:512022-07-21 15:29:16 [INFO][TRAIN] epoch: 2, iter: 6160/20000, loss: 0.3870, lr: 0.007180, batch_cost: 0.2528, reader_cost: 0.12292, ips: 15.8249 samples/sec | ETA 00:58:182022-07-21 15:29:20 [INFO][TRAIN] epoch: 2, iter: 6170/20000, loss: 0.3136, lr: 0.007175, batch_cost: 0.3504, reader_cost: 0.18132, ips: 11.4162 samples/sec | ETA 01:20:452022-07-21 15:29:23 [INFO][TRAIN] epoch: 2, iter: 6180/20000, loss: 0.4404, lr: 0.007171, batch_cost: 0.3246, reader_cost: 0.16046, ips: 12.3222 samples/sec | ETA 01:14:462022-07-21 15:29:25 [INFO][TRAIN] epoch: 2, iter: 6190/20000, loss: 0.3857, lr: 0.007166, batch_cost: 0.2284, reader_cost: 0.10501, ips: 17.5103 samples/sec | ETA 00:52:342022-07-21 15:29:28 [INFO][TRAIN] epoch: 2, iter: 6200/20000, loss: 0.4218, lr: 0.007161, batch_cost: 0.2225, reader_cost: 0.09900, ips: 17.9793 samples/sec | ETA 00:51:102022-07-21 15:29:30 [INFO][TRAIN] epoch: 2, iter: 6210/20000, loss: 0.5062, lr: 0.007157, batch_cost: 0.2191, reader_cost: 0.09666, ips: 18.2534 samples/sec | ETA 00:50:212022-07-21 15:29:32 [INFO][TRAIN] epoch: 2, iter: 6220/20000, loss: 0.4042, lr: 0.007152, batch_cost: 0.2349, reader_cost: 0.11141, ips: 17.0302 samples/sec | ETA 00:53:562022-07-21 15:29:35 [INFO][TRAIN] epoch: 2, iter: 6230/20000, loss: 0.3566, lr: 0.007147, batch_cost: 0.2344, reader_cost: 0.10760, ips: 17.0631 samples/sec | ETA 00:53:482022-07-21 15:29:37 [INFO][TRAIN] epoch: 2, iter: 6240/20000, loss: 0.3222, lr: 0.007143, batch_cost: 0.2779, reader_cost: 0.15093, ips: 14.3943 samples/sec | ETA 01:03:432022-07-21 15:29:40 [INFO][TRAIN] epoch: 2, iter: 6250/20000, loss: 0.6252, lr: 0.007138, batch_cost: 0.2204, reader_cost: 0.09867, ips: 18.1477 samples/sec | ETA 00:50:302022-07-21 15:29:42 [INFO][TRAIN] epoch: 2, iter: 6260/20000, loss: 0.4654, lr: 0.007133, batch_cost: 0.2322, reader_cost: 0.10986, ips: 17.2274 samples/sec | ETA 00:53:102022-07-21 15:29:44 [INFO][TRAIN] epoch: 2, iter: 6270/20000, loss: 0.3417, lr: 0.007129, batch_cost: 0.2425, reader_cost: 0.11935, ips: 16.4967 samples/sec | ETA 00:55:292022-07-21 15:29:47 [INFO][TRAIN] epoch: 2, iter: 6280/20000, loss: 0.4055, lr: 0.007124, batch_cost: 0.2401, reader_cost: 0.11454, ips: 16.6604 samples/sec | ETA 00:54:542022-07-21 15:29:49 [INFO][TRAIN] epoch: 2, iter: 6290/20000, loss: 0.2659, lr: 0.007119, batch_cost: 0.2366, reader_cost: 0.11102, ips: 16.9042 samples/sec | ETA 00:54:042022-07-21 15:29:52 [INFO][TRAIN] epoch: 2, iter: 6300/20000, loss: 0.3332, lr: 0.007115, batch_cost: 0.2472, reader_cost: 0.10840, ips: 16.1789 samples/sec | ETA 00:56:272022-07-21 15:29:55 [INFO][TRAIN] epoch: 2, iter: 6310/20000, loss: 0.2858, lr: 0.007110, batch_cost: 0.3016, reader_cost: 0.14486, ips: 13.2616 samples/sec | ETA 01:08:492022-07-21 15:29:57 [INFO][TRAIN] epoch: 2, iter: 6320/20000, loss: 0.3522, lr: 0.007105, batch_cost: 0.2498, reader_cost: 0.12548, ips: 16.0159 samples/sec | ETA 00:56:562022-07-21 15:29:59 [INFO][TRAIN] epoch: 2, iter: 6330/20000, loss: 0.3278, lr: 0.007101, batch_cost: 0.2126, reader_cost: 0.09122, ips: 18.8142 samples/sec | ETA 00:48:262022-07-21 15:30:01 [INFO][TRAIN] epoch: 2, iter: 6340/20000, loss: 0.5020, lr: 0.007096, batch_cost: 0.2230, reader_cost: 0.09922, ips: 17.9361 samples/sec | ETA 00:50:462022-07-21 15:30:04 [INFO][TRAIN] epoch: 2, iter: 6350/20000, loss: 0.3229, lr: 0.007091, batch_cost: 0.2230, reader_cost: 0.10007, ips: 17.9377 samples/sec | ETA 00:50:432022-07-21 15:30:06 [INFO][TRAIN] epoch: 2, iter: 6360/20000, loss: 0.4023, lr: 0.007087, batch_cost: 0.2410, reader_cost: 0.11804, ips: 16.6000 samples/sec | ETA 00:54:462022-07-21 15:30:08 [INFO][TRAIN] epoch: 2, iter: 6370/20000, loss: 0.3320, lr: 0.007082, batch_cost: 0.2278, reader_cost: 0.10464, ips: 17.5570 samples/sec | ETA 00:51:452022-07-21 15:30:11 [INFO][TRAIN] epoch: 2, iter: 6380/20000, loss: 0.4018, lr: 0.007077, batch_cost: 0.2465, reader_cost: 0.12232, ips: 16.2254 samples/sec | ETA 00:55:572022-07-21 15:30:13 [INFO][TRAIN] epoch: 2, iter: 6390/20000, loss: 0.3027, lr: 0.007073, batch_cost: 0.2460, reader_cost: 0.12364, ips: 16.2610 samples/sec | ETA 00:55:472022-07-21 15:30:15 [INFO][TRAIN] epoch: 2, iter: 6400/20000, loss: 0.4074, lr: 0.007068, batch_cost: 0.2258, reader_cost: 0.10257, ips: 17.7144 samples/sec | ETA 00:51:102022-07-21 15:30:18 [INFO][TRAIN] epoch: 2, iter: 6410/20000, loss: 0.2781, lr: 0.007063, batch_cost: 0.2333, reader_cost: 0.10916, ips: 17.1432 samples/sec | ETA 00:52:502022-07-21 15:30:20 [INFO][TRAIN] epoch: 2, iter: 6420/20000, loss: 0.3350, lr: 0.007058, batch_cost: 0.2306, reader_cost: 0.10815, ips: 17.3487 samples/sec | ETA 00:52:112022-07-21 15:30:22 [INFO][TRAIN] epoch: 2, iter: 6430/20000, loss: 0.3672, lr: 0.007054, batch_cost: 0.2367, reader_cost: 0.11009, ips: 16.8966 samples/sec | ETA 00:53:322022-07-21 15:30:26 [INFO][TRAIN] epoch: 2, iter: 6440/20000, loss: 0.3798, lr: 0.007049, batch_cost: 0.3796, reader_cost: 0.16630, ips: 10.5381 samples/sec | ETA 01:25:472022-07-21 15:30:29 [INFO][TRAIN] epoch: 2, iter: 6450/20000, loss: 0.3803, lr: 0.007044, batch_cost: 0.3094, reader_cost: 0.15834, ips: 12.9279 samples/sec | ETA 01:09:522022-07-21 15:30:32 [INFO][TRAIN] epoch: 2, iter: 6460/20000, loss: 0.3558, lr: 0.007040, batch_cost: 0.2249, reader_cost: 0.10135, ips: 17.7841 samples/sec | ETA 00:50:452022-07-21 15:30:34 [INFO][TRAIN] epoch: 2, iter: 6470/20000, loss: 0.5293, lr: 0.007035, batch_cost: 0.2245, reader_cost: 0.10168, ips: 17.8178 samples/sec | ETA 00:50:372022-07-21 15:30:36 [INFO][TRAIN] epoch: 2, iter: 6480/20000, loss: 0.4786, lr: 0.007030, batch_cost: 0.2322, reader_cost: 0.10624, ips: 17.2272 samples/sec | ETA 00:52:192022-07-21 15:30:39 [INFO][TRAIN] epoch: 2, iter: 6490/20000, loss: 0.3329, lr: 0.007026, batch_cost: 0.2670, reader_cost: 0.13913, ips: 14.9824 samples/sec | ETA 01:00:062022-07-21 15:30:41 [INFO][TRAIN] epoch: 2, iter: 6500/20000, loss: 0.5508, lr: 0.007021, batch_cost: 0.2443, reader_cost: 0.11347, ips: 16.3752 samples/sec | ETA 00:54:572022-07-21 15:30:44 [INFO][TRAIN] epoch: 2, iter: 6510/20000, loss: 0.4365, lr: 0.007016, batch_cost: 0.2287, reader_cost: 0.10330, ips: 17.4884 samples/sec | ETA 00:51:252022-07-21 15:30:46 [INFO][TRAIN] epoch: 2, iter: 6520/20000, loss: 0.3806, lr: 0.007012, batch_cost: 0.2295, reader_cost: 0.10389, ips: 17.4276 samples/sec | ETA 00:51:332022-07-21 15:30:48 [INFO][TRAIN] epoch: 2, iter: 6530/20000, loss: 0.4894, lr: 0.007007, batch_cost: 0.2373, reader_cost: 0.11379, ips: 16.8541 samples/sec | ETA 00:53:162022-07-21 15:30:51 [INFO][TRAIN] epoch: 2, iter: 6540/20000, loss: 0.3412, lr: 0.007002, batch_cost: 0.2369, reader_cost: 0.11469, ips: 16.8848 samples/sec | ETA 00:53:082022-07-21 15:30:53 [INFO][TRAIN] epoch: 2, iter: 6550/20000, loss: 0.2933, lr: 0.006998, batch_cost: 0.2646, reader_cost: 0.12953, ips: 15.1166 samples/sec | ETA 00:59:192022-07-21 15:30:56 [INFO][TRAIN] epoch: 2, iter: 6560/20000, loss: 0.3020, lr: 0.006993, batch_cost: 0.2739, reader_cost: 0.13773, ips: 14.6032 samples/sec | ETA 01:01:212022-07-21 15:30:59 [INFO][TRAIN] epoch: 2, iter: 6570/20000, loss: 0.4782, lr: 0.006988, batch_cost: 0.2658, reader_cost: 0.13235, ips: 15.0464 samples/sec | ETA 00:59:302022-07-21 15:31:01 [INFO][TRAIN] epoch: 2, iter: 6580/20000, loss: 0.3459, lr: 0.006984, batch_cost: 0.2291, reader_cost: 0.10386, ips: 17.4576 samples/sec | ETA 00:51:142022-07-21 15:31:03 [INFO][TRAIN] epoch: 2, iter: 6590/20000, loss: 0.3739, lr: 0.006979, batch_cost: 0.2280, reader_cost: 0.10387, ips: 17.5423 samples/sec | ETA 00:50:572022-07-21 15:31:06 [INFO][TRAIN] epoch: 2, iter: 6600/20000, loss: 0.3185, lr: 0.006974, batch_cost: 0.2263, reader_cost: 0.10475, ips: 17.6731 samples/sec | ETA 00:50:322022-07-21 15:31:08 [INFO][TRAIN] epoch: 2, iter: 6610/20000, loss: 0.5080, lr: 0.006970, batch_cost: 0.2301, reader_cost: 0.10776, ips: 17.3827 samples/sec | ETA 00:51:212022-07-21 15:31:10 [INFO][TRAIN] epoch: 2, iter: 6620/20000, loss: 0.4798, lr: 0.006965, batch_cost: 0.2330, reader_cost: 0.11024, ips: 17.1642 samples/sec | ETA 00:51:582022-07-21 15:31:13 [INFO][TRAIN] epoch: 2, iter: 6630/20000, loss: 0.3907, lr: 0.006960, batch_cost: 0.2349, reader_cost: 0.11189, ips: 17.0318 samples/sec | ETA 00:52:202022-07-21 15:31:15 [INFO][TRAIN] epoch: 2, iter: 6640/20000, loss: 0.3904, lr: 0.006955, batch_cost: 0.2374, reader_cost: 0.11479, ips: 16.8476 samples/sec | ETA 00:52:512022-07-21 15:31:17 [INFO][TRAIN] epoch: 2, iter: 6650/20000, loss: 0.4754, lr: 0.006951, batch_cost: 0.2165, reader_cost: 0.09473, ips: 18.4784 samples/sec | ETA 00:48:092022-07-21 15:31:19 [INFO][TRAIN] epoch: 2, iter: 6660/20000, loss: 0.3973, lr: 0.006946, batch_cost: 0.2231, reader_cost: 0.09956, ips: 17.9285 samples/sec | ETA 00:49:362022-07-21 15:31:22 [INFO][TRAIN] epoch: 2, iter: 6670/20000, loss: 0.3188, lr: 0.006941, batch_cost: 0.2422, reader_cost: 0.11782, ips: 16.5133 samples/sec | ETA 00:53:482022-07-21 15:31:24 [INFO][TRAIN] epoch: 2, iter: 6680/20000, loss: 0.5073, lr: 0.006937, batch_cost: 0.2397, reader_cost: 0.11378, ips: 16.6856 samples/sec | ETA 00:53:132022-07-21 15:31:26 [INFO][TRAIN] epoch: 2, iter: 6690/20000, loss: 0.3654, lr: 0.006932, batch_cost: 0.2301, reader_cost: 0.10683, ips: 17.3831 samples/sec | ETA 00:51:022022-07-21 15:31:29 [INFO][TRAIN] epoch: 2, iter: 6700/20000, loss: 0.4763, lr: 0.006927, batch_cost: 0.2476, reader_cost: 0.11176, ips: 16.1574 samples/sec | ETA 00:54:522022-07-21 15:31:32 [INFO][TRAIN] epoch: 2, iter: 6710/20000, loss: 0.3630, lr: 0.006923, batch_cost: 0.3222, reader_cost: 0.15137, ips: 12.4137 samples/sec | ETA 01:11:222022-07-21 15:31:36 [INFO][TRAIN] epoch: 2, iter: 6720/20000, loss: 0.3713, lr: 0.006918, batch_cost: 0.3409, reader_cost: 0.17863, ips: 11.7336 samples/sec | ETA 01:15:272022-07-21 15:31:38 [INFO][TRAIN] epoch: 2, iter: 6730/20000, loss: 0.4624, lr: 0.006913, batch_cost: 0.2346, reader_cost: 0.11056, ips: 17.0489 samples/sec | ETA 00:51:532022-07-21 15:31:40 [INFO][TRAIN] epoch: 2, iter: 6740/20000, loss: 0.3269, lr: 0.006909, batch_cost: 0.2424, reader_cost: 0.12124, ips: 16.5028 samples/sec | ETA 00:53:332022-07-21 15:31:43 [INFO][TRAIN] epoch: 2, iter: 6750/20000, loss: 0.4448, lr: 0.006904, batch_cost: 0.2418, reader_cost: 0.11775, ips: 16.5428 samples/sec | ETA 00:53:232022-07-21 15:31:45 [INFO][TRAIN] epoch: 2, iter: 6760/20000, loss: 0.3073, lr: 0.006899, batch_cost: 0.2401, reader_cost: 0.11143, ips: 16.6620 samples/sec | ETA 00:52:582022-07-21 15:31:47 [INFO][TRAIN] epoch: 2, iter: 6770/20000, loss: 0.3713, lr: 0.006895, batch_cost: 0.2375, reader_cost: 0.10728, ips: 16.8456 samples/sec | ETA 00:52:212022-07-21 15:31:50 [INFO][TRAIN] epoch: 2, iter: 6780/20000, loss: 0.3410, lr: 0.006890, batch_cost: 0.2395, reader_cost: 0.11575, ips: 16.7001 samples/sec | ETA 00:52:462022-07-21 15:31:52 [INFO][TRAIN] epoch: 2, iter: 6790/20000, loss: 0.3680, lr: 0.006885, batch_cost: 0.2391, reader_cost: 0.11785, ips: 16.7317 samples/sec | ETA 00:52:382022-07-21 15:31:55 [INFO][TRAIN] epoch: 2, iter: 6800/20000, loss: 0.5045, lr: 0.006880, batch_cost: 0.2253, reader_cost: 0.10000, ips: 17.7566 samples/sec | ETA 00:49:332022-07-21 15:31:57 [INFO][TRAIN] epoch: 2, iter: 6810/20000, loss: 0.3553, lr: 0.006876, batch_cost: 0.2863, reader_cost: 0.13632, ips: 13.9714 samples/sec | ETA 01:02:562022-07-21 15:32:00 [INFO][TRAIN] epoch: 2, iter: 6820/20000, loss: 0.4394, lr: 0.006871, batch_cost: 0.2254, reader_cost: 0.09981, ips: 17.7441 samples/sec | ETA 00:49:312022-07-21 15:32:02 [INFO][TRAIN] epoch: 2, iter: 6830/20000, loss: 0.4109, lr: 0.006866, batch_cost: 0.2639, reader_cost: 0.11929, ips: 15.1562 samples/sec | ETA 00:57:552022-07-21 15:32:05 [INFO][TRAIN] epoch: 2, iter: 6840/20000, loss: 0.4091, lr: 0.006862, batch_cost: 0.2323, reader_cost: 0.11129, ips: 17.2187 samples/sec | ETA 00:50:572022-07-21 15:32:07 [INFO][TRAIN] epoch: 2, iter: 6850/20000, loss: 0.3348, lr: 0.006857, batch_cost: 0.2203, reader_cost: 0.10030, ips: 18.1532 samples/sec | ETA 00:48:172022-07-21 15:32:09 [INFO][TRAIN] epoch: 2, iter: 6860/20000, loss: 0.4871, lr: 0.006852, batch_cost: 0.2327, reader_cost: 0.10903, ips: 17.1882 samples/sec | ETA 00:50:572022-07-21 15:32:12 [INFO][TRAIN] epoch: 2, iter: 6870/20000, loss: 0.3589, lr: 0.006848, batch_cost: 0.2375, reader_cost: 0.11095, ips: 16.8451 samples/sec | ETA 00:51:572022-07-21 15:32:14 [INFO][TRAIN] epoch: 2, iter: 6880/20000, loss: 0.3127, lr: 0.006843, batch_cost: 0.2106, reader_cost: 0.08431, ips: 18.9931 samples/sec | ETA 00:46:032022-07-21 15:32:16 [INFO][TRAIN] epoch: 2, iter: 6890/20000, loss: 0.4708, lr: 0.006838, batch_cost: 0.2371, reader_cost: 0.10906, ips: 16.8734 samples/sec | ETA 00:51:472022-07-21 15:32:18 [INFO][TRAIN] epoch: 2, iter: 6900/20000, loss: 0.3590, lr: 0.006834, batch_cost: 0.2223, reader_cost: 0.09993, ips: 17.9914 samples/sec | ETA 00:48:322022-07-21 15:32:21 [INFO][TRAIN] epoch: 2, iter: 6910/20000, loss: 0.3546, lr: 0.006829, batch_cost: 0.2304, reader_cost: 0.10793, ips: 17.3615 samples/sec | ETA 00:50:152022-07-21 15:32:23 [INFO][TRAIN] epoch: 2, iter: 6920/20000, loss: 0.4434, lr: 0.006824, batch_cost: 0.2357, reader_cost: 0.10859, ips: 16.9676 samples/sec | ETA 00:51:232022-07-21 15:32:25 [INFO][TRAIN] epoch: 2, iter: 6930/20000, loss: 0.3885, lr: 0.006819, batch_cost: 0.2202, reader_cost: 0.09591, ips: 18.1615 samples/sec | ETA 00:47:582022-07-21 15:32:28 [INFO][TRAIN] epoch: 2, iter: 6940/20000, loss: 0.2846, lr: 0.006815, batch_cost: 0.2415, reader_cost: 0.11246, ips: 16.5612 samples/sec | ETA 00:52:342022-07-21 15:32:30 [INFO][TRAIN] epoch: 2, iter: 6950/20000, loss: 0.4283, lr: 0.006810, batch_cost: 0.2324, reader_cost: 0.10792, ips: 17.2152 samples/sec | ETA 00:50:322022-07-21 15:32:32 [INFO][TRAIN] epoch: 2, iter: 6960/20000, loss: 0.4757, lr: 0.006805, batch_cost: 0.2465, reader_cost: 0.11929, ips: 16.2254 samples/sec | ETA 00:53:342022-07-21 15:32:35 [INFO][TRAIN] epoch: 2, iter: 6970/20000, loss: 0.3434, lr: 0.006801, batch_cost: 0.2578, reader_cost: 0.11153, ips: 15.5159 samples/sec | ETA 00:55:592022-07-21 15:32:37 [INFO][TRAIN] epoch: 2, iter: 6980/20000, loss: 0.4000, lr: 0.006796, batch_cost: 0.2313, reader_cost: 0.10361, ips: 17.2901 samples/sec | ETA 00:50:122022-07-21 15:32:41 [INFO][TRAIN] epoch: 2, iter: 6990/20000, loss: 0.3531, lr: 0.006791, batch_cost: 0.3619, reader_cost: 0.17299, ips: 11.0534 samples/sec | ETA 01:18:282022-07-21 15:32:43 [INFO][TRAIN] epoch: 2, iter: 7000/20000, loss: 0.3181, lr: 0.006787, batch_cost: 0.2608, reader_cost: 0.12082, ips: 15.3372 samples/sec | ETA 00:56:302022-07-21 15:32:46 [INFO][TRAIN] epoch: 2, iter: 7010/20000, loss: 0.4069, lr: 0.006782, batch_cost: 0.2448, reader_cost: 0.11956, ips: 16.3398 samples/sec | ETA 00:52:592022-07-21 15:32:48 [INFO][TRAIN] epoch: 2, iter: 7020/20000, loss: 0.4162, lr: 0.006777, batch_cost: 0.2301, reader_cost: 0.10395, ips: 17.3832 samples/sec | ETA 00:49:462022-07-21 15:32:50 [INFO][TRAIN] epoch: 2, iter: 7030/20000, loss: 0.4281, lr: 0.006773, batch_cost: 0.2239, reader_cost: 0.09777, ips: 17.8620 samples/sec | ETA 00:48:242022-07-21 15:32:53 [INFO][TRAIN] epoch: 2, iter: 7040/20000, loss: 0.3480, lr: 0.006768, batch_cost: 0.2610, reader_cost: 0.13134, ips: 15.3233 samples/sec | ETA 00:56:232022-07-21 15:32:55 [INFO][TRAIN] epoch: 2, iter: 7050/20000, loss: 0.4102, lr: 0.006763, batch_cost: 0.2447, reader_cost: 0.12001, ips: 16.3442 samples/sec | ETA 00:52:492022-07-21 15:32:58 [INFO][TRAIN] epoch: 2, iter: 7060/20000, loss: 0.4236, lr: 0.006758, batch_cost: 0.2739, reader_cost: 0.13169, ips: 14.6030 samples/sec | ETA 00:59:042022-07-21 15:33:01 [INFO][TRAIN] epoch: 2, iter: 7070/20000, loss: 0.4329, lr: 0.006754, batch_cost: 0.2470, reader_cost: 0.11624, ips: 16.1927 samples/sec | ETA 00:53:142022-07-21 15:33:03 [INFO][TRAIN] epoch: 2, iter: 7080/20000, loss: 0.3161, lr: 0.006749, batch_cost: 0.2269, reader_cost: 0.10461, ips: 17.6300 samples/sec | ETA 00:48:512022-07-21 15:33:06 [INFO][TRAIN] epoch: 2, iter: 7090/20000, loss: 0.2810, lr: 0.006744, batch_cost: 0.2844, reader_cost: 0.14339, ips: 14.0633 samples/sec | ETA 01:01:112022-07-21 15:33:08 [INFO][TRAIN] epoch: 2, iter: 7100/20000, loss: 0.3926, lr: 0.006740, batch_cost: 0.2178, reader_cost: 0.09589, ips: 18.3622 samples/sec | ETA 00:46:502022-07-21 15:33:11 [INFO][TRAIN] epoch: 2, iter: 7110/20000, loss: 0.4195, lr: 0.006735, batch_cost: 0.2533, reader_cost: 0.12724, ips: 15.7908 samples/sec | ETA 00:54:252022-07-21 15:33:13 [INFO][TRAIN] epoch: 2, iter: 7120/20000, loss: 0.2977, lr: 0.006730, batch_cost: 0.2414, reader_cost: 0.11868, ips: 16.5669 samples/sec | ETA 00:51:492022-07-21 15:33:15 [INFO][TRAIN] epoch: 2, iter: 7130/20000, loss: 0.3706, lr: 0.006725, batch_cost: 0.2425, reader_cost: 0.11565, ips: 16.4922 samples/sec | ETA 00:52:012022-07-21 15:33:18 [INFO][TRAIN] epoch: 2, iter: 7140/20000, loss: 0.3393, lr: 0.006721, batch_cost: 0.2215, reader_cost: 0.09762, ips: 18.0617 samples/sec | ETA 00:47:282022-07-21 15:33:20 [INFO][TRAIN] epoch: 2, iter: 7150/20000, loss: 0.3397, lr: 0.006716, batch_cost: 0.2352, reader_cost: 0.10913, ips: 17.0082 samples/sec | ETA 00:50:222022-07-21 15:33:22 [INFO][TRAIN] epoch: 2, iter: 7160/20000, loss: 0.3851, lr: 0.006711, batch_cost: 0.2252, reader_cost: 0.10122, ips: 17.7590 samples/sec | ETA 00:48:122022-07-21 15:33:24 [INFO][TRAIN] epoch: 2, iter: 7170/20000, loss: 0.2772, lr: 0.006707, batch_cost: 0.2318, reader_cost: 0.10843, ips: 17.2552 samples/sec | ETA 00:49:342022-07-21 15:33:27 [INFO][TRAIN] epoch: 2, iter: 7180/20000, loss: 0.4299, lr: 0.006702, batch_cost: 0.2216, reader_cost: 0.09848, ips: 18.0497 samples/sec | ETA 00:47:212022-07-21 15:33:29 [INFO][TRAIN] epoch: 2, iter: 7190/20000, loss: 0.3898, lr: 0.006697, batch_cost: 0.2397, reader_cost: 0.11387, ips: 16.6887 samples/sec | ETA 00:51:102022-07-21 15:33:31 [INFO][TRAIN] epoch: 2, iter: 7200/20000, loss: 0.3092, lr: 0.006693, batch_cost: 0.2285, reader_cost: 0.10607, ips: 17.5063 samples/sec | ETA 00:48:442022-07-21 15:33:34 [INFO][TRAIN] epoch: 2, iter: 7210/20000, loss: 0.3152, lr: 0.006688, batch_cost: 0.2418, reader_cost: 0.11595, ips: 16.5425 samples/sec | ETA 00:51:322022-07-21 15:33:36 [INFO][TRAIN] epoch: 2, iter: 7220/20000, loss: 0.3713, lr: 0.006683, batch_cost: 0.2498, reader_cost: 0.11376, ips: 16.0160 samples/sec | ETA 00:53:112022-07-21 15:33:39 [INFO][TRAIN] epoch: 2, iter: 7230/20000, loss: 0.5055, lr: 0.006678, batch_cost: 0.2535, reader_cost: 0.12327, ips: 15.7762 samples/sec | ETA 00:53:572022-07-21 15:33:41 [INFO][TRAIN] epoch: 2, iter: 7240/20000, loss: 0.4321, lr: 0.006674, batch_cost: 0.2336, reader_cost: 0.10987, ips: 17.1228 samples/sec | ETA 00:49:402022-07-21 15:33:44 [INFO][TRAIN] epoch: 2, iter: 7250/20000, loss: 0.3176, lr: 0.006669, batch_cost: 0.2408, reader_cost: 0.10967, ips: 16.6090 samples/sec | ETA 00:51:102022-07-21 15:33:47 [INFO][TRAIN] epoch: 2, iter: 7260/20000, loss: 0.3554, lr: 0.006664, batch_cost: 0.3600, reader_cost: 0.17208, ips: 11.1109 samples/sec | ETA 01:16:262022-07-21 15:33:50 [INFO][TRAIN] epoch: 2, iter: 7270/20000, loss: 0.3413, lr: 0.006660, batch_cost: 0.2777, reader_cost: 0.12316, ips: 14.4033 samples/sec | ETA 00:58:552022-07-21 15:33:52 [INFO][TRAIN] epoch: 2, iter: 7280/20000, loss: 0.3042, lr: 0.006655, batch_cost: 0.2337, reader_cost: 0.11097, ips: 17.1188 samples/sec | ETA 00:49:322022-07-21 15:33:55 [INFO][TRAIN] epoch: 2, iter: 7290/20000, loss: 0.3453, lr: 0.006650, batch_cost: 0.2533, reader_cost: 0.12670, ips: 15.7913 samples/sec | ETA 00:53:392022-07-21 15:33:58 [INFO][TRAIN] epoch: 2, iter: 7300/20000, loss: 0.3940, lr: 0.006645, batch_cost: 0.2672, reader_cost: 0.10734, ips: 14.9722 samples/sec | ETA 00:56:322022-07-21 15:34:00 [INFO][TRAIN] epoch: 2, iter: 7310/20000, loss: 0.4133, lr: 0.006641, batch_cost: 0.2700, reader_cost: 0.12531, ips: 14.8164 samples/sec | ETA 00:57:052022-07-21 15:34:03 [INFO][TRAIN] epoch: 2, iter: 7320/20000, loss: 0.3483, lr: 0.006636, batch_cost: 0.2414, reader_cost: 0.11520, ips: 16.5671 samples/sec | ETA 00:51:012022-07-21 15:34:05 [INFO][TRAIN] epoch: 2, iter: 7330/20000, loss: 0.3672, lr: 0.006631, batch_cost: 0.2220, reader_cost: 0.09866, ips: 18.0156 samples/sec | ETA 00:46:532022-07-21 15:34:07 [INFO][TRAIN] epoch: 2, iter: 7340/20000, loss: 0.3119, lr: 0.006627, batch_cost: 0.2347, reader_cost: 0.10806, ips: 17.0433 samples/sec | ETA 00:49:312022-07-21 15:34:10 [INFO][TRAIN] epoch: 2, iter: 7350/20000, loss: 0.3093, lr: 0.006622, batch_cost: 0.2608, reader_cost: 0.12743, ips: 15.3397 samples/sec | ETA 00:54:582022-07-21 15:34:12 [INFO][TRAIN] epoch: 2, iter: 7360/20000, loss: 0.3871, lr: 0.006617, batch_cost: 0.2405, reader_cost: 0.11722, ips: 16.6308 samples/sec | ETA 00:50:402022-07-21 15:34:14 [INFO][TRAIN] epoch: 2, iter: 7370/20000, loss: 0.4515, lr: 0.006613, batch_cost: 0.2287, reader_cost: 0.10703, ips: 17.4892 samples/sec | ETA 00:48:082022-07-21 15:34:17 [INFO][TRAIN] epoch: 2, iter: 7380/20000, loss: 0.3057, lr: 0.006608, batch_cost: 0.2244, reader_cost: 0.10009, ips: 17.8228 samples/sec | ETA 00:47:122022-07-21 15:34:19 [INFO][TRAIN] epoch: 2, iter: 7390/20000, loss: 0.3382, lr: 0.006603, batch_cost: 0.2351, reader_cost: 0.10947, ips: 17.0145 samples/sec | ETA 00:49:242022-07-21 15:34:21 [INFO][TRAIN] epoch: 2, iter: 7400/20000, loss: 0.2837, lr: 0.006598, batch_cost: 0.2228, reader_cost: 0.09866, ips: 17.9522 samples/sec | ETA 00:46:472022-07-21 15:34:24 [INFO][TRAIN] epoch: 2, iter: 7410/20000, loss: 0.3093, lr: 0.006594, batch_cost: 0.2476, reader_cost: 0.12520, ips: 16.1520 samples/sec | ETA 00:51:572022-07-21 15:34:26 [INFO][TRAIN] epoch: 2, iter: 7420/20000, loss: 0.3254, lr: 0.006589, batch_cost: 0.2503, reader_cost: 0.12771, ips: 15.9809 samples/sec | ETA 00:52:282022-07-21 15:34:29 [INFO][TRAIN] epoch: 2, iter: 7430/20000, loss: 0.3488, lr: 0.006584, batch_cost: 0.2514, reader_cost: 0.12685, ips: 15.9080 samples/sec | ETA 00:52:402022-07-21 15:34:31 [INFO][TRAIN] epoch: 2, iter: 7440/20000, loss: 0.3824, lr: 0.006580, batch_cost: 0.2377, reader_cost: 0.11510, ips: 16.8279 samples/sec | ETA 00:49:452022-07-21 15:34:33 [INFO][TRAIN] epoch: 2, iter: 7450/20000, loss: 0.4773, lr: 0.006575, batch_cost: 0.2243, reader_cost: 0.09735, ips: 17.8353 samples/sec | ETA 00:46:542022-07-21 15:34:36 [INFO][TRAIN] epoch: 2, iter: 7460/20000, loss: 0.4035, lr: 0.006570, batch_cost: 0.2231, reader_cost: 0.09667, ips: 17.9265 samples/sec | ETA 00:46:382022-07-21 15:34:38 [INFO][TRAIN] epoch: 2, iter: 7470/20000, loss: 0.4438, lr: 0.006565, batch_cost: 0.2397, reader_cost: 0.11276, ips: 16.6844 samples/sec | ETA 00:50:042022-07-21 15:34:41 [INFO][TRAIN] epoch: 2, iter: 7480/20000, loss: 0.4695, lr: 0.006561, batch_cost: 0.2707, reader_cost: 0.12761, ips: 14.7776 samples/sec | ETA 00:56:282022-07-21 15:34:43 [INFO][TRAIN] epoch: 2, iter: 7490/20000, loss: 0.3455, lr: 0.006556, batch_cost: 0.2425, reader_cost: 0.11669, ips: 16.4946 samples/sec | ETA 00:50:332022-07-21 15:34:46 [INFO][TRAIN] epoch: 2, iter: 7500/20000, loss: 0.3670, lr: 0.006551, batch_cost: 0.2513, reader_cost: 0.12014, ips: 15.9160 samples/sec | ETA 00:52:212022-07-21 15:34:48 [INFO][TRAIN] epoch: 2, iter: 7510/20000, loss: 0.3403, lr: 0.006547, batch_cost: 0.2347, reader_cost: 0.10734, ips: 17.0463 samples/sec | ETA 00:48:502022-07-21 15:34:50 [INFO][TRAIN] epoch: 2, iter: 7520/20000, loss: 0.3827, lr: 0.006542, batch_cost: 0.2336, reader_cost: 0.10740, ips: 17.1268 samples/sec | ETA 00:48:342022-07-21 15:34:54 [INFO][TRAIN] epoch: 2, iter: 7530/20000, loss: 0.3406, lr: 0.006537, batch_cost: 0.3588, reader_cost: 0.17673, ips: 11.1481 samples/sec | ETA 01:14:342022-07-21 15:34:57 [INFO][TRAIN] epoch: 2, iter: 7540/20000, loss: 0.3298, lr: 0.006532, batch_cost: 0.2528, reader_cost: 0.10945, ips: 15.8209 samples/sec | ETA 00:52:302022-07-21 15:34:59 [INFO][TRAIN] epoch: 2, iter: 7550/20000, loss: 0.3142, lr: 0.006528, batch_cost: 0.2399, reader_cost: 0.11716, ips: 16.6760 samples/sec | ETA 00:49:462022-07-21 15:35:01 [INFO][TRAIN] epoch: 2, iter: 7560/20000, loss: 0.4109, lr: 0.006523, batch_cost: 0.2498, reader_cost: 0.11896, ips: 16.0115 samples/sec | ETA 00:51:472022-07-21 15:35:04 [INFO][TRAIN] epoch: 2, iter: 7570/20000, loss: 0.4276, lr: 0.006518, batch_cost: 0.2987, reader_cost: 0.16518, ips: 13.3921 samples/sec | ETA 01:01:522022-07-21 15:35:07 [INFO][TRAIN] epoch: 2, iter: 7580/20000, loss: 0.4390, lr: 0.006513, batch_cost: 0.2366, reader_cost: 0.11201, ips: 16.9075 samples/sec | ETA 00:48:582022-07-21 15:35:09 [INFO][TRAIN] epoch: 2, iter: 7590/20000, loss: 0.3332, lr: 0.006509, batch_cost: 0.2265, reader_cost: 0.10376, ips: 17.6634 samples/sec | ETA 00:46:502022-07-21 15:35:12 [INFO][TRAIN] epoch: 2, iter: 7600/20000, loss: 0.3464, lr: 0.006504, batch_cost: 0.2777, reader_cost: 0.13289, ips: 14.4023 samples/sec | ETA 00:57:232022-07-21 15:35:14 [INFO][TRAIN] epoch: 2, iter: 7610/20000, loss: 0.3708, lr: 0.006499, batch_cost: 0.2247, reader_cost: 0.09532, ips: 17.8030 samples/sec | ETA 00:46:232022-07-21 15:35:16 [INFO][TRAIN] epoch: 2, iter: 7620/20000, loss: 0.3830, lr: 0.006495, batch_cost: 0.2380, reader_cost: 0.11453, ips: 16.8087 samples/sec | ETA 00:49:062022-07-21 15:35:19 [INFO][TRAIN] epoch: 2, iter: 7630/20000, loss: 0.4397, lr: 0.006490, batch_cost: 0.2215, reader_cost: 0.09795, ips: 18.0593 samples/sec | ETA 00:45:392022-07-21 15:35:21 [INFO][TRAIN] epoch: 2, iter: 7640/20000, loss: 0.3209, lr: 0.006485, batch_cost: 0.2401, reader_cost: 0.11539, ips: 16.6577 samples/sec | ETA 00:49:272022-07-21 15:35:23 [INFO][TRAIN] epoch: 2, iter: 7650/20000, loss: 0.3499, lr: 0.006480, batch_cost: 0.2323, reader_cost: 0.11072, ips: 17.2181 samples/sec | ETA 00:47:492022-07-21 15:35:26 [INFO][TRAIN] epoch: 2, iter: 7660/20000, loss: 0.4862, lr: 0.006476, batch_cost: 0.2282, reader_cost: 0.10474, ips: 17.5320 samples/sec | ETA 00:46:552022-07-21 15:35:28 [INFO][TRAIN] epoch: 2, iter: 7670/20000, loss: 0.2943, lr: 0.006471, batch_cost: 0.2314, reader_cost: 0.10949, ips: 17.2882 samples/sec | ETA 00:47:322022-07-21 15:35:30 [INFO][TRAIN] epoch: 2, iter: 7680/20000, loss: 0.5478, lr: 0.006466, batch_cost: 0.2173, reader_cost: 0.09371, ips: 18.4044 samples/sec | ETA 00:44:372022-07-21 15:35:32 [INFO][TRAIN] epoch: 2, iter: 7690/20000, loss: 0.4967, lr: 0.006462, batch_cost: 0.2220, reader_cost: 0.09944, ips: 18.0159 samples/sec | ETA 00:45:332022-07-21 15:35:35 [INFO][TRAIN] epoch: 2, iter: 7700/20000, loss: 0.3755, lr: 0.006457, batch_cost: 0.2333, reader_cost: 0.10901, ips: 17.1437 samples/sec | ETA 00:47:492022-07-21 15:35:37 [INFO][TRAIN] epoch: 2, iter: 7710/20000, loss: 0.4112, lr: 0.006452, batch_cost: 0.2435, reader_cost: 0.12051, ips: 16.4247 samples/sec | ETA 00:49:532022-07-21 15:35:39 [INFO][TRAIN] epoch: 2, iter: 7720/20000, loss: 0.4762, lr: 0.006447, batch_cost: 0.2273, reader_cost: 0.09972, ips: 17.6010 samples/sec | ETA 00:46:302022-07-21 15:35:42 [INFO][TRAIN] epoch: 2, iter: 7730/20000, loss: 0.4150, lr: 0.006443, batch_cost: 0.2274, reader_cost: 0.10446, ips: 17.5882 samples/sec | ETA 00:46:302022-07-21 15:35:44 [INFO][TRAIN] epoch: 2, iter: 7740/20000, loss: 0.4047, lr: 0.006438, batch_cost: 0.2722, reader_cost: 0.12576, ips: 14.6930 samples/sec | ETA 00:55:372022-07-21 15:35:47 [INFO][TRAIN] epoch: 2, iter: 7750/20000, loss: 0.2710, lr: 0.006433, batch_cost: 0.2150, reader_cost: 0.08840, ips: 18.6030 samples/sec | ETA 00:43:532022-07-21 15:35:49 [INFO][TRAIN] epoch: 2, iter: 7760/20000, loss: 0.2710, lr: 0.006428, batch_cost: 0.2315, reader_cost: 0.10775, ips: 17.2759 samples/sec | ETA 00:47:142022-07-21 15:35:51 [INFO][TRAIN] epoch: 2, iter: 7770/20000, loss: 0.3553, lr: 0.006424, batch_cost: 0.2330, reader_cost: 0.11110, ips: 17.1694 samples/sec | ETA 00:47:292022-07-21 15:35:53 [INFO][TRAIN] epoch: 2, iter: 7780/20000, loss: 0.3030, lr: 0.006419, batch_cost: 0.2246, reader_cost: 0.10236, ips: 17.8115 samples/sec | ETA 00:45:442022-07-21 15:35:56 [INFO][TRAIN] epoch: 2, iter: 7790/20000, loss: 0.3578, lr: 0.006414, batch_cost: 0.2449, reader_cost: 0.11758, ips: 16.3347 samples/sec | ETA 00:49:492022-07-21 15:35:59 [INFO][TRAIN] epoch: 2, iter: 7800/20000, loss: 0.4215, lr: 0.006410, batch_cost: 0.3072, reader_cost: 0.14530, ips: 13.0201 samples/sec | ETA 01:02:282022-07-21 15:36:02 [INFO][TRAIN] epoch: 2, iter: 7810/20000, loss: 0.4009, lr: 0.006405, batch_cost: 0.3266, reader_cost: 0.16279, ips: 12.2478 samples/sec | ETA 01:06:212022-07-21 15:36:05 [INFO][TRAIN] epoch: 2, iter: 7820/20000, loss: 0.3907, lr: 0.006400, batch_cost: 0.2655, reader_cost: 0.11928, ips: 15.0649 samples/sec | ETA 00:53:542022-07-21 15:36:07 [INFO][TRAIN] epoch: 2, iter: 7830/20000, loss: 0.3368, lr: 0.006395, batch_cost: 0.2361, reader_cost: 0.10799, ips: 16.9426 samples/sec | ETA 00:47:532022-07-21 15:36:10 [INFO][TRAIN] epoch: 2, iter: 7840/20000, loss: 0.4152, lr: 0.006391, batch_cost: 0.2572, reader_cost: 0.12811, ips: 15.5531 samples/sec | ETA 00:52:072022-07-21 15:36:12 [INFO][TRAIN] epoch: 2, iter: 7850/20000, loss: 0.3609, lr: 0.006386, batch_cost: 0.2520, reader_cost: 0.12739, ips: 15.8737 samples/sec | ETA 00:51:012022-07-21 15:36:15 [INFO][TRAIN] epoch: 2, iter: 7860/20000, loss: 0.4380, lr: 0.006381, batch_cost: 0.2521, reader_cost: 0.10999, ips: 15.8697 samples/sec | ETA 00:50:592022-07-21 15:36:17 [INFO][TRAIN] epoch: 2, iter: 7870/20000, loss: 0.3375, lr: 0.006376, batch_cost: 0.2436, reader_cost: 0.11675, ips: 16.4235 samples/sec | ETA 00:49:142022-07-21 15:36:20 [INFO][TRAIN] epoch: 2, iter: 7880/20000, loss: 0.5077, lr: 0.006372, batch_cost: 0.2351, reader_cost: 0.11235, ips: 17.0105 samples/sec | ETA 00:47:302022-07-21 15:36:22 [INFO][TRAIN] epoch: 2, iter: 7890/20000, loss: 0.4537, lr: 0.006367, batch_cost: 0.2258, reader_cost: 0.10388, ips: 17.7144 samples/sec | ETA 00:45:342022-07-21 15:36:24 [INFO][TRAIN] epoch: 2, iter: 7900/20000, loss: 0.4222, lr: 0.006362, batch_cost: 0.2440, reader_cost: 0.11669, ips: 16.3932 samples/sec | ETA 00:49:122022-07-21 15:36:27 [INFO][TRAIN] epoch: 2, iter: 7910/20000, loss: 0.3258, lr: 0.006358, batch_cost: 0.2438, reader_cost: 0.11675, ips: 16.4040 samples/sec | ETA 00:49:082022-07-21 15:36:29 [INFO][TRAIN] epoch: 2, iter: 7920/20000, loss: 0.4069, lr: 0.006353, batch_cost: 0.2255, reader_cost: 0.10406, ips: 17.7389 samples/sec | ETA 00:45:232022-07-21 15:36:32 [INFO][TRAIN] epoch: 2, iter: 7930/20000, loss: 0.3951, lr: 0.006348, batch_cost: 0.2476, reader_cost: 0.12355, ips: 16.1568 samples/sec | ETA 00:49:482022-07-21 15:36:34 [INFO][TRAIN] epoch: 2, iter: 7940/20000, loss: 0.5528, lr: 0.006343, batch_cost: 0.2230, reader_cost: 0.09922, ips: 17.9347 samples/sec | ETA 00:44:492022-07-21 15:36:36 [INFO][TRAIN] epoch: 2, iter: 7950/20000, loss: 0.2804, lr: 0.006339, batch_cost: 0.2310, reader_cost: 0.10744, ips: 17.3191 samples/sec | ETA 00:46:232022-07-21 15:36:38 [INFO][TRAIN] epoch: 2, iter: 7960/20000, loss: 0.3241, lr: 0.006334, batch_cost: 0.2275, reader_cost: 0.10315, ips: 17.5800 samples/sec | ETA 00:45:392022-07-21 15:36:41 [INFO][TRAIN] epoch: 2, iter: 7970/20000, loss: 0.3166, lr: 0.006329, batch_cost: 0.2419, reader_cost: 0.11946, ips: 16.5373 samples/sec | ETA 00:48:292022-07-21 15:36:43 [INFO][TRAIN] epoch: 2, iter: 7980/20000, loss: 0.4185, lr: 0.006324, batch_cost: 0.2416, reader_cost: 0.11746, ips: 16.5543 samples/sec | ETA 00:48:242022-07-21 15:36:46 [INFO][TRAIN] epoch: 2, iter: 7990/20000, loss: 0.2762, lr: 0.006320, batch_cost: 0.2585, reader_cost: 0.13001, ips: 15.4717 samples/sec | ETA 00:51:452022-07-21 15:36:48 [INFO][TRAIN] epoch: 2, iter: 8000/20000, loss: 0.3836, lr: 0.006315, batch_cost: 0.2449, reader_cost: 0.10571, ips: 16.3356 samples/sec | ETA 00:48:582022-07-21 15:36:48 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - ETA: 0s - batch_cost: 0.0636 - reader cost: 0.00790 - ETA: 7:21 - batch_cost: 0.0608 - ETA: 6:59 - batch_cost: 0.0581 - reader cost: 0.0 - ETA: 6:58 - batch_cost: 0.058 - ETA: 6:46 - batch_cost: 0.0566 - reader cost - ETA: 6:41 - batch_cost: 0.0561 - ETA: 6:35 - batch_cost: 0.0555 - reader cost: - ETA: 6:38 - batch_cost: 0.0560 - reader cost: - ETA: 6:41 - batch_cost: 0.0566 - reader c - ETA: 6:56 - batch_cost: 0.0586 - reader cost: 0 - ETA: 7:08 - batch_cost: 0.0603 - reader cost: 0 - ETA: 7:21 - batch_cost: 0.0622 - reader cost: - ETA: 7:34 - batch_cost: 0.0641 - reader cost: 0.005 - ETA: 7:36 - batch_cost: 0.0644 - ETA: 7:47 - batch_cost: 0.0661 - reader cost - ETA: 7:47 - batch_cost: 0.0663 - reader cost: - ETA: 7:49 - batch_cost: 0.0665 - reader cost: 0 - ETA: 7:49 - ETA: 7:54 - batch_cost: 0.0678 - reade - ETA: 7:55 - batch_cost: 0.0681 - reader cost: 0.009 - ETA: 7:55 - batch_cost: 0.0681 - reader cost: 0.00 - ETA: 7:55 - batch_cost: 0.0681 - reader cost: 0. - ETA: 7:56 - batch_cost: 0.0683 - reader cost: 0.00 - ETA: 7:57 - batch_cost: 0.0684 - reader cost - ETA: 8:01 - batch_cost: 0.0692 - reader cost: - ETA: 8:05 - batch_ - ETA: 8:03 - batch_cost: 0.0699 - reader cost: 0. - ETA: 8:03 - batch_cost: 0.0698 - reader cost: - ETA: 8:02 - batch_cost: 0.0698 - - ETA: 8:02 - batch_cost: - ETA: 7:52 - batch_cost: 0.0689 - - ETA: 7:41 - - ETA: 7:22 - batch_cost: 0.0657 - reader cost: 0.00 - ETA: 7:22 - batch_cost: 0.0656 - reader co - ETA: 7:18 - batch_cost: 0.0653 - reader cost: 0. - ETA: 7:17 - batch_cost: 0.0651 - reader cost - ETA: - ETA: 7:11 - batch_cost: - ETA: 7:02 - batch_cost: 0.0641 - reader c - ETA: 6:58 - batch_cost: 0.0637 - reader cost - ETA: 6:56 - batch_cost: 0.0635 - reader cost: 0.008 - ETA: 6:55 - batch_cost: 0.0634 - reader - ETA: 6:51 - batch_cost: 0.0630 - reader cost - ETA: 6:49 - batch_cost: 0.0628 - reader cost: 0.00 - ETA: 6:48 - batch_cost: 0.0628 - reader cost: 0. - ETA: 6:47 - batch_cost: 0.0627 - reader cost: 0.0 - ETA: 6:47 - - ETA: 6:37 - batch_cost: 0.0619 - reader cost: 0. - ETA: 6:36 - batch_cost: 0.0618 - reader cost: - ETA: 6:36 - batch_cost: 0.0619 - reader cost: 0. - ETA: 6:36 - ETA: 6:29 - ETA: 6:19 - batch_cost: 0.0607 - reader cost - ETA: 6:17 - batch_cost: 0.0605 - reader cost - ETA: 6:15 - batch_cost: 0.0604 - reader cost: - ETA: 6:14 - batch_cost: 0.0603 - reader cost: 0 - ETA: 6:13 - batch_cost: 0.0602 - reader c - ETA: 6:10 - batch_cost: 0.0600 - reader c - ETA: 6:08 - batch_cost: 0.0598 - reader cost: - ETA: 6:06 - - ETA: 6:01 - batch_cost: 0.0594 - reader cos - ETA: 6:01 - batch_cost: - ETA: 6:01 - batch_cost: 0.0599 - reader cost: - ETA: 6:02 - batch_cost: 0.0601 - reader cost - ETA: 6:03 - batch_cost: 0.0604 - reader cost - ETA: 6:05 - batch_cost: 0.0607 - reader cost: 0. - ETA: 6:06 - batch_cost: 0.0609 - reader cost: 0.006 - ETA: 6:06 - batch_cost: 0.0610 - reader cost: 0.0 - ETA: 6:07 - batch_cost: 0.0612 - reader cost: 0 - ETA: 6:08 - batch_cost: 0.0613 - reader cost: 0.00 - ETA: 6:08 - batch_cost: 0.0614 - reader - ETA: 6:11 - batch_cost: 0.0621 - reader co - ETA: 6:12 - batch_cost: 0.0624 - reader cost: 0.0 - ETA: 6:13 - batch_cost - ETA: 6:13 - batch_c - ETA: 6:14 - batch_cost: 0.0633 - r - ETA: 6:12 - batch_cost: 0.0633 - ETA: 6:06 - batch_cost: 0.0629 - reader co - ETA: 5:53 - batch_cost: 0.0619 - reader cost: - ETA: 5:52 - batch_cost: 0.0618 - r - ETA: 5:48 - batch_cost: 0 - ETA: 5 - ETA: 5:35 - batch_cost: 0.0606 - reader cost - ETA: 5:33 - batch_cost: 0.0605 - read - ETA: 5:31 - batch_cos - ETA: 5:30 - batch_cost: 0.0605 - reade - ETA: 5:29 - batch_cost: 0.0606 - reader cost: 0. - ETA: 5:29 - batch_cost: 0. - ETA: 5:27 - batch_cost: 0.0607 - reader cost: 0.00 - ETA: 5:27 - batch_cost: 0.0607 - reader cost: 0.0 - ETA: 5:27 - batch_cost: 0.0607 - read - ETA: 5:25 - batch_cost: 0. - ETA: 5:24 - ETA: 5:23 - batch_cost: 0.0612 - reader cost: 0. - ETA: 5:23 - batch_cost: 0.0612 - reader cost: - E - ETA: 5:15 - batch_cost: 0.0614 - reader co - ETA: 5:14 - batch_cost: 0.0614 - reader cost: 0.0 - ETA: 5:14 - batch_cost: 0.0614 - reader - ETA: 5:13 - batch_cost: 0.0615 - reader cost: - ETA: 5:12 - batch_cost: 0.0615 - ETA: 5:11 - batch_cost: 0.0615 - reader cost: 0.0 - ETA: 5:11 - batch_cost: 0.0616 - reader cost: 0.0 - ETA: 5:11 - batch_cost: 0.0617 - re - ETA: 5:08 - batch_cost: 0.0620 - reader cost: 0.0 - ETA: 5:08 - batch_cost: 0.0621 - reader cost - ETA: 5:08 - batch_cost: 0.0622 - reader cost: - ETA: 5:08 - batch_cost: 0.0623 - read - ETA: 5:09 - batch_cost: 0.0626 - reader - ETA: 5:09 - batch_cost: 0.0629 - - ETA: 5:08 - batch_cost: 0.0630 - reader cost - ETA: 5:08 - batch_cost: 0.0631 - reader cost: 0. - ETA: 5:08 - batch_cost: 0.0632 - reader cost: 0.0 - ETA: - ETA: 5:01 - batch_cost: 0.0634 - reader cost: 0 - ETA: 5:00 - batch_cost: 0.0634 - reader - ETA: 4:59 - batch_cost: 0.0634 - reader cost: - ETA: 4:59 - batch_cost: 0.0634 - reader cost: - ETA: 4:59 - batch_cost: 0.0634 - reader cost: 0.00 - ETA: 4:59 - batch_cost: 0.0634 - reader cost: 0.00 - ETA: 4:58 - batch_cost: 0.0634 - reader cost - ETA: 4:58 - batch_cost: 0.0635 - reader cost: 0 - ETA: 4:58 - batch_cost: 0.0 - ETA: 4:51 - batch_cost: 0.0636 - reader cos - ETA: 4:51 - batch_cost: 0.0636 - reader cos - ETA: 4:50 - batch - ETA: 4:47 - batch_cost: 0.0636 - reader - ETA: 4:46 - batch_cost: 0.0636 - reader c - ETA: 4:45 - batch_co - ETA: 4:43 - batch_cost: - ETA: 4:41 - batch_cost: 0.0637 - reader cost: - ETA: 4:41 - batch_cost: 0.0637 - reader cost: 0.009 - ETA: 4:41 - batch_cost: 0.0638 - reader cost: 0. - ETA: 4:41 - bat - ETA: 4:38 - batch_cost: 0.0639 - reader cost: - ETA: 4:38 - batch_cost: 0.0639 - rea - ETA: 4:36 - batch_cost: 0.0639 - reader cost: 0 - ETA: 4:36 - batch_co - - ETA: 4:30 - batch_cost: 0.0639 - reader cos - ETA: 4:29 - batch_cost: 0.0639 - reader cost: 0.009 - ETA: 4:29 - batch_c - ETA: 4:26 - batch_cost: 0.0640 - ETA: 4:26 - batch_cost: 0.0642 - reader cost: - ETA: 4:25 - batch_cos - ETA: 4:23 - batch_cost: 0.0642 - reader cost: 0.009 - ETA: 4:23 - batch_cost: 0.0642 - rea - ETA: 4:21 - batch_cost: 0.0642 - reader cost: - ETA: 4:21 - batch_cost: - ETA: 4:19 - batch_cost: 0.0642 - ETA: 4:17 - batch_cost: 0.0643 - reader cost: 0.0 - ETA: 4:17 - batch_cost: 0.0643 - reader cost - ETA: 4:17 - batch_cost: 0.0644 - reader cost: 0. - ETA: 4:17 - batch_cost: 0.0644 - read - ETA: 4:17 - batch_cost: 0.0647 - reader cos - ETA: 4:17 - batch_cost: 0.0649 - reader cost: - ETA: 4:17 - batch_cost: 0.0650 - reader cost: 0.01 - ETA: 4:17 - batch_cos - ETA: 4:15 - batch_cost: 0.0651 - r - ETA: 4:14 - batch_cost: 0.0651 - reader cos - ETA: 4:13 - batch_cost: 0.0651 - reader cost: 0. - ETA: 4:12 - - ETA: 4:09 - batch_cost: 0.0652 - re - ETA: 4:08 - batch_cost: 0.0652 - reader cost: 0.010 - ETA: 4:08 - batch_cost: 0.0652 - reader cost: - ETA: 4:07 - batch_cost: 0.0652 - read - ETA: 4:06 - batch_cost: 0.0652 - - ETA: 4:05 - batch_cost: 0.0652 - reader - ETA: 4:04 - batch_cost: 0.0652 - reader cost: 0. - ETA: 4:04 - batch_co - ETA: 4:01 - batch_cost: 0.0652 - reader cost: - ETA: 4:01 - batch_cost: 0.0652 - reader cost: 0.0 - ETA: 4:0 - ETA: 3:56 - batch_cost: 0.0651 - reader cost: 0.01 - ETA: 3:56 - batch_cost: 0.0651 - reader - ETA: 3:55 - batch_cost: 0.0651 - reader co - ETA: 3:54 - batch_cost: 0.0651 - reader cost: - ETA: 3:53 - batch_cost: 0.0651 - reader cost: 0.010 - ETA: 3:53 - batch_cost: 0.0651 - reader cost: 0.010 - ETA: 3:53 - batch_cost: - ETA: 3:50 - batch_cost: 0.0651 - reader cost - ETA: 3:50 - batch_cost: 0 - ETA: 3:47 - batch_cost: 0.0651 - reader cost: 0.010 - ETA: 3:47 - batch_cost: 0.0651 - read - ETA: 3:46 - batch_cost: 0.0651 - reader cost: 0 - ETA: 3:46 - batch_cost: 0.0652 - reader cost - ETA: 3:46 - batch_cost: 0.0652 - reader cost: 0. - ETA: 3:45 - batch_cost: 0.0652 - reader cost: 0 - ETA: 3:45 - batch_cost: 0.0652 - reader cost: 0 - ETA: 3:45 - batch_cost: 0.0652 - reader cost - ETA: 3:44 - batch_cost: 0.0652 - reader cost: 0. - ETA: 3:44 - batch_cost: 0.0652 - reader cost: 0.01 - ETA: 3:44 - batch_cost: 0.0652 - - ETA: 3:42 - batch_cost: 0.0652 - reader cost: 0 - ETA: 3:41 - - ETA: 3:38 - ETA: 3:34 - batch_cost: 0.0651 - reader cos - ETA: 3:33 - batch_cost: 0.0652 - reader cost: 0. - ETA: 3:33 - batch_cost: 0.0652 - reader cost: 0.010 - ETA: 3:33 - batch_cost: 0.0652 - rea - ETA: 3:32 - batch_cost: 0.0653 - reader - ETA: 3:31 - batch_cost: 0.0653 - reader cost - ETA: 3:31 - batch_cost: 0.0653 - reader cost: 0. - ETA: 3:30 - batch_cost: 0.06 - ETA: 3:28 - batch_cost: 0.0653 - reader cost: 0 - ETA: 3:28 - batch_ - ETA: 3:25 - batch_cost: 0.0653 - reader cost: 0 - ETA: 3:25 - batch_cost: 0.0 - ETA: 3:23 - batch_cost: 0.0653 - reader cost: 0. - ETA: 3:22 - batch_cost: 0.0653 - - ETA: 3:21 - batch_cost: 0.0653 - reader cost: 0.01 - ETA: 3:20 - batch_cost: 0.0653 - reader c - ETA: 3:19 - batch_cost: 0.0653 - read - ETA: 3:18 - batch_cost: 0.0653 - reader cost: 0.0 - ETA: 3:18 - batch_cost: 0.0653 - reader cost: 0.01 - ETA: 3:18 - batch_cost: 0.0653 - reader cos - ETA: 3:17 - batch_cost: 0.0653 - reader cost: 0. - ETA: 3:17 - batch_cost: 0.0653 - reader cost: - ETA: 3:17 - batch_cost: 0.0655 - reader cost: 0 - ETA: 3:17 - batch_cost: 0.0656 - reader cost: 0. - ETA: 3:17 - batch_cost: 0.0657 - reader cost: 0 - ETA: 3:17 - batch_cost: 0.0658 - reader cost: 0.01 - ETA: 3:17 - batch_cost: 0.0658 - reader - ETA: 3:17 - batch_cost: 0.0660 - reader cost - ETA: 3:16 - batch_cost: 0.0661 - reader cost: 0.0 - ETA: 3:16 - batch_cost: 0.0661 - reader cost - ETA: 3:15 - batch_cost: 0.0661 - reader co - ETA: 3:14 - batch_cost: 0.0661 - reader cost: 0.011 - ETA: 3:14 - batch_cost: 0.0661 - reader cost: - ETA: 3:14 - batch_cost: 0.0661 - reader cost: 0. - ETA: 3:13 - batch_cost: 0.0661 - reader cost: 0.011 - ETA: 3:13 - batch_cost: 0.0661 - reader cost: 0 - ETA: 3:13 - batch_cost: 0.0661 - reader cost: 0.0 - ETA: 3:13 - batch_cost: 0.0661 - reader cost: 0. - ETA: 3:12 - batc - ETA: 3:10 - batch_cost: 0.0661 - rea - ETA: 3:08 - batch_cost: 0.0661 - reader cost: 0.01 - ETA: 3:08 - batch_cost: 0.0661 - reader cost: - ETA: 3:08 - batch_cost: 0.0661 - reader c - ETA: 3:07 - batch_cost: 0.0661 - reader cost: - ETA: 3:06 - batch_cost: 0.0661 - reader cost: 0.0 - ETA: 3:06 - batch_cost: 0.0661 - rea - ETA: 3:04 - batch_cost: 0.0661 - reader cost: 0.0 - ETA: 3:04 - batch_cost: 0.0661 - reader cost: 0 - ETA: 3:03 - batch_cost: 0.0661 - reader cost: - ETA: 3:03 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 3:03 - batch_ - ETA: 3:00 - batch_cost: 0.0660 - reader cost - ETA: 2:59 - batch_cost: 0.0660 - re - ETA: 2:57 - batch_cost: 0.0660 - reader cost: 0.01 - ETA: 2:57 - batch_cost: 0.0660 - re - ETA: 2:56 - batch_cost: - ETA: 2:53 - batch_cost: 0. - ETA: 2:51 - batch_cost: 0.0660 - reader - ETA: 2:50 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 2:50 - batch_cost: 0.0 - ETA: 2:48 - batch_cost: 0.0660 - reader cost: 0. - ETA: 2:48 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 2:48 - batch_cost: 0.0661 - reader cost: 0.0 - ETA: 2:47 - batch_cost: 0.0661 - reader cost: 0. - ETA: 2:47 - batch_cost: 0.0661 - reader cost: 0.01 - ETA: 2:47 - batch_cost - ETA: 2:45 - batch_cost: 0.0661 - reader cost: - ETA: 2:44 - batch_cost: 0.0661 - r - ETA: 2:43 - batch_cost: 0.0661 - reader - ETA: 2:42 - batch_cost: 0.0661 - reade - ETA: 2:41 - batch_cost: 0.0661 - reader - ETA: 2:40 - batch_cost: - ETA: 2:38 - batch_cost: 0.0662 - reader cost: 0.01 - ETA: 2:38 - batch_cost: 0.0662 - reader cost: 0 - ETA: 2:37 - batch_cost: 0.0662 - reader cost: 0.01 - ETA: 2:37 - batch_cost: 0.0662 - - ETA: 2:36 - batch_cost: 0.0663 - reader cost: 0. - ETA: 2:36 - batch_cost: 0.0663 - ETA: 2:34 - batch_cost: 0 - ETA: 2:31 - batch_cost: 0. - ETA: - ETA: 2:22 - batch_cost: 0.0660 - reader cost: - ETA: 2:21 - batch_cost: 0.0659 - reader - ETA: 2 - ETA: 2:13 - batch_cost: 0.0657 - reader cost: 0.01 - ETA: 2:13 - batch_cost: 0.0657 - reade - ETA: 2:12 - batch_cost: 0.0658 - reader cost: 0.01 - ETA: 2:12 - batch_cost: 0.0658 - reader cost: 0.01 - ETA: 2:12 - batch_cost: 0.0658 - reader cost: 0 - ETA: 2:11 - batch_cost: 0.0658 - reader cost: 0.0 - ETA: 2:11 - batch_cost: 0.0658 - reader cost: 0.0 - ETA: 2:11 - batch_cost: 0.0658 - reader cos - ETA: 2:11 - batch_cost: 0.0659 - reader cost: 0. - ETA: 2:10 - batch_cost: 0.0659 - reader cost: 0.0 - ETA: 2:10 - batch_cost: 0.0659 - reader cost: 0.0 - ETA: 2:10 - batch_cost: 0.0659 - reader cost: - ETA: 2:10 - batch_cost: 0.0660 - reader cost: - ETA: 2:09 - batch_cost: 0.0660 - reader cost: 0.010 - ETA: 2:09 - batch_cost: 0.0660 - reader cost - ETA: 2:09 - batch_cost: 0.0661 - reader cost: 0.01 - ETA: 2:09 - batch_cost: 0.0661 - reader cos - ETA: 2:08 - batch_cost: 0.0661 - reade - ETA: 2:06 - batch_cost: 0.0660 - reader cost: - ETA: 2:05 - batch_cost: 0.0660 - reader cost: 0 - ETA: 2:05 - batch_cost: 0.0660 - reader cost: - ETA: 2:04 - batch_cost: 0.0660 - reader cost - ETA: 2:03 - batch_cost: - ETA: 1:59 - batch_cost: 0.0658 - - ETA: 1:57 - batch_cost: 0.0657 - reader cos - ETA: 1:55 - batch_cost: 0.0657 - reader cost: - ET - ETA: 1:48 - batch_cost: 0.0655 - reader cost: - ETA: 1:48 - batch_cost: 0.0654 - reader cost: 0.010 - ETA: 1:47 - batch_cost: 0.0654 - reader - ETA: 1:39 - batch_cost: 0.0652 - reader - ETA: 1:38 - batch_cost: 0.0651 - reader cost: 0.0 - ETA: 1:37 - batch_cost: 0.0651 - reader cost: 0.0 - ETA: 1:37 - batch_cost: 0.0651 - reader cost: 0.00 - ETA: 1:37 - batch_cost: 0.0652 - reader cost: 0 - ETA: 1:36 - batch_cost: 0.0652 - reader co - ETA: 1:36 - batch_cost: 0.06 - ETA: 1:33 - batch_cost: 0.0651 - read - ETA: 1:31 - batch_cost: 0.0651 - reader cost: 0.00 - ETA: 1:31 - batch_cost: 0.0 - ETA: 1:28 - batch_cost: 0.0650 - r - ETA: 1:26 - batch_cost: 0.0650 - reader cost: 0.0 - ETA: 1:26 - batch_cost: 0.0650 - reader - ETA: 1:25 - batch_cost: 0.0650 - reader cost: - ETA: 1:24 - batch_cost: 0.0650 - reader cos - ETA: 1:23 - batch_cost: 0.0650 - reader cost: - ETA: 1:22 - batch_cost: 0 - ETA: 1:19 - batch_cost: 0.0648 - reader cost: 0.009 - ETA: 1:19 - batch_cost: 0.0648 - reader cos - ETA: 1:18 - batch_cost: 0.0648 - reader - ETA: 1:16 - - ETA: 1:11 - batch_cost: 0.0646 - reader cost: 0.009 - ETA: 1:11 - batch_ - ETA: 1:07 - batch_cost: 0.0645 - reader cos - ETA: 1:06 - batch_cost: 0.0645 - read - ETA: 1:04 - batch_cost: 0.0644 - reader cost: 0.00 - ETA: 1:04 - batch_cost: 0.0644 - reader cost: 0.0 - ETA: 1:03 - batch_cost: 0.0644 - reader cost: 0.009 - ETA: 1:03 - batch_cost: 0.0644 - reader cost: 0 - ETA: 1:02 - batch_cost: 0.0644 - reader cost: 0.009 - ETA: 1:02 - batch_cost: 0.0644 - reade - ETA: 1:01 - batch_cost: 0.0643 - reader cost: 0.009 - ETA: 1:00 - batch_cost: 0.0 - ETA: 59s - batch_cost: 0.0643 - reader - ETA: 58s - batch_cost: 0.0643 - reader co - ETA: 58s - batch_cost: 0.0644 - reade - ETA: 57s - batch_cost: 0.06 - ETA: 56s - batch_cost: 0.0643 - reader - ETA: 55s - batch_cost: 0.0643 - r - ETA: 52s - batch_cost: 0.0644 - reader cost - ETA: 52s - batch_cost: 0.0644 - reader co - ETA: 52s - batch_cost: 0.0644 - reader cost: 0. - ETA: 51s - batch_cost: 0.0644 - reader co - ETA: 51s - batch_cost: 0.0645 - reader cost: - ETA: 51s - batch_cost: 0.0645 - r - ETA: 50s - batch_cost: 0.0645 - reader cost: 0. - ETA: 50s - batch_cost: 0.0645 - reader co - ETA: 50s - batch_cost: 0.0646 - reader cost: - ETA: 50s - batch_cost: 0.0646 - reader cost: 0.00 - ETA: 50s - batch_cost: 0.0646 - reader cost: 0. - ETA: 49s - batch_cost: 0.0646 - reader cost: 0. - ETA: 49s - batch_cost: 0.0646 - reader cost: 0.00 - ETA: 49s - batch_cost: 0.0646 - reader cost: 0. - ETA: 49s - batch_cost: 0.06 - ETA: 48s - batch_cost: 0.0646 - reader co - ETA: 47s - batch_cost: 0.0646 - reader cost: 0.00 - ETA: 47s - batch_cost: 0.0646 - r - ETA: 46s - batch_cost: 0.0645 - rea - ETA: 45s - batch_cost: 0.0645 - reader cost: - ETA: 45s - batch_cost: 0.0645 - reader cost: 0.00 - ETA: 45s - batch_cost - ETA: 43s - batch_cost: 0.0645 - rea - ETA: - ETA: 39s - batch_cost: 0.0644 - reader co - ETA: 38s - batch_cost: 0.0643 - rea - ETA: 37 - - ETA: 32s - batch_cost: 0.0642 - reader cost: - ETA: 31s - batch_cost: 0.0642 - reader cost: 0. - ETA: 31s - batch_cost: 0.0642 - reader cost: 0. - ETA: 31s - - ETA: 28s - batch_cost: 0.0641 - reader cost: - ETA: 25s - batch_cost: 0.0640 - reader cost: - ETA: 25s - batch_cost: 0.0640 - reader cost: 0. - ETA: 25s - batch_cost: - ETA: 17s - batch_cost: 0.0639 - reader cost - ETA: 17s - batch_cost: 0.0639 - reade - ETA: 16s - batch_cost: 0.0640 - reader co - ETA: 16s - batch_cost: 0.0640 - reader cost: 0.00 - ETA: 16s - batch_cost: 0.0640 - reader cost - ETA: 15s - batch_cost: 0.0640 - reader cost: - ETA: 15s - batch_cost: 0.0640 - reader cost - ETA: 15s - batch_cost: 0.0640 - reader cost - ETA: 14s - batch_cost - ETA: 13s - batch_cost: - ETA: 11s - batch_cost: 0.0639 - reader cost: 0. - ETA: 11s - batch_cost: 0.06 - ETA: 10s - batch_cost: 0.0639 - reader cost: 0. - ETA: 9s - batch_cost: 0.0639 - reader - ETA: 8s - batch_cost: 0.0638 - reader cost - ETA: 7s - batch_cost: 0.0638 - reader cost: 0.0 - ETA: 7s - batch_cost: 0.0638 - reader cost: 0 - ETA: 6s - batch_cost: 0.0638 - - ETA: 4s - batch_cost: 0.0637 - reader cost: 0.00 - ETA: 3s - batch_cost: 0.0637 - reader cost: 0.007 - ETA: 3s - batch_cost: 0.0637 - reader cost: 0.007 - ETA: 3s - batch_cost: 0.0637 - reader cost: - ETA: 2s - batch_cost: 0.0637 - reader cost: 0. - ETA: 2s - batch_cost: 0.0637 - - 471s 64ms/step - batch_cost: 0.0636 - reader cost: 0.00792022-07-21 15:44:39 [INFO][EVAL] #Images: 7361 mIoU: 0.1066 Acc: 0.9839 Kappa: 0.4030 Dice: 0.13512022-07-21 15:44:39 [INFO][EVAL] Class IoU: [0.9857 0.1112 0.0521 0.1735 0.0001 0. 0. 0. 0.2299 0.0064 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.5722]2022-07-21 15:44:39 [INFO][EVAL] Class Precision: [0.9877 0.3447 0.3259 0.5863 0.1206 0. 0. 0. 0.555 0.1666 0. 0. 0. 0.0962 0. 0. 0. 0. 0. 0.6706]2022-07-21 15:44:39 [INFO][EVAL] Class Recall: [0.998 0.141 0.0584 0.1978 0.0001 0. 0. 0. 0.2819 0.0066 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.796 ]2022-07-21 15:44:39 [INFO][EVAL] The model with the best validation mIoU (0.1066) was saved at iter 8000.2022-07-21 15:44:40 [INFO][TRAIN] epoch: 2, iter: 8010/20000, loss: 0.2766, lr: 0.006310, batch_cost: 0.1433, reader_cost: 0.01768, ips: 27.9072 samples/sec | ETA 00:28:382022-07-21 15:44:43 [INFO][TRAIN] epoch: 2, iter: 8020/20000, loss: 0.3989, lr: 0.006305, batch_cost: 0.2489, reader_cost: 0.11926, ips: 16.0693 samples/sec | ETA 00:49:422022-07-21 15:44:45 [INFO][TRAIN] epoch: 2, iter: 8030/20000, loss: 0.4043, lr: 0.006301, batch_cost: 0.2428, reader_cost: 0.11646, ips: 16.4777 samples/sec | ETA 00:48:252022-07-21 15:44:48 [INFO][TRAIN] epoch: 2, iter: 8040/20000, loss: 0.3067, lr: 0.006296, batch_cost: 0.2321, reader_cost: 0.10884, ips: 17.2346 samples/sec | ETA 00:46:152022-07-21 15:44:51 [INFO][TRAIN] epoch: 2, iter: 8050/20000, loss: 0.4878, lr: 0.006291, batch_cost: 0.2927, reader_cost: 0.14845, ips: 13.6651 samples/sec | ETA 00:58:172022-07-21 15:44:53 [INFO][TRAIN] epoch: 2, iter: 8060/20000, loss: 0.3029, lr: 0.006287, batch_cost: 0.2351, reader_cost: 0.11031, ips: 17.0146 samples/sec | ETA 00:46:472022-07-21 15:44:55 [INFO][TRAIN] epoch: 2, iter: 8070/20000, loss: 0.3445, lr: 0.006282, batch_cost: 0.2349, reader_cost: 0.10511, ips: 17.0270 samples/sec | ETA 00:46:422022-07-21 15:44:58 [INFO][TRAIN] epoch: 2, iter: 8080/20000, loss: 0.3386, lr: 0.006277, batch_cost: 0.2336, reader_cost: 0.10242, ips: 17.1242 samples/sec | ETA 00:46:242022-07-21 15:45:00 [INFO][TRAIN] epoch: 2, iter: 8090/20000, loss: 0.3505, lr: 0.006272, batch_cost: 0.2749, reader_cost: 0.11246, ips: 14.5507 samples/sec | ETA 00:54:342022-07-21 15:45:04 [INFO][TRAIN] epoch: 2, iter: 8100/20000, loss: 0.3735, lr: 0.006268, batch_cost: 0.3426, reader_cost: 0.17639, ips: 11.6752 samples/sec | ETA 01:07:572022-07-21 15:45:06 [INFO][TRAIN] epoch: 2, iter: 8110/20000, loss: 0.3946, lr: 0.006263, batch_cost: 0.2326, reader_cost: 0.10904, ips: 17.1976 samples/sec | ETA 00:46:052022-07-21 15:45:08 [INFO][TRAIN] epoch: 2, iter: 8120/20000, loss: 0.3739, lr: 0.006258, batch_cost: 0.2217, reader_cost: 0.09773, ips: 18.0460 samples/sec | ETA 00:43:532022-07-21 15:45:11 [INFO][TRAIN] epoch: 2, iter: 8130/20000, loss: 0.5236, lr: 0.006253, batch_cost: 0.2237, reader_cost: 0.09852, ips: 17.8823 samples/sec | ETA 00:44:152022-07-21 15:45:13 [INFO][TRAIN] epoch: 2, iter: 8140/20000, loss: 0.4171, lr: 0.006249, batch_cost: 0.2346, reader_cost: 0.10903, ips: 17.0482 samples/sec | ETA 00:46:222022-07-21 15:45:15 [INFO][TRAIN] epoch: 2, iter: 8150/20000, loss: 0.4124, lr: 0.006244, batch_cost: 0.2369, reader_cost: 0.10989, ips: 16.8857 samples/sec | ETA 00:46:472022-07-21 15:45:18 [INFO][TRAIN] epoch: 2, iter: 8160/20000, loss: 0.4365, lr: 0.006239, batch_cost: 0.2284, reader_cost: 0.10616, ips: 17.5131 samples/sec | ETA 00:45:042022-07-21 15:45:20 [INFO][TRAIN] epoch: 2, iter: 8170/20000, loss: 0.4971, lr: 0.006234, batch_cost: 0.2287, reader_cost: 0.10474, ips: 17.4938 samples/sec | ETA 00:45:042022-07-21 15:45:23 [INFO][TRAIN] epoch: 2, iter: 8180/20000, loss: 0.4178, lr: 0.006230, batch_cost: 0.2756, reader_cost: 0.13271, ips: 14.5136 samples/sec | ETA 00:54:172022-07-21 15:45:25 [INFO][TRAIN] epoch: 2, iter: 8190/20000, loss: 0.3622, lr: 0.006225, batch_cost: 0.2292, reader_cost: 0.10659, ips: 17.4483 samples/sec | ETA 00:45:072022-07-21 15:45:28 [INFO][TRAIN] epoch: 2, iter: 8200/20000, loss: 0.3920, lr: 0.006220, batch_cost: 0.2753, reader_cost: 0.13128, ips: 14.5310 samples/sec | ETA 00:54:082022-07-21 15:45:30 [INFO][TRAIN] epoch: 2, iter: 8210/20000, loss: 0.4872, lr: 0.006215, batch_cost: 0.2456, reader_cost: 0.11249, ips: 16.2845 samples/sec | ETA 00:48:162022-07-21 15:45:32 [INFO][TRAIN] epoch: 2, iter: 8220/20000, loss: 0.2869, lr: 0.006211, batch_cost: 0.2313, reader_cost: 0.10790, ips: 17.2972 samples/sec | ETA 00:45:242022-07-21 15:45:35 [INFO][TRAIN] epoch: 2, iter: 8230/20000, loss: 0.4220, lr: 0.006206, batch_cost: 0.2341, reader_cost: 0.11079, ips: 17.0831 samples/sec | ETA 00:45:552022-07-21 15:45:37 [INFO][TRAIN] epoch: 2, iter: 8240/20000, loss: 0.2934, lr: 0.006201, batch_cost: 0.2256, reader_cost: 0.10137, ips: 17.7332 samples/sec | ETA 00:44:122022-07-21 15:45:40 [INFO][TRAIN] epoch: 2, iter: 8250/20000, loss: 0.3758, lr: 0.006196, batch_cost: 0.2592, reader_cost: 0.13628, ips: 15.4301 samples/sec | ETA 00:50:452022-07-21 15:45:42 [INFO][TRAIN] epoch: 2, iter: 8260/20000, loss: 0.3226, lr: 0.006192, batch_cost: 0.2549, reader_cost: 0.12604, ips: 15.6934 samples/sec | ETA 00:49:522022-07-21 15:45:44 [INFO][TRAIN] epoch: 2, iter: 8270/20000, loss: 0.4110, lr: 0.006187, batch_cost: 0.2241, reader_cost: 0.10053, ips: 17.8465 samples/sec | ETA 00:43:492022-07-21 15:45:47 [INFO][TRAIN] epoch: 2, iter: 8280/20000, loss: 0.3900, lr: 0.006182, batch_cost: 0.2316, reader_cost: 0.10508, ips: 17.2675 samples/sec | ETA 00:45:142022-07-21 15:45:49 [INFO][TRAIN] epoch: 2, iter: 8290/20000, loss: 0.4330, lr: 0.006177, batch_cost: 0.2290, reader_cost: 0.10649, ips: 17.4640 samples/sec | ETA 00:44:422022-07-21 15:45:51 [INFO][TRAIN] epoch: 2, iter: 8300/20000, loss: 0.4551, lr: 0.006173, batch_cost: 0.2257, reader_cost: 0.09937, ips: 17.7202 samples/sec | ETA 00:44:012022-07-21 15:45:54 [INFO][TRAIN] epoch: 2, iter: 8310/20000, loss: 0.3461, lr: 0.006168, batch_cost: 0.2793, reader_cost: 0.13615, ips: 14.3230 samples/sec | ETA 00:54:242022-07-21 15:45:57 [INFO][TRAIN] epoch: 2, iter: 8320/20000, loss: 0.2819, lr: 0.006163, batch_cost: 0.2380, reader_cost: 0.11318, ips: 16.8054 samples/sec | ETA 00:46:202022-07-21 15:45:59 [INFO][TRAIN] epoch: 2, iter: 8330/20000, loss: 0.4229, lr: 0.006158, batch_cost: 0.2389, reader_cost: 0.11195, ips: 16.7407 samples/sec | ETA 00:46:282022-07-21 15:46:01 [INFO][TRAIN] epoch: 2, iter: 8340/20000, loss: 0.2991, lr: 0.006154, batch_cost: 0.2471, reader_cost: 0.11625, ips: 16.1850 samples/sec | ETA 00:48:012022-07-21 15:46:04 [INFO][TRAIN] epoch: 2, iter: 8350/20000, loss: 0.4036, lr: 0.006149, batch_cost: 0.2535, reader_cost: 0.11695, ips: 15.7782 samples/sec | ETA 00:49:132022-07-21 15:46:07 [INFO][TRAIN] epoch: 2, iter: 8360/20000, loss: 0.3338, lr: 0.006144, batch_cost: 0.2793, reader_cost: 0.13255, ips: 14.3213 samples/sec | ETA 00:54:112022-07-21 15:46:10 [INFO][TRAIN] epoch: 2, iter: 8370/20000, loss: 0.4172, lr: 0.006139, batch_cost: 0.3689, reader_cost: 0.17745, ips: 10.8426 samples/sec | ETA 01:11:302022-07-21 15:46:13 [INFO][TRAIN] epoch: 2, iter: 8380/20000, loss: 0.3168, lr: 0.006135, batch_cost: 0.2472, reader_cost: 0.11993, ips: 16.1806 samples/sec | ETA 00:47:522022-07-21 15:46:15 [INFO][TRAIN] epoch: 2, iter: 8390/20000, loss: 0.4087, lr: 0.006130, batch_cost: 0.2320, reader_cost: 0.10576, ips: 17.2399 samples/sec | ETA 00:44:532022-07-21 15:46:17 [INFO][TRAIN] epoch: 2, iter: 8400/20000, loss: 0.3769, lr: 0.006125, batch_cost: 0.2148, reader_cost: 0.09196, ips: 18.6231 samples/sec | ETA 00:41:312022-07-21 15:46:20 [INFO][TRAIN] epoch: 2, iter: 8410/20000, loss: 0.3434, lr: 0.006120, batch_cost: 0.2500, reader_cost: 0.12346, ips: 15.9972 samples/sec | ETA 00:48:182022-07-21 15:46:22 [INFO][TRAIN] epoch: 2, iter: 8420/20000, loss: 0.3542, lr: 0.006116, batch_cost: 0.2431, reader_cost: 0.11838, ips: 16.4565 samples/sec | ETA 00:46:542022-07-21 15:46:25 [INFO][TRAIN] epoch: 2, iter: 8430/20000, loss: 0.3150, lr: 0.006111, batch_cost: 0.2751, reader_cost: 0.12519, ips: 14.5411 samples/sec | ETA 00:53:022022-07-21 15:46:27 [INFO][TRAIN] epoch: 2, iter: 8440/20000, loss: 0.2827, lr: 0.006106, batch_cost: 0.2379, reader_cost: 0.11540, ips: 16.8156 samples/sec | ETA 00:45:492022-07-21 15:46:30 [INFO][TRAIN] epoch: 2, iter: 8450/20000, loss: 0.3541, lr: 0.006101, batch_cost: 0.2704, reader_cost: 0.11784, ips: 14.7916 samples/sec | ETA 00:52:032022-07-21 15:46:32 [INFO][TRAIN] epoch: 2, iter: 8460/20000, loss: 0.3654, lr: 0.006097, batch_cost: 0.2334, reader_cost: 0.11022, ips: 17.1365 samples/sec | ETA 00:44:532022-07-21 15:46:35 [INFO][TRAIN] epoch: 2, iter: 8470/20000, loss: 0.4482, lr: 0.006092, batch_cost: 0.2440, reader_cost: 0.11948, ips: 16.3913 samples/sec | ETA 00:46:532022-07-21 15:46:37 [INFO][TRAIN] epoch: 2, iter: 8480/20000, loss: 0.3939, lr: 0.006087, batch_cost: 0.2332, reader_cost: 0.10974, ips: 17.1535 samples/sec | ETA 00:44:462022-07-21 15:46:40 [INFO][TRAIN] epoch: 2, iter: 8490/20000, loss: 0.3876, lr: 0.006082, batch_cost: 0.2654, reader_cost: 0.14426, ips: 15.0734 samples/sec | ETA 00:50:542022-07-21 15:46:42 [INFO][TRAIN] epoch: 2, iter: 8500/20000, loss: 0.3985, lr: 0.006078, batch_cost: 0.2517, reader_cost: 0.12342, ips: 15.8934 samples/sec | ETA 00:48:142022-07-21 15:46:45 [INFO][TRAIN] epoch: 2, iter: 8510/20000, loss: 0.3719, lr: 0.006073, batch_cost: 0.2289, reader_cost: 0.10306, ips: 17.4771 samples/sec | ETA 00:43:492022-07-21 15:46:47 [INFO][TRAIN] epoch: 2, iter: 8520/20000, loss: 0.2961, lr: 0.006068, batch_cost: 0.2316, reader_cost: 0.10777, ips: 17.2686 samples/sec | ETA 00:44:192022-07-21 15:46:50 [INFO][TRAIN] epoch: 2, iter: 8530/20000, loss: 0.2700, lr: 0.006063, batch_cost: 0.2627, reader_cost: 0.13880, ips: 15.2280 samples/sec | ETA 00:50:122022-07-21 15:46:52 [INFO][TRAIN] epoch: 2, iter: 8540/20000, loss: 0.4039, lr: 0.006059, batch_cost: 0.2444, reader_cost: 0.12187, ips: 16.3671 samples/sec | ETA 00:46:402022-07-21 15:46:54 [INFO][TRAIN] epoch: 2, iter: 8550/20000, loss: 0.3368, lr: 0.006054, batch_cost: 0.2404, reader_cost: 0.11667, ips: 16.6406 samples/sec | ETA 00:45:522022-07-21 15:46:57 [INFO][TRAIN] epoch: 2, iter: 8560/20000, loss: 0.3673, lr: 0.006049, batch_cost: 0.2821, reader_cost: 0.13864, ips: 14.1808 samples/sec | ETA 00:53:462022-07-21 15:47:00 [INFO][TRAIN] epoch: 2, iter: 8570/20000, loss: 0.4302, lr: 0.006044, batch_cost: 0.2413, reader_cost: 0.11875, ips: 16.5785 samples/sec | ETA 00:45:572022-07-21 15:47:02 [INFO][TRAIN] epoch: 2, iter: 8580/20000, loss: 0.2675, lr: 0.006040, batch_cost: 0.2393, reader_cost: 0.11386, ips: 16.7170 samples/sec | ETA 00:45:322022-07-21 15:47:04 [INFO][TRAIN] epoch: 2, iter: 8590/20000, loss: 0.4063, lr: 0.006035, batch_cost: 0.2385, reader_cost: 0.11242, ips: 16.7690 samples/sec | ETA 00:45:212022-07-21 15:47:07 [INFO][TRAIN] epoch: 2, iter: 8600/20000, loss: 0.3163, lr: 0.006030, batch_cost: 0.2519, reader_cost: 0.11682, ips: 15.8821 samples/sec | ETA 00:47:512022-07-21 15:47:09 [INFO][TRAIN] epoch: 2, iter: 8610/20000, loss: 0.3200, lr: 0.006025, batch_cost: 0.2234, reader_cost: 0.08122, ips: 17.9014 samples/sec | ETA 00:42:252022-07-21 15:47:12 [INFO][TRAIN] epoch: 2, iter: 8620/20000, loss: 0.3741, lr: 0.006021, batch_cost: 0.2400, reader_cost: 0.10857, ips: 16.6696 samples/sec | ETA 00:45:302022-07-21 15:47:15 [INFO][TRAIN] epoch: 2, iter: 8630/20000, loss: 0.3756, lr: 0.006016, batch_cost: 0.3348, reader_cost: 0.16522, ips: 11.9457 samples/sec | ETA 01:03:272022-07-21 15:47:18 [INFO][TRAIN] epoch: 2, iter: 8640/20000, loss: 0.5047, lr: 0.006011, batch_cost: 0.3262, reader_cost: 0.14204, ips: 12.2641 samples/sec | ETA 01:01:452022-07-21 15:47:21 [INFO][TRAIN] epoch: 2, iter: 8650/20000, loss: 0.2849, lr: 0.006006, batch_cost: 0.2326, reader_cost: 0.10460, ips: 17.1996 samples/sec | ETA 00:43:592022-07-21 15:47:23 [INFO][TRAIN] epoch: 2, iter: 8660/20000, loss: 0.5139, lr: 0.006001, batch_cost: 0.2351, reader_cost: 0.10916, ips: 17.0174 samples/sec | ETA 00:44:252022-07-21 15:47:26 [INFO][TRAIN] epoch: 2, iter: 8670/20000, loss: 0.3818, lr: 0.005997, batch_cost: 0.2662, reader_cost: 0.13807, ips: 15.0279 samples/sec | ETA 00:50:152022-07-21 15:47:28 [INFO][TRAIN] epoch: 2, iter: 8680/20000, loss: 0.3030, lr: 0.005992, batch_cost: 0.2644, reader_cost: 0.12243, ips: 15.1304 samples/sec | ETA 00:49:522022-07-21 15:47:31 [INFO][TRAIN] epoch: 2, iter: 8690/20000, loss: 0.2631, lr: 0.005987, batch_cost: 0.2462, reader_cost: 0.11931, ips: 16.2447 samples/sec | ETA 00:46:242022-07-21 15:47:34 [INFO][TRAIN] epoch: 2, iter: 8700/20000, loss: 0.3303, lr: 0.005982, batch_cost: 0.2877, reader_cost: 0.13708, ips: 13.9044 samples/sec | ETA 00:54:102022-07-21 15:47:36 [INFO][TRAIN] epoch: 2, iter: 8710/20000, loss: 0.4510, lr: 0.005978, batch_cost: 0.2357, reader_cost: 0.11370, ips: 16.9683 samples/sec | ETA 00:44:212022-07-21 15:47:38 [INFO][TRAIN] epoch: 2, iter: 8720/20000, loss: 0.2980, lr: 0.005973, batch_cost: 0.2384, reader_cost: 0.11671, ips: 16.7756 samples/sec | ETA 00:44:492022-07-21 15:47:41 [INFO][TRAIN] epoch: 2, iter: 8730/20000, loss: 0.4752, lr: 0.005968, batch_cost: 0.2336, reader_cost: 0.11292, ips: 17.1205 samples/sec | ETA 00:43:532022-07-21 15:47:43 [INFO][TRAIN] epoch: 2, iter: 8740/20000, loss: 0.2895, lr: 0.005963, batch_cost: 0.2120, reader_cost: 0.08844, ips: 18.8679 samples/sec | ETA 00:39:472022-07-21 15:47:45 [INFO][TRAIN] epoch: 2, iter: 8750/20000, loss: 0.5014, lr: 0.005959, batch_cost: 0.2244, reader_cost: 0.10152, ips: 17.8225 samples/sec | ETA 00:42:042022-07-21 15:47:48 [INFO][TRAIN] epoch: 2, iter: 8760/20000, loss: 0.4452, lr: 0.005954, batch_cost: 0.2496, reader_cost: 0.11774, ips: 16.0262 samples/sec | ETA 00:46:452022-07-21 15:47:50 [INFO][TRAIN] epoch: 2, iter: 8770/20000, loss: 0.4412, lr: 0.005949, batch_cost: 0.2501, reader_cost: 0.12233, ips: 15.9915 samples/sec | ETA 00:46:482022-07-21 15:47:52 [INFO][TRAIN] epoch: 2, iter: 8780/20000, loss: 0.4256, lr: 0.005944, batch_cost: 0.2300, reader_cost: 0.10605, ips: 17.3880 samples/sec | ETA 00:43:012022-07-21 15:47:55 [INFO][TRAIN] epoch: 2, iter: 8790/20000, loss: 0.3645, lr: 0.005940, batch_cost: 0.2510, reader_cost: 0.12926, ips: 15.9356 samples/sec | ETA 00:46:532022-07-21 15:47:57 [INFO][TRAIN] epoch: 2, iter: 8800/20000, loss: 0.3587, lr: 0.005935, batch_cost: 0.2469, reader_cost: 0.11739, ips: 16.2007 samples/sec | ETA 00:46:052022-07-21 15:48:00 [INFO][TRAIN] epoch: 2, iter: 8810/20000, loss: 0.4339, lr: 0.005930, batch_cost: 0.2665, reader_cost: 0.11200, ips: 15.0073 samples/sec | ETA 00:49:422022-07-21 15:48:02 [INFO][TRAIN] epoch: 2, iter: 8820/20000, loss: 0.3538, lr: 0.005925, batch_cost: 0.2394, reader_cost: 0.11325, ips: 16.7088 samples/sec | ETA 00:44:362022-07-21 15:48:05 [INFO][TRAIN] epoch: 2, iter: 8830/20000, loss: 0.3803, lr: 0.005920, batch_cost: 0.2275, reader_cost: 0.10072, ips: 17.5838 samples/sec | ETA 00:42:202022-07-21 15:48:07 [INFO][TRAIN] epoch: 2, iter: 8840/20000, loss: 0.4038, lr: 0.005916, batch_cost: 0.2326, reader_cost: 0.10755, ips: 17.1938 samples/sec | ETA 00:43:162022-07-21 15:48:09 [INFO][TRAIN] epoch: 2, iter: 8850/20000, loss: 0.4135, lr: 0.005911, batch_cost: 0.2374, reader_cost: 0.11548, ips: 16.8507 samples/sec | ETA 00:44:062022-07-21 15:48:12 [INFO][TRAIN] epoch: 2, iter: 8860/20000, loss: 0.3957, lr: 0.005906, batch_cost: 0.2330, reader_cost: 0.10611, ips: 17.1695 samples/sec | ETA 00:43:152022-07-21 15:48:14 [INFO][TRAIN] epoch: 2, iter: 8870/20000, loss: 0.4652, lr: 0.005901, batch_cost: 0.2313, reader_cost: 0.10478, ips: 17.2972 samples/sec | ETA 00:42:532022-07-21 15:48:16 [INFO][TRAIN] epoch: 2, iter: 8880/20000, loss: 0.3779, lr: 0.005897, batch_cost: 0.2346, reader_cost: 0.10679, ips: 17.0474 samples/sec | ETA 00:43:292022-07-21 15:48:19 [INFO][TRAIN] epoch: 2, iter: 8890/20000, loss: 0.3451, lr: 0.005892, batch_cost: 0.2330, reader_cost: 0.10531, ips: 17.1666 samples/sec | ETA 00:43:082022-07-21 15:48:21 [INFO][TRAIN] epoch: 2, iter: 8900/20000, loss: 0.3680, lr: 0.005887, batch_cost: 0.2422, reader_cost: 0.11150, ips: 16.5143 samples/sec | ETA 00:44:482022-07-21 15:48:25 [INFO][TRAIN] epoch: 2, iter: 8910/20000, loss: 0.2894, lr: 0.005882, batch_cost: 0.3805, reader_cost: 0.19572, ips: 10.5125 samples/sec | ETA 01:10:192022-07-21 15:48:27 [INFO][TRAIN] epoch: 2, iter: 8920/20000, loss: 0.3174, lr: 0.005878, batch_cost: 0.2441, reader_cost: 0.11382, ips: 16.3896 samples/sec | ETA 00:45:042022-07-21 15:48:30 [INFO][TRAIN] epoch: 2, iter: 8930/20000, loss: 0.3215, lr: 0.005873, batch_cost: 0.2370, reader_cost: 0.10949, ips: 16.8807 samples/sec | ETA 00:43:432022-07-21 15:48:32 [INFO][TRAIN] epoch: 2, iter: 8940/20000, loss: 0.3899, lr: 0.005868, batch_cost: 0.2687, reader_cost: 0.11786, ips: 14.8853 samples/sec | ETA 00:49:322022-07-21 15:48:35 [INFO][TRAIN] epoch: 2, iter: 8950/20000, loss: 0.3870, lr: 0.005863, batch_cost: 0.2755, reader_cost: 0.13071, ips: 14.5207 samples/sec | ETA 00:50:432022-07-21 15:48:38 [INFO][TRAIN] epoch: 2, iter: 8960/20000, loss: 0.4225, lr: 0.005858, batch_cost: 0.2351, reader_cost: 0.11167, ips: 17.0171 samples/sec | ETA 00:43:152022-07-21 15:48:40 [INFO][TRAIN] epoch: 2, iter: 8970/20000, loss: 0.3613, lr: 0.005854, batch_cost: 0.2311, reader_cost: 0.10717, ips: 17.3105 samples/sec | ETA 00:42:282022-07-21 15:48:42 [INFO][TRAIN] epoch: 2, iter: 8980/20000, loss: 0.2971, lr: 0.005849, batch_cost: 0.2455, reader_cost: 0.11656, ips: 16.2912 samples/sec | ETA 00:45:052022-07-21 15:48:45 [INFO][TRAIN] epoch: 2, iter: 8990/20000, loss: 0.5250, lr: 0.005844, batch_cost: 0.2521, reader_cost: 0.12502, ips: 15.8681 samples/sec | ETA 00:46:152022-07-21 15:48:47 [INFO][TRAIN] epoch: 2, iter: 9000/20000, loss: 0.4248, lr: 0.005839, batch_cost: 0.2316, reader_cost: 0.10610, ips: 17.2729 samples/sec | ETA 00:42:272022-07-21 15:48:50 [INFO][TRAIN] epoch: 2, iter: 9010/20000, loss: 0.3118, lr: 0.005835, batch_cost: 0.2553, reader_cost: 0.12845, ips: 15.6706 samples/sec | ETA 00:46:452022-07-21 15:48:52 [INFO][TRAIN] epoch: 2, iter: 9020/20000, loss: 0.4037, lr: 0.005830, batch_cost: 0.2249, reader_cost: 0.10420, ips: 17.7819 samples/sec | ETA 00:41:092022-07-21 15:48:54 [INFO][TRAIN] epoch: 2, iter: 9030/20000, loss: 0.3256, lr: 0.005825, batch_cost: 0.2361, reader_cost: 0.11303, ips: 16.9399 samples/sec | ETA 00:43:102022-07-21 15:48:57 [INFO][TRAIN] epoch: 2, iter: 9040/20000, loss: 0.3372, lr: 0.005820, batch_cost: 0.2265, reader_cost: 0.10272, ips: 17.6593 samples/sec | ETA 00:41:222022-07-21 15:48:59 [INFO][TRAIN] epoch: 2, iter: 9050/20000, loss: 0.3766, lr: 0.005815, batch_cost: 0.2168, reader_cost: 0.09411, ips: 18.4498 samples/sec | ETA 00:39:342022-07-21 15:49:01 [INFO][TRAIN] epoch: 2, iter: 9060/20000, loss: 0.3565, lr: 0.005811, batch_cost: 0.2506, reader_cost: 0.12767, ips: 15.9612 samples/sec | ETA 00:45:412022-07-21 15:49:04 [INFO][TRAIN] epoch: 2, iter: 9070/20000, loss: 0.3560, lr: 0.005806, batch_cost: 0.2670, reader_cost: 0.14238, ips: 14.9817 samples/sec | ETA 00:48:382022-07-21 15:49:06 [INFO][TRAIN] epoch: 2, iter: 9080/20000, loss: 0.2961, lr: 0.005801, batch_cost: 0.2451, reader_cost: 0.12062, ips: 16.3178 samples/sec | ETA 00:44:362022-07-21 15:49:09 [INFO][TRAIN] epoch: 2, iter: 9090/20000, loss: 0.2933, lr: 0.005796, batch_cost: 0.2376, reader_cost: 0.11447, ips: 16.8330 samples/sec | ETA 00:43:122022-07-21 15:49:11 [INFO][TRAIN] epoch: 2, iter: 9100/20000, loss: 0.3168, lr: 0.005792, batch_cost: 0.2478, reader_cost: 0.12324, ips: 16.1412 samples/sec | ETA 00:45:012022-07-21 15:49:14 [INFO][TRAIN] epoch: 2, iter: 9110/20000, loss: 0.4104, lr: 0.005787, batch_cost: 0.2392, reader_cost: 0.11436, ips: 16.7220 samples/sec | ETA 00:43:242022-07-21 15:49:16 [INFO][TRAIN] epoch: 2, iter: 9120/20000, loss: 0.3228, lr: 0.005782, batch_cost: 0.2454, reader_cost: 0.12269, ips: 16.3001 samples/sec | ETA 00:44:292022-07-21 15:49:18 [INFO][TRAIN] epoch: 2, iter: 9130/20000, loss: 0.4526, lr: 0.005777, batch_cost: 0.2445, reader_cost: 0.11609, ips: 16.3603 samples/sec | ETA 00:44:172022-07-21 15:49:21 [INFO][TRAIN] epoch: 2, iter: 9140/20000, loss: 0.3368, lr: 0.005772, batch_cost: 0.2613, reader_cost: 0.13485, ips: 15.3073 samples/sec | ETA 00:47:172022-07-21 15:49:24 [INFO][TRAIN] epoch: 2, iter: 9150/20000, loss: 0.4126, lr: 0.005768, batch_cost: 0.2453, reader_cost: 0.11612, ips: 16.3097 samples/sec | ETA 00:44:202022-07-21 15:49:26 [INFO][TRAIN] epoch: 2, iter: 9160/20000, loss: 0.3478, lr: 0.005763, batch_cost: 0.2270, reader_cost: 0.10271, ips: 17.6245 samples/sec | ETA 00:41:002022-07-21 15:49:29 [INFO][TRAIN] epoch: 2, iter: 9170/20000, loss: 0.3692, lr: 0.005758, batch_cost: 0.2762, reader_cost: 0.13346, ips: 14.4845 samples/sec | ETA 00:49:502022-07-21 15:49:32 [INFO][TRAIN] epoch: 2, iter: 9180/20000, loss: 0.3648, lr: 0.005753, batch_cost: 0.3676, reader_cost: 0.17534, ips: 10.8810 samples/sec | ETA 01:06:172022-07-21 15:49:35 [INFO][TRAIN] epoch: 2, iter: 9190/20000, loss: 0.3475, lr: 0.005748, batch_cost: 0.2755, reader_cost: 0.12560, ips: 14.5174 samples/sec | ETA 00:49:382022-07-21 15:49:38 [INFO][TRAIN] epoch: 2, iter: 9200/20000, loss: 0.4395, lr: 0.005744, batch_cost: 0.2706, reader_cost: 0.13512, ips: 14.7833 samples/sec | ETA 00:48:422022-07-21 15:49:40 [INFO][TRAIN] epoch: 2, iter: 9210/20000, loss: 0.3647, lr: 0.005739, batch_cost: 0.2508, reader_cost: 0.12625, ips: 15.9458 samples/sec | ETA 00:45:062022-07-21 15:49:43 [INFO][TRAIN] epoch: 2, iter: 9220/20000, loss: 0.3713, lr: 0.005734, batch_cost: 0.2344, reader_cost: 0.10845, ips: 17.0652 samples/sec | ETA 00:42:062022-07-21 15:49:45 [INFO][TRAIN] epoch: 2, iter: 9230/20000, loss: 0.3257, lr: 0.005729, batch_cost: 0.2239, reader_cost: 0.10052, ips: 17.8662 samples/sec | ETA 00:40:112022-07-21 15:49:47 [INFO][TRAIN] epoch: 2, iter: 9240/20000, loss: 0.3977, lr: 0.005725, batch_cost: 0.2162, reader_cost: 0.09413, ips: 18.5008 samples/sec | ETA 00:38:462022-07-21 15:49:49 [INFO][TRAIN] epoch: 2, iter: 9250/20000, loss: 0.3130, lr: 0.005720, batch_cost: 0.2389, reader_cost: 0.11616, ips: 16.7427 samples/sec | ETA 00:42:482022-07-21 15:49:52 [INFO][TRAIN] epoch: 2, iter: 9260/20000, loss: 0.3605, lr: 0.005715, batch_cost: 0.2394, reader_cost: 0.11673, ips: 16.7109 samples/sec | ETA 00:42:502022-07-21 15:49:54 [INFO][TRAIN] epoch: 2, iter: 9270/20000, loss: 0.3616, lr: 0.005710, batch_cost: 0.2490, reader_cost: 0.12577, ips: 16.0629 samples/sec | ETA 00:44:312022-07-21 15:49:57 [INFO][TRAIN] epoch: 2, iter: 9280/20000, loss: 0.5049, lr: 0.005705, batch_cost: 0.2452, reader_cost: 0.11809, ips: 16.3111 samples/sec | ETA 00:43:482022-07-21 15:49:59 [INFO][TRAIN] epoch: 2, iter: 9290/20000, loss: 0.3136, lr: 0.005701, batch_cost: 0.2408, reader_cost: 0.11760, ips: 16.6134 samples/sec | ETA 00:42:582022-07-21 15:50:01 [INFO][TRAIN] epoch: 2, iter: 9300/20000, loss: 0.3065, lr: 0.005696, batch_cost: 0.2323, reader_cost: 0.10813, ips: 17.2167 samples/sec | ETA 00:41:252022-07-21 15:50:04 [INFO][TRAIN] epoch: 2, iter: 9310/20000, loss: 0.3286, lr: 0.005691, batch_cost: 0.2347, reader_cost: 0.11183, ips: 17.0432 samples/sec | ETA 00:41:482022-07-21 15:50:06 [INFO][TRAIN] epoch: 2, iter: 9320/20000, loss: 0.2784, lr: 0.005686, batch_cost: 0.2697, reader_cost: 0.12694, ips: 14.8334 samples/sec | ETA 00:47:592022-07-21 15:50:09 [INFO][TRAIN] epoch: 2, iter: 9330/20000, loss: 0.3032, lr: 0.005681, batch_cost: 0.2374, reader_cost: 0.11419, ips: 16.8499 samples/sec | ETA 00:42:122022-07-21 15:50:11 [INFO][TRAIN] epoch: 2, iter: 9340/20000, loss: 0.3179, lr: 0.005677, batch_cost: 0.2362, reader_cost: 0.11396, ips: 16.9381 samples/sec | ETA 00:41:572022-07-21 15:50:14 [INFO][TRAIN] epoch: 2, iter: 9350/20000, loss: 0.3131, lr: 0.005672, batch_cost: 0.2359, reader_cost: 0.11221, ips: 16.9556 samples/sec | ETA 00:41:522022-07-21 15:50:16 [INFO][TRAIN] epoch: 2, iter: 9360/20000, loss: 0.3538, lr: 0.005667, batch_cost: 0.2456, reader_cost: 0.12091, ips: 16.2841 samples/sec | ETA 00:43:332022-07-21 15:50:18 [INFO][TRAIN] epoch: 2, iter: 9370/20000, loss: 0.4056, lr: 0.005662, batch_cost: 0.2277, reader_cost: 0.10336, ips: 17.5696 samples/sec | ETA 00:40:202022-07-21 15:50:21 [INFO][TRAIN] epoch: 2, iter: 9380/20000, loss: 0.4474, lr: 0.005657, batch_cost: 0.2408, reader_cost: 0.11824, ips: 16.6138 samples/sec | ETA 00:42:362022-07-21 15:50:23 [INFO][TRAIN] epoch: 2, iter: 9390/20000, loss: 0.3254, lr: 0.005653, batch_cost: 0.2435, reader_cost: 0.11575, ips: 16.4271 samples/sec | ETA 00:43:032022-07-21 15:50:26 [INFO][TRAIN] epoch: 2, iter: 9400/20000, loss: 0.4234, lr: 0.005648, batch_cost: 0.2457, reader_cost: 0.11927, ips: 16.2787 samples/sec | ETA 00:43:242022-07-21 15:50:28 [INFO][TRAIN] epoch: 2, iter: 9410/20000, loss: 0.4784, lr: 0.005643, batch_cost: 0.2479, reader_cost: 0.10087, ips: 16.1355 samples/sec | ETA 00:43:452022-07-21 15:50:30 [INFO][TRAIN] epoch: 2, iter: 9420/20000, loss: 0.3366, lr: 0.005638, batch_cost: 0.2370, reader_cost: 0.09561, ips: 16.8767 samples/sec | ETA 00:41:472022-07-21 15:50:33 [INFO][TRAIN] epoch: 2, iter: 9430/20000, loss: 0.3297, lr: 0.005633, batch_cost: 0.2407, reader_cost: 0.11481, ips: 16.6201 samples/sec | ETA 00:42:232022-07-21 15:50:35 [INFO][TRAIN] epoch: 2, iter: 9440/20000, loss: 0.3596, lr: 0.005629, batch_cost: 0.2400, reader_cost: 0.10771, ips: 16.6645 samples/sec | ETA 00:42:142022-07-21 15:50:40 [INFO][TRAIN] epoch: 2, iter: 9450/20000, loss: 0.3022, lr: 0.005624, batch_cost: 0.4813, reader_cost: 0.24061, ips: 8.3100 samples/sec | ETA 01:24:382022-07-21 15:50:43 [INFO][TRAIN] epoch: 2, iter: 9460/20000, loss: 0.3640, lr: 0.005619, batch_cost: 0.2802, reader_cost: 0.13757, ips: 14.2760 samples/sec | ETA 00:49:132022-07-21 15:50:45 [INFO][TRAIN] epoch: 2, iter: 9470/20000, loss: 0.5272, lr: 0.005614, batch_cost: 0.2343, reader_cost: 0.11118, ips: 17.0746 samples/sec | ETA 00:41:062022-07-21 15:50:48 [INFO][TRAIN] epoch: 2, iter: 9480/20000, loss: 0.3791, lr: 0.005610, batch_cost: 0.2344, reader_cost: 0.11076, ips: 17.0641 samples/sec | ETA 00:41:052022-07-21 15:50:50 [INFO][TRAIN] epoch: 2, iter: 9490/20000, loss: 0.3625, lr: 0.005605, batch_cost: 0.2301, reader_cost: 0.10060, ips: 17.3806 samples/sec | ETA 00:40:182022-07-21 15:50:52 [INFO][TRAIN] epoch: 2, iter: 9500/20000, loss: 0.3030, lr: 0.005600, batch_cost: 0.2301, reader_cost: 0.10167, ips: 17.3864 samples/sec | ETA 00:40:152022-07-21 15:50:55 [INFO][TRAIN] epoch: 2, iter: 9510/20000, loss: 0.4320, lr: 0.005595, batch_cost: 0.2331, reader_cost: 0.10934, ips: 17.1587 samples/sec | ETA 00:40:452022-07-21 15:50:57 [INFO][TRAIN] epoch: 2, iter: 9520/20000, loss: 0.4028, lr: 0.005590, batch_cost: 0.2372, reader_cost: 0.11497, ips: 16.8659 samples/sec | ETA 00:41:252022-07-21 15:50:59 [INFO][TRAIN] epoch: 2, iter: 9530/20000, loss: 0.4041, lr: 0.005586, batch_cost: 0.2418, reader_cost: 0.11759, ips: 16.5450 samples/sec | ETA 00:42:112022-07-21 15:51:02 [INFO][TRAIN] epoch: 2, iter: 9540/20000, loss: 0.3686, lr: 0.005581, batch_cost: 0.2354, reader_cost: 0.10995, ips: 16.9915 samples/sec | ETA 00:41:022022-07-21 15:51:04 [INFO][TRAIN] epoch: 2, iter: 9550/20000, loss: 0.3841, lr: 0.005576, batch_cost: 0.2310, reader_cost: 0.10705, ips: 17.3196 samples/sec | ETA 00:40:132022-07-21 15:51:06 [INFO][TRAIN] epoch: 2, iter: 9560/20000, loss: 0.3477, lr: 0.005571, batch_cost: 0.2485, reader_cost: 0.12605, ips: 16.0950 samples/sec | ETA 00:43:142022-07-21 15:51:09 [INFO][TRAIN] epoch: 2, iter: 9570/20000, loss: 0.3367, lr: 0.005566, batch_cost: 0.2214, reader_cost: 0.09981, ips: 18.0638 samples/sec | ETA 00:38:292022-07-21 15:51:11 [INFO][TRAIN] epoch: 2, iter: 9580/20000, loss: 0.2829, lr: 0.005561, batch_cost: 0.2722, reader_cost: 0.12292, ips: 14.6926 samples/sec | ETA 00:47:162022-07-21 15:51:14 [INFO][TRAIN] epoch: 2, iter: 9590/20000, loss: 0.3803, lr: 0.005557, batch_cost: 0.2274, reader_cost: 0.10367, ips: 17.5907 samples/sec | ETA 00:39:272022-07-21 15:51:16 [INFO][TRAIN] epoch: 2, iter: 9600/20000, loss: 0.3151, lr: 0.005552, batch_cost: 0.2375, reader_cost: 0.11134, ips: 16.8441 samples/sec | ETA 00:41:092022-07-21 15:51:18 [INFO][TRAIN] epoch: 2, iter: 9610/20000, loss: 0.3803, lr: 0.005547, batch_cost: 0.2342, reader_cost: 0.11246, ips: 17.0824 samples/sec | ETA 00:40:322022-07-21 15:51:21 [INFO][TRAIN] epoch: 2, iter: 9620/20000, loss: 0.4055, lr: 0.005542, batch_cost: 0.2347, reader_cost: 0.11084, ips: 17.0450 samples/sec | ETA 00:40:352022-07-21 15:51:23 [INFO][TRAIN] epoch: 2, iter: 9630/20000, loss: 0.3595, lr: 0.005537, batch_cost: 0.2347, reader_cost: 0.10829, ips: 17.0422 samples/sec | ETA 00:40:332022-07-21 15:51:26 [INFO][TRAIN] epoch: 2, iter: 9640/20000, loss: 0.3313, lr: 0.005533, batch_cost: 0.2593, reader_cost: 0.13146, ips: 15.4273 samples/sec | ETA 00:44:462022-07-21 15:51:28 [INFO][TRAIN] epoch: 2, iter: 9650/20000, loss: 0.3234, lr: 0.005528, batch_cost: 0.2241, reader_cost: 0.10134, ips: 17.8528 samples/sec | ETA 00:38:382022-07-21 15:51:30 [INFO][TRAIN] epoch: 2, iter: 9660/20000, loss: 0.2805, lr: 0.005523, batch_cost: 0.2413, reader_cost: 0.11405, ips: 16.5764 samples/sec | ETA 00:41:352022-07-21 15:51:33 [INFO][TRAIN] epoch: 2, iter: 9670/20000, loss: 0.3706, lr: 0.005518, batch_cost: 0.2376, reader_cost: 0.10820, ips: 16.8353 samples/sec | ETA 00:40:542022-07-21 15:51:35 [INFO][TRAIN] epoch: 2, iter: 9680/20000, loss: 0.3469, lr: 0.005513, batch_cost: 0.2386, reader_cost: 0.11377, ips: 16.7613 samples/sec | ETA 00:41:022022-07-21 15:51:37 [INFO][TRAIN] epoch: 2, iter: 9690/20000, loss: 0.3864, lr: 0.005509, batch_cost: 0.2255, reader_cost: 0.09816, ips: 17.7407 samples/sec | ETA 00:38:442022-07-21 15:51:40 [INFO][TRAIN] epoch: 2, iter: 9700/20000, loss: 0.2648, lr: 0.005504, batch_cost: 0.2294, reader_cost: 0.10375, ips: 17.4368 samples/sec | ETA 00:39:222022-07-21 15:51:43 [INFO][TRAIN] epoch: 2, iter: 9710/20000, loss: 0.3084, lr: 0.005499, batch_cost: 0.2924, reader_cost: 0.14242, ips: 13.6820 samples/sec | ETA 00:50:082022-07-21 15:51:47 [INFO][TRAIN] epoch: 2, iter: 9720/20000, loss: 0.2482, lr: 0.005494, batch_cost: 0.4534, reader_cost: 0.24070, ips: 8.8216 samples/sec | ETA 01:17:412022-07-21 15:51:50 [INFO][TRAIN] epoch: 2, iter: 9730/20000, loss: 0.3241, lr: 0.005489, batch_cost: 0.2865, reader_cost: 0.12204, ips: 13.9601 samples/sec | ETA 00:49:022022-07-21 15:51:52 [INFO][TRAIN] epoch: 2, iter: 9740/20000, loss: 0.3215, lr: 0.005485, batch_cost: 0.2376, reader_cost: 0.10781, ips: 16.8354 samples/sec | ETA 00:40:372022-07-21 15:51:55 [INFO][TRAIN] epoch: 2, iter: 9750/20000, loss: 0.3278, lr: 0.005480, batch_cost: 0.2538, reader_cost: 0.13096, ips: 15.7593 samples/sec | ETA 00:43:212022-07-21 15:51:57 [INFO][TRAIN] epoch: 2, iter: 9760/20000, loss: 0.3433, lr: 0.005475, batch_cost: 0.2389, reader_cost: 0.11304, ips: 16.7427 samples/sec | ETA 00:40:462022-07-21 15:52:00 [INFO][TRAIN] epoch: 2, iter: 9770/20000, loss: 0.3239, lr: 0.005470, batch_cost: 0.2266, reader_cost: 0.10443, ips: 17.6500 samples/sec | ETA 00:38:382022-07-21 15:52:02 [INFO][TRAIN] epoch: 2, iter: 9780/20000, loss: 0.3441, lr: 0.005465, batch_cost: 0.2384, reader_cost: 0.11380, ips: 16.7755 samples/sec | ETA 00:40:362022-07-21 15:52:04 [INFO][TRAIN] epoch: 2, iter: 9790/20000, loss: 0.3238, lr: 0.005461, batch_cost: 0.2428, reader_cost: 0.12142, ips: 16.4738 samples/sec | ETA 00:41:192022-07-21 15:52:07 [INFO][TRAIN] epoch: 2, iter: 9800/20000, loss: 0.4424, lr: 0.005456, batch_cost: 0.2271, reader_cost: 0.10291, ips: 17.6137 samples/sec | ETA 00:38:362022-07-21 15:52:09 [INFO][TRAIN] epoch: 2, iter: 9810/20000, loss: 0.4047, lr: 0.005451, batch_cost: 0.2239, reader_cost: 0.10091, ips: 17.8667 samples/sec | ETA 00:38:012022-07-21 15:52:11 [INFO][TRAIN] epoch: 2, iter: 9820/20000, loss: 0.2742, lr: 0.005446, batch_cost: 0.2292, reader_cost: 0.10405, ips: 17.4501 samples/sec | ETA 00:38:532022-07-21 15:52:14 [INFO][TRAIN] epoch: 2, iter: 9830/20000, loss: 0.4473, lr: 0.005441, batch_cost: 0.2813, reader_cost: 0.13170, ips: 14.2214 samples/sec | ETA 00:47:402022-07-21 15:52:16 [INFO][TRAIN] epoch: 2, iter: 9840/20000, loss: 0.2918, lr: 0.005436, batch_cost: 0.2409, reader_cost: 0.11454, ips: 16.6021 samples/sec | ETA 00:40:472022-07-21 15:52:19 [INFO][TRAIN] epoch: 2, iter: 9850/20000, loss: 0.3203, lr: 0.005432, batch_cost: 0.2265, reader_cost: 0.10607, ips: 17.6608 samples/sec | ETA 00:38:182022-07-21 15:52:21 [INFO][TRAIN] epoch: 2, iter: 9860/20000, loss: 0.2830, lr: 0.005427, batch_cost: 0.2316, reader_cost: 0.10870, ips: 17.2721 samples/sec | ETA 00:39:082022-07-21 15:52:23 [INFO][TRAIN] epoch: 2, iter: 9870/20000, loss: 0.4303, lr: 0.005422, batch_cost: 0.2382, reader_cost: 0.11487, ips: 16.7905 samples/sec | ETA 00:40:132022-07-21 15:52:26 [INFO][TRAIN] epoch: 2, iter: 9880/20000, loss: 0.3192, lr: 0.005417, batch_cost: 0.2230, reader_cost: 0.09983, ips: 17.9351 samples/sec | ETA 00:37:372022-07-21 15:52:28 [INFO][TRAIN] epoch: 2, iter: 9890/20000, loss: 0.3606, lr: 0.005412, batch_cost: 0.2333, reader_cost: 0.11001, ips: 17.1450 samples/sec | ETA 00:39:182022-07-21 15:52:30 [INFO][TRAIN] epoch: 2, iter: 9900/20000, loss: 0.4961, lr: 0.005408, batch_cost: 0.2359, reader_cost: 0.11176, ips: 16.9560 samples/sec | ETA 00:39:422022-07-21 15:52:33 [INFO][TRAIN] epoch: 3, iter: 9910/20000, loss: 0.3578, lr: 0.005403, batch_cost: 0.2364, reader_cost: 0.11463, ips: 16.9240 samples/sec | ETA 00:39:442022-07-21 15:52:35 [INFO][TRAIN] epoch: 3, iter: 9920/20000, loss: 0.3289, lr: 0.005398, batch_cost: 0.2442, reader_cost: 0.11116, ips: 16.3774 samples/sec | ETA 00:41:012022-07-21 15:52:38 [INFO][TRAIN] epoch: 3, iter: 9930/20000, loss: 0.3882, lr: 0.005393, batch_cost: 0.2397, reader_cost: 0.11101, ips: 16.6864 samples/sec | ETA 00:40:132022-07-21 15:52:40 [INFO][TRAIN] epoch: 3, iter: 9940/20000, loss: 0.4135, lr: 0.005388, batch_cost: 0.2484, reader_cost: 0.12390, ips: 16.1000 samples/sec | ETA 00:41:392022-07-21 15:52:42 [INFO][TRAIN] epoch: 3, iter: 9950/20000, loss: 0.2918, lr: 0.005383, batch_cost: 0.2357, reader_cost: 0.10805, ips: 16.9740 samples/sec | ETA 00:39:282022-07-21 15:52:45 [INFO][TRAIN] epoch: 3, iter: 9960/20000, loss: 0.3873, lr: 0.005379, batch_cost: 0.2779, reader_cost: 0.14092, ips: 14.3958 samples/sec | ETA 00:46:292022-07-21 15:52:48 [INFO][TRAIN] epoch: 3, iter: 9970/20000, loss: 0.2681, lr: 0.005374, batch_cost: 0.3078, reader_cost: 0.15138, ips: 12.9947 samples/sec | ETA 00:51:272022-07-21 15:52:51 [INFO][TRAIN] epoch: 3, iter: 9980/20000, loss: 0.3009, lr: 0.005369, batch_cost: 0.2595, reader_cost: 0.12844, ips: 15.4138 samples/sec | ETA 00:43:202022-07-21 15:52:54 [INFO][TRAIN] epoch: 3, iter: 9990/20000, loss: 0.3348, lr: 0.005364, batch_cost: 0.3607, reader_cost: 0.17451, ips: 11.0892 samples/sec | ETA 01:00:102022-07-21 15:52:57 [INFO][TRAIN] epoch: 3, iter: 10000/20000, loss: 0.3239, lr: 0.005359, batch_cost: 0.2901, reader_cost: 0.15077, ips: 13.7867 samples/sec | ETA 00:48:212022-07-21 15:53:00 [INFO][TRAIN] epoch: 3, iter: 10010/20000, loss: 0.2764, lr: 0.005355, batch_cost: 0.2375, reader_cost: 0.11299, ips: 16.8409 samples/sec | ETA 00:39:322022-07-21 15:53:02 [INFO][TRAIN] epoch: 3, iter: 10020/20000, loss: 0.3123, lr: 0.005350, batch_cost: 0.2486, reader_cost: 0.12681, ips: 16.0912 samples/sec | ETA 00:41:202022-07-21 15:53:05 [INFO][TRAIN] epoch: 3, iter: 10030/20000, loss: 0.4088, lr: 0.005345, batch_cost: 0.2389, reader_cost: 0.11469, ips: 16.7464 samples/sec | ETA 00:39:412022-07-21 15:53:07 [INFO][TRAIN] epoch: 3, iter: 10040/20000, loss: 0.4949, lr: 0.005340, batch_cost: 0.2340, reader_cost: 0.11082, ips: 17.0952 samples/sec | ETA 00:38:502022-07-21 15:53:09 [INFO][TRAIN] epoch: 3, iter: 10050/20000, loss: 0.3278, lr: 0.005335, batch_cost: 0.2228, reader_cost: 0.10070, ips: 17.9558 samples/sec | ETA 00:36:562022-07-21 15:53:12 [INFO][TRAIN] epoch: 3, iter: 10060/20000, loss: 0.4127, lr: 0.005330, batch_cost: 0.2446, reader_cost: 0.11689, ips: 16.3563 samples/sec | ETA 00:40:302022-07-21 15:53:14 [INFO][TRAIN] epoch: 3, iter: 10070/20000, loss: 0.4148, lr: 0.005326, batch_cost: 0.2330, reader_cost: 0.10767, ips: 17.1692 samples/sec | ETA 00:38:332022-07-21 15:53:16 [INFO][TRAIN] epoch: 3, iter: 10080/20000, loss: 0.3814, lr: 0.005321, batch_cost: 0.2506, reader_cost: 0.11859, ips: 15.9607 samples/sec | ETA 00:41:262022-07-21 15:53:19 [INFO][TRAIN] epoch: 3, iter: 10090/20000, loss: 0.4552, lr: 0.005316, batch_cost: 0.2461, reader_cost: 0.11514, ips: 16.2538 samples/sec | ETA 00:40:382022-07-21 15:53:21 [INFO][TRAIN] epoch: 3, iter: 10100/20000, loss: 0.3444, lr: 0.005311, batch_cost: 0.2309, reader_cost: 0.10600, ips: 17.3266 samples/sec | ETA 00:38:052022-07-21 15:53:24 [INFO][TRAIN] epoch: 3, iter: 10110/20000, loss: 0.3150, lr: 0.005306, batch_cost: 0.2322, reader_cost: 0.10636, ips: 17.2276 samples/sec | ETA 00:38:162022-07-21 15:53:26 [INFO][TRAIN] epoch: 3, iter: 10120/20000, loss: 0.3417, lr: 0.005301, batch_cost: 0.2222, reader_cost: 0.09689, ips: 18.0056 samples/sec | ETA 00:36:342022-07-21 15:53:28 [INFO][TRAIN] epoch: 3, iter: 10130/20000, loss: 0.3496, lr: 0.005297, batch_cost: 0.2202, reader_cost: 0.09236, ips: 18.1683 samples/sec | ETA 00:36:132022-07-21 15:53:30 [INFO][TRAIN] epoch: 3, iter: 10140/20000, loss: 0.3413, lr: 0.005292, batch_cost: 0.2240, reader_cost: 0.09447, ips: 17.8580 samples/sec | ETA 00:36:482022-07-21 15:53:33 [INFO][TRAIN] epoch: 3, iter: 10150/20000, loss: 0.2704, lr: 0.005287, batch_cost: 0.2354, reader_cost: 0.10732, ips: 16.9914 samples/sec | ETA 00:38:382022-07-21 15:53:35 [INFO][TRAIN] epoch: 3, iter: 10160/20000, loss: 0.3593, lr: 0.005282, batch_cost: 0.2290, reader_cost: 0.10392, ips: 17.4683 samples/sec | ETA 00:37:332022-07-21 15:53:37 [INFO][TRAIN] epoch: 3, iter: 10170/20000, loss: 0.3323, lr: 0.005277, batch_cost: 0.2395, reader_cost: 0.11741, ips: 16.7047 samples/sec | ETA 00:39:132022-07-21 15:53:39 [INFO][TRAIN] epoch: 3, iter: 10180/20000, loss: 0.4802, lr: 0.005272, batch_cost: 0.2095, reader_cost: 0.08479, ips: 19.0976 samples/sec | ETA 00:34:162022-07-21 15:53:42 [INFO][TRAIN] epoch: 3, iter: 10190/20000, loss: 0.4059, lr: 0.005268, batch_cost: 0.2396, reader_cost: 0.10577, ips: 16.6920 samples/sec | ETA 00:39:102022-07-21 15:53:44 [INFO][TRAIN] epoch: 3, iter: 10200/20000, loss: 0.4751, lr: 0.005263, batch_cost: 0.2310, reader_cost: 0.10209, ips: 17.3184 samples/sec | ETA 00:37:432022-07-21 15:53:46 [INFO][TRAIN] epoch: 3, iter: 10210/20000, loss: 0.3373, lr: 0.005258, batch_cost: 0.2412, reader_cost: 0.11351, ips: 16.5824 samples/sec | ETA 00:39:212022-07-21 15:53:49 [INFO][TRAIN] epoch: 3, iter: 10220/20000, loss: 0.3796, lr: 0.005253, batch_cost: 0.2572, reader_cost: 0.10753, ips: 15.5544 samples/sec | ETA 00:41:552022-07-21 15:53:52 [INFO][TRAIN] epoch: 3, iter: 10230/20000, loss: 0.2732, lr: 0.005248, batch_cost: 0.2907, reader_cost: 0.14148, ips: 13.7601 samples/sec | ETA 00:47:202022-07-21 15:53:54 [INFO][TRAIN] epoch: 3, iter: 10240/20000, loss: 0.4746, lr: 0.005243, batch_cost: 0.2260, reader_cost: 0.10285, ips: 17.6980 samples/sec | ETA 00:36:452022-07-21 15:53:57 [INFO][TRAIN] epoch: 3, iter: 10250/20000, loss: 0.3503, lr: 0.005239, batch_cost: 0.2385, reader_cost: 0.10945, ips: 16.7731 samples/sec | ETA 00:38:452022-07-21 15:54:00 [INFO][TRAIN] epoch: 3, iter: 10260/20000, loss: 0.3372, lr: 0.005234, batch_cost: 0.2941, reader_cost: 0.14212, ips: 13.6023 samples/sec | ETA 00:47:442022-07-21 15:54:03 [INFO][TRAIN] epoch: 3, iter: 10270/20000, loss: 0.3882, lr: 0.005229, batch_cost: 0.3358, reader_cost: 0.15928, ips: 11.9105 samples/sec | ETA 00:54:272022-07-21 15:54:05 [INFO][TRAIN] epoch: 3, iter: 10280/20000, loss: 0.2789, lr: 0.005224, batch_cost: 0.2406, reader_cost: 0.11797, ips: 16.6285 samples/sec | ETA 00:38:582022-07-21 15:54:08 [INFO][TRAIN] epoch: 3, iter: 10290/20000, loss: 0.5647, lr: 0.005219, batch_cost: 0.2501, reader_cost: 0.12470, ips: 15.9940 samples/sec | ETA 00:40:282022-07-21 15:54:10 [INFO][TRAIN] epoch: 3, iter: 10300/20000, loss: 0.3400, lr: 0.005214, batch_cost: 0.2398, reader_cost: 0.11780, ips: 16.6792 samples/sec | ETA 00:38:462022-07-21 15:54:12 [INFO][TRAIN] epoch: 3, iter: 10310/20000, loss: 0.3769, lr: 0.005210, batch_cost: 0.2310, reader_cost: 0.10378, ips: 17.3141 samples/sec | ETA 00:37:182022-07-21 15:54:15 [INFO][TRAIN] epoch: 3, iter: 10320/20000, loss: 0.3506, lr: 0.005205, batch_cost: 0.2356, reader_cost: 0.10864, ips: 16.9767 samples/sec | ETA 00:38:002022-07-21 15:54:17 [INFO][TRAIN] epoch: 3, iter: 10330/20000, loss: 0.3014, lr: 0.005200, batch_cost: 0.2316, reader_cost: 0.10440, ips: 17.2733 samples/sec | ETA 00:37:192022-07-21 15:54:20 [INFO][TRAIN] epoch: 3, iter: 10340/20000, loss: 0.2694, lr: 0.005195, batch_cost: 0.2560, reader_cost: 0.11977, ips: 15.6275 samples/sec | ETA 00:41:122022-07-21 15:54:22 [INFO][TRAIN] epoch: 3, iter: 10350/20000, loss: 0.4047, lr: 0.005190, batch_cost: 0.2604, reader_cost: 0.11892, ips: 15.3624 samples/sec | ETA 00:41:522022-07-21 15:54:25 [INFO][TRAIN] epoch: 3, iter: 10360/20000, loss: 0.2904, lr: 0.005185, batch_cost: 0.2414, reader_cost: 0.11777, ips: 16.5733 samples/sec | ETA 00:38:462022-07-21 15:54:27 [INFO][TRAIN] epoch: 3, iter: 10370/20000, loss: 0.3791, lr: 0.005181, batch_cost: 0.2288, reader_cost: 0.10654, ips: 17.4815 samples/sec | ETA 00:36:432022-07-21 15:54:29 [INFO][TRAIN] epoch: 3, iter: 10380/20000, loss: 0.3920, lr: 0.005176, batch_cost: 0.2410, reader_cost: 0.11813, ips: 16.5955 samples/sec | ETA 00:38:382022-07-21 15:54:32 [INFO][TRAIN] epoch: 3, iter: 10390/20000, loss: 0.3336, lr: 0.005171, batch_cost: 0.2230, reader_cost: 0.09472, ips: 17.9339 samples/sec | ETA 00:35:432022-07-21 15:54:34 [INFO][TRAIN] epoch: 3, iter: 10400/20000, loss: 0.3088, lr: 0.005166, batch_cost: 0.2232, reader_cost: 0.09979, ips: 17.9245 samples/sec | ETA 00:35:422022-07-21 15:54:36 [INFO][TRAIN] epoch: 3, iter: 10410/20000, loss: 0.3351, lr: 0.005161, batch_cost: 0.2323, reader_cost: 0.10919, ips: 17.2224 samples/sec | ETA 00:37:072022-07-21 15:54:39 [INFO][TRAIN] epoch: 3, iter: 10420/20000, loss: 0.3815, lr: 0.005156, batch_cost: 0.2383, reader_cost: 0.11296, ips: 16.7848 samples/sec | ETA 00:38:032022-07-21 15:54:41 [INFO][TRAIN] epoch: 3, iter: 10430/20000, loss: 0.2595, lr: 0.005152, batch_cost: 0.2350, reader_cost: 0.11235, ips: 17.0211 samples/sec | ETA 00:37:282022-07-21 15:54:43 [INFO][TRAIN] epoch: 3, iter: 10440/20000, loss: 0.3088, lr: 0.005147, batch_cost: 0.2350, reader_cost: 0.11090, ips: 17.0202 samples/sec | ETA 00:37:262022-07-21 15:54:46 [INFO][TRAIN] epoch: 3, iter: 10450/20000, loss: 0.4280, lr: 0.005142, batch_cost: 0.2510, reader_cost: 0.12292, ips: 15.9372 samples/sec | ETA 00:39:562022-07-21 15:54:48 [INFO][TRAIN] epoch: 3, iter: 10460/20000, loss: 0.3732, lr: 0.005137, batch_cost: 0.2471, reader_cost: 0.12032, ips: 16.1857 samples/sec | ETA 00:39:172022-07-21 15:54:51 [INFO][TRAIN] epoch: 3, iter: 10470/20000, loss: 0.4362, lr: 0.005132, batch_cost: 0.2267, reader_cost: 0.10085, ips: 17.6419 samples/sec | ETA 00:36:002022-07-21 15:54:54 [INFO][TRAIN] epoch: 3, iter: 10480/20000, loss: 0.3784, lr: 0.005127, batch_cost: 0.3331, reader_cost: 0.17735, ips: 12.0078 samples/sec | ETA 00:52:512022-07-21 15:54:57 [INFO][TRAIN] epoch: 3, iter: 10490/20000, loss: 0.4448, lr: 0.005122, batch_cost: 0.2652, reader_cost: 0.13542, ips: 15.0831 samples/sec | ETA 00:42:022022-07-21 15:54:59 [INFO][TRAIN] epoch: 3, iter: 10500/20000, loss: 0.4418, lr: 0.005118, batch_cost: 0.2290, reader_cost: 0.10606, ips: 17.4706 samples/sec | ETA 00:36:152022-07-21 15:55:01 [INFO][TRAIN] epoch: 3, iter: 10510/20000, loss: 0.5078, lr: 0.005113, batch_cost: 0.2308, reader_cost: 0.10844, ips: 17.3320 samples/sec | ETA 00:36:302022-07-21 15:55:04 [INFO][TRAIN] epoch: 3, iter: 10520/20000, loss: 0.2750, lr: 0.005108, batch_cost: 0.2409, reader_cost: 0.11809, ips: 16.6065 samples/sec | ETA 00:38:032022-07-21 15:55:06 [INFO][TRAIN] epoch: 3, iter: 10530/20000, loss: 0.4096, lr: 0.005103, batch_cost: 0.2558, reader_cost: 0.11244, ips: 15.6394 samples/sec | ETA 00:40:222022-07-21 15:55:10 [INFO][TRAIN] epoch: 3, iter: 10540/20000, loss: 0.4072, lr: 0.005098, batch_cost: 0.3609, reader_cost: 0.15664, ips: 11.0831 samples/sec | ETA 00:56:542022-07-21 15:55:12 [INFO][TRAIN] epoch: 3, iter: 10550/20000, loss: 0.3686, lr: 0.005093, batch_cost: 0.2410, reader_cost: 0.11334, ips: 16.6008 samples/sec | ETA 00:37:572022-07-21 15:55:14 [INFO][TRAIN] epoch: 3, iter: 10560/20000, loss: 0.4262, lr: 0.005088, batch_cost: 0.2265, reader_cost: 0.09857, ips: 17.6614 samples/sec | ETA 00:35:372022-07-21 15:55:17 [INFO][TRAIN] epoch: 3, iter: 10570/20000, loss: 0.2958, lr: 0.005084, batch_cost: 0.2283, reader_cost: 0.10472, ips: 17.5177 samples/sec | ETA 00:35:532022-07-21 15:55:19 [INFO][TRAIN] epoch: 3, iter: 10580/20000, loss: 0.3897, lr: 0.005079, batch_cost: 0.2086, reader_cost: 0.08689, ips: 19.1788 samples/sec | ETA 00:32:442022-07-21 15:55:21 [INFO][TRAIN] epoch: 3, iter: 10590/20000, loss: 0.3123, lr: 0.005074, batch_cost: 0.2475, reader_cost: 0.12526, ips: 16.1601 samples/sec | ETA 00:38:492022-07-21 15:55:24 [INFO][TRAIN] epoch: 3, iter: 10600/20000, loss: 0.4368, lr: 0.005069, batch_cost: 0.2748, reader_cost: 0.13262, ips: 14.5583 samples/sec | ETA 00:43:022022-07-21 15:55:26 [INFO][TRAIN] epoch: 3, iter: 10610/20000, loss: 0.3426, lr: 0.005064, batch_cost: 0.2439, reader_cost: 0.11943, ips: 16.4035 samples/sec | ETA 00:38:092022-07-21 15:55:29 [INFO][TRAIN] epoch: 3, iter: 10620/20000, loss: 0.3202, lr: 0.005059, batch_cost: 0.2388, reader_cost: 0.11636, ips: 16.7510 samples/sec | ETA 00:37:192022-07-21 15:55:31 [INFO][TRAIN] epoch: 3, iter: 10630/20000, loss: 0.2912, lr: 0.005055, batch_cost: 0.2378, reader_cost: 0.11496, ips: 16.8231 samples/sec | ETA 00:37:072022-07-21 15:55:34 [INFO][TRAIN] epoch: 3, iter: 10640/20000, loss: 0.3233, lr: 0.005050, batch_cost: 0.2386, reader_cost: 0.11546, ips: 16.7642 samples/sec | ETA 00:37:132022-07-21 15:55:36 [INFO][TRAIN] epoch: 3, iter: 10650/20000, loss: 0.3079, lr: 0.005045, batch_cost: 0.2278, reader_cost: 0.10596, ips: 17.5566 samples/sec | ETA 00:35:302022-07-21 15:55:38 [INFO][TRAIN] epoch: 3, iter: 10660/20000, loss: 0.3593, lr: 0.005040, batch_cost: 0.2401, reader_cost: 0.11651, ips: 16.6612 samples/sec | ETA 00:37:222022-07-21 15:55:41 [INFO][TRAIN] epoch: 3, iter: 10670/20000, loss: 0.4755, lr: 0.005035, batch_cost: 0.2249, reader_cost: 0.10180, ips: 17.7819 samples/sec | ETA 00:34:582022-07-21 15:55:43 [INFO][TRAIN] epoch: 3, iter: 10680/20000, loss: 0.4357, lr: 0.005030, batch_cost: 0.2271, reader_cost: 0.09812, ips: 17.6121 samples/sec | ETA 00:35:162022-07-21 15:55:45 [INFO][TRAIN] epoch: 3, iter: 10690/20000, loss: 0.3285, lr: 0.005025, batch_cost: 0.2585, reader_cost: 0.12330, ips: 15.4711 samples/sec | ETA 00:40:072022-07-21 15:55:48 [INFO][TRAIN] epoch: 3, iter: 10700/20000, loss: 0.3483, lr: 0.005021, batch_cost: 0.2360, reader_cost: 0.10595, ips: 16.9480 samples/sec | ETA 00:36:342022-07-21 15:55:50 [INFO][TRAIN] epoch: 3, iter: 10710/20000, loss: 0.3591, lr: 0.005016, batch_cost: 0.2230, reader_cost: 0.09556, ips: 17.9350 samples/sec | ETA 00:34:312022-07-21 15:55:52 [INFO][TRAIN] epoch: 3, iter: 10720/20000, loss: 0.3940, lr: 0.005011, batch_cost: 0.2351, reader_cost: 0.10884, ips: 17.0162 samples/sec | ETA 00:36:212022-07-21 15:55:55 [INFO][TRAIN] epoch: 3, iter: 10730/20000, loss: 0.2967, lr: 0.005006, batch_cost: 0.2581, reader_cost: 0.09984, ips: 15.4986 samples/sec | ETA 00:39:522022-07-21 15:55:58 [INFO][TRAIN] epoch: 3, iter: 10740/20000, loss: 0.4202, lr: 0.005001, batch_cost: 0.3204, reader_cost: 0.14373, ips: 12.4847 samples/sec | ETA 00:49:262022-07-21 15:56:00 [INFO][TRAIN] epoch: 3, iter: 10750/20000, loss: 0.3601, lr: 0.004996, batch_cost: 0.2311, reader_cost: 0.10619, ips: 17.3072 samples/sec | ETA 00:35:372022-07-21 15:56:03 [INFO][TRAIN] epoch: 3, iter: 10760/20000, loss: 0.3168, lr: 0.004991, batch_cost: 0.2322, reader_cost: 0.10846, ips: 17.2287 samples/sec | ETA 00:35:452022-07-21 15:56:05 [INFO][TRAIN] epoch: 3, iter: 10770/20000, loss: 0.2984, lr: 0.004987, batch_cost: 0.2201, reader_cost: 0.09724, ips: 18.1719 samples/sec | ETA 00:33:512022-07-21 15:56:07 [INFO][TRAIN] epoch: 3, iter: 10780/20000, loss: 0.4584, lr: 0.004982, batch_cost: 0.2130, reader_cost: 0.08798, ips: 18.7773 samples/sec | ETA 00:32:442022-07-21 15:56:09 [INFO][TRAIN] epoch: 3, iter: 10790/20000, loss: 0.4133, lr: 0.004977, batch_cost: 0.2290, reader_cost: 0.10402, ips: 17.4689 samples/sec | ETA 00:35:082022-07-21 15:56:12 [INFO][TRAIN] epoch: 3, iter: 10800/20000, loss: 0.3553, lr: 0.004972, batch_cost: 0.2545, reader_cost: 0.12877, ips: 15.7181 samples/sec | ETA 00:39:012022-07-21 15:56:15 [INFO][TRAIN] epoch: 3, iter: 10810/20000, loss: 0.4532, lr: 0.004967, batch_cost: 0.3258, reader_cost: 0.16190, ips: 12.2783 samples/sec | ETA 00:49:532022-07-21 15:56:18 [INFO][TRAIN] epoch: 3, iter: 10820/20000, loss: 0.5089, lr: 0.004962, batch_cost: 0.3150, reader_cost: 0.14615, ips: 12.6987 samples/sec | ETA 00:48:112022-07-21 15:56:21 [INFO][TRAIN] epoch: 3, iter: 10830/20000, loss: 0.3970, lr: 0.004957, batch_cost: 0.2382, reader_cost: 0.11346, ips: 16.7921 samples/sec | ETA 00:36:242022-07-21 15:56:23 [INFO][TRAIN] epoch: 3, iter: 10840/20000, loss: 0.4134, lr: 0.004952, batch_cost: 0.2375, reader_cost: 0.11493, ips: 16.8393 samples/sec | ETA 00:36:152022-07-21 15:56:26 [INFO][TRAIN] epoch: 3, iter: 10850/20000, loss: 0.3937, lr: 0.004948, batch_cost: 0.2406, reader_cost: 0.11623, ips: 16.6255 samples/sec | ETA 00:36:412022-07-21 15:56:28 [INFO][TRAIN] epoch: 3, iter: 10860/20000, loss: 0.2770, lr: 0.004943, batch_cost: 0.2811, reader_cost: 0.13504, ips: 14.2323 samples/sec | ETA 00:42:482022-07-21 15:56:31 [INFO][TRAIN] epoch: 3, iter: 10870/20000, loss: 0.3401, lr: 0.004938, batch_cost: 0.2472, reader_cost: 0.12098, ips: 16.1807 samples/sec | ETA 00:37:372022-07-21 15:56:33 [INFO][TRAIN] epoch: 3, iter: 10880/20000, loss: 0.3897, lr: 0.004933, batch_cost: 0.2374, reader_cost: 0.11579, ips: 16.8493 samples/sec | ETA 00:36:052022-07-21 15:56:36 [INFO][TRAIN] epoch: 3, iter: 10890/20000, loss: 0.3537, lr: 0.004928, batch_cost: 0.2343, reader_cost: 0.11153, ips: 17.0733 samples/sec | ETA 00:35:342022-07-21 15:56:38 [INFO][TRAIN] epoch: 3, iter: 10900/20000, loss: 0.2812, lr: 0.004923, batch_cost: 0.2284, reader_cost: 0.10619, ips: 17.5164 samples/sec | ETA 00:34:382022-07-21 15:56:40 [INFO][TRAIN] epoch: 3, iter: 10910/20000, loss: 0.2629, lr: 0.004918, batch_cost: 0.2351, reader_cost: 0.11163, ips: 17.0144 samples/sec | ETA 00:35:372022-07-21 15:56:43 [INFO][TRAIN] epoch: 3, iter: 10920/20000, loss: 0.3870, lr: 0.004914, batch_cost: 0.2523, reader_cost: 0.11973, ips: 15.8538 samples/sec | ETA 00:38:102022-07-21 15:56:45 [INFO][TRAIN] epoch: 3, iter: 10930/20000, loss: 0.4101, lr: 0.004909, batch_cost: 0.2524, reader_cost: 0.12279, ips: 15.8486 samples/sec | ETA 00:38:092022-07-21 15:56:48 [INFO][TRAIN] epoch: 3, iter: 10940/20000, loss: 0.3804, lr: 0.004904, batch_cost: 0.2538, reader_cost: 0.12336, ips: 15.7621 samples/sec | ETA 00:38:192022-07-21 15:56:50 [INFO][TRAIN] epoch: 3, iter: 10950/20000, loss: 0.3426, lr: 0.004899, batch_cost: 0.2415, reader_cost: 0.11585, ips: 16.5661 samples/sec | ETA 00:36:252022-07-21 15:56:52 [INFO][TRAIN] epoch: 3, iter: 10960/20000, loss: 0.3328, lr: 0.004894, batch_cost: 0.2280, reader_cost: 0.10674, ips: 17.5467 samples/sec | ETA 00:34:202022-07-21 15:56:55 [INFO][TRAIN] epoch: 3, iter: 10970/20000, loss: 0.4474, lr: 0.004889, batch_cost: 0.2472, reader_cost: 0.11906, ips: 16.1843 samples/sec | ETA 00:37:112022-07-21 15:56:57 [INFO][TRAIN] epoch: 3, iter: 10980/20000, loss: 0.2909, lr: 0.004884, batch_cost: 0.2302, reader_cost: 0.10617, ips: 17.3749 samples/sec | ETA 00:34:362022-07-21 15:57:00 [INFO][TRAIN] epoch: 3, iter: 10990/20000, loss: 0.4413, lr: 0.004879, batch_cost: 0.3208, reader_cost: 0.15816, ips: 12.4702 samples/sec | ETA 00:48:102022-07-21 15:57:03 [INFO][TRAIN] epoch: 3, iter: 11000/20000, loss: 0.3464, lr: 0.004875, batch_cost: 0.2964, reader_cost: 0.13717, ips: 13.4961 samples/sec | ETA 00:44:272022-07-21 15:57:06 [INFO][TRAIN] epoch: 3, iter: 11010/20000, loss: 0.3409, lr: 0.004870, batch_cost: 0.2245, reader_cost: 0.10200, ips: 17.8149 samples/sec | ETA 00:33:382022-07-21 15:57:08 [INFO][TRAIN] epoch: 3, iter: 11020/20000, loss: 0.3205, lr: 0.004865, batch_cost: 0.2352, reader_cost: 0.11222, ips: 17.0040 samples/sec | ETA 00:35:122022-07-21 15:57:10 [INFO][TRAIN] epoch: 3, iter: 11030/20000, loss: 0.3511, lr: 0.004860, batch_cost: 0.2361, reader_cost: 0.11405, ips: 16.9453 samples/sec | ETA 00:35:172022-07-21 15:57:13 [INFO][TRAIN] epoch: 3, iter: 11040/20000, loss: 0.3380, lr: 0.004855, batch_cost: 0.2315, reader_cost: 0.10828, ips: 17.2815 samples/sec | ETA 00:34:332022-07-21 15:57:15 [INFO][TRAIN] epoch: 3, iter: 11050/20000, loss: 0.4317, lr: 0.004850, batch_cost: 0.2415, reader_cost: 0.11677, ips: 16.5648 samples/sec | ETA 00:36:012022-07-21 15:57:17 [INFO][TRAIN] epoch: 3, iter: 11060/20000, loss: 0.3769, lr: 0.004845, batch_cost: 0.2237, reader_cost: 0.10177, ips: 17.8787 samples/sec | ETA 00:33:202022-07-21 15:57:20 [INFO][TRAIN] epoch: 3, iter: 11070/20000, loss: 0.4009, lr: 0.004840, batch_cost: 0.2288, reader_cost: 0.09976, ips: 17.4833 samples/sec | ETA 00:34:032022-07-21 15:57:24 [INFO][TRAIN] epoch: 3, iter: 11080/20000, loss: 0.3407, lr: 0.004836, batch_cost: 0.3922, reader_cost: 0.20090, ips: 10.1988 samples/sec | ETA 00:58:182022-07-21 15:57:26 [INFO][TRAIN] epoch: 3, iter: 11090/20000, loss: 0.3852, lr: 0.004831, batch_cost: 0.2595, reader_cost: 0.11877, ips: 15.4113 samples/sec | ETA 00:38:322022-07-21 15:57:28 [INFO][TRAIN] epoch: 3, iter: 11100/20000, loss: 0.2889, lr: 0.004826, batch_cost: 0.2349, reader_cost: 0.10722, ips: 17.0307 samples/sec | ETA 00:34:502022-07-21 15:57:31 [INFO][TRAIN] epoch: 3, iter: 11110/20000, loss: 0.2835, lr: 0.004821, batch_cost: 0.2804, reader_cost: 0.13581, ips: 14.2651 samples/sec | ETA 00:41:322022-07-21 15:57:34 [INFO][TRAIN] epoch: 3, iter: 11120/20000, loss: 0.4029, lr: 0.004816, batch_cost: 0.2270, reader_cost: 0.10339, ips: 17.6192 samples/sec | ETA 00:33:352022-07-21 15:57:36 [INFO][TRAIN] epoch: 3, iter: 11130/20000, loss: 0.3379, lr: 0.004811, batch_cost: 0.2266, reader_cost: 0.10365, ips: 17.6507 samples/sec | ETA 00:33:302022-07-21 15:57:38 [INFO][TRAIN] epoch: 3, iter: 11140/20000, loss: 0.3615, lr: 0.004806, batch_cost: 0.2238, reader_cost: 0.10295, ips: 17.8735 samples/sec | ETA 00:33:022022-07-21 15:57:40 [INFO][TRAIN] epoch: 3, iter: 11150/20000, loss: 0.3973, lr: 0.004801, batch_cost: 0.2343, reader_cost: 0.10983, ips: 17.0699 samples/sec | ETA 00:34:332022-07-21 15:57:43 [INFO][TRAIN] epoch: 3, iter: 11160/20000, loss: 0.3186, lr: 0.004796, batch_cost: 0.2280, reader_cost: 0.10207, ips: 17.5435 samples/sec | ETA 00:33:352022-07-21 15:57:45 [INFO][TRAIN] epoch: 3, iter: 11170/20000, loss: 0.3561, lr: 0.004792, batch_cost: 0.2598, reader_cost: 0.13363, ips: 15.3943 samples/sec | ETA 00:38:142022-07-21 15:57:48 [INFO][TRAIN] epoch: 3, iter: 11180/20000, loss: 0.3956, lr: 0.004787, batch_cost: 0.2450, reader_cost: 0.11123, ips: 16.3279 samples/sec | ETA 00:36:002022-07-21 15:57:50 [INFO][TRAIN] epoch: 3, iter: 11190/20000, loss: 0.3258, lr: 0.004782, batch_cost: 0.2264, reader_cost: 0.09722, ips: 17.6708 samples/sec | ETA 00:33:142022-07-21 15:57:52 [INFO][TRAIN] epoch: 3, iter: 11200/20000, loss: 0.3449, lr: 0.004777, batch_cost: 0.2251, reader_cost: 0.10289, ips: 17.7662 samples/sec | ETA 00:33:012022-07-21 15:57:55 [INFO][TRAIN] epoch: 3, iter: 11210/20000, loss: 0.3543, lr: 0.004772, batch_cost: 0.2351, reader_cost: 0.11186, ips: 17.0167 samples/sec | ETA 00:34:262022-07-21 15:57:57 [INFO][TRAIN] epoch: 3, iter: 11220/20000, loss: 0.3273, lr: 0.004767, batch_cost: 0.2311, reader_cost: 0.10625, ips: 17.3112 samples/sec | ETA 00:33:482022-07-21 15:57:59 [INFO][TRAIN] epoch: 3, iter: 11230/20000, loss: 0.2933, lr: 0.004762, batch_cost: 0.2267, reader_cost: 0.10459, ips: 17.6409 samples/sec | ETA 00:33:082022-07-21 15:58:02 [INFO][TRAIN] epoch: 3, iter: 11240/20000, loss: 0.3187, lr: 0.004757, batch_cost: 0.2527, reader_cost: 0.12374, ips: 15.8276 samples/sec | ETA 00:36:532022-07-21 15:58:05 [INFO][TRAIN] epoch: 3, iter: 11250/20000, loss: 0.4843, lr: 0.004753, batch_cost: 0.3389, reader_cost: 0.16831, ips: 11.8028 samples/sec | ETA 00:49:252022-07-21 15:58:08 [INFO][TRAIN] epoch: 3, iter: 11260/20000, loss: 0.3408, lr: 0.004748, batch_cost: 0.2812, reader_cost: 0.13217, ips: 14.2244 samples/sec | ETA 00:40:572022-07-21 15:58:10 [INFO][TRAIN] epoch: 3, iter: 11270/20000, loss: 0.3243, lr: 0.004743, batch_cost: 0.2385, reader_cost: 0.10677, ips: 16.7682 samples/sec | ETA 00:34:422022-07-21 15:58:13 [INFO][TRAIN] epoch: 3, iter: 11280/20000, loss: 0.3574, lr: 0.004738, batch_cost: 0.2384, reader_cost: 0.11048, ips: 16.7754 samples/sec | ETA 00:34:392022-07-21 15:58:15 [INFO][TRAIN] epoch: 3, iter: 11290/20000, loss: 0.3157, lr: 0.004733, batch_cost: 0.2441, reader_cost: 0.11981, ips: 16.3860 samples/sec | ETA 00:35:262022-07-21 15:58:17 [INFO][TRAIN] epoch: 3, iter: 11300/20000, loss: 0.5059, lr: 0.004728, batch_cost: 0.2214, reader_cost: 0.09555, ips: 18.0666 samples/sec | ETA 00:32:062022-07-21 15:58:20 [INFO][TRAIN] epoch: 3, iter: 11310/20000, loss: 0.4382, lr: 0.004723, batch_cost: 0.2311, reader_cost: 0.10699, ips: 17.3106 samples/sec | ETA 00:33:282022-07-21 15:58:22 [INFO][TRAIN] epoch: 3, iter: 11320/20000, loss: 0.2926, lr: 0.004718, batch_cost: 0.2631, reader_cost: 0.13541, ips: 15.2007 samples/sec | ETA 00:38:042022-07-21 15:58:25 [INFO][TRAIN] epoch: 3, iter: 11330/20000, loss: 0.3461, lr: 0.004713, batch_cost: 0.2410, reader_cost: 0.11314, ips: 16.5994 samples/sec | ETA 00:34:492022-07-21 15:58:27 [INFO][TRAIN] epoch: 3, iter: 11340/20000, loss: 0.3426, lr: 0.004709, batch_cost: 0.2300, reader_cost: 0.09778, ips: 17.3926 samples/sec | ETA 00:33:112022-07-21 15:58:31 [INFO][TRAIN] epoch: 3, iter: 11350/20000, loss: 0.3127, lr: 0.004704, batch_cost: 0.4102, reader_cost: 0.21323, ips: 9.7512 samples/sec | ETA 00:59:082022-07-21 15:58:34 [INFO][TRAIN] epoch: 3, iter: 11360/20000, loss: 0.3642, lr: 0.004699, batch_cost: 0.2606, reader_cost: 0.12482, ips: 15.3472 samples/sec | ETA 00:37:312022-07-21 15:58:36 [INFO][TRAIN] epoch: 3, iter: 11370/20000, loss: 0.3804, lr: 0.004694, batch_cost: 0.2550, reader_cost: 0.11853, ips: 15.6877 samples/sec | ETA 00:36:402022-07-21 15:58:39 [INFO][TRAIN] epoch: 3, iter: 11380/20000, loss: 0.4474, lr: 0.004689, batch_cost: 0.2274, reader_cost: 0.10640, ips: 17.5882 samples/sec | ETA 00:32:402022-07-21 15:58:41 [INFO][TRAIN] epoch: 3, iter: 11390/20000, loss: 0.3405, lr: 0.004684, batch_cost: 0.2463, reader_cost: 0.11880, ips: 16.2429 samples/sec | ETA 00:35:202022-07-21 15:58:43 [INFO][TRAIN] epoch: 3, iter: 11400/20000, loss: 0.4528, lr: 0.004679, batch_cost: 0.2476, reader_cost: 0.11981, ips: 16.1581 samples/sec | ETA 00:35:282022-07-21 15:58:46 [INFO][TRAIN] epoch: 3, iter: 11410/20000, loss: 0.4124, lr: 0.004674, batch_cost: 0.2314, reader_cost: 0.10401, ips: 17.2842 samples/sec | ETA 00:33:072022-07-21 15:58:48 [INFO][TRAIN] epoch: 3, iter: 11420/20000, loss: 0.3971, lr: 0.004669, batch_cost: 0.2360, reader_cost: 0.11122, ips: 16.9475 samples/sec | ETA 00:33:452022-07-21 15:58:50 [INFO][TRAIN] epoch: 3, iter: 11430/20000, loss: 0.3674, lr: 0.004664, batch_cost: 0.2294, reader_cost: 0.10357, ips: 17.4379 samples/sec | ETA 00:32:452022-07-21 15:58:53 [INFO][TRAIN] epoch: 3, iter: 11440/20000, loss: 0.3899, lr: 0.004660, batch_cost: 0.2509, reader_cost: 0.12806, ips: 15.9408 samples/sec | ETA 00:35:472022-07-21 15:58:55 [INFO][TRAIN] epoch: 3, iter: 11450/20000, loss: 0.4188, lr: 0.004655, batch_cost: 0.2342, reader_cost: 0.10666, ips: 17.0805 samples/sec | ETA 00:33:222022-07-21 15:58:58 [INFO][TRAIN] epoch: 3, iter: 11460/20000, loss: 0.3202, lr: 0.004650, batch_cost: 0.2343, reader_cost: 0.10748, ips: 17.0691 samples/sec | ETA 00:33:212022-07-21 15:59:00 [INFO][TRAIN] epoch: 3, iter: 11470/20000, loss: 0.3427, lr: 0.004645, batch_cost: 0.2267, reader_cost: 0.09805, ips: 17.6457 samples/sec | ETA 00:32:132022-07-21 15:59:02 [INFO][TRAIN] epoch: 3, iter: 11480/20000, loss: 0.4160, lr: 0.004640, batch_cost: 0.2412, reader_cost: 0.11435, ips: 16.5817 samples/sec | ETA 00:34:152022-07-21 15:59:05 [INFO][TRAIN] epoch: 3, iter: 11490/20000, loss: 0.4208, lr: 0.004635, batch_cost: 0.2252, reader_cost: 0.09405, ips: 17.7592 samples/sec | ETA 00:31:562022-07-21 15:59:07 [INFO][TRAIN] epoch: 3, iter: 11500/20000, loss: 0.4001, lr: 0.004630, batch_cost: 0.2853, reader_cost: 0.12004, ips: 14.0191 samples/sec | ETA 00:40:252022-07-21 15:59:10 [INFO][TRAIN] epoch: 3, iter: 11510/20000, loss: 0.2887, lr: 0.004625, batch_cost: 0.2841, reader_cost: 0.13407, ips: 14.0791 samples/sec | ETA 00:40:122022-07-21 15:59:13 [INFO][TRAIN] epoch: 3, iter: 11520/20000, loss: 0.4400, lr: 0.004620, batch_cost: 0.2611, reader_cost: 0.13305, ips: 15.3206 samples/sec | ETA 00:36:542022-07-21 15:59:15 [INFO][TRAIN] epoch: 3, iter: 11530/20000, loss: 0.2940, lr: 0.004615, batch_cost: 0.2542, reader_cost: 0.12828, ips: 15.7326 samples/sec | ETA 00:35:532022-07-21 15:59:18 [INFO][TRAIN] epoch: 3, iter: 11540/20000, loss: 0.6789, lr: 0.004611, batch_cost: 0.2305, reader_cost: 0.10695, ips: 17.3571 samples/sec | ETA 00:32:292022-07-21 15:59:20 [INFO][TRAIN] epoch: 3, iter: 11550/20000, loss: 0.4579, lr: 0.004606, batch_cost: 0.2321, reader_cost: 0.10805, ips: 17.2348 samples/sec | ETA 00:32:412022-07-21 15:59:23 [INFO][TRAIN] epoch: 3, iter: 11560/20000, loss: 0.2699, lr: 0.004601, batch_cost: 0.2460, reader_cost: 0.12131, ips: 16.2596 samples/sec | ETA 00:34:362022-07-21 15:59:25 [INFO][TRAIN] epoch: 3, iter: 11570/20000, loss: 0.3821, lr: 0.004596, batch_cost: 0.2404, reader_cost: 0.11531, ips: 16.6383 samples/sec | ETA 00:33:462022-07-21 15:59:27 [INFO][TRAIN] epoch: 3, iter: 11580/20000, loss: 0.3816, lr: 0.004591, batch_cost: 0.2312, reader_cost: 0.10315, ips: 17.3021 samples/sec | ETA 00:32:262022-07-21 15:59:29 [INFO][TRAIN] epoch: 3, iter: 11590/20000, loss: 0.2769, lr: 0.004586, batch_cost: 0.2255, reader_cost: 0.09796, ips: 17.7397 samples/sec | ETA 00:31:362022-07-21 15:59:32 [INFO][TRAIN] epoch: 3, iter: 11600/20000, loss: 0.4626, lr: 0.004581, batch_cost: 0.2255, reader_cost: 0.10080, ips: 17.7393 samples/sec | ETA 00:31:342022-07-21 15:59:34 [INFO][TRAIN] epoch: 3, iter: 11610/20000, loss: 0.3068, lr: 0.004576, batch_cost: 0.2444, reader_cost: 0.11191, ips: 16.3677 samples/sec | ETA 00:34:102022-07-21 15:59:38 [INFO][TRAIN] epoch: 3, iter: 11620/20000, loss: 0.4552, lr: 0.004571, batch_cost: 0.3900, reader_cost: 0.19875, ips: 10.2555 samples/sec | ETA 00:54:282022-07-21 15:59:41 [INFO][TRAIN] epoch: 3, iter: 11630/20000, loss: 0.3540, lr: 0.004566, batch_cost: 0.3137, reader_cost: 0.14616, ips: 12.7498 samples/sec | ETA 00:43:452022-07-21 15:59:44 [INFO][TRAIN] epoch: 3, iter: 11640/20000, loss: 0.4021, lr: 0.004561, batch_cost: 0.2303, reader_cost: 0.10378, ips: 17.3720 samples/sec | ETA 00:32:042022-07-21 15:59:46 [INFO][TRAIN] epoch: 3, iter: 11650/20000, loss: 0.3727, lr: 0.004557, batch_cost: 0.2389, reader_cost: 0.10913, ips: 16.7450 samples/sec | ETA 00:33:142022-07-21 15:59:48 [INFO][TRAIN] epoch: 3, iter: 11660/20000, loss: 0.3768, lr: 0.004552, batch_cost: 0.2454, reader_cost: 0.11898, ips: 16.2974 samples/sec | ETA 00:34:062022-07-21 15:59:51 [INFO][TRAIN] epoch: 3, iter: 11670/20000, loss: 0.2816, lr: 0.004547, batch_cost: 0.2287, reader_cost: 0.10642, ips: 17.4935 samples/sec | ETA 00:31:442022-07-21 15:59:53 [INFO][TRAIN] epoch: 3, iter: 11680/20000, loss: 0.3239, lr: 0.004542, batch_cost: 0.2245, reader_cost: 0.10052, ips: 17.8195 samples/sec | ETA 00:31:072022-07-21 15:59:55 [INFO][TRAIN] epoch: 3, iter: 11690/20000, loss: 0.2929, lr: 0.004537, batch_cost: 0.2342, reader_cost: 0.10558, ips: 17.0798 samples/sec | ETA 00:32:262022-07-21 15:59:58 [INFO][TRAIN] epoch: 3, iter: 11700/20000, loss: 0.3777, lr: 0.004532, batch_cost: 0.2442, reader_cost: 0.12139, ips: 16.3816 samples/sec | ETA 00:33:462022-07-21 16:00:00 [INFO][TRAIN] epoch: 3, iter: 11710/20000, loss: 0.4263, lr: 0.004527, batch_cost: 0.2396, reader_cost: 0.11834, ips: 16.6958 samples/sec | ETA 00:33:062022-07-21 16:00:03 [INFO][TRAIN] epoch: 3, iter: 11720/20000, loss: 0.2970, lr: 0.004522, batch_cost: 0.2516, reader_cost: 0.13054, ips: 15.8970 samples/sec | ETA 00:34:432022-07-21 16:00:05 [INFO][TRAIN] epoch: 3, iter: 11730/20000, loss: 0.4070, lr: 0.004517, batch_cost: 0.2510, reader_cost: 0.12928, ips: 15.9333 samples/sec | ETA 00:34:362022-07-21 16:00:08 [INFO][TRAIN] epoch: 3, iter: 11740/20000, loss: 0.2966, lr: 0.004512, batch_cost: 0.2467, reader_cost: 0.12470, ips: 16.2139 samples/sec | ETA 00:33:572022-07-21 16:00:11 [INFO][TRAIN] epoch: 3, iter: 11750/20000, loss: 0.5062, lr: 0.004507, batch_cost: 0.3104, reader_cost: 0.13841, ips: 12.8854 samples/sec | ETA 00:42:412022-07-21 16:00:13 [INFO][TRAIN] epoch: 3, iter: 11760/20000, loss: 0.3227, lr: 0.004503, batch_cost: 0.2585, reader_cost: 0.12437, ips: 15.4739 samples/sec | ETA 00:35:302022-07-21 16:00:16 [INFO][TRAIN] epoch: 3, iter: 11770/20000, loss: 0.2404, lr: 0.004498, batch_cost: 0.2373, reader_cost: 0.11060, ips: 16.8583 samples/sec | ETA 00:32:322022-07-21 16:00:18 [INFO][TRAIN] epoch: 3, iter: 11780/20000, loss: 0.2977, lr: 0.004493, batch_cost: 0.2398, reader_cost: 0.11536, ips: 16.6772 samples/sec | ETA 00:32:512022-07-21 16:00:21 [INFO][TRAIN] epoch: 3, iter: 11790/20000, loss: 0.3985, lr: 0.004488, batch_cost: 0.2569, reader_cost: 0.12939, ips: 15.5705 samples/sec | ETA 00:35:092022-07-21 16:00:23 [INFO][TRAIN] epoch: 3, iter: 11800/20000, loss: 0.2839, lr: 0.004483, batch_cost: 0.2504, reader_cost: 0.12570, ips: 15.9723 samples/sec | ETA 00:34:132022-07-21 16:00:26 [INFO][TRAIN] epoch: 3, iter: 11810/20000, loss: 0.3727, lr: 0.004478, batch_cost: 0.2437, reader_cost: 0.11802, ips: 16.4142 samples/sec | ETA 00:33:152022-07-21 16:00:28 [INFO][TRAIN] epoch: 3, iter: 11820/20000, loss: 0.3121, lr: 0.004473, batch_cost: 0.2370, reader_cost: 0.11204, ips: 16.8756 samples/sec | ETA 00:32:182022-07-21 16:00:30 [INFO][TRAIN] epoch: 3, iter: 11830/20000, loss: 0.3445, lr: 0.004468, batch_cost: 0.2374, reader_cost: 0.11001, ips: 16.8491 samples/sec | ETA 00:32:192022-07-21 16:00:33 [INFO][TRAIN] epoch: 3, iter: 11840/20000, loss: 0.4331, lr: 0.004463, batch_cost: 0.2298, reader_cost: 0.10818, ips: 17.4043 samples/sec | ETA 00:31:152022-07-21 16:00:35 [INFO][TRAIN] epoch: 3, iter: 11850/20000, loss: 0.4077, lr: 0.004458, batch_cost: 0.2210, reader_cost: 0.09929, ips: 18.0969 samples/sec | ETA 00:30:012022-07-21 16:00:37 [INFO][TRAIN] epoch: 3, iter: 11860/20000, loss: 0.3754, lr: 0.004453, batch_cost: 0.2210, reader_cost: 0.09842, ips: 18.0978 samples/sec | ETA 00:29:592022-07-21 16:00:39 [INFO][TRAIN] epoch: 3, iter: 11870/20000, loss: 0.4111, lr: 0.004448, batch_cost: 0.2288, reader_cost: 0.10703, ips: 17.4855 samples/sec | ETA 00:30:592022-07-21 16:00:42 [INFO][TRAIN] epoch: 3, iter: 11880/20000, loss: 0.3511, lr: 0.004443, batch_cost: 0.2682, reader_cost: 0.12566, ips: 14.9146 samples/sec | ETA 00:36:172022-07-21 16:00:47 [INFO][TRAIN] epoch: 3, iter: 11890/20000, loss: 0.2718, lr: 0.004439, batch_cost: 0.4704, reader_cost: 0.21734, ips: 8.5025 samples/sec | ETA 01:03:352022-07-21 16:00:49 [INFO][TRAIN] epoch: 3, iter: 11900/20000, loss: 0.3884, lr: 0.004434, batch_cost: 0.2747, reader_cost: 0.13509, ips: 14.5592 samples/sec | ETA 00:37:052022-07-21 16:00:52 [INFO][TRAIN] epoch: 3, iter: 11910/20000, loss: 0.3579, lr: 0.004429, batch_cost: 0.2342, reader_cost: 0.11270, ips: 17.0815 samples/sec | ETA 00:31:342022-07-21 16:00:54 [INFO][TRAIN] epoch: 3, iter: 11920/20000, loss: 0.3641, lr: 0.004424, batch_cost: 0.2285, reader_cost: 0.10407, ips: 17.5030 samples/sec | ETA 00:30:462022-07-21 16:00:56 [INFO][TRAIN] epoch: 3, iter: 11930/20000, loss: 0.5021, lr: 0.004419, batch_cost: 0.2279, reader_cost: 0.10724, ips: 17.5527 samples/sec | ETA 00:30:392022-07-21 16:00:59 [INFO][TRAIN] epoch: 3, iter: 11940/20000, loss: 0.3756, lr: 0.004414, batch_cost: 0.2231, reader_cost: 0.10030, ips: 17.9261 samples/sec | ETA 00:29:582022-07-21 16:01:01 [INFO][TRAIN] epoch: 3, iter: 11950/20000, loss: 0.4340, lr: 0.004409, batch_cost: 0.2368, reader_cost: 0.11044, ips: 16.8932 samples/sec | ETA 00:31:462022-07-21 16:01:03 [INFO][TRAIN] epoch: 3, iter: 11960/20000, loss: 0.3717, lr: 0.004404, batch_cost: 0.2456, reader_cost: 0.11901, ips: 16.2845 samples/sec | ETA 00:32:542022-07-21 16:01:06 [INFO][TRAIN] epoch: 3, iter: 11970/20000, loss: 0.3427, lr: 0.004399, batch_cost: 0.2354, reader_cost: 0.11161, ips: 16.9904 samples/sec | ETA 00:31:302022-07-21 16:01:08 [INFO][TRAIN] epoch: 3, iter: 11980/20000, loss: 0.3233, lr: 0.004394, batch_cost: 0.2383, reader_cost: 0.11426, ips: 16.7882 samples/sec | ETA 00:31:502022-07-21 16:01:11 [INFO][TRAIN] epoch: 3, iter: 11990/20000, loss: 0.2750, lr: 0.004389, batch_cost: 0.2353, reader_cost: 0.11495, ips: 16.9978 samples/sec | ETA 00:31:242022-07-21 16:01:13 [INFO][TRAIN] epoch: 3, iter: 12000/20000, loss: 0.4184, lr: 0.004384, batch_cost: 0.2673, reader_cost: 0.12172, ips: 14.9621 samples/sec | ETA 00:35:382022-07-21 16:01:13 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 489s 66ms/step - batch_cost: 0.0661 - reader cost: 0.0071: 25:47 - batch_cost: 0.2102 - - ETA: 18:25 - batch_c - ETA: 9:03 - batch_cost: 0.0745 - reader cost: 0 - ETA: 8:44 - batch_cost: 0.0721 - reader cost: 0. - ETA: 8:30 - batch_cost: 0.0702 - rea - ETA: 8:01 - batch_cost: 0.0664 - reader cos - ETA: 7:55 - batch_cost: 0.0656 - reader co - ETA: 7:47 - batch_cost: 0.0647 - reader - ETA: 7:38 - batch_cost: 0.0636 - reader cost: 0.00 - ETA: 7:37 - batch_cost: 0.0635 - reader cost: - ETA: 7:33 - batch_cost: 0.0631 - reader co - ETA: 7:28 - batch_cost: 0.0625 - reader cost: - ETA: 7:24 - batch_cost: 0.0620 - reader cost: 0.003 - ETA: 7:23 - batch_cost: 0.0620 - reader cost: 0.003 - ETA: 7:24 - batch_cost: 0.0620 - reader cost: 0.0 - ETA: 7:22 - batch_cost: 0.0619 - reader cost - ETA: 7:16 - batch_cost: 0.0611 - reader c - ETA: 7:09 - batch_cost: 0.0603 - reader - ETA: 7:04 - batch_cost: 0.0599 - - ETA: 7:08 - batch_cost: 0.0606 - reader cost: - ETA: 7:10 - batch_cost: 0.0608 - reader co - ETA: 7:11 - batch_cost: 0.0612 - reader cos - ETA: 7:13 - batch_cost: - ETA: 7:16 - batch_cost: 0.0622 - re - ETA: 7:18 - batch_cost: 0.0626 - reade - ETA: 7:19 - batch_cost: 0.0628 - reade - ETA: 7:20 - batch_cost: 0.0632 - reader cost: - ETA: 7:20 - batch_cost: 0.0633 - re - ETA: 7:23 - batch_cost: 0.0638 - read - ETA: 7:25 - batch_cost: 0.0642 - reader cos - ETA: 7:26 - batch_cost: 0.0644 - reader co - ETA: 7:27 - batch_cost: 0.0647 - reader cost: 0.00 - ETA: 7:27 - batch_cost: 0.0647 - r - ETA: 7:30 - batch_cost: 0.0653 - reader cost: 0.0 - ETA: 7:30 - batch_cost: 0.0654 - reader cost: 0 - ETA: 7:31 - batch_cost: 0.0655 - reader cost: 0.01 - ETA: 7:32 - batch_cost: 0.0657 - reader cost: - ETA: 7:33 - batch_cost: 0.0659 - reader cost: 0 - ETA: 7:34 - batch_cost: 0.0660 - reader cos - ETA: 7:35 - batch_cost: 0.0664 - reader cost: - ETA: 7:36 - batch_cost: 0.0665 - reader c - ETA: 7:32 - batch_cost: 0.0661 - reader cost: - ETA: 7:33 - batch_cost: 0.0663 - reader cost: 0 - ETA: 7:34 - batch_cost: 0.0666 - reader cost: 0.0 - ETA: 7:35 - batch_cost: 0.0668 - reader cost: 0.0 - ETA: 7:36 - batch_cost: 0.0670 - reader cos - ETA: 7:38 - batch_cost: 0.0674 - reader cost: 0. - ETA: 7:40 - batch_cost: 0.0676 - reader cost: 0. - ETA: 7:41 - batch_cost: 0.0678 - reader - ETA: 7:43 - batch_cost: 0.0683 - reader cost - ETA: 7:42 - batch_cost: 0.0682 - reader cost: 0 - ETA: 7:39 - batch_cost: 0.0679 - reader cos - ETA: 7:35 - batch_cost: 0.0674 - reader - ETA: 7:29 - batch_cost: 0.0669 - reader cost: 0 - ETA: 7:27 - batch_cost: 0.0666 - reader cost: 0. - ETA: 7:25 - batch_cost: 0.0664 - reader cost: 0.007 - ETA: 7:25 - batch_cost: 0.0663 - reade - ETA: 7:22 - batch_cost: 0.0662 - reader cost: 0.0 - ETA: 7:22 - batch_cost: 0.0662 - reader cost: - ETA: 7:22 - batch_cost: 0.0662 - reader cost: 0.008 - ETA: 7:22 - batch_cost: 0.0662 - reader cost: 0.0 - ETA: 7:22 - batch_ - ETA: 7:17 - batch_cost: 0.0 - ETA: 7:07 - batch_cost: 0.0650 - read - ETA: 7:01 - batch_cost: 0.0644 - reader co - ETA: 6:58 - batch_cost: 0.0641 - reader cost: 0. - ETA: 6:57 - batch_cost: 0. - ETA: 6:48 - batch_cost: 0.0631 - reader - ETA: 6:44 - batch_cost: 0.0628 - reader cost: - ETA: 6:42 - batch_cost: 0.0626 - reader cos - ETA: 6:40 - batch_cost: 0.0624 - reader cost: 0.00 - ETA: 6:40 - batch_cost: 0.0624 - reader cos - ETA: 6:40 - batch_cost: 0.0626 - reader cost: 0 - ETA: 6:41 - batch_cost: 0 - ETA: 6:36 - batch_cost: 0.0624 - reader cost: 0. - ETA: 6:35 - batch_cost: 0.0623 - reader cost: 0 - ETA: 6:34 - batch_cost: 0.0622 - reader cost: 0.006 - ETA: 6:34 - batch_cost: 0.0622 - reader cost: 0.0 - ETA: 6:34 - batch_cost: 0.0623 - reader cost: 0. - ETA: 6:34 - batch_cost: 0.0623 - reader cost: 0.00 - ETA: 6:34 - batch_cost: 0.0624 - reader cost: 0.00 - ETA: 6:34 - batch_cost: 0.0624 - reader cost: 0.0 - ETA: 6:34 - batch_cost: 0.0624 - reader co - ETA: 6:31 - batch_cost: 0.0622 - reader cost - ETA: 6:29 - batch_cost: 0.062 - ETA: 6:24 - batch_cost: 0.0617 - reader cost: 0. - ETA: 6:23 - batch_cost: 0.0616 - reader co - ETA: 6:21 - batch_cost: 0.0614 - reader cost: - ETA: 6:19 - batch_cost: 0.0613 - reader co - ETA: 6:17 - batch_cost: 0.0611 - reader cost: - ETA: 6:16 - batch_cost: 0.0611 - reader cost: 0.0 - ETA: 6:16 - batch_cost: 0.0610 - reader cost: 0 - ETA: 6:15 - batch_cost: - ETA: 6:12 - batch_cost: 0.0609 - reader cost: 0.00 - ETA: 6:11 - batch_cost: 0.0609 - reader cost: 0. - ETA: 6:11 - batch_cost: 0.0609 - reader cost: 0.00 - ETA: 6:11 - batch_cost: 0.0609 - reader cost: - ETA: 6:10 - batch_cost: 0.0609 - reader c - ETA: 6:11 - batch_cost: 0.0611 - reader cost: 0.005 - ETA: 6:11 - batch_cost: 0.0611 - rea - ETA: 6:10 - batch_cost: 0.0612 - reader cost: 0.005 - ETA: - ETA: 6:09 - batch_cost: 0.0616 - reader cost: - ETA: 6:09 - ETA: 6:09 - batch_cost: 0.0620 - reader cos - ETA: 6:08 - batch_cost: 0.0620 - reader cost: 0.006 - ETA: 6:08 - batch_c - ETA: 6:06 - batch_cost: 0.0621 - r - ETA: 6:02 - batch_cost: 0.0618 - reader cost - ETA: 6:01 - batch_cost: 0.0617 - reader cos - ETA: 5:59 - batch_cost: 0.0615 - - ETA: 5:55 - batch_cost: 0.0613 - reader cost: 0. - ETA: 5:55 - batch_cost: 0.0613 - reader cost: 0.006 - ETA: 5:55 - batch_cost: 0.0613 - reader cost: 0.0 - ETA: 5:54 - batch_cost: 0.0614 - reader cost: - ETA: 5:54 - batch_cost: 0.0614 - reader cost: 0.00 - ETA: 5:54 - batch_cost: 0.0614 - reader cost: 0.006 - ETA: 5:54 - batch_cost: 0.0614 - reader - ETA: 5:51 - batch_cost: 0.0612 - reader c - ETA: 5:49 - batch_cost: 0 - ETA: 5:46 - batch_cost: 0.0610 - ETA: 5:47 - batch_cost: 0.0614 - re - ETA: 5:47 - batch_cost: 0.0617 - reader cost: 0. - ETA: 5:48 - batch_cost: 0.0618 - reader cost: 0.005 - ETA: 5:48 - batch_cost: 0.0618 - reader cost: 0. - ETA: 5:47 - batch_cost: 0.0619 - reader cost: 0.0 - ETA: 5:47 - batch_cost: 0.0619 - reader cost: 0.005 - ETA: 5:47 - batch_cost: 0.0619 - read - ETA: 5:44 - batch_cost: 0.0617 - reader cost: 0 - ETA: 5:43 - batch_cost: 0.0616 - reader cos - ETA: 5:42 - batch_cost: 0.0615 - reader cost: - ETA: 5:40 - batch_cost: 0.0614 - reader cost: 0.00 - ETA: 5:40 - batch_cost: 0.061 - ETA: 5:37 - batch_cost: 0.0612 - reader cost: 0.00 - ETA: 5:3 - ETA: 5:33 - batch_cost: 0.0612 - reader cost: 0. - ETA: 5:33 - batch_cost: 0.0612 - reader cost - ETA: 5:32 - batch_cost: 0.0612 - ETA: 5:31 - batch_cost: 0.0613 - reader cost - ETA: 5:30 - batch_cost: 0.0613 - reader cost: - ETA: 5:30 - batch_cost: 0.0613 - reader co - ETA: 5:30 - batch_cost: 0.0614 - reader cost: 0. - ETA: 5:29 - batch_cost: 0.0614 - reader cost: 0.005 - ETA: 5:29 - batch_cost: 0.0614 - reader cost: 0. - ETA: 5:29 - batch_cost: 0.0615 - reader cost: 0. - ETA: 5:29 - batch_cost: 0.0615 - reade - ETA: 5:30 - batch_cost: 0.0617 - reader cost: - ETA: 5:29 - batch_cost: 0.0617 - reader c - ETA: 5:29 - batch_cost: 0.0618 - reader cost: 0.00 - ETA: 5:28 - batch_cost: 0.0618 - reader cost: - ETA: 5:28 - batch_cost: 0.0618 - reader cost: 0.0 - ETA: 5:28 - batch_cost: 0.0618 - reader cost: 0. - ETA: 5:28 - batch_cost: 0.0618 - reader c - ETA: 5:27 - batch_cost: 0.0619 - reader cost: 0.006 - ETA: 5:27 - batch_cost: 0.0619 - reader cost: - ETA: 5:27 - batch_cost: 0.0620 - reader cost: 0.00 - ETA: 5:27 - batch_cost: 0.0620 - reader cost: 0. - ETA: 5:27 - batch_cost: 0.0621 - reader cost: - ETA: 5:27 - batch_cost: 0.0621 - reader cos - ETA: 5:27 - batch_cost: 0.0621 - reader cost: - ETA: 5:26 - batch_cost: 0 - ETA: 5:24 - batch_cost: 0.0621 - reader cost: 0. - ETA: 5:23 - batch_cost: 0.0621 - ETA: 5:22 - batch_cost: 0.0622 - reader cost: 0.006 - ETA: 5:22 - batch_cost: 0.0621 - reader cost: - ETA: 5:21 - batch_cost: 0.0622 - reader cost - ETA: 5:21 - batch_cost: 0.0622 - reader cost - ETA: 5:20 - batch_cost: 0.0622 - reader cost: 0. - ETA: 5:20 - batch_cost: 0.0622 - reader cost - ETA: 5:19 - batch_cost: 0.0622 - read - ETA: 5:19 - batch_cost: 0.0623 - reader cost: - ETA: 5:19 - batch_cost: 0.0624 - reader cost: 0. - ETA: 5:19 - batch_cost: 0.0624 - reader - ETA: 5:18 - batch_cost: 0.0625 - re - ETA: 5:17 - batch_cost: 0.0626 - reader cost: 0.00 - ETA: 5:17 - batch_cost: 0.0626 - r - ETA: 5:16 - batch_cost: 0.0626 - ETA: 5:14 - batch_cost: 0.0627 - reader cost: 0. - ETA: 5:14 - batch_cost: 0.0627 - reader cost: 0.00 - ETA: 5:14 - batch_cost: 0.0627 - reader cost: 0.00 - ETA: 5:14 - batch_cost: 0.062 - ETA: 5:13 - batch_cost: 0.0628 - reader cost: 0. - ETA: 5:12 - batch_cost: 0.0628 - reader cost: 0. - ETA: 5:12 - batch_cost: 0.0628 - reader co - ETA: 5:12 - batch_cost: 0.0628 - reader co - ETA: 5:11 - batch_cost: 0.0629 - reader cost: 0.0 - ETA: 5:11 - batch_cost: 0.0629 - reader - ETA: 5:10 - batch_cost: 0.0629 - reader cost - ETA: 5:09 - batch_cost: 0.0629 - reader cost: - ETA: 5:08 - batch_cost: 0.0629 - reader cos - ETA: 5:08 - batch_cost: 0.0629 - reade - ETA: 5:07 - batch_cost: 0.0629 - reade - ETA: 5:06 - batch_cost: 0.0629 - reader cost: 0 - ETA: 5:05 - batch_cost: 0.0630 - reader cost - ETA: 5:05 - batch_cost: 0.0630 - reader - ETA: 5:04 - batch_cost: 0.0630 - reader cost: 0.007 - ETA: 5:04 - batch_cost: 0.0630 - reader cost: 0.007 - ETA: 5:04 - batch_cost: 0.0630 - reader cost: - ETA: 5:04 - batch_cost: 0.0632 - reader cost: 0.00 - ETA: 5:04 - batch_cost: 0.0632 - reader cost: 0.007 - ETA: 5:04 - batch_cost: 0.0632 - reader - ETA: 5:03 - batch_cost: 0.0633 - reader cos - ETA: 5:03 - batch_cost: 0.0633 - reader cost: 0.00 - ETA: 5:03 - batch_cost: 0.0634 - reader cost: 0.0 - ETA: 5:02 - batch_cost: 0.0633 - - ETA: 5:01 - batch_cost: 0.0634 - reader cos - ETA: 5:00 - batch_cost: 0.0634 - reader cost: 0.007 - ETA: 5:00 - batch_cost: 0.0634 - reader cost: 0. - ETA: 5:00 - batch_cost: 0.0634 - reade - ETA: 4:59 - batch_cost: 0.0634 - reader co - ETA: 4:58 - batch_cost: 0.0634 - reader cost: 0.00 - ETA: 4:58 - batch_cost: 0.063 - ETA: 4:57 - batch_cost: 0.0636 - reader cost: 0.00 - ETA: 4:57 - batch_cost: 0.0637 - reader cost: 0.0 - ETA: 4:57 - batch_cost: 0.0638 - reader cost: - ETA: 4:57 - batch_cost: 0.0639 - reader cost: 0 - ETA: 4:57 - batch_cost: 0.0639 - reader cost: 0.007 - ETA: 4:58 - batch_cost: 0.0640 - reader c - ETA: 4:58 - batch_cost: 0.0642 - reader - ETA: 4:57 - batch_co - ETA: 4:54 - batch_cost: 0.0643 - reader cost: - ETA: 4:54 - batch_cost: 0.0643 - reader cost: 0.00 - ETA: 4:53 - batch_cost: 0.0643 - reader cost: 0.0 - ETA: 4:53 - batch_cost: 0.0643 - read - ETA: 4:52 - batch_cost: 0.0643 - read - ETA: 4:51 - batch_cost: 0.0643 - reader cost: 0.0 - ETA: 4:50 - batch_cost: 0.0643 - rea - ETA: 4:49 - batch_cost: 0.0643 - reader cost: 0.00 - ETA: 4:49 - batch_cost: 0.0643 - reader cost: 0.0 - ETA: 4:49 - batch_cost: 0.0644 - reader cost: 0. - ETA: 4:49 - batch_cost: 0.0644 - reader cost: 0. - ETA: 4:49 - batch_cost: 0.0644 - reader cost: 0.00 - ETA: 4:49 - batch_cost: 0.0645 - reader - ETA: 4:48 - batch_cost: 0.0646 - reader cost: - ETA: 4:48 - batch_cost: 0.0646 - rea - ETA: 4:47 - batch_cost: 0.0646 - - ETA: 4:45 - batch_cost: 0.0646 - reader cost: 0.0 - ETA: 4:45 - batch_cost: 0.0647 - reader cost: 0.007 - ETA: 4:45 - batch_cost: 0.0647 - re - ETA: 4:44 - batch_cost: 0.0647 - reader co - ETA: 4:43 - batch_cost: 0.0648 - reader cost: 0.007 - ETA: 4:43 - batch_cost: 0.0648 - reader cost: 0. - ETA: 4:43 - batch_cost: 0.0648 - reader co - ETA: 4:42 - batch_cost: 0.0649 - ETA: 4:40 - batch_cost: 0.0649 - reader cost: - ETA: 4:40 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 4:40 - batch_cost: 0.0649 - r - ETA: 4:38 - batch_cost: 0.0649 - reade - ETA: 4:37 - batch_cost: 0.0648 - reader c - ETA: 4:36 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 4:36 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 4:36 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 4:36 - batch_cost: 0.0649 - reader cost: - ETA: 4:36 - batch_cost: 0.0649 - reader cost: - ETA: 4:35 - batch_cost: 0.0649 - reade - ETA: 4:34 - batch_cost: 0.0649 - reader cost: 0.0 - ETA: 4:33 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 4:33 - batch_cost: 0.0648 - reade - ETA: 4:32 - batch_cost: 0.0649 - reader - ETA: 4:31 - batch_cost: 0.0649 - reader cost: - ETA: 4:31 - batch_cost: 0.0649 - r - ETA: 4:30 - batch_cost: 0.0650 - read - ETA: 4:28 - batch_cost: 0.0650 - reader co - ETA: 4:27 - batch_cost: 0.0650 - reader cost: 0.008 - ETA: 4:27 - batch_cost: 0.0650 - reader cost: - ETA: 4:27 - batch_cost: 0.0650 - reader cost: 0.008 - ETA: 4:27 - batch_cost: 0.0650 - reader cos - ETA: 4:26 - batch_cost: 0.0650 - reade - ETA: 4:25 - batch_cost: 0.0650 - reader - ETA: 4:24 - batch_cost: 0.06 - ETA: 4:22 - batch_cost: 0.0650 - reader cost: 0.0 - ETA: 4:21 - batch_cost: 0.065 - ETA: 4:20 - batch_cost: 0.0650 - reader cost: 0. - ETA: 4:19 - batch_cost: 0.0650 - reader cost: 0.00 - ETA: 4:19 - batch_cost: 0.0650 - reader c - ETA: 4:18 - batch_cost: 0.0650 - reader cost: 0.008 - ETA: 4:18 - batch_cost: 0.0650 - reader - ETA: 4:17 - batch_cost: 0.0650 - reader cost: 0.008 - ETA: 4:17 - batch_cost: 0.0650 - reader cost: 0.0 - ETA: 4:17 - batch_cost: 0.0650 - reader cost: 0.00 - ETA: 4:16 - batch_cost: 0.0650 - reader cost: 0.00 - ETA: 4:16 - batch_cost: 0.0650 - reader cost: 0.0 - ETA: 4:16 - batch_cost: 0.0650 - reader - ETA: 4:15 - batch_cost: 0.0650 - reader cost: 0 - ETA: 4:15 - batch_cost: 0.0650 - reader - ETA: 4:14 - batch_cost: 0.0651 - reader cost: 0.0 - ETA: 4:14 - batch_cost: 0.0651 - reader cost: 0.0 - ETA: 4:14 - batch_cost: 0.0651 - reader cost - ETA: 4:13 - batch_cost: 0.0652 - reader cost: - ETA: 4:13 - batch_cost: 0.0652 - reader cost: 0.00 - ETA: 4:13 - batch_cost: 0.0652 - ETA: 4:11 - batch_cost: 0. - ETA: 4:09 - batch_cost: 0.0652 - reader cost: 0. - ETA: 4:09 - batch_cost: 0.0652 - reader cost: 0.008 - ETA: 4:08 - batch_cost: 0.0652 - reader cost: - ETA: 4:08 - batch_cost: 0.0652 - reader cost - ETA: 4:07 - batch_cost: 0.0652 - reader cost: 0 - ETA: 4:07 - - ETA: 4:04 - batch_cost: 0.0652 - reader cost: 0. - ETA: 4:03 - batch_cost: 0.0652 - reader cost: 0 - ETA: 4:03 - batch_cost: 0.0653 - reader cost: - ETA: 4:03 - batch_cost: 0.0653 - reader - ETA: 4:02 - batch_cost: 0.0655 - reader cost: 0 - ETA: 4:02 - batch_cost: 0.0656 - reader c - ETA: 4:02 - batch_cost: 0.0657 - reader cost: 0. - ETA: 4:02 - batch_cost: 0.0658 - reader cost: 0.00 - ETA: 4:02 - batch_cost: 0.0658 - reader cost: 0.00 - ETA: 4:02 - batch_cost: 0.0658 - reader cost: 0 - ETA: 4:02 - batch_cost: 0.0659 - reader cost - ETA: 4:01 - batch_cost: 0.0659 - reader cost: 0.0 - ETA: 4:01 - batch_cost: 0.0659 - reader cost: 0. - ETA: 4:00 - batch_cost: 0.0658 - reader cost: 0 - ETA: 4:00 - batch_cost: 0.0658 - rea - ETA: 3:58 - batch_cost: 0.0658 - reader cos - ETA: 3:58 - batch_cost: 0.0658 - reader cost: 0.009 - ETA: 3:58 - batch_cost: 0.0658 - reader cost: 0. - ETA: 3:58 - batch_cost: 0.0659 - reader cost: 0.0 - ETA: 3:57 - batch_cost: 0.0659 - reader cost: - ETA: 3:57 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 3:57 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 3:57 - batch_cost: 0.0660 - reader cost: 0.00 - ETA: 3:57 - batch_cost: 0.0 - ETA: 3:55 - batch_cost: 0.0660 - reader cost: 0 - ETA: 3:54 - batch_cost: 0.0660 - reader cost - ETA: 3:54 - batch_cost: 0.0660 - reader cost: 0 - ETA: 3:53 - batch_cost: 0.0660 - reader cost: - ETA: 3:53 - batch_cost: 0.0660 - reader cost: 0 - ETA: 3:52 - batch_cost: 0.0660 - reader cost: 0. - ETA: 3:52 - batch_cost: 0.0660 - reader cost: 0.00 - ETA: 3:52 - batch_cost: 0.0660 - - ETA: 3:50 - batch_cost: 0.0660 - reader cost: 0 - ETA: 3:50 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 3:49 - batch_cost: 0.0661 - reader cost: 0.009 - ETA: 3:49 - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 3:49 - batch_cost: 0.0661 - reader cost: 0 - ETA: 3:49 - batch_cost: 0.0661 - reader c - ETA: 3:48 - batch_cost: 0.06 - ETA: 3:46 - batch_cost: 0.0661 - reader cost: 0. - ETA: 3:46 - batch_cost: 0.0661 - reader cost: 0.009 - ETA: 3:46 - batch_cost: 0.0661 - reader cost: 0 - ETA: 3:45 - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 3:45 - batch_cost: 0.0661 - reader cost - ETA: 3:44 - batch_cost: 0.0661 - reader cost: 0. - ETA: 3:44 - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 3:44 - batch_cost: 0.0661 - reader cost: 0.009 - ETA: 3:43 - batch_cost: 0.0661 - reader cost: 0.009 - ETA: 3:43 - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 3:43 - batch_cost: 0.0661 - ETA: 3:41 - batch_cost: 0.0661 - reader cost: 0.009 - ETA: 3:41 - batch_cost: 0.0661 - reader cost: 0. - ETA: 3:41 - batch_cost: 0.0661 - ETA: 3:39 - batch_cost: 0.0661 - reader - ETA: 3:38 - batch_cost: 0.0660 - reader - ETA: 3:37 - bat - ETA: 3:34 - batch_cost: 0.0661 - reader cost: 0. - ETA: 3:34 - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 3:34 - batch_cost: 0.0661 - reader cost - ETA: 3:33 - batch_cost: 0.0662 - reader cost: - ETA: 3:33 - batc - ETA: 3:29 - batch_cost: 0.0662 - reader cost: - ETA: 3:29 - batch_cost: 0.0 - ETA: 3:26 - batch_cost: 0.0662 - reader cost: 0.0 - ETA: 3:26 - batch_cost: 0.0662 - - ETA: 3:24 - batch_cost: 0.0662 - reader cost - ETA: 3:24 - batch_cost: 0.0662 - read - ETA: 3:23 - batch_cost: 0.0662 - reader cost: 0.00 - ETA: 3:22 - batch_cost: 0.0662 - reader co - ETA: 3:22 - batch_cost: 0.0662 - reader cost: 0.009 - ETA: 3:21 - batch_cost: 0. - ETA: 3:19 - batch_cost: 0.0662 - reader cost: - ETA: 3:19 - batch_cost: 0.0662 - reader cost: 0. - ETA: 3:18 - batch_cost: 0.0662 - reader cost: 0. - ETA: 3:18 - batch_cost: 0.0663 - reader cost: 0.009 - ETA: 3:18 - batch_cost: 0.0663 - reader cos - ETA: 3:17 - batch_cost: 0.0663 - reader - ETA: 3:16 - batch_cost: 0.0663 - reader co - ETA: 3:16 - batch_cost: 0.0663 - reader cost: - ETA: 3:15 - batch_cost: 0.0663 - reader c - ETA: 3:14 - batch_cost: 0.0663 - reader cost: 0 - ETA: 3:14 - batch_cost: 0.0663 - reader cost: 0.009 - ETA: 3:14 - batch_cost: 0.0663 - reader cost: 0.0 - ETA: 3:14 - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 3:13 - batch_cost: 0.0663 - reader cost: 0 - ETA: 3:13 - batch_cost: 0.0663 - reader c - ETA: 3:12 - batch_cost: 0.0663 - reader cost: 0. - ETA: 3:12 - batch_cost: 0.0663 - reader cost: 0 - ETA: 3:11 - batch_cost: 0.0663 - reader cos - ETA: 3:10 - batch_cost: 0.0663 - reader cost: - ETA: 3:10 - batch_cost: 0.0663 - reader cost: 0 - ETA: 3:09 - batch_cost: 0.0663 - reader co - ETA: 3:09 - batch_cost: 0.0663 - reader c - ETA: 3:08 - batch_cost: 0.0663 - reader cost: 0.0 - ETA: 3:07 - batch_cost: 0.0663 - reader cost: 0 - ETA: 3:07 - batch_cost: 0.0663 - reader cos - ETA: 3:06 - batch_cost: 0.0663 - reader cost: 0.009 - ETA: 3:06 - batch_cost: 0.0663 - reader - ETA: 3:04 - batch_cost: 0.0663 - reader cost: - ETA: 3:04 - batch_cost: 0.0663 - reader cost: 0.009 - ETA: 3:04 - batch_cost: 0.0663 - reader cost: 0 - ETA: 3:03 - batch_cost: 0.0663 - reader cost: 0.0 - ETA: 3:03 - batch_cost: 0.0663 - reader cos - ETA: 3:03 - batch_cost: 0.0664 - r - ETA: 3:02 - batch_cost: 0.0665 - reader cost: 0.00 - ETA: 3:02 - batch_cost: 0.0665 - reader co - ETA: 3:01 - batch_cost: 0.0667 - reader cos - ETA: 3:01 - batch_cost: 0.0667 - reader cost: 0 - ETA: 3:01 - batch_cost: 0.0668 - reader cost: 0.00 - ETA: 3:01 - batch_cost: 0.0668 - reader cost: 0.009 - ETA: 3:00 - batch_cost: 0.0668 - reader cost: 0.00 - ETA: 3:00 - batch_cost: 0.0668 - reader cost: 0.00 - ETA: 3:00 - batch_cost: 0.0668 - reader cost: 0.009 - ETA: 3:00 - batch_cost: 0.0668 - reader cost: 0.009 - ETA: 3:00 - batch_cost: 0.0668 - read - ETA: 2:59 - batch_cost: 0.0669 - reader cost: 0.009 - ETA: 2:59 - batch_cost: 0.0668 - reader - ETA: 2:58 - batch_cost: 0.0668 - ETA: 2:56 - batch_cost: 0.0669 - reader cost: 0.00 - ETA: 2:56 - batch_cost: 0 - ETA: 2:54 - batch_cost: 0.0669 - reader cost: - ETA: 2:54 - batch_cost: 0.0669 - r - ETA: 2:53 - batch_cost: 0.0670 - reader cost: 0.009 - ETA: 2:53 - batch_cost: 0.0670 - reader cost: 0.009 - ETA: 2:53 - batch_cost: 0.0670 - reader cost: 0. - ETA: 2:52 - batch_cost: 0.0671 - reader cost: 0 - ETA: 2:52 - batch_cost: 0.0672 - reader cost: 0. - ETA: 2:52 - batch_cost: 0.0672 - re - ETA: 2:51 - batch_cost: 0.0672 - re - ETA: 2:49 - batch_cost: 0.0673 - reade - ETA: 2:48 - batch_cost: 0.0673 - reader cost - ETA: 2:48 - batch_cost: 0.0673 - reader cost - ETA: 2:47 - batch_cost: 0.0673 - reader cost: 0.01 - ETA: 2:47 - batch_cost: 0.0673 - reader cost: 0.010 - ETA: 2:47 - batch_cost: 0.0673 - reade - ETA: 2:46 - batch_cost: 0.0674 - re - ETA: 2:45 - batch_cost: 0.0674 - reader cos - ETA: 2:44 - batch_cost: 0.0674 - reader cost: 0.0 - ETA: 2:44 - batch_cost: 0.0674 - reader cost: 0. - ETA: 2:43 - batch_cost: 0.0674 - reader c - ETA: 2:43 - batch_cost: 0.0674 - - ETA: 2:41 - batch_cost: 0.0675 - reader cost: - ETA: 2:41 - batch_cost: 0.0675 - reader cost: 0.01 - ETA: 2:41 - batch_cost: 0.0676 - r - ETA: 2:40 - batch_cost: 0.0677 - - ETA: 2:38 - batch_cost: 0.0677 - reader cost: 0.0 - ETA: 2:38 - batch_cost: 0.0677 - reader cost - ETA: 2:37 - batch_cost: 0.0678 - reader c - ETA: 2:36 - batch_cost: 0.0678 - reader cost: 0.01 - ETA: 2:36 - batch_cost: 0.0678 - reader cost: - ETA: 2:36 - batch_cost: 0.0678 - r - ETA: 2:34 - batch_cost: 0.0678 - reader cos - ETA: 2:33 - batch_cost: 0. - ETA: 2:31 - batch_cost: 0.0678 - reader cost: 0.00 - ETA: 2:31 - batch_cost: 0.0678 - reader cost: 0. - ETA: 2:30 - batch_cost: 0.0678 - reader cost - ETA: 2:30 - batch_cost: 0.0678 - reader c - ETA: 2:29 - batch_cost: 0.0678 - reader cost: 0.0 - ETA: 2:29 - batch_cost: 0.0678 - reader cost: 0. - ETA: 2:29 - batch_cost: 0.0678 - reader co - ETA: 2:28 - batch_cost: 0.0679 - reader cost: 0. - ETA: 2:27 - batch_cost: 0.0678 - reader cost: 0.009 - ETA: 2:27 - batch_cost: 0.0678 - reader cost: 0.0 - ETA: 2:27 - batch_cost: 0.0678 - reader cost: 0.00 - ETA: 2:27 - batch_cost: 0.0679 - reader cost - ETA: 2:26 - batch_cost: 0.0679 - reader cost: 0.00 - ETA: 2:26 - batch_cost: 0.0679 - reader cost: 0.0 - ETA: 2:26 - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 2:26 - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 2:26 - batch_cost: 0.0680 - reader cost: 0. - ETA: 2:26 - batch_cost: 0.0680 - reader cost: 0.0 - ETA: 2:25 - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 2:25 - batch_cost: 0.0680 - reader cost: 0.009 - ETA: 2:25 - batch_cost: 0.0680 - - ETA: 2:23 - batch_cost: 0.0 - ETA: 2:20 - batch_cost: 0.0679 - reader cost: 0 - ETA: 2:19 - batch_cost: 0.0679 - reader cost: - ETA: 2:19 - batch_cost: 0.0678 - reader - ETA: 2:17 - batch_cost: 0.0678 - reader cost: 0 - ETA: 2:17 - batch_cost: 0.0678 - reader cost: - ETA: 2:16 - batch_cost: 0.0678 - reader cost: 0 - ETA: 2:15 - batch_cost: 0.0677 - reader cost: 0.009 - ETA: 2:15 - batch_cost: 0.0677 - reader - ETA: 2:13 - batch_cost: 0.0677 - reader cost - ETA: 2:12 - batch_cost: 0 - ETA: 2:09 - batch_cost: 0.0675 - reader cost: 0. - ETA: 2:08 - batch_cost: 0.0675 - reader cost: - ETA: 2:07 - batch_cost: 0.0675 - reader cost - ETA: 2:06 - batch_cost: 0.0675 - reader cost - ETA: 2:05 - batch_cost: 0.0674 - reader cost: 0 - ETA: 2:05 - batch_cost: 0.0674 - reader cost: 0 - ETA: 2:04 - batch_cost: 0.0674 - reader cost: 0.00 - ETA: 2:04 - batch_cost: 0.0674 - reader cost: 0.009 - ETA: 2:04 - batch_cost: 0.0674 - reader cost: 0.00 - ETA: 2:04 - batch_cost: 0.0674 - reader cost: 0. - ETA: 2:03 - batch_cost: 0.0674 - reader cost: 0.009 - ETA: 2:03 - batch_cost: 0.0674 - reader co - ETA: 2:03 - batch_cost: 0.0675 - reader cost: - ETA: 2:02 - batch_cost: 0.0675 - reade - ETA: 2:01 - batch_cost: 0. - ETA: 1:59 - batch_cost: 0.0675 - reader cost: 0. - ETA: 1:58 - batch_cost: 0.0675 - reader cost - ETA: 1:57 - batch_cost: 0.0675 - - ETA: 1:54 - batch_cost: 0.0674 - re - ETA: 1:52 - batch_cost: 0.0674 - reader cost: 0.009 - ETA: 1:52 - batch_cost: 0.0674 - reader cost: 0. - ETA: 1:52 - batch_cost: 0.0673 - reader - ETA: 1:50 - batch_cost: 0.0673 - reader cost: 0 - ETA: 1:50 - batch_cost: 0.0673 - reader cost: 0. - ETA: 1:49 - batch_cost: 0.0673 - reader cost: 0.00 - ETA: 1:49 - batch_cost: 0.0673 - reader cost: 0.008 - ETA: 1:49 - batch_cost: 0.0673 - reader cost: 0. - ETA: 1:49 - batch_cost: 0.0673 - reader cost: 0. - ETA: 1:49 - batch_cost: 0.0674 - reader cost: 0.0 - ETA: 1:49 - batch_cost: 0.0674 - reader cost: 0.00 - ETA: 1:48 - batch_cost: 0.0674 - reader co - ETA: 1:48 - batch_cost: 0.0674 - reader cos - ETA: 1:47 - batch_cost: 0.0674 - reader cost: - ETA: 1:46 - batch_cost: 0.06 - ETA: 1:44 - batch_cost: 0.0674 - re - ETA: 1:42 - batch_cost: 0.0673 - reader cost: 0.0 - ETA: 1:42 - batch_cost: 0.0673 - reader cos - ETA: 1:41 - batch_cost: 0.0673 - reader cost: 0.0 - ETA: 1:40 - batch_cost: 0.0673 - reader cost - ETA: 1:39 - batch_cost: 0.0673 - reader cost: 0.008 - ETA: 1:39 - batch_cost: 0.0673 - reader cost: 0 - ETA: 1:39 - batch_cost: 0.0673 - reader cost - ETA: 1:38 - batch_cost: 0.0672 - reader c - ETA: 1:36 - batch_cost: 0.0672 - reader cost: - ETA: 1:36 - batch_cost: 0.0672 - reader co - ETA: 1:34 - batch_cost: 0.0671 - reader cos - ETA: 1:33 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 1:33 - batch_cost: 0.0671 - reader c - ETA: 1:32 - batch_cost: 0.0671 - reader cost: 0. - ETA: 1:31 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 1:31 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 1:31 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 1:31 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 1:30 - batch_cost: 0.0671 - reader cost: 0 - ETA: 1:30 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 1:30 - batch_cost: 0.0671 - reader cost: 0. - ETA: 1:30 - batch_cost: 0.0671 - reader cost: 0. - ETA: 1:29 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 1:29 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 1:29 - batch_cost: 0.0672 - reader cost: - ETA: 1:28 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 1:28 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 1:28 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 1:27 - batch_cost: 0.0672 - reader cost: 0.00 - ETA: 1:27 - batch_cost: 0.0672 - reader cost: 0. - ETA: 1:27 - batch_cost: 0.0672 - reader cost: 0.008 - ETA: 1:27 - batch_cost: 0.0672 - reader cost: 0.008 - ETA: 1:27 - batch_cost: 0.0672 - reader cost: - ETA: 1:26 - batch_cost: 0.0672 - reader cost: 0. - ETA: 1:25 - batch_cost: 0.0672 - reader cost: 0.00 - ETA: 1:25 - batch_cost: 0.0672 - reader - ETA: 1:24 - batch_cost: 0.0671 - reader cost: 0. - ETA: 1:23 - batch_cost: 0.0671 - reader c - ETA: 1:22 - batch_cost: 0.0671 - reader cos - ETA: 1:21 - batch_cost: 0.0670 - ETA: 1:18 - batch_cost: 0.0670 - reader c - ETA: 1:17 - batch_cost: 0.0669 - reader cost: - ETA: 1:16 - batch_cost: 0.0669 - reader cost: 0.008 - ETA: 1:16 - batch_cost: 0.0669 - reader cost - ETA: 1:15 - batch_cost: 0.0669 - reader cost: - ETA: 1:14 - batch_cost: 0.0669 - reader cost: 0.008 - ETA: 1:14 - batch_cost: 0.0669 - reader co - ETA: 1:13 - batch_cost: 0.0669 - reader cost: - ETA: 1:13 - batch_cost: 0.0669 - reader cost: 0.00 - ETA: 1:13 - batch_cost: 0.066 - ETA: 1:10 - batch_cost: 0.0669 - ETA: 1:08 - batch_cost: 0.0669 - reader c - ETA: 1:07 - batch_cost: 0.0669 - reader cost: 0 - ETA: 1:06 - batch_cost: 0.0669 - reader - ETA: 1:04 - batch_cost: 0.0668 - reader cos - ETA: 1:03 - batch_ - ETA: 59s - batch_cost: 0.0666 - r - ETA: 58s - batch_cost: 0.06 - ETA: 57s - batch_cost: 0.0666 - r - ETA: 56s - batch_cost: 0.0666 - reader co - ETA: 55s - batch_cost: 0.0666 - ETA: 54s - batch_cost: 0.0666 - reader cost: - ETA: 54s - batch_cost: 0.0666 - reader cost: 0.00 - ETA: 54s - batch_cost: 0.0666 - ETA: 53s - batch_cost: 0.0666 - reader cost: 0. - ETA: 53s - batch_cost: 0.0666 - reader cost: 0. - ETA: 52s - batch_cost: 0.0666 - rea - ETA: 51s - batch_cost: 0.0666 - reader cost - ETA: 51s - batch_cost: 0.0666 - reader cost: 0.00 - ETA: 51s - batch_cost: 0.0666 - reader cost: - ETA: 51s - batch_cost: 0.0666 - reader cost - ETA: 50s - batch_cost: 0.0666 - reader cost: 0.00 - ETA: 50s - batch_cost: 0.0665 - reader co - ETA: 50s - batch_cost: 0.0666 - reader cost: 0. - ETA: 49s - batch_cost: 0.0666 - reader cost: 0.00 - ETA: 49s - batch_cost: 0.0666 - r - ETA: 48s - batch_cost: 0.0666 - reader cost: 0. - ETA: 48s - batch_cost: 0.0666 - reader - ETA: 48s - batch_cost: 0.0666 - reader co - ETA: 47s - batch_cost: 0.0666 - reader cost: - ETA: 47s - batch_cost: 0.0666 - reader cost: 0. - ETA: 47s - batch_cost: 0.0666 - reader co - ETA: 46s - batch_cost: 0.0666 - rea - ETA: 46s - batch_cost: 0.0666 - reader cost: - ETA: 45s - batch_cost: 0.0666 - ETA: 44s - batch_cost: 0.0665 - reader - ETA: 43s - batch_cost: 0. - ETA: 42s - batch_cost: 0.0665 - r - ETA: 42s - batch_cost: 0.0665 - reader cost - ETA: 41s - batch_cost: 0.0665 - reader cost - ETA: 41s - batch_cost: 0.06 - ETA: 40s - batch_cost: 0.0665 - reader co - ETA: 39s - batch_cost: 0.0665 - reader cost: 0.00 - ETA: 39s - batch_cost: 0.0666 - reader cost: - ETA: 39s - batch_cost: 0.0666 - reader co - ETA: 39s - batch_cost: 0.0666 - reader cost: 0. - ETA: 38s - batch_cost: 0.0666 - rea - ETA: 38s - batch_cost: 0.0666 - reader - ETA: 37s - batch_cost: 0.0666 - reader cost: 0.00 - ETA: 37s - batch_cost: - ETA: 35 - ETA: 33s - batch_cost: 0.0665 - reader cost - ETA: 32s - batch_cost: 0.0665 - reade - ETA: 31s - batch_cost: 0.0665 - - ETA: 31s - batch_cost: 0.0665 - ETA: 29s - batch_cost: 0.0664 - reader co - ETA: 29s - batch_cost: 0.0664 - - ETA: 27s - batch_cost: 0.0664 - reader cost: 0.00 - ETA: 27s - batch_cost: 0.0664 - reader cost: 0. - ETA: 27s - batch_cost: 0.0664 - reader cost: 0.00 - ETA: 27s - batch_cost: 0.0664 - reader - ETA: 26s - batch_cost: 0.0664 - reader cost: 0. - ETA: 26s - batch_cost: 0.0664 - reader cost - ETA: 23s - batch_cost: 0.0663 - reade - ETA: 22s - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 22s - - ETA: 19s - batch_cost: 0.0663 - reader cost: - ETA: 19s - batch_cost: 0.0663 - reader cost: 0. - ETA: 19s - batch_cost: 0.0663 - r - ETA: 18s - batch_cost: 0.0662 - reader cost: 0. - ETA: 18s - batch_cost: 0.0663 - reader - ETA: 17s - batch_cost: 0.0663 - reader cost: - ETA: 17s - batch_cost: 0.0663 - reader cost: 0. - ETA: 17s - batch_cost: 0.0663 - rea - ETA: 16s - batch_cost: 0.0663 - reader cost: 0. - ETA: 16s - batch_cost: 0.0663 - reader cost - ETA: 15s - batch_cost: 0.0663 - rea - ETA: 15s - batch_cost: 0.0663 - reader cost: 0. - ETA: 14s - batch_cost: 0.0663 - reader cost - ETA: 14s - batch_cost: 0.0663 - reader - ETA: 14s - batch_cost: 0.0663 - rea - ETA: 13s - batch_cost: 0.0663 - reader cost: 0. - ETA: 12s - batch_cost: 0. - ETA: 11s - batch_cost: 0.0663 - reader cost: - ETA: 10s - batch_cost: 0.0663 - r - ETA: 9s - batch_cost: 0.0662 - reader cost: 0.00 - ETA: 9s - batch_cost: 0.0662 - reader cost: - ETA: 8s - batch_cost: 0.0662 - reader cost: 0.00 - ETA: 8s - batch_ - ETA: 4s - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 4s - batch_cost: 0.0662 - reader cost: 0 - ETA: 4s - batch_cost: 0.0662 - reader cost: 0.00 - ETA: 3s - batch_cost: 0.0662 - reader cost: 0.00 - ETA: 3s - batch_cost: 0.0662 - reader cost: 0 - ETA: 3s - batch_cost: 0.0662 - rea - ETA: 1s - batch_cost: 0.0662 - reader cost: - ETA: 0s - batch_cost: 0.0661 - reader cost: 0.0 - ETA: 0s - batch_cost: 0.0661 - reader cost: 0.02022-07-21 16:09:22 [INFO][EVAL] #Images: 7361 mIoU: 0.1177 Acc: 0.9842 Kappa: 0.4602 Dice: 0.15042022-07-21 16:09:22 [INFO][EVAL] Class IoU: [0.9862 0.1398 0.0514 0.2486 0.0168 0. 0. 0. 0.3248 0.0074 0. 0. 0. 0.0029 0. 0. 0. 0. 0. 0.5755]2022-07-21 16:09:22 [INFO][EVAL] Class Precision: [0.989 0.4111 0.4251 0.5449 0.2797 0. 0. 0. 0.5079 0.152 0. 0. 0. 0.1955 0. 0. 0. 0. 0. 0.6405]2022-07-21 16:09:22 [INFO][EVAL] Class Recall: [0.9971 0.1748 0.0552 0.3138 0.0176 0. 0. 0. 0.474 0.0077 0. 0. 0. 0.0029 0. 0. 0. 0. 0. 0.8501]2022-07-21 16:09:22 [INFO][EVAL] The model with the best validation mIoU (0.1177) was saved at iter 12000.2022-07-21 16:09:24 [INFO][TRAIN] epoch: 3, iter: 12010/20000, loss: 0.3974, lr: 0.004379, batch_cost: 0.1724, reader_cost: 0.04518, ips: 23.2063 samples/sec | ETA 00:22:572022-07-21 16:09:27 [INFO][TRAIN] epoch: 3, iter: 12020/20000, loss: 0.3217, lr: 0.004374, batch_cost: 0.2500, reader_cost: 0.12436, ips: 15.9984 samples/sec | ETA 00:33:152022-07-21 16:09:29 [INFO][TRAIN] epoch: 3, iter: 12030/20000, loss: 0.3661, lr: 0.004370, batch_cost: 0.2228, reader_cost: 0.09447, ips: 17.9537 samples/sec | ETA 00:29:352022-07-21 16:09:31 [INFO][TRAIN] epoch: 3, iter: 12040/20000, loss: 0.2920, lr: 0.004365, batch_cost: 0.2506, reader_cost: 0.12082, ips: 15.9613 samples/sec | ETA 00:33:142022-07-21 16:09:34 [INFO][TRAIN] epoch: 3, iter: 12050/20000, loss: 0.3660, lr: 0.004360, batch_cost: 0.2815, reader_cost: 0.13769, ips: 14.2079 samples/sec | ETA 00:37:182022-07-21 16:09:37 [INFO][TRAIN] epoch: 3, iter: 12060/20000, loss: 0.4336, lr: 0.004355, batch_cost: 0.2606, reader_cost: 0.12852, ips: 15.3516 samples/sec | ETA 00:34:282022-07-21 16:09:39 [INFO][TRAIN] epoch: 3, iter: 12070/20000, loss: 0.3549, lr: 0.004350, batch_cost: 0.2579, reader_cost: 0.13069, ips: 15.5103 samples/sec | ETA 00:34:052022-07-21 16:09:42 [INFO][TRAIN] epoch: 3, iter: 12080/20000, loss: 0.2921, lr: 0.004345, batch_cost: 0.2621, reader_cost: 0.12896, ips: 15.2603 samples/sec | ETA 00:34:352022-07-21 16:09:45 [INFO][TRAIN] epoch: 3, iter: 12090/20000, loss: 0.3111, lr: 0.004340, batch_cost: 0.3388, reader_cost: 0.16238, ips: 11.8060 samples/sec | ETA 00:44:392022-07-21 16:09:48 [INFO][TRAIN] epoch: 3, iter: 12100/20000, loss: 0.3827, lr: 0.004335, batch_cost: 0.2529, reader_cost: 0.11495, ips: 15.8150 samples/sec | ETA 00:33:182022-07-21 16:09:50 [INFO][TRAIN] epoch: 3, iter: 12110/20000, loss: 0.2875, lr: 0.004330, batch_cost: 0.2568, reader_cost: 0.12516, ips: 15.5735 samples/sec | ETA 00:33:462022-07-21 16:09:53 [INFO][TRAIN] epoch: 3, iter: 12120/20000, loss: 0.3588, lr: 0.004325, batch_cost: 0.2838, reader_cost: 0.14239, ips: 14.0935 samples/sec | ETA 00:37:162022-07-21 16:09:56 [INFO][TRAIN] epoch: 3, iter: 12130/20000, loss: 0.3534, lr: 0.004320, batch_cost: 0.2288, reader_cost: 0.10335, ips: 17.4836 samples/sec | ETA 00:30:002022-07-21 16:09:58 [INFO][TRAIN] epoch: 3, iter: 12140/20000, loss: 0.3787, lr: 0.004315, batch_cost: 0.2248, reader_cost: 0.09466, ips: 17.7897 samples/sec | ETA 00:29:272022-07-21 16:10:00 [INFO][TRAIN] epoch: 3, iter: 12150/20000, loss: 0.4308, lr: 0.004310, batch_cost: 0.2345, reader_cost: 0.10967, ips: 17.0599 samples/sec | ETA 00:30:402022-07-21 16:10:03 [INFO][TRAIN] epoch: 3, iter: 12160/20000, loss: 0.2659, lr: 0.004305, batch_cost: 0.2325, reader_cost: 0.11107, ips: 17.2036 samples/sec | ETA 00:30:222022-07-21 16:10:05 [INFO][TRAIN] epoch: 3, iter: 12170/20000, loss: 0.5307, lr: 0.004300, batch_cost: 0.2275, reader_cost: 0.10511, ips: 17.5808 samples/sec | ETA 00:29:412022-07-21 16:10:07 [INFO][TRAIN] epoch: 3, iter: 12180/20000, loss: 0.3963, lr: 0.004295, batch_cost: 0.2353, reader_cost: 0.11402, ips: 17.0030 samples/sec | ETA 00:30:392022-07-21 16:10:10 [INFO][TRAIN] epoch: 3, iter: 12190/20000, loss: 0.3229, lr: 0.004291, batch_cost: 0.2440, reader_cost: 0.11826, ips: 16.3949 samples/sec | ETA 00:31:452022-07-21 16:10:12 [INFO][TRAIN] epoch: 3, iter: 12200/20000, loss: 0.3606, lr: 0.004286, batch_cost: 0.2329, reader_cost: 0.09571, ips: 17.1745 samples/sec | ETA 00:30:162022-07-21 16:10:15 [INFO][TRAIN] epoch: 3, iter: 12210/20000, loss: 0.4075, lr: 0.004281, batch_cost: 0.2862, reader_cost: 0.12445, ips: 13.9763 samples/sec | ETA 00:37:092022-07-21 16:10:18 [INFO][TRAIN] epoch: 3, iter: 12220/20000, loss: 0.4866, lr: 0.004276, batch_cost: 0.2710, reader_cost: 0.12379, ips: 14.7618 samples/sec | ETA 00:35:082022-07-21 16:10:20 [INFO][TRAIN] epoch: 3, iter: 12230/20000, loss: 0.3075, lr: 0.004271, batch_cost: 0.2405, reader_cost: 0.11603, ips: 16.6299 samples/sec | ETA 00:31:082022-07-21 16:10:22 [INFO][TRAIN] epoch: 3, iter: 12240/20000, loss: 0.3798, lr: 0.004266, batch_cost: 0.2571, reader_cost: 0.12881, ips: 15.5587 samples/sec | ETA 00:33:152022-07-21 16:10:25 [INFO][TRAIN] epoch: 3, iter: 12250/20000, loss: 0.3018, lr: 0.004261, batch_cost: 0.2630, reader_cost: 0.12704, ips: 15.2071 samples/sec | ETA 00:33:582022-07-21 16:10:27 [INFO][TRAIN] epoch: 3, iter: 12260/20000, loss: 0.4142, lr: 0.004256, batch_cost: 0.2293, reader_cost: 0.10597, ips: 17.4451 samples/sec | ETA 00:29:342022-07-21 16:10:30 [INFO][TRAIN] epoch: 3, iter: 12270/20000, loss: 0.3251, lr: 0.004251, batch_cost: 0.2525, reader_cost: 0.12624, ips: 15.8426 samples/sec | ETA 00:32:312022-07-21 16:10:32 [INFO][TRAIN] epoch: 3, iter: 12280/20000, loss: 0.2875, lr: 0.004246, batch_cost: 0.2439, reader_cost: 0.11616, ips: 16.4011 samples/sec | ETA 00:31:222022-07-21 16:10:35 [INFO][TRAIN] epoch: 3, iter: 12290/20000, loss: 0.3681, lr: 0.004241, batch_cost: 0.2394, reader_cost: 0.11742, ips: 16.7050 samples/sec | ETA 00:30:462022-07-21 16:10:38 [INFO][TRAIN] epoch: 3, iter: 12300/20000, loss: 0.3011, lr: 0.004236, batch_cost: 0.2777, reader_cost: 0.12696, ips: 14.4017 samples/sec | ETA 00:35:382022-07-21 16:10:40 [INFO][TRAIN] epoch: 3, iter: 12310/20000, loss: 0.4066, lr: 0.004231, batch_cost: 0.2478, reader_cost: 0.11509, ips: 16.1396 samples/sec | ETA 00:31:452022-07-21 16:10:43 [INFO][TRAIN] epoch: 3, iter: 12320/20000, loss: 0.3619, lr: 0.004226, batch_cost: 0.2489, reader_cost: 0.11651, ips: 16.0705 samples/sec | ETA 00:31:512022-07-21 16:10:45 [INFO][TRAIN] epoch: 3, iter: 12330/20000, loss: 0.3670, lr: 0.004221, batch_cost: 0.2644, reader_cost: 0.13118, ips: 15.1271 samples/sec | ETA 00:33:482022-07-21 16:10:48 [INFO][TRAIN] epoch: 3, iter: 12340/20000, loss: 0.3891, lr: 0.004216, batch_cost: 0.2500, reader_cost: 0.12027, ips: 16.0010 samples/sec | ETA 00:31:542022-07-21 16:10:51 [INFO][TRAIN] epoch: 3, iter: 12350/20000, loss: 0.3864, lr: 0.004211, batch_cost: 0.3028, reader_cost: 0.15371, ips: 13.2101 samples/sec | ETA 00:38:362022-07-21 16:10:54 [INFO][TRAIN] epoch: 3, iter: 12360/20000, loss: 0.3766, lr: 0.004206, batch_cost: 0.2889, reader_cost: 0.12392, ips: 13.8445 samples/sec | ETA 00:36:472022-07-21 16:10:56 [INFO][TRAIN] epoch: 3, iter: 12370/20000, loss: 0.3482, lr: 0.004201, batch_cost: 0.2831, reader_cost: 0.14273, ips: 14.1296 samples/sec | ETA 00:36:002022-07-21 16:10:59 [INFO][TRAIN] epoch: 3, iter: 12380/20000, loss: 0.4683, lr: 0.004196, batch_cost: 0.2384, reader_cost: 0.11472, ips: 16.7781 samples/sec | ETA 00:30:162022-07-21 16:11:01 [INFO][TRAIN] epoch: 3, iter: 12390/20000, loss: 0.4271, lr: 0.004192, batch_cost: 0.2426, reader_cost: 0.11711, ips: 16.4866 samples/sec | ETA 00:30:462022-07-21 16:11:04 [INFO][TRAIN] epoch: 3, iter: 12400/20000, loss: 0.3825, lr: 0.004187, batch_cost: 0.2433, reader_cost: 0.11919, ips: 16.4384 samples/sec | ETA 00:30:492022-07-21 16:11:06 [INFO][TRAIN] epoch: 3, iter: 12410/20000, loss: 0.3209, lr: 0.004182, batch_cost: 0.2407, reader_cost: 0.11772, ips: 16.6190 samples/sec | ETA 00:30:262022-07-21 16:11:08 [INFO][TRAIN] epoch: 3, iter: 12420/20000, loss: 0.3706, lr: 0.004177, batch_cost: 0.2252, reader_cost: 0.10199, ips: 17.7593 samples/sec | ETA 00:28:272022-07-21 16:11:11 [INFO][TRAIN] epoch: 3, iter: 12430/20000, loss: 0.2794, lr: 0.004172, batch_cost: 0.2339, reader_cost: 0.10567, ips: 17.1048 samples/sec | ETA 00:29:302022-07-21 16:11:13 [INFO][TRAIN] epoch: 3, iter: 12440/20000, loss: 0.3483, lr: 0.004167, batch_cost: 0.2297, reader_cost: 0.10662, ips: 17.4167 samples/sec | ETA 00:28:562022-07-21 16:11:15 [INFO][TRAIN] epoch: 3, iter: 12450/20000, loss: 0.2779, lr: 0.004162, batch_cost: 0.2259, reader_cost: 0.09701, ips: 17.7107 samples/sec | ETA 00:28:252022-07-21 16:11:18 [INFO][TRAIN] epoch: 3, iter: 12460/20000, loss: 0.3715, lr: 0.004157, batch_cost: 0.2469, reader_cost: 0.11874, ips: 16.2029 samples/sec | ETA 00:31:012022-07-21 16:11:20 [INFO][TRAIN] epoch: 3, iter: 12470/20000, loss: 0.4070, lr: 0.004152, batch_cost: 0.2471, reader_cost: 0.11796, ips: 16.1876 samples/sec | ETA 00:31:002022-07-21 16:11:23 [INFO][TRAIN] epoch: 3, iter: 12480/20000, loss: 0.4016, lr: 0.004147, batch_cost: 0.2774, reader_cost: 0.14115, ips: 14.4172 samples/sec | ETA 00:34:462022-07-21 16:11:26 [INFO][TRAIN] epoch: 3, iter: 12490/20000, loss: 0.3765, lr: 0.004142, batch_cost: 0.2948, reader_cost: 0.13768, ips: 13.5707 samples/sec | ETA 00:36:532022-07-21 16:11:29 [INFO][TRAIN] epoch: 3, iter: 12500/20000, loss: 0.4333, lr: 0.004137, batch_cost: 0.2622, reader_cost: 0.12410, ips: 15.2532 samples/sec | ETA 00:32:462022-07-21 16:11:31 [INFO][TRAIN] epoch: 3, iter: 12510/20000, loss: 0.5294, lr: 0.004132, batch_cost: 0.2190, reader_cost: 0.09728, ips: 18.2616 samples/sec | ETA 00:27:202022-07-21 16:11:33 [INFO][TRAIN] epoch: 3, iter: 12520/20000, loss: 0.4688, lr: 0.004127, batch_cost: 0.2179, reader_cost: 0.09431, ips: 18.3574 samples/sec | ETA 00:27:092022-07-21 16:11:35 [INFO][TRAIN] epoch: 3, iter: 12530/20000, loss: 0.3208, lr: 0.004122, batch_cost: 0.2358, reader_cost: 0.10954, ips: 16.9637 samples/sec | ETA 00:29:212022-07-21 16:11:38 [INFO][TRAIN] epoch: 3, iter: 12540/20000, loss: 0.3536, lr: 0.004117, batch_cost: 0.2454, reader_cost: 0.11511, ips: 16.3003 samples/sec | ETA 00:30:302022-07-21 16:11:40 [INFO][TRAIN] epoch: 3, iter: 12550/20000, loss: 0.4195, lr: 0.004112, batch_cost: 0.2774, reader_cost: 0.12502, ips: 14.4205 samples/sec | ETA 00:34:262022-07-21 16:11:43 [INFO][TRAIN] epoch: 3, iter: 12560/20000, loss: 0.3465, lr: 0.004107, batch_cost: 0.2575, reader_cost: 0.12213, ips: 15.5331 samples/sec | ETA 00:31:552022-07-21 16:11:46 [INFO][TRAIN] epoch: 3, iter: 12570/20000, loss: 0.3280, lr: 0.004102, batch_cost: 0.2560, reader_cost: 0.12245, ips: 15.6226 samples/sec | ETA 00:31:422022-07-21 16:11:48 [INFO][TRAIN] epoch: 3, iter: 12580/20000, loss: 0.3996, lr: 0.004097, batch_cost: 0.2357, reader_cost: 0.10908, ips: 16.9695 samples/sec | ETA 00:29:092022-07-21 16:11:50 [INFO][TRAIN] epoch: 3, iter: 12590/20000, loss: 0.3296, lr: 0.004092, batch_cost: 0.2322, reader_cost: 0.10791, ips: 17.2301 samples/sec | ETA 00:28:402022-07-21 16:11:53 [INFO][TRAIN] epoch: 3, iter: 12600/20000, loss: 0.2818, lr: 0.004087, batch_cost: 0.2376, reader_cost: 0.11407, ips: 16.8344 samples/sec | ETA 00:29:182022-07-21 16:11:55 [INFO][TRAIN] epoch: 3, iter: 12610/20000, loss: 0.2839, lr: 0.004082, batch_cost: 0.2462, reader_cost: 0.12062, ips: 16.2458 samples/sec | ETA 00:30:192022-07-21 16:11:58 [INFO][TRAIN] epoch: 3, iter: 12620/20000, loss: 0.2899, lr: 0.004077, batch_cost: 0.3060, reader_cost: 0.13774, ips: 13.0714 samples/sec | ETA 00:37:382022-07-21 16:12:01 [INFO][TRAIN] epoch: 3, iter: 12630/20000, loss: 0.3054, lr: 0.004072, batch_cost: 0.2694, reader_cost: 0.11696, ips: 14.8501 samples/sec | ETA 00:33:052022-07-21 16:12:03 [INFO][TRAIN] epoch: 3, iter: 12640/20000, loss: 0.4145, lr: 0.004067, batch_cost: 0.2394, reader_cost: 0.11740, ips: 16.7085 samples/sec | ETA 00:29:212022-07-21 16:12:06 [INFO][TRAIN] epoch: 3, iter: 12650/20000, loss: 0.3103, lr: 0.004062, batch_cost: 0.2351, reader_cost: 0.10979, ips: 17.0130 samples/sec | ETA 00:28:482022-07-21 16:12:08 [INFO][TRAIN] epoch: 3, iter: 12660/20000, loss: 0.2774, lr: 0.004057, batch_cost: 0.2214, reader_cost: 0.09785, ips: 18.0658 samples/sec | ETA 00:27:052022-07-21 16:12:10 [INFO][TRAIN] epoch: 3, iter: 12670/20000, loss: 0.4124, lr: 0.004052, batch_cost: 0.2313, reader_cost: 0.10125, ips: 17.2942 samples/sec | ETA 00:28:152022-07-21 16:12:13 [INFO][TRAIN] epoch: 3, iter: 12680/20000, loss: 0.2982, lr: 0.004047, batch_cost: 0.2485, reader_cost: 0.12411, ips: 16.0975 samples/sec | ETA 00:30:182022-07-21 16:12:15 [INFO][TRAIN] epoch: 3, iter: 12690/20000, loss: 0.4661, lr: 0.004043, batch_cost: 0.2293, reader_cost: 0.10717, ips: 17.4444 samples/sec | ETA 00:27:562022-07-21 16:12:17 [INFO][TRAIN] epoch: 3, iter: 12700/20000, loss: 0.3205, lr: 0.004038, batch_cost: 0.2390, reader_cost: 0.11425, ips: 16.7356 samples/sec | ETA 00:29:042022-07-21 16:12:20 [INFO][TRAIN] epoch: 3, iter: 12710/20000, loss: 0.3427, lr: 0.004033, batch_cost: 0.2340, reader_cost: 0.11050, ips: 17.0945 samples/sec | ETA 00:28:252022-07-21 16:12:22 [INFO][TRAIN] epoch: 3, iter: 12720/20000, loss: 0.3196, lr: 0.004028, batch_cost: 0.2533, reader_cost: 0.12892, ips: 15.7927 samples/sec | ETA 00:30:432022-07-21 16:12:25 [INFO][TRAIN] epoch: 3, iter: 12730/20000, loss: 0.3425, lr: 0.004023, batch_cost: 0.2440, reader_cost: 0.11959, ips: 16.3959 samples/sec | ETA 00:29:332022-07-21 16:12:27 [INFO][TRAIN] epoch: 3, iter: 12740/20000, loss: 0.3315, lr: 0.004018, batch_cost: 0.2497, reader_cost: 0.11885, ips: 16.0206 samples/sec | ETA 00:30:122022-07-21 16:12:30 [INFO][TRAIN] epoch: 3, iter: 12750/20000, loss: 0.2895, lr: 0.004013, batch_cost: 0.3177, reader_cost: 0.14799, ips: 12.5898 samples/sec | ETA 00:38:232022-07-21 16:12:33 [INFO][TRAIN] epoch: 3, iter: 12760/20000, loss: 0.3292, lr: 0.004008, batch_cost: 0.2800, reader_cost: 0.13400, ips: 14.2835 samples/sec | ETA 00:33:472022-07-21 16:12:36 [INFO][TRAIN] epoch: 3, iter: 12770/20000, loss: 0.3748, lr: 0.004003, batch_cost: 0.2407, reader_cost: 0.11604, ips: 16.6198 samples/sec | ETA 00:29:002022-07-21 16:12:38 [INFO][TRAIN] epoch: 3, iter: 12780/20000, loss: 0.3249, lr: 0.003998, batch_cost: 0.2235, reader_cost: 0.09593, ips: 17.9003 samples/sec | ETA 00:26:532022-07-21 16:12:40 [INFO][TRAIN] epoch: 3, iter: 12790/20000, loss: 0.3079, lr: 0.003993, batch_cost: 0.2624, reader_cost: 0.12276, ips: 15.2446 samples/sec | ETA 00:31:312022-07-21 16:12:43 [INFO][TRAIN] epoch: 3, iter: 12800/20000, loss: 0.3590, lr: 0.003988, batch_cost: 0.3033, reader_cost: 0.15466, ips: 13.1864 samples/sec | ETA 00:36:242022-07-21 16:12:46 [INFO][TRAIN] epoch: 3, iter: 12810/20000, loss: 0.3297, lr: 0.003983, batch_cost: 0.2350, reader_cost: 0.10939, ips: 17.0237 samples/sec | ETA 00:28:092022-07-21 16:12:48 [INFO][TRAIN] epoch: 3, iter: 12820/20000, loss: 0.4415, lr: 0.003978, batch_cost: 0.2263, reader_cost: 0.10169, ips: 17.6757 samples/sec | ETA 00:27:042022-07-21 16:12:50 [INFO][TRAIN] epoch: 3, iter: 12830/20000, loss: 0.4013, lr: 0.003973, batch_cost: 0.2362, reader_cost: 0.10974, ips: 16.9313 samples/sec | ETA 00:28:132022-07-21 16:12:53 [INFO][TRAIN] epoch: 3, iter: 12840/20000, loss: 0.3715, lr: 0.003968, batch_cost: 0.2261, reader_cost: 0.10240, ips: 17.6936 samples/sec | ETA 00:26:582022-07-21 16:12:55 [INFO][TRAIN] epoch: 3, iter: 12850/20000, loss: 0.3615, lr: 0.003963, batch_cost: 0.2330, reader_cost: 0.10916, ips: 17.1667 samples/sec | ETA 00:27:462022-07-21 16:12:57 [INFO][TRAIN] epoch: 3, iter: 12860/20000, loss: 0.2923, lr: 0.003958, batch_cost: 0.2369, reader_cost: 0.11252, ips: 16.8852 samples/sec | ETA 00:28:112022-07-21 16:13:00 [INFO][TRAIN] epoch: 3, iter: 12870/20000, loss: 0.3266, lr: 0.003953, batch_cost: 0.2461, reader_cost: 0.12173, ips: 16.2508 samples/sec | ETA 00:29:142022-07-21 16:13:02 [INFO][TRAIN] epoch: 3, iter: 12880/20000, loss: 0.3829, lr: 0.003948, batch_cost: 0.2563, reader_cost: 0.12069, ips: 15.6094 samples/sec | ETA 00:30:242022-07-21 16:13:06 [INFO][TRAIN] epoch: 3, iter: 12890/20000, loss: 0.3139, lr: 0.003943, batch_cost: 0.3121, reader_cost: 0.14951, ips: 12.8157 samples/sec | ETA 00:36:592022-07-21 16:13:08 [INFO][TRAIN] epoch: 3, iter: 12900/20000, loss: 0.2766, lr: 0.003938, batch_cost: 0.2715, reader_cost: 0.14103, ips: 14.7315 samples/sec | ETA 00:32:072022-07-21 16:13:11 [INFO][TRAIN] epoch: 3, iter: 12910/20000, loss: 0.3805, lr: 0.003933, batch_cost: 0.2338, reader_cost: 0.10811, ips: 17.1101 samples/sec | ETA 00:27:372022-07-21 16:13:13 [INFO][TRAIN] epoch: 3, iter: 12920/20000, loss: 0.2964, lr: 0.003928, batch_cost: 0.2228, reader_cost: 0.10089, ips: 17.9551 samples/sec | ETA 00:26:172022-07-21 16:13:15 [INFO][TRAIN] epoch: 3, iter: 12930/20000, loss: 0.5671, lr: 0.003923, batch_cost: 0.2433, reader_cost: 0.11897, ips: 16.4387 samples/sec | ETA 00:28:402022-07-21 16:13:18 [INFO][TRAIN] epoch: 3, iter: 12940/20000, loss: 0.4134, lr: 0.003918, batch_cost: 0.2317, reader_cost: 0.10811, ips: 17.2674 samples/sec | ETA 00:27:152022-07-21 16:13:20 [INFO][TRAIN] epoch: 3, iter: 12950/20000, loss: 0.3050, lr: 0.003913, batch_cost: 0.2573, reader_cost: 0.13447, ips: 15.5443 samples/sec | ETA 00:30:142022-07-21 16:13:22 [INFO][TRAIN] epoch: 3, iter: 12960/20000, loss: 0.3292, lr: 0.003908, batch_cost: 0.2350, reader_cost: 0.10814, ips: 17.0233 samples/sec | ETA 00:27:342022-07-21 16:13:25 [INFO][TRAIN] epoch: 3, iter: 12970/20000, loss: 0.3600, lr: 0.003903, batch_cost: 0.2419, reader_cost: 0.11784, ips: 16.5338 samples/sec | ETA 00:28:202022-07-21 16:13:27 [INFO][TRAIN] epoch: 3, iter: 12980/20000, loss: 0.3420, lr: 0.003898, batch_cost: 0.2401, reader_cost: 0.11367, ips: 16.6592 samples/sec | ETA 00:28:052022-07-21 16:13:30 [INFO][TRAIN] epoch: 3, iter: 12990/20000, loss: 0.2725, lr: 0.003893, batch_cost: 0.2526, reader_cost: 0.11721, ips: 15.8336 samples/sec | ETA 00:29:302022-07-21 16:13:32 [INFO][TRAIN] epoch: 3, iter: 13000/20000, loss: 0.3367, lr: 0.003888, batch_cost: 0.2316, reader_cost: 0.10541, ips: 17.2741 samples/sec | ETA 00:27:002022-07-21 16:13:35 [INFO][TRAIN] epoch: 3, iter: 13010/20000, loss: 0.3643, lr: 0.003883, batch_cost: 0.2583, reader_cost: 0.11911, ips: 15.4866 samples/sec | ETA 00:30:052022-07-21 16:13:38 [INFO][TRAIN] epoch: 3, iter: 13020/20000, loss: 0.4024, lr: 0.003878, batch_cost: 0.3374, reader_cost: 0.17303, ips: 11.8546 samples/sec | ETA 00:39:152022-07-21 16:13:41 [INFO][TRAIN] epoch: 3, iter: 13030/20000, loss: 0.3872, lr: 0.003873, batch_cost: 0.2519, reader_cost: 0.12594, ips: 15.8787 samples/sec | ETA 00:29:152022-07-21 16:13:44 [INFO][TRAIN] epoch: 3, iter: 13040/20000, loss: 0.3337, lr: 0.003868, batch_cost: 0.3092, reader_cost: 0.16536, ips: 12.9350 samples/sec | ETA 00:35:522022-07-21 16:13:46 [INFO][TRAIN] epoch: 3, iter: 13050/20000, loss: 0.3460, lr: 0.003863, batch_cost: 0.2552, reader_cost: 0.12183, ips: 15.6739 samples/sec | ETA 00:29:332022-07-21 16:13:49 [INFO][TRAIN] epoch: 3, iter: 13060/20000, loss: 0.3637, lr: 0.003858, batch_cost: 0.2577, reader_cost: 0.12437, ips: 15.5249 samples/sec | ETA 00:29:482022-07-21 16:13:52 [INFO][TRAIN] epoch: 3, iter: 13070/20000, loss: 0.3777, lr: 0.003853, batch_cost: 0.2929, reader_cost: 0.15903, ips: 13.6559 samples/sec | ETA 00:33:492022-07-21 16:13:55 [INFO][TRAIN] epoch: 3, iter: 13080/20000, loss: 0.2936, lr: 0.003848, batch_cost: 0.2756, reader_cost: 0.14806, ips: 14.5151 samples/sec | ETA 00:31:462022-07-21 16:13:57 [INFO][TRAIN] epoch: 3, iter: 13090/20000, loss: 0.2622, lr: 0.003843, batch_cost: 0.2424, reader_cost: 0.11717, ips: 16.5022 samples/sec | ETA 00:27:542022-07-21 16:13:59 [INFO][TRAIN] epoch: 3, iter: 13100/20000, loss: 0.4354, lr: 0.003838, batch_cost: 0.2438, reader_cost: 0.11844, ips: 16.4041 samples/sec | ETA 00:28:022022-07-21 16:14:02 [INFO][TRAIN] epoch: 3, iter: 13110/20000, loss: 0.3544, lr: 0.003833, batch_cost: 0.2635, reader_cost: 0.12690, ips: 15.1822 samples/sec | ETA 00:30:152022-07-21 16:14:04 [INFO][TRAIN] epoch: 3, iter: 13120/20000, loss: 0.4235, lr: 0.003828, batch_cost: 0.2473, reader_cost: 0.10865, ips: 16.1721 samples/sec | ETA 00:28:212022-07-21 16:14:07 [INFO][TRAIN] epoch: 3, iter: 13130/20000, loss: 0.4036, lr: 0.003823, batch_cost: 0.2744, reader_cost: 0.12376, ips: 14.5777 samples/sec | ETA 00:31:252022-07-21 16:14:10 [INFO][TRAIN] epoch: 3, iter: 13140/20000, loss: 0.3049, lr: 0.003818, batch_cost: 0.2646, reader_cost: 0.13442, ips: 15.1164 samples/sec | ETA 00:30:152022-07-21 16:14:13 [INFO][TRAIN] epoch: 3, iter: 13150/20000, loss: 0.3204, lr: 0.003813, batch_cost: 0.3027, reader_cost: 0.14791, ips: 13.2147 samples/sec | ETA 00:34:332022-07-21 16:14:15 [INFO][TRAIN] epoch: 3, iter: 13160/20000, loss: 0.4165, lr: 0.003808, batch_cost: 0.2320, reader_cost: 0.10686, ips: 17.2387 samples/sec | ETA 00:26:272022-07-21 16:14:17 [INFO][TRAIN] epoch: 3, iter: 13170/20000, loss: 0.4003, lr: 0.003803, batch_cost: 0.2225, reader_cost: 0.09944, ips: 17.9754 samples/sec | ETA 00:25:192022-07-21 16:14:20 [INFO][TRAIN] epoch: 3, iter: 13180/20000, loss: 0.4235, lr: 0.003798, batch_cost: 0.2222, reader_cost: 0.09837, ips: 18.0002 samples/sec | ETA 00:25:152022-07-21 16:14:22 [INFO][TRAIN] epoch: 3, iter: 13190/20000, loss: 0.5219, lr: 0.003793, batch_cost: 0.2301, reader_cost: 0.10796, ips: 17.3849 samples/sec | ETA 00:26:062022-07-21 16:14:24 [INFO][TRAIN] epoch: 3, iter: 13200/20000, loss: 0.2794, lr: 0.003788, batch_cost: 0.2436, reader_cost: 0.11624, ips: 16.4198 samples/sec | ETA 00:27:362022-07-21 16:14:27 [INFO][TRAIN] epoch: 3, iter: 13210/20000, loss: 0.3255, lr: 0.003783, batch_cost: 0.2333, reader_cost: 0.10516, ips: 17.1428 samples/sec | ETA 00:26:242022-07-21 16:14:29 [INFO][TRAIN] epoch: 3, iter: 13220/20000, loss: 0.3241, lr: 0.003778, batch_cost: 0.2552, reader_cost: 0.13038, ips: 15.6720 samples/sec | ETA 00:28:502022-07-21 16:14:32 [INFO][TRAIN] epoch: 3, iter: 13230/20000, loss: 0.3012, lr: 0.003773, batch_cost: 0.2320, reader_cost: 0.10960, ips: 17.2413 samples/sec | ETA 00:26:102022-07-21 16:14:34 [INFO][TRAIN] epoch: 3, iter: 13240/20000, loss: 0.3437, lr: 0.003768, batch_cost: 0.2434, reader_cost: 0.11764, ips: 16.4339 samples/sec | ETA 00:27:252022-07-21 16:14:37 [INFO][TRAIN] epoch: 3, iter: 13250/20000, loss: 0.4204, lr: 0.003763, batch_cost: 0.2501, reader_cost: 0.12336, ips: 15.9942 samples/sec | ETA 00:28:082022-07-21 16:14:39 [INFO][TRAIN] epoch: 3, iter: 13260/20000, loss: 0.3835, lr: 0.003758, batch_cost: 0.2725, reader_cost: 0.11545, ips: 14.6791 samples/sec | ETA 00:30:362022-07-21 16:14:42 [INFO][TRAIN] epoch: 3, iter: 13270/20000, loss: 0.3122, lr: 0.003753, batch_cost: 0.2707, reader_cost: 0.12483, ips: 14.7761 samples/sec | ETA 00:30:212022-07-21 16:14:46 [INFO][TRAIN] epoch: 3, iter: 13280/20000, loss: 0.3815, lr: 0.003748, batch_cost: 0.3714, reader_cost: 0.18085, ips: 10.7708 samples/sec | ETA 00:41:352022-07-21 16:14:48 [INFO][TRAIN] epoch: 3, iter: 13290/20000, loss: 0.3666, lr: 0.003743, batch_cost: 0.2667, reader_cost: 0.12352, ips: 14.9983 samples/sec | ETA 00:29:492022-07-21 16:14:51 [INFO][TRAIN] epoch: 3, iter: 13300/20000, loss: 0.3544, lr: 0.003738, batch_cost: 0.2425, reader_cost: 0.11653, ips: 16.4967 samples/sec | ETA 00:27:042022-07-21 16:14:53 [INFO][TRAIN] epoch: 3, iter: 13310/20000, loss: 0.3467, lr: 0.003733, batch_cost: 0.2397, reader_cost: 0.11078, ips: 16.6849 samples/sec | ETA 00:26:432022-07-21 16:14:56 [INFO][TRAIN] epoch: 3, iter: 13320/20000, loss: 0.3225, lr: 0.003728, batch_cost: 0.2346, reader_cost: 0.10885, ips: 17.0526 samples/sec | ETA 00:26:062022-07-21 16:14:58 [INFO][TRAIN] epoch: 3, iter: 13330/20000, loss: 0.3212, lr: 0.003723, batch_cost: 0.2438, reader_cost: 0.12066, ips: 16.4040 samples/sec | ETA 00:27:062022-07-21 16:15:00 [INFO][TRAIN] epoch: 3, iter: 13340/20000, loss: 0.3967, lr: 0.003718, batch_cost: 0.2401, reader_cost: 0.11375, ips: 16.6590 samples/sec | ETA 00:26:392022-07-21 16:15:03 [INFO][TRAIN] epoch: 3, iter: 13350/20000, loss: 0.3210, lr: 0.003713, batch_cost: 0.2212, reader_cost: 0.09865, ips: 18.0842 samples/sec | ETA 00:24:302022-07-21 16:15:05 [INFO][TRAIN] epoch: 3, iter: 13360/20000, loss: 0.4772, lr: 0.003708, batch_cost: 0.2111, reader_cost: 0.08783, ips: 18.9470 samples/sec | ETA 00:23:212022-07-21 16:15:07 [INFO][TRAIN] epoch: 3, iter: 13370/20000, loss: 0.3763, lr: 0.003702, batch_cost: 0.2208, reader_cost: 0.09715, ips: 18.1169 samples/sec | ETA 00:24:232022-07-21 16:15:09 [INFO][TRAIN] epoch: 3, iter: 13380/20000, loss: 0.3624, lr: 0.003697, batch_cost: 0.2463, reader_cost: 0.10608, ips: 16.2409 samples/sec | ETA 00:27:102022-07-21 16:15:12 [INFO][TRAIN] epoch: 3, iter: 13390/20000, loss: 0.3120, lr: 0.003692, batch_cost: 0.2591, reader_cost: 0.12382, ips: 15.4360 samples/sec | ETA 00:28:322022-07-21 16:15:14 [INFO][TRAIN] epoch: 3, iter: 13400/20000, loss: 0.3717, lr: 0.003687, batch_cost: 0.2254, reader_cost: 0.09840, ips: 17.7467 samples/sec | ETA 00:24:472022-07-21 16:15:17 [INFO][TRAIN] epoch: 3, iter: 13410/20000, loss: 0.3358, lr: 0.003682, batch_cost: 0.2650, reader_cost: 0.11744, ips: 15.0948 samples/sec | ETA 00:29:062022-07-21 16:15:20 [INFO][TRAIN] epoch: 3, iter: 13420/20000, loss: 0.3815, lr: 0.003677, batch_cost: 0.2632, reader_cost: 0.12068, ips: 15.1963 samples/sec | ETA 00:28:522022-07-21 16:15:22 [INFO][TRAIN] epoch: 3, iter: 13430/20000, loss: 0.2979, lr: 0.003672, batch_cost: 0.2329, reader_cost: 0.10940, ips: 17.1751 samples/sec | ETA 00:25:302022-07-21 16:15:24 [INFO][TRAIN] epoch: 3, iter: 13440/20000, loss: 0.3490, lr: 0.003667, batch_cost: 0.2336, reader_cost: 0.11038, ips: 17.1242 samples/sec | ETA 00:25:322022-07-21 16:15:26 [INFO][TRAIN] epoch: 3, iter: 13450/20000, loss: 0.3940, lr: 0.003662, batch_cost: 0.2274, reader_cost: 0.10168, ips: 17.5905 samples/sec | ETA 00:24:492022-07-21 16:15:29 [INFO][TRAIN] epoch: 3, iter: 13460/20000, loss: 0.3204, lr: 0.003657, batch_cost: 0.2382, reader_cost: 0.11454, ips: 16.7919 samples/sec | ETA 00:25:572022-07-21 16:15:31 [INFO][TRAIN] epoch: 3, iter: 13470/20000, loss: 0.2936, lr: 0.003652, batch_cost: 0.2350, reader_cost: 0.10884, ips: 17.0239 samples/sec | ETA 00:25:342022-07-21 16:15:33 [INFO][TRAIN] epoch: 3, iter: 13480/20000, loss: 0.2798, lr: 0.003647, batch_cost: 0.2290, reader_cost: 0.10249, ips: 17.4636 samples/sec | ETA 00:24:532022-07-21 16:15:36 [INFO][TRAIN] epoch: 3, iter: 13490/20000, loss: 0.2828, lr: 0.003642, batch_cost: 0.2202, reader_cost: 0.09291, ips: 18.1652 samples/sec | ETA 00:23:532022-07-21 16:15:38 [INFO][TRAIN] epoch: 3, iter: 13500/20000, loss: 0.3351, lr: 0.003637, batch_cost: 0.2228, reader_cost: 0.09195, ips: 17.9561 samples/sec | ETA 00:24:072022-07-21 16:15:41 [INFO][TRAIN] epoch: 3, iter: 13510/20000, loss: 0.3867, lr: 0.003632, batch_cost: 0.2661, reader_cost: 0.12941, ips: 15.0308 samples/sec | ETA 00:28:472022-07-21 16:15:43 [INFO][TRAIN] epoch: 3, iter: 13520/20000, loss: 0.3725, lr: 0.003627, batch_cost: 0.2784, reader_cost: 0.12616, ips: 14.3669 samples/sec | ETA 00:30:042022-07-21 16:15:46 [INFO][TRAIN] epoch: 3, iter: 13530/20000, loss: 0.2849, lr: 0.003622, batch_cost: 0.2473, reader_cost: 0.11677, ips: 16.1721 samples/sec | ETA 00:26:402022-07-21 16:15:49 [INFO][TRAIN] epoch: 3, iter: 13540/20000, loss: 0.4029, lr: 0.003617, batch_cost: 0.3069, reader_cost: 0.14003, ips: 13.0315 samples/sec | ETA 00:33:022022-07-21 16:15:52 [INFO][TRAIN] epoch: 3, iter: 13550/20000, loss: 0.3829, lr: 0.003612, batch_cost: 0.3089, reader_cost: 0.14891, ips: 12.9510 samples/sec | ETA 00:33:122022-07-21 16:15:54 [INFO][TRAIN] epoch: 3, iter: 13560/20000, loss: 0.3924, lr: 0.003607, batch_cost: 0.2396, reader_cost: 0.11616, ips: 16.6916 samples/sec | ETA 00:25:432022-07-21 16:15:57 [INFO][TRAIN] epoch: 3, iter: 13570/20000, loss: 0.3830, lr: 0.003602, batch_cost: 0.2225, reader_cost: 0.09318, ips: 17.9739 samples/sec | ETA 00:23:502022-07-21 16:15:59 [INFO][TRAIN] epoch: 3, iter: 13580/20000, loss: 0.4072, lr: 0.003597, batch_cost: 0.2729, reader_cost: 0.14904, ips: 14.6582 samples/sec | ETA 00:29:112022-07-21 16:16:02 [INFO][TRAIN] epoch: 3, iter: 13590/20000, loss: 0.2897, lr: 0.003592, batch_cost: 0.2409, reader_cost: 0.11403, ips: 16.6024 samples/sec | ETA 00:25:442022-07-21 16:16:04 [INFO][TRAIN] epoch: 3, iter: 13600/20000, loss: 0.4332, lr: 0.003587, batch_cost: 0.2191, reader_cost: 0.09448, ips: 18.2571 samples/sec | ETA 00:23:222022-07-21 16:16:06 [INFO][TRAIN] epoch: 3, iter: 13610/20000, loss: 0.3617, lr: 0.003582, batch_cost: 0.2260, reader_cost: 0.10369, ips: 17.7026 samples/sec | ETA 00:24:032022-07-21 16:16:09 [INFO][TRAIN] epoch: 3, iter: 13620/20000, loss: 0.2281, lr: 0.003577, batch_cost: 0.2416, reader_cost: 0.11488, ips: 16.5565 samples/sec | ETA 00:25:412022-07-21 16:16:11 [INFO][TRAIN] epoch: 3, iter: 13630/20000, loss: 0.3109, lr: 0.003572, batch_cost: 0.2404, reader_cost: 0.11792, ips: 16.6364 samples/sec | ETA 00:25:312022-07-21 16:16:14 [INFO][TRAIN] epoch: 3, iter: 13640/20000, loss: 0.2748, lr: 0.003567, batch_cost: 0.2762, reader_cost: 0.12580, ips: 14.4835 samples/sec | ETA 00:29:162022-07-21 16:16:16 [INFO][TRAIN] epoch: 3, iter: 13650/20000, loss: 0.2559, lr: 0.003561, batch_cost: 0.2360, reader_cost: 0.11158, ips: 16.9481 samples/sec | ETA 00:24:582022-07-21 16:16:18 [INFO][TRAIN] epoch: 3, iter: 13660/20000, loss: 0.3401, lr: 0.003556, batch_cost: 0.2319, reader_cost: 0.10788, ips: 17.2471 samples/sec | ETA 00:24:302022-07-21 16:16:21 [INFO][TRAIN] epoch: 3, iter: 13670/20000, loss: 0.3627, lr: 0.003551, batch_cost: 0.2444, reader_cost: 0.12086, ips: 16.3700 samples/sec | ETA 00:25:462022-07-21 16:16:24 [INFO][TRAIN] epoch: 3, iter: 13680/20000, loss: 0.2888, lr: 0.003546, batch_cost: 0.2658, reader_cost: 0.12308, ips: 15.0485 samples/sec | ETA 00:27:592022-07-21 16:16:26 [INFO][TRAIN] epoch: 3, iter: 13690/20000, loss: 0.3825, lr: 0.003541, batch_cost: 0.2664, reader_cost: 0.12257, ips: 15.0146 samples/sec | ETA 00:28:012022-07-21 16:16:28 [INFO][TRAIN] epoch: 3, iter: 13700/20000, loss: 0.2699, lr: 0.003536, batch_cost: 0.2215, reader_cost: 0.09976, ips: 18.0555 samples/sec | ETA 00:23:152022-07-21 16:16:31 [INFO][TRAIN] epoch: 3, iter: 13710/20000, loss: 0.4682, lr: 0.003531, batch_cost: 0.2364, reader_cost: 0.11247, ips: 16.9210 samples/sec | ETA 00:24:462022-07-21 16:16:33 [INFO][TRAIN] epoch: 3, iter: 13720/20000, loss: 0.3835, lr: 0.003526, batch_cost: 0.2296, reader_cost: 0.10620, ips: 17.4219 samples/sec | ETA 00:24:012022-07-21 16:16:35 [INFO][TRAIN] epoch: 3, iter: 13730/20000, loss: 0.2929, lr: 0.003521, batch_cost: 0.2191, reader_cost: 0.09589, ips: 18.2535 samples/sec | ETA 00:22:532022-07-21 16:16:38 [INFO][TRAIN] epoch: 3, iter: 13740/20000, loss: 0.3053, lr: 0.003516, batch_cost: 0.2290, reader_cost: 0.10539, ips: 17.4668 samples/sec | ETA 00:23:532022-07-21 16:16:40 [INFO][TRAIN] epoch: 3, iter: 13750/20000, loss: 0.2821, lr: 0.003511, batch_cost: 0.2332, reader_cost: 0.11000, ips: 17.1519 samples/sec | ETA 00:24:172022-07-21 16:16:42 [INFO][TRAIN] epoch: 3, iter: 13760/20000, loss: 0.4535, lr: 0.003506, batch_cost: 0.2338, reader_cost: 0.10142, ips: 17.1106 samples/sec | ETA 00:24:182022-07-21 16:16:45 [INFO][TRAIN] epoch: 3, iter: 13770/20000, loss: 0.3371, lr: 0.003501, batch_cost: 0.3000, reader_cost: 0.14707, ips: 13.3323 samples/sec | ETA 00:31:092022-07-21 16:16:48 [INFO][TRAIN] epoch: 3, iter: 13780/20000, loss: 0.3767, lr: 0.003496, batch_cost: 0.2748, reader_cost: 0.13811, ips: 14.5558 samples/sec | ETA 00:28:292022-07-21 16:16:51 [INFO][TRAIN] epoch: 3, iter: 13790/20000, loss: 0.3630, lr: 0.003491, batch_cost: 0.2950, reader_cost: 0.14252, ips: 13.5591 samples/sec | ETA 00:30:312022-07-21 16:16:53 [INFO][TRAIN] epoch: 3, iter: 13800/20000, loss: 0.3170, lr: 0.003486, batch_cost: 0.2299, reader_cost: 0.10210, ips: 17.4004 samples/sec | ETA 00:23:452022-07-21 16:16:56 [INFO][TRAIN] epoch: 3, iter: 13810/20000, loss: 0.3258, lr: 0.003481, batch_cost: 0.2508, reader_cost: 0.11260, ips: 15.9507 samples/sec | ETA 00:25:522022-07-21 16:16:59 [INFO][TRAIN] epoch: 3, iter: 13820/20000, loss: 0.3228, lr: 0.003476, batch_cost: 0.2864, reader_cost: 0.13395, ips: 13.9649 samples/sec | ETA 00:29:302022-07-21 16:17:01 [INFO][TRAIN] epoch: 3, iter: 13830/20000, loss: 0.4448, lr: 0.003471, batch_cost: 0.2310, reader_cost: 0.10422, ips: 17.3149 samples/sec | ETA 00:23:452022-07-21 16:17:03 [INFO][TRAIN] epoch: 3, iter: 13840/20000, loss: 0.4262, lr: 0.003465, batch_cost: 0.2519, reader_cost: 0.12812, ips: 15.8824 samples/sec | ETA 00:25:512022-07-21 16:17:06 [INFO][TRAIN] epoch: 3, iter: 13850/20000, loss: 0.3111, lr: 0.003460, batch_cost: 0.2294, reader_cost: 0.10311, ips: 17.4334 samples/sec | ETA 00:23:312022-07-21 16:17:08 [INFO][TRAIN] epoch: 3, iter: 13860/20000, loss: 0.2712, lr: 0.003455, batch_cost: 0.2495, reader_cost: 0.12217, ips: 16.0315 samples/sec | ETA 00:25:312022-07-21 16:17:11 [INFO][TRAIN] epoch: 3, iter: 13870/20000, loss: 0.3911, lr: 0.003450, batch_cost: 0.2241, reader_cost: 0.10057, ips: 17.8508 samples/sec | ETA 00:22:532022-07-21 16:17:13 [INFO][TRAIN] epoch: 3, iter: 13880/20000, loss: 0.4230, lr: 0.003445, batch_cost: 0.2337, reader_cost: 0.10888, ips: 17.1158 samples/sec | ETA 00:23:502022-07-21 16:17:15 [INFO][TRAIN] epoch: 3, iter: 13890/20000, loss: 0.2647, lr: 0.003440, batch_cost: 0.2339, reader_cost: 0.10882, ips: 17.1023 samples/sec | ETA 00:23:492022-07-21 16:17:18 [INFO][TRAIN] epoch: 3, iter: 13900/20000, loss: 0.3779, lr: 0.003435, batch_cost: 0.2639, reader_cost: 0.11680, ips: 15.1584 samples/sec | ETA 00:26:492022-07-21 16:17:20 [INFO][TRAIN] epoch: 3, iter: 13910/20000, loss: 0.2978, lr: 0.003430, batch_cost: 0.2242, reader_cost: 0.10098, ips: 17.8441 samples/sec | ETA 00:22:452022-07-21 16:17:22 [INFO][TRAIN] epoch: 3, iter: 13920/20000, loss: 0.3572, lr: 0.003425, batch_cost: 0.2298, reader_cost: 0.10547, ips: 17.4066 samples/sec | ETA 00:23:172022-07-21 16:17:25 [INFO][TRAIN] epoch: 3, iter: 13930/20000, loss: 0.3518, lr: 0.003420, batch_cost: 0.2233, reader_cost: 0.09908, ips: 17.9127 samples/sec | ETA 00:22:352022-07-21 16:17:27 [INFO][TRAIN] epoch: 3, iter: 13940/20000, loss: 0.2793, lr: 0.003415, batch_cost: 0.2314, reader_cost: 0.10555, ips: 17.2891 samples/sec | ETA 00:23:222022-07-21 16:17:30 [INFO][TRAIN] epoch: 3, iter: 13950/20000, loss: 0.3047, lr: 0.003410, batch_cost: 0.2874, reader_cost: 0.13738, ips: 13.9176 samples/sec | ETA 00:28:582022-07-21 16:17:33 [INFO][TRAIN] epoch: 3, iter: 13960/20000, loss: 0.4667, lr: 0.003405, batch_cost: 0.2726, reader_cost: 0.12461, ips: 14.6755 samples/sec | ETA 00:27:262022-07-21 16:17:35 [INFO][TRAIN] epoch: 3, iter: 13970/20000, loss: 0.3491, lr: 0.003400, batch_cost: 0.2159, reader_cost: 0.09272, ips: 18.5242 samples/sec | ETA 00:21:422022-07-21 16:17:37 [INFO][TRAIN] epoch: 3, iter: 13980/20000, loss: 0.4031, lr: 0.003394, batch_cost: 0.2267, reader_cost: 0.10225, ips: 17.6441 samples/sec | ETA 00:22:442022-07-21 16:17:39 [INFO][TRAIN] epoch: 3, iter: 13990/20000, loss: 0.4246, lr: 0.003389, batch_cost: 0.2318, reader_cost: 0.10871, ips: 17.2581 samples/sec | ETA 00:23:122022-07-21 16:17:42 [INFO][TRAIN] epoch: 3, iter: 14000/20000, loss: 0.3284, lr: 0.003384, batch_cost: 0.2410, reader_cost: 0.10647, ips: 16.5960 samples/sec | ETA 00:24:062022-07-21 16:17:44 [INFO][TRAIN] epoch: 3, iter: 14010/20000, loss: 0.3234, lr: 0.003379, batch_cost: 0.2578, reader_cost: 0.12144, ips: 15.5138 samples/sec | ETA 00:25:442022-07-21 16:17:47 [INFO][TRAIN] epoch: 3, iter: 14020/20000, loss: 0.3457, lr: 0.003374, batch_cost: 0.2720, reader_cost: 0.13646, ips: 14.7060 samples/sec | ETA 00:27:062022-07-21 16:17:50 [INFO][TRAIN] epoch: 3, iter: 14030/20000, loss: 0.3627, lr: 0.003369, batch_cost: 0.2999, reader_cost: 0.11946, ips: 13.3364 samples/sec | ETA 00:29:502022-07-21 16:17:53 [INFO][TRAIN] epoch: 3, iter: 14040/20000, loss: 0.4321, lr: 0.003364, batch_cost: 0.2913, reader_cost: 0.14322, ips: 13.7338 samples/sec | ETA 00:28:552022-07-21 16:17:55 [INFO][TRAIN] epoch: 3, iter: 14050/20000, loss: 0.3620, lr: 0.003359, batch_cost: 0.2471, reader_cost: 0.09610, ips: 16.1902 samples/sec | ETA 00:24:302022-07-21 16:17:58 [INFO][TRAIN] epoch: 3, iter: 14060/20000, loss: 0.2997, lr: 0.003354, batch_cost: 0.2368, reader_cost: 0.10946, ips: 16.8904 samples/sec | ETA 00:23:262022-07-21 16:18:00 [INFO][TRAIN] epoch: 3, iter: 14070/20000, loss: 0.4528, lr: 0.003349, batch_cost: 0.2402, reader_cost: 0.11450, ips: 16.6513 samples/sec | ETA 00:23:442022-07-21 16:18:03 [INFO][TRAIN] epoch: 3, iter: 14080/20000, loss: 0.3690, lr: 0.003344, batch_cost: 0.2836, reader_cost: 0.13599, ips: 14.1024 samples/sec | ETA 00:27:592022-07-21 16:18:06 [INFO][TRAIN] epoch: 3, iter: 14090/20000, loss: 0.3414, lr: 0.003339, batch_cost: 0.2621, reader_cost: 0.12322, ips: 15.2632 samples/sec | ETA 00:25:482022-07-21 16:18:08 [INFO][TRAIN] epoch: 3, iter: 14100/20000, loss: 0.3633, lr: 0.003334, batch_cost: 0.2368, reader_cost: 0.11421, ips: 16.8911 samples/sec | ETA 00:23:172022-07-21 16:18:10 [INFO][TRAIN] epoch: 3, iter: 14110/20000, loss: 0.3567, lr: 0.003328, batch_cost: 0.2249, reader_cost: 0.10097, ips: 17.7869 samples/sec | ETA 00:22:042022-07-21 16:18:13 [INFO][TRAIN] epoch: 3, iter: 14120/20000, loss: 0.4106, lr: 0.003323, batch_cost: 0.2411, reader_cost: 0.10597, ips: 16.5919 samples/sec | ETA 00:23:372022-07-21 16:18:15 [INFO][TRAIN] epoch: 3, iter: 14130/20000, loss: 0.3890, lr: 0.003318, batch_cost: 0.2411, reader_cost: 0.11544, ips: 16.5881 samples/sec | ETA 00:23:352022-07-21 16:18:17 [INFO][TRAIN] epoch: 3, iter: 14140/20000, loss: 0.4635, lr: 0.003313, batch_cost: 0.2338, reader_cost: 0.11145, ips: 17.1115 samples/sec | ETA 00:22:492022-07-21 16:18:20 [INFO][TRAIN] epoch: 3, iter: 14150/20000, loss: 0.3102, lr: 0.003308, batch_cost: 0.2507, reader_cost: 0.12666, ips: 15.9569 samples/sec | ETA 00:24:262022-07-21 16:18:23 [INFO][TRAIN] epoch: 3, iter: 14160/20000, loss: 0.3972, lr: 0.003303, batch_cost: 0.2839, reader_cost: 0.14589, ips: 14.0880 samples/sec | ETA 00:27:382022-07-21 16:18:25 [INFO][TRAIN] epoch: 3, iter: 14170/20000, loss: 0.2893, lr: 0.003298, batch_cost: 0.2361, reader_cost: 0.11419, ips: 16.9445 samples/sec | ETA 00:22:562022-07-21 16:18:27 [INFO][TRAIN] epoch: 3, iter: 14180/20000, loss: 0.2962, lr: 0.003293, batch_cost: 0.2201, reader_cost: 0.09680, ips: 18.1755 samples/sec | ETA 00:21:202022-07-21 16:18:30 [INFO][TRAIN] epoch: 3, iter: 14190/20000, loss: 0.4161, lr: 0.003288, batch_cost: 0.2310, reader_cost: 0.10840, ips: 17.3147 samples/sec | ETA 00:22:222022-07-21 16:18:32 [INFO][TRAIN] epoch: 3, iter: 14200/20000, loss: 0.4552, lr: 0.003283, batch_cost: 0.2347, reader_cost: 0.11211, ips: 17.0399 samples/sec | ETA 00:22:412022-07-21 16:18:34 [INFO][TRAIN] epoch: 3, iter: 14210/20000, loss: 0.2943, lr: 0.003278, batch_cost: 0.2459, reader_cost: 0.11932, ips: 16.2660 samples/sec | ETA 00:23:432022-07-21 16:18:37 [INFO][TRAIN] epoch: 3, iter: 14220/20000, loss: 0.4190, lr: 0.003272, batch_cost: 0.2897, reader_cost: 0.13566, ips: 13.8075 samples/sec | ETA 00:27:542022-07-21 16:18:40 [INFO][TRAIN] epoch: 3, iter: 14230/20000, loss: 0.3272, lr: 0.003267, batch_cost: 0.2562, reader_cost: 0.12882, ips: 15.6139 samples/sec | ETA 00:24:382022-07-21 16:18:42 [INFO][TRAIN] epoch: 3, iter: 14240/20000, loss: 0.3409, lr: 0.003262, batch_cost: 0.2605, reader_cost: 0.12941, ips: 15.3542 samples/sec | ETA 00:25:002022-07-21 16:18:45 [INFO][TRAIN] epoch: 3, iter: 14250/20000, loss: 0.3570, lr: 0.003257, batch_cost: 0.2621, reader_cost: 0.12856, ips: 15.2637 samples/sec | ETA 00:25:062022-07-21 16:18:47 [INFO][TRAIN] epoch: 3, iter: 14260/20000, loss: 0.4985, lr: 0.003252, batch_cost: 0.2371, reader_cost: 0.10879, ips: 16.8688 samples/sec | ETA 00:22:412022-07-21 16:18:50 [INFO][TRAIN] epoch: 3, iter: 14270/20000, loss: 0.4282, lr: 0.003247, batch_cost: 0.2375, reader_cost: 0.11141, ips: 16.8445 samples/sec | ETA 00:22:402022-07-21 16:18:53 [INFO][TRAIN] epoch: 3, iter: 14280/20000, loss: 0.3649, lr: 0.003242, batch_cost: 0.2738, reader_cost: 0.13070, ips: 14.6112 samples/sec | ETA 00:26:052022-07-21 16:18:56 [INFO][TRAIN] epoch: 3, iter: 14290/20000, loss: 0.2680, lr: 0.003237, batch_cost: 0.3308, reader_cost: 0.16248, ips: 12.0937 samples/sec | ETA 00:31:282022-07-21 16:18:59 [INFO][TRAIN] epoch: 3, iter: 14300/20000, loss: 0.4472, lr: 0.003232, batch_cost: 0.2598, reader_cost: 0.12644, ips: 15.3948 samples/sec | ETA 00:24:412022-07-21 16:19:01 [INFO][TRAIN] epoch: 3, iter: 14310/20000, loss: 0.3202, lr: 0.003227, batch_cost: 0.2444, reader_cost: 0.11226, ips: 16.3646 samples/sec | ETA 00:23:102022-07-21 16:19:03 [INFO][TRAIN] epoch: 3, iter: 14320/20000, loss: 0.3634, lr: 0.003221, batch_cost: 0.2296, reader_cost: 0.10357, ips: 17.4246 samples/sec | ETA 00:21:432022-07-21 16:19:06 [INFO][TRAIN] epoch: 3, iter: 14330/20000, loss: 0.2965, lr: 0.003216, batch_cost: 0.2303, reader_cost: 0.10758, ips: 17.3699 samples/sec | ETA 00:21:452022-07-21 16:19:08 [INFO][TRAIN] epoch: 3, iter: 14340/20000, loss: 0.4227, lr: 0.003211, batch_cost: 0.2646, reader_cost: 0.13230, ips: 15.1156 samples/sec | ETA 00:24:572022-07-21 16:19:11 [INFO][TRAIN] epoch: 3, iter: 14350/20000, loss: 0.3770, lr: 0.003206, batch_cost: 0.2932, reader_cost: 0.14847, ips: 13.6420 samples/sec | ETA 00:27:362022-07-21 16:19:13 [INFO][TRAIN] epoch: 3, iter: 14360/20000, loss: 0.3882, lr: 0.003201, batch_cost: 0.2287, reader_cost: 0.10493, ips: 17.4870 samples/sec | ETA 00:21:302022-07-21 16:19:16 [INFO][TRAIN] epoch: 3, iter: 14370/20000, loss: 0.3372, lr: 0.003196, batch_cost: 0.2308, reader_cost: 0.10581, ips: 17.3329 samples/sec | ETA 00:21:392022-07-21 16:19:18 [INFO][TRAIN] epoch: 3, iter: 14380/20000, loss: 0.3628, lr: 0.003191, batch_cost: 0.2561, reader_cost: 0.12147, ips: 15.6173 samples/sec | ETA 00:23:592022-07-21 16:19:21 [INFO][TRAIN] epoch: 3, iter: 14390/20000, loss: 0.4594, lr: 0.003186, batch_cost: 0.2545, reader_cost: 0.12537, ips: 15.7192 samples/sec | ETA 00:23:472022-07-21 16:19:23 [INFO][TRAIN] epoch: 3, iter: 14400/20000, loss: 0.3006, lr: 0.003181, batch_cost: 0.2233, reader_cost: 0.10046, ips: 17.9097 samples/sec | ETA 00:20:502022-07-21 16:19:26 [INFO][TRAIN] epoch: 3, iter: 14410/20000, loss: 0.2654, lr: 0.003176, batch_cost: 0.2728, reader_cost: 0.12928, ips: 14.6620 samples/sec | ETA 00:25:252022-07-21 16:19:28 [INFO][TRAIN] epoch: 3, iter: 14420/20000, loss: 0.2890, lr: 0.003170, batch_cost: 0.2429, reader_cost: 0.12110, ips: 16.4694 samples/sec | ETA 00:22:352022-07-21 16:19:31 [INFO][TRAIN] epoch: 3, iter: 14430/20000, loss: 0.3687, lr: 0.003165, batch_cost: 0.2405, reader_cost: 0.11629, ips: 16.6329 samples/sec | ETA 00:22:192022-07-21 16:19:33 [INFO][TRAIN] epoch: 3, iter: 14440/20000, loss: 0.2872, lr: 0.003160, batch_cost: 0.2278, reader_cost: 0.10585, ips: 17.5598 samples/sec | ETA 00:21:062022-07-21 16:19:35 [INFO][TRAIN] epoch: 3, iter: 14450/20000, loss: 0.2794, lr: 0.003155, batch_cost: 0.2460, reader_cost: 0.12061, ips: 16.2615 samples/sec | ETA 00:22:452022-07-21 16:19:38 [INFO][TRAIN] epoch: 3, iter: 14460/20000, loss: 0.3934, lr: 0.003150, batch_cost: 0.2342, reader_cost: 0.11225, ips: 17.0772 samples/sec | ETA 00:21:372022-07-21 16:19:40 [INFO][TRAIN] epoch: 3, iter: 14470/20000, loss: 0.2750, lr: 0.003145, batch_cost: 0.2450, reader_cost: 0.12010, ips: 16.3242 samples/sec | ETA 00:22:352022-07-21 16:19:43 [INFO][TRAIN] epoch: 3, iter: 14480/20000, loss: 0.3107, lr: 0.003140, batch_cost: 0.2724, reader_cost: 0.11796, ips: 14.6826 samples/sec | ETA 00:25:032022-07-21 16:19:46 [INFO][TRAIN] epoch: 3, iter: 14490/20000, loss: 0.3587, lr: 0.003135, batch_cost: 0.2889, reader_cost: 0.14516, ips: 13.8468 samples/sec | ETA 00:26:312022-07-21 16:19:48 [INFO][TRAIN] epoch: 3, iter: 14500/20000, loss: 0.4358, lr: 0.003129, batch_cost: 0.2308, reader_cost: 0.10889, ips: 17.3283 samples/sec | ETA 00:21:092022-07-21 16:19:50 [INFO][TRAIN] epoch: 3, iter: 14510/20000, loss: 0.3517, lr: 0.003124, batch_cost: 0.2386, reader_cost: 0.11430, ips: 16.7647 samples/sec | ETA 00:21:492022-07-21 16:19:53 [INFO][TRAIN] epoch: 3, iter: 14520/20000, loss: 0.3754, lr: 0.003119, batch_cost: 0.2406, reader_cost: 0.11697, ips: 16.6234 samples/sec | ETA 00:21:582022-07-21 16:19:55 [INFO][TRAIN] epoch: 3, iter: 14530/20000, loss: 0.4326, lr: 0.003114, batch_cost: 0.2406, reader_cost: 0.11505, ips: 16.6264 samples/sec | ETA 00:21:552022-07-21 16:19:58 [INFO][TRAIN] epoch: 3, iter: 14540/20000, loss: 0.3045, lr: 0.003109, batch_cost: 0.3118, reader_cost: 0.14864, ips: 12.8269 samples/sec | ETA 00:28:222022-07-21 16:20:01 [INFO][TRAIN] epoch: 3, iter: 14550/20000, loss: 0.4008, lr: 0.003104, batch_cost: 0.2459, reader_cost: 0.11302, ips: 16.2660 samples/sec | ETA 00:22:202022-07-21 16:20:03 [INFO][TRAIN] epoch: 3, iter: 14560/20000, loss: 0.2957, lr: 0.003099, batch_cost: 0.2450, reader_cost: 0.11846, ips: 16.3256 samples/sec | ETA 00:22:122022-07-21 16:20:06 [INFO][TRAIN] epoch: 3, iter: 14570/20000, loss: 0.3560, lr: 0.003094, batch_cost: 0.2354, reader_cost: 0.10576, ips: 16.9917 samples/sec | ETA 00:21:182022-07-21 16:20:08 [INFO][TRAIN] epoch: 3, iter: 14580/20000, loss: 0.3629, lr: 0.003088, batch_cost: 0.2370, reader_cost: 0.11511, ips: 16.8742 samples/sec | ETA 00:21:242022-07-21 16:20:10 [INFO][TRAIN] epoch: 3, iter: 14590/20000, loss: 0.3456, lr: 0.003083, batch_cost: 0.2401, reader_cost: 0.11782, ips: 16.6608 samples/sec | ETA 00:21:382022-07-21 16:20:13 [INFO][TRAIN] epoch: 3, iter: 14600/20000, loss: 0.4028, lr: 0.003078, batch_cost: 0.2391, reader_cost: 0.11177, ips: 16.7321 samples/sec | ETA 00:21:302022-07-21 16:20:16 [INFO][TRAIN] epoch: 3, iter: 14610/20000, loss: 0.4459, lr: 0.003073, batch_cost: 0.2848, reader_cost: 0.14412, ips: 14.0456 samples/sec | ETA 00:25:342022-07-21 16:20:18 [INFO][TRAIN] epoch: 3, iter: 14620/20000, loss: 0.4086, lr: 0.003068, batch_cost: 0.2584, reader_cost: 0.12515, ips: 15.4777 samples/sec | ETA 00:23:102022-07-21 16:20:21 [INFO][TRAIN] epoch: 3, iter: 14630/20000, loss: 0.2587, lr: 0.003063, batch_cost: 0.2434, reader_cost: 0.11734, ips: 16.4330 samples/sec | ETA 00:21:472022-07-21 16:20:23 [INFO][TRAIN] epoch: 3, iter: 14640/20000, loss: 0.3626, lr: 0.003058, batch_cost: 0.2539, reader_cost: 0.12237, ips: 15.7537 samples/sec | ETA 00:22:402022-07-21 16:20:26 [INFO][TRAIN] epoch: 3, iter: 14650/20000, loss: 0.2772, lr: 0.003053, batch_cost: 0.2354, reader_cost: 0.09634, ips: 16.9927 samples/sec | ETA 00:20:592022-07-21 16:20:28 [INFO][TRAIN] epoch: 3, iter: 14660/20000, loss: 0.4344, lr: 0.003047, batch_cost: 0.2685, reader_cost: 0.12384, ips: 14.8976 samples/sec | ETA 00:23:532022-07-21 16:20:31 [INFO][TRAIN] epoch: 3, iter: 14670/20000, loss: 0.3363, lr: 0.003042, batch_cost: 0.2405, reader_cost: 0.11400, ips: 16.6343 samples/sec | ETA 00:21:212022-07-21 16:20:33 [INFO][TRAIN] epoch: 3, iter: 14680/20000, loss: 0.3946, lr: 0.003037, batch_cost: 0.2273, reader_cost: 0.10328, ips: 17.5977 samples/sec | ETA 00:20:092022-07-21 16:20:35 [INFO][TRAIN] epoch: 3, iter: 14690/20000, loss: 0.4087, lr: 0.003032, batch_cost: 0.2390, reader_cost: 0.11751, ips: 16.7362 samples/sec | ETA 00:21:092022-07-21 16:20:38 [INFO][TRAIN] epoch: 3, iter: 14700/20000, loss: 0.2771, lr: 0.003027, batch_cost: 0.2325, reader_cost: 0.11036, ips: 17.2027 samples/sec | ETA 00:20:322022-07-21 16:20:40 [INFO][TRAIN] epoch: 3, iter: 14710/20000, loss: 0.3055, lr: 0.003022, batch_cost: 0.2294, reader_cost: 0.10547, ips: 17.4374 samples/sec | ETA 00:20:132022-07-21 16:20:42 [INFO][TRAIN] epoch: 3, iter: 14720/20000, loss: 0.3472, lr: 0.003017, batch_cost: 0.2398, reader_cost: 0.11609, ips: 16.6777 samples/sec | ETA 00:21:062022-07-21 16:20:45 [INFO][TRAIN] epoch: 3, iter: 14730/20000, loss: 0.3014, lr: 0.003011, batch_cost: 0.2357, reader_cost: 0.10920, ips: 16.9676 samples/sec | ETA 00:20:422022-07-21 16:20:47 [INFO][TRAIN] epoch: 3, iter: 14740/20000, loss: 0.2833, lr: 0.003006, batch_cost: 0.2368, reader_cost: 0.10726, ips: 16.8922 samples/sec | ETA 00:20:452022-07-21 16:20:50 [INFO][TRAIN] epoch: 3, iter: 14750/20000, loss: 0.3608, lr: 0.003001, batch_cost: 0.2909, reader_cost: 0.13151, ips: 13.7498 samples/sec | ETA 00:25:272022-07-21 16:20:52 [INFO][TRAIN] epoch: 3, iter: 14760/20000, loss: 0.3495, lr: 0.002996, batch_cost: 0.2239, reader_cost: 0.09342, ips: 17.8647 samples/sec | ETA 00:19:332022-07-21 16:20:55 [INFO][TRAIN] epoch: 3, iter: 14770/20000, loss: 0.3258, lr: 0.002991, batch_cost: 0.2329, reader_cost: 0.10769, ips: 17.1741 samples/sec | ETA 00:20:182022-07-21 16:20:57 [INFO][TRAIN] epoch: 3, iter: 14780/20000, loss: 0.2743, lr: 0.002986, batch_cost: 0.2256, reader_cost: 0.10397, ips: 17.7321 samples/sec | ETA 00:19:372022-07-21 16:20:59 [INFO][TRAIN] epoch: 3, iter: 14790/20000, loss: 0.3429, lr: 0.002981, batch_cost: 0.2641, reader_cost: 0.12344, ips: 15.1460 samples/sec | ETA 00:22:552022-07-21 16:21:03 [INFO][TRAIN] epoch: 3, iter: 14800/20000, loss: 0.3172, lr: 0.002975, batch_cost: 0.3150, reader_cost: 0.15627, ips: 12.6971 samples/sec | ETA 00:27:182022-07-21 16:21:05 [INFO][TRAIN] epoch: 3, iter: 14810/20000, loss: 0.3988, lr: 0.002970, batch_cost: 0.2322, reader_cost: 0.10606, ips: 17.2293 samples/sec | ETA 00:20:042022-07-21 16:21:07 [INFO][TRAIN] epoch: 3, iter: 14820/20000, loss: 0.2964, lr: 0.002965, batch_cost: 0.2352, reader_cost: 0.10580, ips: 17.0053 samples/sec | ETA 00:20:182022-07-21 16:21:10 [INFO][TRAIN] epoch: 3, iter: 14830/20000, loss: 0.3013, lr: 0.002960, batch_cost: 0.2348, reader_cost: 0.10046, ips: 17.0367 samples/sec | ETA 00:20:132022-07-21 16:21:12 [INFO][TRAIN] epoch: 3, iter: 14840/20000, loss: 0.2646, lr: 0.002955, batch_cost: 0.2428, reader_cost: 0.11880, ips: 16.4753 samples/sec | ETA 00:20:522022-07-21 16:21:14 [INFO][TRAIN] epoch: 3, iter: 14850/20000, loss: 0.3200, lr: 0.002950, batch_cost: 0.2383, reader_cost: 0.11511, ips: 16.7832 samples/sec | ETA 00:20:272022-07-21 16:21:17 [INFO][TRAIN] epoch: 4, iter: 14860/20000, loss: 0.3473, lr: 0.002945, batch_cost: 0.2487, reader_cost: 0.12159, ips: 16.0815 samples/sec | ETA 00:21:182022-07-21 16:21:19 [INFO][TRAIN] epoch: 4, iter: 14870/20000, loss: 0.3490, lr: 0.002939, batch_cost: 0.2465, reader_cost: 0.12202, ips: 16.2292 samples/sec | ETA 00:21:042022-07-21 16:21:22 [INFO][TRAIN] epoch: 4, iter: 14880/20000, loss: 0.3526, lr: 0.002934, batch_cost: 0.2857, reader_cost: 0.13732, ips: 14.0008 samples/sec | ETA 00:24:222022-07-21 16:21:25 [INFO][TRAIN] epoch: 4, iter: 14890/20000, loss: 0.3753, lr: 0.002929, batch_cost: 0.2746, reader_cost: 0.13293, ips: 14.5674 samples/sec | ETA 00:23:232022-07-21 16:21:27 [INFO][TRAIN] epoch: 4, iter: 14900/20000, loss: 0.3155, lr: 0.002924, batch_cost: 0.2282, reader_cost: 0.10472, ips: 17.5274 samples/sec | ETA 00:19:232022-07-21 16:21:30 [INFO][TRAIN] epoch: 4, iter: 14910/20000, loss: 0.3952, lr: 0.002919, batch_cost: 0.2518, reader_cost: 0.12868, ips: 15.8845 samples/sec | ETA 00:21:212022-07-21 16:21:32 [INFO][TRAIN] epoch: 4, iter: 14920/20000, loss: 0.4983, lr: 0.002914, batch_cost: 0.2624, reader_cost: 0.10346, ips: 15.2423 samples/sec | ETA 00:22:132022-07-21 16:21:35 [INFO][TRAIN] epoch: 4, iter: 14930/20000, loss: 0.2874, lr: 0.002908, batch_cost: 0.2425, reader_cost: 0.11460, ips: 16.4973 samples/sec | ETA 00:20:292022-07-21 16:21:37 [INFO][TRAIN] epoch: 4, iter: 14940/20000, loss: 0.4542, lr: 0.002903, batch_cost: 0.2234, reader_cost: 0.09874, ips: 17.9028 samples/sec | ETA 00:18:502022-07-21 16:21:39 [INFO][TRAIN] epoch: 4, iter: 14950/20000, loss: 0.4010, lr: 0.002898, batch_cost: 0.2324, reader_cost: 0.10761, ips: 17.2110 samples/sec | ETA 00:19:332022-07-21 16:21:42 [INFO][TRAIN] epoch: 4, iter: 14960/20000, loss: 0.3147, lr: 0.002893, batch_cost: 0.2451, reader_cost: 0.12059, ips: 16.3217 samples/sec | ETA 00:20:352022-07-21 16:21:44 [INFO][TRAIN] epoch: 4, iter: 14970/20000, loss: 0.4767, lr: 0.002888, batch_cost: 0.2271, reader_cost: 0.10396, ips: 17.6139 samples/sec | ETA 00:19:022022-07-21 16:21:47 [INFO][TRAIN] epoch: 4, iter: 14980/20000, loss: 0.3531, lr: 0.002883, batch_cost: 0.2550, reader_cost: 0.12724, ips: 15.6892 samples/sec | ETA 00:21:192022-07-21 16:21:49 [INFO][TRAIN] epoch: 4, iter: 14990/20000, loss: 0.3619, lr: 0.002877, batch_cost: 0.2525, reader_cost: 0.13161, ips: 15.8408 samples/sec | ETA 00:21:052022-07-21 16:21:51 [INFO][TRAIN] epoch: 4, iter: 15000/20000, loss: 0.3679, lr: 0.002872, batch_cost: 0.2228, reader_cost: 0.10007, ips: 17.9496 samples/sec | ETA 00:18:342022-07-21 16:21:54 [INFO][TRAIN] epoch: 4, iter: 15010/20000, loss: 0.3451, lr: 0.002867, batch_cost: 0.2608, reader_cost: 0.12476, ips: 15.3359 samples/sec | ETA 00:21:412022-07-21 16:21:57 [INFO][TRAIN] epoch: 4, iter: 15020/20000, loss: 0.2717, lr: 0.002862, batch_cost: 0.2984, reader_cost: 0.13735, ips: 13.4043 samples/sec | ETA 00:24:462022-07-21 16:21:59 [INFO][TRAIN] epoch: 4, iter: 15030/20000, loss: 0.3472, lr: 0.002857, batch_cost: 0.2309, reader_cost: 0.10664, ips: 17.3263 samples/sec | ETA 00:19:072022-07-21 16:22:02 [INFO][TRAIN] epoch: 4, iter: 15040/20000, loss: 0.3858, lr: 0.002852, batch_cost: 0.2339, reader_cost: 0.10763, ips: 17.1046 samples/sec | ETA 00:19:192022-07-21 16:22:05 [INFO][TRAIN] epoch: 4, iter: 15050/20000, loss: 0.3670, lr: 0.002846, batch_cost: 0.3064, reader_cost: 0.14486, ips: 13.0545 samples/sec | ETA 00:25:162022-07-21 16:22:07 [INFO][TRAIN] epoch: 4, iter: 15060/20000, loss: 0.4614, lr: 0.002841, batch_cost: 0.2510, reader_cost: 0.11279, ips: 15.9370 samples/sec | ETA 00:20:392022-07-21 16:22:10 [INFO][TRAIN] epoch: 4, iter: 15070/20000, loss: 0.3803, lr: 0.002836, batch_cost: 0.2261, reader_cost: 0.09997, ips: 17.6879 samples/sec | ETA 00:18:342022-07-21 16:22:12 [INFO][TRAIN] epoch: 4, iter: 15080/20000, loss: 0.2541, lr: 0.002831, batch_cost: 0.2310, reader_cost: 0.10701, ips: 17.3185 samples/sec | ETA 00:18:562022-07-21 16:22:14 [INFO][TRAIN] epoch: 4, iter: 15090/20000, loss: 0.3563, lr: 0.002826, batch_cost: 0.2321, reader_cost: 0.10113, ips: 17.2362 samples/sec | ETA 00:18:592022-07-21 16:22:17 [INFO][TRAIN] epoch: 4, iter: 15100/20000, loss: 0.2794, lr: 0.002821, batch_cost: 0.2575, reader_cost: 0.12402, ips: 15.5343 samples/sec | ETA 00:21:012022-07-21 16:22:19 [INFO][TRAIN] epoch: 4, iter: 15110/20000, loss: 0.4846, lr: 0.002815, batch_cost: 0.2210, reader_cost: 0.09949, ips: 18.0960 samples/sec | ETA 00:18:002022-07-21 16:22:21 [INFO][TRAIN] epoch: 4, iter: 15120/20000, loss: 0.2890, lr: 0.002810, batch_cost: 0.2283, reader_cost: 0.10504, ips: 17.5238 samples/sec | ETA 00:18:332022-07-21 16:22:24 [INFO][TRAIN] epoch: 4, iter: 15130/20000, loss: 0.2673, lr: 0.002805, batch_cost: 0.2282, reader_cost: 0.10426, ips: 17.5274 samples/sec | ETA 00:18:312022-07-21 16:22:26 [INFO][TRAIN] epoch: 4, iter: 15140/20000, loss: 0.3420, lr: 0.002800, batch_cost: 0.2390, reader_cost: 0.11151, ips: 16.7381 samples/sec | ETA 00:19:212022-07-21 16:22:28 [INFO][TRAIN] epoch: 4, iter: 15150/20000, loss: 0.3243, lr: 0.002795, batch_cost: 0.2551, reader_cost: 0.10808, ips: 15.6825 samples/sec | ETA 00:20:372022-07-21 16:22:31 [INFO][TRAIN] epoch: 4, iter: 15160/20000, loss: 0.2929, lr: 0.002789, batch_cost: 0.2819, reader_cost: 0.13142, ips: 14.1905 samples/sec | ETA 00:22:442022-07-21 16:22:34 [INFO][TRAIN] epoch: 4, iter: 15170/20000, loss: 0.3605, lr: 0.002784, batch_cost: 0.2516, reader_cost: 0.12346, ips: 15.9005 samples/sec | ETA 00:20:152022-07-21 16:22:37 [INFO][TRAIN] epoch: 4, iter: 15180/20000, loss: 0.3459, lr: 0.002779, batch_cost: 0.2841, reader_cost: 0.14864, ips: 14.0805 samples/sec | ETA 00:22:492022-07-21 16:22:39 [INFO][TRAIN] epoch: 4, iter: 15190/20000, loss: 0.2900, lr: 0.002774, batch_cost: 0.2277, reader_cost: 0.10521, ips: 17.5708 samples/sec | ETA 00:18:142022-07-21 16:22:41 [INFO][TRAIN] epoch: 4, iter: 15200/20000, loss: 0.2976, lr: 0.002769, batch_cost: 0.2300, reader_cost: 0.10447, ips: 17.3922 samples/sec | ETA 00:18:232022-07-21 16:22:44 [INFO][TRAIN] epoch: 4, iter: 15210/20000, loss: 0.3704, lr: 0.002763, batch_cost: 0.2361, reader_cost: 0.10755, ips: 16.9399 samples/sec | ETA 00:18:512022-07-21 16:22:46 [INFO][TRAIN] epoch: 4, iter: 15220/20000, loss: 0.3081, lr: 0.002758, batch_cost: 0.2341, reader_cost: 0.10684, ips: 17.0873 samples/sec | ETA 00:18:382022-07-21 16:22:48 [INFO][TRAIN] epoch: 4, iter: 15230/20000, loss: 0.3395, lr: 0.002753, batch_cost: 0.2344, reader_cost: 0.11158, ips: 17.0618 samples/sec | ETA 00:18:382022-07-21 16:22:51 [INFO][TRAIN] epoch: 4, iter: 15240/20000, loss: 0.4646, lr: 0.002748, batch_cost: 0.2325, reader_cost: 0.11134, ips: 17.2047 samples/sec | ETA 00:18:262022-07-21 16:22:53 [INFO][TRAIN] epoch: 4, iter: 15250/20000, loss: 0.3403, lr: 0.002743, batch_cost: 0.2324, reader_cost: 0.11027, ips: 17.2096 samples/sec | ETA 00:18:242022-07-21 16:22:55 [INFO][TRAIN] epoch: 4, iter: 15260/20000, loss: 0.3188, lr: 0.002738, batch_cost: 0.2375, reader_cost: 0.11454, ips: 16.8404 samples/sec | ETA 00:18:452022-07-21 16:22:58 [INFO][TRAIN] epoch: 4, iter: 15270/20000, loss: 0.4598, lr: 0.002732, batch_cost: 0.2411, reader_cost: 0.11706, ips: 16.5921 samples/sec | ETA 00:19:002022-07-21 16:23:00 [INFO][TRAIN] epoch: 4, iter: 15280/20000, loss: 0.2373, lr: 0.002727, batch_cost: 0.2690, reader_cost: 0.13571, ips: 14.8705 samples/sec | ETA 00:21:092022-07-21 16:23:03 [INFO][TRAIN] epoch: 4, iter: 15290/20000, loss: 0.4385, lr: 0.002722, batch_cost: 0.2803, reader_cost: 0.12886, ips: 14.2699 samples/sec | ETA 00:22:002022-07-21 16:23:06 [INFO][TRAIN] epoch: 4, iter: 15300/20000, loss: 0.3754, lr: 0.002717, batch_cost: 0.2466, reader_cost: 0.10748, ips: 16.2225 samples/sec | ETA 00:19:182022-07-21 16:23:09 [INFO][TRAIN] epoch: 4, iter: 15310/20000, loss: 0.4110, lr: 0.002712, batch_cost: 0.3118, reader_cost: 0.15561, ips: 12.8273 samples/sec | ETA 00:24:222022-07-21 16:23:11 [INFO][TRAIN] epoch: 4, iter: 15320/20000, loss: 0.2612, lr: 0.002706, batch_cost: 0.2528, reader_cost: 0.12914, ips: 15.8206 samples/sec | ETA 00:19:432022-07-21 16:23:14 [INFO][TRAIN] epoch: 4, iter: 15330/20000, loss: 0.4192, lr: 0.002701, batch_cost: 0.2213, reader_cost: 0.09758, ips: 18.0781 samples/sec | ETA 00:17:132022-07-21 16:23:16 [INFO][TRAIN] epoch: 4, iter: 15340/20000, loss: 0.3655, lr: 0.002696, batch_cost: 0.2436, reader_cost: 0.11748, ips: 16.4233 samples/sec | ETA 00:18:542022-07-21 16:23:18 [INFO][TRAIN] epoch: 4, iter: 15350/20000, loss: 0.3330, lr: 0.002691, batch_cost: 0.2338, reader_cost: 0.11031, ips: 17.1059 samples/sec | ETA 00:18:072022-07-21 16:23:21 [INFO][TRAIN] epoch: 4, iter: 15360/20000, loss: 0.3326, lr: 0.002685, batch_cost: 0.2395, reader_cost: 0.11115, ips: 16.7003 samples/sec | ETA 00:18:312022-07-21 16:23:23 [INFO][TRAIN] epoch: 4, iter: 15370/20000, loss: 0.2416, lr: 0.002680, batch_cost: 0.2474, reader_cost: 0.11756, ips: 16.1690 samples/sec | ETA 00:19:052022-07-21 16:23:26 [INFO][TRAIN] epoch: 4, iter: 15380/20000, loss: 0.2805, lr: 0.002675, batch_cost: 0.2419, reader_cost: 0.11672, ips: 16.5340 samples/sec | ETA 00:18:372022-07-21 16:23:28 [INFO][TRAIN] epoch: 4, iter: 15390/20000, loss: 0.4279, lr: 0.002670, batch_cost: 0.2368, reader_cost: 0.11416, ips: 16.8904 samples/sec | ETA 00:18:112022-07-21 16:23:30 [INFO][TRAIN] epoch: 4, iter: 15400/20000, loss: 0.2665, lr: 0.002665, batch_cost: 0.2355, reader_cost: 0.11197, ips: 16.9848 samples/sec | ETA 00:18:032022-07-21 16:23:33 [INFO][TRAIN] epoch: 4, iter: 15410/20000, loss: 0.4077, lr: 0.002659, batch_cost: 0.2332, reader_cost: 0.10417, ips: 17.1493 samples/sec | ETA 00:17:502022-07-21 16:23:35 [INFO][TRAIN] epoch: 4, iter: 15420/20000, loss: 0.4203, lr: 0.002654, batch_cost: 0.2686, reader_cost: 0.13020, ips: 14.8942 samples/sec | ETA 00:20:302022-07-21 16:23:39 [INFO][TRAIN] epoch: 4, iter: 15430/20000, loss: 0.3220, lr: 0.002649, batch_cost: 0.3321, reader_cost: 0.16730, ips: 12.0454 samples/sec | ETA 00:25:172022-07-21 16:23:41 [INFO][TRAIN] epoch: 4, iter: 15440/20000, loss: 0.4047, lr: 0.002644, batch_cost: 0.2486, reader_cost: 0.11869, ips: 16.0903 samples/sec | ETA 00:18:532022-07-21 16:23:44 [INFO][TRAIN] epoch: 4, iter: 15450/20000, loss: 0.3373, lr: 0.002639, batch_cost: 0.2476, reader_cost: 0.11954, ips: 16.1555 samples/sec | ETA 00:18:462022-07-21 16:23:46 [INFO][TRAIN] epoch: 4, iter: 15460/20000, loss: 0.3584, lr: 0.002633, batch_cost: 0.2302, reader_cost: 0.10422, ips: 17.3743 samples/sec | ETA 00:17:252022-07-21 16:23:48 [INFO][TRAIN] epoch: 4, iter: 15470/20000, loss: 0.3882, lr: 0.002628, batch_cost: 0.2459, reader_cost: 0.12352, ips: 16.2659 samples/sec | ETA 00:18:332022-07-21 16:23:51 [INFO][TRAIN] epoch: 4, iter: 15480/20000, loss: 0.3442, lr: 0.002623, batch_cost: 0.2384, reader_cost: 0.11696, ips: 16.7796 samples/sec | ETA 00:17:572022-07-21 16:23:53 [INFO][TRAIN] epoch: 4, iter: 15490/20000, loss: 0.3754, lr: 0.002618, batch_cost: 0.2225, reader_cost: 0.09933, ips: 17.9771 samples/sec | ETA 00:16:432022-07-21 16:23:55 [INFO][TRAIN] epoch: 4, iter: 15500/20000, loss: 0.3105, lr: 0.002612, batch_cost: 0.2442, reader_cost: 0.11664, ips: 16.3823 samples/sec | ETA 00:18:182022-07-21 16:23:58 [INFO][TRAIN] epoch: 4, iter: 15510/20000, loss: 0.3793, lr: 0.002607, batch_cost: 0.2309, reader_cost: 0.10154, ips: 17.3264 samples/sec | ETA 00:17:162022-07-21 16:24:00 [INFO][TRAIN] epoch: 4, iter: 15520/20000, loss: 0.4220, lr: 0.002602, batch_cost: 0.2337, reader_cost: 0.10484, ips: 17.1187 samples/sec | ETA 00:17:262022-07-21 16:24:02 [INFO][TRAIN] epoch: 4, iter: 15530/20000, loss: 0.3624, lr: 0.002597, batch_cost: 0.2271, reader_cost: 0.10319, ips: 17.6105 samples/sec | ETA 00:16:552022-07-21 16:24:05 [INFO][TRAIN] epoch: 4, iter: 15540/20000, loss: 0.3577, lr: 0.002592, batch_cost: 0.2495, reader_cost: 0.12027, ips: 16.0295 samples/sec | ETA 00:18:322022-07-21 16:24:07 [INFO][TRAIN] epoch: 4, iter: 15550/20000, loss: 0.3513, lr: 0.002586, batch_cost: 0.2604, reader_cost: 0.11881, ips: 15.3635 samples/sec | ETA 00:19:182022-07-21 16:24:11 [INFO][TRAIN] epoch: 4, iter: 15560/20000, loss: 0.3372, lr: 0.002581, batch_cost: 0.3220, reader_cost: 0.14760, ips: 12.4227 samples/sec | ETA 00:23:492022-07-21 16:24:14 [INFO][TRAIN] epoch: 4, iter: 15570/20000, loss: 0.3610, lr: 0.002576, batch_cost: 0.3118, reader_cost: 0.15280, ips: 12.8274 samples/sec | ETA 00:23:012022-07-21 16:24:16 [INFO][TRAIN] epoch: 4, iter: 15580/20000, loss: 0.3087, lr: 0.002571, batch_cost: 0.2307, reader_cost: 0.10645, ips: 17.3390 samples/sec | ETA 00:16:592022-07-21 16:24:19 [INFO][TRAIN] epoch: 4, iter: 15590/20000, loss: 0.4706, lr: 0.002565, batch_cost: 0.2388, reader_cost: 0.11675, ips: 16.7524 samples/sec | ETA 00:17:322022-07-21 16:24:21 [INFO][TRAIN] epoch: 4, iter: 15600/20000, loss: 0.4105, lr: 0.002560, batch_cost: 0.2266, reader_cost: 0.10440, ips: 17.6520 samples/sec | ETA 00:16:372022-07-21 16:24:23 [INFO][TRAIN] epoch: 4, iter: 15610/20000, loss: 0.5791, lr: 0.002555, batch_cost: 0.2333, reader_cost: 0.10990, ips: 17.1431 samples/sec | ETA 00:17:042022-07-21 16:24:26 [INFO][TRAIN] epoch: 4, iter: 15620/20000, loss: 0.3018, lr: 0.002550, batch_cost: 0.2536, reader_cost: 0.12416, ips: 15.7742 samples/sec | ETA 00:18:302022-07-21 16:24:28 [INFO][TRAIN] epoch: 4, iter: 15630/20000, loss: 0.3361, lr: 0.002544, batch_cost: 0.2725, reader_cost: 0.14074, ips: 14.6801 samples/sec | ETA 00:19:502022-07-21 16:24:31 [INFO][TRAIN] epoch: 4, iter: 15640/20000, loss: 0.2910, lr: 0.002539, batch_cost: 0.2338, reader_cost: 0.10386, ips: 17.1115 samples/sec | ETA 00:16:592022-07-21 16:24:33 [INFO][TRAIN] epoch: 4, iter: 15650/20000, loss: 0.3814, lr: 0.002534, batch_cost: 0.2118, reader_cost: 0.08993, ips: 18.8901 samples/sec | ETA 00:15:212022-07-21 16:24:35 [INFO][TRAIN] epoch: 4, iter: 15660/20000, loss: 0.3085, lr: 0.002529, batch_cost: 0.2325, reader_cost: 0.11085, ips: 17.2013 samples/sec | ETA 00:16:492022-07-21 16:24:38 [INFO][TRAIN] epoch: 4, iter: 15670/20000, loss: 0.3170, lr: 0.002523, batch_cost: 0.2342, reader_cost: 0.11135, ips: 17.0821 samples/sec | ETA 00:16:532022-07-21 16:24:40 [INFO][TRAIN] epoch: 4, iter: 15680/20000, loss: 0.4325, lr: 0.002518, batch_cost: 0.2575, reader_cost: 0.11466, ips: 15.5344 samples/sec | ETA 00:18:322022-07-21 16:24:44 [INFO][TRAIN] epoch: 4, iter: 15690/20000, loss: 0.3639, lr: 0.002513, batch_cost: 0.4026, reader_cost: 0.18907, ips: 9.9358 samples/sec | ETA 00:28:552022-07-21 16:24:47 [INFO][TRAIN] epoch: 4, iter: 15700/20000, loss: 0.4558, lr: 0.002508, batch_cost: 0.2847, reader_cost: 0.11027, ips: 14.0497 samples/sec | ETA 00:20:242022-07-21 16:24:49 [INFO][TRAIN] epoch: 4, iter: 15710/20000, loss: 0.3030, lr: 0.002503, batch_cost: 0.2518, reader_cost: 0.11486, ips: 15.8861 samples/sec | ETA 00:18:002022-07-21 16:24:52 [INFO][TRAIN] epoch: 4, iter: 15720/20000, loss: 0.4171, lr: 0.002497, batch_cost: 0.2617, reader_cost: 0.12805, ips: 15.2824 samples/sec | ETA 00:18:402022-07-21 16:24:54 [INFO][TRAIN] epoch: 4, iter: 15730/20000, loss: 0.3975, lr: 0.002492, batch_cost: 0.2260, reader_cost: 0.09745, ips: 17.7015 samples/sec | ETA 00:16:042022-07-21 16:24:57 [INFO][TRAIN] epoch: 4, iter: 15740/20000, loss: 0.3671, lr: 0.002487, batch_cost: 0.2569, reader_cost: 0.12780, ips: 15.5676 samples/sec | ETA 00:18:142022-07-21 16:24:59 [INFO][TRAIN] epoch: 4, iter: 15750/20000, loss: 0.4847, lr: 0.002482, batch_cost: 0.2457, reader_cost: 0.11100, ips: 16.2828 samples/sec | ETA 00:17:242022-07-21 16:25:02 [INFO][TRAIN] epoch: 4, iter: 15760/20000, loss: 0.3049, lr: 0.002476, batch_cost: 0.2381, reader_cost: 0.10760, ips: 16.7972 samples/sec | ETA 00:16:492022-07-21 16:25:04 [INFO][TRAIN] epoch: 4, iter: 15770/20000, loss: 0.2701, lr: 0.002471, batch_cost: 0.2443, reader_cost: 0.11582, ips: 16.3759 samples/sec | ETA 00:17:132022-07-21 16:25:07 [INFO][TRAIN] epoch: 4, iter: 15780/20000, loss: 0.3293, lr: 0.002466, batch_cost: 0.2343, reader_cost: 0.10814, ips: 17.0753 samples/sec | ETA 00:16:282022-07-21 16:25:09 [INFO][TRAIN] epoch: 4, iter: 15790/20000, loss: 0.3796, lr: 0.002460, batch_cost: 0.2382, reader_cost: 0.11090, ips: 16.7911 samples/sec | ETA 00:16:422022-07-21 16:25:11 [INFO][TRAIN] epoch: 4, iter: 15800/20000, loss: 0.2752, lr: 0.002455, batch_cost: 0.2542, reader_cost: 0.12833, ips: 15.7381 samples/sec | ETA 00:17:472022-07-21 16:25:15 [INFO][TRAIN] epoch: 4, iter: 15810/20000, loss: 0.2905, lr: 0.002450, batch_cost: 0.3404, reader_cost: 0.15840, ips: 11.7494 samples/sec | ETA 00:23:462022-07-21 16:25:18 [INFO][TRAIN] epoch: 4, iter: 15820/20000, loss: 0.4268, lr: 0.002445, batch_cost: 0.3449, reader_cost: 0.17212, ips: 11.5963 samples/sec | ETA 00:24:012022-07-21 16:25:21 [INFO][TRAIN] epoch: 4, iter: 15830/20000, loss: 0.2623, lr: 0.002439, batch_cost: 0.2324, reader_cost: 0.11054, ips: 17.2092 samples/sec | ETA 00:16:092022-07-21 16:25:23 [INFO][TRAIN] epoch: 4, iter: 15840/20000, loss: 0.3059, lr: 0.002434, batch_cost: 0.2345, reader_cost: 0.11027, ips: 17.0558 samples/sec | ETA 00:16:152022-07-21 16:25:25 [INFO][TRAIN] epoch: 4, iter: 15850/20000, loss: 0.5222, lr: 0.002429, batch_cost: 0.2348, reader_cost: 0.11144, ips: 17.0360 samples/sec | ETA 00:16:142022-07-21 16:25:28 [INFO][TRAIN] epoch: 4, iter: 15860/20000, loss: 0.3866, lr: 0.002424, batch_cost: 0.2362, reader_cost: 0.11400, ips: 16.9382 samples/sec | ETA 00:16:172022-07-21 16:25:30 [INFO][TRAIN] epoch: 4, iter: 15870/20000, loss: 0.3467, lr: 0.002418, batch_cost: 0.2439, reader_cost: 0.11691, ips: 16.3998 samples/sec | ETA 00:16:472022-07-21 16:25:32 [INFO][TRAIN] epoch: 4, iter: 15880/20000, loss: 0.3281, lr: 0.002413, batch_cost: 0.2305, reader_cost: 0.10599, ips: 17.3538 samples/sec | ETA 00:15:492022-07-21 16:25:35 [INFO][TRAIN] epoch: 4, iter: 15890/20000, loss: 0.2739, lr: 0.002408, batch_cost: 0.2352, reader_cost: 0.10399, ips: 17.0075 samples/sec | ETA 00:16:062022-07-21 16:25:37 [INFO][TRAIN] epoch: 4, iter: 15900/20000, loss: 0.3812, lr: 0.002403, batch_cost: 0.2392, reader_cost: 0.11734, ips: 16.7249 samples/sec | ETA 00:16:202022-07-21 16:25:40 [INFO][TRAIN] epoch: 4, iter: 15910/20000, loss: 0.3547, lr: 0.002397, batch_cost: 0.2423, reader_cost: 0.11416, ips: 16.5100 samples/sec | ETA 00:16:302022-07-21 16:25:42 [INFO][TRAIN] epoch: 4, iter: 15920/20000, loss: 0.3626, lr: 0.002392, batch_cost: 0.2403, reader_cost: 0.11142, ips: 16.6475 samples/sec | ETA 00:16:202022-07-21 16:25:45 [INFO][TRAIN] epoch: 4, iter: 15930/20000, loss: 0.4052, lr: 0.002387, batch_cost: 0.2464, reader_cost: 0.11963, ips: 16.2340 samples/sec | ETA 00:16:422022-07-21 16:25:47 [INFO][TRAIN] epoch: 4, iter: 15940/20000, loss: 0.2843, lr: 0.002381, batch_cost: 0.2680, reader_cost: 0.12545, ips: 14.9270 samples/sec | ETA 00:18:072022-07-21 16:25:51 [INFO][TRAIN] epoch: 4, iter: 15950/20000, loss: 0.3054, lr: 0.002376, batch_cost: 0.3529, reader_cost: 0.18425, ips: 11.3338 samples/sec | ETA 00:23:492022-07-21 16:25:54 [INFO][TRAIN] epoch: 4, iter: 15960/20000, loss: 0.3957, lr: 0.002371, batch_cost: 0.3149, reader_cost: 0.18443, ips: 12.7022 samples/sec | ETA 00:21:122022-07-21 16:25:57 [INFO][TRAIN] epoch: 4, iter: 15970/20000, loss: 0.3262, lr: 0.002366, batch_cost: 0.2664, reader_cost: 0.14262, ips: 15.0170 samples/sec | ETA 00:17:532022-07-21 16:25:59 [INFO][TRAIN] epoch: 4, iter: 15980/20000, loss: 0.2920, lr: 0.002360, batch_cost: 0.2508, reader_cost: 0.12519, ips: 15.9463 samples/sec | ETA 00:16:482022-07-21 16:26:01 [INFO][TRAIN] epoch: 4, iter: 15990/20000, loss: 0.3457, lr: 0.002355, batch_cost: 0.2275, reader_cost: 0.10235, ips: 17.5801 samples/sec | ETA 00:15:122022-07-21 16:26:04 [INFO][TRAIN] epoch: 4, iter: 16000/20000, loss: 0.3446, lr: 0.002350, batch_cost: 0.2304, reader_cost: 0.10264, ips: 17.3620 samples/sec | ETA 00:15:212022-07-21 16:26:04 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 502s 68ms/step - batch_cost: 0.0679 - reader cost: 0.0083: 19:38 - batch_cost: 0.1600 - rea - ETA: 16:40 - batch_cost: 0.1358 - reader cost: 0.04 - ETA: 16:22 - batch_cost: 0.1335 - rea - ETA: 13:58 - batch_cost: 0.1139 - r - ETA: 11:27 - batch_cost: 0.0936 - reader cost - ETA: 10:37 - batch_cost: 0.0868 - reader cost: 0. - ETA: 10:23 - batch_cost: 0.0850 - reader cost: 0.01 - ETA: 10:13 - batch_cost: 0.0836 - reader cost: - ETA: 9:49 - batch_cost: 0.0803 - reade - ETA: 8:41 - batch_cost: 0.0714 - reader cost: - ETA: 8:25 - batch_cost: 0.0693 - reader cost: 0.008 - ETA: 8:23 - batch_cost: 0.0690 - reader cost: - ETA: 8:08 - batch_cost: 0.0671 - rea - ETA: 7:47 - batch_cost: 0.0645 - read - ETA: 7:32 - batch_cost: 0.0626 - reader cost: 0. - ETA: 7:29 - batch_cost: 0.0623 - reader - ETA: 7:24 - batch_cost: 0.0618 - reader c - ETA: 7:16 - batch_cost: 0.0609 - reader cost: - ETA: 7:20 - batch_cost: 0.0614 - reader cost: 0 - ETA: 7:23 - batch_cost: 0.0619 - reader cost: 0.00 - ETA: 7:23 - batch_cost: 0.0620 - reader cost: 0 - ETA: 7:27 - batch_cost: 0.0626 - reader cost: 0.003 - ETA: 7:28 - batch_cost: 0.0627 - reader cost: 0 - ETA: 7:33 - batch_cost: 0.0635 - reader cost: 0.0 - ETA: 7:36 - batch_cost: 0.0639 - reader cost: 0.0 - ETA: 7:38 - batch_cost: 0.0643 - reader cost: 0.0 - ETA: 7:40 - batch_cost: 0.0645 - reader cost: 0.0 - ETA: 7:41 - batch_cost: 0.0647 - reader cost: 0.00 - ETA: 7:42 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 7:45 - batch_cost: 0.0653 - reader cost: 0.00 - ETA: 7:46 - batch_cost: 0.0655 - reader cost: 0.003 - ETA: 7:47 - batch_cost: 0.0657 - reader cost: 0.003 - ETA: 7:48 - batch_cost: 0.0658 - reader cost: 0.00 - ETA: 7:50 - batch_cost: 0.0660 - reader cost - ETA: 7:52 - batch_cost: 0.0665 - reader cost: - ETA: 7:54 - batch_cost: 0.0668 - reader cost: 0.0 - ETA: 7:54 - batch_cost: 0.0669 - reader cost: 0.0 - ETA: 7:54 - batch_cost: 0.0669 - reader cost: 0. - ETA: 7:56 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 7:58 - batch_cost: 0.0676 - reader c - ETA: 8:05 - batch_cost: 0.0687 - reader c - ETA: 8:06 - batch_cost: 0.0689 - reader cost: 0.005 - ETA: 8:07 - batch_cost: 0.0689 - reader cost: - ETA: 8:07 - batch_cost: 0.0691 - reader cost: 0. - ETA: 8:07 - batch_cost: 0.0691 - reader cost - ETA: 8:08 - batch_cost: 0.0694 - reade - ETA: 8:07 - batch_cost: 0.0694 - reader - ETA: 8:08 - batch_cost: 0.0695 - reader cost: 0. - ETA: 8:08 - batch_cost: 0.0697 - reader cost: 0.00 - ETA: 8:09 - batch_cost: 0.069 - ETA: 8:08 - batch_cost: 0.0699 - reader cost: 0. - ETA: 8:08 - batch_cost: 0.0699 - reader c - ETA: 8:08 - batch_cost: 0.0700 - reader cost: 0.009 - ETA: 8:08 - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 8:08 - batch_cost: 0.0701 - reader cost: 0.009 - ETA: 8:08 - batch_cost: 0.0701 - reader cost: 0.009 - ETA: 8:08 - batch_cost: 0.0701 - reader cost: 0 - ETA: 8:08 - batch_cost: 0.0702 - reader cost: 0.01 - ETA: 8:08 - batch_cos - ETA: 8:09 - batch_cost: 0.0706 - reade - ETA: 8:09 - batch_cost: 0.0708 - reader cost: - ETA: 8:10 - batch_cost: 0.0710 - reader cost: 0. - ETA: 8:10 - batch_cost: 0.0711 - - ETA: 8:11 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 8:11 - batch_cost: 0.0714 - reader cost: 0.0 - ETA: 8:11 - batch_cost: 0.0714 - reader cost: 0. - ETA: 8:11 - batch_cost: 0.0714 - reader cost: 0.012 - ETA: 8:10 - batch_cost: 0.0713 - reader cost: 0. - ETA: 8:09 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 8:09 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 8:08 - batch_cost: 0.0711 - reader cost: 0. - ETA: 8:06 - batch_cost: 0.0710 - rea - ETA: 7:57 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 7:56 - batch_cost: 0.0699 - reader cost: - ETA: 7:53 - batch_cost: 0.0696 - reader cost: 0.0 - ETA: 7:53 - batch_cost: 0.0695 - reader cost: 0. - ETA: 7:51 - batch_cost: 0.0694 - re - ETA: 7:43 - batch_cost: 0.0685 - r - ETA: 7:35 - batch_cost: 0.0676 - reader cost: 0.0 - ETA: 7:34 - batch_cost: 0.0675 - reader cos - ETA: 7:29 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 7:29 - batch_cost: 0.0670 - reader cost: 0. - ETA: 7:27 - batch_cost: 0.0669 - reader cost: 0.0 - ETA: 7:27 - batch_cost: 0.0668 - reader cost: 0.009 - ETA: 7:26 - batch_cost: 0.0668 - reader cost: 0.0 - ETA: 7:26 - batch_cost: 0.0669 - reader cost: 0.00 - ETA: 7:27 - batch_cost: 0.0670 - reader cost: 0.0 - ETA: 7:28 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 7:28 - batch_cost: 0.0672 - reader cost: 0.00 - ETA: 7:29 - batch_cost: 0.0673 - reader cost - ETA: 7:31 - batch_cost: 0.0678 - reader cost: 0.009 - ETA: 7:32 - batch_cost: 0.0678 - reader cost: 0 - ETA: 7:33 - batch_cost: 0.0680 - reader cost: 0.01 - ETA: 7:33 - batch_cost: 0.0681 - rea - ETA: 7:33 - batch_cost: 0.0683 - reader cost: 0.010 - ETA: 7:33 - batch_cost: 0.0684 - reader cost: 0. - ETA: 7:34 - batch_cost: 0.0684 - reader cost: 0 - ETA: 7:34 - batch_cost: 0.0685 - reader cost: 0.01 - ETA: 7:34 - batch_cost: 0.0686 - reader cost: 0.01 - ETA: 7:35 - batch_cost: 0.0687 - reader cost: 0.0 - ETA: 7:35 - batch_cost: 0.0688 - reader cost: - ETA: 7:35 - batch_cost: 0.0689 - reader cost: 0.010 - ETA: 7:36 - batch_cost: 0.0690 - reader cost: - ETA: 7:36 - batch_cost: 0.0691 - reader cost: 0.0 - ETA: 7:36 - batch_cost: 0.0692 - reader cost: 0.0 - ETA: 7:37 - batch_cost: 0.0693 - reader cost: 0 - ETA: 7:37 - batch_cost: 0.0695 - reader cost: 0.0 - ETA: 7:37 - batch_cost: 0.0695 - reader cost: - ETA: 7:36 - batch_cost: 0.0694 - ETA: 7:28 - batch_cost: - ETA: 7:18 - batch_cost: 0.0677 - reader cost: 0.00 - ETA - ETA: 7:03 - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 7:03 - batch_cost: 0.0663 - reader c - ETA: 7:00 - batch_cost: 0.0660 - reader cost: - ETA: 6:58 - batch_cost: 0.0659 - reader cost: 0.0 - ETA: 6:58 - batch_cost: 0.0658 - reader cost: 0. - ETA: 6:57 - batch_cost: 0.0658 - reader cost: - ETA: 6:55 - batch_cost: 0.0656 - reader cost: 0 - ETA: 6:54 - batch_cost: 0.0655 - reader c - ETA: 6:51 - batch_cost: 0.0653 - reade - ETA: 6:48 - batch_cost: 0.0651 - reader cost: - ETA: 6:46 - batch_cost: 0.0650 - reader cost: 0. - ETA: 6:45 - batch_cost: 0.0649 - read - ETA: 6:42 - batch_cost: 0.0646 - reader cost: - ETA: 6:41 - batch_cost: 0.0645 - reader c - ETA: 6:38 - batch_cost: 0.0643 - reader cost: 0.00 - ETA: 6:38 - batch_cost: 0.0643 - reader c - ETA: 6:36 - batch_cost: 0.0642 - reader cost: 0 - ETA: 6:36 - batch_cost: 0.0643 - reader cost: 0. - ETA: 6:36 - batch_cost: 0.0644 - reader cost: 0.0 - ETA: 6:36 - batch_cost: 0.0644 - reader cost: 0. - ETA: 6:36 - batch_cost: 0.0645 - reader cost: - ETA: 6:37 - batch_cost: 0.0646 - reader cost: 0.0 - ETA: 6:37 - batch_cost: 0.0647 - reader cos - ETA: 6:37 - batch_cost: 0.0648 - reader cost: 0. - ETA: 6:37 - batch_cost: 0.064 - ETA: 6:34 - batch_cost: 0.0648 - reader cost - ETA: 6:35 - batch_cost: 0.0649 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0650 - reader cost: 0 - ETA: 6:36 - batch_cost: 0.0652 - reader cost: 0.00 - ETA: 6:36 - batch_cost: 0.0652 - reader cost: 0.0 - ETA: 6:36 - batch_cost: 0.0653 - reader cost: 0 - ETA: 6:36 - batch_cost: 0.0654 - reader cost: 0.00 - ETA: 6:36 - batch_cost: 0.0654 - reader cost: 0.00 - ETA: 6:37 - batch_cost: 0.0655 - reader cost: 0.0 - ETA: 6:37 - batch_cost: 0.0656 - reader cost: 0.006 - ETA: 6:37 - batch_cost: 0.0656 - reader cost: 0.00 - ETA: 6:37 - batch_cost: 0.0657 - reader cost: 0.00 - ETA: 6:37 - batch_cost: 0.0657 - reader cost: 0. - ETA: 6:38 - batch_cost: 0.0658 - rea - ETA: 6:37 - batch_cost: 0.0659 - reader - ETA: 6:37 - batch_cost: 0.0660 - reader cost: 0 - ETA: 6:37 - batch_cost: 0.0660 - reader cost: 0.0 - ETA: 6:37 - batch_cost: 0.0660 - reader cost: 0. - ETA: 6:36 - batch_cost: 0.0661 - reader cost: - ETA: 6:36 - batch_cost: 0.0661 - r - ETA: 6:35 - batch_cost: 0.0662 - reader cost: 0. - ETA: 6:35 - batch_cost: 0.0662 - reade - ETA: 6:35 - batch_cost: 0.0663 - reader cost: 0.0 - ETA: 6:35 - batch_cost: 0.0663 - reader co - ETA: 6:35 - batch_cost: 0.0664 - reader cost: 0. - ETA: 6:35 - batch_cost: 0.066 - ETA: 6:34 - batch_cost: 0.0666 - reade - ETA: 6:33 - batch_cost: 0.0666 - reader cost: 0 - ETA: 6:32 - batch_cost: 0 - ETA: 6:30 - batch_cost: 0.0666 - reader cost: 0.007 - ETA: 6:30 - batch_cost: 0.0666 - reader cost: 0.007 - ETA: 6:29 - batch_cost: 0.0666 - reade - ETA: 6:27 - batch_cost: 0.0665 - reader - ETA: 6:26 - batch_cost: 0.0664 - reader cost: - ETA: 6:25 - batch_cost: 0.0664 - reader cos - ETA: 6:24 - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 6:24 - batch_cost: 0.0663 - reader c - ETA: 6:21 - batch_cost: 0.0662 - reader cost: 0.0 - ETA: 6:21 - batch_cost: 0.0661 - reader - ETA: 6:18 - batch_cost: 0.0659 - reader cost: - ETA: 6:17 - batch_cost: 0.0658 - reader cost: 0.00 - ETA: 6:16 - batch_cost: 0.0658 - reader cos - ETA: 6:14 - batch_cost: 0.0656 - reader cost: 0. - ETA: 6:13 - batch_cost: 0.0655 - reader cost - ETA: 6:11 - batch_cost: 0.0654 - reader c - ETA: 6:10 - batch_cost: 0.0654 - reader cost - ETA: 6:10 - batch_cost: 0.0655 - reader cost: - ETA: 6:10 - batch_cost: 0.0656 - reader cost: 0 - ETA: 6:09 - batch_cost: 0.0655 - reader cost: 0 - ETA: 6:08 - batch_cost: 0.0654 - reader cost: 0.0 - ETA: 6:08 - batch_cost: 0.0654 - reader cost: - ETA: 6:07 - batch_cost: 0.0653 - reader cost - ETA: 6:06 - batch_cost: 0.0653 - reader cost: 0.006 - ETA: 6:06 - batch_cost: 0.0653 - reader cost: 0. - ETA: 6:05 - batch_cost: - ETA: 6:02 - batch_cost: 0.0652 - reader cost: 0.006 - ETA: 6:02 - batch_cost: 0.0652 - reader cost: - ETA: 6:02 - batch_cost: 0.0653 - reader cost: 0. - ETA: 6:01 - batch_cost: 0.0653 - reader cost - ETA: 6:01 - batch_cost: 0.0654 - reader cost: 0. - ETA: 6:01 - batch_cost: 0. - ETA: 5:58 - batch_cost: 0.0653 - reader - ETA: 5:57 - batch_cost: 0.0653 - reader cost - ETA: 5:56 - batch_cost: 0.0654 - re - ETA: 5:55 - bat - ETA: 5:52 - batch_cost: 0.06 - ETA: 5:50 - batch_cost: 0.0654 - ETA: 5:49 - batch_cost: 0.0655 - reader cos - ETA: 5:48 - batch_cost: 0.0655 - reader cost: 0.00 - ETA: 5:48 - batch_cost: 0.0655 - reader cost: 0.0 - ETA: 5:48 - batch_cost: 0.0655 - reader cost: 0.0 - ETA: 5:48 - batch_cost: 0.0655 - reader cost: - ETA: 5:47 - batch_cost: 0.0655 - reader cost: 0.007 - ETA: 5:47 - batch_cost: 0.0655 - reader cos - ETA: 5:46 - batch_cost: 0.0656 - reader cost: - ETA: 5:46 - batch_cost: 0.0656 - reader cost: 0.0 - ETA: 5:46 - batch_cost: 0.0656 - reader cost: 0.007 - ETA: 5:46 - batch_cost: 0.0656 - reader co - ETA: 5:45 - batch_cost - ETA: 5:43 - batch_cost: 0.0657 - reader cost - ETA: 5:43 - batch_cost: 0.0658 - reader cost: - ETA: 5:42 - batch_cost: 0.0658 - reader cost: 0.0 - ETA: 5:42 - batch_cost: 0.0659 - reader cost: 0. - ETA: 5:42 - batch_cost: 0.0660 - reader cost: 0.00 - ETA: 5:43 - batch_cost: 0.0660 - read - ETA: 5:43 - batch_cost: 0.0663 - reader cost: - ETA: 5:42 - batch_cost: 0.0663 - reader - ETA: 5:42 - batch_cost: 0.0663 - reader cost: - ETA: 5:41 - batch_cost: 0.0663 - reader cost - ETA: 5:41 - batch_cost: 0.0663 - rea - ETA: 5:39 - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 5:39 - batch_cost: 0.0664 - reader cost: 0. - ETA: 5:39 - batch_cost: 0.0664 - reader cost: 0.00 - ETA: 5:39 - batch_cost: 0.0664 - reader cost: 0.0 - ETA: 5:39 - batch_cost: 0.0665 - reader cost: 0.0 - ETA: 5:39 - batch_cost: 0.0665 - reader cost: 0.00 - ETA: 5:39 - batch_cost: 0.0665 - reader cost: 0.00 - ETA: 5:39 - batch_cost: 0.0666 - reader cost: - ETA: 5:39 - batch_cost: 0.0666 - reader cost - ETA: 5:39 - batch_cost: 0.0667 - reader cost: 0.00 - ETA: 5:39 - batch_cost: 0.0668 - reader cost: 0.0 - ETA: 5:39 - batch_cost: 0 - ETA: 5:38 - batch_cost: 0.0669 - reader cost: 0. - ETA: 5:37 - batch_cost: 0.0669 - reader cost: 0.0 - ETA: 5:37 - batch_cost: 0.0669 - reader co - ETA: 5:37 - batch_cost: 0.0670 - reader cost: 0.0 - ETA: 5:36 - batch_cost: 0.0670 - reader cost: 0.008 - ETA: 5:36 - batch_cost: 0.0670 - ETA: 5:34 - batch_cost: 0.0670 - reader cos - ETA: 5:34 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 5:34 - batch_cost: 0.0671 - reader cost: 0. - ETA: 5:34 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 5:33 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 5:33 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 5:33 - batch_cost: 0.0671 - reader cost: 0.00 - ETA: 5:33 - batch_cost: 0.0671 - reader cost: - ETA: 5:32 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 5:32 - batch_cost: 0.0671 - reader cost: 0.008 - ETA: 5:32 - batch_cost: 0.0671 - reader cost: 0.0 - ETA: 5:32 - batch_cost: 0.0672 - reader cost: 0. - ETA: 5:32 - batch_cost: 0.0672 - reader cost: 0.00 - ETA: 5:32 - batch_cost: 0.0672 - reader cost: 0.00 - ETA: 5:31 - batch_cost: 0.0672 - reader cost: - ETA: 5:31 - batch_cost: 0.0672 - reader cost: 0. - ETA: 5:31 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 5:30 - batch_cost: 0.0672 - reader cost: 0 - ETA: 5:30 - batch_cost: 0.0672 - reade - ETA: 5:29 - batch_cost: 0.0673 - rea - ETA: 5:28 - batch_cost: 0.0673 - reader cost: 0.008 - ETA: 5:27 - batch_cost: 0.0673 - reader cost: 0.0 - ETA: 5:27 - batch_cost: 0.0673 - reader - ETA: 5:26 - batch_cost: 0.0673 - reader cost: 0 - ETA: 5:26 - batch_cost: 0.0673 - reader cost: 0 - ETA: 5:26 - batch_cost: 0.0673 - reader cost: 0.00 - ETA: 5:25 - batch_cost: 0.0673 - reader - ETA: 5:24 - batch_cost: 0.0673 - reader c - ETA: 5:23 - batch_cost: 0.0673 - reader cost: 0 - ETA: 5:23 - batch_cost: 0.0673 - reader cost: 0.0 - ETA: 5:23 - batch_cost: 0.0673 - reader cos - ETA: 5:22 - batch_cost: 0.0673 - reader cost: 0.0 - ETA: 5:22 - batch_cost: 0.0673 - reader cost: 0.00 - ETA: 5:22 - batch_cost: 0.0673 - reader cost: 0.00 - ETA: 5:22 - batch_cost: 0.0673 - reader cost: 0. - ETA: 5:21 - batch_cost: 0.0673 - reader cos - ETA: 5:21 - batch_cost: 0.0674 - reader cost - ETA: 5:21 - batch_cost: 0.0675 - reader cost: 0.008 - ETA: 5:21 - batch_cost: 0.0675 - reader cost: 0.008 - ETA: 5:21 - batch_cost: 0.0675 - reader cost: 0.008 - ETA: 5:21 - batch_cost: 0.0675 - reader cost: 0.00 - ETA: 5:21 - batch_cost: 0.0675 - reader cost: 0.00 - ETA: 5:21 - batch_cost: 0.0676 - reader cost: - ETA: 5:20 - batch_cost: 0.0676 - reader cost - ETA: 5:20 - batch_cost: 0.0676 - reader cost: 0. - ETA: 5:20 - batch_cost: 0.0676 - reader cost: 0. - ETA: 5:19 - batch_cost: 0.0676 - reader - ETA: 5:18 - batch_cost: 0.0676 - reader cost: 0.00 - ETA: 5:18 - batch_cost: 0.0676 - reader cost: 0.00 - ETA: 5:18 - batch_cost: 0.0676 - reader cost: 0. - ETA: 5:18 - batch_cost: 0.0676 - reader cost - ETA: 5:17 - batch_cost: 0.0676 - reader - ETA: 5:16 - batch_cost: 0.0676 - reader cost: 0 - ETA: 5:16 - batch_cost: 0.0676 - reader cost: 0.009 - ETA: 5:16 - batch_cost: 0.0676 - reader cost: 0.0 - ETA: 5:16 - batch_cost: 0.0677 - reader - ETA: 5:15 - batch_cost: 0.0678 - reader - ETA: 5:15 - batch_cost: 0.0679 - reader cost: 0.0 - ETA: 5:15 - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 5:15 - batch_cost: 0.0680 - reader cost: - ETA: 5:15 - batch_cost: 0.0681 - reader cost: 0. - ETA: 5:15 - batch_cost: 0.0681 - reader cos - ETA: 5:14 - batch_cost: 0.0682 - reader cost: 0 - ETA: 5:14 - batch_cost: 0.0682 - reader co - ETA: 5:13 - batch_cost: 0.0682 - reader cost: 0 - ETA: 5:12 - batch_cost: 0.0682 - reader cost: 0. - ETA: 5:12 - batch_cost: 0.0682 - reader co - ETA: 5:11 - batch_cost: 0.0682 - reader cost: 0.009 - ETA: 5:11 - batch_cost: 0.0682 - reader cost: 0.00 - ETA: 5:11 - batch_cost: 0.0682 - reader cost: 0.0 - ETA: 5:11 - batch_cost: 0.0682 - reader cos - ETA: 5:10 - batch_cost: 0.0682 - reader cost - ETA: 5:09 - batch_cost: 0.0682 - reader - ETA: 5:08 - batch_cost: 0.0682 - reader cost: - ETA: 5:07 - batch_cost: 0.0682 - reader cos - ETA: 5:06 - batch_cost: 0.0682 - reader c - ETA: 5:05 - batch_cost: 0.0681 - reader cost: 0. - ETA: 5:05 - batch_cost: 0.0682 - reader cost: 0.00 - ETA: 5:05 - batch_cost: 0.0682 - reader cost: 0.009 - ETA: 5:05 - batch_cost: 0.0682 - reader cost: 0.009 - ETA: 5:04 - batch_cost: 0.0681 - reader cost: 0 - ETA: 5:04 - batch_cost: 0.0682 - reader cost: - ETA: 5:04 - batch_cost: 0.0682 - reader cost: - ETA: 5:03 - batch_cost: 0.0682 - reader cost: 0.0 - ETA: 5:03 - batch_cost: 0.0681 - reader cost: 0 - ETA: 5:02 - batch_cost: 0.0681 - reader cost: 0.0 - ETA: 5:02 - batch_cost: 0.0682 - reader cost: 0.009 - ETA: 5:02 - batch_cost: 0.0682 - read - ETA: 5:01 - batch_cost: 0.0682 - reader cost - ETA: 5:00 - batch_cost: 0.0682 - reader cost: 0 - ETA: 5:00 - batch_cost: 0.0681 - reader cost: 0 - ETA: 4:59 - batch_cost: 0.0681 - reader cost: 0.009 - ETA: 4:59 - batch_cost: 0.0681 - reader cost - ETA: 4:58 - batch_cost: 0.0681 - reader cost: 0.0 - ETA: 4:58 - batch_cost: 0.0681 - reader co - ETA: 4:57 - batch_cost: 0.0681 - reader cost: 0.00 - ETA: 4:57 - batch_cost: 0.0681 - reader cost: 0.009 - ETA: 4:57 - batch_cost: 0.0682 - reader cost: 0.0 - ETA: 4:57 - batch_cost: 0.0682 - reader cost: 0.009 - ETA: 4:57 - batch_cost: 0.0682 - reader cost: 0.009 - ETA: 4:57 - batch_cost: 0.0682 - reader cost: 0.0 - ETA: 4:57 - batch_cost: 0.0683 - reader cost: 0. - ETA: 4:57 - batch_cost: 0.0683 - reader cost: 0. - ETA: 4:57 - batch_cost: 0.0683 - reader cost: 0.0 - ETA: 4:56 - batch_cost: 0.0684 - reader cost: 0.00 - ETA: 4:56 - batch_cost: 0.0684 - reader cost: 0.00 - ETA: 4:57 - batch_cost: 0.0684 - reader cost: 0.0 - ETA: 4:56 - batch_cost: 0.0685 - reader cost: - ETA: 4:56 - batch_cost: 0.0685 - reader cost: 0.01 - ETA: 4:56 - batch_cost: 0.0686 - reader cost: 0.010 - ETA: 4:56 - batch_cost: 0.0686 - ETA: 4:54 - batch_cost: 0.0686 - r - ETA: 4:53 - batch_cost: 0.0686 - reader cost: - ETA: 4:52 - batch_cost: 0.0686 - reader cost: 0.0 - ETA: 4:52 - batch_cost: 0.0686 - reader co - ETA: 4:51 - batch_cost: 0.0686 - reader cost: 0.0 - ETA: 4:51 - batch_cost: 0.0686 - reader cost: 0.0 - ETA: 4:51 - batch_cost: 0.0686 - reader cost: 0.0 - ETA: 4:51 - batch_cost: 0.0686 - reader cost: 0.01 - ETA: 4:50 - batch_cost: 0.0686 - reader cost: - ETA: 4:50 - batch_cost: 0.0687 - reader cost: - ETA: 4:50 - batch_cost: 0.0687 - reader cost: 0.0 - ETA: 4:50 - batch_cost: 0.0687 - reader cost: - ETA: 4:50 - batch_cost: 0.0688 - reader cost: 0.01 - ETA: 4:50 - batch_cost: 0.0689 - reader cost: 0.01 - ETA: 4:49 - batch_cost: 0.0689 - reader cost: 0.010 - ETA: 4:49 - batch_cost: 0.0689 - reader co - ETA: 4:49 - batch_cost: 0.0689 - reader cost: 0.01 - ETA: 4:49 - batch_cost: 0.0690 - reader cos - ETA: 4:48 - batch_cost: 0.0690 - reader co - ETA: 4:48 - batch_cost: 0.0691 - reader cost: - ETA: 4:48 - batch_cost: 0.0691 - reader cost: 0 - ETA: 4:47 - batch_cost: 0.0692 - reader - ETA: 4:46 - batch_cost: 0.0692 - read - ETA: 4:45 - batch_cost: 0.0691 - reade - ETA: 4:44 - batch_cost: 0.0691 - reader cost: 0.010 - ETA: 4:43 - batch_cost: 0.0691 - reader cost: 0.01 - ETA: 4:43 - batch_cost: 0.0691 - reader cost: 0.010 - ETA: 4:43 - batch_cost: 0.0691 - reader - ETA: 4:42 - batch_cost: 0.0691 - reader cost: 0.0 - ETA: 4:42 - batch_cost: 0.0691 - re - ETA: 4:41 - batch_cost: 0.0691 - reader - ETA: 4:40 - batch_cost: 0.0691 - reader cost: 0.01 - ETA: 4:40 - batch_cost: 0.0691 - reader cost: 0.0 - ETA: 4:39 - batch_cost: 0.0691 - reader cost - ETA: 4:39 - batch_cost: 0.0691 - reader - ETA: 4:38 - batch_cost: 0.0692 - reader cost: 0.010 - ETA: 4:38 - batch_cost: 0.0692 - reader cost: 0.010 - ETA: 4:37 - batch_cost: 0.0692 - reader cost: 0.01 - ETA: 4:37 - batch_cost: 0.0692 - reader cost: 0.010 - ETA: 4:37 - batch_cost: 0.0692 - reader cost: 0.0 - ETA: 4:37 - batch_cost: 0.0692 - reader cost: - ETA: 4:36 - batch_cost: 0.0692 - reader cost: 0.010 - ETA: 4:36 - batch_cost: 0.0692 - reader cost: 0.0 - ETA: 4:36 - batch_cost: 0.0691 - reader cost: 0.01 - ETA: 4:36 - batch_cost: 0.0691 - reade - ETA: 4:34 - batch_cost: 0.0691 - reader cost: 0.0 - ETA: 4:34 - batch_cost: 0 - ETA: 4:32 - batch_cost: 0.0691 - read - ETA: 4:30 - batch_cost: 0.0691 - reader co - ETA: 4:30 - batch_cost: 0.0691 - reader - ETA: 4:29 - batch_cost: 0.06 - ETA: 4:27 - batch_cost: 0.0692 - reader co - ETA: 4:26 - batch_cost: 0.0692 - reader c - ETA: 4:25 - batch_cost: 0.0692 - re - ETA: 4:24 - batch_cost: 0.0693 - reader cost: 0.01 - ETA: 4:24 - batch_cost: 0.0692 - reader cost: 0.0 - ETA: 4:23 - batch_cost: 0.0692 - re - ETA: 4:22 - batch_cost: 0.0693 - reader cost: - ETA: 4:22 - batch_cost: 0.0693 - read - ETA: 4:20 - batch_cost: 0.0693 - reader cost: 0.01 - ETA: 4:20 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 4:20 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 4:20 - batch_cost: 0.0693 - reader cos - ETA: 4:19 - batch_cost: 0.0693 - reader cost: 0 - ETA: 4:19 - batch_cost: 0.0693 - reader cost: - ETA: 4:19 - batch_cost: 0.0694 - reader cost: 0. - ETA: 4:18 - batch_cost: 0.0694 - reader cost: 0.01 - ETA: 4:18 - batch_cost: 0.0694 - reader cost: 0 - ETA: 4:18 - batch_cost: 0.0695 - reader cost: 0.01 - ETA: 4:18 - batch_cost: 0.0695 - reader cost: 0.010 - ETA: 4:18 - batch_cost: 0.0695 - reader cost: 0.0 - ETA: 4:18 - batch_cost: 0.0 - ETA: 4:17 - batch_cost: 0.0697 - reader cost: - ETA: 4:16 - batch_cost: 0.0696 - reader cost: 0.010 - ETA: 4:16 - batch_cost: 0.0697 - reader cos - ETA: 4:15 - batch_cost: 0.0696 - reader - ETA: 4:14 - batch_cost: 0.0696 - reader cost - ETA: 4:13 - batch_cost: 0.0696 - reader - ETA: 4:11 - batch_cost: 0.0696 - reader cost: 0.010 - ETA: 4:11 - batch_cost: 0.0696 - reader cost: - ETA: 4:11 - batch_cost: 0.0696 - reader cost: 0.0 - ETA: 4:11 - batch_cost: 0.0696 - reader cost: 0. - ETA: 4:10 - batch_cost: 0.0696 - reader cost: 0 - ETA: 4:10 - batch_cost: 0.0696 - reader cost: 0. - ETA: 4:09 - batch_cost: 0.0696 - reader cost: 0.010 - ETA: 4:09 - batch_cost: 0.0696 - reader cost: 0. - ETA: 4:09 - batch_cost: 0.0696 - reader cost: 0.01 - ETA: 4:09 - batch_cost: 0.0696 - reader cost: 0.0 - ETA: 4:08 - batch_cost: 0.0696 - reader cost: 0.010 - ETA: 4:08 - batch_cost: 0.0696 - re - ETA: 4:07 - batch_cost: 0.0696 - reader cost: 0.010 - ETA: 4:07 - batch_cost: 0.0696 - reader co - ETA: 4:05 - batch_cost: 0.0696 - reader - ETA: 4:04 - batch_cost: 0.0696 - reader - ETA: 4:03 - batch_cost: 0.0695 - reader - ETA: 4:02 - batch_cost: 0.0696 - reader cost: 0. - ETA: 4:02 - batch_cost: 0.0696 - reader cost: 0. - ETA: 4:02 - batch_cost: 0.0697 - reader cost: 0.011 - ETA: 4:02 - batch_cost: 0.0697 - reader cost: - ETA: 4:02 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 4:02 - batch_cost: 0.0698 - reader cost: 0.01 - ETA: 4:02 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 4:01 - batch_cost: 0.0698 - reader cost: 0 - ETA: 4:01 - batch_cost: 0.0698 - reader cost: 0.011 - ETA: 4:01 - batch_cost: 0.0698 - reader cost: 0 - ETA: 4:01 - batch_cost: 0.0699 - reader cost: 0 - ETA: 4:00 - batch_cost: 0.0699 - reader cost: 0 - ETA: 4:00 - batch_cost: 0.0698 - reader cost: 0.01 - ETA: 4:00 - batch_cost: 0.0698 - reader cost - ETA: 3:59 - batch_cost: 0.0698 - reader cost - ETA: 3:58 - batch_cost: 0.0698 - reader cost: 0.011 - ETA: 3:58 - batch_cost: 0.0698 - reader co - ETA: 3:57 - batch_cost: 0.0698 - reader cos - ETA: 3:56 - batch_cost: 0.0698 - reader cost: 0.011 - ETA: 3:56 - batch_cost: 0.0698 - reader cost: 0.01 - ETA: 3:56 - batch_cost: 0.0698 - reader cost: 0. - ETA: 3:56 - batch_cost: 0.0698 - reader cost: - ETA: 3:55 - batch_cost: - ETA: 3:52 - batch_cost: 0.0698 - reader cost: 0.01 - ETA: 3:52 - batch_cost: 0.0698 - reader cost: 0. - ETA: 3:52 - batch_cost: 0.0697 - reader cost: - ETA: 3:51 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 3:51 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 3:51 - batch_cost: 0.0697 - reader cost: - ETA: 3:50 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 3:50 - batch_cost: 0.0697 - reader cost: 0 - ETA: 3:49 - batch_cost: 0.0698 - reader cost: 0.01 - ETA: 3:49 - batch_cost: 0.0698 - reader cost: 0. - ETA: 3:49 - batch_cost: 0.0698 - reader cost: 0.011 - ETA: 3:49 - batch_cost: 0.0698 - reader cost: 0. - ETA: 3:49 - batch_cost: 0.0698 - reader cost: 0. - ETA: 3:49 - batch_cost: 0.0699 - reader cost: 0.0 - ETA: 3:48 - batch_cost: 0.0699 - reader cost: 0. - ETA: 3:48 - batch_cost: 0.0699 - reader cost: 0.011 - ETA: 3:48 - batch_cost: 0.0699 - reader cost: 0.01 - ETA: 3:48 - batch_cost: 0.0699 - reader cost: 0.0 - ETA: 3:48 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:48 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 3:48 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 3:47 - batch_cost: 0.0700 - reader co - ETA: 3:46 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:46 - batch_cost: 0.0700 - reader cost: 0. - ETA: 3:46 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:46 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:46 - batch_cost: 0.0700 - reader cost: - ETA: 3:45 - batch_cost: 0.0700 - reader co - ETA: 3:44 - batch_cost: 0 - ETA: 3:42 - batch_cost: 0.0700 - reader co - ETA: 3:41 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:41 - batch_cost: 0.0700 - reader cost - ETA: 3:40 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:40 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:40 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:40 - batch_cost: 0.0700 - rea - ETA: 3:38 - batch_cost: 0.0700 - reader cost: 0. - ETA: 3:38 - batch_cost: 0.0700 - reader cost: - ETA: 3:37 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 3:37 - batch_cost: 0.0699 - reader cost - ETA: 3:36 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:36 - batch_cost: 0.0699 - reader cost: 0.0 - ETA: 3:36 - batch_cost: 0.0699 - reader cost: 0.011 - ETA: 3:36 - batch_cost: 0.0699 - reader cost: 0 - ETA: 3:35 - batch_cost: 0.0699 - reader co - ETA: 3:34 - batch_cost: 0.0699 - reader cos - ETA: 3:34 - batch_cost: 0.0699 - reader cost: 0.0 - ETA: 3:33 - batch_cost: 0.0699 - read - ETA: 3:32 - batch_cost: 0.0699 - reader cost: 0.01 - ETA: 3:32 - batch_cost: 0.0699 - reader cost: 0 - ETA: 3:31 - batch_cost: 0.0699 - reader cost: 0.01 - ETA: 3:31 - batch_cost: 0.0699 - reader cost: 0.01 - ETA: 3:31 - batch_cost: 0.0699 - reader cost - ETA: 3:31 - batch_cost: 0.0700 - reader cost: - ETA: 3:30 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:30 - batch_cost: 0.0700 - reader cost: - ETA: 3:29 - batch_cost: 0.0700 - reader co - ETA: 3:29 - batch_cost: 0.0700 - reader - ETA: 3:27 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:27 - batch_cost: 0.0700 - reader - ETA: 3:26 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:26 - batch_cost: 0.0700 - reader cost: - ETA: 3:26 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:26 - batch_cost: 0.0700 - reader cost: 0. - ETA: 3:26 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:25 - batch_cost: 0.0700 - reader cost: 0. - ETA: 3:25 - batch_cost: 0.0700 - rea - ETA: 3:24 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:24 - batch_cost: 0.0700 - reader cost: 0. - ETA: 3:23 - batch_cost: 0.0700 - reader cost: - ETA: 3:23 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:23 - batch_cost: 0.0700 - reader cost: 0 - ETA: 3:22 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 3:22 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 3:22 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:22 - batch_cost: 0.0700 - rea - ETA: 3:20 - batch_cost: 0.0700 - reader - ETA: 3:19 - batch_c - ETA: 3:16 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 3:16 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 3:16 - batch_cost: 0.0700 - reader cost: 0. - ETA: 3:15 - batch_cost: 0.0700 - reader cos - ETA: 3:15 - batch_cost: 0.0701 - reader cost - ETA: 3:14 - batch_cost: 0.0701 - reader cost: 0.011 - ETA: 3:14 - batch_cost: 0.0701 - reader cost: - ETA: 3:14 - batch_cost: 0.0701 - reader cost: 0. - ETA: 3:14 - batch_cost: 0.0702 - reader cost: 0.01 - ETA: 3:13 - batch_cost: 0.0702 - reader cost: - ETA: 3:13 - batch_cost: 0.0702 - reader cost: 0.0 - ETA: 3:13 - batch_cost: 0.0702 - reader cost: 0.011 - ETA: 3:13 - batch_cost: 0.0702 - reader cost - ETA: 3:12 - batch_cost: 0.0702 - reader co - ETA: 3:11 - batch_cost: 0.0702 - reader cost: 0. - ETA: 3:11 - batch_cost: 0.0702 - reader cost: 0.01 - ETA: 3:11 - batch_cost: 0.0702 - reader cost: 0.011 - ETA: 3:11 - batch_cost: 0.0702 - reader cost: 0.01 - ETA: 3:11 - batch_cost: 0.0702 - reader cos - ETA: 3:10 - batch_cost: 0.0702 - reader cost: 0.01 - ETA: 3:09 - batch_cost: 0.0701 - reader cos - ETA: 3:08 - batch_cost: 0.0701 - reader cost: 0.011 - ETA: 3:08 - batch_cost: 0.0701 - reader cost: 0 - ETA: 3:08 - batch_cost: 0.0701 - reader cost: 0 - ETA: 3:07 - batch_cost: 0.0701 - reader cost: 0.01 - ETA: 3:07 - batch_cost: 0.0701 - reader co - ETA: 3:06 - batch_cost: 0.0701 - re - ETA: 3:04 - batch_cost: 0.0701 - reader cost: - ETA: 3:04 - batch_cost: 0.0701 - read - ETA: 3:03 - batch_cost: 0.0701 - reader cost: 0.0 - ETA: 3:03 - batch_cost: 0.0701 - reader cost: - ETA: 3:02 - batch_cost: 0.0702 - reader cost: 0. - ETA: 3:02 - batch_cost: 0.0702 - reade - ETA: 3:01 - batch_cost: 0.0704 - reader cost: 0. - ETA: 3:01 - batch_cost: 0.070 - ETA: 2:59 - batch_cost: 0.0704 - reader cost: - ETA: 2:59 - batch_cost: 0.0704 - reader cost: - ETA: 2:58 - batch_cost: 0.0704 - reader cost: - ETA: 2:57 - batch_cost: 0.0704 - reader cost: 0.01 - ETA: 2:57 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:57 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:57 - batch_cost: 0.0704 - read - ETA: 2:56 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:56 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:56 - batch_cost: 0.0704 - reader cost - ETA: 2:55 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:55 - batch_cost: 0.0704 - re - ETA: 2:54 - batch_cost: 0.0704 - reader c - ETA: 2:53 - batch_cost: 0.0704 - reader cost: 0.0 - ETA: 2:52 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:52 - batch_cost: 0.0704 - reade - ETA: 2:51 - batch_cost: 0.0704 - reader - ETA: 2:50 - batch_cost: 0.0704 - reader cost: - ETA: 2:50 - batch_cost: 0.0704 - reader cost: 0.01 - ETA: 2:49 - batch_cost: 0.0704 - reader cost: 0.01 - ETA: 2:49 - batch_cost: 0.0 - ETA: 2:47 - batch_cost: 0.0705 - reader cost: 0.01 - ETA: 2:47 - batch_cost: 0.0705 - reader cost: 0.0 - ETA: 2:47 - batch_cost: 0.0705 - reader cost: 0.011 - ETA: 2:47 - batch_cost: 0.0705 - reader cost: 0.01 - ETA: 2:47 - batch_cost: 0.0705 - reader cost: 0.011 - ETA: 2:47 - batch_cost: 0.0705 - reader cost: 0.0 - ETA: 2:47 - batch_cost: 0.0706 - reader cost: - ETA: 2:46 - batch_cost: 0.0706 - reader cost: 0. - ETA: 2:46 - batch_cost: 0.0706 - reader cost: - ETA: 2:45 - batch_cost: 0.0707 - reader cost: 0. - ETA: 2:45 - batch_cost: 0.0707 - reader cost: 0.011 - ETA: 2:45 - batch_cost: 0.0707 - reader cost: 0 - ETA: 2:44 - batch_cost: 0.0706 - reader cost: - ETA: 2:44 - batch_cost: 0.0706 - reader cos - ETA: 2:43 - batch_cost: 0.0706 - reader cost: 0.011 - ETA: 2:43 - batch_cost: 0.0706 - reader cost: 0. - ETA: 2:42 - batch_cost: 0.0706 - reader cost - ETA: 2:42 - batch_cost: 0.0706 - reader cost: 0.011 - ETA: 2:41 - batch_cost: 0.0706 - reader cost: 0.0 - ETA: 2:41 - batch_cost: 0.0706 - reader cost: - ETA: 2:40 - batch_cost: 0.0706 - reader cost: 0.0 - ETA: 2:40 - batch_cost: 0.0706 - reader cost: 0.011 - ETA: 2:40 - batch_cost: 0.0706 - reader cost: 0.0 - ETA: 2:40 - batch_cost: 0.0706 - reader cost: - ETA: 2:39 - batch_cost: 0.0706 - - ETA: 2:36 - batch_cost: 0.0705 - reader co - ETA: 2:35 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:35 - batch_cost: 0.0705 - reader - ETA: 2:33 - batch_cost: 0.0704 - reader cost: 0 - ETA: 2:33 - batch_cost: 0.0704 - reader cost: - ETA: 2:32 - batch_cost: 0.0704 - reader - ETA: 2:30 - batch_cost: 0.0703 - reader cost: 0 - ETA: 2:30 - batch_cost: 0.0703 - reader cost: 0.0 - ETA: 2:30 - batch_cost: 0.0703 - reader cost: 0. - ETA: 2:29 - batch_cost: 0.0703 - reader cost: 0. - ETA: 2:29 - batch_cost: 0.0704 - reader cost: 0.011 - ETA: 2:29 - batch_cost: 0.0704 - reader cost: - ETA: 2:29 - batch_cost: 0.0704 - reader co - ETA: 2:27 - batch_cost: 0.0703 - reader cost: 0.0 - ETA: 2:27 - batch_cost: 0.0703 - reader cost: 0.0 - ETA: 2:27 - batch_cost: 0.0703 - reader cost: 0.011 - ETA: 2:26 - batch_cost: 0.0703 - reader co - ETA: 2:25 - batch_cost: 0.0703 - reader cost: - ETA: 2:24 - batch_cost: 0.0702 - reader cost: - ETA: 2:23 - batch_cost: 0.0702 - reader cost - ETA: 2:22 - batch_cost: 0.0701 - reader cost: 0.01 - ETA: 2:22 - batch_cost: 0.0701 - reader cost: 0 - ETA: 2:21 - batch_cost: 0.0701 - reader cost: 0.0 - ETA: 2:21 - batch_cost: 0.0701 - reader cost: 0.011 - ETA: 2:21 - batch_cost: 0.0701 - reader cost: - ETA: 2:20 - batch_cost: 0.0701 - reader cost: 0.0 - ETA: 2:19 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 2:19 - batch_cost: 0.0700 - reader cost: - ETA: 2:18 - batch_cost: 0.0700 - reader co - ETA: 2:17 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 2:17 - batch_cost: 0.0700 - reader cost: 0.011 - ETA: 2:17 - batch_cost: 0.0700 - reader cost - ETA: 2:16 - batch_cost: 0.0699 - reader cost: 0.011 - ETA: 2:16 - batch_cost: 0.0699 - rea - ETA: 2:14 - batch_cost: 0.0699 - reader cost: 0 - ETA: 2:13 - batch_cost: 0.0698 - reader cost: 0.01 - ETA: 2:13 - batch_cost: 0.0698 - - ETA: 2:10 - batch_cost: 0.0698 - reader cos - ETA: 2:09 - batch_cost: 0.0697 - reader cost: 0 - ETA: 2:09 - batch_cost: 0.0697 - reader cos - ETA: 2:07 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 2:07 - batch_cost: - ETA: 2:04 - batch_cost: 0.0697 - reader cost: 0. - ETA: 2:04 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 2:04 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 2:04 - batch_cost: 0.0697 - reader cost: 0.0 - ETA: 2:03 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 2:03 - batch_cost: 0.0697 - reader cost - ETA: 2:02 - batch_cost: 0.0697 - reader cost: 0.01 - ETA: 2:02 - batch_cost: 0.0697 - reader cost: 0.010 - ETA: 2:02 - batch_cost: 0.0697 - reader cost: - ETA: 2:01 - batch_cost: 0.0697 - reader cost: 0 - ETA: 2:01 - batch_cost: 0.0697 - reader cost: 0.010 - ETA: 2:01 - batch_cost: 0.0697 - rea - ETA: 1:58 - batch_cost: 0.0696 - reader cost: 0.0 - ETA: 1:58 - batch_cost: 0.0696 - reader cost: 0.0 - ETA: 1:58 - batch_cost: 0.0696 - reader cost: 0.01 - ETA: 1:57 - batch_cost: 0.0696 - reader cost: 0.010 - ETA: 1:57 - batch_cost: 0.0696 - reader c - ETA: 1:56 - batch_cost: 0.0695 - reader cost: 0.0 - ETA: 1:56 - batch_cost: 0.0695 - reader cost: - ETA: 1:55 - batch_cost: 0.0695 - re - ETA: 1:53 - batch_cost: 0.0695 - reader cost: 0. - ETA: 1:53 - batch_cost: 0.0695 - reader cost: 0.01 - ETA: 1:53 - batch_cost: 0.0695 - reader cost: 0.010 - ETA: 1:52 - batch_cost: 0.0695 - reader cost: 0.01 - ETA: 1:52 - batch_cost: 0.0695 - reader cost: 0.01 - ETA: 1:52 - batch_cost: 0.0695 - reader cost - ETA: 1:51 - batch_cost: 0.0696 - reader cost: 0.01 - ETA: 1:51 - batch_cost: 0.0696 - reader cos - ETA: 1:51 - batch_cost: 0.0696 - reader cost: 0.0 - ETA: 1:50 - batch_cost: 0.0696 - reader c - ETA: 1:49 - batch_cost: 0.0695 - reader cost: 0 - ETA: 1:48 - batch_cost: 0.0695 - reader cost: - ETA: 1:47 - batch_cost: 0.0695 - reader cost: 0.010 - ETA: 1:47 - batch_cost: 0.0695 - reader cost: 0.01 - ETA: 1:47 - batch_cost: 0.0695 - reader cost: 0.010 - ETA: 1:47 - batch_cost: 0.0695 - reade - ETA: 1:45 - batch_cost: 0.0694 - reader cost: 0 - ETA: 1:44 - batch_cost: 0.0694 - reader cost: - ETA: 1:43 - batch_cost: 0.0694 - reader cost: - ETA: 1:43 - batch_cost: 0.0693 - reade - ETA: 1:41 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 1:41 - batch_cost: 0.0693 - reader cost: - ETA: 1:39 - batch_cost: 0.0692 - reader cost: 0.01 - ETA: 1:39 - batch_cost: 0.0692 - reader cost: - ETA: 1:38 - batch_cost: 0.0692 - reader cost: 0.01 - ETA: 1:38 - batch_cost: 0.0692 - reader c - ETA: 1:37 - batch_cost: 0.0692 - reader cost: 0.0 - ETA: 1:36 - batch_cost: 0.0691 - reader - ETA: 1:35 - batch_cost: 0.0691 - reader - ETA: 1:33 - batch_cost: 0.0691 - reader cost: 0.0 - ETA: 1:33 - batch_cost: 0.0691 - reader cost: 0. - ETA: 1:33 - batch_cost: 0.0690 - reader cost: 0.0 - ETA: 1:32 - batch_cost: 0.0690 - reader cost: 0.01 - ETA: 1:32 - batch_cost: 0.0690 - reader cost: 0. - ETA: 1:31 - batch_cost: 0.0690 - reader cost: 0.01 - ETA: 1:31 - batch_cost: 0.0690 - reader co - ETA: 1:30 - batch_cost: 0.0690 - reader - ETA: 1:29 - batch_cost: 0.0690 - reader c - ETA: 1:28 - batch_cost: 0.0690 - reader co - ETA: 1:27 - batch_cost: 0.0690 - reader cost: 0.0 - ETA: 1:27 - batch_cost: 0.0690 - reader cost: 0.01 - ETA: 1:27 - batch_cost: 0.0690 - reader cost: 0. - ETA: 1:26 - batch_cost: 0.0690 - - ETA: 1:24 - batch_cost: 0.0690 - rea - ETA: 1:23 - batch_cost: 0.0690 - reade - ETA: 1:21 - batch_cost: 0.0690 - reader cost: - ETA: 1:20 - batch_cost: 0.0690 - reader cost: - ETA: 1:19 - batch_cost: 0.0689 - reader - ETA: 1:18 - batch_cost: 0.0689 - reader cost: - ETA: 1:17 - batch_cost: 0.0689 - reader cost: - ETA: 1:16 - batch_cost: 0.0688 - reader cost: - ETA: 1:16 - batch_cost: 0.0688 - reader c - ETA: 1:14 - batch_cost: 0.0688 - reader cost: 0. - ETA: 1:14 - batch_cost: 0.0688 - reader cost: 0.009 - ETA: 1:14 - batch_cost: 0.0688 - reader cost: 0. - ETA: 1:14 - batch_cost: 0.0688 - reader cost: - ETA: 1:13 - batch_cost: 0.0689 - reader cost: 0.009 - ETA: 1:13 - batch_cost: 0.0689 - reader cost: 0 - ETA: 1:13 - batch_cost: 0.0689 - reader cost - ETA: 1:12 - batch_cost: 0.0689 - reader cost: 0.009 - ETA: 1:12 - batch_cost: 0.0689 - - ETA: 1:10 - batch_cost: 0.0688 - reader cost: 0. - ETA: 1:09 - batch_cost: 0.0688 - reader cost: - ETA: 1:08 - batch_cost: 0.0688 - - ETA: 1:06 - batch_cost: 0.0687 - reader cost: 0.00 - ETA: 1:05 - batch_cost: 0.0687 - reader cost: 0.00 - ETA: 1:05 - batch_cost: 0.0687 - reader co - ETA: 1:04 - batch_cost: 0.0687 - reader cost: 0.009 - ETA: 1:04 - batch_cost: 0.0686 - ETA: 1:01 - batch_cost: 0.0686 - reader cos - ETA: 1:00 - batch_cost: 0.0685 - reader cost: 0.0 - ETA: 1:00 - batch_cost: 0.0685 - - ETA: 59s - batch_cost: 0.0685 - reader cost: 0.00 - ETA: 58s - batch_cost: 0.0685 - reader cost: 0. - ETA: 58s - batch_cost: 0.0685 - reader cost - ETA: 58s - batch_cost: 0.0685 - reader cost - ETA: 57s - batch_cost: 0.0685 - reade - ETA: 57s - batch_cost: 0. - ETA: 56s - ETA: 49s - batch_cost: 0.0683 - reader cost: 0. - ETA: 49s - batch_cost: 0.0683 - rea - ETA: 48s - batch_cost: 0.0683 - reader cost: - ETA: 48s - batch_cost: 0.0683 - reader cost: 0. - ETA: 48s - batch_cost: 0.0683 - reader - ETA: 48s - batch_cost: 0.0683 - reader co - ETA: 47s - batch_cost: 0.0683 - reader cost: 0. - ETA: 47s - batch_cost: 0.0684 - reader cost: 0. - ETA: 47s - batch_cost: 0.0683 - r - ETA: 46s - batch_cost: 0.0684 - reader cost: 0.00 - ETA: 46s - batch_cost: 0.0684 - reade - ETA: 45s - batch_cost: 0.0684 - reader cost: 0.00 - ETA: 45s - batch_cost: 0.06 - ETA: 44s - batch_cost: 0.0684 - reader co - ETA: 44s - batch_cost: 0.0684 - reade - ETA: 43s - batch_cost: 0.0683 - reader cost: 0. - ETA: 42s - batch_cost: 0.0683 - rea - ETA: 42s - batch_cost: 0.0683 - r - ETA: 41s - batch_cost: 0.0683 - reader - ETA: 40s - batch_cost: 0.0683 - reader cost: - ETA: 40s - batch_cost: 0.0683 - reader cost: 0.00 - ETA: 40s - batch_cost: 0.0683 - reader cost: 0. - ETA: 39s - batch_cost: 0.0683 - reader cost: - ETA: 39s - batch_cost: 0.0683 - reader cost: 0.00 - ETA: 39s - batch_cost: 0.0683 - reader cost: 0. - ETA: 39s - batch_cost: 0.0683 - reader - ETA: 39s - batch_cost: 0.0684 - reader cost: 0.00 - ETA: 39s - batch_cost: 0.0684 - rea - ETA: 38s - batch_cost: 0.0684 - reader cost: 0. - ETA: 38s - batch_cost: 0.0684 - reader - ETA: 37s - batch_cost: 0.0684 - reader co - ETA: 37s - batch_cost: 0.0684 - reader cost: 0.00 - ETA: 36s - batch_cost: 0.0684 - reader cost - ETA: 36s - batch_cost - ETA: 34s - batch_cost: 0.0683 - reader cost: 0.00 - ETA: 34s - batch_cost: 0.0683 - reader cost: 0.00 - ETA: 34s - batch_cost: 0.0683 - reader cost: - ETA: 34s - batch_cost: 0.0683 - reader - ETA: 33s - batch_cost: 0. - ETA: 31s - batch_cost: 0. - ETA: 30s - batch_cost: 0.0683 - reader - ETA: 29s - batch_cost: 0.0683 - reader cost - ETA: 29s - batch_cost: 0.0683 - reader cost: 0.00 - ETA: 29s - batch_ - ETA: 24s - batch_cost: 0.0682 - reader cost: 0.00 - ETA: 24s - batch_cost: 0.0682 - reader cost - ETA: 23s - batch_cost: 0.0681 - ETA: 22s - batch_cost: 0.0681 - reader cost: 0. - ETA: 22s - batch_cost: 0.0681 - reade - ETA: 21s - batch_cost: 0.0681 - reader - ETA: 20s - batch_cost: 0.0681 - reader co - ETA: 20s - batch_cost: 0.0681 - reader co - ETA: 19s - batch_cost: 0.0681 - reade - ETA: 18s - batch_cost: 0.0680 - reader co - ETA: 18s - batch_cost: 0.0680 - reader cost: - ETA: 17s - batch_cost: 0.0680 - ETA: 16s - batch_cost: 0.0680 - reader cost: 0. - ETA: 16s - batch_cost: 0.0680 - reader cost: - ETA: 15s - batch_cost: 0.0680 - reader cost: 0. - ETA: 15s - batch_cost: 0.0680 - reader cost: - ETA: 15s - batch_cost: 0.0680 - reader cost: 0. - ETA: 15s - batch_cost: 0.0680 - reader cost: 0. - ETA: 15s - batch_cost: 0.0680 - reader cost: - ETA: 14s - batch_cost: 0.0680 - reader cost: 0. - ETA: 14s - batch_cost: 0.0680 - reader cost: 0. - ETA: 14s - batch_cost: 0.0680 - reader cost: - ETA: 14s - batch_cost: 0.0680 - reader - ETA: 13s - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 13s - batch_cost: 0.0680 - reader co - ETA: 13s - batch_cost: 0.0681 - rea - ETA: 12s - batch_cost: 0.0681 - reader cost: - ETA: 12s - batch_cost: 0.0681 - reader cost - ETA: 11s - batch_cost: 0.0681 - reader cost: - ETA: 11s - batch_cost: 0.0681 - reader cost: - ETA: 11s - batch_cost: 0.0681 - reader cost - ETA: 10s - batch_cost: 0.0681 - reader cost - ETA: 10s - batch_cost: 0.0681 - reader cost: 0. - ETA: 10s - batch_cost: 0.0680 - reader - ETA: 9s - batch_cost: 0.0680 - reader cost: - ETA: 8s - batch_cost: 0.0680 - reader cost - ETA: 7s - batch_cost: 0.0680 - reader cost - ETA: 6s - batch_cost: 0.0680 - reader co - ETA: 5s - batch_cost: 0.0680 - reader cost: 0.0 - ETA: 4s - batch_cost: 0.0679 - reader cost: 0 - ETA: 4s - batch_cost: 0.0679 - reader cost: 0. - ETA: 4s - batch_cost: 0.0679 - reader cost: 0.0 - ETA: 3s - batch_cost: 0.0679 - reader cost: 0.00 - ETA: 3s - batch_cost: 0.0679 - reader cost: 0. - ETA: 3s - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 3s - batch_cost: 0.0680 - reader cost: 0.0 - ETA: 2s - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 2s - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 2s - batch_cost: 0.0680 - reader cost: 0 - ETA: 1s - batch_cost: 0.0680 - reader cost: 0.00 - ETA: 1s - batch_cost: 0.0680 - reader cost: - ETA: 1s - batch_cost: 0.0680 - reader cos2022-07-21 16:34:26 [INFO][EVAL] #Images: 7361 mIoU: 0.1224 Acc: 0.9847 Kappa: 0.4763 Dice: 0.15582022-07-21 16:34:26 [INFO][EVAL] Class IoU: [0.9866 0.1488 0.0672 0.2721 0.0186 0. 0. 0. 0.3188 0.011 0. 0. 0. 0.0003 0. 0. 0. 0. 0. 0.6237]2022-07-21 16:34:26 [INFO][EVAL] Class Precision: [0.9893 0.4079 0.4243 0.5645 0.26 0. 0. 0. 0.5582 0.2027 0. 0. 0. 0.1797 0. 0. 0. 0. 0. 0.7015]2022-07-21 16:34:26 [INFO][EVAL] Class Recall: [0.9973 0.1897 0.0739 0.3444 0.0196 0. 0. 0. 0.4264 0.0115 0. 0. 0. 0.0003 0. 0. 0. 0. 0. 0.849 ]2022-07-21 16:34:26 [INFO][EVAL] The model with the best validation mIoU (0.1224) was saved at iter 16000.2022-07-21 16:34:27 [INFO][TRAIN] epoch: 4, iter: 16010/20000, loss: 0.2846, lr: 0.002344, batch_cost: 0.1498, reader_cost: 0.02214, ips: 26.6975 samples/sec | ETA 00:09:572022-07-21 16:34:30 [INFO][TRAIN] epoch: 4, iter: 16020/20000, loss: 0.3530, lr: 0.002339, batch_cost: 0.2334, reader_cost: 0.10982, ips: 17.1365 samples/sec | ETA 00:15:292022-07-21 16:34:32 [INFO][TRAIN] epoch: 4, iter: 16030/20000, loss: 0.3177, lr: 0.002334, batch_cost: 0.2393, reader_cost: 0.11262, ips: 16.7174 samples/sec | ETA 00:15:492022-07-21 16:34:34 [INFO][TRAIN] epoch: 4, iter: 16040/20000, loss: 0.3182, lr: 0.002329, batch_cost: 0.2332, reader_cost: 0.10916, ips: 17.1516 samples/sec | ETA 00:15:232022-07-21 16:34:37 [INFO][TRAIN] epoch: 4, iter: 16050/20000, loss: 0.2947, lr: 0.002323, batch_cost: 0.2105, reader_cost: 0.08299, ips: 18.9997 samples/sec | ETA 00:13:512022-07-21 16:34:39 [INFO][TRAIN] epoch: 4, iter: 16060/20000, loss: 0.3454, lr: 0.002318, batch_cost: 0.2385, reader_cost: 0.11149, ips: 16.7714 samples/sec | ETA 00:15:392022-07-21 16:34:41 [INFO][TRAIN] epoch: 4, iter: 16070/20000, loss: 0.3962, lr: 0.002313, batch_cost: 0.2304, reader_cost: 0.10354, ips: 17.3586 samples/sec | ETA 00:15:052022-07-21 16:34:44 [INFO][TRAIN] epoch: 4, iter: 16080/20000, loss: 0.3234, lr: 0.002307, batch_cost: 0.2371, reader_cost: 0.11089, ips: 16.8730 samples/sec | ETA 00:15:292022-07-21 16:34:46 [INFO][TRAIN] epoch: 4, iter: 16090/20000, loss: 0.2655, lr: 0.002302, batch_cost: 0.2708, reader_cost: 0.13254, ips: 14.7694 samples/sec | ETA 00:17:382022-07-21 16:34:49 [INFO][TRAIN] epoch: 4, iter: 16100/20000, loss: 0.3970, lr: 0.002297, batch_cost: 0.2742, reader_cost: 0.11753, ips: 14.5899 samples/sec | ETA 00:17:492022-07-21 16:34:52 [INFO][TRAIN] epoch: 4, iter: 16110/20000, loss: 0.3209, lr: 0.002292, batch_cost: 0.2620, reader_cost: 0.11625, ips: 15.2662 samples/sec | ETA 00:16:592022-07-21 16:34:55 [INFO][TRAIN] epoch: 4, iter: 16120/20000, loss: 0.4317, lr: 0.002286, batch_cost: 0.3248, reader_cost: 0.14889, ips: 12.3142 samples/sec | ETA 00:21:002022-07-21 16:34:57 [INFO][TRAIN] epoch: 4, iter: 16130/20000, loss: 0.2764, lr: 0.002281, batch_cost: 0.2473, reader_cost: 0.11832, ips: 16.1761 samples/sec | ETA 00:15:562022-07-21 16:35:00 [INFO][TRAIN] epoch: 4, iter: 16140/20000, loss: 0.4132, lr: 0.002276, batch_cost: 0.2362, reader_cost: 0.11364, ips: 16.9316 samples/sec | ETA 00:15:112022-07-21 16:35:02 [INFO][TRAIN] epoch: 4, iter: 16150/20000, loss: 0.3305, lr: 0.002270, batch_cost: 0.2411, reader_cost: 0.12011, ips: 16.5878 samples/sec | ETA 00:15:282022-07-21 16:35:04 [INFO][TRAIN] epoch: 4, iter: 16160/20000, loss: 0.3125, lr: 0.002265, batch_cost: 0.2191, reader_cost: 0.09531, ips: 18.2530 samples/sec | ETA 00:14:012022-07-21 16:35:06 [INFO][TRAIN] epoch: 4, iter: 16170/20000, loss: 0.4105, lr: 0.002260, batch_cost: 0.2140, reader_cost: 0.08738, ips: 18.6932 samples/sec | ETA 00:13:392022-07-21 16:35:09 [INFO][TRAIN] epoch: 4, iter: 16180/20000, loss: 0.3342, lr: 0.002254, batch_cost: 0.2186, reader_cost: 0.09544, ips: 18.2991 samples/sec | ETA 00:13:552022-07-21 16:35:11 [INFO][TRAIN] epoch: 4, iter: 16190/20000, loss: 0.3135, lr: 0.002249, batch_cost: 0.2323, reader_cost: 0.11056, ips: 17.2193 samples/sec | ETA 00:14:452022-07-21 16:35:13 [INFO][TRAIN] epoch: 4, iter: 16200/20000, loss: 0.4243, lr: 0.002244, batch_cost: 0.2487, reader_cost: 0.12754, ips: 16.0843 samples/sec | ETA 00:15:452022-07-21 16:35:16 [INFO][TRAIN] epoch: 4, iter: 16210/20000, loss: 0.3754, lr: 0.002238, batch_cost: 0.2295, reader_cost: 0.09764, ips: 17.4314 samples/sec | ETA 00:14:292022-07-21 16:35:18 [INFO][TRAIN] epoch: 4, iter: 16220/20000, loss: 0.2379, lr: 0.002233, batch_cost: 0.2649, reader_cost: 0.13734, ips: 15.1004 samples/sec | ETA 00:16:412022-07-21 16:35:21 [INFO][TRAIN] epoch: 4, iter: 16230/20000, loss: 0.2911, lr: 0.002228, batch_cost: 0.3023, reader_cost: 0.13917, ips: 13.2338 samples/sec | ETA 00:18:592022-07-21 16:35:24 [INFO][TRAIN] epoch: 4, iter: 16240/20000, loss: 0.3265, lr: 0.002223, batch_cost: 0.2285, reader_cost: 0.10310, ips: 17.5044 samples/sec | ETA 00:14:192022-07-21 16:35:26 [INFO][TRAIN] epoch: 4, iter: 16250/20000, loss: 0.3400, lr: 0.002217, batch_cost: 0.2479, reader_cost: 0.11350, ips: 16.1361 samples/sec | ETA 00:15:292022-07-21 16:35:29 [INFO][TRAIN] epoch: 4, iter: 16260/20000, loss: 0.3225, lr: 0.002212, batch_cost: 0.2556, reader_cost: 0.11619, ips: 15.6510 samples/sec | ETA 00:15:552022-07-21 16:35:31 [INFO][TRAIN] epoch: 4, iter: 16270/20000, loss: 0.3137, lr: 0.002207, batch_cost: 0.2327, reader_cost: 0.11075, ips: 17.1864 samples/sec | ETA 00:14:282022-07-21 16:35:34 [INFO][TRAIN] epoch: 4, iter: 16280/20000, loss: 0.3076, lr: 0.002201, batch_cost: 0.2411, reader_cost: 0.11771, ips: 16.5897 samples/sec | ETA 00:14:562022-07-21 16:35:36 [INFO][TRAIN] epoch: 4, iter: 16290/20000, loss: 0.4640, lr: 0.002196, batch_cost: 0.2348, reader_cost: 0.11107, ips: 17.0355 samples/sec | ETA 00:14:312022-07-21 16:35:38 [INFO][TRAIN] epoch: 4, iter: 16300/20000, loss: 0.3471, lr: 0.002191, batch_cost: 0.2398, reader_cost: 0.11583, ips: 16.6775 samples/sec | ETA 00:14:472022-07-21 16:35:41 [INFO][TRAIN] epoch: 4, iter: 16310/20000, loss: 0.3386, lr: 0.002185, batch_cost: 0.2438, reader_cost: 0.12037, ips: 16.4068 samples/sec | ETA 00:14:592022-07-21 16:35:43 [INFO][TRAIN] epoch: 4, iter: 16320/20000, loss: 0.2408, lr: 0.002180, batch_cost: 0.2410, reader_cost: 0.11173, ips: 16.5974 samples/sec | ETA 00:14:462022-07-21 16:35:46 [INFO][TRAIN] epoch: 4, iter: 16330/20000, loss: 0.2581, lr: 0.002175, batch_cost: 0.2477, reader_cost: 0.11913, ips: 16.1459 samples/sec | ETA 00:15:092022-07-21 16:35:48 [INFO][TRAIN] epoch: 4, iter: 16340/20000, loss: 0.3236, lr: 0.002169, batch_cost: 0.2406, reader_cost: 0.11783, ips: 16.6221 samples/sec | ETA 00:14:402022-07-21 16:35:50 [INFO][TRAIN] epoch: 4, iter: 16350/20000, loss: 0.3013, lr: 0.002164, batch_cost: 0.2479, reader_cost: 0.12150, ips: 16.1323 samples/sec | ETA 00:15:052022-07-21 16:35:53 [INFO][TRAIN] epoch: 4, iter: 16360/20000, loss: 0.3170, lr: 0.002159, batch_cost: 0.2423, reader_cost: 0.11632, ips: 16.5094 samples/sec | ETA 00:14:412022-07-21 16:35:56 [INFO][TRAIN] epoch: 4, iter: 16370/20000, loss: 0.2907, lr: 0.002153, batch_cost: 0.3020, reader_cost: 0.14218, ips: 13.2463 samples/sec | ETA 00:18:162022-07-21 16:35:59 [INFO][TRAIN] epoch: 4, iter: 16380/20000, loss: 0.3501, lr: 0.002148, batch_cost: 0.3589, reader_cost: 0.15862, ips: 11.1446 samples/sec | ETA 00:21:392022-07-21 16:36:02 [INFO][TRAIN] epoch: 4, iter: 16390/20000, loss: 0.3249, lr: 0.002143, batch_cost: 0.2295, reader_cost: 0.10022, ips: 17.4301 samples/sec | ETA 00:13:482022-07-21 16:36:04 [INFO][TRAIN] epoch: 4, iter: 16400/20000, loss: 0.3394, lr: 0.002137, batch_cost: 0.2367, reader_cost: 0.11418, ips: 16.9000 samples/sec | ETA 00:14:122022-07-21 16:36:06 [INFO][TRAIN] epoch: 4, iter: 16410/20000, loss: 0.3413, lr: 0.002132, batch_cost: 0.2308, reader_cost: 0.10668, ips: 17.3334 samples/sec | ETA 00:13:482022-07-21 16:36:09 [INFO][TRAIN] epoch: 4, iter: 16420/20000, loss: 0.2285, lr: 0.002127, batch_cost: 0.2335, reader_cost: 0.11061, ips: 17.1287 samples/sec | ETA 00:13:562022-07-21 16:36:11 [INFO][TRAIN] epoch: 4, iter: 16430/20000, loss: 0.3748, lr: 0.002121, batch_cost: 0.2433, reader_cost: 0.11714, ips: 16.4421 samples/sec | ETA 00:14:282022-07-21 16:36:14 [INFO][TRAIN] epoch: 4, iter: 16440/20000, loss: 0.2190, lr: 0.002116, batch_cost: 0.2424, reader_cost: 0.11697, ips: 16.5001 samples/sec | ETA 00:14:232022-07-21 16:36:16 [INFO][TRAIN] epoch: 4, iter: 16450/20000, loss: 0.2733, lr: 0.002111, batch_cost: 0.2345, reader_cost: 0.10805, ips: 17.0566 samples/sec | ETA 00:13:522022-07-21 16:36:18 [INFO][TRAIN] epoch: 4, iter: 16460/20000, loss: 0.4754, lr: 0.002105, batch_cost: 0.2397, reader_cost: 0.11473, ips: 16.6891 samples/sec | ETA 00:14:082022-07-21 16:36:21 [INFO][TRAIN] epoch: 4, iter: 16470/20000, loss: 0.3372, lr: 0.002100, batch_cost: 0.2404, reader_cost: 0.11631, ips: 16.6408 samples/sec | ETA 00:14:082022-07-21 16:36:23 [INFO][TRAIN] epoch: 4, iter: 16480/20000, loss: 0.3645, lr: 0.002094, batch_cost: 0.2333, reader_cost: 0.10861, ips: 17.1447 samples/sec | ETA 00:13:412022-07-21 16:36:26 [INFO][TRAIN] epoch: 4, iter: 16490/20000, loss: 0.3784, lr: 0.002089, batch_cost: 0.2839, reader_cost: 0.12698, ips: 14.0885 samples/sec | ETA 00:16:362022-07-21 16:36:29 [INFO][TRAIN] epoch: 4, iter: 16500/20000, loss: 0.3139, lr: 0.002084, batch_cost: 0.2980, reader_cost: 0.15096, ips: 13.4219 samples/sec | ETA 00:17:232022-07-21 16:36:32 [INFO][TRAIN] epoch: 4, iter: 16510/20000, loss: 0.3020, lr: 0.002078, batch_cost: 0.3003, reader_cost: 0.15824, ips: 13.3207 samples/sec | ETA 00:17:272022-07-21 16:36:34 [INFO][TRAIN] epoch: 4, iter: 16520/20000, loss: 0.3177, lr: 0.002073, batch_cost: 0.2154, reader_cost: 0.09300, ips: 18.5734 samples/sec | ETA 00:12:292022-07-21 16:36:36 [INFO][TRAIN] epoch: 4, iter: 16530/20000, loss: 0.3615, lr: 0.002068, batch_cost: 0.2320, reader_cost: 0.10885, ips: 17.2379 samples/sec | ETA 00:13:252022-07-21 16:36:39 [INFO][TRAIN] epoch: 4, iter: 16540/20000, loss: 0.2997, lr: 0.002062, batch_cost: 0.2318, reader_cost: 0.10681, ips: 17.2532 samples/sec | ETA 00:13:222022-07-21 16:36:41 [INFO][TRAIN] epoch: 4, iter: 16550/20000, loss: 0.3202, lr: 0.002057, batch_cost: 0.2355, reader_cost: 0.11189, ips: 16.9842 samples/sec | ETA 00:13:322022-07-21 16:36:43 [INFO][TRAIN] epoch: 4, iter: 16560/20000, loss: 0.3401, lr: 0.002052, batch_cost: 0.2295, reader_cost: 0.10429, ips: 17.4307 samples/sec | ETA 00:13:092022-07-21 16:36:46 [INFO][TRAIN] epoch: 4, iter: 16570/20000, loss: 0.3910, lr: 0.002046, batch_cost: 0.2286, reader_cost: 0.10345, ips: 17.5008 samples/sec | ETA 00:13:032022-07-21 16:36:48 [INFO][TRAIN] epoch: 4, iter: 16580/20000, loss: 0.3332, lr: 0.002041, batch_cost: 0.2394, reader_cost: 0.10964, ips: 16.7077 samples/sec | ETA 00:13:382022-07-21 16:36:51 [INFO][TRAIN] epoch: 4, iter: 16590/20000, loss: 0.4875, lr: 0.002035, batch_cost: 0.2417, reader_cost: 0.11551, ips: 16.5499 samples/sec | ETA 00:13:442022-07-21 16:36:53 [INFO][TRAIN] epoch: 4, iter: 16600/20000, loss: 0.3126, lr: 0.002030, batch_cost: 0.2463, reader_cost: 0.11975, ips: 16.2418 samples/sec | ETA 00:13:572022-07-21 16:36:55 [INFO][TRAIN] epoch: 4, iter: 16610/20000, loss: 0.3060, lr: 0.002025, batch_cost: 0.2299, reader_cost: 0.10404, ips: 17.3993 samples/sec | ETA 00:12:592022-07-21 16:36:58 [INFO][TRAIN] epoch: 4, iter: 16620/20000, loss: 0.3810, lr: 0.002019, batch_cost: 0.2456, reader_cost: 0.12133, ips: 16.2860 samples/sec | ETA 00:13:502022-07-21 16:37:00 [INFO][TRAIN] epoch: 4, iter: 16630/20000, loss: 0.3041, lr: 0.002014, batch_cost: 0.2577, reader_cost: 0.12407, ips: 15.5239 samples/sec | ETA 00:14:282022-07-21 16:37:04 [INFO][TRAIN] epoch: 4, iter: 16640/20000, loss: 0.2799, lr: 0.002009, batch_cost: 0.3892, reader_cost: 0.19988, ips: 10.2781 samples/sec | ETA 00:21:472022-07-21 16:37:07 [INFO][TRAIN] epoch: 4, iter: 16650/20000, loss: 0.2660, lr: 0.002003, batch_cost: 0.2536, reader_cost: 0.11576, ips: 15.7752 samples/sec | ETA 00:14:092022-07-21 16:37:09 [INFO][TRAIN] epoch: 4, iter: 16660/20000, loss: 0.4575, lr: 0.001998, batch_cost: 0.2307, reader_cost: 0.10590, ips: 17.3363 samples/sec | ETA 00:12:502022-07-21 16:37:12 [INFO][TRAIN] epoch: 4, iter: 16670/20000, loss: 0.2820, lr: 0.001992, batch_cost: 0.2521, reader_cost: 0.12990, ips: 15.8697 samples/sec | ETA 00:13:592022-07-21 16:37:14 [INFO][TRAIN] epoch: 4, iter: 16680/20000, loss: 0.2791, lr: 0.001987, batch_cost: 0.2257, reader_cost: 0.10125, ips: 17.7254 samples/sec | ETA 00:12:292022-07-21 16:37:16 [INFO][TRAIN] epoch: 4, iter: 16690/20000, loss: 0.3214, lr: 0.001982, batch_cost: 0.2502, reader_cost: 0.12251, ips: 15.9895 samples/sec | ETA 00:13:482022-07-21 16:37:19 [INFO][TRAIN] epoch: 4, iter: 16700/20000, loss: 0.3991, lr: 0.001976, batch_cost: 0.2293, reader_cost: 0.10730, ips: 17.4458 samples/sec | ETA 00:12:362022-07-21 16:37:21 [INFO][TRAIN] epoch: 4, iter: 16710/20000, loss: 0.4188, lr: 0.001971, batch_cost: 0.2412, reader_cost: 0.11687, ips: 16.5809 samples/sec | ETA 00:13:132022-07-21 16:37:23 [INFO][TRAIN] epoch: 4, iter: 16720/20000, loss: 0.3057, lr: 0.001966, batch_cost: 0.2272, reader_cost: 0.10227, ips: 17.6089 samples/sec | ETA 00:12:252022-07-21 16:37:26 [INFO][TRAIN] epoch: 4, iter: 16730/20000, loss: 0.4189, lr: 0.001960, batch_cost: 0.2491, reader_cost: 0.12221, ips: 16.0565 samples/sec | ETA 00:13:342022-07-21 16:37:28 [INFO][TRAIN] epoch: 4, iter: 16740/20000, loss: 0.5262, lr: 0.001955, batch_cost: 0.2372, reader_cost: 0.11073, ips: 16.8627 samples/sec | ETA 00:12:532022-07-21 16:37:31 [INFO][TRAIN] epoch: 4, iter: 16750/20000, loss: 0.3220, lr: 0.001949, batch_cost: 0.2406, reader_cost: 0.10436, ips: 16.6220 samples/sec | ETA 00:13:022022-07-21 16:37:34 [INFO][TRAIN] epoch: 4, iter: 16760/20000, loss: 0.4092, lr: 0.001944, batch_cost: 0.3407, reader_cost: 0.17410, ips: 11.7413 samples/sec | ETA 00:18:232022-07-21 16:37:37 [INFO][TRAIN] epoch: 4, iter: 16770/20000, loss: 0.3751, lr: 0.001939, batch_cost: 0.2783, reader_cost: 0.13482, ips: 14.3743 samples/sec | ETA 00:14:582022-07-21 16:37:39 [INFO][TRAIN] epoch: 4, iter: 16780/20000, loss: 0.4117, lr: 0.001933, batch_cost: 0.2308, reader_cost: 0.10224, ips: 17.3294 samples/sec | ETA 00:12:232022-07-21 16:37:42 [INFO][TRAIN] epoch: 4, iter: 16790/20000, loss: 0.2909, lr: 0.001928, batch_cost: 0.2442, reader_cost: 0.11848, ips: 16.3787 samples/sec | ETA 00:13:032022-07-21 16:37:44 [INFO][TRAIN] epoch: 4, iter: 16800/20000, loss: 0.4038, lr: 0.001922, batch_cost: 0.2375, reader_cost: 0.11266, ips: 16.8434 samples/sec | ETA 00:12:392022-07-21 16:37:46 [INFO][TRAIN] epoch: 4, iter: 16810/20000, loss: 0.3143, lr: 0.001917, batch_cost: 0.2281, reader_cost: 0.09908, ips: 17.5380 samples/sec | ETA 00:12:072022-07-21 16:37:49 [INFO][TRAIN] epoch: 4, iter: 16820/20000, loss: 0.3405, lr: 0.001912, batch_cost: 0.2445, reader_cost: 0.11832, ips: 16.3617 samples/sec | ETA 00:12:572022-07-21 16:37:51 [INFO][TRAIN] epoch: 4, iter: 16830/20000, loss: 0.4299, lr: 0.001906, batch_cost: 0.2383, reader_cost: 0.11216, ips: 16.7885 samples/sec | ETA 00:12:352022-07-21 16:37:53 [INFO][TRAIN] epoch: 4, iter: 16840/20000, loss: 0.3548, lr: 0.001901, batch_cost: 0.2371, reader_cost: 0.10931, ips: 16.8735 samples/sec | ETA 00:12:292022-07-21 16:37:56 [INFO][TRAIN] epoch: 4, iter: 16850/20000, loss: 0.6845, lr: 0.001895, batch_cost: 0.2438, reader_cost: 0.11540, ips: 16.4075 samples/sec | ETA 00:12:472022-07-21 16:37:58 [INFO][TRAIN] epoch: 4, iter: 16860/20000, loss: 0.3614, lr: 0.001890, batch_cost: 0.2360, reader_cost: 0.11112, ips: 16.9501 samples/sec | ETA 00:12:212022-07-21 16:38:01 [INFO][TRAIN] epoch: 4, iter: 16870/20000, loss: 0.3269, lr: 0.001884, batch_cost: 0.2355, reader_cost: 0.11026, ips: 16.9823 samples/sec | ETA 00:12:172022-07-21 16:38:03 [INFO][TRAIN] epoch: 4, iter: 16880/20000, loss: 0.3475, lr: 0.001879, batch_cost: 0.2345, reader_cost: 0.11016, ips: 17.0568 samples/sec | ETA 00:12:112022-07-21 16:38:06 [INFO][TRAIN] epoch: 4, iter: 16890/20000, loss: 0.2689, lr: 0.001874, batch_cost: 0.3019, reader_cost: 0.14053, ips: 13.2503 samples/sec | ETA 00:15:382022-07-21 16:38:09 [INFO][TRAIN] epoch: 4, iter: 16900/20000, loss: 0.2780, lr: 0.001868, batch_cost: 0.3178, reader_cost: 0.16087, ips: 12.5864 samples/sec | ETA 00:16:252022-07-21 16:38:12 [INFO][TRAIN] epoch: 4, iter: 16910/20000, loss: 0.3795, lr: 0.001863, batch_cost: 0.2786, reader_cost: 0.12494, ips: 14.3589 samples/sec | ETA 00:14:202022-07-21 16:38:14 [INFO][TRAIN] epoch: 4, iter: 16920/20000, loss: 0.4708, lr: 0.001857, batch_cost: 0.2412, reader_cost: 0.11762, ips: 16.5845 samples/sec | ETA 00:12:222022-07-21 16:38:17 [INFO][TRAIN] epoch: 4, iter: 16930/20000, loss: 0.2976, lr: 0.001852, batch_cost: 0.2328, reader_cost: 0.10903, ips: 17.1814 samples/sec | ETA 00:11:542022-07-21 16:38:19 [INFO][TRAIN] epoch: 4, iter: 16940/20000, loss: 0.4585, lr: 0.001847, batch_cost: 0.2328, reader_cost: 0.11023, ips: 17.1854 samples/sec | ETA 00:11:522022-07-21 16:38:21 [INFO][TRAIN] epoch: 4, iter: 16950/20000, loss: 0.3461, lr: 0.001841, batch_cost: 0.2542, reader_cost: 0.13057, ips: 15.7354 samples/sec | ETA 00:12:552022-07-21 16:38:24 [INFO][TRAIN] epoch: 4, iter: 16960/20000, loss: 0.3584, lr: 0.001836, batch_cost: 0.2340, reader_cost: 0.11202, ips: 17.0936 samples/sec | ETA 00:11:512022-07-21 16:38:26 [INFO][TRAIN] epoch: 4, iter: 16970/20000, loss: 0.3346, lr: 0.001830, batch_cost: 0.2179, reader_cost: 0.09544, ips: 18.3561 samples/sec | ETA 00:11:002022-07-21 16:38:28 [INFO][TRAIN] epoch: 4, iter: 16980/20000, loss: 0.2956, lr: 0.001825, batch_cost: 0.2481, reader_cost: 0.12472, ips: 16.1246 samples/sec | ETA 00:12:292022-07-21 16:38:31 [INFO][TRAIN] epoch: 4, iter: 16990/20000, loss: 0.3391, lr: 0.001819, batch_cost: 0.2374, reader_cost: 0.11200, ips: 16.8525 samples/sec | ETA 00:11:542022-07-21 16:38:33 [INFO][TRAIN] epoch: 4, iter: 17000/20000, loss: 0.6348, lr: 0.001814, batch_cost: 0.2378, reader_cost: 0.10885, ips: 16.8227 samples/sec | ETA 00:11:532022-07-21 16:38:36 [INFO][TRAIN] epoch: 4, iter: 17010/20000, loss: 0.3701, lr: 0.001808, batch_cost: 0.2262, reader_cost: 0.10147, ips: 17.6842 samples/sec | ETA 00:11:162022-07-21 16:38:38 [INFO][TRAIN] epoch: 4, iter: 17020/20000, loss: 0.3606, lr: 0.001803, batch_cost: 0.2789, reader_cost: 0.13688, ips: 14.3431 samples/sec | ETA 00:13:512022-07-21 16:38:42 [INFO][TRAIN] epoch: 4, iter: 17030/20000, loss: 0.2790, lr: 0.001798, batch_cost: 0.3406, reader_cost: 0.16595, ips: 11.7442 samples/sec | ETA 00:16:512022-07-21 16:38:44 [INFO][TRAIN] epoch: 4, iter: 17040/20000, loss: 0.3190, lr: 0.001792, batch_cost: 0.2225, reader_cost: 0.09651, ips: 17.9740 samples/sec | ETA 00:10:582022-07-21 16:38:46 [INFO][TRAIN] epoch: 4, iter: 17050/20000, loss: 0.4117, lr: 0.001787, batch_cost: 0.2239, reader_cost: 0.10167, ips: 17.8672 samples/sec | ETA 00:11:002022-07-21 16:38:49 [INFO][TRAIN] epoch: 4, iter: 17060/20000, loss: 0.4152, lr: 0.001781, batch_cost: 0.2396, reader_cost: 0.11281, ips: 16.6914 samples/sec | ETA 00:11:442022-07-21 16:38:51 [INFO][TRAIN] epoch: 4, iter: 17070/20000, loss: 0.3768, lr: 0.001776, batch_cost: 0.2306, reader_cost: 0.10849, ips: 17.3482 samples/sec | ETA 00:11:152022-07-21 16:38:53 [INFO][TRAIN] epoch: 4, iter: 17080/20000, loss: 0.3094, lr: 0.001770, batch_cost: 0.2228, reader_cost: 0.09873, ips: 17.9563 samples/sec | ETA 00:10:502022-07-21 16:38:55 [INFO][TRAIN] epoch: 4, iter: 17090/20000, loss: 0.3264, lr: 0.001765, batch_cost: 0.2345, reader_cost: 0.11164, ips: 17.0581 samples/sec | ETA 00:11:222022-07-21 16:38:58 [INFO][TRAIN] epoch: 4, iter: 17100/20000, loss: 0.2840, lr: 0.001759, batch_cost: 0.2438, reader_cost: 0.11742, ips: 16.4068 samples/sec | ETA 00:11:472022-07-21 16:39:00 [INFO][TRAIN] epoch: 4, iter: 17110/20000, loss: 0.3911, lr: 0.001754, batch_cost: 0.2380, reader_cost: 0.10999, ips: 16.8035 samples/sec | ETA 00:11:272022-07-21 16:39:03 [INFO][TRAIN] epoch: 4, iter: 17120/20000, loss: 0.3035, lr: 0.001748, batch_cost: 0.2395, reader_cost: 0.11428, ips: 16.7033 samples/sec | ETA 00:11:292022-07-21 16:39:05 [INFO][TRAIN] epoch: 4, iter: 17130/20000, loss: 0.4370, lr: 0.001743, batch_cost: 0.2177, reader_cost: 0.09518, ips: 18.3706 samples/sec | ETA 00:10:242022-07-21 16:39:07 [INFO][TRAIN] epoch: 4, iter: 17140/20000, loss: 0.3876, lr: 0.001738, batch_cost: 0.2103, reader_cost: 0.08627, ips: 19.0198 samples/sec | ETA 00:10:012022-07-21 16:39:10 [INFO][TRAIN] epoch: 4, iter: 17150/20000, loss: 0.4276, lr: 0.001732, batch_cost: 0.2844, reader_cost: 0.13203, ips: 14.0653 samples/sec | ETA 00:13:302022-07-21 16:39:13 [INFO][TRAIN] epoch: 4, iter: 17160/20000, loss: 0.3215, lr: 0.001727, batch_cost: 0.2775, reader_cost: 0.13379, ips: 14.4133 samples/sec | ETA 00:13:082022-07-21 16:39:15 [INFO][TRAIN] epoch: 4, iter: 17170/20000, loss: 0.3242, lr: 0.001721, batch_cost: 0.2423, reader_cost: 0.10868, ips: 16.5053 samples/sec | ETA 00:11:252022-07-21 16:39:18 [INFO][TRAIN] epoch: 4, iter: 17180/20000, loss: 0.2985, lr: 0.001716, batch_cost: 0.2799, reader_cost: 0.12688, ips: 14.2911 samples/sec | ETA 00:13:092022-07-21 16:39:20 [INFO][TRAIN] epoch: 4, iter: 17190/20000, loss: 0.3119, lr: 0.001710, batch_cost: 0.2314, reader_cost: 0.10977, ips: 17.2886 samples/sec | ETA 00:10:502022-07-21 16:39:23 [INFO][TRAIN] epoch: 4, iter: 17200/20000, loss: 0.2243, lr: 0.001705, batch_cost: 0.2449, reader_cost: 0.11973, ips: 16.3300 samples/sec | ETA 00:11:252022-07-21 16:39:25 [INFO][TRAIN] epoch: 4, iter: 17210/20000, loss: 0.2719, lr: 0.001699, batch_cost: 0.2517, reader_cost: 0.12957, ips: 15.8900 samples/sec | ETA 00:11:422022-07-21 16:39:28 [INFO][TRAIN] epoch: 4, iter: 17220/20000, loss: 0.3010, lr: 0.001694, batch_cost: 0.2443, reader_cost: 0.12081, ips: 16.3744 samples/sec | ETA 00:11:192022-07-21 16:39:30 [INFO][TRAIN] epoch: 4, iter: 17230/20000, loss: 0.4078, lr: 0.001688, batch_cost: 0.2499, reader_cost: 0.12461, ips: 16.0035 samples/sec | ETA 00:11:322022-07-21 16:39:32 [INFO][TRAIN] epoch: 4, iter: 17240/20000, loss: 0.3646, lr: 0.001683, batch_cost: 0.2404, reader_cost: 0.11721, ips: 16.6365 samples/sec | ETA 00:11:032022-07-21 16:39:35 [INFO][TRAIN] epoch: 4, iter: 17250/20000, loss: 0.2856, lr: 0.001677, batch_cost: 0.2352, reader_cost: 0.10781, ips: 17.0076 samples/sec | ETA 00:10:462022-07-21 16:39:37 [INFO][TRAIN] epoch: 4, iter: 17260/20000, loss: 0.3423, lr: 0.001672, batch_cost: 0.2229, reader_cost: 0.09629, ips: 17.9437 samples/sec | ETA 00:10:102022-07-21 16:39:39 [INFO][TRAIN] epoch: 4, iter: 17270/20000, loss: 0.4521, lr: 0.001666, batch_cost: 0.2459, reader_cost: 0.11650, ips: 16.2636 samples/sec | ETA 00:11:112022-07-21 16:39:42 [INFO][TRAIN] epoch: 4, iter: 17280/20000, loss: 0.4119, lr: 0.001661, batch_cost: 0.2576, reader_cost: 0.13023, ips: 15.5251 samples/sec | ETA 00:11:402022-07-21 16:39:45 [INFO][TRAIN] epoch: 4, iter: 17290/20000, loss: 0.3641, lr: 0.001655, batch_cost: 0.2960, reader_cost: 0.14403, ips: 13.5143 samples/sec | ETA 00:13:222022-07-21 16:39:48 [INFO][TRAIN] epoch: 4, iter: 17300/20000, loss: 0.3020, lr: 0.001650, batch_cost: 0.3014, reader_cost: 0.15649, ips: 13.2700 samples/sec | ETA 00:13:332022-07-21 16:39:51 [INFO][TRAIN] epoch: 4, iter: 17310/20000, loss: 0.3827, lr: 0.001644, batch_cost: 0.2511, reader_cost: 0.12145, ips: 15.9278 samples/sec | ETA 00:11:152022-07-21 16:39:53 [INFO][TRAIN] epoch: 4, iter: 17320/20000, loss: 0.3198, lr: 0.001639, batch_cost: 0.2290, reader_cost: 0.10537, ips: 17.4647 samples/sec | ETA 00:10:132022-07-21 16:39:55 [INFO][TRAIN] epoch: 4, iter: 17330/20000, loss: 0.2492, lr: 0.001633, batch_cost: 0.2309, reader_cost: 0.10909, ips: 17.3262 samples/sec | ETA 00:10:162022-07-21 16:39:58 [INFO][TRAIN] epoch: 4, iter: 17340/20000, loss: 0.3138, lr: 0.001628, batch_cost: 0.2388, reader_cost: 0.11416, ips: 16.7537 samples/sec | ETA 00:10:352022-07-21 16:40:00 [INFO][TRAIN] epoch: 4, iter: 17350/20000, loss: 0.2899, lr: 0.001622, batch_cost: 0.2369, reader_cost: 0.11147, ips: 16.8859 samples/sec | ETA 00:10:272022-07-21 16:40:02 [INFO][TRAIN] epoch: 4, iter: 17360/20000, loss: 0.3399, lr: 0.001617, batch_cost: 0.2323, reader_cost: 0.10594, ips: 17.2194 samples/sec | ETA 00:10:132022-07-21 16:40:05 [INFO][TRAIN] epoch: 4, iter: 17370/20000, loss: 0.3238, lr: 0.001611, batch_cost: 0.2502, reader_cost: 0.12184, ips: 15.9856 samples/sec | ETA 00:10:582022-07-21 16:40:07 [INFO][TRAIN] epoch: 4, iter: 17380/20000, loss: 0.4441, lr: 0.001606, batch_cost: 0.2398, reader_cost: 0.11103, ips: 16.6823 samples/sec | ETA 00:10:282022-07-21 16:40:09 [INFO][TRAIN] epoch: 4, iter: 17390/20000, loss: 0.3299, lr: 0.001600, batch_cost: 0.2324, reader_cost: 0.10544, ips: 17.2087 samples/sec | ETA 00:10:062022-07-21 16:40:12 [INFO][TRAIN] epoch: 4, iter: 17400/20000, loss: 0.3309, lr: 0.001595, batch_cost: 0.2778, reader_cost: 0.12633, ips: 14.3999 samples/sec | ETA 00:12:022022-07-21 16:40:15 [INFO][TRAIN] epoch: 4, iter: 17410/20000, loss: 0.3019, lr: 0.001589, batch_cost: 0.2845, reader_cost: 0.14404, ips: 14.0622 samples/sec | ETA 00:12:162022-07-21 16:40:17 [INFO][TRAIN] epoch: 4, iter: 17420/20000, loss: 0.3434, lr: 0.001584, batch_cost: 0.2281, reader_cost: 0.10391, ips: 17.5343 samples/sec | ETA 00:09:482022-07-21 16:40:20 [INFO][TRAIN] epoch: 4, iter: 17430/20000, loss: 0.4878, lr: 0.001578, batch_cost: 0.2293, reader_cost: 0.10528, ips: 17.4463 samples/sec | ETA 00:09:492022-07-21 16:40:22 [INFO][TRAIN] epoch: 4, iter: 17440/20000, loss: 0.3929, lr: 0.001573, batch_cost: 0.2848, reader_cost: 0.13631, ips: 14.0452 samples/sec | ETA 00:12:092022-07-21 16:40:25 [INFO][TRAIN] epoch: 4, iter: 17450/20000, loss: 0.3545, lr: 0.001567, batch_cost: 0.2928, reader_cost: 0.15262, ips: 13.6592 samples/sec | ETA 00:12:262022-07-21 16:40:28 [INFO][TRAIN] epoch: 4, iter: 17460/20000, loss: 0.3543, lr: 0.001562, batch_cost: 0.2214, reader_cost: 0.09694, ips: 18.0643 samples/sec | ETA 00:09:222022-07-21 16:40:30 [INFO][TRAIN] epoch: 4, iter: 17470/20000, loss: 0.4272, lr: 0.001556, batch_cost: 0.2355, reader_cost: 0.11196, ips: 16.9859 samples/sec | ETA 00:09:552022-07-21 16:40:32 [INFO][TRAIN] epoch: 4, iter: 17480/20000, loss: 0.2374, lr: 0.001551, batch_cost: 0.2448, reader_cost: 0.12347, ips: 16.3399 samples/sec | ETA 00:10:162022-07-21 16:40:35 [INFO][TRAIN] epoch: 4, iter: 17490/20000, loss: 0.4142, lr: 0.001545, batch_cost: 0.2333, reader_cost: 0.10791, ips: 17.1473 samples/sec | ETA 00:09:452022-07-21 16:40:37 [INFO][TRAIN] epoch: 4, iter: 17500/20000, loss: 0.4013, lr: 0.001539, batch_cost: 0.2094, reader_cost: 0.08359, ips: 19.0983 samples/sec | ETA 00:08:432022-07-21 16:40:39 [INFO][TRAIN] epoch: 4, iter: 17510/20000, loss: 0.3929, lr: 0.001534, batch_cost: 0.2229, reader_cost: 0.09836, ips: 17.9479 samples/sec | ETA 00:09:142022-07-21 16:40:42 [INFO][TRAIN] epoch: 4, iter: 17520/20000, loss: 0.2984, lr: 0.001528, batch_cost: 0.2548, reader_cost: 0.13093, ips: 15.6964 samples/sec | ETA 00:10:312022-07-21 16:40:44 [INFO][TRAIN] epoch: 4, iter: 17530/20000, loss: 0.3794, lr: 0.001523, batch_cost: 0.2381, reader_cost: 0.11570, ips: 16.8000 samples/sec | ETA 00:09:482022-07-21 16:40:47 [INFO][TRAIN] epoch: 4, iter: 17540/20000, loss: 0.4314, lr: 0.001517, batch_cost: 0.2661, reader_cost: 0.12070, ips: 15.0323 samples/sec | ETA 00:10:542022-07-21 16:40:49 [INFO][TRAIN] epoch: 4, iter: 17550/20000, loss: 0.3233, lr: 0.001512, batch_cost: 0.2530, reader_cost: 0.12230, ips: 15.8116 samples/sec | ETA 00:10:192022-07-21 16:40:52 [INFO][TRAIN] epoch: 4, iter: 17560/20000, loss: 0.3717, lr: 0.001506, batch_cost: 0.2419, reader_cost: 0.11302, ips: 16.5343 samples/sec | ETA 00:09:502022-07-21 16:40:55 [INFO][TRAIN] epoch: 4, iter: 17570/20000, loss: 0.3186, lr: 0.001501, batch_cost: 0.2948, reader_cost: 0.12187, ips: 13.5668 samples/sec | ETA 00:11:562022-07-21 16:40:57 [INFO][TRAIN] epoch: 4, iter: 17580/20000, loss: 0.3579, lr: 0.001495, batch_cost: 0.2739, reader_cost: 0.14765, ips: 14.6022 samples/sec | ETA 00:11:022022-07-21 16:41:00 [INFO][TRAIN] epoch: 4, iter: 17590/20000, loss: 0.3991, lr: 0.001490, batch_cost: 0.2303, reader_cost: 0.10627, ips: 17.3690 samples/sec | ETA 00:09:152022-07-21 16:41:02 [INFO][TRAIN] epoch: 4, iter: 17600/20000, loss: 0.2645, lr: 0.001484, batch_cost: 0.2443, reader_cost: 0.11545, ips: 16.3709 samples/sec | ETA 00:09:462022-07-21 16:41:04 [INFO][TRAIN] epoch: 4, iter: 17610/20000, loss: 0.3886, lr: 0.001478, batch_cost: 0.2258, reader_cost: 0.10080, ips: 17.7137 samples/sec | ETA 00:08:592022-07-21 16:41:07 [INFO][TRAIN] epoch: 4, iter: 17620/20000, loss: 0.3711, lr: 0.001473, batch_cost: 0.2325, reader_cost: 0.10750, ips: 17.2013 samples/sec | ETA 00:09:132022-07-21 16:41:09 [INFO][TRAIN] epoch: 4, iter: 17630/20000, loss: 0.3466, lr: 0.001467, batch_cost: 0.2238, reader_cost: 0.09933, ips: 17.8765 samples/sec | ETA 00:08:502022-07-21 16:41:11 [INFO][TRAIN] epoch: 4, iter: 17640/20000, loss: 0.3716, lr: 0.001462, batch_cost: 0.2367, reader_cost: 0.11189, ips: 16.8968 samples/sec | ETA 00:09:182022-07-21 16:41:14 [INFO][TRAIN] epoch: 4, iter: 17650/20000, loss: 0.3173, lr: 0.001456, batch_cost: 0.3049, reader_cost: 0.15521, ips: 13.1189 samples/sec | ETA 00:11:562022-07-21 16:41:17 [INFO][TRAIN] epoch: 4, iter: 17660/20000, loss: 0.3318, lr: 0.001451, batch_cost: 0.2321, reader_cost: 0.10819, ips: 17.2357 samples/sec | ETA 00:09:032022-07-21 16:41:19 [INFO][TRAIN] epoch: 4, iter: 17670/20000, loss: 0.3307, lr: 0.001445, batch_cost: 0.2695, reader_cost: 0.12968, ips: 14.8436 samples/sec | ETA 00:10:272022-07-21 16:41:22 [INFO][TRAIN] epoch: 4, iter: 17680/20000, loss: 0.4049, lr: 0.001439, batch_cost: 0.2397, reader_cost: 0.11697, ips: 16.6889 samples/sec | ETA 00:09:162022-07-21 16:41:24 [INFO][TRAIN] epoch: 4, iter: 17690/20000, loss: 0.3119, lr: 0.001434, batch_cost: 0.2250, reader_cost: 0.10043, ips: 17.7769 samples/sec | ETA 00:08:392022-07-21 16:41:26 [INFO][TRAIN] epoch: 4, iter: 17700/20000, loss: 0.2778, lr: 0.001428, batch_cost: 0.2426, reader_cost: 0.11844, ips: 16.4914 samples/sec | ETA 00:09:172022-07-21 16:41:29 [INFO][TRAIN] epoch: 4, iter: 17710/20000, loss: 0.2940, lr: 0.001423, batch_cost: 0.2898, reader_cost: 0.14310, ips: 13.8015 samples/sec | ETA 00:11:032022-07-21 16:41:32 [INFO][TRAIN] epoch: 4, iter: 17720/20000, loss: 0.3718, lr: 0.001417, batch_cost: 0.2618, reader_cost: 0.13086, ips: 15.2777 samples/sec | ETA 00:09:562022-07-21 16:41:34 [INFO][TRAIN] epoch: 4, iter: 17730/20000, loss: 0.3825, lr: 0.001411, batch_cost: 0.2400, reader_cost: 0.11388, ips: 16.6684 samples/sec | ETA 00:09:042022-07-21 16:41:37 [INFO][TRAIN] epoch: 4, iter: 17740/20000, loss: 0.3480, lr: 0.001406, batch_cost: 0.2278, reader_cost: 0.10417, ips: 17.5605 samples/sec | ETA 00:08:342022-07-21 16:41:39 [INFO][TRAIN] epoch: 4, iter: 17750/20000, loss: 0.3045, lr: 0.001400, batch_cost: 0.2380, reader_cost: 0.10675, ips: 16.8082 samples/sec | ETA 00:08:552022-07-21 16:41:42 [INFO][TRAIN] epoch: 4, iter: 17760/20000, loss: 0.4030, lr: 0.001395, batch_cost: 0.2552, reader_cost: 0.12552, ips: 15.6742 samples/sec | ETA 00:09:312022-07-21 16:41:44 [INFO][TRAIN] epoch: 4, iter: 17770/20000, loss: 0.3501, lr: 0.001389, batch_cost: 0.2294, reader_cost: 0.10411, ips: 17.4363 samples/sec | ETA 00:08:312022-07-21 16:41:46 [INFO][TRAIN] epoch: 4, iter: 17780/20000, loss: 0.3432, lr: 0.001383, batch_cost: 0.2370, reader_cost: 0.11241, ips: 16.8764 samples/sec | ETA 00:08:462022-07-21 16:41:49 [INFO][TRAIN] epoch: 4, iter: 17790/20000, loss: 0.3482, lr: 0.001378, batch_cost: 0.2589, reader_cost: 0.13411, ips: 15.4494 samples/sec | ETA 00:09:322022-07-21 16:41:51 [INFO][TRAIN] epoch: 4, iter: 17800/20000, loss: 0.4280, lr: 0.001372, batch_cost: 0.2598, reader_cost: 0.12009, ips: 15.3977 samples/sec | ETA 00:09:312022-07-21 16:41:54 [INFO][TRAIN] epoch: 4, iter: 17810/20000, loss: 0.3524, lr: 0.001367, batch_cost: 0.2415, reader_cost: 0.11740, ips: 16.5655 samples/sec | ETA 00:08:482022-07-21 16:41:56 [INFO][TRAIN] epoch: 4, iter: 17820/20000, loss: 0.2928, lr: 0.001361, batch_cost: 0.2615, reader_cost: 0.13227, ips: 15.2936 samples/sec | ETA 00:09:302022-07-21 16:41:59 [INFO][TRAIN] epoch: 4, iter: 17830/20000, loss: 0.4271, lr: 0.001355, batch_cost: 0.2579, reader_cost: 0.11488, ips: 15.5120 samples/sec | ETA 00:09:192022-07-21 16:42:02 [INFO][TRAIN] epoch: 4, iter: 17840/20000, loss: 0.3056, lr: 0.001350, batch_cost: 0.2880, reader_cost: 0.12789, ips: 13.8912 samples/sec | ETA 00:10:212022-07-21 16:42:04 [INFO][TRAIN] epoch: 4, iter: 17850/20000, loss: 0.3423, lr: 0.001344, batch_cost: 0.2407, reader_cost: 0.11032, ips: 16.6148 samples/sec | ETA 00:08:372022-07-21 16:42:07 [INFO][TRAIN] epoch: 4, iter: 17860/20000, loss: 0.3599, lr: 0.001339, batch_cost: 0.2204, reader_cost: 0.09793, ips: 18.1467 samples/sec | ETA 00:07:512022-07-21 16:42:09 [INFO][TRAIN] epoch: 4, iter: 17870/20000, loss: 0.3197, lr: 0.001333, batch_cost: 0.2278, reader_cost: 0.10057, ips: 17.5567 samples/sec | ETA 00:08:052022-07-21 16:42:11 [INFO][TRAIN] epoch: 4, iter: 17880/20000, loss: 0.2527, lr: 0.001327, batch_cost: 0.2433, reader_cost: 0.11731, ips: 16.4432 samples/sec | ETA 00:08:352022-07-21 16:42:14 [INFO][TRAIN] epoch: 4, iter: 17890/20000, loss: 0.3041, lr: 0.001322, batch_cost: 0.2360, reader_cost: 0.10972, ips: 16.9501 samples/sec | ETA 00:08:172022-07-21 16:42:17 [INFO][TRAIN] epoch: 4, iter: 17900/20000, loss: 0.2786, lr: 0.001316, batch_cost: 0.3019, reader_cost: 0.12859, ips: 13.2505 samples/sec | ETA 00:10:332022-07-21 16:42:19 [INFO][TRAIN] epoch: 4, iter: 17910/20000, loss: 0.3360, lr: 0.001310, batch_cost: 0.2171, reader_cost: 0.09056, ips: 18.4259 samples/sec | ETA 00:07:332022-07-21 16:42:21 [INFO][TRAIN] epoch: 4, iter: 17920/20000, loss: 0.4307, lr: 0.001305, batch_cost: 0.2528, reader_cost: 0.11137, ips: 15.8214 samples/sec | ETA 00:08:452022-07-21 16:42:24 [INFO][TRAIN] epoch: 4, iter: 17930/20000, loss: 0.3879, lr: 0.001299, batch_cost: 0.2318, reader_cost: 0.09212, ips: 17.2596 samples/sec | ETA 00:07:592022-07-21 16:42:26 [INFO][TRAIN] epoch: 4, iter: 17940/20000, loss: 0.3501, lr: 0.001293, batch_cost: 0.2375, reader_cost: 0.10913, ips: 16.8439 samples/sec | ETA 00:08:092022-07-21 16:42:28 [INFO][TRAIN] epoch: 4, iter: 17950/20000, loss: 0.4086, lr: 0.001288, batch_cost: 0.2332, reader_cost: 0.10642, ips: 17.1541 samples/sec | ETA 00:07:582022-07-21 16:42:31 [INFO][TRAIN] epoch: 4, iter: 17960/20000, loss: 0.4139, lr: 0.001282, batch_cost: 0.2417, reader_cost: 0.11815, ips: 16.5485 samples/sec | ETA 00:08:132022-07-21 16:42:33 [INFO][TRAIN] epoch: 4, iter: 17970/20000, loss: 0.4715, lr: 0.001276, batch_cost: 0.2286, reader_cost: 0.10493, ips: 17.4944 samples/sec | ETA 00:07:442022-07-21 16:42:36 [INFO][TRAIN] epoch: 4, iter: 17980/20000, loss: 0.4710, lr: 0.001271, batch_cost: 0.2886, reader_cost: 0.13401, ips: 13.8619 samples/sec | ETA 00:09:422022-07-21 16:42:39 [INFO][TRAIN] epoch: 4, iter: 17990/20000, loss: 0.2861, lr: 0.001265, batch_cost: 0.2617, reader_cost: 0.13045, ips: 15.2861 samples/sec | ETA 00:08:452022-07-21 16:42:41 [INFO][TRAIN] epoch: 4, iter: 18000/20000, loss: 0.5956, lr: 0.001259, batch_cost: 0.2355, reader_cost: 0.10787, ips: 16.9858 samples/sec | ETA 00:07:502022-07-21 16:42:43 [INFO][TRAIN] epoch: 4, iter: 18010/20000, loss: 0.3615, lr: 0.001254, batch_cost: 0.2443, reader_cost: 0.11666, ips: 16.3733 samples/sec | ETA 00:08:062022-07-21 16:42:46 [INFO][TRAIN] epoch: 4, iter: 18020/20000, loss: 0.3502, lr: 0.001248, batch_cost: 0.2306, reader_cost: 0.10659, ips: 17.3449 samples/sec | ETA 00:07:362022-07-21 16:42:48 [INFO][TRAIN] epoch: 4, iter: 18030/20000, loss: 0.3642, lr: 0.001242, batch_cost: 0.2316, reader_cost: 0.10799, ips: 17.2719 samples/sec | ETA 00:07:362022-07-21 16:42:50 [INFO][TRAIN] epoch: 4, iter: 18040/20000, loss: 0.3283, lr: 0.001237, batch_cost: 0.2345, reader_cost: 0.11160, ips: 17.0600 samples/sec | ETA 00:07:392022-07-21 16:42:53 [INFO][TRAIN] epoch: 4, iter: 18050/20000, loss: 0.3751, lr: 0.001231, batch_cost: 0.2501, reader_cost: 0.11326, ips: 15.9929 samples/sec | ETA 00:08:072022-07-21 16:42:55 [INFO][TRAIN] epoch: 4, iter: 18060/20000, loss: 0.2595, lr: 0.001225, batch_cost: 0.2560, reader_cost: 0.12065, ips: 15.6266 samples/sec | ETA 00:08:162022-07-21 16:42:58 [INFO][TRAIN] epoch: 4, iter: 18070/20000, loss: 0.3746, lr: 0.001220, batch_cost: 0.2331, reader_cost: 0.10674, ips: 17.1621 samples/sec | ETA 00:07:292022-07-21 16:43:00 [INFO][TRAIN] epoch: 4, iter: 18080/20000, loss: 0.3823, lr: 0.001214, batch_cost: 0.2232, reader_cost: 0.09937, ips: 17.9193 samples/sec | ETA 00:07:082022-07-21 16:43:02 [INFO][TRAIN] epoch: 4, iter: 18090/20000, loss: 0.4127, lr: 0.001208, batch_cost: 0.2379, reader_cost: 0.11174, ips: 16.8167 samples/sec | ETA 00:07:342022-07-21 16:43:05 [INFO][TRAIN] epoch: 4, iter: 18100/20000, loss: 0.4017, lr: 0.001203, batch_cost: 0.2365, reader_cost: 0.10917, ips: 16.9156 samples/sec | ETA 00:07:292022-07-21 16:43:08 [INFO][TRAIN] epoch: 4, iter: 18110/20000, loss: 0.3205, lr: 0.001197, batch_cost: 0.2902, reader_cost: 0.13906, ips: 13.7816 samples/sec | ETA 00:09:082022-07-21 16:43:10 [INFO][TRAIN] epoch: 4, iter: 18120/20000, loss: 0.3054, lr: 0.001191, batch_cost: 0.2855, reader_cost: 0.14280, ips: 14.0123 samples/sec | ETA 00:08:562022-07-21 16:43:13 [INFO][TRAIN] epoch: 4, iter: 18130/20000, loss: 0.3014, lr: 0.001186, batch_cost: 0.2268, reader_cost: 0.10366, ips: 17.6380 samples/sec | ETA 00:07:042022-07-21 16:43:15 [INFO][TRAIN] epoch: 4, iter: 18140/20000, loss: 0.3647, lr: 0.001180, batch_cost: 0.2401, reader_cost: 0.11623, ips: 16.6586 samples/sec | ETA 00:07:262022-07-21 16:43:18 [INFO][TRAIN] epoch: 4, iter: 18150/20000, loss: 0.3410, lr: 0.001174, batch_cost: 0.2796, reader_cost: 0.12443, ips: 14.3071 samples/sec | ETA 00:08:372022-07-21 16:43:20 [INFO][TRAIN] epoch: 4, iter: 18160/20000, loss: 0.3350, lr: 0.001168, batch_cost: 0.2315, reader_cost: 0.10178, ips: 17.2753 samples/sec | ETA 00:07:062022-07-21 16:43:23 [INFO][TRAIN] epoch: 4, iter: 18170/20000, loss: 0.3538, lr: 0.001163, batch_cost: 0.2462, reader_cost: 0.11517, ips: 16.2489 samples/sec | ETA 00:07:302022-07-21 16:43:25 [INFO][TRAIN] epoch: 4, iter: 18180/20000, loss: 0.3078, lr: 0.001157, batch_cost: 0.2629, reader_cost: 0.11478, ips: 15.2145 samples/sec | ETA 00:07:582022-07-21 16:43:28 [INFO][TRAIN] epoch: 4, iter: 18190/20000, loss: 0.2645, lr: 0.001151, batch_cost: 0.2430, reader_cost: 0.11523, ips: 16.4626 samples/sec | ETA 00:07:192022-07-21 16:43:30 [INFO][TRAIN] epoch: 4, iter: 18200/20000, loss: 0.3690, lr: 0.001146, batch_cost: 0.2399, reader_cost: 0.11456, ips: 16.6731 samples/sec | ETA 00:07:112022-07-21 16:43:33 [INFO][TRAIN] epoch: 4, iter: 18210/20000, loss: 0.3548, lr: 0.001140, batch_cost: 0.2429, reader_cost: 0.11971, ips: 16.4667 samples/sec | ETA 00:07:142022-07-21 16:43:35 [INFO][TRAIN] epoch: 4, iter: 18220/20000, loss: 0.3407, lr: 0.001134, batch_cost: 0.2283, reader_cost: 0.10402, ips: 17.5200 samples/sec | ETA 00:06:462022-07-21 16:43:37 [INFO][TRAIN] epoch: 4, iter: 18230/20000, loss: 0.3595, lr: 0.001128, batch_cost: 0.2309, reader_cost: 0.10602, ips: 17.3261 samples/sec | ETA 00:06:482022-07-21 16:43:40 [INFO][TRAIN] epoch: 4, iter: 18240/20000, loss: 0.4048, lr: 0.001123, batch_cost: 0.2557, reader_cost: 0.12182, ips: 15.6413 samples/sec | ETA 00:07:302022-07-21 16:43:43 [INFO][TRAIN] epoch: 4, iter: 18250/20000, loss: 0.3643, lr: 0.001117, batch_cost: 0.2870, reader_cost: 0.11354, ips: 13.9364 samples/sec | ETA 00:08:222022-07-21 16:43:45 [INFO][TRAIN] epoch: 4, iter: 18260/20000, loss: 0.5052, lr: 0.001111, batch_cost: 0.2898, reader_cost: 0.14121, ips: 13.8043 samples/sec | ETA 00:08:242022-07-21 16:43:48 [INFO][TRAIN] epoch: 4, iter: 18270/20000, loss: 0.3598, lr: 0.001105, batch_cost: 0.2325, reader_cost: 0.10471, ips: 17.2056 samples/sec | ETA 00:06:422022-07-21 16:43:50 [INFO][TRAIN] epoch: 4, iter: 18280/20000, loss: 0.3890, lr: 0.001100, batch_cost: 0.2499, reader_cost: 0.12160, ips: 16.0052 samples/sec | ETA 00:07:092022-07-21 16:43:53 [INFO][TRAIN] epoch: 4, iter: 18290/20000, loss: 0.3751, lr: 0.001094, batch_cost: 0.2550, reader_cost: 0.11970, ips: 15.6837 samples/sec | ETA 00:07:162022-07-21 16:43:56 [INFO][TRAIN] epoch: 4, iter: 18300/20000, loss: 0.4246, lr: 0.001088, batch_cost: 0.2925, reader_cost: 0.13949, ips: 13.6764 samples/sec | ETA 00:08:172022-07-21 16:43:59 [INFO][TRAIN] epoch: 4, iter: 18310/20000, loss: 0.3192, lr: 0.001082, batch_cost: 0.2760, reader_cost: 0.12363, ips: 14.4939 samples/sec | ETA 00:07:462022-07-21 16:44:01 [INFO][TRAIN] epoch: 4, iter: 18320/20000, loss: 0.3047, lr: 0.001077, batch_cost: 0.2413, reader_cost: 0.11762, ips: 16.5765 samples/sec | ETA 00:06:452022-07-21 16:44:03 [INFO][TRAIN] epoch: 4, iter: 18330/20000, loss: 0.3497, lr: 0.001071, batch_cost: 0.2326, reader_cost: 0.10930, ips: 17.1949 samples/sec | ETA 00:06:282022-07-21 16:44:06 [INFO][TRAIN] epoch: 4, iter: 18340/20000, loss: 0.4100, lr: 0.001065, batch_cost: 0.2245, reader_cost: 0.09357, ips: 17.8175 samples/sec | ETA 00:06:122022-07-21 16:44:08 [INFO][TRAIN] epoch: 4, iter: 18350/20000, loss: 0.3541, lr: 0.001059, batch_cost: 0.2537, reader_cost: 0.12684, ips: 15.7668 samples/sec | ETA 00:06:582022-07-21 16:44:11 [INFO][TRAIN] epoch: 4, iter: 18360/20000, loss: 0.4204, lr: 0.001054, batch_cost: 0.2915, reader_cost: 0.16476, ips: 13.7210 samples/sec | ETA 00:07:582022-07-21 16:44:13 [INFO][TRAIN] epoch: 4, iter: 18370/20000, loss: 0.3242, lr: 0.001048, batch_cost: 0.2496, reader_cost: 0.10274, ips: 16.0263 samples/sec | ETA 00:06:462022-07-21 16:44:17 [INFO][TRAIN] epoch: 4, iter: 18380/20000, loss: 0.2942, lr: 0.001042, batch_cost: 0.3023, reader_cost: 0.15332, ips: 13.2331 samples/sec | ETA 00:08:092022-07-21 16:44:19 [INFO][TRAIN] epoch: 4, iter: 18390/20000, loss: 0.3629, lr: 0.001036, batch_cost: 0.2799, reader_cost: 0.14895, ips: 14.2908 samples/sec | ETA 00:07:302022-07-21 16:44:22 [INFO][TRAIN] epoch: 4, iter: 18400/20000, loss: 0.4051, lr: 0.001030, batch_cost: 0.2561, reader_cost: 0.12108, ips: 15.6167 samples/sec | ETA 00:06:492022-07-21 16:44:24 [INFO][TRAIN] epoch: 4, iter: 18410/20000, loss: 0.3234, lr: 0.001025, batch_cost: 0.2298, reader_cost: 0.09871, ips: 17.4076 samples/sec | ETA 00:06:052022-07-21 16:44:27 [INFO][TRAIN] epoch: 4, iter: 18420/20000, loss: 0.4615, lr: 0.001019, batch_cost: 0.2435, reader_cost: 0.11555, ips: 16.4260 samples/sec | ETA 00:06:242022-07-21 16:44:30 [INFO][TRAIN] epoch: 4, iter: 18430/20000, loss: 0.3095, lr: 0.001013, batch_cost: 0.2902, reader_cost: 0.12474, ips: 13.7842 samples/sec | ETA 00:07:352022-07-21 16:44:32 [INFO][TRAIN] epoch: 4, iter: 18440/20000, loss: 0.2559, lr: 0.001007, batch_cost: 0.2537, reader_cost: 0.12511, ips: 15.7681 samples/sec | ETA 00:06:352022-07-21 16:44:34 [INFO][TRAIN] epoch: 4, iter: 18450/20000, loss: 0.3964, lr: 0.001001, batch_cost: 0.2293, reader_cost: 0.10498, ips: 17.4435 samples/sec | ETA 00:05:552022-07-21 16:44:37 [INFO][TRAIN] epoch: 4, iter: 18460/20000, loss: 0.3936, lr: 0.000996, batch_cost: 0.2301, reader_cost: 0.10736, ips: 17.3801 samples/sec | ETA 00:05:542022-07-21 16:44:39 [INFO][TRAIN] epoch: 4, iter: 18470/20000, loss: 0.3150, lr: 0.000990, batch_cost: 0.2263, reader_cost: 0.09688, ips: 17.6790 samples/sec | ETA 00:05:462022-07-21 16:44:42 [INFO][TRAIN] epoch: 4, iter: 18480/20000, loss: 0.3567, lr: 0.000984, batch_cost: 0.2618, reader_cost: 0.13121, ips: 15.2814 samples/sec | ETA 00:06:372022-07-21 16:44:44 [INFO][TRAIN] epoch: 4, iter: 18490/20000, loss: 0.3925, lr: 0.000978, batch_cost: 0.2659, reader_cost: 0.13373, ips: 15.0420 samples/sec | ETA 00:06:412022-07-21 16:44:47 [INFO][TRAIN] epoch: 4, iter: 18500/20000, loss: 0.4031, lr: 0.000972, batch_cost: 0.2640, reader_cost: 0.11993, ips: 15.1520 samples/sec | ETA 00:06:352022-07-21 16:44:50 [INFO][TRAIN] epoch: 4, iter: 18510/20000, loss: 0.3335, lr: 0.000967, batch_cost: 0.3289, reader_cost: 0.17031, ips: 12.1617 samples/sec | ETA 00:08:102022-07-21 16:44:53 [INFO][TRAIN] epoch: 4, iter: 18520/20000, loss: 0.3444, lr: 0.000961, batch_cost: 0.2806, reader_cost: 0.13826, ips: 14.2564 samples/sec | ETA 00:06:552022-07-21 16:44:55 [INFO][TRAIN] epoch: 4, iter: 18530/20000, loss: 0.3294, lr: 0.000955, batch_cost: 0.2384, reader_cost: 0.11076, ips: 16.7797 samples/sec | ETA 00:05:502022-07-21 16:44:58 [INFO][TRAIN] epoch: 4, iter: 18540/20000, loss: 0.2476, lr: 0.000949, batch_cost: 0.2269, reader_cost: 0.10119, ips: 17.6275 samples/sec | ETA 00:05:312022-07-21 16:45:00 [INFO][TRAIN] epoch: 4, iter: 18550/20000, loss: 0.3437, lr: 0.000943, batch_cost: 0.2412, reader_cost: 0.10693, ips: 16.5826 samples/sec | ETA 00:05:492022-07-21 16:45:03 [INFO][TRAIN] epoch: 4, iter: 18560/20000, loss: 0.3507, lr: 0.000937, batch_cost: 0.2540, reader_cost: 0.11654, ips: 15.7469 samples/sec | ETA 00:06:052022-07-21 16:45:05 [INFO][TRAIN] epoch: 4, iter: 18570/20000, loss: 0.3024, lr: 0.000931, batch_cost: 0.2379, reader_cost: 0.11078, ips: 16.8115 samples/sec | ETA 00:05:402022-07-21 16:45:07 [INFO][TRAIN] epoch: 4, iter: 18580/20000, loss: 0.3560, lr: 0.000926, batch_cost: 0.2396, reader_cost: 0.11049, ips: 16.6936 samples/sec | ETA 00:05:402022-07-21 16:45:10 [INFO][TRAIN] epoch: 4, iter: 18590/20000, loss: 0.3287, lr: 0.000920, batch_cost: 0.2312, reader_cost: 0.10658, ips: 17.2983 samples/sec | ETA 00:05:262022-07-21 16:45:12 [INFO][TRAIN] epoch: 4, iter: 18600/20000, loss: 0.3491, lr: 0.000914, batch_cost: 0.2324, reader_cost: 0.10677, ips: 17.2080 samples/sec | ETA 00:05:252022-07-21 16:45:14 [INFO][TRAIN] epoch: 4, iter: 18610/20000, loss: 0.3205, lr: 0.000908, batch_cost: 0.2376, reader_cost: 0.07993, ips: 16.8324 samples/sec | ETA 00:05:302022-07-21 16:45:17 [INFO][TRAIN] epoch: 4, iter: 18620/20000, loss: 0.3397, lr: 0.000902, batch_cost: 0.2448, reader_cost: 0.11803, ips: 16.3379 samples/sec | ETA 00:05:372022-07-21 16:45:19 [INFO][TRAIN] epoch: 4, iter: 18630/20000, loss: 0.3408, lr: 0.000896, batch_cost: 0.2413, reader_cost: 0.11349, ips: 16.5779 samples/sec | ETA 00:05:302022-07-21 16:45:23 [INFO][TRAIN] epoch: 4, iter: 18640/20000, loss: 0.3002, lr: 0.000890, batch_cost: 0.3697, reader_cost: 0.17199, ips: 10.8192 samples/sec | ETA 00:08:222022-07-21 16:45:25 [INFO][TRAIN] epoch: 4, iter: 18650/20000, loss: 0.3334, lr: 0.000884, batch_cost: 0.2402, reader_cost: 0.11300, ips: 16.6555 samples/sec | ETA 00:05:242022-07-21 16:45:28 [INFO][TRAIN] epoch: 4, iter: 18660/20000, loss: 0.3306, lr: 0.000879, batch_cost: 0.2236, reader_cost: 0.10116, ips: 17.8854 samples/sec | ETA 00:04:592022-07-21 16:45:30 [INFO][TRAIN] epoch: 4, iter: 18670/20000, loss: 0.2854, lr: 0.000873, batch_cost: 0.2504, reader_cost: 0.12352, ips: 15.9730 samples/sec | ETA 00:05:332022-07-21 16:45:33 [INFO][TRAIN] epoch: 4, iter: 18680/20000, loss: 0.3058, lr: 0.000867, batch_cost: 0.2581, reader_cost: 0.10817, ips: 15.4998 samples/sec | ETA 00:05:402022-07-21 16:45:35 [INFO][TRAIN] epoch: 4, iter: 18690/20000, loss: 0.3625, lr: 0.000861, batch_cost: 0.2591, reader_cost: 0.12959, ips: 15.4374 samples/sec | ETA 00:05:392022-07-21 16:45:38 [INFO][TRAIN] epoch: 4, iter: 18700/20000, loss: 0.5366, lr: 0.000855, batch_cost: 0.2492, reader_cost: 0.12334, ips: 16.0529 samples/sec | ETA 00:05:232022-07-21 16:45:40 [INFO][TRAIN] epoch: 4, iter: 18710/20000, loss: 0.3782, lr: 0.000849, batch_cost: 0.2328, reader_cost: 0.10903, ips: 17.1793 samples/sec | ETA 00:05:002022-07-21 16:45:42 [INFO][TRAIN] epoch: 4, iter: 18720/20000, loss: 0.2714, lr: 0.000843, batch_cost: 0.2432, reader_cost: 0.11843, ips: 16.4484 samples/sec | ETA 00:05:112022-07-21 16:45:45 [INFO][TRAIN] epoch: 4, iter: 18730/20000, loss: 0.4481, lr: 0.000837, batch_cost: 0.2346, reader_cost: 0.11132, ips: 17.0472 samples/sec | ETA 00:04:572022-07-21 16:45:47 [INFO][TRAIN] epoch: 4, iter: 18740/20000, loss: 0.2563, lr: 0.000831, batch_cost: 0.2368, reader_cost: 0.11394, ips: 16.8912 samples/sec | ETA 00:04:582022-07-21 16:45:50 [INFO][TRAIN] epoch: 4, iter: 18750/20000, loss: 0.3025, lr: 0.000825, batch_cost: 0.2468, reader_cost: 0.12310, ips: 16.2064 samples/sec | ETA 00:05:082022-07-21 16:45:52 [INFO][TRAIN] epoch: 4, iter: 18760/20000, loss: 0.3749, lr: 0.000819, batch_cost: 0.2277, reader_cost: 0.10718, ips: 17.5644 samples/sec | ETA 00:04:422022-07-21 16:45:54 [INFO][TRAIN] epoch: 4, iter: 18770/20000, loss: 0.2954, lr: 0.000813, batch_cost: 0.2460, reader_cost: 0.12087, ips: 16.2603 samples/sec | ETA 00:05:022022-07-21 16:45:57 [INFO][TRAIN] epoch: 4, iter: 18780/20000, loss: 0.3439, lr: 0.000807, batch_cost: 0.3001, reader_cost: 0.14680, ips: 13.3296 samples/sec | ETA 00:06:062022-07-21 16:46:00 [INFO][TRAIN] epoch: 4, iter: 18790/20000, loss: 0.3191, lr: 0.000801, batch_cost: 0.2618, reader_cost: 0.12123, ips: 15.2785 samples/sec | ETA 00:05:162022-07-21 16:46:02 [INFO][TRAIN] epoch: 4, iter: 18800/20000, loss: 0.3043, lr: 0.000796, batch_cost: 0.2450, reader_cost: 0.11892, ips: 16.3266 samples/sec | ETA 00:04:532022-07-21 16:46:05 [INFO][TRAIN] epoch: 4, iter: 18810/20000, loss: 0.3671, lr: 0.000790, batch_cost: 0.2642, reader_cost: 0.12469, ips: 15.1407 samples/sec | ETA 00:05:142022-07-21 16:46:07 [INFO][TRAIN] epoch: 4, iter: 18820/20000, loss: 0.3310, lr: 0.000784, batch_cost: 0.2222, reader_cost: 0.09866, ips: 18.0021 samples/sec | ETA 00:04:222022-07-21 16:46:10 [INFO][TRAIN] epoch: 4, iter: 18830/20000, loss: 0.3617, lr: 0.000778, batch_cost: 0.2373, reader_cost: 0.11341, ips: 16.8567 samples/sec | ETA 00:04:372022-07-21 16:46:12 [INFO][TRAIN] epoch: 4, iter: 18840/20000, loss: 0.3233, lr: 0.000772, batch_cost: 0.2406, reader_cost: 0.11709, ips: 16.6217 samples/sec | ETA 00:04:392022-07-21 16:46:14 [INFO][TRAIN] epoch: 4, iter: 18850/20000, loss: 0.2867, lr: 0.000766, batch_cost: 0.2223, reader_cost: 0.10031, ips: 17.9970 samples/sec | ETA 00:04:152022-07-21 16:46:17 [INFO][TRAIN] epoch: 4, iter: 18860/20000, loss: 0.3541, lr: 0.000760, batch_cost: 0.2378, reader_cost: 0.11327, ips: 16.8236 samples/sec | ETA 00:04:312022-07-21 16:46:19 [INFO][TRAIN] epoch: 4, iter: 18870/20000, loss: 0.3220, lr: 0.000754, batch_cost: 0.2175, reader_cost: 0.09165, ips: 18.3943 samples/sec | ETA 00:04:052022-07-21 16:46:21 [INFO][TRAIN] epoch: 4, iter: 18880/20000, loss: 0.3190, lr: 0.000748, batch_cost: 0.2486, reader_cost: 0.09745, ips: 16.0915 samples/sec | ETA 00:04:382022-07-21 16:46:24 [INFO][TRAIN] epoch: 4, iter: 18890/20000, loss: 0.3509, lr: 0.000742, batch_cost: 0.2518, reader_cost: 0.09951, ips: 15.8858 samples/sec | ETA 00:04:392022-07-21 16:46:27 [INFO][TRAIN] epoch: 4, iter: 18900/20000, loss: 0.3212, lr: 0.000736, batch_cost: 0.3003, reader_cost: 0.14561, ips: 13.3190 samples/sec | ETA 00:05:302022-07-21 16:46:30 [INFO][TRAIN] epoch: 4, iter: 18910/20000, loss: 0.3066, lr: 0.000730, batch_cost: 0.3037, reader_cost: 0.14641, ips: 13.1703 samples/sec | ETA 00:05:312022-07-21 16:46:32 [INFO][TRAIN] epoch: 4, iter: 18920/20000, loss: 0.3510, lr: 0.000724, batch_cost: 0.2342, reader_cost: 0.11043, ips: 17.0809 samples/sec | ETA 00:04:122022-07-21 16:46:35 [INFO][TRAIN] epoch: 4, iter: 18930/20000, loss: 0.4091, lr: 0.000718, batch_cost: 0.2323, reader_cost: 0.10741, ips: 17.2188 samples/sec | ETA 00:04:082022-07-21 16:46:37 [INFO][TRAIN] epoch: 4, iter: 18940/20000, loss: 0.3246, lr: 0.000712, batch_cost: 0.2746, reader_cost: 0.12678, ips: 14.5688 samples/sec | ETA 00:04:512022-07-21 16:46:40 [INFO][TRAIN] epoch: 4, iter: 18950/20000, loss: 0.3130, lr: 0.000706, batch_cost: 0.2829, reader_cost: 0.15526, ips: 14.1402 samples/sec | ETA 00:04:572022-07-21 16:46:43 [INFO][TRAIN] epoch: 4, iter: 18960/20000, loss: 0.3352, lr: 0.000699, batch_cost: 0.2465, reader_cost: 0.11569, ips: 16.2280 samples/sec | ETA 00:04:162022-07-21 16:46:45 [INFO][TRAIN] epoch: 4, iter: 18970/20000, loss: 0.3020, lr: 0.000693, batch_cost: 0.2581, reader_cost: 0.12633, ips: 15.4988 samples/sec | ETA 00:04:252022-07-21 16:46:48 [INFO][TRAIN] epoch: 4, iter: 18980/20000, loss: 0.3266, lr: 0.000687, batch_cost: 0.2441, reader_cost: 0.11360, ips: 16.3854 samples/sec | ETA 00:04:092022-07-21 16:46:50 [INFO][TRAIN] epoch: 4, iter: 18990/20000, loss: 0.3414, lr: 0.000681, batch_cost: 0.2139, reader_cost: 0.08711, ips: 18.7012 samples/sec | ETA 00:03:362022-07-21 16:46:52 [INFO][TRAIN] epoch: 4, iter: 19000/20000, loss: 0.3091, lr: 0.000675, batch_cost: 0.2434, reader_cost: 0.12211, ips: 16.4358 samples/sec | ETA 00:04:032022-07-21 16:46:55 [INFO][TRAIN] epoch: 4, iter: 19010/20000, loss: 0.2751, lr: 0.000669, batch_cost: 0.2317, reader_cost: 0.10775, ips: 17.2620 samples/sec | ETA 00:03:492022-07-21 16:46:57 [INFO][TRAIN] epoch: 4, iter: 19020/20000, loss: 0.2641, lr: 0.000663, batch_cost: 0.2472, reader_cost: 0.12447, ips: 16.1833 samples/sec | ETA 00:04:022022-07-21 16:46:59 [INFO][TRAIN] epoch: 4, iter: 19030/20000, loss: 0.3579, lr: 0.000657, batch_cost: 0.2466, reader_cost: 0.12384, ips: 16.2193 samples/sec | ETA 00:03:592022-07-21 16:47:02 [INFO][TRAIN] epoch: 4, iter: 19040/20000, loss: 0.4441, lr: 0.000651, batch_cost: 0.2593, reader_cost: 0.12432, ips: 15.4259 samples/sec | ETA 00:04:082022-07-21 16:47:05 [INFO][TRAIN] epoch: 4, iter: 19050/20000, loss: 0.3610, lr: 0.000645, batch_cost: 0.2898, reader_cost: 0.14023, ips: 13.8010 samples/sec | ETA 00:04:352022-07-21 16:47:08 [INFO][TRAIN] epoch: 4, iter: 19060/20000, loss: 0.3090, lr: 0.000639, batch_cost: 0.2827, reader_cost: 0.14639, ips: 14.1480 samples/sec | ETA 00:04:252022-07-21 16:47:10 [INFO][TRAIN] epoch: 4, iter: 19070/20000, loss: 0.2957, lr: 0.000633, batch_cost: 0.2395, reader_cost: 0.11223, ips: 16.6990 samples/sec | ETA 00:03:422022-07-21 16:47:13 [INFO][TRAIN] epoch: 4, iter: 19080/20000, loss: 0.3436, lr: 0.000626, batch_cost: 0.2454, reader_cost: 0.11462, ips: 16.2997 samples/sec | ETA 00:03:452022-07-21 16:47:15 [INFO][TRAIN] epoch: 4, iter: 19090/20000, loss: 0.3893, lr: 0.000620, batch_cost: 0.2447, reader_cost: 0.11871, ips: 16.3468 samples/sec | ETA 00:03:422022-07-21 16:47:18 [INFO][TRAIN] epoch: 4, iter: 19100/20000, loss: 0.3444, lr: 0.000614, batch_cost: 0.2546, reader_cost: 0.12783, ips: 15.7106 samples/sec | ETA 00:03:492022-07-21 16:47:20 [INFO][TRAIN] epoch: 4, iter: 19110/20000, loss: 0.3575, lr: 0.000608, batch_cost: 0.2177, reader_cost: 0.09667, ips: 18.3733 samples/sec | ETA 00:03:132022-07-21 16:47:22 [INFO][TRAIN] epoch: 4, iter: 19120/20000, loss: 0.3471, lr: 0.000602, batch_cost: 0.2310, reader_cost: 0.10612, ips: 17.3163 samples/sec | ETA 00:03:232022-07-21 16:47:24 [INFO][TRAIN] epoch: 4, iter: 19130/20000, loss: 0.3624, lr: 0.000596, batch_cost: 0.2256, reader_cost: 0.09557, ips: 17.7315 samples/sec | ETA 00:03:162022-07-21 16:47:27 [INFO][TRAIN] epoch: 4, iter: 19140/20000, loss: 0.3317, lr: 0.000590, batch_cost: 0.2980, reader_cost: 0.12915, ips: 13.4250 samples/sec | ETA 00:04:162022-07-21 16:47:30 [INFO][TRAIN] epoch: 4, iter: 19150/20000, loss: 0.2806, lr: 0.000583, batch_cost: 0.2591, reader_cost: 0.13204, ips: 15.4400 samples/sec | ETA 00:03:402022-07-21 16:47:32 [INFO][TRAIN] epoch: 4, iter: 19160/20000, loss: 0.3098, lr: 0.000577, batch_cost: 0.2407, reader_cost: 0.11279, ips: 16.6166 samples/sec | ETA 00:03:222022-07-21 16:47:35 [INFO][TRAIN] epoch: 4, iter: 19170/20000, loss: 0.2535, lr: 0.000571, batch_cost: 0.2923, reader_cost: 0.14086, ips: 13.6854 samples/sec | ETA 00:04:022022-07-21 16:47:38 [INFO][TRAIN] epoch: 4, iter: 19180/20000, loss: 0.3059, lr: 0.000565, batch_cost: 0.2359, reader_cost: 0.10550, ips: 16.9546 samples/sec | ETA 00:03:132022-07-21 16:47:41 [INFO][TRAIN] epoch: 4, iter: 19190/20000, loss: 0.3966, lr: 0.000559, batch_cost: 0.2933, reader_cost: 0.14676, ips: 13.6380 samples/sec | ETA 00:03:572022-07-21 16:47:43 [INFO][TRAIN] epoch: 4, iter: 19200/20000, loss: 0.3419, lr: 0.000553, batch_cost: 0.2482, reader_cost: 0.11678, ips: 16.1157 samples/sec | ETA 00:03:182022-07-21 16:47:46 [INFO][TRAIN] epoch: 4, iter: 19210/20000, loss: 0.2859, lr: 0.000546, batch_cost: 0.2685, reader_cost: 0.12701, ips: 14.8954 samples/sec | ETA 00:03:322022-07-21 16:47:48 [INFO][TRAIN] epoch: 4, iter: 19220/20000, loss: 0.3202, lr: 0.000540, batch_cost: 0.2646, reader_cost: 0.12188, ips: 15.1148 samples/sec | ETA 00:03:262022-07-21 16:47:51 [INFO][TRAIN] epoch: 4, iter: 19230/20000, loss: 0.4274, lr: 0.000534, batch_cost: 0.2354, reader_cost: 0.10745, ips: 16.9904 samples/sec | ETA 00:03:012022-07-21 16:47:53 [INFO][TRAIN] epoch: 4, iter: 19240/20000, loss: 0.3821, lr: 0.000528, batch_cost: 0.2351, reader_cost: 0.11293, ips: 17.0123 samples/sec | ETA 00:02:582022-07-21 16:47:56 [INFO][TRAIN] epoch: 4, iter: 19250/20000, loss: 0.2976, lr: 0.000521, batch_cost: 0.2546, reader_cost: 0.13166, ips: 15.7104 samples/sec | ETA 00:03:102022-07-21 16:47:58 [INFO][TRAIN] epoch: 4, iter: 19260/20000, loss: 0.3125, lr: 0.000515, batch_cost: 0.2267, reader_cost: 0.10435, ips: 17.6474 samples/sec | ETA 00:02:472022-07-21 16:48:00 [INFO][TRAIN] epoch: 4, iter: 19270/20000, loss: 0.3288, lr: 0.000509, batch_cost: 0.2351, reader_cost: 0.10520, ips: 17.0173 samples/sec | ETA 00:02:512022-07-21 16:48:03 [INFO][TRAIN] epoch: 4, iter: 19280/20000, loss: 0.2812, lr: 0.000503, batch_cost: 0.2331, reader_cost: 0.10631, ips: 17.1604 samples/sec | ETA 00:02:472022-07-21 16:48:05 [INFO][TRAIN] epoch: 4, iter: 19290/20000, loss: 0.3167, lr: 0.000496, batch_cost: 0.2409, reader_cost: 0.11606, ips: 16.6034 samples/sec | ETA 00:02:512022-07-21 16:48:07 [INFO][TRAIN] epoch: 4, iter: 19300/20000, loss: 0.3659, lr: 0.000490, batch_cost: 0.2359, reader_cost: 0.10972, ips: 16.9551 samples/sec | ETA 00:02:452022-07-21 16:48:10 [INFO][TRAIN] epoch: 4, iter: 19310/20000, loss: 0.3546, lr: 0.000484, batch_cost: 0.2998, reader_cost: 0.13846, ips: 13.3428 samples/sec | ETA 00:03:262022-07-21 16:48:13 [INFO][TRAIN] epoch: 4, iter: 19320/20000, loss: 0.3676, lr: 0.000477, batch_cost: 0.2983, reader_cost: 0.14149, ips: 13.4076 samples/sec | ETA 00:03:222022-07-21 16:48:16 [INFO][TRAIN] epoch: 4, iter: 19330/20000, loss: 0.3037, lr: 0.000471, batch_cost: 0.2128, reader_cost: 0.08910, ips: 18.7934 samples/sec | ETA 00:02:222022-07-21 16:48:18 [INFO][TRAIN] epoch: 4, iter: 19340/20000, loss: 0.3862, lr: 0.000465, batch_cost: 0.2379, reader_cost: 0.11121, ips: 16.8141 samples/sec | ETA 00:02:372022-07-21 16:48:20 [INFO][TRAIN] epoch: 4, iter: 19350/20000, loss: 0.2730, lr: 0.000458, batch_cost: 0.2382, reader_cost: 0.11397, ips: 16.7901 samples/sec | ETA 00:02:342022-07-21 16:48:22 [INFO][TRAIN] epoch: 4, iter: 19360/20000, loss: 0.3416, lr: 0.000452, batch_cost: 0.2212, reader_cost: 0.09920, ips: 18.0847 samples/sec | ETA 00:02:212022-07-21 16:48:25 [INFO][TRAIN] epoch: 4, iter: 19370/20000, loss: 0.3029, lr: 0.000446, batch_cost: 0.2461, reader_cost: 0.11741, ips: 16.2510 samples/sec | ETA 00:02:352022-07-21 16:48:27 [INFO][TRAIN] epoch: 4, iter: 19380/20000, loss: 0.3307, lr: 0.000439, batch_cost: 0.2282, reader_cost: 0.09761, ips: 17.5314 samples/sec | ETA 00:02:212022-07-21 16:48:30 [INFO][TRAIN] epoch: 4, iter: 19390/20000, loss: 0.3165, lr: 0.000433, batch_cost: 0.3036, reader_cost: 0.15668, ips: 13.1742 samples/sec | ETA 00:03:052022-07-21 16:48:33 [INFO][TRAIN] epoch: 4, iter: 19400/20000, loss: 0.3986, lr: 0.000427, batch_cost: 0.2423, reader_cost: 0.11826, ips: 16.5081 samples/sec | ETA 00:02:252022-07-21 16:48:35 [INFO][TRAIN] epoch: 4, iter: 19410/20000, loss: 0.3153, lr: 0.000420, batch_cost: 0.2377, reader_cost: 0.11068, ips: 16.8256 samples/sec | ETA 00:02:202022-07-21 16:48:37 [INFO][TRAIN] epoch: 4, iter: 19420/20000, loss: 0.3733, lr: 0.000414, batch_cost: 0.2406, reader_cost: 0.11596, ips: 16.6283 samples/sec | ETA 00:02:192022-07-21 16:48:40 [INFO][TRAIN] epoch: 4, iter: 19430/20000, loss: 0.3203, lr: 0.000407, batch_cost: 0.2731, reader_cost: 0.13268, ips: 14.6475 samples/sec | ETA 00:02:352022-07-21 16:48:43 [INFO][TRAIN] epoch: 4, iter: 19440/20000, loss: 0.3608, lr: 0.000401, batch_cost: 0.3213, reader_cost: 0.15083, ips: 12.4480 samples/sec | ETA 00:02:592022-07-21 16:48:46 [INFO][TRAIN] epoch: 4, iter: 19450/20000, loss: 0.3388, lr: 0.000395, batch_cost: 0.2470, reader_cost: 0.11526, ips: 16.1915 samples/sec | ETA 00:02:152022-07-21 16:48:48 [INFO][TRAIN] epoch: 4, iter: 19460/20000, loss: 0.2871, lr: 0.000388, batch_cost: 0.2492, reader_cost: 0.12521, ips: 16.0533 samples/sec | ETA 00:02:142022-07-21 16:48:51 [INFO][TRAIN] epoch: 4, iter: 19470/20000, loss: 0.3600, lr: 0.000382, batch_cost: 0.2603, reader_cost: 0.13242, ips: 15.3680 samples/sec | ETA 00:02:172022-07-21 16:48:53 [INFO][TRAIN] epoch: 4, iter: 19480/20000, loss: 0.2687, lr: 0.000375, batch_cost: 0.2400, reader_cost: 0.11135, ips: 16.6660 samples/sec | ETA 00:02:042022-07-21 16:48:56 [INFO][TRAIN] epoch: 4, iter: 19490/20000, loss: 0.2976, lr: 0.000369, batch_cost: 0.2511, reader_cost: 0.12215, ips: 15.9322 samples/sec | ETA 00:02:082022-07-21 16:48:58 [INFO][TRAIN] epoch: 4, iter: 19500/20000, loss: 0.2969, lr: 0.000362, batch_cost: 0.2387, reader_cost: 0.11117, ips: 16.7606 samples/sec | ETA 00:01:592022-07-21 16:49:01 [INFO][TRAIN] epoch: 4, iter: 19510/20000, loss: 0.3131, lr: 0.000356, batch_cost: 0.2303, reader_cost: 0.10041, ips: 17.3662 samples/sec | ETA 00:01:522022-07-21 16:49:03 [INFO][TRAIN] epoch: 4, iter: 19520/20000, loss: 0.5328, lr: 0.000349, batch_cost: 0.2133, reader_cost: 0.09208, ips: 18.7503 samples/sec | ETA 00:01:422022-07-21 16:49:05 [INFO][TRAIN] epoch: 4, iter: 19530/20000, loss: 0.2964, lr: 0.000343, batch_cost: 0.2319, reader_cost: 0.10860, ips: 17.2480 samples/sec | ETA 00:01:482022-07-21 16:49:07 [INFO][TRAIN] epoch: 4, iter: 19540/20000, loss: 0.3551, lr: 0.000336, batch_cost: 0.2256, reader_cost: 0.10293, ips: 17.7288 samples/sec | ETA 00:01:432022-07-21 16:49:10 [INFO][TRAIN] epoch: 4, iter: 19550/20000, loss: 0.2200, lr: 0.000329, batch_cost: 0.2427, reader_cost: 0.11899, ips: 16.4830 samples/sec | ETA 00:01:492022-07-21 16:49:12 [INFO][TRAIN] epoch: 4, iter: 19560/20000, loss: 0.3193, lr: 0.000323, batch_cost: 0.2337, reader_cost: 0.10925, ips: 17.1145 samples/sec | ETA 00:01:422022-07-21 16:49:14 [INFO][TRAIN] epoch: 4, iter: 19570/20000, loss: 0.2911, lr: 0.000316, batch_cost: 0.2420, reader_cost: 0.10620, ips: 16.5319 samples/sec | ETA 00:01:442022-07-21 16:49:18 [INFO][TRAIN] epoch: 4, iter: 19580/20000, loss: 0.4047, lr: 0.000310, batch_cost: 0.3698, reader_cost: 0.19224, ips: 10.8162 samples/sec | ETA 00:02:352022-07-21 16:49:21 [INFO][TRAIN] epoch: 4, iter: 19590/20000, loss: 0.2873, lr: 0.000303, batch_cost: 0.2479, reader_cost: 0.11459, ips: 16.1336 samples/sec | ETA 00:01:412022-07-21 16:49:23 [INFO][TRAIN] epoch: 4, iter: 19600/20000, loss: 0.4859, lr: 0.000296, batch_cost: 0.2204, reader_cost: 0.09549, ips: 18.1471 samples/sec | ETA 00:01:282022-07-21 16:49:25 [INFO][TRAIN] epoch: 4, iter: 19610/20000, loss: 0.2806, lr: 0.000290, batch_cost: 0.2237, reader_cost: 0.10082, ips: 17.8825 samples/sec | ETA 00:01:272022-07-21 16:49:27 [INFO][TRAIN] epoch: 4, iter: 19620/20000, loss: 0.2777, lr: 0.000283, batch_cost: 0.2250, reader_cost: 0.10082, ips: 17.7744 samples/sec | ETA 00:01:252022-07-21 16:49:30 [INFO][TRAIN] epoch: 4, iter: 19630/20000, loss: 0.3308, lr: 0.000276, batch_cost: 0.2340, reader_cost: 0.11167, ips: 17.0905 samples/sec | ETA 00:01:262022-07-21 16:49:33 [INFO][TRAIN] epoch: 4, iter: 19640/20000, loss: 0.2588, lr: 0.000270, batch_cost: 0.2923, reader_cost: 0.14120, ips: 13.6838 samples/sec | ETA 00:01:452022-07-21 16:49:35 [INFO][TRAIN] epoch: 4, iter: 19650/20000, loss: 0.4791, lr: 0.000263, batch_cost: 0.2365, reader_cost: 0.11018, ips: 16.9130 samples/sec | ETA 00:01:222022-07-21 16:49:37 [INFO][TRAIN] epoch: 4, iter: 19660/20000, loss: 0.3626, lr: 0.000256, batch_cost: 0.2450, reader_cost: 0.11834, ips: 16.3261 samples/sec | ETA 00:01:232022-07-21 16:49:40 [INFO][TRAIN] epoch: 4, iter: 19670/20000, loss: 0.3362, lr: 0.000249, batch_cost: 0.2232, reader_cost: 0.09895, ips: 17.9228 samples/sec | ETA 00:01:132022-07-21 16:49:42 [INFO][TRAIN] epoch: 4, iter: 19680/20000, loss: 0.4149, lr: 0.000243, batch_cost: 0.2299, reader_cost: 0.10370, ips: 17.3982 samples/sec | ETA 00:01:132022-07-21 16:49:44 [INFO][TRAIN] epoch: 4, iter: 19690/20000, loss: 0.3533, lr: 0.000236, batch_cost: 0.2410, reader_cost: 0.11593, ips: 16.5980 samples/sec | ETA 00:01:142022-07-21 16:49:48 [INFO][TRAIN] epoch: 4, iter: 19700/20000, loss: 0.3928, lr: 0.000229, batch_cost: 0.3335, reader_cost: 0.16931, ips: 11.9924 samples/sec | ETA 00:01:402022-07-21 16:49:51 [INFO][TRAIN] epoch: 4, iter: 19710/20000, loss: 0.2789, lr: 0.000222, batch_cost: 0.3082, reader_cost: 0.15962, ips: 12.9791 samples/sec | ETA 00:01:292022-07-21 16:49:53 [INFO][TRAIN] epoch: 4, iter: 19720/20000, loss: 0.2763, lr: 0.000215, batch_cost: 0.2329, reader_cost: 0.11030, ips: 17.1751 samples/sec | ETA 00:01:052022-07-21 16:49:56 [INFO][TRAIN] epoch: 4, iter: 19730/20000, loss: 0.3507, lr: 0.000208, batch_cost: 0.2377, reader_cost: 0.10352, ips: 16.8255 samples/sec | ETA 00:01:042022-07-21 16:49:58 [INFO][TRAIN] epoch: 4, iter: 19740/20000, loss: 0.3579, lr: 0.000201, batch_cost: 0.2452, reader_cost: 0.11798, ips: 16.3146 samples/sec | ETA 00:01:032022-07-21 16:50:00 [INFO][TRAIN] epoch: 4, iter: 19750/20000, loss: 0.2775, lr: 0.000194, batch_cost: 0.2240, reader_cost: 0.10053, ips: 17.8571 samples/sec | ETA 00:00:562022-07-21 16:50:03 [INFO][TRAIN] epoch: 4, iter: 19760/20000, loss: 0.3391, lr: 0.000187, batch_cost: 0.2378, reader_cost: 0.11604, ips: 16.8208 samples/sec | ETA 00:00:572022-07-21 16:50:05 [INFO][TRAIN] epoch: 4, iter: 19770/20000, loss: 0.3645, lr: 0.000180, batch_cost: 0.2384, reader_cost: 0.11331, ips: 16.7787 samples/sec | ETA 00:00:542022-07-21 16:50:07 [INFO][TRAIN] epoch: 4, iter: 19780/20000, loss: 0.3372, lr: 0.000173, batch_cost: 0.2472, reader_cost: 0.12335, ips: 16.1786 samples/sec | ETA 00:00:542022-07-21 16:50:10 [INFO][TRAIN] epoch: 4, iter: 19790/20000, loss: 0.3502, lr: 0.000166, batch_cost: 0.2141, reader_cost: 0.09006, ips: 18.6786 samples/sec | ETA 00:00:442022-07-21 16:50:12 [INFO][TRAIN] epoch: 4, iter: 19800/20000, loss: 0.3825, lr: 0.000159, batch_cost: 0.2281, reader_cost: 0.10500, ips: 17.5339 samples/sec | ETA 00:00:452022-07-21 16:50:14 [INFO][TRAIN] epoch: 5, iter: 19810/20000, loss: 0.3732, lr: 0.000152, batch_cost: 0.2405, reader_cost: 0.11709, ips: 16.6301 samples/sec | ETA 00:00:452022-07-21 16:50:17 [INFO][TRAIN] epoch: 5, iter: 19820/20000, loss: 0.3280, lr: 0.000145, batch_cost: 0.2362, reader_cost: 0.11404, ips: 16.9321 samples/sec | ETA 00:00:422022-07-21 16:50:19 [INFO][TRAIN] epoch: 5, iter: 19830/20000, loss: 0.3091, lr: 0.000138, batch_cost: 0.2547, reader_cost: 0.11251, ips: 15.7024 samples/sec | ETA 00:00:432022-07-21 16:50:22 [INFO][TRAIN] epoch: 5, iter: 19840/20000, loss: 0.4444, lr: 0.000130, batch_cost: 0.2514, reader_cost: 0.11718, ips: 15.9121 samples/sec | ETA 00:00:402022-07-21 16:50:25 [INFO][TRAIN] epoch: 5, iter: 19850/20000, loss: 0.3429, lr: 0.000123, batch_cost: 0.3100, reader_cost: 0.15504, ips: 12.9034 samples/sec | ETA 00:00:462022-07-21 16:50:27 [INFO][TRAIN] epoch: 5, iter: 19860/20000, loss: 0.3297, lr: 0.000116, batch_cost: 0.2396, reader_cost: 0.11101, ips: 16.6919 samples/sec | ETA 00:00:332022-07-21 16:50:29 [INFO][TRAIN] epoch: 5, iter: 19870/20000, loss: 0.3267, lr: 0.000108, batch_cost: 0.2274, reader_cost: 0.10614, ips: 17.5882 samples/sec | ETA 00:00:292022-07-21 16:50:32 [INFO][TRAIN] epoch: 5, iter: 19880/20000, loss: 0.3257, lr: 0.000101, batch_cost: 0.2415, reader_cost: 0.11649, ips: 16.5635 samples/sec | ETA 00:00:282022-07-21 16:50:35 [INFO][TRAIN] epoch: 5, iter: 19890/20000, loss: 0.4353, lr: 0.000093, batch_cost: 0.2901, reader_cost: 0.13848, ips: 13.7872 samples/sec | ETA 00:00:312022-07-21 16:50:37 [INFO][TRAIN] epoch: 5, iter: 19900/20000, loss: 0.3346, lr: 0.000086, batch_cost: 0.2205, reader_cost: 0.09630, ips: 18.1412 samples/sec | ETA 00:00:222022-07-21 16:50:39 [INFO][TRAIN] epoch: 5, iter: 19910/20000, loss: 0.3142, lr: 0.000078, batch_cost: 0.2331, reader_cost: 0.11010, ips: 17.1579 samples/sec | ETA 00:00:202022-07-21 16:50:42 [INFO][TRAIN] epoch: 5, iter: 19920/20000, loss: 0.3014, lr: 0.000070, batch_cost: 0.2365, reader_cost: 0.10561, ips: 16.9123 samples/sec | ETA 00:00:182022-07-21 16:50:44 [INFO][TRAIN] epoch: 5, iter: 19930/20000, loss: 0.2803, lr: 0.000062, batch_cost: 0.2324, reader_cost: 0.10681, ips: 17.2151 samples/sec | ETA 00:00:162022-07-21 16:50:46 [INFO][TRAIN] epoch: 5, iter: 19940/20000, loss: 0.3316, lr: 0.000054, batch_cost: 0.2298, reader_cost: 0.10150, ips: 17.4098 samples/sec | ETA 00:00:132022-07-21 16:50:49 [INFO][TRAIN] epoch: 5, iter: 19950/20000, loss: 0.2373, lr: 0.000046, batch_cost: 0.2522, reader_cost: 0.12966, ips: 15.8578 samples/sec | ETA 00:00:122022-07-21 16:50:51 [INFO][TRAIN] epoch: 5, iter: 19960/20000, loss: 0.3187, lr: 0.000038, batch_cost: 0.2644, reader_cost: 0.12060, ips: 15.1296 samples/sec | ETA 00:00:102022-07-21 16:50:54 [INFO][TRAIN] epoch: 5, iter: 19970/20000, loss: 0.3909, lr: 0.000030, batch_cost: 0.2806, reader_cost: 0.12679, ips: 14.2573 samples/sec | ETA 00:00:082022-07-21 16:50:57 [INFO][TRAIN] epoch: 5, iter: 19980/20000, loss: 0.3619, lr: 0.000021, batch_cost: 0.2840, reader_cost: 0.14115, ips: 14.0867 samples/sec | ETA 00:00:052022-07-21 16:50:59 [INFO][TRAIN] epoch: 5, iter: 19990/20000, loss: 0.3193, lr: 0.000012, batch_cost: 0.2216, reader_cost: 0.09452, ips: 18.0467 samples/sec | ETA 00:00:022022-07-21 16:51:02 [INFO][TRAIN] epoch: 5, iter: 20000/20000, loss: 0.2613, lr: 0.000001, batch_cost: 0.2486, reader_cost: 0.12090, ips: 16.0915 samples/sec | ETA 00:00:002022-07-21 16:51:02 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 517s 70ms/step - batch_cost: 0.0700 - reader cost: 0.0083: 18:55 - batch_cost: 0.1541 - reade - ETA: 10:23 - batch_cost: 0.0850 - reader cost: 0.01 - ETA: 10:22 - batch_cost: 0.0848 - reader c - ETA: 9:43 - batch_cost: 0.0797 - reader cost: 0.014 - ETA: 9:37 - batch_cost: 0.0789 - reader cost: 0 - ETA: 9:22 - batch_cost: 0.0770 - reade - ETA: 8:36 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 8:30 - batch_cost: 0.0702 - reader cost: 0. - ETA: 8:21 - batch_cost: 0.0689 - reader cos - ETA: 8:08 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 8:05 - batch_cost: 0.0669 - reader cos - ETA: 7:52 - batch_cost: 0.0654 - reader cost: - ETA: 7:48 - batch_cost: 0.0648 - reader cost: - ETA: 7:41 - batch_cost: 0.0640 - reader c - ETA: 7:32 - batch_cost: 0.0630 - reader - ETA: 7:28 - batch_cost: 0.0625 - read - ETA: 7:18 - batch_cost: 0.0614 - reader cost: 0. - ETA: 7:19 - batch_cost: 0.0616 - reader cost: 0 - ETA: 7:17 - batch_cost: 0.0613 - reade - ETA: 7:10 - batch_cost: 0.0606 - reader cost: 0. - ETA: 7:09 - batch_cost: 0.0606 - reader cost: 0.0 - ETA: 7:10 - batch_cost: 0.0607 - reader cost: - ETA: 7:11 - batch_cost: 0.0608 - reader cost: 0.0 - ETA: 7:11 - batch_cost: 0.0610 - reader cost: 0 - ETA: 7:14 - batch_cost: 0.0613 - reader cost: 0. - ETA: 7:15 - batch_cost: 0.0615 - reader cost: 0. - ETA: 7:16 - batch_cost: 0.0617 - - ETA: 7:31 - batch_cost: 0.0640 - reader cost: 0.00 - ETA: 7:33 - batch_cost: 0.0643 - reader cost: 0.00 - ETA: 7:34 - batch_cost: 0.0645 - reader - ETA: 7:37 - batch_cost: 0.0650 - reader cost: 0.007 - ETA: 7:37 - batch_cost: 0.0651 - reader cost: 0.00 - ETA: 7:38 - batch_cost: 0.0652 - reader cost: 0.0 - ETA: 7:38 - batch_cost: 0 - ETA: 7:44 - batch_cost: 0.0663 - reader - ETA: 7:48 - batch_cost: 0.0671 - reader cost: 0 - ETA: 7:53 - batch_cost: 0.0678 - reader cost - ETA: 7:59 - batch_cost: 0.0687 - reader cost: 0 - ETA: 8:00 - batch_cost: 0.0689 - reader cost: - ETA: 8:03 - batch_cost: 0.0695 - reader cost: 0.0 - ETA: 8:06 - batch_cost: 0.0699 - reader cost: 0.01 - ETA: 8:07 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 8:08 - batch_cost: 0.0703 - reader co - ETA: 8:11 - batch_cost: 0.0707 - reader cost: - ETA: 8:10 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 8:11 - batch_cost: 0.0708 - reader cos - ETA: 8:13 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 8:14 - batch_cost: 0.0714 - reader cost: 0.0 - ETA: 8:16 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 8:17 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 8:19 - batch_cost: 0.0722 - reader cost: 0 - ETA: 8:23 - batch_cost: 0.0728 - reader cost: 0.012 - ETA: 8:23 - batch_cost: 0.0729 - reader cost: 0.01 - ETA: 8:23 - batch_cost: 0.0729 - reader cost: 0.012 - ETA: 8:23 - batch_cost: 0.0729 - reader cost: 0.01 - ETA: 8:23 - batch_cost: 0.0730 - rea - ETA: 8:23 - batch_cost: 0.0731 - reader cost: 0. - ETA: 8:23 - batch_cost: 0.0731 - reader cost: 0.0 - ETA: 8:23 - batch_cost: 0.0731 - reader cost: 0.012 - ETA: 8:23 - batch_cost: 0.0731 - reader cos - ETA: 8:22 - batch_cost: 0.0732 - reader c - ETA: 8:17 - batch_cost: 0.0726 - reader c - ETA: 8:12 - batch_cost: 0.0721 - reader cost: - ETA: 8:10 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 8:09 - batch_cost: 0.0718 - reader cost: - ETA: 8:06 - batch_cost: 0.0714 - reader cost: 0.0 - ETA: 8:05 - batch_cost: 0.0713 - reader cost: 0 - ETA: 8:02 - batch_cost: 0.0710 - - ETA: 7:53 - batch_cost: 0.0700 - reader cost: 0.01 - ETA: 7:52 - batch_cost: 0.0700 - reader cost: 0.010 - ETA: 7:52 - batch_cost: 0.0699 - reader cost: 0.01 - ETA: 7:51 - batch_cost: 0.0699 - reader co - ETA: 7:48 - batch_cost: 0.0696 - reade - ETA: 7:43 - batch_cost: 0.0691 - reader cost: 0.0 - ETA: 7:42 - batch_cost: 0.0690 - reader cost: 0.010 - ETA: 7:42 - batch_cost: 0.0690 - reader cost: 0.009 - ETA: 7:42 - batch_cost: 0.0690 - read - ETA: 7:39 - batch_cost: 0.0688 - reader cost: 0.0 - ETA: 7:39 - batch_cost: 0.06 - ETA: 7:39 - batch_cost: 0.0691 - reader cost - ETA: 7:39 - batch_cost: 0.0691 - reader cost: 0.01 - ETA: 7:39 - batch_cost: 0.0692 - reader cost: 0.010 - ETA: 7:39 - batch_cost: 0.0692 - reader - ETA: 7:39 - batch_cost: 0.0694 - reader c - ETA: 7:40 - batch_cost: 0.0696 - reader cost: 0.01 - ETA: 7:40 - batch_cost: 0.0696 - reader cost: 0 - ETA: 7:38 - batch_cost: 0.0694 - reader - ETA: 7:33 - batch_cost: 0.0690 - reader cost: 0.0 - ETA: 7:33 - batch_cost: 0.0689 - reader cost - ETA: 7:31 - batch_cost: 0.0688 - reader cost: 0. - ETA: 7:30 - batch_cost: 0.0687 - reader cost: 0 - ETA: 7:30 - batch_cost: 0.0687 - reader cost: 0.009 - ETA: 7:30 - batch_cost: 0.0687 - reader cost: 0.0 - ETA: 7:29 - batch_cost: 0.0688 - reader cost: 0.0 - ETA: 7:29 - batch_cost: 0.0688 - reader cost: 0.009 - ETA: 7:29 - batch_cost: 0.0688 - reader cost: 0.00 - ETA: 7:29 - batch_cost: 0.0688 - reader cost: 0.00 - ETA: 7:29 - batch_cost: 0.0689 - reader cost: - ETA: 7:29 - batch_cost: 0.0689 - reader cost: - ETA: 7:28 - batch_cost: 0.0689 - reader cost: - ETA: 7:28 - batch_cost: 0.0689 - reader cost: 0.00 - ETA: 7:28 - batch_cost: 0.0690 - reader cost: 0.0 - ETA: 7:28 - batch_cost: 0.0690 - reader cost: 0. - ETA: 7:28 - batch_cost: 0.0691 - reader cost: 0. - ETA: 7:26 - batch_cost: 0.0689 - reader - ETA: 7:23 - batch_cost: 0.0686 - reader c - ETA: 7:20 - batch_cost: 0.0683 - reader co - ETA: 7:17 - batch_cost: 0.0681 - reade - ETA: 7:15 - batch_cost: 0.0679 - reader cost: 0. - ETA: 7:14 - batch_cost: 0.0679 - reader cost: 0.007 - ETA: 7:14 - batch_cost: 0.0679 - reader cost: - ETA: 7:12 - batch_cost: 0.0677 - reader c - ETA: 7:09 - batch_cost: 0.0674 - reader cost: - ETA: 7:07 - batch_cost: 0.0673 - reader cost: 0 - ETA: 7:05 - batch_cost: 0.0671 - - ETA: 7:00 - batch_cost: 0.0666 - reader cos - ETA: 6:58 - batch_cost: 0.0665 - reader cost: 0.00 - ETA: 6:57 - batch_cost: 0.0665 - reader cost: - ETA: 6:56 - batch_cost: 0.0663 - reader cost: 0. - ETA: 6:55 - batch_cost: 0.0663 - reader cost: 0 - ETA: 6:53 - batch_cost: 0.0661 - reader cost: 0.00 - ETA: 6:53 - batch_cost: 0.0661 - reader c - ETA: 6:50 - batch_cost: 0.0659 - reader cost: 0.0 - ETA: 6:49 - batch_cost: 0.0658 - reader cost: - ETA: 6:48 - batch_cost: 0.0656 - reader cost: - ETA: 6:46 - batch_cost: 0.0655 - reader cost: 0.0 - ETA: 6:45 - batch_cost: 0.0655 - reader cost: 0.00 - ETA: 6:45 - batch_cost: 0. - ETA: 6:39 - batch_cost: 0.0650 - reader cost: - ETA: 6:38 - batch_cost: 0.0649 - reader co - ETA: 6:36 - batch_cost: 0.0647 - reader c - ETA: 6:33 - batch_cost: 0.0645 - reader cost: 0.0 - ETA: 6:33 - batch_cost: 0.0645 - reader cost: 0 - ETA: 6:33 - batch_cost: 0.0646 - reader cost: - ETA: 6:34 - batch_cost: 0.0649 - reader cost: 0 - ETA: 6:35 - batch_cost: 0.0651 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0652 - reader cost: 0.0 - ETA: 6:35 - batch_cost: 0.0652 - reader cost - ETA: 6:35 - batch_cost: 0.0653 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0654 - rea - ETA: 6:35 - batch_cost: 0.0655 - reader cost: 0.0 - ETA: 6:35 - batch_cost: 0.0656 - rea - ETA: 6:35 - batch_cost: 0.0657 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0657 - reader cost: 0.0 - ETA: 6:35 - batch_cost: 0.0658 - reader cost: 0.0 - ETA: 6:35 - batch_cost: 0.065 - ETA: 6:35 - batch_cost: 0.0660 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0660 - reader c - ETA: 6:35 - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0663 - reader cos - ETA: 6:37 - batch_cost: 0.0667 - reader cost: 0.00 - ETA: 6:38 - batch_cost: 0.0668 - reader cost: 0.0 - ETA: 6:38 - batch_cost: 0.0670 - reader cost: 0.0 - ETA: 6:39 - batch_cost: 0.0671 - reader cost: - ETA: 6:41 - batch_cost: 0.0674 - reader cost: 0.007 - ETA: 6:41 - batch_cost: 0.0675 - reader cost: 0.00 - ETA: 6:41 - batch_cost: 0.0676 - reader cost: 0.00 - ETA: 6:41 - batch_cost: 0.0677 - reader cost: 0.00 - ETA: 6:42 - batch_cost: 0.0677 - - ETA: 6:42 - batch_cost: 0.0680 - reader cost: 0. - ETA: 6:41 - batch_cost: 0.0680 - reader cos - ETA: 6:41 - batch_cost: 0.0680 - reader cost - ETA: 6:41 - batch_cost: 0.0681 - reader cost: 0.007 - ETA: 6:41 - batch_cost: 0.0681 - reader cost: 0.007 - ETA: 6:41 - batch_cost: 0.0681 - reader cost: 0.007 - ETA: 6:41 - batch_cost: 0.0681 - reader cost: 0 - ETA: 6:40 - batch_cost: 0.0681 - reader cost: 0.007 - ETA: 6:40 - batch_cost: 0.0681 - reader cost: 0.00 - ETA: 6:40 - batch_cost: 0.0681 - reader cost: - ETA: 6:39 - batch_cost: 0.0681 - reader cost: - ETA: 6:38 - batch_cost: 0.0680 - reader cos - ETA: 6:35 - batch_cost: 0.0678 - reader cost: 0.00 - ETA: 6:35 - batch_cost: 0.0678 - reader cost: 0. - ETA: 6:34 - batch_cost: 0.0677 - reader cost: - ETA: 6:33 - batch_cost: 0.0676 - reader cost: - ETA: 6:31 - batch_cost: 0.0675 - reader cost: - ETA: 6:30 - batch_cost: 0.06 - ETA: 6:24 - batch_cost: 0.0670 - reader cost: 0.00 - ETA: 6:24 - batch_cost: 0.0669 - - ETA: 6:20 - batch_cost: 0.0667 - reader cost: - ETA: 6:19 - batch_cost: 0.0666 - reader cost: 0. - ETA: 6:18 - batch_cost: 0 - ETA: 6:13 - batch_cost: 0.0663 - reader cost: 0 - ETA: 6:13 - batch_cost: 0.0663 - reader cos - ETA: 6:13 - batch_cost: 0.0664 - reader cost: 0. - ETA: 6:13 - batch_cost: 0.0665 - reader cost: - ETA: 6:12 - batch_cost: 0.0664 - r - ETA: 6:08 - batch_cost: 0.0662 - reader cost: 0 - ETA: 6:07 - batch_cost: 0.0661 - - ETA: 6:03 - batch_cost: 0.0658 - reader cost: - ETA: 6:02 - batch_cost: 0.0658 - reader cost: - ETA: 6:01 - batch_cost: 0.0657 - reader cost: - ETA: 6:01 - batch_cost: 0.0657 - reader cost: 0.00 - ETA: 6:01 - batch_cost: 0.0658 - reader cost: 0.0 - ETA: 6:01 - batch_cost: 0.0658 - reader cost: 0.00 - ETA: 6:01 - batch_cost: 0.0658 - reader cost: 0.00 - ETA: 6:00 - batch_cost: 0.0658 - reader cost: 0. - ETA: 6:01 - batch_cost: 0.0660 - reader cost: 0.00 - ETA: 6:01 - batch_cost: 0.0660 - reader cost: 0.00 - ETA: 6:01 - batch_cost: 0.0660 - reader cos - ETA: 6:01 - batch_cost: 0.0661 - reader cost: - ETA: 6:01 - batch_cost: 0.0663 - reader cost: 0.0 - ETA: 6:01 - batch_cost: 0.0663 - reader cost: 0.00 - ETA: 6:01 - batch_cost: 0.0664 - reader - ETA: 6:01 - batch_cost: 0.0665 - reader cost: 0.0 - ETA: 6:01 - batch_cost: 0.0665 - reader cost: - ETA: 6:01 - batch_cost: 0.0666 - reader cost: 0.0 - ETA: 6:01 - batch_cost: 0.0666 - reader cost: 0. - ETA: 6:00 - batch_cost: 0.0666 - reader cost: 0.0 - ETA: 6:00 - batch_cost: 0.0666 - reader cost: - ETA: 6:00 - batch_cost: 0.0667 - reader c - ETA: 5:59 - batch_cost: 0.0667 - reader cost: 0.00 - ETA: 5:59 - batch_cost: 0.0667 - reader cost: 0.00 - ETA: 5:59 - batch_cost: 0.0667 - reader cost - ETA: 5:58 - batch_cost: 0.0668 - reader c - ETA: 5:57 - batch_cost: 0.0667 - reader cost: 0.00 - ETA: 5:57 - batch_cost: 0.0667 - reader - ETA: 5:56 - batch_cost: 0.0668 - reader cost: 0.007 - ETA: 5:56 - batch_cost: 0.0667 - reader cost: 0. - ETA: 5:56 - batch_cost: 0.0668 - reader cos - ETA: 5:55 - batch_cost: 0.0668 - reader cost: 0.00 - ETA: 5:55 - batch_cost: 0.0668 - reader cost - ETA: 5:55 - batch_cost: 0.0668 - re - ETA: 5:54 - batch_cost: 0.0669 - reader cost: 0 - ETA: 5:53 - batch_cost: 0.0669 - reader cost: - ETA: 5:53 - batch_cost: 0.0669 - reader cost: 0.007 - ETA: 5:53 - batch_cost: 0.0669 - reader cost: 0.00 - ETA: 5:53 - batch_cost: 0.0669 - reader cost: 0.0 - ETA: 5:52 - batch_cost: 0.0669 - reader cost: - ETA: 5:52 - batch_cost: 0.0670 - reader cost: 0. - ETA: 5:51 - batch_cost: 0.0670 - reader cost: 0.00 - ETA: 5:51 - batch_cost: 0.0670 - reader cost: 0.007 - ETA: 5:51 - batch_cost: 0.0670 - reader cost: 0. - ETA: 5:51 - batch_cost: 0.0670 - reader cost: 0.00 - ETA: 5:51 - batch_cost: 0.0670 - reader cost - ETA: 5:50 - batch_cost: 0.0670 - reader cos - ETA: 5:50 - batch_cost: 0.0670 - reader cost: 0.007 - ETA: 5:49 - batch_cost: 0.0670 - reader cost: 0.007 - ETA: 5:49 - batch_cost: 0.0670 - reader cost: 0.00 - ETA: 5:49 - batch_cost: 0.0670 - reader cost - ETA: 5:48 - batch_cost: 0.0670 - reader cos - ETA: 5:48 - batch_cost: 0.0670 - reader cost: - ETA: 5:47 - batch_cost: 0.0671 - reader cost - ETA: 5:47 - batch_cost: 0.0672 - reader cost: 0.007 - ETA: 5:47 - batch_cost: 0.0672 - reader cost: 0.0 - ETA: 5:47 - batch_cost: 0.0673 - reader cost: 0.0 - ETA: 5:47 - batch_cost: 0.0673 - reader cost: 0.00 - ETA: 5:47 - batch_cost: 0.0674 - reader cost: - ETA: 5:47 - batch_cost: 0.0674 - reader cos - ETA: 5:47 - batch_cost: 0.0675 - reader cost: 0 - ETA: 5:46 - batch_cost: 0.0675 - reader cost: 0.00 - ETA: 5:46 - batch_cost: 0.0675 - - ETA: 5:45 - batch_cost: 0.0675 - reader cost: 0. - ETA: 5:44 - batch_cost: 0.0675 - reader cost: - ETA: 5:44 - batch_cost: 0.0675 - reader cost: 0.008 - ETA: 5:44 - batch_cost: 0.0675 - reader cost - ETA: 5:43 - batch_cost: 0.0675 - reader cost: 0 - ETA: 5:43 - batch_cost: 0.0676 - reader cost: 0.0 - ETA: 5:43 - batch_cost: 0.0676 - reader cost: 0.00 - ETA: 5:43 - batch_cost: 0.0676 - reader cost: 0 - ETA: 5:42 - batch_cost: 0.0676 - reader cost: - ETA: 5:42 - batch_cost: 0.0676 - reader cost: 0. - ETA: 5:42 - batch_cost: 0.0676 - reader cost: - ETA: 5:41 - batch_cost: 0.0676 - reader cost: 0.008 - ETA: 5:41 - batch_cost: 0.0677 - reader cost: 0 - ETA: 5:41 - batch_cost: 0.0678 - reader cost: 0.0 - ETA: 5:41 - batch_cost: 0.0678 - reader cost: 0.008 - ETA: 5:41 - batch_cost: 0.0 - ETA: 5:41 - batch_cost: 0.0681 - reader cost: 0.0 - ETA: 5:41 - batch_cost: 0.0682 - reader cost: 0 - ETA: 5:41 - batch_cost: 0.0683 - reader - ETA: 5:42 - batch_cost: 0.0685 - reader cost: 0.00 - ETA: 5:42 - batch_cost: 0.0686 - reader - ETA: 5:42 - batch_cost: 0.0688 - reader cost - ETA: 5:41 - batch_cost: 0.0688 - reader cos - ETA: 5:41 - batch_cost: 0.0688 - reader cost: 0.00 - ETA: 5:40 - batch_cost: 0.0688 - reader cost - ETA: 5:40 - batch_cost: 0.0688 - reader cost: - ETA: 5:39 - batch_cost: 0.06 - ETA: 5:37 - batch_cost: 0.0688 - reader cost: 0.008 - ETA: 5:37 - batch_cost: 0.0688 - reader cost: 0. - ETA: 5:36 - batch_cost: 0.0688 - reader cost - ETA: 5:36 - batch_cost: 0.0688 - reader co - ETA: 5:35 - batch_cost: 0.0688 - reader cost: 0.00 - ETA: 5:35 - batch_cost: 0.0688 - reader cost: - ETA: 5:34 - batch_cost: 0.0688 - reader cost: 0 - ETA: 5:34 - batch_cost: 0.0688 - reader cost: 0. - ETA: 5:33 - batch_cost: 0.0688 - reader c - ETA: 5:33 - batch_cost: 0.0688 - reader cost: - ETA: 5:32 - batch_cost: 0.0689 - reader cost: 0.00 - ETA: 5:32 - batch_cost: 0.0689 - reader cost: 0.008 - ETA: 5:32 - batch_cost: 0.0689 - reader cost: 0.008 - ETA: 5:32 - batch_cost: 0.0689 - reader cos - ETA: 5:31 - batch_cost: 0.0689 - reader cost: 0.0 - ETA: 5:31 - batch_cost: 0.0689 - reader c - ETA: 5:31 - batch_cost: 0.0690 - reader cost: 0.00 - ETA: 5:30 - batch_cost: 0.0690 - reader cost: 0.0 - ETA: 5:30 - batch_cost: 0.0690 - reader cost: 0.00 - ETA: 5:30 - batch_cost: 0.0690 - reader cos - ETA: 5:30 - batch_cost: 0.0691 - reader cost: 0.009 - ETA: 5:30 - batch_cost: 0.0691 - reader cost: 0. - ETA: 5:29 - batch_cost: 0.0691 - reader cost: 0.00 - ETA: 5:29 - batch_cost: 0.0691 - reader cost: 0.00 - ETA: 5:29 - batch_cost: 0.0692 - reader cost - ETA: 5:29 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 5:29 - batch_cost: 0.0693 - reader cost: 0.00 - ETA: 5:29 - batch_cost: 0.0693 - reader cost: 0 - ETA: 5:28 - batch_cost: 0.0693 - reader cost: - ETA: 5:28 - batch_cost: 0.0693 - - ETA: 5:27 - batch_cost: 0.0694 - reader cost: - ETA: 5:26 - batch_cost: 0.0694 - reader cost - ETA: 5:26 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 5:25 - batch_cost: 0.0693 - reader cost: - ETA: 5:25 - batch_cost: 0.0693 - reader cost: 0.009 - ETA: 5:25 - batch_cost: 0.0693 - reader cost: - ETA: 5:24 - batch_cost: 0.0693 - reader cost: 0. - ETA: 5:24 - batch_cost: 0.0693 - re - ETA: 5:22 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 5:22 - batch_cost: 0.0693 - reader cost: 0.00 - ETA: 5:22 - batch_cost: 0.0693 - reader cost: 0.0 - ETA: 5:21 - batch_cost: 0.0693 - reader cost: 0.00 - ETA: 5:21 - batch_cost: 0.0693 - reader cost - ETA: 5:21 - batch_cost: 0.0693 - reader cost - ETA: 5:20 - batch_cost: 0.0694 - reader cost: - ETA: 5:20 - batch_cost: 0.0695 - reader cost: 0. - ETA: 5:20 - batch_cost: 0.0695 - reader cost: 0.009 - ETA: 5:20 - batch_cost: 0.0695 - reader cost: 0.00 - ETA: 5:20 - batch_cost: 0.0696 - reader cost: 0.00 - ETA: 5:20 - batch_cost: 0.0696 - reader cost: 0 - ETA: 5:20 - batch_cost: 0.0696 - reader cost: - ETA: 5:19 - batch_cost: 0.0696 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0696 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0697 - reader cost: 0.0 - ETA: 5:19 - batch_cost: 0.0697 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0697 - reader cost: 0.009 - ETA: 5:19 - batch_cost: 0.0697 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:19 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 5:18 - batch_cost: 0.0698 - reader co - ETA: 5:17 - batch_cost: 0.0698 - reader cost: - ETA: 5:17 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 5:17 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:17 - batch_cost: 0.0698 - reader cost: 0. - ETA: 5:16 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:16 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:16 - batch_cost: 0.0698 - reader cost: 0.009 - ETA: 5:16 - batch_cost: 0.0698 - reader c - ETA: 5:15 - batch_cost: 0.0698 - reader cost: - ETA: 5:14 - batch_cost: 0.0698 - read - ETA: 5:13 - batch_cost: 0.0698 - reader - ETA: 5:12 - batch_cost: 0.0698 - reader cost: 0.00 - ETA: 5:12 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 5:12 - batch_cost: 0.0698 - reader cost: - ETA: 5:11 - batch_cost: 0.0698 - reader cost: 0.0 - ETA: 5:11 - batch_cost: 0.0698 - reader co - ETA: 5:10 - batch_cost: 0.0698 - reader cost: - ETA: 5:09 - batch_cost: 0.0698 - reader cost: - ETA: 5:09 - batch_cost: 0.0698 - ETA: 5:07 - batch_cost: 0.0699 - ETA: 5:05 - batch_cost: 0.0699 - reader cost: 0.009 - ETA: 5:05 - batch_cost: 0.0699 - reader cost: 0.0 - ETA: 5:05 - batch_cost: 0.0699 - reader - ETA: 5:04 - batch_cost: 0.0699 - reader cost - ETA: 5:04 - batch_cost: 0.0699 - reader cost: 0. - ETA: 5:03 - batch_cost: 0.0700 - reader cost: 0.009 - ETA: 5:03 - batch_cost: 0.0700 - reader cost: 0.0 - ETA: 5:03 - batch_cost: 0.0700 - reader cost: 0.009 - ETA: 5:03 - batch_cost: 0.0700 - read - ETA: 5:02 - batch_cost: 0.0701 - reader cost: 0. - ETA: 5:02 - batch_cost: 0.0701 - reader cost: 0. - ETA: 5:01 - batch_cost: 0.0700 - reader cost: 0. - ETA: 5:01 - batch_cost: 0.0700 - reader cost: 0 - ETA: 5:00 - batch_cost: 0.0700 - reader cost: - ETA: 5:00 - batch_cost: 0.0700 - reader cost: - ETA: 4:59 - batch_cost: 0.0700 - reader cost: 0. - ETA: 4:58 - batch_cost: 0.0700 - reader cost: 0.00 - ETA: 4:58 - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 4:58 - batch_cost: 0.0700 - reader cost: 0.009 - ETA: 4:58 - batch_cost: 0.0701 - reader co - ETA: 4:57 - batch_cost: 0.0701 - r - ETA: 4:56 - batch_cost: 0.0702 - reader cost: 0.00 - ETA: 4:56 - batch_cost: 0.0702 - reader cost: 0.0 - ETA: 4:56 - batch_cost: 0.0702 - reader cost: 0 - ETA: 4:56 - batch_cost: 0.0703 - reader cost: 0.0 - ETA: 4:56 - batch_cost: 0.0703 - reader cost: 0.00 - ETA: 4:56 - batch_cost: 0.0703 - reader cost: 0. - ETA: 4:55 - batch_cost: 0.0703 - reader cost: - ETA: 4:55 - batch_cost: 0.0703 - reader cost: 0.00 - ETA: 4:54 - batch_cost: 0.0703 - reader cost: 0.0 - ETA: 4:54 - batch_cost: 0.0703 - reader cost: 0. - ETA: 4:54 - batch_cost: 0.0703 - reader cost: 0.009 - ETA: 4:54 - batch_cost: 0.0703 - reader cost: 0.0 - ETA: 4:53 - batch_cost: 0.0703 - reader co - ETA: 4:53 - batch_cost: 0.0703 - reader cost: 0.009 - ETA: 4:52 - batch_cost: 0.0703 - reader cost: 0.01 - ETA: 4:52 - batch_cost: 0.0703 - reader cost: 0.010 - ETA: 4:52 - batch_cost: 0.0703 - reader cost: 0 - ETA: 4:52 - batch_cost: 0.0703 - reade - ETA: 4:51 - batch_cost: 0.0704 - reader cost: 0.010 - ETA: 4:51 - batch_cost: 0.0704 - read - ETA: 4:50 - batch_cost: 0.0705 - reader cost: 0 - ETA: 4:50 - batch_cost: 0.0705 - reader cost: 0.01 - ETA: 4:50 - batch_cost: 0.0705 - reader cost: 0.010 - ETA: 4:50 - batch_cost: 0.0705 - reader cost: 0.01 - ETA: 4:50 - batch_cost: 0.0706 - reader cost: 0.010 - ETA: 4:50 - batch_cost: 0.0706 - reader cost: 0. - ETA: 4:50 - batch_cost: 0.0706 - reader cost: 0.01 - ETA: 4:50 - batch_cost: 0.0707 - reader cost: 0.0 - ETA: 4:50 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:50 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:50 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:49 - batch_cost: 0.0707 - reader cost: - ETA: 4:49 - batch_cost: 0.0708 - reader cost: 0. - ETA: 4:48 - batch_cost: 0.0708 - reader cost: 0.01 - ETA: 4:48 - batch_cost: 0.0708 - reader cost: 0.0 - ETA: 4:48 - batch_cost: 0.0707 - reader cost: - ETA: 4:47 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:47 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:47 - batch_cost: 0.0707 - reader cost: 0.0 - ETA: 4:47 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:46 - batch_cost: 0.0707 - reader cost: 0 - ETA: 4:46 - batch_cost: 0.0707 - reader c - ETA: 4:45 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:45 - batch_cost: 0.0707 - reader cost: 0.0 - ETA: 4:44 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:44 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:44 - batch_cost: 0.0707 - reader cost: 0.0 - ETA: 4:44 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:44 - batch_cost: 0.0707 - reader - ETA: 4:42 - batch_cost: 0.0707 - reader cost: - ETA: 4:42 - batch_cost: 0.0707 - reader - ETA: 4:41 - batch_cost: 0.0707 - reader cost: 0.0 - ETA: 4:40 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:40 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:40 - batch_cost: 0.0707 - reader cos - ETA: 4:39 - batch_cost: 0.0707 - reader cost: 0.01 - ETA: 4:39 - batch_cost: 0.0707 - reader cost: 0.010 - ETA: 4:39 - batch_cost: 0.0707 - reader cost: 0. - ETA: 4:39 - batch_cost: 0.0707 - reader co - ETA: 4:38 - batch_cost: 0.0707 - reader cost: 0 - ETA: 4:38 - batch_cost: 0.0708 - reader cost: 0.01 - ETA: 4:38 - batch_cost: 0.0708 - reader cost: 0.01 - ETA: 4:37 - batch_cost: 0.0708 - reader cost: 0.01 - ETA: 4:37 - batch_cost: 0.0708 - reader cost: 0.01 - ETA: 4:37 - batch_cost: 0.0708 - reader cost: 0 - ETA: 4:37 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:37 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:37 - batch_cost: 0.0709 - reader cost: - ETA: 4:36 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:36 - batch_cost: 0.0709 - reader cost: 0.010 - ETA: 4:36 - batch_cost: 0.0709 - reader cost: 0.010 - ETA: 4:35 - batch_cost: 0.0709 - reader cost: 0.010 - ETA: 4:35 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:35 - batch_cost: 0.0709 - reader - ETA: 4:34 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:34 - batch_cost: 0.0709 - reader cost - ETA: 4:33 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:33 - batch_cost: 0.0709 - re - ETA: 4:32 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:32 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:32 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:31 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:31 - batch_cost: 0.0709 - reader cost: 0.010 - ETA: 4:31 - batch_cost: 0.0709 - reader cost: - ETA: 4:31 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:31 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:30 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:30 - batch_cost: 0.0709 - reader co - ETA: 4:29 - batch_cost: 0.0709 - reader cost: 0.010 - ETA: 4:29 - batch_co - ETA: 4:26 - batch_cost: 0.0709 - reader cost: 0.01 - ETA: 4:26 - batch_cost: 0.0709 - reader cost: - ETA: 4:25 - batch_cost: 0.0709 - reader c - ETA: 4:24 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 4:24 - batch_cost: 0.0709 - reader cost: 0.010 - ETA: 4:24 - batch_cost: 0.0709 - reader cost: 0. - ETA: 4:24 - batch_cost: 0.0709 - reader cost: - ETA: 4:24 - batch_cost: 0.0710 - reader cost - ETA: 4:23 - batch_cost: 0.0711 - reader cost: - ETA: 4:23 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 4:23 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 4:23 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 4:23 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 4:23 - batch_cost: 0.0712 - reader cost: - ETA: 4:23 - batch_cost: 0.0712 - reader cost: - ETA: 4:22 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 4:22 - batch_cost: 0.0713 - reader cost: 0. - ETA: 4:22 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 4:22 - batch_cost: 0.0713 - reader - ETA: 4:20 - batch_cost: 0.0713 - reader cost: 0. - ETA: 4:20 - batch_cost: 0.0713 - reader cos - ETA: 4:19 - batch_cost: 0.0713 - reader cos - ETA: 4:18 - batch_cost: 0.0712 - reader cost - ETA: 4:17 - batch_cost: 0.0712 - reader cost: - ETA: 4:17 - batch_cost: 0.0713 - reader cost: 0.010 - ETA: 4:17 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 4:16 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 4:16 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 4:16 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 4:16 - batch_cost: 0.0712 - reader cost: - ETA: 4:15 - batch_cost: 0.0712 - reader cost: - ETA: 4:14 - batch_cost: 0.0712 - ETA: 4:12 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 4:12 - batch_cost: 0.0712 - reader cost: 0 - ETA: 4:12 - batch_cost: 0.0712 - reader cost: - ETA: 4:11 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 4:11 - batch_cost: 0.0712 - reader cost - ETA: 4:10 - batch_cost: 0.0712 - reader cost: 0 - ETA: 4:10 - batch_cost: 0.0712 - reader cost: 0.01 - ETA: 4:10 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 4:09 - batch_cost: 0.0712 - reader cost: 0 - ETA: 4:09 - batch_cost: 0.0712 - reader cost: 0. - ETA: 4:09 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 4:09 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 4:09 - batch_cost: 0.0713 - reader cost: 0.010 - ETA: 4:09 - batch_cost: 0.0713 - reader cost: 0.010 - ETA: 4:09 - batch_cost: 0.0713 - reader cost: 0 - ETA: 4:08 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 4:08 - batch_cost: 0.0713 - reader cost: 0 - ETA: 4:08 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 4:08 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:07 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:07 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:07 - batch_cost: 0.0714 - reader cost: 0.0 - ETA: 4:07 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:06 - batch_cost: 0.0714 - reader - ETA: 4:05 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 4:05 - batch_cost: 0.0714 - reader cost: - ETA: 4:04 - batch_cost: 0.0714 - reader cost: 0 - ETA: 4:04 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 4:04 - batch_cost: 0.0714 - reader cost: 0.0 - ETA: 4:04 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:04 - batch_cost: 0.0714 - reader cost: 0.0 - ETA: 4:04 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:03 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 4:03 - batch_cost: 0.0715 - reader cost: 0 - ETA: 4:03 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 4:03 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 4:03 - batch_cost: 0.0715 - reader cost: 0. - ETA: 4:02 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 4:02 - batch_cost: 0.0715 - reader cost: 0.011 - ETA: 4:02 - batch_cost: 0.0715 - reader cost: 0. - ETA: 4:02 - batch_cost: 0.0715 - reader cost: - ETA: 4:01 - batch_cost: 0.0715 - reader cost: 0.0 - ETA: 4:01 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 4:00 - batch_cost: 0.0715 - reader cost: - ETA: 4:00 - batch_cost: 0.0715 - reader cost: 0.011 - ETA: 4:00 - batch_cost: 0.0715 - reader cost: 0. - ETA: 3:59 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 3:59 - batch_cost: 0.0715 - reader cost: 0.011 - ETA: 3:59 - batch_cost: 0.0715 - reader cost: 0.0 - ETA: 3:59 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 3:59 - batch_cost: 0.0715 - reade - ETA: 3:57 - batch_cost: 0.0714 - reader cost: 0.011 - ETA: 3:57 - batch_cost: 0.0714 - reader cost: 0. - ETA: 3:57 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 3:56 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 3:56 - batch_cost: 0.0714 - reader c - ETA: 3:55 - batch_cost: 0.0714 - rea - ETA: 3:54 - batch_cost: 0.0714 - reader cost: 0. - ETA: 3:54 - batch_cost: 0.0714 - reader cost: 0.01 - ETA: 3:53 - batch_cost: 0.0714 - reader cost: - ETA: 3:53 - batch_cost: 0.0714 - reader c - ETA: 3:52 - batch_cost: 0.0714 - reader cost: 0 - ETA: 3:52 - batch_cost: 0.0714 - reader cost - ETA: 3:51 - batch_cost: 0.0715 - reader cost: 0.0 - ETA: 3:51 - batch_cost: 0.0715 - reader cost: - ETA: 3:51 - batch_cost: 0.0716 - reader cost: 0.01 - ETA: 3:50 - batch_cost: 0.0716 - reader cost: 0.01 - ETA: 3:50 - batch_cost: 0.0716 - reader cost: 0 - ETA: 3:50 - batch_cost: 0.0716 - reader cost: 0.011 - ETA: 3:50 - batch_cost: 0.0716 - reader cost: - ETA: 3:50 - batch_cost: 0.0717 - reader cost: 0.01 - ETA: 3:50 - batch_cost: 0.0717 - reader cost: 0.011 - ETA: 3:50 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 3:49 - batch_cost: 0.0717 - reader cos - ETA: 3:48 - batch_cost: 0.0717 - reader cost: 0. - ETA: 3:48 - batch_cost: 0.0717 - reader cost: 0. - ETA: 3:48 - batch_cost: 0.0717 - reader cost: - ETA: 3:47 - batch_cost: 0.0717 - reader cost: 0.01 - ETA: 3:47 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 3:47 - batch_cost: 0.0717 - reader cost: 0.011 - ETA: 3:47 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 3:47 - batch_cost: 0.0717 - reader cost: 0. - ETA: 3:46 - batch_cost: 0.0717 - reader cost: - ETA: 3:46 - batch_cost: 0.0717 - reader cost: 0.011 - ETA: 3:46 - batch_cost: 0.0717 - reader cost: 0. - ETA: 3:45 - batch_cost: 0.0717 - reader cost: 0.011 - ETA: 3:45 - batch_cost: 0.0717 - reader cost - ETA: 3:44 - batch_cost: 0.0717 - reader cost: 0 - ETA: 3:44 - batch_cost: 0.0717 - reader cost: 0.01 - ETA: 3:44 - batch_cost: 0.0717 - reader cost - ETA: 3:43 - batch_cost: 0.0717 - reader cost: 0.011 - ETA: 3:43 - batch_cost: 0.0717 - reader cost: 0 - ETA: 3:42 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 3:42 - batch_cost: 0.0717 - reader cost: 0. - ETA: 3:42 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 3:41 - batch_cost: 0.0717 - reader cost: - ETA: 3:41 - batch_cost: 0.0717 - reader cost: 0.01 - ETA: 3:41 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 3:41 - batch_cost: 0.0718 - reader cost: 0. - ETA: 3:41 - batch_cost: 0.0718 - reader cost: 0.0 - ETA: 3:40 - batch_cost: 0.0718 - reader cost: - ETA: 3:40 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:39 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:39 - batch_cost: 0.0718 - reader cost: 0.011 - ETA: 3:39 - batch_cost: 0.0718 - reade - ETA: 3:38 - batch_cost: 0.0719 - reader cost: - ETA: 3:37 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:37 - batch_cost: 0.0718 - reader cost: - ETA: 3:36 - batch_cost: 0.0718 - reader - ETA: 3:35 - batch_cost: 0.0718 - reader - ETA: 3:34 - batch_cost: 0.0718 - reader co - ETA: 3:33 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:33 - batch_cost: 0.0718 - reader c - ETA: 3:32 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:32 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:32 - batch_cost: 0.0718 - reader cost: - ETA: 3:31 - batch_cost: 0.0718 - reader cost: 0 - ETA: 3:31 - batch_cost: 0.0718 - reader cost: 0. - ETA: 3:31 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:30 - batch_cost: 0.0718 - reader cost: - ETA: 3:30 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:30 - batch_cost: 0.0718 - reader cost: 0. - ETA: 3:29 - batch_cost: 0.0718 - reader cost: 0.0 - ETA: 3:29 - batch_cost: 0.0718 - reader cost: 0.0 - ETA: 3:29 - batch_cost: 0.0718 - reader - ETA: 3:28 - batch_cost: 0.0718 - reader cost: 0 - ETA: 3:28 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 3:27 - batch_cost: 0.0718 - reader co - ETA: 3:26 - batch_cost: 0.0718 - reader cost: 0. - ETA: 3:26 - batch_cost: 0.0718 - reade - ETA: 3:25 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 3:24 - batch_cost: 0.0719 - reader cost - ETA: 3:24 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 3:24 - batch_cost: 0.0719 - reader cost: 0. - ETA: 3:23 - batch_cost: 0.0719 - reader cost: 0 - ETA: 3:23 - batch_cost: 0.0719 - reader cos - ETA: 3:23 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 3:22 - batch_cost: 0.0720 - reader cost: 0 - ETA: 3:22 - batch_cost: 0.0720 - reader cost: 0.0 - ETA: 3:22 - batch_cost: 0.0720 - reader cos - ETA: 3:21 - batch_cost: 0.0721 - reader cost: 0.01 - ETA: 3:21 - batch_cost: 0.0721 - reader co - ETA: 3:20 - batch_cost: 0.0721 - reader cost: 0.01 - ETA: 3:20 - batch_cost: 0.0721 - reader cost: - ETA: 3:19 - batch_cost: 0.0721 - reader cost: 0.011 - ETA: 3:19 - batch_cost: 0.0721 - reader cost: 0.01 - ETA: 3:19 - batch_cost: 0.0721 - reader cost: 0.0 - ETA: 3:19 - batch_cost: 0.0721 - reader cost: 0.011 - ETA: 3:19 - batch_cost: 0.0721 - reade - ETA: 3:17 - batch_cost: 0.0721 - reader cost: 0 - ETA: 3:17 - batch_cost: 0.0721 - reader cost: 0.01 - ETA: 3:17 - batch_cost: 0.0721 - reader cost: 0.0 - ETA: 3:17 - batch_cost: 0.0721 - reader cost: 0.011 - ETA: 3:16 - batch_cost: 0.0721 - reader cost: 0.0 - ETA: 3:16 - batch_cost: 0.0721 - reader cos - ETA: 3:15 - batch_cost: 0.0721 - reade - ETA: 3:14 - batch_cost: 0.0721 - reader cost: 0.01 - ETA: 3:14 - batch_cost: 0.0721 - reader cost: 0 - ETA: 3:14 - batch_cost: 0.0721 - reader cost: 0.011 - ETA: 3:13 - batch_cost: 0.0721 - reader cost: 0. - ETA: 3:13 - batch_cost: 0.0721 - reader cost: 0.01 - ETA: 3:13 - batch_cost: 0.0721 - reader cost: 0.011 - ETA: 3:13 - batch_cost: 0.0721 - reader cost: 0.011 - ETA: 3:13 - batch_cost: 0.0721 - reader cost: 0. - ETA: 3:12 - batch_cost: 0.0721 - reader - ETA: 3:11 - batch_cost: 0.0722 - reader cost: 0. - ETA: 3:11 - batch_cost: 0.0722 - reader cost: 0.011 - ETA: 3:11 - batch_cost: 0.0722 - reader cost: 0.0 - ETA: 3:11 - batch_cost: 0.0722 - reader cost: 0. - ETA: 3:10 - batch_cost: 0.0722 - reader cost: 0. - ETA: 3:10 - batch_cost: 0.0722 - r - ETA: 3:08 - batch_cost: 0.0722 - reader cost: 0.01 - ETA: 3:08 - batch_cost: 0.0722 - reader cost - ETA: 3:08 - batch_cost: 0.0723 - reader cost: 0. - ETA: 3:07 - batch_cost: 0.0723 - reader cost: 0.01 - ETA: 3:07 - batch_cost: 0.0723 - reader cost: 0.01 - ETA: 3:07 - batch_cost: 0.0723 - reader cost: 0.01 - ETA: 3:07 - batch_cost: 0.0723 - reader cos - ETA: 3:06 - batch_cost: 0.0724 - rea - ETA: 3:05 - batch_cost: 0.0724 - reader cost: 0 - ETA: 3:05 - batch_cost: 0.0724 - reader cost: 0. - ETA: 3:04 - batch_cost: 0.0724 - reader cost: 0.0 - ETA: 3:04 - batch_cost: 0.0724 - - ETA: 3:02 - batch_cost: 0.0724 - reader cost - ETA: 3:01 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 3:01 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 3:01 - batch_cost: 0.0724 - reader cost: 0.0 - ETA: 3:01 - batch_cost: 0.0724 - reader cost: 0.01 - ETA: 3:01 - batch_cost: 0.0724 - reade - ETA: 3:00 - batch_cost: 0.0724 - reader cost: 0 - ETA: 2:59 - batch_cost: 0.0724 - reader cost: 0.0 - ETA: 2:59 - batch_cost: 0.0724 - reader cost: - ETA: 2:58 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:58 - batch_cost: 0.0724 - reader cost: - ETA: 2:58 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:58 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:57 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:57 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:57 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:57 - batch_cost: 0.0724 - reader cost: - ETA: 2:57 - batch_cost: 0.0724 - reader cost: 0.0 - ETA: 2:56 - batch_cost: 0.0724 - reader cost: 0.01 - ETA: 2:56 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:56 - batch_cost: 0.0724 - reader cost: 0.01 - ETA: 2:56 - batch_cost: 0.0724 - reader cost: 0. - ETA: 2:56 - batch_cost: 0.0724 - reader - ETA: 2:54 - batch_cost: 0.0724 - reader cost: 0 - ETA: 2:54 - batch_cost: 0.0724 - reader cost: 0. - ETA: 2:54 - batch_cost: 0.0724 - reader cost: 0.01 - ETA: 2:54 - batch_cost: 0.0724 - reader cost: 0. - ETA: 2:53 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:53 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:53 - batch_cost: 0.0724 - reader cost: - ETA: 2:53 - batch_cost: 0.0724 - reader cost - ETA: 2:52 - batch_cost: 0.0724 - reader cost: 0.0 - ETA: 2:51 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:51 - batch_cost: 0.0724 - reader cost: 0.01 - ETA: 2:51 - batch_cost: 0.0724 - reader cost: 0.0 - ETA: 2:51 - batch_cost: 0.0725 - reader cost: 0 - ETA: 2:51 - batch_cost: 0.0725 - reader cost: 0.01 - ETA: 2:51 - batch_cost: 0.0725 - reader cost: 0.01 - ETA: 2:51 - batch_cost: 0.0726 - reader cost: 0.01 - ETA: 2:50 - batch_cost: 0.0726 - rea - ETA: 2:49 - batch_cost: 0.0726 - reader cost: 0. - ETA: 2:49 - batch_cost: 0.0726 - reader c - ETA: 2:48 - batch_cost: 0.0727 - reader cost: 0.012 - ETA: 2:48 - batch_cost: 0.0727 - reader - ETA: 2:47 - batch_cost: 0.0727 - reader cost: 0.0 - ETA: 2:47 - batch_cost: 0.0727 - reader cost - ETA: 2:46 - batch_cost: 0.0726 - reader cost: 0.01 - ETA: 2:46 - batch_cost: 0.0726 - reader cost: 0.012 - ETA: 2:46 - batch_cost: 0.0726 - reader cost: 0 - ETA: 2:45 - batch_cost: 0.0726 - reader cost: 0. - ETA: 2:45 - batch_cost: 0.0726 - reader cost: 0.01 - ETA: 2:44 - batch_cost: 0.0726 - reader co - ETA: 2:43 - batch_cost: 0.0726 - reader cost: 0.01 - ETA: 2:43 - batch_cost: 0.0726 - reader cost: 0. - ETA: 2:43 - batch_cost: 0.0726 - reader cost: 0.01 - ETA: 2:43 - batch_cost: 0.0726 - reader cost: 0.011 - ETA: 2:43 - batch_cost: 0.0727 - reader cost: - ETA: 2:42 - batch_cost: 0.0727 - reader cost: 0.011 - ETA: 2:42 - batch_cost: 0.0727 - reader cost: 0 - ETA: 2:42 - batch_cost: 0.0727 - reader cost: 0.011 - ETA: 2:42 - batch_cost: 0.0727 - reader cost: 0.0 - ETA: 2:42 - batch_cost: 0.0727 - reader cost: 0.01 - ETA: 2:41 - batch_cost: 0.0727 - reader cost: 0.011 - ETA: 2:41 - batch_cost: 0.0727 - reader cost: 0.0 - ETA: 2:41 - batch_cost: 0.0727 - reader cost: 0.011 - ETA: 2:41 - batch_cost: 0.0727 - reader cost: 0.011 - ETA: 2:41 - batch_cost: 0.0727 - reader cost: 0.011 - ETA: 2:41 - batch_cost: 0.0727 - reader cost: 0. - ETA: 2:40 - batch_cost: 0.0727 - reader cost: - ETA: 2:39 - batch_cost: 0.0726 - reader cost: 0.01 - ETA: 2:39 - batch_cost: 0.0726 - reader co - ETA: 2:38 - batch_cost: 0.0726 - reader cost: 0 - ETA: 2:37 - batch_cost: 0.0725 - reader cost: 0.01 - ETA: 2:37 - batch_cost: 0.0725 - reader cost: 0.0 - ETA: 2:36 - batch_cost: 0.0725 - reader cost: 0.0 - ETA: 2:36 - batch_cost: 0.0725 - reader cost: 0.0 - ETA: 2:35 - batch_cost: 0.0725 - reader cost: 0.01 - ETA: 2:35 - batch_cost: 0.0725 - reader cost: 0. - ETA: 2:35 - batch_cost: 0.0725 - reader cost: 0 - ETA: 2:34 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:34 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:34 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:34 - batch_cost: 0.0724 - reader cost: 0.01 - ETA: 2:33 - batch_cost: 0.0724 - reader cost: 0.011 - ETA: 2:33 - batch_cost: 0.0724 - reader cost: 0 - ETA: 2:33 - batch_cost: 0.0724 - reader cost: - ETA: 2:32 - batch_cost: 0.0724 - reader cost: - ETA: 2:31 - batch_cost: 0.0723 - reader cost: 0.01 - ETA: 2:31 - batch_cost: 0.0723 - reader cost: 0 - ETA: 2:30 - batch_cost: 0.0723 - reader cost: 0.01 - ETA: 2:30 - batch_cost: 0.0723 - reader cos - ETA: 2:28 - batch_cost: 0.0722 - reader c - ETA: 2:27 - batch_cost: 0.0 - ETA: 2:24 - batch_cost: 0.0721 - reader co - ETA: 2:22 - batch_cost: 0.0720 - reader cost: 0.011 - ETA: 2:22 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 2:22 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 2:22 - batch_cost: 0.0720 - reader cost: 0 - ETA: 2:21 - batch_cost: 0.0720 - reader cost: 0.011 - ETA: 2:21 - batch_cost: 0.0720 - reader cost: 0.011 - ETA: 2:21 - batch_cost: 0.0720 - reader cost: 0.0 - ETA: 2:20 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 2:20 - batch_cost: 0.0720 - reader cost: 0 - ETA: 2:20 - batch_cost: 0.0720 - reade - ETA: 2:18 - batch_cost: 0.0719 - reader cost: 0.011 - ETA: 2:18 - batch_cost: 0.0719 - reader cost: 0.011 - ETA: 2:18 - batch_cost: 0.0719 - re - ETA: 2:16 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:15 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:15 - batch_cost: 0.0719 - reader cost: - ETA: 2:15 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:14 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:14 - batch_cost: 0.0719 - reader cost: 0 - ETA: 2:14 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 2:14 - batch_cost: 0.0719 - reader cost: - ETA: 2:13 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:13 - batch_cost: 0.0719 - reader cost - ETA: 2:12 - batch_cost: 0.0720 - reader cost: 0.011 - ETA: 2:12 - batch_cost: 0.0720 - reader cost: - ETA: 2:12 - batch_cost: 0.0720 - reader cost: 0 - ETA: 2:11 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 2:11 - batch_cost: 0.0720 - reader cost: 0.011 - ETA: 2:11 - batch_cost: 0.0720 - reader cost - ETA: 2:10 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:09 - batch_cost: 0.0719 - reade - ETA: 2:07 - batch_cost: 0.0719 - reader cost: 0 - ETA: 2:07 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 2:07 - batch_cost: 0.0718 - reader cos - ETA: 2:06 - batch_cost: 0.0718 - reader cost: - ETA: 2:05 - batch_cost: 0.0718 - reader - ETA: 2:04 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:04 - batch_cost: 0.0719 - reader cost: - ETA: 2:04 - batch_cost: 0.0719 - reader cost: 0.010 - ETA: 2:04 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:03 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 2:03 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:03 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:02 - batch_cost: 0.0719 - reader cost: 0. - ETA: 2:02 - batch_cost: 0.0719 - reader cost - ETA: 2:02 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 2:01 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 2:01 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 2:01 - batch_cost: 0.0720 - reader cost: 0.010 - ETA: 2:01 - batch_cost: 0.0720 - reader cost: 0. - ETA: 2:01 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 2:00 - batch_cost: 0.0720 - reader cost: 0 - ETA: 2:00 - batch_cost: 0.0720 - reader cost: 0.01 - ETA: 1:59 - batch_cost: 0.0719 - reader cost: 0.010 - ETA: 1:59 - batch_cost: 0.0719 - reader cost: 0 - ETA: 1:58 - batch_cost: 0.0719 - reader cost: 0.01 - ETA: 1:58 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 1:58 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 1:57 - batch_cost: 0.0719 - reader cost: 0.010 - ETA: 1:57 - batch_cost: 0.0719 - reader cost: 0.0 - ETA: 1:57 - batch_cost: 0.0719 - reader cost: 0. - ETA: 1:56 - batch_cost: 0.0718 - reader cost: 0.01 - ETA: 1:56 - batch_cost: 0.0718 - reader - ETA: 1:54 - batch_cost: 0.0718 - reader cost - ETA: 1:53 - batch_cost: 0.0717 - reader cost: - ETA: 1:52 - batch_cost: 0.0717 - reader cost: 0.0 - ETA: 1:52 - batch_cost: 0.0717 - reader cost: 0.01 - ETA: 1:52 - batch_cost: 0.0717 - reader cost: 0. - ETA: 1:51 - batch_cost: 0.0717 - reader cost: - ETA: 1:50 - batch_cost: 0.0716 - reader cost: 0. - ETA: 1:50 - batch_cost: 0.0716 - reader cost: 0. - ETA: 1:49 - batch_cost: 0.0716 - reader cost: 0.01 - ETA: 1:49 - batch_cost: 0.0716 - reader cost: 0.010 - ETA: 1:49 - batch_cost: 0.0716 - reader cost: 0.01 - ETA: 1:49 - batch_cost: 0.0716 - reader co - ETA: 1:47 - batch_cost: 0.0715 - reader cost - ETA: 1:46 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 1:46 - batch_cost: 0.0715 - reader cost: 0.0 - ETA: 1:46 - batch_cost: 0.0715 - reader cost: 0 - ETA: 1:45 - batch_cost: 0.0715 - reader cost: 0.01 - ETA: 1:45 - batch_cost: 0.0715 - reader cost: 0.0 - ETA: 1:45 - batch_cost: 0.0715 - reader cost: - ETA: 1:44 - batch_cost: 0.0714 - reader cost: 0. - ETA: 1:43 - batch_cost: 0.0714 - re - ETA: 1:41 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:41 - batch_cost: 0.0713 - reader cost: 0.010 - ETA: 1:40 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:40 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:40 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:40 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:39 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:39 - batch_cost: 0.0713 - reader cost: 0. - ETA: 1:38 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:38 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:38 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:38 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:37 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:37 - batch_cost: 0.0712 - reader cost: 0.01 - ETA: 1:37 - batch_cost: 0.0712 - reader cost: - ETA: 1:36 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 1:36 - batch_cost: 0.0712 - reader cost: 0.01 - ETA: 1:36 - batch_cost: 0.0712 - reader cost: 0.01 - ETA: 1:36 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 1:35 - batch_cost: 0.0712 - reader cost: 0.010 - ETA: 1:35 - batch_cost: 0.0712 - reader cost: 0.01 - ETA: 1:35 - batch_cost: 0.0712 - reader cost: 0.0 - ETA: 1:35 - batch_cost: 0.0712 - reader cos - ETA: 1:34 - batch_cost: 0.0712 - - ETA: 1:32 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:32 - batch_cost: 0.0713 - reader cost: 0.010 - ETA: 1:32 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:32 - batch_cost: 0.0713 - reader cost: 0.01 - ETA: 1:32 - batch_cost: 0.0713 - reader cost: 0.0 - ETA: 1:31 - batch_cost: 0.0713 - reader cost: 0. - ETA: 1:31 - batch_cost: 0.0713 - reader cost: 0. - ETA: 1:30 - batch_cost: 0.0713 - reader cost: - ETA: 1:30 - batch_cost: 0.0713 - reader cost: - ETA: 1:29 - batch_cost: 0.0713 - reader cos - ETA: 1:28 - batch_cost: 0.0713 - reader cost - ETA: 1:28 - batch_cost: 0.0713 - ETA: 1:25 - batch_cost: 0.0713 - reader cost: 0. - ETA: 1:25 - batch_cost: 0.0713 - reader cost: 0 - ETA: 1:24 - batch_cost: 0.0713 - reader cost: 0. - ETA: 1:24 - batch_cost: 0.0713 - reader cost: 0.009 - ETA: 1:24 - batch_cost: 0.0712 - reader cost: - ETA: 1:23 - batch_cost: 0.0712 - reader cos - ETA: 1:22 - batch_cost: 0.0712 - reader cost: 0.00 - ETA: 1:22 - batch_cost: 0.0712 - reader cost: 0 - ETA: 1:21 - batch_cost: 0.0712 - reader cost: 0.009 - ETA: 1:21 - batch_cost: 0.0712 - reader cost: 0.009 - ETA: 1:21 - batch_cost: 0.0712 - reader cost: - ETA: 1:20 - batch_cost: 0.0711 - reader cost: 0.00 - ETA: 1:20 - batch_cost: 0.071 - ETA: 1:17 - batch_cost: 0.0711 - reader cost: 0.009 - ETA: 1:17 - batch_cost: 0.0711 - reader cost: 0.00 - ETA: 1:17 - batch_cost: 0.0711 - reader cost: 0 - ETA: 1:16 - batch_cost: 0.0710 - reader cost: 0. - ETA: 1:16 - batch_cost: 0.0710 - reader cost: 0.0 - ETA: 1:15 - batch_cost: 0.0710 - reader cost: 0.00 - ETA: 1:15 - batch_cost: 0.0710 - reader co - ETA: 1:14 - batch_cost: 0.0710 - reader cost: 0.0 - ETA: 1:13 - batch_cost: 0.0710 - reader cost: 0. - ETA: 1:13 - batch_cost: 0.0710 - reader cost: 0.00 - ETA: 1:13 - batch_cost: 0.0710 - reader cost: 0 - ETA: 1:12 - batch_cost: 0.0709 - reader cost: 0.00 - ETA: 1:12 - batch_cost: 0.0709 - reader cost: 0.0 - ETA: 1:11 - batch_cost: 0.0709 - reader co - ETA: 1:10 - batch_cost: 0.0709 - reader cost: 0.00 - ETA: 1:10 - batch_cost: 0.0709 - reader cost: 0. - ETA: 1:09 - batch_cost: 0.0709 - reader cost: 0.009 - ETA: 1:09 - batch_cost: 0.0709 - reader cost: 0 - ETA: 1:09 - batch_cost: 0.0708 - reader cost: 0.00 - ETA: 1:08 - batch_cost: 0.0708 - reader cost: 0.009 - ETA: 1:08 - batch_cost: 0.0708 - reader cost: 0.00 - ETA: 1:08 - batch_cost: 0.0708 - reader cost: 0.00 - ETA: 1:08 - batch_cost: 0.0708 - rea - ETA: 1:06 - batch_cost: 0.0708 - reader cost - ETA: 1:05 - batch_cost: 0.0708 - reader cos - ETA: 1:04 - batch_cost: 0.0707 - reader c - ETA: 1:03 - batch_cost: 0.0708 - reader cost: - ETA: 1:03 - batch_cost: 0.0708 - reader cost: 0.0 - ETA: 1:02 - batch_cost: 0.0708 - reader cost: 0.009 - ETA: 1:02 - batch_cost: 0.0708 - reader cost: 0.00 - ETA: 1:02 - batch_cost: 0.0707 - reader cost: 0.009 - ETA: 1:02 - batch_cost: 0.0708 - reader cost: 0. - ETA: 1:01 - batch_cost: 0.0707 - reader cost - ETA: 1:00 - batch_cost: 0.0707 - reader cost: 0.00 - ETA: 1:00 - batch_cost: 0.0707 - reader cost: 0. - ETA: 1:00 - batch_cost: 0.0707 - reader cost: 0. - ETA: 59s - batch_cost: 0.0707 - reader cost: - ETA: 59s - batch_cost: 0.0707 - reader cost: - ETA: 59s - batch_cost: 0.0707 - reader - ETA: 58s - batch_cost: 0.0707 - reader cost - ETA: 58s - batch_cost: 0.0707 - rea - ETA: 57s - batch_cost: 0.0707 - reader cost: 0. - ETA: 57s - batch_cost: 0.0707 - reader cost - ETA: 57s - batch_cost: 0.0707 - reader cost - ETA: 56s - batch_cost: 0.0707 - reader co - ETA: 56s - batch_cost: 0.0707 - reader cost - ETA: 55s - batch_cost: 0.0708 - reader cost: 0. - ETA: 55s - batch_cost: 0.0708 - reader cost: - ETA: 55s - batch_cost: 0.0708 - ETA: 54s - batch_cost: 0.0708 - reade - ETA: 53s - batch_cost: 0.0708 - ETA: 52s - batch_cost: 0.0707 - reader cost - ETA: 52s - batch_cost: 0.0707 - reader cost: 0.00 - ETA: 52s - batch_cost: 0.0707 - reader cost: 0. - ETA: 51s - batch_cost: 0.0707 - reader cost: 0.00 - ETA: 51s - batch_cost: 0.0708 - reader cost: 0.00 - ETA: 51s - batch_cost: 0.0708 - reader cost: - ETA: 51s - batch_cost: 0.0708 - reader cost: 0. - ETA: 51s - batch_cost: 0.0708 - reader cost - ETA: 50s - batch_cost: 0.0708 - reade - ETA: 50s - batch_cost: 0.0708 - reader cost: 0.00 - ETA: 50s - batch_cost: 0.0708 - reader - ETA: 49s - batch_cost: 0.0708 - reader cost: 0. - ETA: 49s - batch_ - ETA: 47s - batch_cost: 0.0708 - reader cost: 0. - ETA: 47s - batch_cost: 0.0708 - reader - ETA: 46s - batch_cost: 0.0708 - reader co - ETA: 46s - batch_cost: 0.0708 - reader cost: 0. - ETA: 45s - batch_cost: 0.0708 - - ETA: 44s - batch_cost: 0.0707 - reader co - ETA: 44s - batch_cost: 0.0707 - reader co - ETA: 43s - batch_cost: 0.0707 - r - ETA: 42s - batch_cost: 0.0707 - reader cost: 0.00 - ETA: 42s - batch_cost: 0.0706 - reader cost: 0.00 - ETA: 42s - batch_cost: 0.0706 - reader cost - ETA: 41s - batch_cost: 0.0706 - reader cost - ETA: 41s - batch_cost: 0.0706 - reader co - ETA: 40s - batch_cost: 0.0706 - reader cost: 0. - ETA: 40s - batch_cost: 0.0706 - reader - ETA: 39s - batch_cost: 0.0706 - reader cost: 0. - ETA: 39s - batch_cost: 0.0706 - reader cost: - ETA: 38s - batch_cost: 0.0706 - reader cost: 0.00 - ETA: 38s - batch_cost: 0.0706 - reader cost: 0.00 - ETA: 38s - batch_co - ETA: 36s - batch_cost: 0.0705 - reader co - ETA: 36s - batch_cost: 0.0705 - reader cost - ETA: 35s - batch_cost: 0.0705 - - ETA: 34s - batch_cost: 0.0705 - reade - ETA: 33s - batch_cost: 0.0704 - reader co - ETA: 33s - batch_cost: 0.0704 - reader - ETA: 32s - batch_cost - ETA: 30s - batch_cost: 0.0704 - reader cost: - ETA: 30s - batch_cost: 0.0704 - reader cost: - ETA: 29s - batch_cost: 0.0703 - reader cost: 0. - ETA: 29s - batch_cost: 0.0703 - reader cost: 0. - ETA: 29s - batch_cost: 0.0703 - reader cost: - ETA: 28s - batch_cost: 0.0703 - reader - ETA: 28s - batch_cost: 0.07 - ETA: 26s - batch_cost: 0.0703 - reader cost: 0. - ETA: 26s - batch_cost: 0.0703 - reader co - ETA: 26s - batch_cost: 0.0703 - reader cost: - ETA: 25s - batch_cost: 0.0703 - reader co - ETA: 25s - batch_cost: 0.0702 - reader cost: 0. - ETA: 24s - batch_cost: 0.0702 - r - ETA: 23s - batch_cost: 0.0702 - reader cost: 0. - ETA: 23s - batch_cost: 0.0702 - rea - ETA: 22s - batch_cost: 0.0702 - reader cost: 0. - ETA: 22s - batch_cost: 0.0702 - reader co - ETA: 21s - batch_cost: 0.0702 - reader cost: 0. - ETA: 21s - batch_cost: 0.0702 - reader cost: - ETA: 21s - batch_cost: 0.0701 - reader co - ETA: 20s - batch_cost: 0.0701 - reader cost: 0. - ETA: 20s - batch_cost: 0.0701 - reader - ETA: 19s - batch_cost: 0.0701 - reader cost: - ETA: 19s - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 19s - batch_cost: 0.0701 - reader cost - ETA: 18s - batch_cost: 0.0701 - - ETA: 17s - batch_cost: 0.0700 - reader cost: - ETA: 16s - batch_cost: 0.0700 - reader cost: 0. - ETA: 16s - batch_cost: 0.0700 - reader - ETA: 16s - batch_cost: 0.0700 - reader co - ETA: 15s - batch_cost: 0. - ETA: 14s - batch_cost: 0.0700 - ETA: 13s - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 13s - batch_cost: 0.0701 - reader cost: 0. - ETA: 13s - batch_cost: 0.0701 - reader co - ETA: 12s - batch_cost: 0.0701 - reade - ETA: 12s - batch_cost: 0.0702 - reader cost: 0.00 - ETA: 12s - batch_cost: 0.0702 - reader cost: 0.00 - ETA: 12s - batch_cost: 0.0702 - reader cost: 0.00 - ETA: 12s - batch_cost: 0.0702 - reader cost: - ETA: 11s - batch_cost: 0.0702 - reader cost: 0. - ETA: 11s - batch_cost: 0.0702 - reader cost: - ETA: 11s - batch_cost: 0.0702 - reader cost: - ETA: 11s - batch_cost: 0.0702 - reader cost: - ETA: 11s - batch_cost: 0.0702 - reader cost: 0. - ETA: 10s - batch_cost: 0.0702 - reader cost: 0. - ETA: 10s - batch_cost: 0.0702 - reader - ETA: 10s - batch_cost: 0.0702 - reader cost: 0. - ETA: 9s - batch_cost: 0.0702 - reader cost: 0.00 - ETA: 9s - batch_cost: 0.0702 - reader cost: 0 - ETA: 8s - batch_cost: 0.0702 - reader cost: - ETA: 8s - batch_cost: 0.0702 - reader cost: - ETA: 7s - batch_cost: 0.0701 - reader cost: - ETA: 6s - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 6s - batch_cost: 0.0701 - reader cost: 0 - ETA: 5s - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 5s - batch_cost: 0.0701 - reader cos - ETA: 4s - batch_cost: 0.0701 - reader cost: 0.00 - ETA: 4s - batch_cost: 0.0701 - reader co - ETA: 3s - batch_cost: 0.0700 - reader cost: 0. - ETA: 2s - batch_cost: 0.0700 - reader cost: - ETA: 1s - batch_cost: 0.0700 - reader cost: 0. - ETA: 1s - batch_cost: 0.0700 - reader cost: 0.00 - ETA: 1s - batch_cost: 0.0700 - reader cost: 0 - ETA: 0s - batch_cost: 0.0700 - reader cost: 02022-07-21 16:59:39 [INFO][EVAL] #Images: 7361 mIoU: 0.1235 Acc: 0.9842 Kappa: 0.4840 Dice: 0.15902022-07-21 16:59:39 [INFO][EVAL] Class IoU: [0.9862 0.1565 0.0647 0.2815 0.0385 0. 0. 0. 0.3483 0.0126 0. 0. 0. 0.003 0. 0. 0. 0. 0. 0.5792]2022-07-21 16:59:39 [INFO][EVAL] Class Precision: [0.9898 0.4227 0.4373 0.5226 0.3097 0. 0. 0. 0.5097 0.2016 0. 0. 0. 0.3583 0. 0. 0. 0. 0. 0.6356]2022-07-21 16:59:39 [INFO][EVAL] Class Recall: [0.9963 0.1991 0.0706 0.379 0.0421 0. 0. 0. 0.5238 0.0132 0. 0. 0. 0.003 0. 0. 0. 0. 0. 0.8672]2022-07-21 16:59:39 [INFO][EVAL] The model with the best validation mIoU (0.1235) was saved at iter 20000.<class 'paddle.nn.layer.conv.Conv2D'>'s flops has been countedCustomize Function has been applied to <class 'paddle.nn.layer.norm.SyncBatchNorm'><class 'paddle.nn.layer.activation.ReLU'>'s flops has been countedCannot find suitable count function for <class 'paddle.nn.layer.pooling.MaxPool2D'>. Treat it as zero FLOPs.<class 'paddle.nn.layer.pooling.AdaptiveAvgPool2D'>'s flops has been countedCannot find suitable count function for <class 'paddleseg.models.layers.wrap_functions.Add'>. Treat it as zero FLOPs.<class 'paddle.nn.layer.pooling.AvgPool2D'>'s flops has been countedCannot find suitable count function for <class 'paddle.nn.layer.activation.Sigmoid'>. Treat it as zero FLOPs.<class 'paddle.nn.layer.common.Dropout'>'s flops has been countedTotal Flops: 8070562560 Total Params: 2330668登录后复制 ? ? ? ?

4.模型评估

4.1模型验证

当保存完模型后,我们可以通过脚本对模型进行评估

In [7]

! python /home/aistudio/PaddleSeg/val.py --config /home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml --model_path /home/aistudio/output/iter_20000/model.pdparams登录后复制 ? ? ? ?

2022-07-21 17:35:43 [INFO]---------------Config Information---------------batch_size: 4iters: 20000loss: coef: - 1 - 1 - 1 - 1 - 1 types: - type: CrossEntropyLosslr_scheduler: end_lr: 0 learning_rate: 0.01 power: 0.9 type: PolynomialDecaymodel: pretrained: null type: BiSeNetV2optimizer: momentum: 0.9 type: sgd weight_decay: 4.0e-05train_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: train num_classes: 20 train_path: /home/aistudio/custom_dataset/dataset2/train.txt transforms: - target_size: - 512 - 512 type: Resize - type: RandomHorizontalFlip - type: Normalize type: Datasetval_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: val num_classes: 20 transforms: - target_size: - 512 - 512 type: Resize - type: Normalize type: Dataset val_path: /home/aistudio/custom_dataset/dataset2/val.txt------------------------------------------------W0721 17:35:43.288358 19127 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 10.1W0721 17:35:43.288410 19127 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.2022-07-21 17:35:44 [INFO]Loading pretrained model from /home/aistudio/output/iter_20000/model.pdparams2022-07-21 17:35:44 [INFO]There are 356/356 variables loaded into BiSeNetV2.2022-07-21 17:35:44 [INFO]Loaded trained params of model successfully2022-07-21 17:35:44 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 441s 60ms/step - batch_cost: 0.0596 - reader cost: 0.0074: 2:47 - batch_cost: 0.0617 - ETA: 2:40 - batch_cost: 0.0617 - - ETA: 2:39 - batch_cost: 0.0617 - reader cost - ETA: 2:38 - batch_cost: 0.0618 - reader - ETA: 2:38 - batch_cost: 0.0618 - - ETA: 2:20 - batch_cost: 0.0621 - reader cost - ETA: 2:20 - ETA: 2 - ETA: 13s - batch_cost: 0.0598 - reader - ETA: 12s - batch_cost: 0.0598 - reade - ETA: 11s - batch_c - ETA: 9s 2022-07-21 17:43:05 [INFO][EVAL] #Images: 7361 mIoU: 0.1235 Acc: 0.9842 Kappa: 0.4840 Dice: 0.15902022-07-21 17:43:05 [INFO][EVAL] Class IoU: [0.9862 0.1565 0.0647 0.2815 0.0385 0. 0. 0. 0.3483 0.0126 0. 0. 0. 0.003 0. 0. 0. 0. 0. 0.5792]2022-07-21 17:43:05 [INFO][EVAL] Class Precision: [0.9898 0.4227 0.4373 0.5226 0.3097 0. 0. 0. 0.5097 0.2016 0. 0. 0. 0.3583 0. 0. 0. 0. 0. 0.6356]2022-07-21 17:43:05 [INFO][EVAL] Class Recall: [0.9963 0.1991 0.0706 0.379 0.0421 0. 0. 0. 0.5238 0.0132 0. 0. 0. 0.003 0. 0. 0. 0. 0. 0.8672]登录后复制 ? ? ? ?

4.2多尺度翻转评估

进行多尺度翻转评估可通过传入--aug_eval进行开启,然后通过--scales传入尺度信息, --flip_horizontal开启水平翻转, flip_vertical开启垂直翻转。使用示例如下:登录后复制 ? ?In [22]

! python /home/aistudio/PaddleSeg/val.py --config /home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml --model_path /home/aistudio/output/iter_20000/model.pdparams --aug_eval --scales 0.75 1.0 1.25 --flip_horizontal登录后复制 ? ? ? ?

2022-07-21 18:01:51 [INFO]---------------Config Information---------------batch_size: 4iters: 20000loss: coef: - 1 - 1 - 1 - 1 - 1 types: - type: CrossEntropyLosslr_scheduler: end_lr: 0 learning_rate: 0.01 power: 0.9 type: PolynomialDecaymodel: pretrained: null type: BiSeNetV2optimizer: momentum: 0.9 type: sgd weight_decay: 4.0e-05train_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: train num_classes: 20 train_path: /home/aistudio/custom_dataset/dataset2/train.txt transforms: - target_size: - 512 - 512 type: Resize - type: RandomHorizontalFlip - type: Normalize type: Datasetval_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: val num_classes: 20 transforms: - target_size: - 512 - 512 type: Resize - type: Normalize type: Dataset val_path: /home/aistudio/custom_dataset/dataset2/val.txt------------------------------------------------W0721 18:01:52.069475 22292 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 10.1W0721 18:01:52.069526 22292 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.2022-07-21 18:01:53 [INFO]Loading pretrained model from /home/aistudio/output/iter_20000/model.pdparams2022-07-21 18:01:53 [INFO]There are 356/356 variables loaded into BiSeNetV2.2022-07-21 18:01:53 [INFO]Loaded trained params of model successfully2022-07-21 18:01:53 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 1126s 153ms/step - batch_cost: 0.1527 - reader cost: 1.6901e-04s - - ETA: 10s - batc - ETA: 4s - batch_cost: 0.12022-07-21 18:20:39 [INFO][EVAL] #Images: 7361 mIoU: 0.1235 Acc: 0.9847 Kappa: 0.4788 Dice: 0.15662022-07-21 18:20:39 [INFO][EVAL] Class IoU: [0.9864 0.1447 0.0531 0.2827 0.0259 0. 0. 0. 0.3494 0.0034 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.6249]2022-07-21 18:20:39 [INFO][EVAL] Class Precision: [0.9893 0.4479 0.5183 0.5357 0.3257 0. 0. 0. 0.5209 0.2834 0. 0. 0. 0.0178 0. 0. 0. 0. 0. 0.7068]2022-07-21 18:20:39 [INFO][EVAL] Class Recall: [0.9971 0.1761 0.0559 0.3745 0.0274 0. 0. 0. 0.5148 0.0034 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.8436]登录后复制 ? ? ? ?

4.3滑窗评估

滑窗评估可通过传入--is_slide进行开启, 通过--crop_size传入窗口大小, --stride传入步长。使用示例如下:登录后复制 ? ?In [24]

!python /home/aistudio/PaddleSeg/val.py --config /home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml --model_path /home/aistudio/output/iter_20000/model.pdparams --is_slide --crop_size 256 256 --stride 128 128登录后复制 ? ? ? ?

2022-07-21 18:58:17 [INFO]---------------Config Information---------------batch_size: 4iters: 20000loss: coef: - 1 - 1 - 1 - 1 - 1 types: - type: CrossEntropyLosslr_scheduler: end_lr: 0 learning_rate: 0.01 power: 0.9 type: PolynomialDecaymodel: pretrained: null type: BiSeNetV2optimizer: momentum: 0.9 type: sgd weight_decay: 4.0e-05train_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: train num_classes: 20 train_path: /home/aistudio/custom_dataset/dataset2/train.txt transforms: - target_size: - 512 - 512 type: Resize - type: RandomHorizontalFlip - type: Normalize type: Datasetval_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: val num_classes: 20 transforms: - target_size: - 512 - 512 type: Resize - type: Normalize type: Dataset val_path: /home/aistudio/custom_dataset/dataset2/val.txt------------------------------------------------W0721 18:58:17.756866 28601 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 10.1W0721 18:58:17.756922 28601 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.2022-07-21 18:58:19 [INFO]Loading pretrained model from /home/aistudio/output/iter_20000/model.pdparams2022-07-21 18:58:19 [INFO]There are 356/356 variables loaded into BiSeNetV2.2022-07-21 18:58:19 [INFO]Loaded trained params of model successfully2022-07-21 18:58:19 [INFO]Start evaluating (total_samples: 7361, total_iters: 7361)...7361/7361 [==============================] - 2958s 402ms/step - batch_cost: 0.4016 - reader cost: 2.0547e-04s - batch_cost: 0.4017 - reader cost - ETA: 12s - batch_co2022-07-21 19:47:37 [INFO][EVAL] #Images: 7361 mIoU: 0.1097 Acc: 0.9836 Kappa: 0.3929 Dice: 0.13762022-07-21 19:47:37 [INFO][EVAL] Class IoU: [0.9849 0.0874 0.013 0.2107 0.0135 0. 0. 0. 0.2979 0.0001 0. 0. 0. 0.0005 0. 0. 0. 0. 0. 0.5858]2022-07-21 19:47:37 [INFO][EVAL] Class Precision: [0.9873 0.3738 0.4391 0.4821 0.2876 0. 0. 0. 0.5352 0.0299 0. 0. 0. 0.3669 0. 0. 0. 0. 0. 0.7497]2022-07-21 19:47:37 [INFO][EVAL] Class Recall: [0.9976 0.1024 0.0132 0.2724 0.014 0. 0. 0. 0.4019 0.0001 0. 0. 0. 0.0005 0. 0. 0. 0. 0. 0.7282]登录后复制 ? ? ? ?

5.模型可视化

我们可以在AIstudio上可通过左侧的可视化按钮来观察训练情况logdir路径选择/home/aistudio/output模型路径选择/home/aistudio/output/iter_20000/model.pdparams随后即可开启模型可视化以下展示模型部分可视化数据登录后复制 ? ?

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

? ? ? ?

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

? ? ? ?

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

? ? ? ?

6.模型预测

In [25]

! python /home/aistudio/PaddleSeg/predict.py --config /home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml --model_path /home/aistudio/output/iter_20000/model.pdparams --image_path /home/aistudio/custom_dataset/dataset2/test --save_dir output/result登录后复制 ? ? ? ?

2022-07-21 19:47:41 [INFO]---------------Config Information---------------batch_size: 4iters: 20000loss: coef: - 1 - 1 - 1 - 1 - 1 types: - type: CrossEntropyLosslr_scheduler: end_lr: 0 learning_rate: 0.01 power: 0.9 type: PolynomialDecaymodel: pretrained: null type: BiSeNetV2optimizer: momentum: 0.9 type: sgd weight_decay: 4.0e-05train_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: train num_classes: 20 train_path: /home/aistudio/custom_dataset/dataset2/train.txt transforms: - target_size: - 512 - 512 type: Resize - type: RandomHorizontalFlip - type: Normalize type: Datasetval_dataset: dataset_root: /home/aistudio/custom_dataset/dataset2 mode: val num_classes: 20 transforms: - target_size: - 512 - 512 type: Resize - type: Normalize type: Dataset val_path: /home/aistudio/custom_dataset/dataset2/val.txt------------------------------------------------W0721 19:47:41.718698 1631 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 10.1W0721 19:47:41.718751 1631 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.2022-07-21 19:47:43 [INFO]Number of predict images = 10002022-07-21 19:47:43 [INFO]Loading pretrained model from /home/aistudio/output/iter_20000/model.pdparams2022-07-21 19:47:43 [INFO]There are 356/356 variables loaded into BiSeNetV2.2022-07-21 19:47:43 [INFO]Start to predict...1000/1000 [==============================] - 136s 136ms/step登录后复制 ? ? ? ?

以下对模型预测结果做展示(added_prediction/pseudo_color_prediction)

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

? ? ? ?

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

基于PaddleSeg与BiSeNet的自动驾驶道路分割模型_wishdown.com

? ? ? ?

7.模型导出

为了方便进行工业级的部署,我们使用PaddleSeg提供的模型导出功能,将训练出来的动态图模型文件转化成静态图形式。

In [27]

! python /home/aistudio/PaddleSeg/export.py --config /home/aistudio/PaddleSeg/configs/quick_start/bisenet_optic_disc_512x512_1k.yml --model_path /home/aistudio/output/iter_20000/model.pdparams登录后复制 ? ? ? ?

W0721 19:54:16.925318 2444 gpu_resources.cc:61] Please NOTE: device: 0, GPU Compute Capability: 7.0, Driver API Version: 11.2, Runtime API Version: 10.1W0721 19:54:16.930387 2444 gpu_resources.cc:91] device: 0, cuDNN Version: 7.6.2022-07-21 19:54:18 [INFO]Loaded trained params of model successfully./opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpyjicxqdn.py:9The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpadop94_c.py:8The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpmrxq9s2c.py:9The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpn8oede7a.py:8The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpujhfh7yd.py:9The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpgblrz7ke.py:8The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpx9xln6q8.py:8The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpt1fqg2uo.py:8The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpq1xokq91.py:15The behavior of expression A * B has been unified with elementwise_mul(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_mul(X, Y, axis=0) instead of A * B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpq1xokq91.py:16The behavior of expression A * B has been unified with elementwise_mul(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_mul(X, Y, axis=0) instead of A * B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))/opt/conda/envs/python35-paddle120-env/lib/python3.7/site-packages/paddle/fluid/layers/math_op_patch.py:341: UserWarning: /tmp/tmpq1xokq91.py:20The behavior of expression A + B has been unified with elementwise_add(X, Y, axis=-1) from Paddle 2.0. If your code works well in the older versions but crashes in this version, try to use elementwise_add(X, Y, axis=0) instead of A + B. This transitional warning will be dropped in the future. op_type, op_type, EXPRESSION_MAP[method_name]))2022-07-21 19:54:21 [INFO]Model is saved in ./output.登录后复制 ? ? ? ?

输出模型的格式为

output ├── deploy.yaml # 部署相关的配置文件 ├── model.pdiparams # 静态图模型参数 ├── model.pdiparams.info # 参数额外信息,一般无需关注 └── model.pdmodel # 静态图模型文件登录后复制 ? ?

福利游戏

相关文章

更多

精选合集

更多

大家都在玩

热门话题

大家都在看

更多