Epoch 1 [38.4 s] loss=1.1197 [0.0 s] Epoch 2 [19.6 s] loss=0.7469, map=0.0661, prec@1=0.1755, prec@5=0.0869, prec@10=0.0630, recall@1=0.0252, recall@5=0.0613, recall@10=0.0885, [626.0 s] Epoch 3 [19.7 s] loss=0.5829 [0.0 s] Epoch 4 [19.5 s] loss=0.4982, map=0.0783, prec@1=0.1951, prec@5=0.0974, prec@10=0.0716, recall@1=0.0295, recall@5=0.0714, recall@10=0.1039, [628.0 s] Epoch 5 [23.9 s] loss=0.4435 [0.0 s] Epoch 6 [20.7 s] loss=0.4032, map=0.0830, prec@1=0.1894, prec@5=0.1039, prec@10=0.0767, recall@1=0.0295, recall@5=0.0772, recall@10=0.1125, [599.4 s] Epoch 7 [60.0 s] loss=0.3728 [0.0 s] Epoch 8 [29.7 s] loss=0.3487, map=0.0854, prec@1=0.1919, prec@5=0.1065, prec@10=0.0782, recall@1=0.0295, recall@5=0.0790, recall@10=0.1147, [588.8 s] Epoch 9 [27.3 s] loss=0.3271 [0.0 s] Epoch 10 [23.7 s] loss=0.3083, map=0.0888, prec@1=0.2027, prec@5=0.1109, prec@10=0.0806, recall@1=0.0316, recall@5=0.0831, recall@10=0.1180, [570.3 s] Epoch 11 [30.1 s] loss=0.2936 [0.0 s] Epoch 12 [25.7 s] loss=0.2786, map=0.0883, prec@1=0.1951, prec@5=0.1079, prec@10=0.0796, recall@1=0.0306, recall@5=0.0807, recall@10=0.1172, [576.2 s] Epoch 13 [50.6 s] loss=0.2659 [0.0 s] Epoch 14 [35.1 s] loss=0.2540, map=0.0918, prec@1=0.2033, prec@5=0.1119, prec@10=0.0825, recall@1=0.0321, recall@5=0.0841, recall@10=0.1213, [574.4 s] Epoch 15 [25.8 s] loss=0.2427 [0.0 s] Epoch 16 [22.4 s] loss=0.2342, map=0.0924, prec@1=0.2014, prec@5=0.1145, prec@10=0.0841, recall@1=0.0317, recall@5=0.0862, recall@10=0.1231, [575.3 s] Epoch 17 [27.1 s] loss=0.2246 [0.0 s] Epoch 18 [23.5 s] loss=0.2171, map=0.0914, prec@1=0.1972, prec@5=0.1119, prec@10=0.0827, recall@1=0.0314, recall@5=0.0847, recall@10=0.1222, [572.8 s] Epoch 19 [22.5 s] loss=0.2087 [0.0 s] Epoch 20 [21.7 s] loss=0.2027, map=0.0939, prec@1=0.2068, prec@5=0.1137, prec@10=0.0837, recall@1=0.0328, recall@5=0.0859, recall@10=0.1237, [575.0 s] Epoch 21 [23.8 s] loss=0.1962 [0.0 s] Epoch 22 [20.7 s] loss=0.1912, map=0.0949, prec@1=0.2043, prec@5=0.1154, prec@10=0.0856, recall@1=0.0327, recall@5=0.0876, recall@10=0.1269, [563.9 s] 400字结果分析
时间: 2023-06-26 15:04:53 浏览: 210
Epoch200-Total_Loss2.0690-Val_Loss2.3955.pth
5星 · 资源好评率100%
这段代码是一个训练神经网络的过程,其中的 loss 表示模型在训练数据上的损失,map 表示平均精度均值,prec@1-10 和 recall@1-10 分别表示模型在预测前1-10个结果时的准确率和召回率。通过观察这些指标的变化,可以了解模型在训练过程中的表现情况,进而进行调整和优化。在这段代码中,模型的表现逐渐提升,说明训练是有效的。
阅读全文