epoch、iteration、batchsize
时间: 2023-12-01 12:03:34 浏览: 58
炼丹技巧 | BERT的下接结构调参
Epoch refers to the number of times a dataset is passed through a neural network during training. One epoch means that the entire dataset has been presented to the network once.
Iteration refers to the number of times the model is trained on a single batch of data during one epoch. For example, if the batch size is 32 and there are 1000 samples in the training set, each epoch will have 1000/32 = 31.25 iterations.
Batch size refers to the number of training samples presented to the network at once. Larger batch sizes can lead to faster training times and smoother convergence, but may also require more memory and computing power. Smaller batch sizes can help the model generalize better but may take longer to train.
阅读全文