"实验一: 俞哲轩的网络层神经元工具,学习率与批处理大小对Sin运算的影响"

需积分: 0 0 下载量 187 浏览量 更新于2024-04-17 收藏 2.09MB PDF 举报
In Lab 1 of the course, 18302010018-俞哲轩 explored the functionality of the NetworkLayerNeuronUtil in implementing a neural network. The student experimented with different parameters such as the learning rate, batch size, activation function (Tanh), and the number of neurons in the hidden layer. Through rigorous testing and analysis, it was found that a smaller learning rate and larger batch size resulted in more stable training, while the Tanh activation function provided better performance compared to others. The student also studied the impact of the number of neurons in the hidden layer on the network's accuracy, finding that a higher number of neurons could lead to overfitting. Overall, this lab exercise provided valuable insights into the intricacies of designing and training neural networks, highlighting the importance of parameter tuning and thoughtful experimentation.