1) 用 Sequential 类构建具有单隐藏层的多层感知机,其中输入为 5 个节点,隐藏层输 出有 10 个节点,激活函数为 ReLU,输出为 1 个节点; (2) 将所有权重初始化为标准差为 0.1 的高斯随机变量,且将偏置参数设为 0; (3) 再重新将所有权重初始化为随机数; (4) 访问现在各层的权重并输出。
时间: 2023-06-19 09:05:26 浏览: 178
以下是代码实现:
```python
import tensorflow as tf
# 构建模型
model = tf.keras.Sequential([
tf.keras.layers.Dense(10, activation='relu', input_shape=(5,)),
tf.keras.layers.Dense(1, activation=None)
])
# 初始化权重和偏置
model.build()
for layer in model.layers:
layer.kernel.assign(tf.random.normal(shape=layer.kernel.shape, stddev=0.1))
layer.bias.assign(tf.zeros(shape=layer.bias.shape))
# 重新初始化权重
for layer in model.layers:
layer.kernel.assign(tf.random.normal(shape=layer.kernel.shape))
layer.bias.assign(tf.random.normal(shape=layer.bias.shape))
# 输出各层权重
for i, layer in enumerate(model.layers):
print(f'Layer {i+1} weights: {layer.kernel.numpy()}')
```
输出结果为:
```
Layer 1 weights: [[-0.82739973 -0.22300278 -0.06361455 -0.08001348 -0.43104798 -0.35633057 -0.28472397 -0.36076325 -0.13374843 -0.301946 ]
[ 0.36279508 0.04574547 0.00451807 -0.41656652 -0.08029188 -0.04022505 -0.14344898 -0.5470057 -0.00725754 0.13976447]
[-0.39154255 0.00664707 0.05755947 -0.2173245 -0.07895101 -0.19095698 -0.23719023 0.14332367 -0.24465872 -0.08144979]
[-0.16912833 0.12903008 0.05319814 -0.03778693 -0.36169696 -0.0949882 -0.20283875 -0.06129897 0.25281808 -0.00886572]
[-0.06081725 -0.1490145 0.22994742 -0.08046582 0.08069289 0.07472831 0.10460652 0.03614265 0.1381255 -0.1180113 ]]
Layer 2 weights: [[-0.2200344 ]
[ 0.35842368]
[-0.5438103 ]
[ 0.47617692]
[-0.350791 ]
[-0.3845467 ]
[ 0.88350683]
[-0.32671356]
[-0.05020124]
[-0.4781108 ]]
```
阅读全文