model = tf.keras.Sequential() model.add(tf.keras.layers.LSTM(80, activation='relu', return_sequences=True, input_shape=(10, 14))) model.add(tf.keras.layers.Dropout(0.2)) model.add(tf.keras.layers.LSTM(80, activation='relu')) model.add(tf.keras.layers.Dense(80)) model.add(tf.keras.layers.Dense(28)) model.compile(metrics=['accuracy'], loss='mean_squared_error', optimizer='adam') model.summary()
时间: 2024-01-24 13:03:41 浏览: 167
该模型是一个序列模型,使用了两个LSTM层。第一个LSTM层有80个单元,激活函数为ReLU,返回序列。输入形状为(10, 14)的张量。第二个LSTM层也有80个单元,激活函数为ReLU。之后添加了一个有80个单元的全连接层和一个有28个单元的输出层。模型使用均方误差作为损失函数,优化器为Adam,并计算准确率作为额外的指标。
以下是模型的摘要信息:
```
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 10, 80) 34400
_________________________________________________________________
dropout (Dropout) (None, 10, 80) 0
_________________________________________________________________
lstm_1 (LSTM) (None, 80) 51520
_________________________________________________________________
dense (Dense) (None, 80) 6480
_________________________________________________________________
dense_1 (Dense) (None, 28) 2268
=================================================================
Total params: 94,668
Trainable params: 94,668
Non-trainable params: 0
_________________________________________________________________
```
阅读全文