SELECT scm.english_name as englishName, scm.city_simplicity as citySimplicity, scm.initials, scm.city_name as city, scm.city_three as cityCode FROM sd_airport_new san INNER JOIN sys_city_management scm ON scm.city_three = san.city_code WHERE san.del_flag=0 <if test="hotHave !=null and hotHave != ''"> and scm.is_hot = 1 </if> GROUP BY scm.city_three 将这段变成sql语句
时间: 2024-02-10 07:08:54 浏览: 96
这段代码已经是SQL语句了,只是加了一些Mybatis的动态SQL标签。如果要去掉动态SQL标签,可以简化为以下代码:
SELECT scm.english_name as englishName, scm.city_simplicity as citySimplicity, scm.initials, scm.city_name as city, scm.city_three as cityCode
FROM sd_airport_new san
INNER JOIN sys_city_management scm ON scm.city_three = san.city_code
WHERE san.del_flag=0
AND scm.is_hot = 1
GROUP BY scm.city_three
相关问题
import numpy as np import tensorflow as tf from tensorflow import keras import matplotlib.pyplot as plt ## Let us define a plt function for simplicity def plt_loss(x,training_metric,testing_metric,ax,colors = ['b']): ax.plot(x,training_metric,'b',label = 'Train') ax.plot(x,testing_metric,'k',label = 'Test') ax.set_xlabel('Epochs') ax.set_ylabel('Accuarcy')# ax.set_ylabel('Categorical Crossentropy Loss') plt.legend() plt.grid() plt.show() tf.keras.utils.set_random_seed(1) ## We import the Minist Dataset using Keras.datasets (train_data, train_labels), (test_data, test_labels) = keras.datasets.mnist.load_data() ## We first vectorize the image (28*28) into a vector (784) train_data = train_data.reshape(train_data.shape[0],train_data.shape[1]train_data.shape[2]) # 60000784 test_data = test_data.reshape(test_data.shape[0],test_data.shape[1]test_data.shape[2]) # 10000784 ## We next change label number to a 10 dimensional vector, e.g., 1->[0,1,0,0,0,0,0,0,0,0] train_labels = keras.utils.to_categorical(train_labels,10) test_labels = keras.utils.to_categorical(test_labels,10) ## start to build a MLP model N_batch_size = 5000 N_epochs = 100 lr = 0.01 ## we build a three layer model, 784 -> 64 -> 10 MLP_4 = keras.models.Sequential([ keras.layers.Dense(128, input_shape=(784,),activation='relu'), keras.layers.Dense(64,activation='relu'), keras.layers.Dense(10,activation='softmax') ]) MLP_4.compile( optimizer=keras.optimizers.Adam(lr), loss= 'categorical_crossentropy', metrics = ['accuracy'] ) History = MLP_4.fit(train_data[:10000],train_labels[:10000], batch_size = N_batch_size, epochs = N_epochs,validation_data=(test_data,test_labels), shuffle=False) train_acc = History.history['accuracy'] test_acc = History.history['val_accuracy']在该模型中加入early stopping,使用monitor='loss', patience = 2设置代码
Sure, here is the updated code with early stopping:
```
import numpy as np
import tensorflow as tf
from tensorflow import keras
import matplotlib.pyplot as plt
## Let us define a plt function for simplicity
def plt_loss(x,training_metric,testing_metric,ax,colors = ['b']):
ax.plot(x,training_metric,'b',label = 'Train')
ax.plot(x,testing_metric,'k',label = 'Test')
ax.set_xlabel('Epochs')
ax.set_ylabel('Accuarcy')# ax.set_ylabel('Categorical Crossentropy Loss')
plt.legend()
plt.grid()
plt.show()
tf.keras.utils.set_random_seed(1)
## We import the Minist Dataset using Keras.datasets
(train_data, train_labels), (test_data, test_labels) = keras.datasets.mnist.load_data()
## We first vectorize the image (28*28) into a vector (784)
train_data = train_data.reshape(train_data.shape[0],train_data.shape[1]*train_data.shape[2]) # 60000*784
test_data = test_data.reshape(test_data.shape[0],test_data.shape[1]*test_data.shape[2]) # 10000*784
## We next change label number to a 10 dimensional vector, e.g., 1->[0,1,0,0,0,0,0,0,0,0]
train_labels = keras.utils.to_categorical(train_labels,10)
test_labels = keras.utils.to_categorical(test_labels,10)
## start to build a MLP model
N_batch_size = 5000
N_epochs = 100
lr = 0.01
## we build a three layer model, 784 -> 64 -> 10
MLP_4 = keras.models.Sequential([
keras.layers.Dense(128, input_shape=(784,),activation='relu'),
keras.layers.Dense(64,activation='relu'),
keras.layers.Dense(10,activation='softmax')
])
MLP_4.compile(
optimizer=keras.optimizers.Adam(lr),
loss= 'categorical_crossentropy',
metrics = ['accuracy']
)
## Add early stopping
early_stop = tf.keras.callbacks.EarlyStopping(monitor='loss', patience=2)
History = MLP_4.fit(train_data[:10000],train_labels[:10000],
batch_size = N_batch_size,
epochs = N_epochs,
validation_data=(test_data,test_labels),
shuffle=False,
callbacks=[early_stop])
train_acc = History.history['accuracy']
test_acc = History.history['val_accuracy']
```
In this code, we added the early stopping callback by creating an instance of `tf.keras.callbacks.EarlyStopping` and passing it to the `callbacks` argument of the `fit` method. We set the `monitor` parameter to `'loss'` and the `patience` parameter to `2`. This means that training will stop if the loss does not improve after 2 epochs.
阅读全文