写出在SinGAN中加入CBAM注意力机制的代码
时间: 2023-03-14 21:11:12 浏览: 102
答:def CBAM_Block(x):
x = Conv2D(filters= x.shape[-1]//8, kernel_size=1)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = Conv2D(filters= x.shape[-1], kernel_size=3, padding='same')(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)
x = MaxPool2D(pool_size=2)(x)
x = Flatten()(x)
x = Dense(units=x.shape[-1]//2, activation='relu')(x)
x = Dense(units=x.shape[-1], activation='sigmoid')(x)
y = Lambda(lambda x: x[0]*x[1])([x, x])
return y