anomaly transformer程序
时间: 2023-08-19 10:11:19 浏览: 199
怪异字符串转换
Anomaly Transformer是一种基于变压器的异常检测模型,它使用自适应和对抗训练过程。该模型的架构使其能够快速训练和测试,并且能够处理大输入序列。与简单的基于变压器的编码器-解码器网络相比,Anomaly Transformer通过一种对抗性训练程序来缓解重建误差,从而更好地检测异常。\[2\]
在Anomaly Transformer的代码中,可以看到使用了一种名为EncoderAtt的编码器。在该编码器中,通过将原来的.cuda()替换为.to(device),将模型移动到指定的设备上进行训练和推断。具体来说,self.encoder = EncoderAtt(input_size=self.X.shape\[1\], hidden_size=encoder_hidden_size, T=T).to(device)这行代码将EncoderAtt模型移动到指定的设备上。\[3\]
Anomaly Transformer的代码可以在GitHub上找到,具体地址是:GitHub - thuml/Anomaly-Transformer: About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight) \[1\]。你可以在该代码库中找到更多关于Anomaly Transformer的实现细节和使用方法。
#### 引用[.reference_title]
- *1* *3* [Anomaly-Transformer (ICLR 2022 )代码通过CPU复现](https://blog.csdn.net/weixin_44385635/article/details/130146282)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item]
- *2* [TranAD: Deep Transformer Networks for Anomaly Detection in Multivariate Time Series Data](https://blog.csdn.net/zj_18706809267/article/details/125059124)[target="_blank" data-report-click={"spm":"1018.2226.3001.9630","extra":{"utm_source":"vip_chatgpt_common_search_pc_result","utm_medium":"distribute.pc_search_result.none-task-cask-2~all~insert_cask~default-1-null.142^v91^insertT0,239^v3^insert_chatgpt"}} ] [.reference_item]
[ .reference_list ]
阅读全文