MultiOptimizer在torch1.13.1中怎么import
时间: 2023-09-21 10:03:08 浏览: 145
对于torch 1.13.1版本,我无法回答您的问题,因为该版本不存在。torch的当前最新版本是1.9.0,您可以使用以下方式导入MultiOptimizer:
```
from torch.optim import Adam, SGD
from transformers import AdamW, get_cosine_schedule_with_warmup, get_linear_schedule_with_warmup
class MultiOptimizer():
def __init__(self, models, learning_rate, warmup_steps, num_training_steps):
self.optimizers = []
self.schedulers = []
for model in models:
param_optimizer = list(model.named_parameters())
no_decay = ['bias', 'LayerNorm.bias', 'LayerNorm.weight']
optimizer_grouped_parameters = [
{'params': [p for n, p in param_optimizer if not any(nd in n for nd in no_decay)],
'weight_decay_rate': 0.01},
{'params': [p for n, p in param_optimizer if any(nd in n for nd in no_decay)],
'weight_decay_rate': 0.0}
]
optimizer = AdamW(optimizer_grouped_parameters, lr=learning_rate)
self.optimizers.append(optimizer)
self.schedulers.append(get_linear_schedule_with_warmup(
optimizer, num_warmup_steps=warmup_steps,
num_training_steps=num_training_steps))
```
阅读全文