pad_sequence() got an unexpected keyword argument 'maxlen'
时间: 2023-10-01 11:02:09 浏览: 167
`pad_sequence()` function in PyTorch does not have a `maxlen` argument. If you want to pad your sequences to a maximum length, you need to specify the maximum length before calling the `pad_sequence()` function. Here's an example:
```python
import torch
# Define some sequences
seqs = [torch.tensor([1, 2, 3]),
torch.tensor([4, 5]),
torch.tensor([6])]
# Define the maximum length
max_len = 4
# Pad the sequences to the maximum length
padded_seqs = torch.nn.utils.rnn.pad_sequence(seqs, batch_first=True, padding_value=0)[:,:max_len]
```
In this example, `pad_sequence()` is called with `batch_first=True` to pad the sequences along the first dimension (i.e., the batch dimension). The `padding_value` argument specifies what value should be used for padding (in this case, 0). Finally, we slice the padded sequences to the maximum length using `[:,:max_len]`.
阅读全文