def _break_up_pc(self, pc): xyz = pc[..., 0:3].contiguous() features = ( pc[..., 3:].transpose(1, 2).contiguous() if pc.size(-1) > 3 else None ) return xyz, features def forward(self, pointcloud: torch.cuda.FloatTensor): xyz, features = self._break_up_pc(pointcloud) l_xyz, l_features = [xyz], [features] for i in range(len(self.SA_modules)): li_xyz, li_features = self.SA_modules[i](l_xyz[i], l_features[i]) l_xyz.append(li_xyz) l_features.append(li_features)明明在使用SA_modules时只给了l_xyz和l_features两个参数,怎么会报错说 File "/root/autodl-tmp/project/tools/../lib/net/pointnet2_msg.py", line 154, in forward li_xyz, li_features = self.SA_modules[i](l_xyz[i], l_features[i]) File "/root/miniconda3/lib/python3.8/site-packages/torch/nn/modules/module.py", line 727, in _call_impl result = self.forward(*input, **kwargs) TypeError: forward() takes 2 positional arguments but 3 were given呢?
时间: 2023-07-15 10:13:32 浏览: 134
dma-contiguous.rar_For Real
根据报错信息,可以看出forward()方法实际传递了三个参数,而不是只有两个参数。在你的代码中,第一个参数是self,表示类实例本身,第二个参数是pointcloud,第三个参数是在SA_modules中调用forward()方法时隐式传递的。因此,你需要检查SA_modules中forward()方法的定义,看是否有多余的参数传递。另外,你也可以检查l_features是否为None,如果是,则只传递l_xyz一个参数即可。
阅读全文