You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[E:onnxruntime:, sequential_executor.cc:514 ExecuteKernel] Non-zero status code returned while running Reshape node. Name:'/layers.0/residual_group/blocks.1/attn/Reshape_1' Status Message: [/onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:40](https://vscode-remote+ssh-002dremote-002b10-002e30-002e2-002e76.vscode-resource.vscode-cdn.net/onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:40) onnxruntime::ReshapeHelper::ReshapeHelper(const onnxruntime::TensorShape&, onnxruntime::TensorShapeVector&, bool) gsl::narrow_cast<int64_t>(input_shape.Size()) == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{88,6,64,64}, requested shape:{1,64,6,64,64}
Hi, I fix it by editing network_swinir.py:
1、replace B = int(windows.shape[0] / (H * W / window_size / window_size))
x = windows.view(B, H // window_size, W // window_size, window_size, window_size, -1)
x = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(B, H, W, -1)
return x
by B = windows.shape[0] / (H * W / window_size / window_size)
x = windows.view(B.numel(), H // window_size, W // window_size, window_size, window_size, -1)
x = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(B.numel(), H, W, -1)
return x
2、replace # W-MSA/SW-MSA (to be compatible for testing on images whose shapes are the multiple of window size
if self.input_resolution == x_size:
attn_windows = self.attn(x_windows, mask=self.attn_mask) # nWB, window_sizewindow_size, C
else:
attn_windows = self.attn(x_windows, mask=self.calculate_mask(x_size).to(x.device))
by # W-MSA/SW-MSA (to be compatible for testing on images whose shapes are the multiple of window size
attn_windows = self.attn(x_windows, mask=self.calculate_mask(x_size).to(x.device))
then do the pytorch2onnx, dynamic input should work when inference with onnx model. Hope it helps.
After exporting the model to onnx, onnxruntime fails to do inference with inputs that have a different shape than the one used to export the model.
To reproduce:
error message:
[E:onnxruntime:, sequential_executor.cc:514 ExecuteKernel] Non-zero status code returned while running Reshape node. Name:'/layers.0/residual_group/blocks.1/attn/Reshape_1' Status Message: [/onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:40](https://vscode-remote+ssh-002dremote-002b10-002e30-002e2-002e76.vscode-resource.vscode-cdn.net/onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:40) onnxruntime::ReshapeHelper::ReshapeHelper(const onnxruntime::TensorShape&, onnxruntime::TensorShapeVector&, bool) gsl::narrow_cast<int64_t>(input_shape.Size()) == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{88,6,64,64}, requested shape:{1,64,6,64,64}
environment: pytorch==1.13.1, onnxruntime==1.15.1, onnx==1.12.0
The text was updated successfully, but these errors were encountered: