You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, I noticed an error that during the backward (self.shortconv_ks=5 in this example):
Traceback (most recent call last):
File "/home/Projects/imaginaire4/projects/mesh2mesh/modules/hyena.py", line 395, in <module>
grad = torch.autograd.grad(y[:, 10, :].sum(), x)[0]
File "/usr/local/lib/python3.10/dist-packages/torch/autograd/__init__.py", line 399, in grad
result = Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
File "/usr/local/lib/python3.10/dist-packages/torch/autograd/function.py", line 289, in apply
return user_fn(self, *args)
File "/usr/local/lib/python3.10/dist-packages/flashfftconv/depthwise_1d.py", line 20, in backward
du, dk, dbias = conv1d_backward(dout, input, weight, bias, ctx.padding, ctx.is_bhl)
RuntimeError: Expected size for first two dimensions of batch2 tensor to be: [768, 1024] but got: [768, 1028].
Note that, given the padding, [768, 1024] is indeed the shape that is to be expected.
Any idea of what might be the cause / how to solve it / how to implement properly a causal short1dconv with FlashFFTConv?
Thank you in advance! :)
Best,
David
The text was updated successfully, but these errors were encountered:
I do not really remember, tbh. If I remember correctly, the model handles padding automatically. Which means that it should be set to zero (or default). I think with this, the out shapes match.
Hi Dan & Hermann,
I am trying to implement a short causal conv. I am doing it this way:
However, I noticed an error that during the backward (
self.shortconv_ks=5
in this example):Note that, given the padding,
[768, 1024]
is indeed the shape that is to be expected.Any idea of what might be the cause / how to solve it / how to implement properly a causal short1dconv with FlashFFTConv?
Thank you in advance! :)
Best,
David
The text was updated successfully, but these errors were encountered: