You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have below code, which concats the input tensor before conv layer, this is the original corresponding Pytorch code: x = torch.nn.functional.pad(x, pad, mode="constant", value=0).
In AIT, assume the input tensor shape is (N, H, W, Cin),
Conversion code 1, pad H first followed by W:
T1 = ops.full()(shape=[N, 1, W, Cin], fill_value=0)
R = ops.concatenate()(inputs=[x, T1], dim=1)
T2 = ops.full()(shape=[N, H + 1, 1, Cin], fill_value=0)
x = ops.concatenate()(inputs=[R, T2], dim=2)
and this will give different results than
Conversion code 2, pad W first followed by H:
This code is run many times during inference. I find when I use the 1st block of code with H-first concat, the conversion results only works for H>W; the 2nd conversion with W-first concat only works for when W>H. Square inputs works with either approach. Eg dimensions to try for (H, W): (896, 1152), (1152, 896), (1024, 1024)
The text was updated successfully, but these errors were encountered:
I have below code, which concats the input tensor before conv layer, this is the original corresponding Pytorch code:
x = torch.nn.functional.pad(x, pad, mode="constant", value=0)
.In AIT, assume the input tensor shape is (N, H, W, Cin),
Conversion code 1, pad H first followed by W:
and this will give different results than
Conversion code 2, pad W first followed by H:
This code is run many times during inference. I find when I use the 1st block of code with H-first concat, the conversion results only works for H>W; the 2nd conversion with W-first concat only works for when W>H. Square inputs works with either approach. Eg dimensions to try for (H, W): (896, 1152), (1152, 896), (1024, 1024)
The text was updated successfully, but these errors were encountered: