Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

view size is not compatible with input tensor's size and stride (at least one dimension spans across two contiguous subspaces). Use .reshape(...) instead. #86

Open
ghost opened this issue Jan 20, 2022 · 2 comments

Comments

@ghost
Copy link

ghost commented Jan 20, 2022

While I was trying to run the code I came across an error when I was trying to test the GMM model.

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
~\AppData\Local\Temp\ipykernel_17948\1395829211.py in <module>
    196     print('Finished test %s, named: %s!' % (opt['stage'], opt['name']))
    197 
--> 198 main()

~\AppData\Local\Temp\ipykernel_17948\1395829211.py in main()
    184         load_checkpoint(model, opt['checkpoint'])
    185         with torch.no_grad():
--> 186             test_gmm(opt, test_loader, model)
    187     elif opt['stage'] == 'TOM':
    188         # model = UnetGenerator(25, 4, 6, ngf=64, norm_layer=nn.InstanceNorm2d)  # CP-VTON

~\AppData\Local\Temp\ipykernel_17948\1395829211.py in test_gmm(opt, test_loader, model)
     73         print(cm.shape)
     74 
---> 75         grid, theta = model(agnostic, cm)
     76         warped_cloth = F.grid_sample(c, grid, padding_mode='border')
     77         warped_mask = F.grid_sample(cm, grid, padding_mode='zeros')

~\miniconda3\envs\fashion\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
   1100         if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1101                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102             return forward_call(*input, **kwargs)
   1103         # Do not call functions when jit is used
   1104         full_backward_hooks, non_full_backward_hooks = [], []

~\AppData\Local\Temp\ipykernel_17948\3890865466.py in forward(self, inputA, inputB)
    521         correlation = self.correlation(featureA, featureB)
    522 
--> 523         theta = self.regression(correlation)
    524         grid = self.gridGen(theta)
    525         return grid, theta

~\miniconda3\envs\fashion\lib\site-packages\torch\nn\modules\module.py in _call_impl(self, *input, **kwargs)
   1100         if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks
   1101                 or _global_forward_hooks or _global_forward_pre_hooks):
-> 1102             return forward_call(*input, **kwargs)
   1103         # Do not call functions when jit is used
   1104         full_backward_hooks, non_full_backward_hooks = [], []

~\AppData\Local\Temp\ipykernel_17948\3890865466.py in forward(self, x)
    132     def forward(self, x):
    133         x = self.conv(x)
--> 134         x = x.view(x.size(0), -1)
    135         x = self.linear(x)
    136         x = self.tanh(x)

RuntimeError: view size is not compatible with input tensor's size and stride (at least one dimension spans across two contiguous subspaces). Use .reshape(...) instead.

I have not changed any part of the code yet. I wanted to run the code as it is, before trying different things. Can you help me understand why such an error was caused and how to fix it?

Thank you!

EDIT: I just replaced the view function with reshape as suggested in the error and it works. Though I am still not sure of the difference between the two functions in this context.

@ZeroSum0x00
Copy link

You can change FeatureRegression class in networks.py x = x.view(x.size(0), -1) to x = torch.reshape(x, (x.shape[0], -1)). It works for me

@thaithanhtuan
Copy link
Collaborator

I think this problem comes from a different PyTorch version.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants