You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which checkpoint is the right one to use? Now only the one in demo works but I am not sure whether that's the right one.
If the one in README is the right one, how to make it work?
Do you have a PyTorch checkpoint for online TAPIR?
pyramid_level=1
wget -P F:\Wei\tapnet\tapnet\checkpoints https://storage.googleapis.com/dm-tapnet/bootstap/causal_bootstapir_checkpoint.pt
model = tapir_model.TAPIR(pyramid_level=1, use_casual_conv=True)
model.load_state_dict(torch.load("tapnet/checkpoints/causal_bootstapir_checkpoint.pt"))
Traceback (most recent call last):
File "F:\Wei\tapnet\tapnet\live_demo_thorlab_camera_torch.py", line 17, in <module>
model.load_state_dict(torch.load("tapnet/checkpoints/causal_bootstapir_checkpoint.pt"))
File "C:\Users\NOCB\anaconda3\envs\tapnet_torch\Lib\site-packages\torch\nn\modules\module.py", line 2215, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for TAPIR:
size mismatch for extra_convs.blocks.0.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.1.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.2.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.3.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.4.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
pyramid_level=0
wget -P F:\Wei\tapnet\tapnet\checkpoints https://storage.googleapis.com/dm-tapnet/bootstap/causal_bootstapir_checkpoint.pt
model = tapir_model.TAPIR(pyramid_level=0, use_casual_conv=True)
model.load_state_dict(torch.load("tapnet/checkpoints/causal_bootstapir_checkpoint.pt"))
Traceback (most recent call last):
File "F:\Wei\tapnet\tapnet\live_demo_thorlab_camera_torch.py", line 17, in <module>
model.load_state_dict(torch.load("tapnet/checkpoints/causal_bootstapir_checkpoint.pt"))
File "C:\Users\NOCB\anaconda3\envs\tapnet_torch\Lib\site-packages\torch\nn\modules\module.py", line 2215, in load_state_dict
raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
RuntimeError: Error(s) in loading state_dict for TAPIR:
size mismatch for torch_pips_mixer.linear.weight: copying a param with shape torch.Size([512, 535]) from checkpoint, the shape in current model is torch.Size([512, 486]).
size mismatch for extra_convs.blocks.0.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.1.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.2.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.3.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
size mismatch for extra_convs.blocks.4.conv.weight: copying a param with shape torch.Size([1024, 768, 3, 3]) from checkpoint, the shape in current model is torch.Size([1024, 256, 3, 3]).
The text was updated successfully, but these errors were encountered:
I realize the PyTorch checkpoints for online BootsTAPIR in
torch_causal_tapir_demo.ipynb
and README are not the same. The former is https://storage.googleapis.com/dm-tapnet/causal_bootstapir_checkpoint.pt but the later is https://storage.googleapis.com/dm-tapnet/bootstap/causal_bootstapir_checkpoint.pt with an extra directory of bootstap. The one for the demo works okay but there are runtime problems for the checkpoint in README. I have listed all problems below. Could you please clarify on the following questions?pyramid_level=1
pyramid_level=0
The text was updated successfully, but these errors were encountered: