Description
🐛 Describe the bug
I tried to download pretrained efficientnet_b3 model using following code:
model_ft = models.efficientnet_b3(weights=models.EfficientNet_B3_Weights.IMAGENET1K_V1)
There's no issue if I ran by PyCharm. Torchvision version: 0.15.2
But when I ran same code with Jupter Notebook (AWS server) torchvision version same as 0.15.2, I got ###### last line error message as following:
RuntimeError: invalid hash value (expected "cf984f9c", got "b3899882250c22946d0229d266049fcd133c169233530b36b9ffa7983988362f")
Versions
I tried to download pretrained efficientnet_b3 model using following code:
model_ft = models.efficientnet_b3(weights=models.EfficientNet_B3_Weights.IMAGENET1K_V1)
There's no issue if I ran by PyCharm. Torchvision version: 0.15.2
But when I ran same code with Jupter Notebook (AWS server) torchvision version same as 0.15.2, I got ###### last line error message as following:
RuntimeError: invalid hash value (expected "cf984f9c", got "b3899882250c22946d0229d266049fcd133c169233530b36b9ffa7983988362f")
### Full messages after the code running as following:
Cell In[33], line 2 1#efficientnet_b3----> 2model_ft =models.efficientnet_b3(weights=models.EfficientNet_B3_Weights.IMAGENET1K_V1)File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torchvision/models/_utils.py:142, in kwonly_to_pos_or_kw..wrapper(*args, **kwargs) 135warnings.warn(
136f"Using {sequence_to_str(tuple(keyword_only_kwargs.keys()), separate_last='and ')} as positional " 137f"parameter(s) is deprecated since 0.13 and may be removed in the future. Please use keyword parameter(s) " 138f"instead." 139)
140kwargs.update(keyword_only_kwargs)
--> 142returnfn(*args, **kwargs)File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torchvision/models/_utils.py:228, in handle_legacy_interface..outer_wrapper..inner_wrapper(*args, **kwargs) 225delkwargs[pretrained_param]
226kwargs[weights_param] =default_weights_arg
--> 228returnbuilder(*args, **kwargs)File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torchvision/models/efficientnet.py:863, in efficientnet_b3(weights, progress, **kwargs) 860weights =EfficientNet_B3_Weights.verify(weights)
862inverted_residual_setting, last_channel =_efficientnet_conf("efficientnet_b3", width_mult=1.2, depth_mult=1.4)
--> 863return_efficientnet( 864 inverted_residual_setting, 865 kwargs.pop("dropout", 0.3), 866 last_channel, 867 weights, 868 progress, 869 **kwargs, 870)File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torchvision/models/efficientnet.py:360, in _efficientnet(inverted_residual_setting, dropout, last_channel, weights, progress, **kwargs) 357model =EfficientNet(inverted_residual_setting, dropout, last_channel=last_channel, **kwargs)
359ifweights isnotNone:
--> 360model.load_state_dict(weights.get_state_dict(progress=progress, check_hash=True))
362returnmodel
File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torchvision/models/_api.py:90, in WeightsEnum.get_state_dict(self, *args, **kwargs) 89defget_state_dict(self, *args: Any, **kwargs: Any) ->Mapping[str, Any]:
---> 90returnload_state_dict_from_url(self.url, *args, **kwargs)File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torch/hub.py:766, in load_state_dict_from_url(url, model_dir, map_location, progress, check_hash, file_name, weights_only) 764r =HASH_REGEX.search(filename) # r is Optional[Match[str]] 765hash_prefix =r.group(1) ifr elseNone--> 766download_url_to_file(url, cached_file, hash_prefix, progress=progress) 768if_is_legacy_zip_format(cached_file):
769return_legacy_zip_load(cached_file, model_dir, map_location, weights_only)
File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/torch/hub.py:663, in download_url_to_file(url, dst, hash_prefix, progress) 661digest =sha256.hexdigest()
662ifdigest[:len(hash_prefix)] !=hash_prefix:
--> 663raiseRuntimeError(f'invalid hash value (expected "{hash_prefix}", got "{digest}")')
664shutil.move(f.name, dst)
665finally:
RuntimeError: invalid hash value (expected "cf984f9c", got "b3899882250c22946d0229d266049fcd133c169233530b36b9ffa7983988362f")