load checkpoint model error #8050
Answered
by
tchaton
morestart
asked this question in
code help: CV
-
this is my model def __init__(self, num_classes: int, image_channels: int = 3, drop_rate: int = 0.5,
filter_config: tuple = (64, 128, 256, 512, 512), attention=False): this is my load code: m = SegNet(num_classes=1)
model = m.load_from_checkpoint('checkpoints/epoch=99-step=312499.ckpt') when i try to load a checkpoint model, i got this error:
who can help me? |
Beta Was this translation helpful? Give feedback.
Answered by
tchaton
Jun 21, 2021
Replies: 1 comment 2 replies
-
Dear @morestart, Great question ! You should use save_hyperparameters function, so Lightning can save your init arguments inside the checkpoint for future reload. Here is the associated doc: https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html?highlight=save_hyperparameters#save-hyperparameters
Best, |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
morestart
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Dear @morestart,
Great question ! You should use save_hyperparameters function, so Lightning can save your init arguments inside the checkpoint for future reload.
Here is the associated doc: https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html?highlight=save_hyperparameters#save-hyperparameters
Best,
T.C