Skip to content

load checkpoint model error #8050

Jun 21, 2021 · 1 comments · 2 replies
Discussion options

You must be logged in to vote

Dear @morestart,

Great question ! You should use save_hyperparameters function, so Lightning can save your init arguments inside the checkpoint for future reload.

Here is the associated doc: https://pytorch-lightning.readthedocs.io/en/stable/common/lightning_module.html?highlight=save_hyperparameters#save-hyperparameters

class SegNet(LightningModule)

    def __init__(
        self,
        num_classes: int,
        image_channels: int = 3,
        drop_rate: int = 0.5,
        filter_config: tuple = (64, 128, 256, 512, 512),
        attention=False
    ):
        super().__init__()
        self.save_hyperparameters()

        ....

Best,
T.C

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@tchaton
Comment options

@morestart
Comment options

Answer selected by morestart
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants