Skip to content

Conversation

tushar00jain
Copy link
Contributor

Differential Revision: D83846965

Differential Revision: D83846965
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Oct 3, 2025
@facebook-github-bot
Copy link

@tushar00jain has exported this pull request. If you are a Meta employee, you can view the originating Diff in D83846965.

enable: bool = False
"""Whether to enable checkpoint"""

enable_ft: bool = True
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

enable_ft is too vague. I got confused that this is a similar thing as FT.enabled. Can we have a longer and more meaningful naming?

def load_state_dict(self, state_dict: dict[str, Any]):
self.step = state_dict["step"]
self.ntokens_seen = state_dict["ntokens_seen"]
if not self.job_config.checkpoint.enable_ft and self.ft_manager is not None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Setting dataloader state_dict is not safe. data.loader_state_dict() may be invoked before or after this load_state_dict().

@fegin
Copy link
Contributor

fegin commented Oct 3, 2025

I would also suggest that you should not try to do internal code first. The internal TorchTitan is always days behind the OSS one. So it can cause some merge issues.


enable_ft: bool = True
"""
Checkpoints data loader state if enabled. Otherwise infers the data loader
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The description is a little bit vague to me, how does this option helps checkpoint?Do I have to enable FT.enabled to use this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot. fb-exported meta-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants