Skip to content

Support for Huggingface safetensors and tokenizer.json for LLM exporting #7696

@ProjectProgramAMark

Description

@ProjectProgramAMark

🚀 The feature, motivation and pitch

Currently, one has to manually convert model*.safetensors to checkpoint.pth before utilizing executorch's export. Perhaps there should be support for safetensor files right out the box?

Alternatives

No response

Additional context

No response

RFC (Optional)

No response

cc @JacobSzwejbka @angelayi

Activity

added
enhancementNot as big of a feature, but technically not a bug. Should be easy to fix
module: exirIssues related to Export IR and the code under exir/
on Jan 17, 2025
mcr229

mcr229 commented on Jan 21, 2025

@mcr229
Contributor
added
triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
on Feb 4, 2025
iseeyuan

iseeyuan commented on Mar 6, 2025

@iseeyuan
Contributor

cc @jackzhxng The similar issue I created the other day.

moved this from To triage to Backlog in ExecuTorch Coreon Mar 6, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNot as big of a feature, but technically not a bug. Should be easy to fixmodule: exirIssues related to Export IR and the code under exir/triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Projects

    Status

    Backlog

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @digantdesai@ProjectProgramAMark@iseeyuan@jackzhxng@mcr229

        Issue actions

          Support for Huggingface safetensors and tokenizer.json for LLM exporting · Issue #7696 · pytorch/executorch