Skip to content

Make export_llm as a separate binary #8432

@kimishpatel

Description

@kimishpatel
Contributor

🚀 The feature, motivation and pitch

Today users invoke python examples.models.llama.export_llama... to export model. It would be nice to have a set of binary utils installed as part of pip install executorch that can be used for for model export and lowering.

Alternatives

Continue to use python examples.models.llama.export_llama...

Additional context

This may have some overlap with export wizard @byjlw, however I am asking this one to focus more on generative ai usecases.

RFC (Optional)

No response

cc @mergennachin @iseeyuan @lucylq @helunwencser @tarun292 @jackzhxng

Activity

changed the title [-]Make export_llm/lmm as a separate binary[/-] [+]Make export_llm/llm as a separate binary[/+] on Feb 12, 2025
changed the title [-]Make export_llm/llm as a separate binary[/-] [+]Make export_llm as a separate binary[/+] on Feb 12, 2025
moved this from To triage to Backlog in ExecuTorch Coreon Feb 13, 2025
added
module: examplesIssues related to demos under examples/
triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
on Feb 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

Labels

module: examplesIssues related to demos under examples/triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

Status

To triage

Status

Todo

Status

Backlog

Milestone

No milestone

Relationships

None yet

    Development

    No branches or pull requests

      Participants

      @larryliu0820@kimishpatel@jackzhxng

      Issue actions

        Make export_llm as a separate binary · Issue #8432 · pytorch/executorch