Skip to content

torchao.float8 not working on PyTorch 2.4.1 and how does torchao handle FP8 autocast? #1159

@zigzagcai

Description

@zigzagcai

We know that Transformer_Engine provides context manager named fp8_autocast to handle autocasting.

But I cannot find a similar context manager in torchao, so could anybody provides some hint about how does torchao handle FP8 autocasting?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions