Skip to content

About inference speed #15

@krennic999

Description

@krennic999

Hi, thank you very much for your great work. When I was inferring a 512*512 image, I found that it took about 120 seconds to generate an image. I have tried strategies such as flash-attention or torch compile. I would like to know if there are any suggestions for faster inference.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions