Skip to content

load_in_8bit is not working for some huggingface model #14

@sanyalsunny111

Description

@sanyalsunny111

I have updated the transformers package and I am using ViLT model: https://huggingface.co/docs/transformers/model_doc/vilt#transformers.ViltForQuestionAnswering

image

I am getting this error is load_in_8bit is not integrated will all hugging face models ? Could you please let me know how to use load_in_8bit for any huggingface model not just BLOOM and T5.

image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions