Skip to content

example code problem #27092

@iplayfast

Description

@iplayfast

while trying the example from https://huggingface.co/amazon/MistralLite This is the result.

python examplecode.py
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Traceback (most recent call last):
File "/home/chris/ai/text-generation-webui/amazonmistral/examplecode.py", line 8, in
model = AutoModelForCausalLM.from_pretrained(model_id,
File "/home/chris/anaconda3/envs/textgen/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 571, in from_pretrained
return model_class.from_pretrained(
File "/home/chris/anaconda3/envs/textgen/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3076, in from_pretrained
config = cls._check_and_enable_flash_attn_2(config, torch_dtype=torch_dtype, device_map=device_map)
File "/home/chris/anaconda3/envs/textgen/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1265, in _check_and_enable_flash_attn_2
raise ValueError(
ValueError: The current architecture does not support Flash Attention 2.0. Please open an issue on GitHub to request support for this architecture: https://github.com/huggingface/transformers/issues/new

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions