Skip to content

LLama3ChatSession example fails to work with LLama 2 model #937

@asmirnov82

Description

@asmirnov82

Description

LLama3ChatSession example is said to "supports models such as llama3, llama2, phi3, qwen1.5, etc".
However it fails due to several reasons:

  1. Null reference exception (as in ;lama2 model.Tokens.EndOfTurnToken is not defined in metadata and that's why is null)
  2. PromtTemplateTransformer doesn't work correctly with llama2. According to https://www.llama.com/docs/model-cards-and-prompt-formats/meta-llama-2 and https://huggingface.co/blog/llama2#how-to-prompt-llama-2, llama2 should be prompted using this template:
<s>[INST]
{{ system_prompt }}
<</SYS>>

{{ user_message_1 }} [/INST] {{ model_answer_1 }} </s>
<s>[INST] {{ user_message_2 }} [/INST]

Llama2 doesn't provide chat template in metadata, so by default different template is used by llama.cpp:

<|im_start|>user
hello<|im_end|>
<|im_start|>assistant
response<|im_end|>
<|im_start|>user
again<|im_end|>
<|im_start|>assistant
response<|im_end|>

As a result, conversation using LLama3 example after fixing NullRefence exception looks like this:
image

It would be nice to have an example that uses custom HistoryTransform for llama2

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions