Skip to content

Misc. bug: The llama-server not read the "--keep" param that user input in the cli #12927

@ZUIcat

Description

@ZUIcat

Name and Version

version b5124

Operating systems

Windows

Which llama.cpp modules do you know to be affected?

llama-server

Command line

"llama-server.exe" ^
-m test.guff ^
--host 127.0.0.1 --port 8090 ^
--keep 0

Problem description & steps to reproduce

The llama-server not read the "keep" param that user input in the cli.
I have read the code. Here, https://github.com/ggml-org/llama.cpp/blob/bc091a4dc585af25c438c8473285a8cfec5c7695/examples/server/server.cpp#L242A
params.n_keep = json_value(data, "n_keep", defaults.n_keep);
shoud be
params.n_keep = json_value(data, "n_keep", params_base.n_keep);

First Bad Commit

No response

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions