Name and Version
maxk\src\llama.cpp\build-win> .\bin\llama-cli.exe --version
register_backend: registered backend CPU (1 devices)
register_device: registered device CPU (AMD EPYC 7763 64-Core Processor                )
version: 5629 (d1660c8)
built with MSVC 19.42.34433.0 for Windows AMD64
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
Test code
Command line
Problem description & steps to reproduce
I tried macos and linux builds and test-chat works fine.
Did a clean windows x86-64 build (with BUILD_SHARED_LIBS=OFF) and test-chat aborts on mistralai-Mistral-Nemo-Instruct-2407.jinja and some other templates.
@slaren looks like you were the last to commit, could you check this on your setup
First Bad Commit
No response
Relevant log output
See output here.
https://github.com/ggml-org/llama.cpp/actions/runs/15570861515/job/43846147913?pr=14003