This is a minimalist, terminal-based AI assistant powered by a local GGUF model using llama-cpp-python
.
Configurable system prompt and few-shot examples,
Clean terminal interface,
Logs conversations to /logs
,
Easily forked and modified for your own builds
pip install -r requirements.txt
python main.py
If perferred, use the included start.bat
or start.sh
launchers.
Edit config/config.json
to:
Point to your .gguf
model file,
Adjust temp, top_p, output tokens,
Enable/disable logging or rich output
llama-cpp-python rich (optional but looks good)
Built to be modified, enjoy.