Skip to content

No context length setting in advanced configuration (unlike said in the documentation) #2462

@Fade78

Description

@Fade78

App Version

3.11.12

API Provider

Ollama

Model Used

qwq:32b, qwen2.5-coder:32b, and many more.

Actual vs. Expected Behavior

I setup my context to be right to the VRAM capacity of my GPUs. For a 32b model, that means around 24k. In Roocode context length displays, it says my model has a 128k context length.

In the documentation, there is an optional point :

  • (Optional) Configure Model context size in Advanced settings, so Roo Code knows how to manage its sliding window.

Well I didn't find this at all.

Detailed Steps to Reproduce

Follow the doc of setup for an ollama connection.

Relevant API Request Output

Additional Context

I want to change the context length in case it changes something to the bug I have about a file_count problem.

Metadata

Metadata

Assignees

Labels

Issue - In ProgressSomeone is actively working on this. Should link to a PR soon.bugSomething isn't working

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions