-
-
Notifications
You must be signed in to change notification settings - Fork 286
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Basic checks
- I searched existing issues - this hasn't been reported
- I can reproduce this consistently
- This is a RubyLLM bug, not my application code
What's broken?
RubyLLM sets its own temperature of 0.7 irrespective of the model.
This is not inline with the expected behavior of LLM API users.
RubyLLM default should mirror model's default.
Temperature variations radically change model behavior vs expectations.
How to reproduce
> RubyLLM.chat(model: "gemini-2.5-pro").instance_variable_get(:@temperature)
=> 0.7
Expected behavior
=> 1
because that's the Gemini's default.
What actually happened
0.7
Environment
Any
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working