Skip to content

[BUG] Use model's default temperature if not set #349

@qertoip

Description

@qertoip

Basic checks

  • I searched existing issues - this hasn't been reported
  • I can reproduce this consistently
  • This is a RubyLLM bug, not my application code

What's broken?

RubyLLM sets its own temperature of 0.7 irrespective of the model.

This is not inline with the expected behavior of LLM API users.

RubyLLM default should mirror model's default.

Temperature variations radically change model behavior vs expectations.

How to reproduce

> RubyLLM.chat(model: "gemini-2.5-pro").instance_variable_get(:@temperature)
=> 0.7

Expected behavior

=> 1 because that's the Gemini's default.

What actually happened

0.7

Environment

Any

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions