Skip to content

Conversation

isaacbmiller
Copy link
Collaborator

Will allow users to just write dspy.LM("gpt-5-nano") but also there is the case where this overrides a user's settings.

I can see an argument against this change, but it seems the direction we are heading is that API users have less explicit control over the temperature.


if model_pattern:
if max_tokens < 16000 or temperature != 1.0:
if temperature == 0 and max_tokens == 4000:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Per our offline discussion, let's proceed with temperature=None and max_tokens=None as the default, which aligns with litellm defaults.

@isaacbmiller isaacbmiller changed the title Set default parameters for reasoning model if not passed in. fix(LM): Change default temperature and max_tokens to be None Oct 6, 2025
Copy link
Collaborator

@chenmoneygithub chenmoneygithub left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM with one comment, thank you for the fix!

@isaacbmiller isaacbmiller merged commit 15cbbd4 into main Oct 8, 2025
19 of 20 checks passed
@isaacbmiller isaacbmiller deleted the isaac/gpt-5-warning branch October 8, 2025 18:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants