Skip to content

Prevent potential AttributeError when accessing reasoning_model #34

@dkqjrm

Description

@dkqjrm

Description of the bug:

Currently, there's a potential for an AttributeError to occur within the agent's graph logic when configurable.reasoning_model is accessed. This issue arises specifically if the reasoning_model field is not explicitly defined in the Configuration class located at backend/src/agent/configuration.py.
The line causing the potential issue is in backend/src/agent/graph.py:
reasoning_model = state.get("reasoning_model") or configurable.reasoning_model
If Configuration is initialized without reasoning_model being explicitly set (e.g., through environment variables or RunnableConfig), configurable.reasoning_model becomes an invalid attribute. This could lead to a runtime AttributeError and cause the application to crash if a value for reasoning_model is not provided through other means.

Actual vs expected behavior:

When reasoning_model is not defined in Configuration and not provided via state or environment variables, accessing configurable.reasoning_model in backend/src/agent/graph.py results in an AttributeError. This causes the application to fail at runtime.

Expected Behavior:
configurable.reasoning_model should always return a valid string value. This ensures that the agent's reasoning model can be reliably configured without causing a runtime error, even if an explicit value isn't provided during configuration. It should gracefully default to a sensible value if no other configuration is given.

Any other information you'd like to share?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions