-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Initial Checks
- I confirm that I'm using the latest version of Pydantic AI
- I confirm that I searched for my issue in https://github.com/pydantic/pydantic-ai/issues before opening this issue
Description
When using a local ollama model (e.g., qwen2.5 or qwen3) on an agent that uses tools, if the tool gets used, the agent stops right after the tool gets called and returns nothing.
As reported in https://pydanticlogfire.slack.com/archives/C083V7PMHHA/p1752874628320719?thread_ts=1752791433.109389&cid=C083V7PMHHA
Example Code
import asyncio
import os
import random
from typing import Any, List, Optional
import logfire
from pydantic_ai import Agent, Tool
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.models.groq import GroqModel
from pydantic_ai.providers.openai import OpenAIProvider
from pydantic import BaseModel
from dotenv import load_dotenv
# Load environment variables from a .env file if it exists
env_ok = load_dotenv(os.path.join('..', ".env"))
if not env_ok:
print("Warning: .env file not found. Environment variables may not be set correctly.")
logfire.configure()
logfire.instrument_pydantic_ai()
def roll_dice() -> str:
"""Roll a six-sided die and return the result."""
return str(random.randint(1, 6))
# --- Agent and Model Configuration ---
ollama_model = OpenAIModel(
model_name='qwen2.5', provider=OpenAIProvider(base_url='http://localhost:11434/v1'), settings={"temperature": 0.2}
)
groq_model = GroqModel(
model_name='qwen/qwen3-32b', settings={"temperature": 0.2},
)
default_system_prompt = """
You are a helpful assistant.
Do not ask any follow-up questions.
"""
model_settings = {
"temperature": 0.2,
}
tools = [roll_dice]
agent = Agent(model=ollama_model, tools=tools,
system_prompt=default_system_prompt, output_type=str)
async def main():
prompt = 'roll teh dice'
async with agent.run_stream(prompt) as result:
async for message in result.stream():
print(message)
print(result.usage())
# ---
if __name__ == '__main__':
asyncio.run(main())Python, Pydantic AI & LLM client version
python 3.11.13, pydantic 0.4.2, LLM: ollama 0.9.6, qwen2.5 and qwen3 latest
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working