-
Notifications
You must be signed in to change notification settings - Fork 2.4k
feat: add dynamic model discovery for xAI provider #8900
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Reviewed the latest commit (8c81c55). The changes only update documentation URLs to point to the correct xAI documentation pages. No issues found. Issues Found
Previous ReviewsMention @roomote in a comment to trigger your PR Fixer agent and make changes to this pull request. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR migrates xAI provider from static model configuration to dynamic model fetching, enabling real-time model discovery and pricing information from the xAI API. This brings xAI in line with other dynamic providers like LiteLLM and DeepInfra.
- Implements dynamic model fetching from xAI's
/v1/language-modelsendpoint - Updates xAI provider to be a "dynamic provider" with API key-based model discovery
- Adds UI components for refreshing models and handling context window overrides
Reviewed Changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| src/api/providers/fetchers/xai.ts | New fetcher implementation for xAI models API with pricing and modality parsing |
| packages/types/src/providers/xai.ts | Refactored static registry to store only contextWindow/maxTokens/description, removing pricing fields |
| webview-ui/src/components/settings/providers/XAI.tsx | Added model refresh UI, ModelPicker integration, and context window override |
| src/api/providers/xai.ts | Updated to use context window override and construct complete ModelInfo from partial static data |
| src/shared/api.ts | Added xAI to dynamic provider configuration |
| src/core/webview/webviewMessageHandler.ts | Added xAI model fetching to requestRouterModels handler |
| webview-ui/src/components/ui/hooks/useSelectedModel.ts | Updated to prioritize dynamic models over static for xAI |
| webview-ui/src/components/settings/ApiOptions.tsx | Integrated xAI into dynamic model flow and hid generic model picker |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
c0dbf8a to
c0a0dcf
Compare
c0a0dcf to
4b39e2e
Compare
- Implemented dynamic model discovery for xAI provider using /v1/language-models endpoint - Models are now fetched at runtime with real-time pricing and capabilities - Fixed pricing conversion: XAI API returns fractional cents (basis points), divide by 10,000 not 100 - This fixes pricing display showing $20.00 instead of $0.20 per 1M tokens - Removed static xAI models from MODELS_BY_PROVIDER to rely on dynamic discovery - Enhanced error logging with detailed status and URL information - Support dynamic model context window overrides from API - Fixed parseApiPrice to handle zero values correctly (for free models) - Provided complete ModelInfo fallback in useSelectedModel for UI type safety - Added comprehensive test coverage for cost utilities and XAI fetcher - Updated all tests to reflect correct pricing scale and dynamic model behavior
…le undefined cacheWritesPrice
- Resolved conflicts in calculateApiCostOpenAI signature (added reasoningTokens param) - Updated cerebras.ts to use new cost calculation with reasoning tokens - Updated groq.ts to use new cost calculation with reasoning tokens - Updated lite-llm.ts to use new cost calculation with reasoning tokens - Updated openai-native.ts to use new cost calculation with reasoning tokens - All providers now properly handle reasoning token costs
Both calculateApiCostAnthropic and calculateApiCostOpenAI now return a number directly (the total cost) instead of an object with totalCost, totalInputTokens, and totalOutputTokens properties.
- Fixed anthropic.ts to destructure totalCost correctly - Fixed deepinfra.ts to destructure totalCost correctly - Updated cost test file for both Anthropic and OpenAI functions
The cost calculation functions now return a number directly instead of an object with totalCost, totalInputTokens, and totalOutputTokens properties.
23e76db to
27830f2
Compare
|
Successfully resolved merge conflicts for PR #8900 "feat(xai): add dynamic model discovery with correct pricing". Resolution Summary
Files Resolved
Current Status
Request
|
Implements dynamic model discovery for xAI using their REST API endpoints instead of hard-coded model IDs.
Changes
Testing
Important
Implements dynamic model discovery for xAI provider using API, updates UI with model picker and caching, and adds relevant tests.
/v1/language-modelsAPI endpoint.reasoning_effortfor incompatible models.XAIcomponent inXAI.tsxfor xAI provider settings.ApiOptions.tsxto integrate xAI model picker and refresh logic.ApiOptions.spec.tsxto test xAI-specific UI behavior.xai.spec.ts.xaiModelsfrom static model lists inconstants.tsandprovider-settings.ts.calculateApiCostInternal()incost.tsto return a number instead of an object.This description was created by
for 27830f2. You can customize this summary. It will automatically update as commits are pushed.