Skip to content

Conversation

@daniel-lxs
Copy link
Member

@daniel-lxs daniel-lxs commented Jun 23, 2025

Description

This PR fixes an issue where LM Studio was only showing the currently loaded model instead of all available downloaded models. Users previously had to restart VSCode to see newly downloaded models.

Problem

As reported in the community:

"So, anyone else have Roo not being able to detect the whole list of available models in LM Studio? It used to be able to, but now it only detects whatever I currently have loaded - and that only if I close and reopen VSCode. :/"

Solution

The issue was that the code was using client.llm.listLoaded() which only returns currently loaded models. This PR changes it to use client.system.listDownloadedModels("llm") which fetches ALL downloaded models in LM Studio.

Changes

Backend Changes

  1. Updated model fetching logic in src/api/providers/fetchers/lmstudio.ts:

    • Changed from listLoaded() to listDownloadedModels("llm")
    • Added fallback to listLoaded() for backward compatibility with older LM Studio versions
    • Updated model key to use the model path for downloaded models
  2. Fixed type handling to support both:

    • LLMInstanceInfo (from loaded models)
    • LLMInfo (from downloaded models)
    • Properly handles different context length properties between the types
  3. Added comprehensive tests to ensure the new functionality works correctly

UI Improvements

  1. Auto-refresh on mount for both Ollama and LM Studio settings:
    • Models are automatically refreshed when opening the settings page
    • Cached models remain visible while fresh models load
    • No more blank states during refresh

Testing

  • ✅ All existing tests pass
  • ✅ Added new tests for downloaded models functionality
  • ✅ Added test for fallback behavior
  • ✅ Verified context length detection works for both model types
  • ✅ Verified UI shows cached models during refresh

Impact

  • Users will now see all downloaded models in LM Studio without needing to restart VSCode
  • Models automatically refresh when opening settings
  • Context length will be properly detected for all models
  • Backward compatibility is maintained for older LM Studio versions
  • Better user experience with no loading states

Related Issues

This addresses the model detection issue mentioned in PR #4314's discussion.


Important

Improves LM Studio to display all downloaded models by updating model fetching logic and adding UI auto-refresh.

  • Behavior:
    • Changed model fetching in lmstudio.ts from listLoaded() to listDownloadedModels("llm") to show all downloaded models.
    • Added fallback to listLoaded() for older LM Studio versions.
    • Updated model key to use model path for downloaded models.
  • Type Handling:
    • Supports both LLMInstanceInfo and LLMInfo types.
    • Handles different context length properties.
  • Testing:
    • Added tests for downloaded models and fallback behavior in lmstudio.test.ts.
  • UI Improvements:
    • Auto-refresh models on mount in LMStudio.tsx and Ollama.tsx.
    • Cached models visible during refresh.
  • Misc:
    • Added logging for fallback and error scenarios in webviewMessageHandler.ts.

This description was created by Ellipsis for b5ae4d3. You can customize this summary. It will automatically update as commits are pushed.

- Change from listLoaded() to listDownloadedModels() to fetch all available models
- Add fallback to listLoaded() for backward compatibility
- Fix context length detection for both loaded and downloaded model types
- Update tests to cover new functionality

This fixes the issue where only the currently loaded model was visible and users had to restart VSCode to see newly downloaded models.
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Jun 23, 2025
@daniel-lxs daniel-lxs marked this pull request as ready for review June 23, 2025 18:24
@daniel-lxs daniel-lxs requested review from cte, jr and mrubens as code owners June 23, 2025 18:24
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Jun 23, 2025
@dosubot dosubot bot added size:M This PR changes 30-99 lines, ignoring generated files. and removed size:L This PR changes 100-499 lines, ignoring generated files. labels Jun 23, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jun 23, 2025
@daniel-lxs daniel-lxs moved this from Triage to PR [Needs Review] in Roo Code Roadmap Jun 23, 2025
@mrubens mrubens merged commit 041c28d into main Jun 23, 2025
14 of 15 checks passed
@mrubens mrubens deleted the fix/lmstudio-model-detection branch June 23, 2025 18:46
@github-project-automation github-project-automation bot moved this from PR [Needs Review] to Done in Roo Code Roadmap Jun 23, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Jun 23, 2025
daniel-lxs added a commit that referenced this pull request Jun 26, 2025
- Fix import button getting stuck on 'Importing...' when user cancels file dialog
- Add missing 'importing' translation key to all locales
- Prevent error logging for user cancellation in import flow

Fixes #5047
hannesrudolph pushed a commit that referenced this pull request Jun 29, 2025
- Fix import button getting stuck on 'Importing...' when user cancels file dialog
- Add missing 'importing' translation key to all locales
- Prevent error logging for user cancellation in import flow

Fixes #5047
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. lgtm This PR has been approved by a maintainer size:M This PR changes 30-99 lines, ignoring generated files.

Projects

Archived in project

Development

Successfully merging this pull request may close these issues.

4 participants