Highlights
- Pro
Pinned Loading
-
vllm-project/vllm
vllm-project/vllm PublicA high-throughput and memory-efficient inference and serving engine for LLMs
-
oobabooga/text-generation-webui
oobabooga/text-generation-webui PublicLLM UI with advanced features, easy setup, and multiple backend support.
-
OpenBMB/MiniCPM-o
OpenBMB/MiniCPM-o PublicMiniCPM-V 4.5: A GPT-4o Level MLLM for Single Image, Multi Image and Video Understanding on Your Phone
-
huggingface/llm-vscode
huggingface/llm-vscode PublicLLM powered development for VSCode
-
-
llm-vscode-inference-server
llm-vscode-inference-server PublicAn endpoint server for efficiently serving quantized open-source LLMs for code.
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.