Skip to content

Commit f261705

Browse files
committed
docs!: adjust external provider docs
now that we consolidated the providerspec types and got rid of `AdapterSpec`, adjust external.md and external-provider-guide.md external_providers_guide.md specifically needed some updates around the proper fields required in a ProviderSpec beyond the changes in this PR. BREAKING CHANGE: external providers must update their `get_provider_spec` function to use `RemoteProviderSpec` properly Signed-off-by: Charlie Doern <[email protected]>
1 parent ceca3c0 commit f261705

File tree

2 files changed

+52
-97
lines changed

2 files changed

+52
-97
lines changed

docs/docs/concepts/apis/external.mdx

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,6 @@ __all__ = ["WeatherAPI", "available_providers"]
152152
from typing import Protocol
153153
154154
from llama_stack.providers.datatypes import (
155-
AdapterSpec,
156155
Api,
157156
ProviderSpec,
158157
RemoteProviderSpec,
@@ -166,12 +165,10 @@ def available_providers() -> list[ProviderSpec]:
166165
api=Api.weather,
167166
provider_type="remote::kaze",
168167
config_class="llama_stack_provider_kaze.KazeProviderConfig",
169-
adapter=AdapterSpec(
170-
adapter_type="kaze",
171-
module="llama_stack_provider_kaze",
172-
pip_packages=["llama_stack_provider_kaze"],
173-
config_class="llama_stack_provider_kaze.KazeProviderConfig",
174-
),
168+
adapter_type="kaze",
169+
module="llama_stack_provider_kaze",
170+
pip_packages=["llama_stack_provider_kaze"],
171+
config_class="llama_stack_provider_kaze.KazeProviderConfig",
175172
),
176173
]
177174
@@ -325,11 +322,10 @@ class WeatherKazeAdapter(WeatherProvider):
325322

326323
```yaml
327324
# ~/.llama/providers.d/remote/weather/kaze.yaml
328-
adapter:
329-
adapter_type: kaze
330-
pip_packages: ["llama_stack_provider_kaze"]
331-
config_class: llama_stack_provider_kaze.config.KazeProviderConfig
332-
module: llama_stack_provider_kaze
325+
adapter_type: kaze
326+
pip_packages: ["llama_stack_provider_kaze"]
327+
config_class: llama_stack_provider_kaze.config.KazeProviderConfig
328+
module: llama_stack_provider_kaze
333329
optional_api_dependencies: []
334330
```
335331

docs/docs/providers/external/external-providers-guide.mdx

Lines changed: 44 additions & 85 deletions
Original file line numberDiff line numberDiff line change
@@ -11,76 +11,52 @@ an example entry in your build.yaml should look like:
1111
module: ramalama_stack
1212
```
1313

14-
Additionally you can configure the `external_providers_dir` in your Llama Stack configuration. This method is in the process of being deprecated in favor of the `module` method. If using this method, the external provider directory should contain your external provider specifications:
15-
16-
```yaml
17-
external_providers_dir: ~/.llama/providers.d/
18-
```
19-
20-
## Directory Structure
21-
22-
The external providers directory should follow this structure:
23-
24-
```
25-
providers.d/
26-
remote/
27-
inference/
28-
custom_ollama.yaml
29-
vllm.yaml
30-
vector_io/
31-
qdrant.yaml
32-
safety/
33-
llama-guard.yaml
34-
inline/
35-
inference/
36-
custom_ollama.yaml
37-
vllm.yaml
38-
vector_io/
39-
qdrant.yaml
40-
safety/
41-
llama-guard.yaml
42-
```
43-
44-
Each YAML file in these directories defines a provider specification for that particular API.
45-
4614
## Provider Types
4715

4816
Llama Stack supports two types of external providers:
4917

5018
1. **Remote Providers**: Providers that communicate with external services (e.g., cloud APIs)
5119
2. **Inline Providers**: Providers that run locally within the Llama Stack process
5220

21+
22+
### Provider Specification (Common between inline and remote providers)
23+
24+
- `provider_type`: The type of the provider to be installed (remote or inline). eg. `remote::ollama`
25+
- `api`: The API for this provider, eg. `inference`
26+
- `config_class`: The full path to the configuration class
27+
- `module`: The Python module containing the provider implementation
28+
- `optional_api_dependencies`: List of optional Llama Stack APIs that this provider can use
29+
- `api_dependencies`: List of Llama Stack APIs that this provider depends on
30+
- `provider_data_validator`: Optional validator for provider data.
31+
- `pip_packages`: List of Python packages required by the provider
32+
5333
### Remote Provider Specification
5434

5535
Remote providers are used when you need to communicate with external services. Here's an example for a custom Ollama provider:
5636

5737
```yaml
58-
adapter:
59-
adapter_type: custom_ollama
60-
pip_packages:
61-
- ollama
62-
- aiohttp
63-
config_class: llama_stack_ollama_provider.config.OllamaImplConfig
64-
module: llama_stack_ollama_provider
38+
adapter_type: custom_ollama
39+
provider_type: "remote::ollama"
40+
pip_packages:
41+
- ollama
42+
- aiohttp
43+
config_class: llama_stack_ollama_provider.config.OllamaImplConfig
44+
module: llama_stack_ollama_provider
6545
api_dependencies: []
6646
optional_api_dependencies: []
6747
```
6848
69-
#### Adapter Configuration
49+
#### Remote Provider Configuration
7050
71-
The `adapter` section defines how to load and configure the provider:
72-
73-
- `adapter_type`: A unique identifier for this adapter
74-
- `pip_packages`: List of Python packages required by the provider
75-
- `config_class`: The full path to the configuration class
76-
- `module`: The Python module containing the provider implementation
51+
- `adapter_type`: A unique identifier for this adapter, eg. `ollama`
7752

7853
### Inline Provider Specification
7954

8055
Inline providers run locally within the Llama Stack process. Here's an example for a custom vector store provider:
8156

8257
```yaml
8358
module: llama_stack_vector_provider
59+
provider_type: inline::llama_stack_vector_provider
8460
config_class: llama_stack_vector_provider.config.VectorStoreConfig
8561
pip_packages:
8662
- faiss-cpu
@@ -95,12 +71,6 @@ container_image: custom-vector-store:latest # optional
9571

9672
#### Inline Provider Fields
9773

98-
- `module`: The Python module containing the provider implementation
99-
- `config_class`: The full path to the configuration class
100-
- `pip_packages`: List of Python packages required by the provider
101-
- `api_dependencies`: List of Llama Stack APIs that this provider depends on
102-
- `optional_api_dependencies`: List of optional Llama Stack APIs that this provider can use
103-
- `provider_data_validator`: Optional validator for provider data
10474
- `container_image`: Optional container image to use instead of pip packages
10575

10676
## Required Fields
@@ -113,20 +83,17 @@ All providers must contain a `get_provider_spec` function in their `provider` mo
11383
from llama_stack.providers.datatypes import (
11484
ProviderSpec,
11585
Api,
116-
AdapterSpec,
117-
remote_provider_spec,
86+
RemoteProviderSpec,
11887
)
11988
12089
12190
def get_provider_spec() -> ProviderSpec:
122-
return remote_provider_spec(
91+
return RemoteProviderSpec(
12392
api=Api.inference,
124-
adapter=AdapterSpec(
125-
adapter_type="ramalama",
126-
pip_packages=["ramalama>=0.8.5", "pymilvus"],
127-
config_class="ramalama_stack.config.RamalamaImplConfig",
128-
module="ramalama_stack",
129-
),
93+
adapter_type="ramalama",
94+
pip_packages=["ramalama>=0.8.5", "pymilvus"],
95+
config_class="ramalama_stack.config.RamalamaImplConfig",
96+
module="ramalama_stack",
13097
)
13198
```
13299

@@ -197,18 +164,16 @@ information. Execute the test for the Provider type you are developing.
197164
If your external provider isn't being loaded:
198165

199166
1. Check that `module` points to a published pip package with a top level `provider` module including `get_provider_spec`.
200-
1. Check that the `external_providers_dir` path is correct and accessible.
201167
2. Verify that the YAML files are properly formatted.
202168
3. Ensure all required Python packages are installed.
203169
4. Check the Llama Stack server logs for any error messages - turn on debug logging to get more
204170
information using `LLAMA_STACK_LOGGING=all=debug`.
205-
5. Verify that the provider package is installed in your Python environment if using `external_providers_dir`.
206171

207172
## Examples
208173

209-
### Example using `external_providers_dir`: Custom Ollama Provider
174+
### How to create an external provider module
210175

211-
Here's a complete example of creating and using a custom Ollama provider:
176+
If you are creating a new external provider called `llama-stack-provider-ollama` here is how you would set up the package properly:
212177

213178
1. First, create the provider package:
214179

@@ -230,33 +195,28 @@ requires-python = ">=3.12"
230195
dependencies = ["llama-stack", "pydantic", "ollama", "aiohttp"]
231196
```
232197

233-
3. Create the provider specification:
234-
235-
```yaml
236-
# ~/.llama/providers.d/remote/inference/custom_ollama.yaml
237-
adapter:
238-
adapter_type: custom_ollama
239-
pip_packages: ["ollama", "aiohttp"]
240-
config_class: llama_stack_provider_ollama.config.OllamaImplConfig
241-
module: llama_stack_provider_ollama
242-
api_dependencies: []
243-
optional_api_dependencies: []
244-
```
245-
246-
4. Install the provider:
198+
3. Install the provider:
247199

248200
```bash
249201
uv pip install -e .
250202
```
251203

252-
5. Configure Llama Stack to use external providers:
204+
4. Edit `provider.py`
253205

254-
```yaml
255-
external_providers_dir: ~/.llama/providers.d/
256-
```
206+
provider.py must be updated to contain `get_provider_spec`. This is used by llama stack to install the provider.
257207

258-
The provider will now be available in Llama Stack with the type `remote::custom_ollama`.
208+
```python
209+
def get_provider_spec() -> ProviderSpec:
210+
return RemoteProviderSpec(
211+
api=Api.inference,
212+
adapter_type="llama-stack-provider-ollama",
213+
pip_packages=["ollama", "aiohttp"],
214+
config_class="llama_stack_provider_ollama.config.OllamaImplConfig",
215+
module="llama_stack_provider_ollama",
216+
)
217+
```
259218

219+
5. Implement the provider as outlined above with `get_provider_impl` or `get_adapter_impl`, etc.
260220

261221
### Example using `module`: ramalama-stack
262222

@@ -275,7 +235,6 @@ distribution_spec:
275235
module: ramalama_stack==0.3.0a0
276236
image_type: venv
277237
image_name: null
278-
external_providers_dir: null
279238
additional_pip_packages:
280239
- aiosqlite
281240
- sqlalchemy[asyncio]

0 commit comments

Comments
 (0)