Skip to content

Commit 53e74d1

Browse files
committed
docs!: adjust external provider docs
now that we consolidated the providerspec types and got rid of `AdapterSpec`, adjust external.md and external-provider-guide.md external_providers_guide.md specifically needed some updates around the proper fields required in a ProviderSpec beyond the changes in this PR. BREAKING CHANGE: external providers must update their `get_provider_spec` function to use `RemoteProviderSpec` properly Signed-off-by: Charlie Doern <[email protected]>
1 parent 8422bd1 commit 53e74d1

File tree

2 files changed

+40
-46
lines changed

2 files changed

+40
-46
lines changed

docs/source/apis/external.md

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -146,7 +146,6 @@ __all__ = ["WeatherAPI", "available_providers"]
146146
from typing import Protocol
147147
148148
from llama_stack.providers.datatypes import (
149-
AdapterSpec,
150149
Api,
151150
ProviderSpec,
152151
RemoteProviderSpec,
@@ -160,12 +159,10 @@ def available_providers() -> list[ProviderSpec]:
160159
api=Api.weather,
161160
provider_type="remote::kaze",
162161
config_class="llama_stack_provider_kaze.KazeProviderConfig",
163-
adapter=AdapterSpec(
164-
adapter_type="kaze",
165-
module="llama_stack_provider_kaze",
166-
pip_packages=["llama_stack_provider_kaze"],
167-
config_class="llama_stack_provider_kaze.KazeProviderConfig",
168-
),
162+
adapter_type="kaze",
163+
module="llama_stack_provider_kaze",
164+
pip_packages=["llama_stack_provider_kaze"],
165+
config_class="llama_stack_provider_kaze.KazeProviderConfig",
169166
),
170167
]
171168
@@ -319,11 +316,10 @@ class WeatherKazeAdapter(WeatherProvider):
319316

320317
```yaml
321318
# ~/.llama/providers.d/remote/weather/kaze.yaml
322-
adapter:
323-
adapter_type: kaze
324-
pip_packages: ["llama_stack_provider_kaze"]
325-
config_class: llama_stack_provider_kaze.config.KazeProviderConfig
326-
module: llama_stack_provider_kaze
319+
adapter_type: kaze
320+
pip_packages: ["llama_stack_provider_kaze"]
321+
config_class: llama_stack_provider_kaze.config.KazeProviderConfig
322+
module: llama_stack_provider_kaze
327323
optional_api_dependencies: []
328324
```
329325

docs/source/providers/external/external-providers-guide.md

Lines changed: 32 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -50,37 +50,45 @@ Llama Stack supports two types of external providers:
5050
1. **Remote Providers**: Providers that communicate with external services (e.g., cloud APIs)
5151
2. **Inline Providers**: Providers that run locally within the Llama Stack process
5252

53+
54+
### Provider Specification (Common between inline and remote providers)
55+
56+
- `provider_type`: The type of the provider to be installed (remote or inline). eg. `remote::ollama`
57+
- `api`: The API for this provider, eg. `inference`
58+
- `config_class`: The full path to the configuration class
59+
- `module`: The Python module containing the provider implementation
60+
- `optional_api_dependencies`: List of optional Llama Stack APIs that this provider can use
61+
- `api_dependencies`: List of Llama Stack APIs that this provider depends on
62+
- `provider_data_validator`: Optional validator for provider data.
63+
- `pip_packages`: List of Python packages required by the provider
64+
5365
### Remote Provider Specification
5466

5567
Remote providers are used when you need to communicate with external services. Here's an example for a custom Ollama provider:
5668

5769
```yaml
58-
adapter:
59-
adapter_type: custom_ollama
60-
pip_packages:
61-
- ollama
62-
- aiohttp
63-
config_class: llama_stack_ollama_provider.config.OllamaImplConfig
64-
module: llama_stack_ollama_provider
70+
adapter_type: custom_ollama
71+
provider_type: "remote::ollama"
72+
pip_packages:
73+
- ollama
74+
- aiohttp
75+
config_class: llama_stack_ollama_provider.config.OllamaImplConfig
76+
module: llama_stack_ollama_provider
6577
api_dependencies: []
6678
optional_api_dependencies: []
6779
```
6880
69-
#### Adapter Configuration
81+
#### Remote Provider Configuration
7082
71-
The `adapter` section defines how to load and configure the provider:
72-
73-
- `adapter_type`: A unique identifier for this adapter
74-
- `pip_packages`: List of Python packages required by the provider
75-
- `config_class`: The full path to the configuration class
76-
- `module`: The Python module containing the provider implementation
83+
- `adapter_type`: A unique identifier for this adapter, eg. `ollama`
7784

7885
### Inline Provider Specification
7986

8087
Inline providers run locally within the Llama Stack process. Here's an example for a custom vector store provider:
8188

8289
```yaml
8390
module: llama_stack_vector_provider
91+
provider_type: inline::llama_stack_vector_provider
8492
config_class: llama_stack_vector_provider.config.VectorStoreConfig
8593
pip_packages:
8694
- faiss-cpu
@@ -95,12 +103,6 @@ container_image: custom-vector-store:latest # optional
95103

96104
#### Inline Provider Fields
97105

98-
- `module`: The Python module containing the provider implementation
99-
- `config_class`: The full path to the configuration class
100-
- `pip_packages`: List of Python packages required by the provider
101-
- `api_dependencies`: List of Llama Stack APIs that this provider depends on
102-
- `optional_api_dependencies`: List of optional Llama Stack APIs that this provider can use
103-
- `provider_data_validator`: Optional validator for provider data
104106
- `container_image`: Optional container image to use instead of pip packages
105107

106108
## Required Fields
@@ -113,20 +115,17 @@ All providers must contain a `get_provider_spec` function in their `provider` mo
113115
from llama_stack.providers.datatypes import (
114116
ProviderSpec,
115117
Api,
116-
AdapterSpec,
117-
remote_provider_spec,
118+
RemoteProviderSpec,
118119
)
119120
120121
121122
def get_provider_spec() -> ProviderSpec:
122-
return remote_provider_spec(
123+
return RemoteProviderSpec(
123124
api=Api.inference,
124-
adapter=AdapterSpec(
125-
adapter_type="ramalama",
126-
pip_packages=["ramalama>=0.8.5", "pymilvus"],
127-
config_class="ramalama_stack.config.RamalamaImplConfig",
128-
module="ramalama_stack",
129-
),
125+
adapter_type="ramalama",
126+
pip_packages=["ramalama>=0.8.5", "pymilvus"],
127+
config_class="ramalama_stack.config.RamalamaImplConfig",
128+
module="ramalama_stack",
130129
)
131130
```
132131

@@ -234,11 +233,10 @@ dependencies = ["llama-stack", "pydantic", "ollama", "aiohttp"]
234233

235234
```yaml
236235
# ~/.llama/providers.d/remote/inference/custom_ollama.yaml
237-
adapter:
238-
adapter_type: custom_ollama
239-
pip_packages: ["ollama", "aiohttp"]
240-
config_class: llama_stack_provider_ollama.config.OllamaImplConfig
241-
module: llama_stack_provider_ollama
236+
adapter_type: custom_ollama
237+
pip_packages: ["ollama", "aiohttp"]
238+
config_class: llama_stack_provider_ollama.config.OllamaImplConfig
239+
module: llama_stack_provider_ollama
242240
api_dependencies: []
243241
optional_api_dependencies: []
244242
```

0 commit comments

Comments
 (0)