Skip to content

Commit 9970f58

Browse files
committed
fixes for pre-commit issues
Signed-off-by: Tsai, Louie <[email protected]>
1 parent d0a19d3 commit 9970f58

File tree

7 files changed

+18
-14
lines changed

7 files changed

+18
-14
lines changed

ChatQnA/README.md

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -90,8 +90,9 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
9090
# cd GenAIExamples/ChatQnA/docker_compose/nvidia/gpu/
9191
docker compose up -d
9292
```
93-
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
94-
CPU example with Open Telemetry feature:
93+
94+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
95+
CPU example with Open Telemetry feature:
9596

9697
```bash
9798
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
@@ -239,7 +240,8 @@ cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
239240
docker compose up -d
240241
```
241242

242-
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
243+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
244+
243245
```bash
244246
cd GenAIExamples/ChatQnA/docker_compose/intel/hpu/gaudi/
245247
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
@@ -255,7 +257,9 @@ Find the corresponding [compose.yaml](./docker_compose/intel/cpu/xeon/compose.ya
255257
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
256258
docker compose up -d
257259
```
258-
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
260+
261+
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
262+
259263
```bash
260264
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
261265
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
@@ -364,7 +368,7 @@ OPEA microservice deployment can easily be monitored through Grafana dashboards
364368

365369
## Tracing Services with OpenTelemetry Tracing and Jaeger
366370

367-
> NOTE: This feature is disabled by default. Please check the Deploy ChatQnA sessions for how to enable this feature with compose_telemetry.yaml file.
371+
> NOTE: This feature is disabled by default. Please check the Deploy ChatQnA sessions for how to enable this feature with compose_telemetry.yaml file.
368372
369373
OPEA microservice and TGI/TEI serving can easily be traced through Jaeger dashboards in conjunction with OpenTelemetry Tracing feature. Follow the [README](https://github.com/opea-project/GenAIComps/tree/main/comps/cores/telemetry#tracing) to trace additional functions if needed.
370374

ChatQnA/docker_compose/intel/cpu/xeon/README.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -44,12 +44,12 @@ To set up environment variables for deploying ChatQnA services, follow these ste
4444
docker compose up -d
4545
```
4646

47-
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
48-
CPU example with Open Telemetry feature:
47+
To enable Open Telemetry Tracing, compose.telemetry.yaml file need to be merged along with default compose.yaml file.
48+
CPU example with Open Telemetry feature:
4949

5050
```bash
5151
cd GenAIExamples/ChatQnA/docker_compose/intel/cpu/xeon/
52-
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
52+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
5353
```
5454

5555
It will automatically download the docker image on `docker hub`:
@@ -272,15 +272,15 @@ docker compose -f compose.yaml up -d
272272
# Start ChatQnA without Rerank Pipeline
273273
docker compose -f compose_without_rerank.yaml up -d
274274
# Start ChatQnA with Rerank Pipeline and Open Telemetry Tracing
275-
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
275+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
276276
```
277277

278278
If use TGI as the LLM serving backend.
279279

280280
```bash
281281
docker compose -f compose_tgi.yaml up -d
282282
# Start ChatQnA with Open Telemetry Tracing
283-
docker compose -f compose_tgi.yaml -f compose_tgi_telemetry.yaml up -d
283+
docker compose -f compose_tgi.yaml -f compose_tgi.telemetry.yaml up -d
284284
```
285285

286286
### Validate Microservices

ChatQnA/docker_compose/intel/hpu/gaudi/README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -45,10 +45,10 @@ To set up environment variables for deploying ChatQnA services, follow these ste
4545
docker compose up -d
4646
```
4747

48-
To enable Open Telemetry Tracing, compose_telemetry.yaml file need to be merged along with default compose.yaml file.
48+
To enable Open Telemetry Tracing, compose.telemetry.yaml file need to be merged along with default compose.yaml file.
4949

5050
```bash
51-
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
51+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
5252
```
5353

5454
It will automatically download the docker image on `docker hub`:
@@ -266,15 +266,15 @@ docker compose -f compose.yaml up -d
266266
# Start ChatQnA without Rerank Pipeline
267267
docker compose -f compose_without_rerank.yaml up -d
268268
# Start ChatQnA with Rerank Pipeline and Open Telemetry Tracing
269-
docker compose -f compose.yaml -f compose_telemetry.yaml up -d
269+
docker compose -f compose.yaml -f compose.telemetry.yaml up -d
270270
```
271271

272272
If use TGI as the LLM serving backend.
273273

274274
```bash
275275
docker compose -f compose_tgi.yaml up -d
276276
# Start ChatQnA with Open Telemetry Tracing
277-
docker compose -f compose_tgi.yaml -f compose_tgi_telemetry.yaml up -d
277+
docker compose -f compose_tgi.yaml -f compose_tgi.telemetry.yaml up -d
278278
```
279279

280280
If you want to enable guardrails microservice in the pipeline, please follow the below command instead:

0 commit comments

Comments
 (0)