A comprehensive collection of ready-to-deploy AI/ML examples for OpenShift AI using GitOps and modern cloud-native technologies. These examples demonstrate best practices for deploying and serving AI models on OpenShift using KServe, vLLM, Kubeflow Pipelines, and other AI acceleration tools.
These examples are designed to work directly with the Red Hat AI Services AI Accelerator repo for configuring an OpenShift AI cluster.
Deploy any example with a single command:
./bootstrap.sh
This interactive script will:
- ✅ Check your OpenShift login status
- 📋 Show available examples
- 🎯 Guide you through deployment options
- 🚀 Deploy the selected example using ArgoCD
- vLLM ModelCar Serverless (
examples/vllm-modelcar-serverless/
) - Deploy Granite 3.3 2B Instruct model using vLLM in serverless configuration - vLLM ModelCar Multinode (
examples/vllm-multinode-modelcar/
) - Deploy Qwen2.5 7B Instruct model using vLLM with multinode pipeline parallelism - Kubeflow Pipelines with Tekton (
examples/kfp-pipeline-example/
) - End-to-end ML pipeline with data ingestion, training, and deployment
ai-accelerator-examples/
├── bootstrap.sh # Quick deployment entry point
├── scripts/
│ ├── bootstrap.sh # Main deployment script
│ ├── functions.sh # Utility functions
│ └── validate_manifests.sh # Manifest validation
├── charts/
│ └── argocd-appgenerator/ # Helm chart for ArgoCD ApplicationSet generation
│ ├── Chart.yaml # Chart metadata
│ ├── values.yaml # Default configuration
│ ├── templates/ # Kubernetes resource templates
│ └── README.md # Chart-specific documentation
├── examples/
│ └── <examples>
└── README.md # This file
Each example contains a list of specific prerequisites for that component that will help to define what features of RHOAI or other Operators are required to utilize that example.
Most of the prerequisites should be fulfilled by the Red Hat AI Services AI Accelerator repo for configuring an OpenShift AI cluster.
- OpenShift cluster admin privileges
- Logged into OpenShift:
oc login --server=<cluster-url> --token=<token>
./bootstrap.sh
- Automatically detects your environment
- Prompts for repository URL configuration
- Lists available examples
- Handles deployment complexity
# Navigate to desired example
cd examples/<example-name>
# Deploy namespace
oc apply -k namespaces/overlays/default
# Deploy application components
oc apply -k <component>/overlays/default
# Using the ArgoCD app generator Helm chart
helm install ai-examples ./charts/argocd-appgenerator -n openshift-gitops \
--set git.repoURL=https://github.com/your-org/your-fork.git \
--set git.revision=main \
--set git.directories[0].path=examples/*/overlays/default
# Validate Kustomize manifests and Helm charts
./scripts/validate_manifests.sh
# Install Python dependencies for linting
pip install -r requirements.txt
# Run YAML linting
yamllint .
# Run spell checking
pyspelling
See CONTRIBUTING.md for detailed guidelines on adding new examples, testing, and submitting contributions.
- OpenShift AI Documentation
- KServe Documentation
- vLLM Documentation
- Kubeflow Pipelines Documentation
- ArgoCD Documentation
- Red Hat AI Services on GitHub
- Red Hat AI Services ModelCar Catalog
- Red Hat AI Services Helm Charts
- Red hat AI Services AI Accelerator
#OpenShift
#AI
#MachineLearning
#KServe
#vLLM
#Kubeflow
#GitOps
#ArgoCD
#Tekton
#ModelServing
#MLOps