This repo contains a sample of a Process Framework runtime that uses the Durable Task Scheduler as the underlying workflow engine.
The Process Framework allows defining a structured set of activities or tasks as an event-driven workflow using code. The Process Framework has two different runtimes currently:
- In-memory: This is the default runtime that uses the in-memory scheduler to execute tasks. It is suitable for local development and testing.
- Dapr: A distributed runtime that uses the Dapr Actors to coordinate and execute tasks. It is suitable for scenarios where you need to scale out your workflows across multiple nodes on a Dapr-compatible compute platform.
This project explores a 3rd option of running the Process Framework on top of the Durable Task Scheduler, enabling a broader set of potential compute platforms with a smaller footprint compared to Dapr.
Process Framework workflows (processes) can be serialized into a JSON document. These documents can then be processed by pre-defined Durable Task orchestrations to allow them to execute durably and in a distributed manner. Stateless steps in the process are executed as Durable Task activities, while stateful steps are executed as Durable Task entities.
Workflow state is stored in the Durable Task Scheduler's storage, which is also responsible for scheduling the execution of activities and entities across the different compute nodes in a reliable and coordinated way. The state of the process workflow can also be observed using the Durable Task Scheduler's monitoring dashboard.
The easiest way to run the sample is to use the Durable Task Scheduler's local development emulator. You can do so using the following Docker command:
docker pull mcr.microsoft.com/dts/dts-emulator:v0.0.6
docker run -itP mcr.microsoft.com/dts/dts-emulator:v0.0.6
This will start the emulator, which includes both the Durable Task Scheduler backend (on port 8080) and the monitoring dashboard (on port 8082).
Next you'll need to set up an Azure OpenAI model deployment. Once done, you'll need to set the following environment variables:
AZURE_OPENAI_ENDPOINT
- for example,https://<your-openai-resource-name>.openai.azure.com/
AZURE_OPENAI_DEPLOYMENT_NAME
- for example,gpt-4o-mini
Note that authentication to Azure OpenAI is done using local development credentials. This sample does not currently support API key authentication.
From there, you can run the sample console app using the following command:
cd SampleApp
dotnet run
To view the monitoring dashboard, open your browser and navigate to http://localhost:8082/subscriptions/default/schedulers/default/taskhubs/default/orchestrations
. You should see the list of orchestrations and their status. You can click on an orchestration to view its details, including the input and output of each activity and entity.
This sample can be containerized and/or deployed as a web app to run in an Azure PaaS service. In that case, instead of using the local emulator, you would configure the app to use an Azure-hosted durable task scheduler resource, and update the connection string accordingly by setting a DTS_CONNECTION_STRING
environment variable with a value like Endpoint=https://{my-dts-resource}.{region}.durabletask.io;TaskHub={my-task-hub};Authentication=ManagedIdentity;
.
More information on how to run durable task scheduler-enabled apps in Azure can be found in the Durable Task Scheduler quickstart documentation.