diff --git a/docs/README.md b/docs/README.md index 412ed30cf..4224102f1 100644 --- a/docs/README.md +++ b/docs/README.md @@ -201,7 +201,7 @@ text: ``` -A PDL program computes 2 data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs, where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is the list of messages used to make calls to LLMs via LiteLLM. +A PDL program computes two data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs, where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is the list of messages used to make calls to LLMs via LiteLLM. The PDL interpreter can also stream the background conversation instead of the result: diff --git a/docs/tutorial.md b/docs/tutorial.md index 2f0f37d70..2e6f84fc3 100644 --- a/docs/tutorial.md +++ b/docs/tutorial.md @@ -35,42 +35,39 @@ Hello, world! In this program ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm.pdl)), the `text` starts with the word `"Hello\n"`, and we call a model (`ollama/granite3.2:2b`) with this as input prompt. The model is passed a parameter `stop` to indicate the stop sequences. -A PDL program computes 2 data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs (list of messages), where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is what is used to make calls to LLMs via LiteLLM. +A PDL program computes two data structures. The first is a JSON corresponding to the result of the overall program, obtained by aggregating the results of each block. This is what is printed by default when we run the interpreter. The second is a conversational background context, which is a list of role/content pairs (list of messages), where we implicitly keep track of roles and content for the purpose of communicating with models that support chat APIs. The contents in the latter correspond to the results of each block. The conversational background context is what is used to make calls to LLMs via LiteLLM. -In this example, since the `input` field is not specified in the model call, the entire text up to that point is passed to the model as input context, using the +In this example, the input of the model is `[{"role": "user", "content": "Hello\n"}]` which corresponds to the entire text up to that point using the default role `user`. -When we execute this program using the interpreter, we obtain: +When we execute this program using the interpreter, we obtain the following result where the second `Hello` has been generated by Granite: ``` Hello Hello ``` -where the second `Hello` has been generated by Granite. - -Here's another example of model call that includes an `input` field ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input.pdl)): +The input of to the model can also be provided explicitly using the `input` field. +Here is an example of model call using this feature ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input.pdl)): ```yaml --8<-- "./examples/tutorial/calling_llm_with_input.pdl" ``` -In this case, the input passed to the model is the sentence: `Translate the word 'Hello' to French` and nothing else from the surrounding document. When we execute this program, we obtain: +In this case, the input passed to the model is `[{"role": "user", "content": "Translate the word 'Hello' to French"}]` and nothing else from the surrounding document. When we execute this program, we obtain the following result where the second line is generated by the model.: ``` Hello Bonjour (pronounced bon-zhoor) is the translation for "Hello" in French. It's an informal greeting used during the day, similar to how we use "Hi" or "Hello." For a more formal context, you might say "Bonjour," which means "Good day." ``` -where the second line is generated by the model. - Using the `input` field, we can also give a directly an array of messages (`role`/`content`) to the model ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input_messages.pdl)): ```yaml --8<-- "./examples/tutorial/calling_llm_with_input_messages.pdl" ``` -This has the same output as the previous program. An alternative way of writing this is [this](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input_messages_var.pdl) program. +This has a similar output as the previous program. An alternative way of writing this program using a variable to store the prompt is [this program](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/calling_llm_with_input_messages_var.pdl). ### Parameter defaults for watsonx Granite models @@ -89,25 +86,45 @@ When using Granite models, we use the following defaults for model parameters: The user can override these defaults by explicitly including them in the model call. -## Variable Definition and Use -Any block can define a variable using a `def: ` field. This means that the output of that block is assigned to the variable ``, which may be reused at a later point in the document. +## Building the background context with `lastOf` -Consider the following example ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/variable_def_use.pdl)): +The pervious example explicitly provides a list of messages with different roles to the LLM call. This can also be done implicitly using the background context. + +Each block can be annotated with a `role` field indicating the role that is used when a message is added to the background context by the block or any of the sub-block that does not redefine it. +In this example, we add a `system` message asking the model to provide answer formally ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/role.pdl)): ```yaml ---8<-- "./examples/tutorial/variable_def_use.pdl" +--8<-- "./examples/tutorial/role.pdl" ``` -Here we assign the output of the model to variable `GEN` using the `def` field. The last line of the program prints out the value of `GEN`. Notice the notation `${ }` for accessing the value of a variable. Any [Jinja](https://jinja.palletsprojects.com/en/3.1.x/) expression is allowed to be used inside these braces. These expressions -are also used to specify conditions for loops and conditionals. See for example this [file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/programs/chatbot.pdl). +In this program, we explicitly indicated the top-level `user` role which is added automatically by the interpreter otherwise. +This role is inherited by the `"Hello\n"` block and masked in the next block to define a system prompt. +So the context provided as input to the LLM is `[{"role": "user", "content": "Hello\n"}, {"role": "system", "content": "You are a polite assistant that likes to answer very formally."}]`. The answer produced by the model block has the `assistant` role. -When we execute this program, we obtain: +The execution of this program produces: ``` Hello +You are a polite assistant that likes to answer very formally. +Greetings! I trust this message finds you in good health and high spirits. How may I be of assistance today? Please feel free to pose your query or request, knowing that I am here to serve with diligence and precision. +``` + +If we want to add the `system` message to the background context without having it present in the result, we can use a `lastOf` block. +The `lastOf` is associated to a list of blocks that are executed in sequence. +Each sub-block contributes messages to the background context but the result of the block is the result of the last block. +The following program provides the same input to the LLM, but the system prompt is not part of the result ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/lastOf.pdl)): + +```yaml +--8<-- "./examples/tutorial/lastOf.pdl" +``` + +Therefore, the result of the program is: + +``` Hello -GEN is equal to: Hello + +Greetings! I trust this message finds you in good health and high spirits. How may I be of assistance today? Please feel free to pose your query or request, knowing that I am here to serve with diligence and precision. ``` ## Model Chaining @@ -118,77 +135,97 @@ In PDL, we can declaratively chain models together as in the following example ( --8<-- "./examples/tutorial/calling_llm_chaining.pdl" ``` -In this program, the first call is to a Granite model with the prompt `"Hello\n"`. The following block in the program prints out the sentence: `"\nDid you just say Hello?\n"`. The final line of the program takes the entire context produced so far and passes it as input to the Granite model. Notice that the input passed to this model is the context up to that point, represented as a conversation. This makes it easy to chain models together and continue building on previous interactions. Notice how the conversational context is accumulated implicitly without requiring the user to explicitly manage messages. +In this program, the first block result is `Hello\n` and adds a message with this value to the background context. The second block calls Granite on the background context containing the Hello message and adds the response of the model to the result and context. The following block contributes the sentence: `\nTranslate the above to French\n`. The final line of the program takes the entire context produced so far and passes it as input to the Granite model. Notice that the input passed to this model is the context up to that point, represented as a conversation. This makes it easy to chain models together and continue building on previous interactions. Notice how the conversational context is accumulated implicitly without requiring the user to explicitly manage messages. When we execute this program, we obtain: ``` Hello Hello -Did you just say Hello? -Yes, I did. That's how I greet people in this conversation. It's a common way to start a dialogue. How can I assist you today? +Translate the above to French +Bonjour ``` -## Function Definition +## Variable Definition and Use -PDL also supports function definitions to make it easier to reuse code. -Suppose we want to define a translation function that takes a string and calls a Granite model for the translation. This would be written in PDL as follows ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_definition.pdl)): +Any block can define a variable using a `def: ` field. This means that the result of that block is assigned to the variable ``, which may be reused at a later point in the document. + +Consider the following example ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/variable_def_use.pdl)): ```yaml ---8<-- "./examples/tutorial/function_definition.pdl" +--8<-- "./examples/tutorial/variable_def_use.pdl" ``` -In this program, the first block defines a function `translate` that takes as parameters `sentence` and `language`, both of which are of type string. The body of the function is defined by its `return` field. In this case, we formulate a translation prompt using the parameters and send it to a Granite model. +Here we assign the response of the model to variable `GEN` using the `def` field. The last line of the program prints out the value of `GEN`. Notice the notation `${ }` for accessing the value of a variable. Any [Jinja](https://jinja.palletsprojects.com/en/3.1.x/) expression is allowed to be used inside these braces. These expressions +are also used to specify conditions for loops and conditionals. See for example this [file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/programs/chatbot.pdl). -The last two blocks are calls to this function, as indicated by `call: ${ translate }`. This block specifies the arguments to be passed. When we execute this program, we obtain: +When we execute this program, we obtain: ``` -'J'aime Paris !' -'Me encanta Madrid.' +Hello +Hello +The variable GEN is equal to: Hello ``` -A function only contributes to the result when it is called. So the definition itself results in `""`. When we call a function, we implicitly pass the current background context, and this is used as input to model calls inside the function body. In the above example, since the `input` field is omitted, the entire document produced at that point is passed as input to the Granite model. - -To reset the context when calling a function, we can pass the special argument: `pdl_context: []`. - -Notice that the arguments of function calls are expressions and cannot be arbitrary PDL blocks. +## Local Computation Using `defs` -A function name can be aliased (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_alias.pdl)). +In the previous example, the value of the variable `GEN` computed by the `model` block is part of the result and is added to the background context. To define the variable `GEN` without contributing to the result and context, the `model` block can be moved into a `defs` ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/local_computation.pdl)): -The context inherited by a function can be reset at the call site (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_empty_context.pdl)). +```yaml +--8<-- "./examples/tutorial/local_computation.pdl" +``` -Functions can be declared with optional parameters (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_optional_params.pdl)). +The execution of this program produces: -## Grouping Variable Definitions in Defs +``` +Hello +The variable GEN is equal to: Hello +``` -In PDL, the above program can be written more neatly by grouping certain variable definitions into a `defs` section, as follows ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/defs.pdl)): +The `defs` field can be added on any block and can introduce multiple variables. +The following program defines two variables `fr` and `es` associated to a `text` block that uses them ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/defs.pdl)): ```yaml --8<-- "./examples/tutorial/defs.pdl" ``` -This program has the same output has the one from the previous section. +This program first output `Hello` and add it in the context. +Then, the blocks defining the `fr` and `es` variables are both executed in a context containing only the `Hello` message. These blocks are using a `lastOf` that adds the value to each sub-blocs to the context and output the value of the last block. Finally, the value of the variables are used in the `text` block. -Any block can have a `defs` field defining variables used in that block. Notice it's different than the `def` field which stores the -result of the block after execution. +The output of this program is: -For another example, see [file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/defs-hello.pdl). +``` +Hello -## Muting Block Output with contribute +In Fench: Bonjour! -By default, when a PDL block is executed it produces a result that is contributed to the overall result, and it also contributes to the background context. It is possible to mute both contributions by setting `contribute` to `[]` for any block. This feature allows the computation of intermediate values that are not necessarily output as a result. The value of the variable specified in `def` is still set to the result of the block. +Translation of "Hello" in French is "Bonjour". -Consider the similar example as above, but with `contribute` set to `[]` ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/muting_block_output.pdl)): +In Spanish: Hola! + +La traducción de "Hello" al español es "Hola". +``` + +## Control Block Outputs with `contribute` + +By default, when a PDL block is executed, it produces a result that is contributed to the overall result, and it also contributes to the background context. We saw that `defs` and `lastOf` gives some control over the contribution to the result of context. `defs` executes a block without contributing to the context and name the result that can be used later. `lastOf` contributes only to the context for all of its sun-blocks except the last one. It is also possible to control the contribution of each block using the `contribute` field. + +Consider an example similar as above, but that uses `contribute` instead of `defs` and `lastOf` ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/muting_block_output.pdl)): ```yaml --8<-- "./examples/tutorial/muting_block_output.pdl" ``` -The call to the translator with French as language does not produce an output. However, we save the result in variable `FRENCH` and use it in the last sentence of the document. When we execute this program, we obtain: +Instead of a `lastOf`, we set `contribute` to `[context]` for the block that produces `"\nTranslate to French\n"`. That way, we only contribute to the context and not to the result. +We set `contribute` to `[]` for the call to the LLM such that it does not produce an output but only save the result in the `fr` variable that is used in the last block of the program. When we execute this program, we obtain: ``` -The french sentence was: 'J'aime Paris !' -``` +Hello + +In Fench: Bonjour! + +Translation of "Hello" in French is "Bonjour". +``` In general, `contribute` can be used to set how the result of the block contribute to the final result and the background context. Here are its possible values: @@ -201,7 +238,43 @@ Here are its possible values: - `[result, context]`: contribute to both, which is also the default setting. -## Specifying Data + +## Function Definition + +PDL supports function definitions to make it easier to reuse code. +Suppose we want to define a translation function that takes a string and calls a Granite model for the translation. This would be written in PDL as follows ([file](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_definition.pdl)): + +```yaml +--8<-- "./examples/tutorial/function_definition.pdl" +``` + +In this program, the `defs` field defines a function `translate` that takes as parameters `sentence` and `language`, both of which are of type string. The body of the function is defined by its `return` field. In this case, we formulate a translation prompt using the parameters and send it to a Granite model. + +The body of the program is a `text` block that calls this function twice, as indicated by `call: ${ translate }`. The `call` block specifies the arguments to be passed. When we execute this program, we obtain: + +``` +J'aime Paris ! +Me encanta Madrid. +``` + +When we call a function, we implicitly pass the current background context, and this is used as input to model calls inside the function body. In the above example, since the `input` field is omitted, the entire document produced at that point is passed as input to the Granite model. + +To reset the context when calling a function, we can pass the special argument: `pdl_context: []` (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_empty_context.pdl)). + +Functions can be declared with optional parameters (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_optional_params.pdl)). + +PDL is a language with higher order functions meaning that functions are values. So for example, a function can be aliased (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/function_alias.pdl)). + +PDL functions can also be called from Jinja expressions as in the following example ([file](https://github.com/IBM/prompt-declaration-language/blob/main/examples/tutorial/function_call_in_jinja.pdl)): + +```yaml +--8<-- "./examples/tutorial/function_call_in_jinja.pdl" +``` + +Notice that arguments can be positional or named. + + +## Building Data Structures In PDL, the user specifies step by step the shape of data they wish to generate. A `text` block takes a list of blocks, stringifies the result of each block, and concatenates them. @@ -975,10 +1048,24 @@ What is the color of the sky? ## Python SDK -See examples of PDL being called programmatically in Python -[here](https://github.com/IBM/prompt-declaration-language/blob/main/examples/sdk). +PDL programs can be defined and called programmatically directly in Python. +In the following example, the PDL program is defined as a string and then parsed and executed using the [`exec_str`](https://ibm.github.io/prompt-declaration-language/api_reference/#src.pdl.pdl.exec_str) function ([file](https://github.com/IBM/prompt-declaration-language/blob/main/examples/tutorial/sdk/hello_str.py)). + +```python +--8<-- "./examples/tutorial/sdk/hello_str.py" +``` + +The SDK also provides functions to execute programs defined in a file (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/sdk/hello_file.py)), as a Python dictionary (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/sdk/hello_dict.py)), or PDL abstract syntax tree defined by a [Pydantic](https://docs.pydantic.dev) data structure (see [example](https://github.com/IBM/prompt-declaration-language//blob/main/examples/tutorial/sdk/hello_prog.py)). The documentation of the API is available [here](https://ibm.github.io/prompt-declaration-language/api_reference/). + + +A way to handle the processing of large datasets using PDL is to use Python multiprocessing capabilities to launch multiple instances of the PDL interpreter. The example below, w are using the Python's `concurrent.futures.ProcessPoolExecutor` to execute in parallel multiple instances of the PDL program `HELLO` where the free variable `name` is instantiated with a different value for each instance ([file](https://github.com/IBM/prompt-declaration-language/blob/main/examples/tutorial/sdk/hello_parallel.py)). + +```python +--8<-- "./examples/tutorial/sdk/hello_parallel.py" +``` -For a more sophisticated example, see [here](https://github.com/IBM/prompt-declaration-language/blob/main/examples/callback). +Finally, it possible to interleave the use of Python and PDL. +You can find an example [here](https://github.com/IBM/prompt-declaration-language/blob/main/examples/callback) of a Python application which is using a function defined in PDL which itself depend on the Python application. ## Debugging PDL Programs @@ -1034,9 +1121,9 @@ export OPENAI_ORGANIZATION=ollama # not required pdl <...> ``` -## Strings In Yaml +## Strings in Yaml -Multiline strings are commonly used when writing PDL programs. There are two types of formats that YAML supports for strings: block scalar and flow scalar formats. Scalars are what YAML calls basic values like numbers or strings, as opposed to complex types like arrays or objects. Block scalars have more control over how they are interpreted, whereas flow scalars have more limited escaping support. (Explanation here is thanks to [Wolfgang Faust](https://yaml-multiline.info/)) +Multiline strings are commonly used when writing PDL programs. There are two types of formats that YAML supports for strings: block scalar and flow scalar formats. Scalars are what YAML calls basic values like numbers or strings, as opposed to complex types like arrays or objects. Block scalars have more control over how they are interpreted, whereas flow scalars have more limited escaping support. (Explanation in this section are based on [yaml-multiline.info](https://yaml-multiline.info/) by Wolfgang Faust.) ### Block Scalars diff --git a/examples/sdk/hello_dict.py b/examples/sdk/hello_dict.py deleted file mode 100644 index 4313898ce..000000000 --- a/examples/sdk/hello_dict.py +++ /dev/null @@ -1,22 +0,0 @@ -from pdl.pdl import exec_dict - -hello = { - "text": [ - "Hello\n", - { - "model": "ollama_chat/granite3.2:2b", - "parameters": { - "stop": ["!"], - }, - }, - ] -} - - -def main(): - result = exec_dict(hello) - print(result) - - -if __name__ == "__main__": - main() diff --git a/examples/sdk/hello_prog.py b/examples/sdk/hello_prog.py deleted file mode 100644 index de0c8b8e8..000000000 --- a/examples/sdk/hello_prog.py +++ /dev/null @@ -1,23 +0,0 @@ -from pdl.pdl import exec_program -from pdl.pdl_ast import LitellmModelBlock, LitellmParameters, Program, TextBlock - -hello = Program( - TextBlock( - text=[ - "Hello\n", - LitellmModelBlock( - model="ollama_chat/granite3.2:2b", - parameters=LitellmParameters(stop=["!"]), # pyright: ignore - ), - ] - ) -) - - -def main(): - result = exec_program(hello) - print(result) - - -if __name__ == "__main__": - main() diff --git a/examples/tutorial/calling_llm.pdl b/examples/tutorial/calling_llm.pdl index 00d09ccdd..49209043b 100644 --- a/examples/tutorial/calling_llm.pdl +++ b/examples/tutorial/calling_llm.pdl @@ -1,6 +1,6 @@ -description: Hello world calling a model +description: Calling a model on the implicit background context text: - "Hello\n" - model: ollama_chat/granite3.2:2b parameters: - stop: ['!'] \ No newline at end of file + stop: ['!'] diff --git a/examples/tutorial/calling_llm_with_input.pdl b/examples/tutorial/calling_llm_with_input.pdl index f16648849..33064cf68 100644 --- a/examples/tutorial/calling_llm_with_input.pdl +++ b/examples/tutorial/calling_llm_with_input.pdl @@ -1,4 +1,4 @@ -description: Hello world calling a model +description: Calling a model with an input text text: - "Hello\n" - model: ollama_chat/granite3.2:2b diff --git a/examples/tutorial/calling_llm_with_input_messages.pdl b/examples/tutorial/calling_llm_with_input_messages.pdl index c994d7788..146a9e28f 100644 --- a/examples/tutorial/calling_llm_with_input_messages.pdl +++ b/examples/tutorial/calling_llm_with_input_messages.pdl @@ -1,4 +1,4 @@ -description: Hello world calling a model +description: Calling a model with an explicit list of messages text: - "Hello\n" - model: ollama_chat/granite3.2:2b diff --git a/examples/tutorial/calling_llm_with_input_messages_var.pdl b/examples/tutorial/calling_llm_with_input_messages_var.pdl index 831467442..774950391 100644 --- a/examples/tutorial/calling_llm_with_input_messages_var.pdl +++ b/examples/tutorial/calling_llm_with_input_messages_var.pdl @@ -3,8 +3,10 @@ defs: prompt: array: - role: system - content: You are a helpful software engineer. You write clear, concise, well-commented code. + content: You are a helpful assistant that is fluent in French. - role: user - content: Write a Python function that implement merge sort. -model: ollama_chat/granite3.2:2b -input: ${ prompt } + content: Translate the word 'Hello' to French +text: +- "Hello\n" +- model: ollama_chat/granite3.2:2b + input: ${ prompt } diff --git a/examples/tutorial/defs-hello.pdl b/examples/tutorial/defs-hello.pdl deleted file mode 100644 index 477d58822..000000000 --- a/examples/tutorial/defs-hello.pdl +++ /dev/null @@ -1,14 +0,0 @@ -description: Hello world with defs -defs: - hello: - function: - name: string - return: Hello ${ name }! - bye: - "Good bye" -text: -- call: ${ hello } - args: - name: World -- "\n" -- ${ bye } diff --git a/examples/tutorial/defs.pdl b/examples/tutorial/defs.pdl index e2e49afd0..2bed2a974 100644 --- a/examples/tutorial/defs.pdl +++ b/examples/tutorial/defs.pdl @@ -1,22 +1,16 @@ -description: Function def and call -defs: - translate: - function: - sentence: string - language: string - return: +text: +- "Hello\n" +- defs: + fr: lastOf: - - "\nTranslate the sentence '${ sentence }' to ${ language }.\n" + - "\nTranslate to French\n" - model: ollama_chat/granite3.2:2b - parameters: - stop: ["\n"] -text: -- call: ${ translate } - args: - sentence: I love Paris! - language: French -- "\n" -- call: ${ translate } - args: - sentence: I love Madrid! - language: Spanish \ No newline at end of file + es: + lastOf: + - "\nTranslate to Spanish\n" + - model: ollama_chat/granite3.2:2b + text: | + + In Fench: ${ fr } + + In Spanish: ${ es } diff --git a/examples/tutorial/function_alias.pdl b/examples/tutorial/function_alias.pdl index 23bf7c115..62883c787 100644 --- a/examples/tutorial/function_alias.pdl +++ b/examples/tutorial/function_alias.pdl @@ -1,4 +1,4 @@ -description: Hello function +description: Use a function as a value defs: hello: function: diff --git a/examples/tutorial/function_call_in_jinja.pdl b/examples/tutorial/function_call_in_jinja.pdl new file mode 100644 index 000000000..32a74aee8 --- /dev/null +++ b/examples/tutorial/function_call_in_jinja.pdl @@ -0,0 +1,14 @@ +description: Calling a PDL function from Jinja +defs: + translate: + function: + sentence: string + language: string + return: + lastOf: + - | + Translate the sentence '${ sentence }' to ${ language }. + Only give the result of the translation. + - model: ollama_chat/granite3.2:2b +text: | + The way to say hello in French is ${ translate("Hello", language="French") }. diff --git a/examples/tutorial/function_definition.pdl b/examples/tutorial/function_definition.pdl index dac318dc9..7124eb0c0 100644 --- a/examples/tutorial/function_definition.pdl +++ b/examples/tutorial/function_definition.pdl @@ -1,16 +1,16 @@ -description: Function def and call +description: Function definition and call +defs: + translate: + function: + sentence: string + language: string + return: + lastOf: + - | + Translate the sentence '${ sentence }' to ${ language }. + Only give the result of the translation. + - model: ollama_chat/granite3.2:2b text: -- def: translate - function: - sentence: string - language: string - return: - lastOf: - - "\nTranslate the sentence '${ sentence }' to ${ language }.\n" - - model: ollama_chat/granite3.2:2b - parameters: - stop: ["\n"] - temperature: 0 - call: ${ translate } args: sentence: I love Paris! diff --git a/examples/tutorial/function_empty_context.pdl b/examples/tutorial/function_empty_context.pdl index 54e3c6858..2f2d5dbf2 100644 --- a/examples/tutorial/function_empty_context.pdl +++ b/examples/tutorial/function_empty_context.pdl @@ -1,13 +1,24 @@ -description: Hello world with function definition and call +description: Function call with an empty context +defs: + translate: + function: + sentence: string + language: string + return: + lastOf: + - | + Translate the sentence '${ sentence }' to ${ language }. + Only give the result of the translation. + - model: ollama_chat/granite3.2:2b text: -- def: hello - function: - name: string - return: - text: - - Hello ${ name }! - - model: ollama_chat/granite3.2:8b -- call: ${ hello } +- call: ${ translate } args: - name: World + sentence: I love Paris! + language: French + pdl_context: [] +- "\n" +- call: ${ translate } + args: + sentence: I love Madrid! + language: Spanish pdl_context: [] diff --git a/examples/tutorial/function_optional_params.pdl b/examples/tutorial/function_optional_params.pdl index cde3af8ac..880c96632 100644 --- a/examples/tutorial/function_optional_params.pdl +++ b/examples/tutorial/function_optional_params.pdl @@ -1,14 +1,19 @@ -description: Hello world with function definition and call +description: Function with optional parameter +defs: + hello: + function: + name: string + lastName: {optional: string} # optional parameter + return: + if: ${ lastName is defined } + then: Hello ${ name } ${ lastName }! + else: Hello ${ name }! text: -- def: hello - function: - name: string - lastName: {optional: string} # optional parameter - return: - if: ${ lastName is defined } - then: Hello ${ name } ${ lastName }! - else: Hello ${ name }! - call: ${ hello } args: name: World - lastName: Universe +- "\n" +- call: ${ hello } + args: + name: Earth + lastName: Planet diff --git a/examples/tutorial/lastOf.pdl b/examples/tutorial/lastOf.pdl new file mode 100644 index 000000000..9eacbd655 --- /dev/null +++ b/examples/tutorial/lastOf.pdl @@ -0,0 +1,8 @@ +description: Explicit use of role +role: user +text: +- "Hello\n" +- lastOf: + - role: system + text: "You are a polite assistant that likes to answer very formally." + - model: ollama_chat/granite3.2:2b diff --git a/examples/tutorial/local_computation.pdl b/examples/tutorial/local_computation.pdl new file mode 100644 index 000000000..2a22df33a --- /dev/null +++ b/examples/tutorial/local_computation.pdl @@ -0,0 +1,9 @@ +description: Local computations using defs +text: +- "Hello\n" +- defs: + GEN: + model: ollama_chat/granite3.2:2b + parameters: + stop: ['!'] +- "The variable GEN is equal to: ${ GEN }" \ No newline at end of file diff --git a/examples/tutorial/muting_block_output.pdl b/examples/tutorial/muting_block_output.pdl index c75b416a0..987617eb1 100644 --- a/examples/tutorial/muting_block_output.pdl +++ b/examples/tutorial/muting_block_output.pdl @@ -1,21 +1,11 @@ -description: Function def and call -defs: - translate: - function: - sentence: string - language: string - return: - text: - - text: "\nTranslate the sentence '${ sentence }' to ${ language }.\n" - contribute: [context] - - model: ollama_chat/granite3.2:2b - parameters: - stop: ["\n"] +description: Control block outputs with `contribute` text: -- call: ${ translate } +- "Hello\n" +- text: "\nTranslate to French\n" + contribute: [context] +- model: ollama_chat/granite3.2:2b contribute: [] - def: FRENCH - args: - sentence: I love Paris! - language: French -- "The french sentence was: ${ FRENCH }" \ No newline at end of file + def: fr +- | + + In Fench: ${ fr } diff --git a/examples/tutorial/role.pdl b/examples/tutorial/role.pdl new file mode 100644 index 000000000..81907fd47 --- /dev/null +++ b/examples/tutorial/role.pdl @@ -0,0 +1,7 @@ +description: Explicit use of role +role: user +text: +- "Hello\n" +- role: system + text: "You are a polite assistant that likes to answer very formally." +- model: ollama_chat/granite3.2:2b diff --git a/examples/sdk/hello.pdl b/examples/tutorial/sdk/hello.pdl similarity index 100% rename from examples/sdk/hello.pdl rename to examples/tutorial/sdk/hello.pdl diff --git a/examples/tutorial/sdk/hello_dict.py b/examples/tutorial/sdk/hello_dict.py new file mode 100644 index 000000000..06a8736dd --- /dev/null +++ b/examples/tutorial/sdk/hello_dict.py @@ -0,0 +1,21 @@ +from pdl.pdl import exec_dict + + +def main(): + hello = { + "text": [ + "Hello\n", + { + "model": "ollama_chat/granite3.2:2b", + "parameters": { + "stop": ["!"], + }, + }, + ] + } + result = exec_dict(hello) + print(result) + + +if __name__ == "__main__": + main() diff --git a/examples/sdk/hello_file.py b/examples/tutorial/sdk/hello_file.py similarity index 100% rename from examples/sdk/hello_file.py rename to examples/tutorial/sdk/hello_file.py diff --git a/examples/tutorial/sdk/hello_parallel.py b/examples/tutorial/sdk/hello_parallel.py new file mode 100644 index 000000000..6dd7b334c --- /dev/null +++ b/examples/tutorial/sdk/hello_parallel.py @@ -0,0 +1,37 @@ +import concurrent.futures + +from pdl.pdl import exec_str + +HELLO = """ +text: +- >+ + Hello, my name is ${name} +- model: ollama_chat/granite3.2:2b +""" + + +def _run_agent(name): + pdl_output = exec_str( + HELLO, + scope={"name": name}, + config={ + "yield_result": False, + "yield_background": False, + "batch": 1, # disable streaming + }, + ) + return pdl_output + + +if __name__ == "__main__": + data = ["Alice", "Nicolas", "Rosa", "Remi"] + with concurrent.futures.ProcessPoolExecutor() as executor: + futures = {executor.submit(_run_agent, name) for name in data} + executor.map(_run_agent, data) + for future in concurrent.futures.as_completed(futures): + try: + result = future.result() + except Exception as e: + print(f"Task raised an exception: {e}") + else: + print(result) diff --git a/examples/tutorial/sdk/hello_prog.py b/examples/tutorial/sdk/hello_prog.py new file mode 100644 index 000000000..f6baa420c --- /dev/null +++ b/examples/tutorial/sdk/hello_prog.py @@ -0,0 +1,22 @@ +from pdl.pdl import exec_program +from pdl.pdl_ast import LitellmModelBlock, LitellmParameters, Program, TextBlock + + +def main(): + hello = Program( + TextBlock( + text=[ + "Hello\n", + LitellmModelBlock( + model="ollama_chat/granite3.2:2b", + parameters=LitellmParameters(stop=["!"]), + ), + ] + ) + ) + result = exec_program(hello) + print(result) + + +if __name__ == "__main__": + main() diff --git a/examples/sdk/hello_str.py b/examples/tutorial/sdk/hello_str.py similarity index 94% rename from examples/sdk/hello_str.py rename to examples/tutorial/sdk/hello_str.py index 8a6b52c90..40f3810ea 100644 --- a/examples/sdk/hello_str.py +++ b/examples/tutorial/sdk/hello_str.py @@ -2,7 +2,8 @@ HELLO = """ text: -- "Hello\n" +- >+ + Hello - model: ollama_chat/granite3.2:2b parameters: stop: ['!'] diff --git a/examples/tutorial/variable_def_use.pdl b/examples/tutorial/variable_def_use.pdl index fea5ac9e9..1a0a53f98 100644 --- a/examples/tutorial/variable_def_use.pdl +++ b/examples/tutorial/variable_def_use.pdl @@ -1,8 +1,8 @@ -description: Hello world with variable def and use +description: Variable def and use text: - "Hello\n" - model: ollama_chat/granite3.2:2b - def: GEN parameters: stop: ['!'] -- "\nGEN is equal to: ${ GEN }" \ No newline at end of file + def: GEN +- "\nThe variable GEN is equal to: ${ GEN }" \ No newline at end of file diff --git a/tests/results/examples/tutorial/calling_llm_chaining.0.result b/tests/results/examples/tutorial/calling_llm_chaining.0.result index 241d9a74f..ac37482be 100644 --- a/tests/results/examples/tutorial/calling_llm_chaining.0.result +++ b/tests/results/examples/tutorial/calling_llm_chaining.0.result @@ -1,4 +1,4 @@ Hello Hello -Did you just say Hello? -Yes, I did. It's a common greeting, similar to how humans might respond when they first interact with an artificial intelligence like me. How can I assist you today? \ No newline at end of file +Translate the above to French +Bonjour \ No newline at end of file diff --git a/tests/results/examples/tutorial/calling_llm_chaining.1.result b/tests/results/examples/tutorial/calling_llm_chaining.1.result deleted file mode 100644 index ac37482be..000000000 --- a/tests/results/examples/tutorial/calling_llm_chaining.1.result +++ /dev/null @@ -1,4 +0,0 @@ -Hello -Hello -Translate the above to French -Bonjour \ No newline at end of file diff --git a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.0.result b/tests/results/examples/tutorial/calling_llm_with_input_messages_var.0.result index d2307e4f4..218c1c505 100644 --- a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.0.result +++ b/tests/results/examples/tutorial/calling_llm_with_input_messages_var.0.result @@ -1,47 +1,2 @@ -Here is a Python implementation of the Merge Sort algorithm: - -```python -def merge_sort(arr): - # Base case: if array has 1 or no elements, it's already sorted - if len(arr) <= 1: - return arr - - # Divide the array into two halves - mid = len(arr) // 2 - left_half = arr[:mid] - right_half = arr[mid:] - - # Recursively sort both halves - left_sorted = merge_sort(left_half) - right_sorted = merge_sort(right_half) - - # Merge the sorted halves back together - return merge(left_sorted, right_sorted) - -def merge(left, right): - """ - Merge two sorted lists into a single sorted list. - """ - merged = [] # Initialize an empty list for the result - left_index = 0 # Index for the left list - right_index = 0 # Index for the right list - - # Continue until we've exhausted both lists - while left_index < len(left) and right_index < len(right): - if left[left_index] <= right[right_index]: - merged.append(left[left_index]) - left_index += 1 - else: - merged.append(right[right_index]) - right_index += 1 - - # If there are any remaining elements in either list, append them to the result - merged.extend(left[left_index:]) - merged.extend(right[right_index:]) - - return merged -``` - -This code first checks if the array is already sorted (i.e., has one or zero elements). If so, it returns the array as is. Otherwise, it divides the array into two halves and recursively sorts each half. The `merge` function then combines these sorted halves back together to produce a single sorted list. - -The time complexity of Merge Sort is O(n log n) for all cases (best, average, worst), making it efficient even for large lists. \ No newline at end of file +Hello +The translation of 'Hello' into French is 'Bonjour'. \ No newline at end of file diff --git a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.1.result b/tests/results/examples/tutorial/calling_llm_with_input_messages_var.1.result deleted file mode 100644 index 150ac5d58..000000000 --- a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.1.result +++ /dev/null @@ -1,53 +0,0 @@ -Here is a Python implementation of the Merge Sort algorithm: - -```python -def merge_sort(arr): - # Base case: if array has 1 or no elements, it's already sorted - if len(arr) <= 1: - return arr - - # Divide the array into two halves - mid = len(arr) // 2 - left_half = arr[:mid] - right_half = arr[mid:] - - # Recursively sort both halves - left_sorted = merge_sort(left_half) - right_sorted = merge_sort(right_half) - - # Merge the sorted halves back together - return merge(left_sorted, right_sorted) - -def merge(left, right): - """ - Merge two sorted arrays into one sorted array. - """ - merged = [] # Initialize an empty list for the result - left_index = 0 # Index for left array - right_index = 0 # Index for right array - - # Continue until we've processed all elements in both lists - while left_index < len(left) and right_index < len(right): - if left[left_index] <= right[right_index]: - merged.append(left[left_index]) - left_index += 1 - else: - merged.append(right[right_index]) - right_index += 1 - - # If there are any remaining elements in either list, append them to the result - merged.extend(left[left_index:]) - merged.extend(right[right_index:]) - - return merged -``` - -This code first checks if the array is already sorted (i.e., has one or no elements). If not, it divides the array into two halves and recursively sorts them. Then, it merges these sorted halves back together using a helper function `merge()`. The merging process compares elements from both halves and adds the smaller element to the result list until all elements are processed. - -You can use this function like so: - -```python -arr = [38, 27, 43, 3, 9, 82, 10] -sorted_arr = merge_sort(arr) -print(sorted_arr) # Outputs: [3, 9, 10, 27, 38, 43, 82] -``` \ No newline at end of file diff --git a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.2.result b/tests/results/examples/tutorial/calling_llm_with_input_messages_var.2.result deleted file mode 100644 index d2307e4f4..000000000 --- a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.2.result +++ /dev/null @@ -1,47 +0,0 @@ -Here is a Python implementation of the Merge Sort algorithm: - -```python -def merge_sort(arr): - # Base case: if array has 1 or no elements, it's already sorted - if len(arr) <= 1: - return arr - - # Divide the array into two halves - mid = len(arr) // 2 - left_half = arr[:mid] - right_half = arr[mid:] - - # Recursively sort both halves - left_sorted = merge_sort(left_half) - right_sorted = merge_sort(right_half) - - # Merge the sorted halves back together - return merge(left_sorted, right_sorted) - -def merge(left, right): - """ - Merge two sorted lists into a single sorted list. - """ - merged = [] # Initialize an empty list for the result - left_index = 0 # Index for the left list - right_index = 0 # Index for the right list - - # Continue until we've exhausted both lists - while left_index < len(left) and right_index < len(right): - if left[left_index] <= right[right_index]: - merged.append(left[left_index]) - left_index += 1 - else: - merged.append(right[right_index]) - right_index += 1 - - # If there are any remaining elements in either list, append them to the result - merged.extend(left[left_index:]) - merged.extend(right[right_index:]) - - return merged -``` - -This code first checks if the array is already sorted (i.e., has one or zero elements). If so, it returns the array as is. Otherwise, it divides the array into two halves and recursively sorts each half. The `merge` function then combines these sorted halves back together to produce a single sorted list. - -The time complexity of Merge Sort is O(n log n) for all cases (best, average, worst), making it efficient even for large lists. \ No newline at end of file diff --git a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.3.result b/tests/results/examples/tutorial/calling_llm_with_input_messages_var.3.result deleted file mode 100644 index ca5666e7a..000000000 --- a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.3.result +++ /dev/null @@ -1,47 +0,0 @@ -Here is a Python implementation of the Merge Sort algorithm: - -```python -def merge_sort(arr): - # Base case: if array has 1 or no elements, it's already sorted - if len(arr) <= 1: - return arr - - # Divide the array into two halves - mid = len(arr) // 2 - left_half = arr[:mid] - right_half = arr[mid:] - - # Recursively sort both halves - left_sorted = merge_sort(left_half) - right_sorted = merge_sort(right_half) - - # Merge the sorted halves back together - return merge(left_sorted, right_sorted) - -def merge(left, right): - """ - Merge two sorted arrays into one sorted array. - """ - merged = [] # Initialize an empty list for the merged result - left_index = 0 # Index for left array - right_index = 0 # Index for right array - - # Continue until we've processed all elements in both lists - while left_index < len(left) and right_index < len(right): - if left[left_index] <= right[right_index]: - merged.append(left[left_index]) - left_index += 1 - else: - merged.append(right[right_index]) - right_index += 1 - - # If there are any remaining elements in either list, append them to the result - merged.extend(left[left_index:]) - merged.extend(right[right_index:]) - - return merged -``` - -This code first checks if the array is already sorted (i.e., has one or zero elements). If so, it returns the array as is. Otherwise, it divides the array into two halves and recursively sorts each half. The `merge` function then combines these sorted halves back together in a single sorted list. - -The time complexity of merge sort is O(n log n) for all cases (best, average, worst), making it efficient even for large lists. diff --git a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.4.result b/tests/results/examples/tutorial/calling_llm_with_input_messages_var.4.result deleted file mode 100644 index 9482c3618..000000000 --- a/tests/results/examples/tutorial/calling_llm_with_input_messages_var.4.result +++ /dev/null @@ -1,47 +0,0 @@ -Here is a Python implementation of the Merge Sort algorithm: - -```python -def merge_sort(arr): - # Base case: if array has 1 or no elements, it's already sorted - if len(arr) <= 1: - return arr - - # Divide the array into two halves - mid = len(arr) // 2 - left_half = arr[:mid] - right_half = arr[mid:] - - # Recursively sort both halves - left_sorted = merge_sort(left_half) - right_sorted = merge_sort(right_half) - - # Merge the sorted halves back together - return merge(left_sorted, right_sorted) - -def merge(left, right): - """ - Merge two sorted arrays into one sorted array. - """ - merged = [] # Initialize an empty list for the result - left_index = 0 # Index for the left array - right_index = 0 # Index for the right array - - # Continue until we've iterated through both lists - while left_index < len(left) and right_index < len(right): - if left[left_index] <= right[right_index]: - merged.append(left[left_index]) - left_index += 1 - else: - merged.append(right[right_index]) - right_index += 1 - - # If there are any remaining elements in either list, append them to the result - merged.extend(left[left_index:]) - merged.extend(right[right_index:]) - - return merged -``` - -This code first checks if the array is already sorted (i.e., has one or zero elements). If so, it returns the array as is. Otherwise, it divides the array into two halves and recursively sorts each half. The `merge` function then combines these sorted halves back together to produce a single sorted array. - -The time complexity of Merge Sort is O(n log n) for all cases (best, average, worst), making it efficient even for large lists. diff --git a/tests/results/examples/tutorial/defs-hello.0.result b/tests/results/examples/tutorial/defs-hello.0.result deleted file mode 100644 index ffd7606c1..000000000 --- a/tests/results/examples/tutorial/defs-hello.0.result +++ /dev/null @@ -1,2 +0,0 @@ -Hello World! -Good bye \ No newline at end of file diff --git a/tests/results/examples/tutorial/defs.0.result b/tests/results/examples/tutorial/defs.0.result index 2d4182036..680f268e2 100644 --- a/tests/results/examples/tutorial/defs.0.result +++ b/tests/results/examples/tutorial/defs.0.result @@ -1,2 +1,9 @@ -'J'aime Paris !' -The translation of "I love Madrid!" into Spanish is: "Me encanta Madrid!" \ No newline at end of file +Hello + +In Fench: Bonjour! + +Translation of "Hello" in French is "Bonjour". + +In Spanish: Hola! + +La traducción de "Hello" al español es "Hola". diff --git a/tests/results/examples/tutorial/defs.1.result b/tests/results/examples/tutorial/defs.1.result deleted file mode 100644 index d5ba84ddc..000000000 --- a/tests/results/examples/tutorial/defs.1.result +++ /dev/null @@ -1,2 +0,0 @@ -'J'aime Paris !' -The translation of "I love Madrid!" into Spanish is: "Me encanta Madrid." diff --git a/tests/results/examples/tutorial/function_call_in_jinja.0.result b/tests/results/examples/tutorial/function_call_in_jinja.0.result new file mode 100644 index 000000000..1a43b5431 --- /dev/null +++ b/tests/results/examples/tutorial/function_call_in_jinja.0.result @@ -0,0 +1 @@ +The way to say hello in French is 'Bonjour'. diff --git a/tests/results/examples/tutorial/function_definition.0.result b/tests/results/examples/tutorial/function_definition.0.result index 2d4182036..e7122f0e9 100644 --- a/tests/results/examples/tutorial/function_definition.0.result +++ b/tests/results/examples/tutorial/function_definition.0.result @@ -1,2 +1,2 @@ -'J'aime Paris !' -The translation of "I love Madrid!" into Spanish is: "Me encanta Madrid!" \ No newline at end of file +J'adore Paris ! +Me encanta Madrid! \ No newline at end of file diff --git a/tests/results/examples/tutorial/function_definition.3.result b/tests/results/examples/tutorial/function_definition.3.result new file mode 100644 index 000000000..80159edaa --- /dev/null +++ b/tests/results/examples/tutorial/function_definition.3.result @@ -0,0 +1,2 @@ +J'aime Paris ! +Amo Madrid! \ No newline at end of file diff --git a/tests/results/examples/tutorial/function_empty_context.0.result b/tests/results/examples/tutorial/function_empty_context.0.result index 4901d530d..b80d21d4d 100644 --- a/tests/results/examples/tutorial/function_empty_context.0.result +++ b/tests/results/examples/tutorial/function_empty_context.0.result @@ -1 +1,2 @@ -Hello World!Greetings! I am Granite, a language model developed by IBM in 2024. How may I assist you today? \ No newline at end of file +J'aime Paris ! +Me encanta Madrid! \ No newline at end of file diff --git a/tests/results/examples/tutorial/function_empty_context.1.result b/tests/results/examples/tutorial/function_empty_context.1.result index 348e9bbe6..e7122f0e9 100644 --- a/tests/results/examples/tutorial/function_empty_context.1.result +++ b/tests/results/examples/tutorial/function_empty_context.1.result @@ -1 +1,2 @@ -Hello World!Hello there! How can I assist you today? If you have any questions or need information on a particular topic, feel free to ask. I'm here to help. \ No newline at end of file +J'adore Paris ! +Me encanta Madrid! \ No newline at end of file diff --git a/tests/results/examples/tutorial/function_optional_params.0.result b/tests/results/examples/tutorial/function_optional_params.0.result index f0f021368..000144d17 100644 --- a/tests/results/examples/tutorial/function_optional_params.0.result +++ b/tests/results/examples/tutorial/function_optional_params.0.result @@ -1 +1,2 @@ -Hello World Universe! \ No newline at end of file +Hello World! +Hello Earth Planet! \ No newline at end of file diff --git a/tests/results/examples/tutorial/lastOf.0.result b/tests/results/examples/tutorial/lastOf.0.result new file mode 100644 index 000000000..072eed5a2 --- /dev/null +++ b/tests/results/examples/tutorial/lastOf.0.result @@ -0,0 +1,3 @@ +Hello + +Greetings, I trust this message finds you in good health and high spirits. How may I be of assistance today? Please feel free to pose your query or request, knowing that I am here to serve with diligence and precision. \ No newline at end of file diff --git a/tests/results/examples/tutorial/lastOf.1.result b/tests/results/examples/tutorial/lastOf.1.result new file mode 100644 index 000000000..a5acb19ab --- /dev/null +++ b/tests/results/examples/tutorial/lastOf.1.result @@ -0,0 +1,3 @@ +Hello + +Greetings! I trust this message finds you in good health and high spirits. How may I be of assistance today? Please feel free to pose your query or request, knowing that I am here to serve with diligence and precision. \ No newline at end of file diff --git a/tests/results/examples/tutorial/local_computation.0.result b/tests/results/examples/tutorial/local_computation.0.result new file mode 100644 index 000000000..956e25b97 --- /dev/null +++ b/tests/results/examples/tutorial/local_computation.0.result @@ -0,0 +1,2 @@ +Hello +The variable GEN is equal to: Hello \ No newline at end of file diff --git a/tests/results/examples/tutorial/muting_block_output.0.result b/tests/results/examples/tutorial/muting_block_output.0.result index 0701474bb..b015fe531 100644 --- a/tests/results/examples/tutorial/muting_block_output.0.result +++ b/tests/results/examples/tutorial/muting_block_output.0.result @@ -1 +1,5 @@ -The french sentence was: 'J'aime Paris !' \ No newline at end of file +Hello + +In Fench: Bonjour! + +Translation of "Hello" in French is "Bonjour". diff --git a/tests/results/examples/tutorial/muting_block_output.1.result b/tests/results/examples/tutorial/muting_block_output.1.result deleted file mode 100644 index 3d6699e99..000000000 --- a/tests/results/examples/tutorial/muting_block_output.1.result +++ /dev/null @@ -1 +0,0 @@ -The french sentence was: 'J'adore Paris !' diff --git a/tests/results/examples/tutorial/role.0.result b/tests/results/examples/tutorial/role.0.result new file mode 100644 index 000000000..a71426c43 --- /dev/null +++ b/tests/results/examples/tutorial/role.0.result @@ -0,0 +1,3 @@ +Hello +You are a polite assistant that likes to answer very formally. +Greetings! I trust this message finds you in good health and high spirits. How may I be of assistance today? Please feel free to pose your query or request, knowing that I am here to serve with diligence and precision. \ No newline at end of file diff --git a/tests/results/examples/sdk/hello.0.result b/tests/results/examples/tutorial/sdk/hello.0.result similarity index 100% rename from tests/results/examples/sdk/hello.0.result rename to tests/results/examples/tutorial/sdk/hello.0.result diff --git a/tests/results/examples/tutorial/variable_def_use.0.result b/tests/results/examples/tutorial/variable_def_use.0.result index 72d411660..36f7aa6b0 100644 --- a/tests/results/examples/tutorial/variable_def_use.0.result +++ b/tests/results/examples/tutorial/variable_def_use.0.result @@ -1,3 +1,3 @@ Hello Hello -GEN is equal to: Hello \ No newline at end of file +The variable GEN is equal to: Hello \ No newline at end of file diff --git a/tests/test_examples_run.yaml b/tests/test_examples_run.yaml index 78ab19b4e..e299c0980 100644 --- a/tests/test_examples_run.yaml +++ b/tests/test_examples_run.yaml @@ -1,5 +1,15 @@ -update_results: false -check: [] +update_results: true +check: + - examples/tutorial/defs.pdl + - examples/tutorial/function_definition.pdl + - examples/tutorial/function_empty_context.pdl + - examples/tutorial/calling_llm_with_input_messages_var.pdl + - examples/tutorial/local_computation.pdl + - examples/tutorial/role.pdl + - examples/tutorial/lastOf.pdl + - examples/tutorial/muting_block_output.pdl + - examples/tutorial/function_optional_params.pdl + - examples/tutorial/function_call_in_jinja.pdl skip: - examples/demos/react.pdl - examples/cldk/cldk-assistant.pdl