You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: dsl-reference.md
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -1519,7 +1519,7 @@ from: .order.pet
1519
1519
1520
1520
### Output
1521
1521
1522
-
Documents the structure - and optionally configures the filtering of - workflow/task output data.
1522
+
Documents the structure - and optionally configures the transformations of - workflow/task output data.
1523
1523
1524
1524
It's crucial for authors to document the schema of output data whenever feasible. This documentation empowers consuming applications to provide contextual auto-suggestions when handling runtime expressions.
1525
1525
@@ -1544,11 +1544,11 @@ output:
1544
1544
petId:
1545
1545
type: string
1546
1546
required: [ petId ]
1547
-
as:
1548
-
petId: '${ .pet.id }'
1547
+
as:
1548
+
petId: '${ .pet.id }'
1549
1549
export:
1550
1550
as:
1551
-
'.petList += [ . ]'
1551
+
'.petList += [ $task.output ]'
1552
1552
```
1553
1553
1554
1554
### Export
@@ -1566,13 +1566,13 @@ Optionally, the context might have an associated schema.
Copy file name to clipboardExpand all lines: dsl.md
+96-24Lines changed: 96 additions & 24 deletions
Original file line number
Diff line number
Diff line change
@@ -164,39 +164,97 @@ Once the task has been executed, different things can happen:
164
164
165
165
In Serverless Workflow DSL, data flow management is crucial to ensure that the right data is passed between tasks and to the workflow itself.
166
166
167
-
Here's how data flows through a workflow based on various filtering stages:
167
+
Here's how data flows through a workflow based on various transformation stages:
168
168
169
-
1.**Filter Workflow Input**
170
-
Before the workflow starts, the input data provided to the workflow can be filtered to ensure only relevant data is passed into the workflow context. This step allows the workflow to start with a clean and focused dataset, reducing potential overhead and complexity in subsequent tasks.
169
+
1.**Transform Workflow Input**
170
+
Before the workflow starts, the input data provided to the workflow can be transformed to ensure only relevant data in the expected format is passed into the workflow context. This can be done using the top level `input.from` expression. It evaluates on the raw workflow input and defaults to the identity expression which leaves the input unchanged. This step allows the workflow to start with a clean and focused dataset, reducing potential overhead and complexity in subsequent tasks. The result of this expression will set as the initial value for the `$context` runtime expression argument and be passed to the first task.
171
171
172
-
*Example: If the workflow receives a JSON object as input, a filter can be applied to remove unnecessary fields and retain only those that are required for the workflow's execution.*
172
+
*Example: If the workflow receives a JSON object as input, a transformation can be applied to remove unnecessary fields and retain only those that are required for the workflow's execution.*
173
173
174
-
2.**Filter First Task Input**
175
-
The input data for the first task can be filtered to match the specific requirements of that task. This ensures that the first task receives only the necessary data it needs to perform its operations.
174
+
2.**Transform First Task Input**
175
+
The input data for the first task can be transformed to match the specific requirements of that task. This ensures that the first task receives only the data required to perform its operations. This can be done using the task's `input.from` expression. It evaluates the transformed workflow input and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition.
176
176
177
-
*Example: If the first task is a function call that only needs a subset of the workflow input, a filter can be applied to provide only those fields needed for the function to execute.*
177
+
*Example: If the first task is a function call that only needs a subset of the workflow input, a transformation can be applied to provide only those fields needed for the function to execute.*
178
178
179
-
3.**Filter First Task Output**
180
-
After the first task completes, its output can be filtered before passing it to the next task or storing it in the workflow context. This helps in managing the data flow and keeping the context clean by removing any unnecessary data produced by the task.
179
+
3.**Transform First Task Output**
180
+
After completing the first task, its output can be transformed before passing it to the next task or storing it in the workflow context. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be input for the next task. To update the context, one uses the `export.as` runtime expression. It evaluates the raw output and defaults to the expression that returns the existing context. The result of this runtime expression replaces the workflow's current context and the content of the `$context` runtime expression argument. This helps manage the data flow and keep the context clean by removing any unnecessary data produced by the task.
181
181
182
-
*Example: If the first task returns a large dataset, a filter can be applied to retain only the relevant results needed for subsequent tasks.*
182
+
*Example: If the first task returns a large dataset, a transformation can be applied to retain only the relevant results needed for subsequent tasks.*
183
183
184
-
4.**Filter Last Task Input**
185
-
Before the last task in the workflow executes, its input data can be filtered to ensure it receives only the necessary information. This step is crucial for ensuring that the final task has all the required data to complete the workflow successfully.
184
+
4.**Transform Last Task Input**
185
+
Before the last task in the workflow executes, its input data can be transformed to ensure it receives only the necessary information. This can be done using the task's `input.from` expression. It evaluates the transformed workflow input and defaults to the identity expression, which leaves the input unchanged. The result of this expression will be set as the `$input` runtime expression argument and be passed to the task. This transformed input will be evaluated against any runtime expressions used within the task definition. This step is crucial for ensuring the final task has all the required data to complete the workflow successfully.
186
186
187
-
*Example: If the last task involves generating a report, the input filter can ensure that only the data required for the report generation is passed to the task.*
187
+
*Example: If the last task involves generating a report, the input transformation can ensure that only the data required for the report generation is passed to the task.*
188
188
189
-
5.**Filter Last Task Output**
190
-
After the last task completes, its output can be filtered before it is considered as the workflow output. This ensures that the workflow produces a clean and relevant output, free from any extraneous data that might have been generated during the task execution.
189
+
5.**Transform Last Task Output**
190
+
After the last task completes, its output can be transformed before it is considered the workflow output. Transformations are applied using the `output.as` runtime expression. It evaluates the raw task output and defaults to the identity expression, which leaves the output unchanged. Its result will be passed to the workflow `output.as` runtime expression. This ensures that the workflow produces a clean and relevant output, free from any extraneous data that might have been generated during the task execution.
191
191
192
-
*Example: If the last task outputs various statistics, a filter can be applied to retain only the key metrics that are relevant to the stakeholders.*
192
+
*Example: If the last task outputs various statistics, a transformation can be applied to retain only the key metrics that are relevant to the stakeholders.*
193
193
194
-
6.**Filter Workflow Output**
195
-
Finally, the overall workflow output can be filtered before it is returned to the caller or stored. This step ensures that the final output of the workflow is concise and relevant, containing only the necessary information that needs to be communicated or recorded.
194
+
6.**Transform Workflow Output**
195
+
Finally, the overall workflow output can be transformed before it is returned to the caller or stored. Transformations are applied using the `output.as` runtime expression. It evaluates the last task's output and defaults to the identity expression, which leaves the output unchanged. This step ensures that the final output of the workflow is concise and relevant, containing only the necessary information that needs to be communicated or recorded.
196
196
197
-
*Example: If the workflow's final output is a summary report, a filter can ensure that the report contains only the most important summaries and conclusions, excluding any intermediate data.*
197
+
*Example: If the workflow's final output is a summary report, a transformation can ensure that the report contains only the most important summaries and conclusions, excluding any intermediate data.*
198
198
199
-
By applying filters at these strategic points, Serverless Workflow DSL ensures that data flows through the workflow in a controlled and efficient manner, maintaining clarity and relevance at each stage of execution. This approach helps in managing complex workflows and ensures that each task operates with the precise data it requires, leading to more predictable and reliable workflow outcomes.
199
+
By applying transformations at these strategic points, Serverless Workflow DSL ensures that data flows through the workflow in a controlled and efficient manner, maintaining clarity and relevance at each execution stage. This approach helps manage complex workflows and ensures that each task operates with the precise data required, leading to more predictable and reliable workflow outcomes.
@@ -220,8 +278,9 @@ When the evaluation of an expression fails, runtimes **must** raise an error wit
220
278
221
279
| Name | Type | Description |
222
280
|:-----|:----:|:------------|
223
-
| context |`any`| The task's context data. |
224
-
| input |`any`| The task's filtered input. |
281
+
| context |`map`| The task's context data. |
282
+
| input |`any`| The task's transformed input. |
283
+
| output |`any`| The task's transformed output. |
225
284
| secrets |`map`| A key/value map of the workflow secrets.<br>To avoid unintentional bleeding, secrets can only be used in the `input.from` runtime expression. |
226
285
| task |[`taskDescriptor`](#task-descriptor)| Describes the current task. |
227
286
| workflow |[`workflowDescriptor`](#workflow-descriptor)| Describes the current workflow. |
@@ -243,7 +302,8 @@ This argument contains information about the runtime executing the workflow.
243
302
|:-----|:----:|:------------|:--------|
244
303
| name |`string`| The task's name. |`getPet`|
245
304
| definition |`map`| The tasks definition (specified under the name) as a parsed object |`{ "call": "http", "with": { ... } }`|
246
-
| input |`any`| The task's input *BEFORE* the `input.from` expression. For the result of `input.from` expression use the context of the runtime expression (for jq `.`) | - |
305
+
| input |`any`| The task's *raw* input (i.e. *BEFORE* the `input.from` expression). For the result of `input.from` expression use the context of the runtime expression (for jq `.`) | - |
306
+
| output |`any`| The task's *raw* output (i.e. *BEFORE* the `output.as` expression). | - |
247
307
| startedAt.iso8601 |`string`| The start time of the task as a ISO 8601 date time string. It uses `T` as the date-time delimiter, either UTC (`Z`) or a time zone offset (`+01:00`). The precision can be either seconds, milliseconds or nanoseconds |`2022-01-01T12:00:00Z`, `2022-01-01T12:00:00.123456Z`, `2022-01-01T12:00:00.123+01:00`|
248
308
| startedAt.epochMillis |`integer`| The start time of the task as a integer value of milliseconds since midnight of 1970-01-01 UTC |`1641024000123` (="2022-01-01T08:00:00.123Z") |
249
309
| startedAt.epochNanos |`integer`| The start time of the task as a integer value of nanoseconds since midnight of 1970-01-01 UTC |`1641024000123456` (="2022-01-01T08:00:00.123456Z") |
@@ -254,11 +314,23 @@ This argument contains information about the runtime executing the workflow.
254
314
|:-----|:----:|:------------|:--------|
255
315
| id |`string`| A unique id of the workflow execution. Now specific format is imposed | UUIDv4: `4a5c8422-5868-4e12-8dd9-220810d2b9ee`, ULID: `0000004JFGDSW1H037G7J7SFB9`|
256
316
| definition |`map`| The workflow's definition as a parsed object |`{ "document": { ... }, "do": [...] }`|
257
-
| input |`any`| The workflow's input *BEFORE* the `input.from` expression. For the result of `input.from` expression use the `$input` argument | - |
317
+
| input |`any`| The workflow's *raw*input (i.e *BEFORE* the `input.from` expression). For the result of `input.from` expression use the `$input` argument | - |
258
318
| startedAt.iso8601 |`string`| The start time of the execution as a ISO 8601 date time string. It uses `T` as the date-time delimiter, either UTC (`Z`) or a time zone offset (`+01:00`). The precision can be either seconds, milliseconds or nanoseconds |`2022-01-01T12:00:00Z`, `2022-01-01T12:00:00.123456Z`, `2022-01-01T12:00:00.123+01:00`|
259
319
| startedAt.epochMillis |`integer`| The start time of the execution as a integer value of milliseconds since midnight of 1970-01-01 UTC |`1641024000123` (="2022-01-01T08:00:00.123Z") |
260
320
| startedAt.epochNanos |`integer`| The start time of the execution as a integer value of nanoseconds since midnight of 1970-01-01 UTC |`1641024000123456` (="2022-01-01T08:00:00.123456Z") |
261
321
322
+
The following table shows which arguments are available for each runtime expression:
323
+
324
+
| Runtime Expression | Evaluated on | Produces |`$context`|`$input`|`$output`|`$secrets`|`$task`|`$workflow`|
Serverless Workflow is designed with resilience in mind, acknowledging that errors are an inevitable part of any system. The DSL provides robust mechanisms to identify, describe, and handle errors effectively, ensuring the workflow can recover gracefully from failures.
0 commit comments