diff --git a/README.md b/README.md index 8cdabce1..557822f6 100644 --- a/README.md +++ b/README.md @@ -4,92 +4,111 @@ [](https://serverlessworkflow.io/) [](https://twitter.com/CNCFWorkflow) +## Table of Contents + +- [About](#about) +- [Ecosystem](#ecosystem) + + [DSL](dsl.md) + + [CTK](/ctk/readme.md) + + [SDKs](#sdks) + + [Runtimes](#runtimes) + + [Tooling](#Tooling) + + [Landscape](#cncf-landscape) +- [Documentation](#documentation) +- [Community](#community) + + [Communication](#communication) + + [Governance](#governance) + + [Code of Conduct](#code-of-conduct) + + [Weekly Meetings](#weekly-meetings) ++ [Support](#support) + - [Adoption](#adoption) + - [Sponsoring](#sponsoring) + ## About -CNCF Serverless Workflow defines a vendor-neutral, open-source, and fully community-driven -ecosystem for defining and running DSL-based workflows that target the Serverless technology domain. +Serverless Workflow presents a vendor-neutral, open-source, and entirely community-driven ecosystem tailored for defining and executing DSL-based workflows in the realm of Serverless technology. -This project is composed of: +The Serverless Workflow DSL is a high-level language that reshapes the terrain of workflow creation, boasting a design that is ubiquitous, intuitive, imperative, and fluent. -* [Specification](specification.md) for defining DSL-based workflows -* [Developer SDKs](#sdks) for different programming languages -* [Workflow runtimes](#runtimes) supporting the specification -* Developer [tooling support](#tooling) for writing DSL-based workflows +Bid farewell to convoluted coding and platform dependencies—now, crafting powerful workflows is effortlessly within reach for everyone! -CNCF Serverless Workflow is hosted by the [Cloud Native Computing Foundation (CNCF)](https://www.cncf.io/) and was approved as a -Cloud Native Sandbox level project on July 14, 2020. +Key features: -## Table of Contents +- **Easy to Use**: Designed for universal understanding, Serverless Workflow DSL enables users to quickly grasp workflow concepts and create complex workflows effortlessly. +- **Event Driven**: Seamlessly integrate events into workflows with support for various formats, including CloudEvents, allowing for event-driven workflow architectures. +- **Service Oriented**: The Serverless Workflow DSL empowers developers to seamlessly integrate with service-oriented architectures, allowing them to define workflows that interact with various services over standard application protocols like HTTP, GRPC, OpenAPI, AxsyncAPI, and more. +- **FaaS Centric**: Seamlessly invoke functions hosted on various platforms within workflows, promoting a function-as-a-service (FaaS) paradigm and enabling microservices architectures. +- **Timely**: Define timeouts for workflows and tasks to manage execution duration effectively. +- **Fault Tolerant**: Easily define error handling strategies to manage and recover from errors that may occur during workflow execution, ensuring robustness and reliability. +- **Schedulable**: Schedule workflows using CRON expressions or trigger them based on events, providing control over workflow execution timing. +- **Interoperable**: Integrates seamlessly with different services and resources. +- **Robust**: Offers features such as conditional branching, event handling, and looping constructs. +- **Scalable**: Promotes code reusability, maintainability, and scalability across different environments. -- [CNCF Landscape](#CNCF-Landscape) -- [Releases](#Releases) -- [Runtimes](#Runtimes) -- [SDKs](#SDKs) -- [Tooling](#Tooling) -- [Community](#Community) - - [Communication](#Communication) - - [Code of Conduct](#Code-of-Conduct) - - [Weekly Meetings](#Weekly-Meetings) -- [Repository Structure](#Repository-Structure) -- [Support](#Support) +## Ecosystem -## CNCF Landscape +Serverless Workflow ecosystem is hosted by the [Cloud Native Computing Foundation (CNCF)](https://www.cncf.io/) and was approved as a +Cloud Native Sandbox level project on July 14, 2020. -Serverless Workflow project falls under the [CNCF "App Definition and Development"](https://landscape.cncf.io/card-mode?category=app-definition-and-development&grouping=category) category. +It encompasses a comprehensive suite of components and tools designed to facilitate the creation, management, and execution of serverless workflows. -It is a member project of the [CNCF Serverless Working Group](https://github.com/cncf/wg-serverless). +1. **[DSL](dsl.md) (Domain Specific Language)**: The core of the ecosystem, defining the fundamental syntax and semantics of Serverless Workflow specifications. -

-CNCF Landscape -

+2. **[TCK](/tck/readme.md) (Test Conformance Kit)**: A set of Gherkin features utilized for both conformance testing and Behavior Driven Design (BDD), ensuring compliance and facilitating testing across implementations. -Check out our project DevStats [here](https://serverlessworkflow.devstats.cncf.io). +3. **[SDKs](#sdks) (Software Development Kits)**: These enable developers to interact with serverless workflows in various programming languages, providing functionalities such as reading, writing, building, and validating workflows. -## Releases +4. **[Runtimes](#runtimes)**: Dedicated environments for executing workflows defined using the Serverless Workflow DSL, ensuring seamless deployment and operation within diverse runtime environments. -| | Latest release | Latest release branch | Working branch | -| --- | :---: | :---: | :---: | -| **Core Specification** | | -| [Serverless Workflow](https://github.com/serverlessworkflow/specification) | [v0.8](https://github.com/serverlessworkflow/specification/releases) | [0.8.x](https://github.com/serverlessworkflow/specification/tree/0.8.x) | [main](https://github.com/serverlessworkflow/specification) | -| **Additional Components** | | -| [Synapse](https://github.com/serverlessworkflow/synapse) | [0.1.0-alpha1](https://github.com/serverlessworkflow/synapse/releases) | | [main](https://github.com/serverlessworkflow/synapse) | -| [GO SDK](https://github.com/serverlessworkflow/sdk-go) | [v2.0.0](https://github.com/serverlessworkflow/sdk-go/releases) | [1.0.x](https://github.com/serverlessworkflow/sdk-go/tree/1.0.x) | [main](https://github.com/serverlessworkflow/sdk-go) | -| [Java SDK](https://github.com/serverlessworkflow/sdk-java) | [4.0.2.Final](https://github.com/serverlessworkflow/sdk-java/releases) | [4.0.x](https://github.com/serverlessworkflow/sdk-java/tree/4.0.x) | [main](https://github.com/serverlessworkflow/sdk-java) | -| [.NET SDK](https://github.com/serverlessworkflow/sdk-net) | [v0.7.4.4](https://github.com/serverlessworkflow/sdk-net/releases) | | [main](https://github.com/serverlessworkflow/sdk-net) | -| [TypeScript SDK](https://github.com/serverlessworkflow/sdk-typescript) | [v3.0.0](https://github.com/serverlessworkflow/sdk-typescript/releases) | [3.0.x](https://github.com/serverlessworkflow/sdk-typescript/tree/3.0.x) | [main](https://github.com/serverlessworkflow/sdk-typescript) | -| [Python SDK](https://github.com/serverlessworkflow/sdk-python) | [v1.0.0](https://github.com/serverlessworkflow/sdk-python/releases) | [1.0.x](https://github.com/serverlessworkflow/sdk-python/tree/1.0.x) | [main](https://github.com/serverlessworkflow/sdk-python) | -| [VSCode Extension](https://github.com/serverlessworkflow/vscode-extension) | [1.6.0](https://marketplace.visualstudio.com/items?itemName=serverlessworkflow.serverless-workflow-vscode-extension) | | [main](https://github.com/serverlessworkflow/vscode-extension) | +5. **[Tooling](#tooling)**: Additional utilities and resources tailored to enhance the development, debugging, and management of serverless workflows, streamlining the workflow lifecycle from creation to deployment and maintenance. -## Runtimes -- [Synapse](https://github.com/serverlessworkflow/synapse) +### SDKs -Serverless Workflow is open to host open-source runtime implementations that would like to -be part and grow alongside the core specification. +The Serverless Workflow SDKs are essential tools designed to assist developers in consuming, parsing, validating, and testing their workflows utilizing the Serverless Workflow DSL. -[Synapse](https://github.com/serverlessworkflow/synapse) is a Kubernetes-native workflow runtime which supports and is part of the Serverless -Workflow eco-system. +These SDKs empower developers to seamlessly integrate serverless workflows into their applications, providing robust support for various programming languages. By offering comprehensive functionality, they streamline the development process and enhance workflow management. -## SDKs +Explore our SDKs for different programming languages: - [Go](https://github.com/serverlessworkflow/sdk-go) - [Java](https://github.com/serverlessworkflow/sdk-java) - [.NET](https://github.com/serverlessworkflow/sdk-net) -- [TypeScript](https://github.com/serverlessworkflow/sdk-typescript) - [Python](https://github.com/serverlessworkflow/sdk-python) +- [TypeScript](https://github.com/serverlessworkflow/sdk-typescript) + +Don't see your favorite implementation on the list? Shout out to the community about it or, even better, contribute to the ecosystem with a new SDK! + +No matter your preferred language, our SDKs provide the tools you need to leverage the power of serverless workflows effectively. -Serverless Workflow encourages development of SDKs dedicated to help developers with -consuming, parsing, validating and testing their workflows that use the Serverless Workflow DSL. +### Runtimes -## Tooling +| Name | About | +| --- | --- | +| [Apache KIE SonataFlow](https://sonataflow.org) | Apache KIE SonataFlow is a tool for building cloud-native workflow applications. You can use it to do the services and events orchestration and choreography. | +| [Synapse](https://github.com/serverlessworkflow/synapse) | Synapse is a scalable, cross-platform, fully customizable platform for managing and running Serverless Workflows. | -In order to enhance developer experience with the specification, we also provide a [Visual Studio Code extension](https://marketplace.visualstudio.com/items?itemName=serverlessworkflow.serverless-workflow-vscode-extension). -The sources of the extension are found [here](https://github.com/serverlessworkflow/vscode-extension). +### Tooling -## Requirements +In order to enhance developer experience with the Serverless Workflow DSL, we provide a [Visual Studio Code extension](https://marketplace.visualstudio.com/items?itemName=serverlessworkflow.serverless-workflow-vscode-extension). -To generate the SVG diagram from the YAML or JSON file, you need to have the following tools installed: -- https://www.graphviz.org/download/source/ +The sources of the extension can be found [here](https://github.com/serverlessworkflow/vscode-extension). + +### CNCF Landscape + +Serverless Workflow project falls under the [CNCF "App Definition and Development"](https://landscape.cncf.io/card-mode?category=app-definition-and-development&grouping=category) category. + +It is a member project of the [CNCF Serverless Working Group](https://github.com/cncf/wg-serverless). + +

+CNCF Landscape +

+ +## Documentation + +The documentation for Serverless Workflow includes: +- [**DSL**](dsl.md): Documents the fundamentals aspects and concepts of the Serverless Workflow DSL +- [**DSL Reference**](dsl-reference.md): References all the definitions used by the Serverless Workflow DSL ## Community @@ -98,10 +117,10 @@ workflow ecosystem. Community contributions are welcome and much needed to foste See [here](community/contributors.md) for the list of community members that have contributed to the specification. -To learn how to contribute to the specification reference the ['how to contribute'](contributing.md) doc. +To learn how to contribute to the specification please refer to ['how to contribute'](contributing.md). If you have any copyright questions when contributing to a CNCF project like this one, -reference the [Ownership of Copyrights in CNCF Project Contributions](https://github.com/cncf/foundation/blob/master/copyright-notices.md) doc. +reference the [Ownership of Copyrights in CNCF Project Contributions](https://github.com/cncf/foundation/blob/master/copyright-notices.md). ### Communication @@ -111,6 +130,18 @@ reference the [Ownership of Copyrights in CNCF Project Contributions](https://gi - Serverless WG Email: [cncf-wg-serverless](mailto:cncf-wg-serverless@lists.cncf.io) - Serverless WG Subscription: [https://lists.cncf.io/g/cncf-wg-serverless](https://lists.cncf.io/g/cncf-wg-serverless) +### Governance + +The Serverless Workflow Project Governance [document](governance.md) delineates the roles, procedures, and principles guiding the collaborative development and maintenance of the project. + +It emphasizes adherence to the CNCF Code of Conduct, defines the responsibilities of maintainers, reviewers, and emeritus maintainers, outlines procedures for their addition and removal, and establishes guidelines for subprojects' inclusion and compliance. + +Decision-making processes are consensus-driven, facilitated through structured proposal and discussion mechanisms, with conflict resolution procedures prioritizing amicable resolution. + +Overall, the document reflects the project's commitment to transparency, accountability, and inclusive collaboration, fostering an environment conducive to sustained growth and innovation. + +See the project's Governance Model [here](governance.md). + ### Code of Conduct As contributors and maintainers of this project, and in the interest of fostering @@ -123,7 +154,7 @@ everyone, regardless of level of experience, gender, gender identity and express sexual orientation, disability, personal appearance, body size, race, ethnicity, age, religion, or nationality. -See our full project Code of Conduct information [here](code-of-conduct.md). +See the project's Code of Conduct [here](code-of-conduct.md). ### Weekly Meetings @@ -133,36 +164,18 @@ To register for meetings please visit the [CNCF Community Calendar](https://tock You can register for individual meetings or for the entire series. -The meeting minutes can be accessed in [the discussions tab](https://github.com/serverlessworkflow/specification/discussions). - -[World Time Zone Converter](http://www.thetimezoneconverter.com/?t=9:00%20am&tz=San%20Francisco&). - -## Repository Structure - -Here is the outline of the repository to help navigate the specification -documents: - -| File/folder | Description | -| --- | --- | -| [specification.md](specification.md) | The main specification document | -| [OWNERS](OWNERS) | Defines the current specification maintainers and approvers | -| [LICENSE](LICENSE) | Specification License doc | -| [MAINTAINERS.md](MAINTAINERS.md) | Project Maintainers Info | -| [GOVERNANCE.md](GOVERNANCE.md) | Project Governance Info | -| [contributing.md](contributing.md) | Documentation on how to contribute to the spec | -| [code-of-conduct.md](code-of-conduct.md) | Defines the spec Code of Conduct | -| [usecases](usecases/README.md) | Specification Use Cases | -| [schema](schema) | Contains all specification JSON Schemas | -| [roadmap](roadmap/README.md) | Specification Roadmap | -| [references](references/README.md) | References used for specification docs | -| [media](media) | Includes all images used in spec docs | -| [extensions](extensions/README.md) | Information on spec extensions | -| [examples](examples) | Specification examples | -| [comparisons](comparisons) | Comparisons of Serverless Workflow with other workflow DSLs | -| [community](community) | Contains info on the spec community | -| [annualreviews](annualreviews) | Contains the project annual reviews presented to the CNCF TOC | -| [meetingminutes](meetingminutes) | Contains the projects community meeting minutes | - ## Support -Support our project by [becoming a Sponsor](https://crowdfunding.lfx.linuxfoundation.org/projects/serverless-workflow). +### Adoption + +If you're using Serverless Workflow in your projects and would like to showcase your adoption, become an Adopter! By joining our community of adopters, you'll have the opportunity to share your experiences, contribute feedback, and collaborate with like-minded individuals and organizations leveraging Serverless Workflow to power their workflows. + +### Sponsoring + +As an open-source project, Serverless Workflow relies on the support of sponsors to sustain its development and growth. + +By becoming a sponsor, you'll not only demonstrate your commitment to advancing serverless technologies but also gain visibility within our vibrant community. + +Sponsorship opportunities range from financial contributions to in-kind support, and every sponsorship makes a meaningful impact on the project's success and sustainability. + +Support our project by [becoming a Sponsor](https://crowdfunding.lfx.linuxfoundation.org/projects/serverless-workflow). \ No newline at end of file diff --git a/annualreviews/2021-ServerlessWorkflowSpecification-annual.md b/annualreviews/2021-ServerlessWorkflowSpecification-annual.md deleted file mode 100644 index ac100023..00000000 --- a/annualreviews/2021-ServerlessWorkflowSpecification-annual.md +++ /dev/null @@ -1,136 +0,0 @@ -# Serverless Workflow Specification 2021 Annual Review - -- [Background](#background) -- [Alignment with Cloud Native](#alignment-with-cloud-native) -- [Year in Review](#year-in-review) -- [Annual Review Contents](#annual-review-contents) -- [Project Links](#project-links) - -## Background - -Serverless Workflow is a vendor-neutral, open-source, and fully community-driven ecosystem -for defining and running DSL-based workflows that target the serverless technology domain. - -This project is composed of: - -* [Specification](https://github.com/serverlessworkflow/specification/blob/main/specification.md) for defining DSL-based workflows. -* [Developer SDKs](https://github.com/serverlessworkflow/specification#sdks) that provide support for many programming languages. -* [Workflow runtimes](https://github.com/serverlessworkflow/specification#runtime) part of the project ecosystem, and support the execution of specification DSL. -* [Developer tooling](https://github.com/serverlessworkflow/specification#Tooling) support for writing DSL-based workflows. - -Serverless Workflow was approved as a Cloud Native Sandbox level project on July 14, 2020. -* [TOC review PDF](https://github.com/serverlessworkflow/specification/blob/main/community/presentations/2020-4-15-toc-pres.pdf). -* [TOC sandbox proposal PR](https://github.com/cncf/toc/pull/376) - -## Alignment with Cloud Native - -Serverless Workflow project falls under the [CNCF "App Definition and Development"](https://landscape.cncf.io/card-mode?category=app-definition-and-development&grouping=category) category. - -Serverless Workflow is a member project of the [CNCF Serverless Working Group](https://github.com/cncf/wg-serverless). - -Serverless Workflow includes [Synapse](https://github.com/serverlessworkflow/synapse), a Kubernetes-native runtime engine for executing workflows that follows the -specification DSL definition. - -In addition, Serverless Workflow provides support for several other open-source projects and specifications in the cloud-native -space: -* CloudEvents -* OpenAPI -* AsyncAPI -* GraphQL -* OData -* OAuth2 - -## Year in Review - -This year was very exciting for the project. Some of the most notable accomplishments include: -* Released specification version [0.6](https://github.com/serverlessworkflow/specification/releases/tag/v0.6) -* Released specification version [0.7](https://github.com/serverlessworkflow/specification/releases/tag/v0.7) -* Released specification version [0.8](https://github.com/serverlessworkflow/specification/releases/tag/v0.8) -* Added [Synapse](https://github.com/serverlessworkflow/synapse), a Kubernetes-native runtime into our ecosystem -* Added [sdk-net](https://github.com/serverlessworkflow/sdk-net), a .NET SDK into our ecosystem -* Added [sdk-typescript](https://github.com/serverlessworkflow/sdk-typescript), a TypeScript SDK into our ecosystem -* Added [two new project maintainers](https://github.com/serverlessworkflow/specification/blob/main/MAINTAINERS.md) - -From the community perspective we also had a good year: -* Over 100 new followers on our [twitter channel](https://twitter.com/CNCFWorkflow). -* Over 200 new stars on our [specification github repo](https://github.com/serverlessworkflow/specification). -* Over 300 people attending our project office hours at 2021 KubeCon EU. -* Presented at KubeCon NA 2021, KubeCon EU 2021, KubeCon NA 2020, and KubeCon EU 2020 -* Mentioned as a key component of open-source microservices architectures at [InfoQ](https://www.infoq.com/articles/microservices-inside-out/) -* Over 40 unique visitors per day on our [website](https://serverlessworkflow.io/) -* Participated at the KubeCon EU 2021 BugBash - -## Annual Review Contents - -- **Include a link to your project’s devstats page. We will be looking for signs of consistent or increasing contribution activity.** - -Project [DevStats page](https://serverlessworkflow.devstats.cncf.io). -The info for the span of one year shows: -* Over 500 merged PRs -* ~2000 commits by 30+ contributors -* Community contributions from 10+ different companies -* Over 100% increase to github stars compared to last year - -- **How many maintainers do you have, and which organizations are they from?** - -Serverless Workflow currently has [5 project maintainers](https://github.com/serverlessworkflow/specification/blob/main/MAINTAINERS.md) - - Tihomir Surdilovic, Temporal Technologies - - Manuel Stein, Nokia Bell Labs - - Ricardo Zanini, Red Hat - - Charles d'Avernas , Neuroglia - - Antonio Mendoza Pérez, Independent - - -- **What do you know about adoption, and how has this changed since your last review / since you joined Sandbox?** - -Both the adoption and community interest has been steadily increasing over the course of the year. -Most notable adoptions have been by: - - [Kogito](https://kogito.kie.org/), a Red Hat project automation runtime - - [Automatiko](https://automatiko.io/), a workflow automation runtime - - [Synapse](https://github.com/serverlessworkflow/synapse), a Kubernetes-based workflow runtime which has joined the Serverless Workflow ecosystem - -We also have a number of integrations that are currently work-in-progress which include -integrations with [Temporal](https://temporal.io/). - -- **How has the project performed against its goals since the last review?** - -This is our first annual review since becoming a Sandbox project. -We have surpassed all of our goals that we have set for this year. -We have been able to not only surpass the goals of the main specifications, -but also to go from just hosting a specification to creating -an entire workflow ecosystem around it. This includes SDKs, Tooling (VSCode, IntelliJ), -rumtimes (Synapse), etc. - -Over this year we were able to create a workflow DSL which is in our opinion -at this time the most feature-rich and most powerful workflow DSL that exists. - -- **What are the current goals of the project?** - -Specification roadmap: https://github.com/serverlessworkflow/specification/tree/main/roadmap - -Our main goals for the project include: - - Release specification version 1.0 by April of 2022 - - Add more SDKs in different languages - - Create a specification TCK - - Add integrations with different workflow DSLs - - Improve our community tooling support - - Add at least 2 more integrations with existing workflow runtimes by middle of 2022 - - -- **How can the CNCF help you achieve your upcoming goals?** - - Help us in promoting the project (Blogs, Twitter, KubeCon, etc) - - If feasible help our project via crowdfunding [here](https://crowdfunding.lfx.linuxfoundation.org/projects/serverless-workflow) - - Keep providing us with opportunities to have project office hours and talks at KubeCons - - -- **Do you think that your project meets the criteria for incubation?** - -We believe we have made significant progress toward this goal and that the project is ready for incubation. -We would like however to release Serverless Workflow specification v1.0 before starting this process -(scheduled for April 2022). - -## Project Links -* [Website](https://serverlessworkflow.io/) -* [GitHub](https://github.com/serverlessworkflow) -* Slack:[CNCF](http://slack.cncf.io) / #serverless-workflow -* [Twitter](https://twitter.com/CNCFWorkflow) \ No newline at end of file diff --git a/annualreviews/2022-ServerlessWorkflowSpecification-annual.md b/annualreviews/2022-ServerlessWorkflowSpecification-annual.md deleted file mode 100644 index 9e802e32..00000000 --- a/annualreviews/2022-ServerlessWorkflowSpecification-annual.md +++ /dev/null @@ -1,157 +0,0 @@ -# Serverless Workflow Specification 2022 Annual Review - -- [Background](#background) -- [Alignment with Cloud Native](#alignment-with-cloud-native) -- [Year in Review](#year-in-review) -- [Annual Review Contents](#annual-review-contents) -- [Project Links](#project-links) - -## Background - -Serverless Workflow is a vendor-neutral, open-source, and fully community-driven ecosystem -for defining and running DSL-based workflows that target the serverless technology domain. - -This project is composed of: - -* [Specification](https://github.com/serverlessworkflow/specification/blob/main/specification.md) for defining DSL-based workflows. -* [Developer SDKs](https://github.com/serverlessworkflow/specification#sdks) that provide support for many programming languages. -* [Workflow runtimes](https://github.com/serverlessworkflow/specification#runtime) part of the project ecosystem, and support the execution of specification DSL. -* [Developer tooling](https://github.com/serverlessworkflow/specification#Tooling) support for writing DSL-based workflows. - -Serverless Workflow was approved as a Cloud Native Sandbox level project on July 14, 2020. - -* [TOC review PDF](https://github.com/serverlessworkflow/specification/blob/main/community/presentations/2020-4-15-toc-pres.pdf). -* [TOC sandbox proposal PR](https://github.com/cncf/toc/pull/376) - -## Alignment with Cloud Native - -Serverless Workflow project falls under the [CNCF "App Definition and Development"](https://landscape.cncf.io/card-mode?category=app-definition-and-development&grouping=category) category. - -Serverless Workflow is a member project of the [CNCF Serverless Working Group](https://github.com/cncf/wg-serverless). - -Serverless Workflow includes [Synapse](https://github.com/serverlessworkflow/synapse), a Kubernetes-native runtime engine for executing workflows that follows the specification DSL definition. - -In addition, Serverless Workflow provides support for several other open-source projects and specifications in the cloud-native -space: - -* CloudEvents -* OpenAPI -* AsyncAPI -* GraphQL -* OData -* OAuth2 - -## Year in Review - -This year was for stabilization and discussions around the upcoming 1.0 release. The ecosystem kept growing: - -* Two new releases for the [sdk-java](https://github.com/serverlessworkflow/sdk-java/releases) -* One new release for the [sdk-typescript](https://github.com/serverlessworkflow/sdk-typescript/releases) -* Four new releses for the [sdk-go](https://github.com/serverlessworkflow/sdk-typescript/releases) -* Seven minor releases for the [sdk-net](https://github.com/serverlessworkflow/sdk-net/tags) -* Four minor releases for the [Synapse](https://github.com/serverlessworkflow/synapse/releases) runtime -* Added [sdk-python](https://github.com/serverlessworkflow/sdk-python), a Python SDK into our ecosystem - -From the community perspective we also had a good year: - - -* Over 210 new stars on our [specification github repo](https://github.com/serverlessworkflow/specification). -* Over 200 people attending our project kiosk at KubeCon NA 2022 -* Presented in-person at KubeCon EU 2022 for more than 100 people. -* Over 200 people attending our project office hours at 2022 KubeCon EU. -* Over 50 people attending our project office hours at 2022 KubeCon China. - -## Annual Review Contents - -- **Include a link to your project’s devstats page. We will be looking for signs of consistent or increasing contribution activity.** - -Project [DevStats page](https://serverlessworkflow.devstats.cncf.io). -The info for the span of one year shows: - -* Over 50 new forks -* Over 80% increase to Synapse (runtime implementation) github stars compared to last year -* Over 101% increase to github stars compared to last year - -- **How many maintainers do you have, and which organizations are they from?** - -Serverless Workflow currently has [3 project maintainers](https://github.com/serverlessworkflow/specification/blob/main/MAINTAINERS.md) - - -- Charles d'Avernas, Neuroglia -- Ricardo Zanini, Red Hat -- Tihomir Surdilovic, Temporal Technologies - -- **What do you know about adoption, and how has this changed since your last review / since you joined Sandbox?** - -Both the adoption and community interest has been steadily increasing over the course of the year. -Most notable adoptions have been by: - - -- [Apache EventMesh](https://eventmesh.apache.org/), a new generation serverless event middleware for building distributed event-driven applications -- [Automatiko](https://automatiko.io/), a workflow automation runtime -- [FaasNet](https://github.com/simpleidserver/FaasNet), FaasNet makes it easy to deploy functions and API to Kubernetes without repetitive, boiler-plate coding. -- [OpenShift Serverless Logic](https://developers.redhat.com/articles/2022/08/15/how-openshift-serverless-logic-evolved-improve-workflows), a Red Hat product under Tech Preview integrated with their flagship product, OpenShift -- [Synapse](https://github.com/serverlessworkflow/synapse), a Kubernetes-based workflow runtime which has joined the Serverless Workflow ecosystem - -These are the companies that have adopeted the Serverless Workflow Specification: - - -- [CAF](https://caf.io), Serverless Workflow is the core technology behind every KYC/KYB solution allowing them to customize it for their clients seamlessly. -- [Huawei](https://www.huaweicloud.com/intl/en-us/product/functiongraph.html), Huawei FunctionGraph hosts event-driven functions in a serverless context while ensuring high availability, high scalability, and zero maintenance. -- [IBM](https://www.ibm.com/) As active members of the open-source KIE community, the BAMOE team from IBM's Digital Business Automation division is highly committed to standards within the business automation domain, and CNCF Serverless Workflow is no different. Eventually, IBM plans to incorporate the CNCF Serverless Workflow format into its product offering, providing more choices to customers to take advantage of BAMOE workflow capabilities. -- [Neuroglia](https://neuroglia.io/), Neuroglia is a consultancy and solution design company for the digital transformation of companies and their services. -- [OpenEnterprise](https://automatiko.io/), OpenEnterprise Automatiko helps you build better services and functions based on workflows expressed with well known standards. -- [Red Hat](https://redhat.com/), Red Hat sponsors the development of Kogito Serverless Workflow, which is a tool for building cloud-native workflow applications. -- [Tantl](https://www.tantl.com/), Tantl is making it easy for developers to build internal workflows, such as allowing customer support reps to quickly process refunds. -- [Temporal](https://temporal.io/), Temporal is the open source microservice orchestration platform for writing durable workflows as code. - -There are a few other companies that are in touch with us and using the specification, but can't disclose at the moment. - -- **How has the project performed against its goals since the last review?** - -This is our second annual review since becoming a Sandbox project. -This year we looked for stabilization of the specification by having more -discussions with current implementation project leaders to achieve a good -balance between the standards and production-level use cases. - -We decided to grow slowly and now we are reaching to a point to fix most of -the open issues and discussions before releasing 0.9 version, and then the final -1.0 by the end of the year. - -One of the goals we achieved, was to set a new governance model to balance the -responsibilities amongst all the maintainers. - -We had to do a little detour duo to the progress of projects implementing and using -the specification in many production-level use cases. - -- **What are the current goals of the project?** - -Specification [roadmap](https://github.com/serverlessworkflow/specification/tree/main/roadmap) and [progress tracker](https://github.com/orgs/serverlessworkflow/projects/1/views/2). - -Our main goals for the project include: - -- Release specification version 1.0 by late 2023 -- Create a specification TCK -- Add integrations with different workflow DSLs -- Improve our community tooling support - -- **How can the CNCF help you achieve your upcoming goals?** - -- Help us in promoting the project (Blogs, Twitter, KubeCon, etc) -- If feasible help our project via crowdfunding [here](https://crowdfunding.lfx.linuxfoundation.org/projects/serverless-workflow) -- Keep providing us with opportunities to have project office hours and talks at KubeCons - -- **Do you think that your project meets the criteria for incubation?** - -We believe we have made significant progress toward this goal and that the project is ready for incubation. -We would like however to release Serverless Workflow specification v1.0 before starting this process -(scheduled for late 2023). - -## Project Links - -* [Website](https://serverlessworkflow.io/) -* [GitHub](https://github.com/serverlessworkflow) -* Slack:[CNCF](http://slack.cncf.io) / #serverless-workflow -* [Twitter](https://twitter.com/CNCFWorkflow) diff --git a/community/contributors.md b/community/contributors.md index d3aa8ff5..b05beb99 100644 --- a/community/contributors.md +++ b/community/contributors.md @@ -13,7 +13,6 @@ us know in chat or team meeting. * **Independent** * Louis Fourie * Achilleas Tzenetopoulos - * Antonio Mendoza Pérez * Richard Gibson * Lucas Stocksmeier @@ -38,6 +37,7 @@ us know in chat or team meeting. * Chathura Ekanayake * **Temporal Technologies** + * Antonio Mendoza Pérez * Tihomir Surdilovic * **Red Hat** @@ -82,5 +82,4 @@ us know in chat or team meeting. * Manickavasagam Sundaram * **OpenEnterprise** - * Maciek Swiderski - + * Maciek Swiderski \ No newline at end of file diff --git a/comparisons/README.md b/comparisons/README.md deleted file mode 100644 index 4c067f4b..00000000 --- a/comparisons/README.md +++ /dev/null @@ -1,9 +0,0 @@ -# Comparisons with other workflow languages - -Following comparison documents are available: - -* [Argo comparison examples](comparison-argo.md) -* [Brigade comparison examples](comparison-brigade.md) -* [Google Cloud Workflow comparison examples](comparison-google-cloud-workflows.md) -* [Temporal comparison examples](comparison-temporal.md) -* [BPMN2 comparison examples](comparison-bpmn.md) diff --git a/comparisons/comparison-argo.md b/comparisons/comparison-argo.md deleted file mode 100644 index 9898f5d7..00000000 --- a/comparisons/comparison-argo.md +++ /dev/null @@ -1,928 +0,0 @@ -# Comparisons - Argo Workflows - -[Argo Workflows](https://github.com/argoproj/argo) is an open source container-native workflow engine for -orchestrating parallel jobs on Kubernetes. -The Argo markup is YAML based and workflows are implemented as a Kubernetes CRD (Custom Resource Definition). -Argo is also a [CNCF](https://www.cncf.io/) Incubating project. - -Argo has a number of [examples](https://github.com/argoproj/argo-workflows/tree/master/examples) which display -different Argo templates. - -The purpose of this document is to show side-by-side the Argo markup and the equivalent markup of the -Serverless Workflow Specification. This can hopefully help compare and contrast the two markups and -give a better understanding of both. - -## Preface - -Argo YAML is defined inside a Kubernetes CRD (Custom Resource Definition). The resource definition contains a "spec" -parameter which contains the entrypoint of the workflow and the template parameter which defines one or more -workflow definitions. When comparing the examples below please note that the Serverless Workflow specification YAML -pertains to the content of the "spec" parameter. Other parameters in the Kubernetes CRD are not considered and can -remain the same. Note that the Serverless Workflow YAML could also be embedded inside the CRD. - -For the sake of comparing the two models, we use the YAML representation as well for the -Serverless Workflow specification part. - -## Table of Contents - -- [Hello World with Parameters](#Hello-World-With-Parameters) -- [Multi Step Workflow](#Multi-Step-Workflow) -- [Directed Acyclic Graph](#Directed-Acyclic-Graph) -- [Scripts and Results](#Scripts-And-Results) -- [Loops](#Loops) -- [Conditionals](#Conditionals) -- [Retrying Failed Steps](#Retrying-Failed-Steps) -- [Recursion](#Recursion) -- [Exit Handlers](#Exit-Handlers) - -### Hello World With Parameters - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#parameters) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: hello-world-parameters- -spec: - entrypoint: whalesay - arguments: - parameters: - - name: message - value: hello world - - templates: - - name: whalesay - inputs: - parameters: - - name: message - container: - image: docker/whalesay - command: [cowsay] - args: ["{{inputs.parameters.message}}"] -``` - - - -```yaml -id: hello-world-parameters -name: Hello World with parameters -version: '1.0.0' -specVersion: '0.8' -start: whalesay -functions: -- name: whalesayimage - metadata: - image: docker/whalesay - command: cowsay -states: -- name: whalesay - type: operation - actions: - - functionRef: - refName: whalesayimage - arguments: - message: "${ .message }" - end: true -``` - -
- -### Multi Step Workflow - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#steps) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: steps- -spec: - entrypoint: hello-hello-hello - templates: - - name: hello-hello-hello - steps: - - - name: hello1 # hello1 is run before the following steps - template: whalesay - arguments: - parameters: - - name: message - value: "hello1" - - - name: hello2a # double dash => run after previous step - template: whalesay - arguments: - parameters: - - name: message - value: "hello2a" - - name: hello2b # single dash => run in parallel with previous step - template: whalesay - arguments: - parameters: - - name: message - value: "hello2b" - - name: whalesay - inputs: - parameters: - - name: message - container: - image: docker/whalesay - command: [cowsay] - args: ["{{inputs.parameters.message}}"] -``` - - - -```yaml -id: hello-hello-hello -name: Multi Step Hello -version: '1.0.0' -specVersion: '0.8' -start: hello1 -functions: -- name: whalesayimage - metadata: - image: docker/whalesay - command: cowsay -states: -- name: hello1 - type: operation - actions: - - functionRef: - refName: whalesayimage - arguments: - message: hello1 - transition: parallelhello -- name: parallelhello - type: parallel - completionType: allOf - branches: - - name: hello2a-branch - actions: - - functionRef: - refName: whalesayimage - arguments: - message: hello2a - - name: hello2b-branch - actions: - - functionRef: - refName: whalesayimage - arguments: - message: hello2b - end: true -``` - -
- -### Directed Acyclic Graph - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#dag) - -*Note*: Even tho this example can be described (has a single -starting task) using the specification, the spec does not currently support multiple -start events. Argo workflows that have multiple starting -DAG tasks cannot be described using the specification at this time. - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: dag-diamond- -spec: - entrypoint: diamond - templates: - - name: echo - inputs: - parameters: - - name: message - container: - image: alpine:3.7 - command: [echo, "{{inputs.parameters.message}}"] - - name: diamond - dag: - tasks: - - name: A - template: echo - arguments: - parameters: [{name: message, value: A}] - - name: B - dependencies: [A] - template: echo - arguments: - parameters: [{name: message, value: B}] - - name: C - dependencies: [A] - template: echo - arguments: - parameters: [{name: message, value: C}] - - name: D - dependencies: [B, C] - template: echo - arguments: - parameters: [{name: message, value: D}] -``` - - - -```yaml -id: dag-diamond- -name: DAG Diamond Example -version: '1.0.0' -specVersion: '0.8' -start: A -functions: -- name: echo - metadata: - image: alpine:3.7 - command: '[echo, "{{inputs.parameters.message}}"]' -states: -- name: A - type: operation - actions: - - functionRef: - refName: echo - arguments: - message: A - transition: parallelecho -- name: parallelecho - type: parallel - completionType: allOf - branches: - - name: B-branch - actions: - - functionRef: - refName: echo - arguments: - message: B - - name: C-branch - actions: - - functionRef: - refName: echo - arguments: - message: C - transition: D -- name: D - type: operation - actions: - - functionRef: - refName: echo - arguments: - message: D - end: true -``` - -
- -### Scripts And Results - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#scripts--results) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: scripts-bash- -spec: - entrypoint: bash-script-example - templates: - - name: bash-script-example - steps: - - - name: generate - template: gen-random-int-bash - - - name: print - template: print-message - arguments: - parameters: - - name: message - value: "{{steps.generate.outputs.result}}" # The result of the here-script - - - name: gen-random-int-bash - script: - image: debian:9.4 - command: [bash] - source: | # Contents of the here-script - cat /dev/urandom | od -N2 -An -i | awk -v f=1 -v r=100 '{printf "%i\n", f + r * $1 / 65536}' - - - name: gen-random-int-python - script: - image: python:alpine3.6 - command: [python] - source: | - import random - i = random.randint(1, 100) - print(i) - - - name: gen-random-int-javascript - script: - image: node:9.1-alpine - command: [node] - source: | - var rand = Math.floor(Math.random() * 100); - console.log(rand); - - - name: print-message - inputs: - parameters: - - name: message - container: - image: alpine:latest - command: [sh, -c] - args: ["echo result was: {{inputs.parameters.message}}"] -``` - - - -```yaml -id: scripts-bash- -name: Scripts and Results Example -version: '1.0.0' -specVersion: '0.8' -start: generate -functions: -- name: gen-random-int-bash - metadata: - image: debian:9.4 - command: bash - source: |- - cat /dev/urandom | od -N2 -An -i | awk -v f=1 -v r=100 '{printf "%i - ", f + r * $1 / 65536}' -- name: gen-random-int-python - metadata: - image: python:alpine3.6 - command: python - source: "import random \ni = random.randint(1, 100) \nprint(i)\n" -- name: gen-random-int-javascript - metadata: - image: node:9.1-alpine - command: node - source: "var rand = Math.floor(Math.random() * 100); \nconsole.log(rand);\n" -- name: printmessagefunc - metadata: - image: alpine:latest - command: sh, -c - source: 'echo result was: ${ .inputs.parameters.message }' -states: -- name: generate - type: operation - actions: - - functionRef: gen-random-int-bash - actionDataFilter: - results: "${ .results }" - transition: print-message -- name: print-message - type: operation - actions: - - functionRef: - refName: printmessagefunc - arguments: - message: "${ .results }" - end: true -``` - -
- -### Loops - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#loops) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: loops- -spec: - entrypoint: loop-example - templates: - - name: loop-example - steps: - - - name: print-message - template: whalesay - arguments: - parameters: - - name: message - value: "{{item}}" - withItems: # invoke whalesay once for each item in parallel - - hello world # item 1 - - goodbye world # item 2 - - - name: whalesay - inputs: - parameters: - - name: message - container: - image: docker/whalesay:latest - command: [cowsay] - args: ["{{inputs.parameters.message}}"] -``` - - - -```yaml -id: loops- -name: Loop over data example -version: '1.0.0' -specVersion: '0.8' -start: injectdata -functions: -- name: whalesay - metadata: - image: docker/whalesay:latest - command: cowsay -states: -- name: injectdata - type: inject - data: - greetings: - - hello world - - goodbye world - transition: printgreetings -- name: printgreetings - type: foreach - inputCollection: "${ .greetings }" - iterationParam: greeting - actions: - - name: print-message - functionRef: - refName: whalesay - arguments: - message: "${ .greeting }" - end: true -``` - -
- -### Conditionals - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#conditionals) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: coinflip- -spec: - entrypoint: coinflip - templates: - - name: coinflip - steps: - # flip a coin - - - name: flip-coin - template: flip-coin - # evaluate the result in parallel - - - name: heads - template: heads # call heads template if "heads" - when: "{{steps.flip-coin.outputs.result}} == heads" - - name: tails - template: tails # call tails template if "tails" - when: "{{steps.flip-coin.outputs.result}} == tails" - - # Return heads or tails based on a random number - - name: flip-coin - script: - image: python:alpine3.6 - command: [python] - source: | - import random - result = "heads" if random.randint(0,1) == 0 else "tails" - print(result) - - - name: heads - container: - image: alpine:3.6 - command: [sh, -c] - args: ["echo \"it was heads\""] - - - name: tails - container: - image: alpine:3.6 - command: [sh, -c] - args: ["echo \"it was tails\""] -``` - - - -```yaml -id: coinflip- -name: Conditionals Example -version: '1.0.0' -specVersion: '0.8' -start: flip-coin -functions: -- name: flip-coin-function - metadata: - image: python:alpine3.6 - command: python - source: import random result = "heads" if random.randint(0,1) == 0 else "tails" - print(result) -- name: echo - metadata: - image: alpine:3.6 - command: sh, -c -states: -- name: flip-coin - type: operation - actions: - - functionRef: flip-coin-function - actionDataFilter: - results: "${ .flip.result }" - transition: show-flip-results -- name: show-flip-results - type: switch - dataConditions: - - condition: "${ .flip | .result == \"heads\" }" - transition: show-results-heads - - condition: "${ .flip | .result == \"tails\" }" - transition: show-results-tails -- name: show-results-heads - type: operation - actions: - - functionRef: echo - actionDataFilter: - results: it was heads - end: true -- name: show-results-tails - type: operation - actions: - - functionRef: echo - actionDataFilter: - results: it was tails - end: true -``` - -
- - -### Retrying Failed Steps - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#retrying-failed-or-errored-steps) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: retry-backoff- -spec: - entrypoint: retry-backoff - templates: - - name: retry-backoff - retryStrategy: - limit: 10 - retryPolicy: "Always" - backoff: - duration: "1" # Must be a string. Default unit is seconds. Could also be a Duration, e.g.: "2m", "6h", "1d" - factor: 2 - maxDuration: "1m" # Must be a string. Default unit is seconds. Could also be a Duration, e.g.: "2m", "6h", "1d" - affinity: - nodeAntiAffinity: {} - container: - image: python:alpine3.6 - command: ["python", -c] - # fail with a 66% probability - args: ["import random; import sys; exit_code = random.choice([0, 1, 1]); sys.exit(exit_code)"] -``` - - - -```yaml -id: retry-backoff- -name: Retry Example -version: '1.0.0' -specVersion: '0.8' -start: retry-backoff -functions: -- name: fail-function - metadata: - image: python:alpine3.6 - command: python -retries: -- name: All workflow errors retry strategy - maxAttempts: 10 - delay: PT1S - maxDelay: PT1M - multiplier: 2 -states: -- name: retry-backoff - type: operation - actions: - - functionRef: - refName: flip-coin-function - arguments: - args: - - import random; import sys; exit_code = random.choice([0, 1, 1]); sys.exit(exit_code) - end: true -``` - -
- - -### Recursion - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#recursion) - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: coinflip-recursive- -spec: - entrypoint: coinflip - templates: - - name: coinflip - steps: - # flip a coin - - - name: flip-coin - template: flip-coin - # evaluate the result in parallel - - - name: heads - template: heads # call heads template if "heads" - when: "{{steps.flip-coin.outputs.result}} == heads" - - name: tails # keep flipping coins if "tails" - template: coinflip - when: "{{steps.flip-coin.outputs.result}} == tails" - - - name: flip-coin - script: - image: python:alpine3.6 - command: [python] - source: | - import random - result = "heads" if random.randint(0,1) == 0 else "tails" - print(result) - - - name: heads - container: - image: alpine:3.6 - command: [sh, -c] - args: ["echo \"it was heads\""] -``` - - - -```yaml -id: coinflip-recursive- -name: Recursion Example -version: '1.0.0' -specVersion: '0.8' -start: flip-coin-state -functions: -- name: heads-function - metadata: - image: alpine:3.6 - command: echo "it was heads" -- name: flip-coin-function - metadata: - image: python:alpine3.6 - command: python - source: import random result = "heads" if random.randint(0,1) == 0 else "tail" print(result) -states: -- name: flip-coin-state - type: operation - actions: - - functionRef: flip-coin-function - actionDataFilter: - results: "${ .steps.flip-coin.outputs.result }" - transition: flip-coin-check -- name: flip-coin-check - type: switch - dataConditions: - - condition: "${ .steps.flip-coin.outputs | .result == \"tails\" }" - transition: flip-coin-state - - condition: "${ .steps.flip-coin.outputs | .result == \"heads\" }" - transition: heads-state -- name: heads-state - type: operation - actions: - - functionRef: - refName: heads-function - arguments: - args: echo "it was heads" - end: true -``` - -
- - -### Exit Handlers - -[Argo Example](https://github.com/argoproj/argo-workflows/tree/master/examples#exit-handlers) - -*Note*: With Serverless Workflow specification we can handle Argos "onExit" functionality -in a couple of ways. One is the "onErrors" functionality to define errors and transition to the parts -of workflow which is capable of handling the errors. -Another is to send an event at the end of workflow execution -which includes the workflow status. This event can then trigger execution other workflows -that can handle each status. For this example we use the "onErrors" definition. - - - - - - - - - - -
ArgoServerless Workflow
- -```yaml -apiVersion: argoproj.io/v1alpha1 -kind: Workflow -metadata: - generateName: exit-handlers- -spec: - entrypoint: intentional-fail - onExit: exit-handler # invoke exit-handler template at end of the workflow - templates: - # primary workflow template - - name: intentional-fail - container: - image: alpine:latest - command: [sh, -c] - args: ["echo intentional failure; exit 1"] - - name: exit-handler - steps: - - - name: notify - template: send-email - - name: celebrate - template: celebrate - when: "{{workflow.status}} == Succeeded" - - name: cry - template: cry - when: "{{workflow.status}} != Succeeded" - - name: send-email - container: - image: alpine:latest - command: [sh, -c] - args: ["echo send e-mail: {{workflow.name}} {{workflow.status}}"] - - name: celebrate - container: - image: alpine:latest - command: [sh, -c] - args: ["echo hooray!"] - - name: cry - container: - image: alpine:latest - command: [sh, -c] - args: ["echo boohoo!"] -``` - - - -```yaml -id: exit-handlers- -name: Exit/Error Handling Example -version: '1.0.0' -specVersion: '0.8' -autoRetries: true -start: intentional-fail-state -functions: - - name: intentional-fail-function - metadata: - image: alpine:latest - command: "[sh, -c]" - - name: send-email-function - metadata: - image: alpine:latest - command: "[sh, -c]" - - name: celebrate-cry-function - metadata: - image: alpine:latest - command: "[sh, -c]" -errors: - - name: IntentionalError - code: '404' -states: - - name: intentional-fail-state - type: operation - actions: - - functionRef: - refName: intentional-fail-function - arguments: - args: echo intentional failure; exit 1 - nonRetryableErrors: - - IntentionalError - onErrors: - - errorRef: IntentionalError - transition: send-email-state - end: true - - name: send-email-state - type: operation - actions: - - functionRef: - refName: send-email-function - arguments: - args: 'echo send e-mail: ${ .workflow.name } ${ .workflow.status }' - transition: emo-state - - name: emo-state - type: switch - dataConditions: - - condition: ${ .workflow| .status == "Succeeded" } - transition: celebrate-state - - condition: ${ .workflow| .status != "Succeeded" } - transition: cry-state - - name: celebrate-state - type: operation - actions: - - functionRef: - refName: celebrate-cry-function - arguments: - args: echo hooray! - end: true - - name: cry-state - type: operation - actions: - - functionRef: - refName: celebrate-cry-function - arguments: - args: echo boohoo! - end: true -``` - -
diff --git a/comparisons/comparison-bpmn.md b/comparisons/comparison-bpmn.md deleted file mode 100644 index 6b935f92..00000000 --- a/comparisons/comparison-bpmn.md +++ /dev/null @@ -1,518 +0,0 @@ -# Comparisons - BPMN2 - -The [Business Process Model and Notation (BPMN)](https://www.omg.org/spec/BPMN/2.0/PDF) defines a flowchart-based -DSL for workflows. It is maintained by the [Object Management Group (OMG)](https://www.omg.org/). -The latest BPMN version is [2.0.2](https://www.omg.org/spec/BPMN/2.0.2/), published in 2014. - -BPMN2 defines a graphical notation to specify workflows. This notation can then be shared between tooling and organizations. -The graphical notation is translated into XML, which then can be used for runtime execution. - -For this comparison, we will compare the Serverless Workflow language with the graphical representation of BPMN2, -and not its underlying XML representation. The BPMN2 XML is very difficult to understand, quite large for even the smallest workflows, and often not portable between runtimes. -It makes more sense to use its portable graphical notation for comparisons. - -Serverless Workflow is a declarative workflow language, represented with JSON or YAML. It currently does not define a graphical notation. However, it can be graphically represented using different flowcharting techniques such as -UML activity diagrams. The [Serverless Workflow Java SDK](https://github.com/serverlessworkflow/sdk-java#building-workflow-diagram) as well as its [VSCode Extension](https://github.com/serverlessworkflow/vscode-extension) provide means to generate SVG diagrams based on the workflow JSON/YAML. - -## Note when reading provided examples - -The BPMN2 graphical notation does not provide details about data inputs/outputs, mapping, and transformation. -BPMN2 does provide graphical representation for things such as Data Objects. However, most of the examples -available do not use them. Execution semantics such as task and event properties are also not visual. -For this reason, the event, function, retry, and data mapping defined in the associated Serverless Workflow YAML are assumed. - -## Table of Contents - -- [Simple File Processor](#Simple-File-Processor) -- [Process Application](#Process-Application) -- [Compensation](#Compensation) -- [Error Handling with Retries](#Error-Handling-With-Retries) -- [Process Execution Timeout](#Process-Execution-Timeout) -- [Multiple Instance Subprocess](#Multiple-Instance-Subprocess) -- [Loop Subprocess](#Loop-Subprocess) -- [Approve Report (User Task)](#Approve-Report) -- [Event Based Decision](#Event-Based-Decision) - -### Simple File Processor - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Simple File Processing Workflow -

-
- -```yaml -id: processfile -name: Process File Workflow -version: '1.0.0' -specVersion: '0.8' -start: Process File -states: -- name: Process File - type: operation - actions: - - functionRef: processFile - end: true -functions: -- name: processFile - operation: file://myservice.json#process -``` - -
- -### Process Application - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Process Applicant Workflow -

-
- -```yaml -id: processapplication -name: Process Application -version: '1.0.0' -specVersion: '0.8' -start: ProcessNewApplication -states: -- name: ProcessNewApplication - type: event - onEvents: - - eventRefs: - - ApplicationReceivedEvent - actions: - - functionRef: processApplicationFunction - - functionRef: acceptApplicantFunction - - functionRef: depositFeesFunction - end: - produceEvents: - - eventRef: NotifyApplicantEvent -functions: -- name: processApplicationFunction - operation: file://myservice.json#process -- name: acceptApplicantFunction - operation: file://myservice.json#accept -- name: depositFeesFunction - operation: file://myservice.json#deposit -events: -- name: ApplicationReceivedEvent - type: application - source: "/applications/new" -- name: NotifyApplicantEvent - type: notifications - source: "/applicants/notify" -``` - -
- -### Compensation - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Simple Compensation Workflow -

-
- -```yaml -id: simplecompensation -name: Simple Compensation -version: '1.0.0' -specVersion: '0.8' -start: Step 1 -states: -- name: Step 1 - type: operation - actions: - - functionRef: step1function - compensatedBy: Cancel Step 1 - transition: Step 2 -- name: Step 2 - type: operation - actions: - - functionRef: step2function - transition: OK? -- name: OK? - type: switch - dataConditions: - - name: 'yes' - condition: ${ .outcome | .ok == "yes" } - end: true - - name: 'no' - condition: ${ .outcome | .ok == "no" } - end: - compensate: true -- name: Cancel Step 1 - type: operation - usedForCompensation: true - actions: - - functionRef: undostep1 -functions: -- name: step1function - operation: file://myservice.json#step1 -- name: step2function - operation: file://myservice.json#step2 -- name: undostep1function - operation: file://myservice.json#undostep1 -``` - -
- -### Error Handling With Retries - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Error Handling With Retries Workflow -

-
- -```yaml ---- -id: errorwithretries -name: Error Handling With Retries Workflow -version: '1.0.0' -specVersion: '0.8' -start: Make Coffee -states: - - name: Make Coffee - type: operation - actions: - - functionRef: makeCoffee - transition: Add Milk - - name: Add Milk - type: operation - actions: - - functionRef: addMilk - retryRef: noMilkRetries - retryableErrors: - - D'oh! No more Milk! - onErrors: - - errorRef: D'oh! No more Milk! - end: true - transition: Drink Coffee - - name: Drink Coffee - type: operation - actions: - - functionRef: drinkCoffee - end: true -retries: - - name: noMilkRetries - delay: PT1M - maxAttempts: 10 -errors: - - name: D'oh! No more Milk! - code: '123' -functions: - - name: makeCoffee - operation: file://myservice.json#make - - name: addMilk - operation: file://myservice.json#add - - name: drinkCoffee - operation: file://myservice.json#drink - -``` - -
- -### Process Execution Timeout - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Execution Timeout Workflow -

-
- -```yaml -id: executiontimeout -name: Execution Timeout Workflow -version: '1.0.0' -specVersion: '0.8' -start: Purchase Parts -timeouts: - workflowExecTimeout: - duration: PT7D - interrupt: true - runBefore: Handle timeout -states: -- name: Purchase Parts - type: operation - actions: - - functionRef: purchasePartsFunction - transition: Unpack Parts -- name: Unpack Parts - type: operation - actions: - - functionRef: unpackPartsFunction - end: true -- name: Handle timeout - type: operation - actions: - - functionRef: handleTimeoutFunction -functions: -- name: purchasePartsFunction - operation: file://myservice.json#purchase -- name: unpackPartsFunction - operation: file://myservice.json#unpack -- name: handleTimeoutFunction - operation: file://myservice.json#handle -``` - -
- -### Multiple Instance Subprocess - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Multi-Instance Subprocess Workflow -

-
- -```yaml -id: foreachWorkflow -name: ForEach State Workflow -version: '1.0.0' -specVersion: '0.8' -start: ForEachItem -states: -- name: ForEachItem - type: foreach - inputCollection: "${ .inputsArray }" - iterationParam: "${ .inputItem }" - outputCollection: "${ .outputsArray }" - actions: - - subFlowRef: doSomethingAndWaitForMessage - end: true -``` - -
- -* Note: We did not include the `dosomethingandwaitformessage` workflow in this example, which would just include -a starting "operation" state transitioning to an "event" state which waits for the needed event. - -### Loop Subprocess - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Loop Subprocess Workflow -

-
- -```yaml -id: subflowloop -name: SubFlow Loop Workflow -version: '1.0.0' -specVersion: '0.8' -start: SubflowRepeat -states: -- name: SubflowRepeat - type: operation - actions: - - functionRef: checkAndReplyToEmail - actionDataFilter: - fromStateData: ${ .someInput } - toStateData: ${ .someInput } - stateDataFilter: - output: ${ .maxChecks -= 1 } - transition: CheckCount -- name: CheckCount - type: switch - dataConditions: - - condition: ${ .maxChecks > 0 } - transition: SubflowRepeat - defaultCondition: - end: true -``` - -
- -This workflow assumes that the input to the workflow includes a maxChecks attribute set to an integer value. - -* Note: We did not include the `checkAndReplyToEmail` workflow in this example, which would include the -control-flow logic to check email and make a decision to reply to it or wait an hour. - -### Approve Report - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Multi-Instance Subprocess Workflow -

-
- -```yaml -id: approvereport -name: Approve Report Workflow -version: '1.0.0' -specVersion: '0.8' -start: Approve Report -states: -- name: Approve Report - type: callback - action: - functionRef: managerDecideOnReport - eventRef: ReportDecisionMadeEvent - transition: Evaluate Report Decision -- name: Evaluate Report Decision - type: switch - dataConditions: - - name: Approve - condition: "${ .decision | .approved == true }" - end: true - - name: Reject - condition: "${ .decision | .approved != true }" - transition: Update Report -- name: Update Report - type: callback - action: - functionRef: workerUpdateReport - eventRef: ReportUpdatedEvent - transition: Approve Report -events: -- name: ReportDecisionMadeEvent - type: report.decisions - source: reports/decision -- name: ReportUpdatedEvent - type: report.updated - source: reports/updated -functions: -- name: managerDecideOnReport - operation: file://myservice.json#managerapproval -- name: workerUpdateReport - operation: file://myservice.json#workerupdate -``` - -
- -* Note: Human interactions during workflow execution in Serverless Workflow is handled via its [Callback state](../specification.md#Callback-State). - -### Event Based Decision - - - - - - - - - - -
BPMN2 DiagramServerless Workflow
-

-BPMN2 Event Decision Workflow -

-
- -```yaml -id: eventdecision -name: Event Decision workflow -version: '1.0.0' -specVersion: '0.8' -start: A -states: -- name: A - type: operation - actions: - - subFlowRef: asubflowid - transition: Event Decision -- name: Event Decision - type: switch - eventConditions: - - eventRef: EventB - transition: B - - eventRef: EventC - transition: C -- name: B - type: operation - actions: - - name: doSomething - functionRef: doSomethingFunction - end: true -- name: C - type: operation - actions: - - name: doSomething - functionRef: doSomethingFunction - end: true -events: -- name: EventB - type: my.events.b - source: "/events/+" -- name: EventC - type: my.events.c - source: "/events/+" -functions: -- name: doSomethingFunction - operation: file://myservice.json#dosomething -``` - -
diff --git a/comparisons/comparison-brigade.md b/comparisons/comparison-brigade.md deleted file mode 100644 index e5b8ccde..00000000 --- a/comparisons/comparison-brigade.md +++ /dev/null @@ -1,626 +0,0 @@ -# Comparisons - Brigade - -[Brigade](https://github.com/brigadecore/brigade) is an open-source Kubernetes-native tool for doing event-driven -scripting. Brigade allows you to -* Script simple and complex workflows using JavaScript -* Chain together containers, running them in parallel or serially. -* Fire scripts based on times, GitHub events, Docker pushes, or any other trigger. -* Create pipelines for Kubernetes -* and much more - -Brigade has a number of [examples](https://github.com/brigadecore/brigade/tree/master/docs/content/examples) which display -JavaScript code that can be used to utilize different features of the project. - -The purpose of this document is to show side-by-side the Brigade JS and the equivalent markup of the -Serverless Workflow Specification. This can hopefully help compare and contrast the two and -give a better understanding of both. - -You can find a lot more information on Brigade in their [website](https://brigade.sh/). - -## Table of Contents - -- [Greeting With Parameters](#Greeting-With-Parameters) -- [Greeting With Error Checking](#Greeting-With-Error-Checking) -- [Handling Multiple Events](#Handling-Multiple-Events) -- [Grouping Actions](#Grouping-Actions) -- [Event Data](#Event-Data) -- [Action Results](#Action-Results) -- [Emit Events](#Emit-Events) - -### Greeting With Parameters - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/advanced-01.js) - - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const { events, Job } = require("brigadier"); - -events.on("exec", exec); - -function exec(e, p) { - let j1 = new Job("j1", "alpine:3.7", ["echo hello"]); - let j2 = new Job("j2", "alpine:3.7", ["echo goodbye"]); - - j1.run() - .then(() => { - return j2.run() - }) - .then(() => { - console.log("done"); - }); -}; -``` - - - -```yaml -id: greeting -name: Greeting Workflow -version: '1.0.0' -specVersion: '0.8' -start: GreetingState -events: -- name: execEvent - type: exec -functions: -- name: greetingFunction - metadata: - image: alpine:3.7 - command: echo -- name: consoleLogFunction - type: console -states: -- name: GreetingState - type: event - onEvents: - - eventRefs: - - execEvent - actions: - - name: sayHelloAction - functionRef: - refName: greetingFunction - arguments: - greeting: hello - - name: sayGoodbyeAction - functionRef: - refName: greetingFunction - arguments: - greeting: hello - - name: logDoneAction - functionRef: - refName: consoleLogFunction - arguments: - log: done - end: true -``` - -
- -### Greeting With Error Checking - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/advanced-03.js) - - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const { events, Job } = require("brigadier"); - -events.on("exec", exec); - -async function exec(e, p) { - let j1 = new Job("j1", "alpine:3.7", ["echo hello"]); - // This will fail - let j2 = new Job("j2", "alpine:3.7", ["exit 1"]); - - try { - await j1.run(); - await j2.run(); - console.log("done"); - } catch (e) { - console.log(`Caught Exception ${e}`); - } -}; -``` - - - -```yaml -id: greetingwitherrorcheck -name: Greeting Workflow With Error Check -version: '1.0.0' -specVersion: '0.8' -autoRetries: true -start: GreetingState -events: -- name: execEvent - type: exec -errors: -- name: CommonError - code: '123' -functions: -- name: greetingFunction - metadata: - image: alpine:3.7 - command: echo -- name: consoleLogFunction - metadata: - type: console -states: -- name: GreetingState - type: event - onEvents: - - eventRefs: - - execEvent - actions: - - name: sayHelloAction - functionRef: - refName: greetingFunction - arguments: - greeting: hello - nonRetryableErrors: - - CommonError - - name: sayGoodbyeAction - functionRef: - refName: greetingFunction - arguments: - greeting: hello - nonRetryableErrors: - - CommonError - - name: logDoneAction - functionRef: - refName: consoleLogFunction - arguments: - log: done - nonRetryableErrors: - - CommonError - onErrors: - - errorRef: CommonError - transition: HandleErrorState - end: true -- name: HandleErrorState - type: operation - actions: - - name: logErrorAction - functionRef: - refName: consoleLogFunction - arguments: - log: Caught Exception ${ .exception } - end: true -``` - -
- -### Handling Multiple Events - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/brigade-03.js) - - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const { events } = require("brigadier") - -events.on("exec", () => { - console.log("==> handling an 'exec' event") -}) - -events.on("push", () => { - console.log(" **** I'm a GitHub 'push' handler") -}) -``` - - - -```yaml -id: multieventworkflow -name: Multiple Events Workflow -version: '1.0.0' -specVersion: '0.8' -start: GreetingState -events: -- name: execEvent - type: exec -- name: pushEvent - type: push -functions: -- name: consoleLogFunction - type: console -states: -- name: GreetingState - type: event - onEvents: - - eventRefs: - - execEvent - actions: - - name: logExecEventAction - functionRef: - refName: consoleLogFunction - arguments: - log: "==> handling an 'exec' event" - - eventRefs: - - pushEvent - actions: - - name: logPushEventAction - functionRef: - refName: consoleLogFunction - arguments: - log: "**** I'm a GitHub 'push' handler" - end: true -``` - -
- -### Grouping Actions - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/brigade-12.js) - -* Note: Serverless Workflow specification does not currently support grouping of actions. This -would be beneficial to have as then errors and retries could be done per group rather than all actions -defined as a whole. This is definitely a feature we can add to the specification that would be beneficial -and a great feature. For now until this is added, grouping needs to be handled by separate states, -where error checking could be performed in each. - - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const { events, Job, Group } = require("brigadier") - -events.on("exec", () => { - var hello = new Job("hello", "alpine:3.4", ["echo hello"]) - var goodbye = new Job("goodbye", "alpine:3.4", ["echo goodbye"]) - - var helloAgain = new Job("hello-again", "alpine:3.4", ["echo hello again"]) - var goodbyeAgain = new Job("bye-again", "alpine:3.4", ["echo bye again"]) - - - var first = new Group() - first.add(hello) - first.add(goodbye) - - var second = new Group() - second.add(helloAgain) - second.add(goodbyeAgain) - - first.runAll().then( () => second.runAll() ) -}) -``` - - - -```yaml -id: groupActionsWorkflow -name: Group Actions Workflow -version: '1.0.0' -specVersion: '0.8' -start: FirstGreetGroup -events: -- name: execEvent - type: exec -functions: -- name: echoFunction - metadata: - image: alpine:3.7 - command: echo -states: -- name: FirstGreetGroup - type: event - onEvents: - - eventRefs: - - execEvent - actions: - - name: firstHelloAction - functionRef: - refName: echoFunction - arguments: - message: hello - - name: firstGoodbyeAction - functionRef: - refName: echoFunction - arguments: - message: goodbye - transition: SecondGreetGroup -- name: SecondGreetGroup - type: operation - actions: - - name: secondHelloAction - functionRef: - refName: echoFunction - arguments: - message: hello-again - - name: secondGoodbyeAction - functionRef: - refName: echoFunction - arguments: - message: bye-again - end: true -``` - -
- -### Event Data - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/brigade-13.js) - -* Note: Events within Serverless Workflow specification require them to have the CloudEvents format. CE specification -defines a "data" context attribute which includes the event payload that we are using in this example. - - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const { events } = require("brigadier") - -events.on("exec", (e, p) => { - console.log(">>> event " + e.type + " caused by " + e.provider) - console.log(">>> project " + p.name + " clones the repo at " + p.repo.cloneURL) -}) -``` - - - -```yaml -id: eventDataWorkflow -name: Event Data Workflow -version: '1.0.0' -specVersion: '0.8' -start: LogEventData -events: -- name: execEvent - type: exec - dataOnly: false -functions: -- name: consoleFunction - type: console -states: -- name: LogEventData - type: event - onEvents: - - eventRefs: - - execEvent - eventDataFilter: - toStateData: "${ .event }" - actions: - - name: eventInfoAction - functionRef: - refName: consoleFunction - arguments: - log: ">>> event ${ .event.type } caused by ${ .event.data.provider }" - - name: projectInfoAction - functionRef: - refName: consoleFunction - arguments: - log: ">>> project ${ .event.data.project.name } clones the repo at by ${ .event.data.repo.cloneURL }" - end: true - -``` - -
- -### Action Results - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/brigade-15.js) - -* Note: Serverless Workflow specification does not have built-in storage options or custom built-in functions, -storing of data needs to be available as a function that can be invoked during workflow execution. - -* Note: It is assumed that the "dest" variable is part of the event payload, rather than hard-coded, injected, or set. - - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const { events, Job, Group } = require("brigadier") - -events.on("exec", (e, p) => { - var dest = "/mnt/brigade/share/hello.txt" - var one = new Job("one", "alpine:3.4", ["echo hello > " + dest]) - var two = new Job("two", "alpine:3.4", ["echo world >> " + dest]) - var three = new Job("three", "alpine:3.4", ["cat " + dest]) - - one.storage.enabled = true - two.storage.enabled = true - three.storage.enabled = true - - Group.runEach([one, two, three]) -}) -``` - - - -```yaml -id: actionResultsWorkflow -name: Action Results Workflow -version: '1.0.0' -specVersion: '0.8' -start: ExecActionsAndStoreResults -events: -- name: execEvent - type: exec -functions: -- name: greetingFunction - metadata: - image: alpine:3.7 - command: echo -- name: storeToFileFunction - metadata: - image: alpine:3.7 - command: filestore -states: -- name: ExecActionsAndStoreResults - type: event - onEvents: - - eventRefs: - - execEvent - eventDataFilter: - toStateData: "${ .event }" - actions: - - name: helloAction - actionDataFilter: - results: "${ .helloResult }" - functionRef: - refName: greetingFunction - arguments: - message: hello - - name: worldAction - actionDataFilter: - results: "${ .worldResults }" - functionRef: - refName: greetingAction - arguments: - message: world - - name: storeToFileAction - functionRef: - refName: storeToFileFunction - arguments: - destination: "${ .event.destination }" - value: "${ .helloResult } ${ .worldResults }" - end: true - -``` - -
- -### Emit Events - -[Brigade Example](https://github.com/brigadecore/brigade/blob/master/docs/content/examples/brigade-19.js) - -* Note: Events can be emitted in Serverless Workflow Specification on state transitions. This also shows that yes. -you can have `onEvents` definition without any actions :) - - - - - - - - - -
BrigadeServerless Workflow
- -```javascript -const {events} = require("brigadier") - -events.on("exec", function(e, project) { - const e2 = { - type: "next", - provider: "exec-handler", - buildID: e.buildID, - workerID: e.workerID, - cause: {event: e} - } - events.fire(e2, project) -}) - -events.on("next", (e) => { - console.log(`fired ${e.type} caused by ${e.cause.event.type}`) -}) -``` - - - -```yaml -id: eventDataWorkflow -name: Event Data Workflow -version: '1.0.0' -specVersion: '0.8' -start: ExecEventState -events: -- name: execEvent - type: exec - dataOnly: false -- name: nextEvent - type: next - kind: produced -functions: -- name: consoleLogFunction - type: console -states: -- name: ExecEventState - type: event - onEvents: - - eventRefs: - - execEvent - actions: [] - eventDataFilter: - toStateData: "${ .execEvent }" - transition: - nextState: NextEventState - produceEvents: - - eventRef: nextEvent - data: - type: next - provider: exec-handler - buildID: "${ .execEvent.data.buildID }" - workerID: "${ .execEvent.data.workerID }" - cause: - event: "${ .execEvent }" -- name: NextEventState - type: event - onEvents: - - eventRefs: - - nextEvent - eventDataFilter: - toStateData: "${ .nextEvent }" - actions: - - name: consoleLogAction - functionRef: - refName: consoleLogFunction - arguments: - log: "fired ${ .nextEvent.data.type } caused by ${ .nextEvent.data.cause.event }" - end: true -``` - -
- diff --git a/comparisons/comparison-google-cloud-workflows.md b/comparisons/comparison-google-cloud-workflows.md deleted file mode 100644 index 1c6c0c65..00000000 --- a/comparisons/comparison-google-cloud-workflows.md +++ /dev/null @@ -1,886 +0,0 @@ -# Comparisons - Google Cloud Workflows - -[Google Cloud Workflows](https://cloud.google.com/workflows) is a proprietary Google serverless workflow language -and runtime service. It's main features, as mentioned on their website, include orchestration of Google Cloud -and HTTP-based API services, automation of complex processes, no infra/capacity planning, scalability support, and -a pay-per-use pricing model. - -We are focusing here only on the Google Cloud Workflow dsl (the language definition). The purpose of this document -is to show a side-by-side comparisons between the equivalent markup of the Serverless Workflow Specifiation workflow language -and that of Google Cloud Workflows. -This can hopefully help compare and contrast the two workflow languages and give a better understanding of both. - -The Google Cloud Workflow examples used in this document are all available on the -[GoogleCloudPlatform Workflow examples github page](https://github.com/GoogleCloudPlatform/workflows-samples/tree/main/src). - - -## Preface - -Both Serverless Workflow and Google Cloud Workflow can describe their workflow language -in both JSON and YAML formats. For the sake of this document, our comparison -examples will include examples in JSON format, but same can be established with -YAML formats as well. - -Overall, as of the time of the writing of this document, the Serverless Workflow language is a -super-set of the Google Cloud Workflow language -in terms of functionality. It also focuses more on the domain-specific aspects, where as -the Google Workflow language seems to be more code-like and focused on easy integration -with their runtime implementation it seems. - -We hope that these examples will give you a good start for comparing and contrasting the two serverless workflow -languages. - -## Table of Contents - -- [Greeting with Arguments](#Greeting-With-Arguments) -- [Concatenating array values](#Concatenating-Array-Values) -- [Connect Compute engine](#Connect-Compute-Engine) -- [Error Handling for REST service invocation](#Error-Handling-For-REST-Service-Invocation) -- [Retrying on errors](#Retrying-On-Errors) -- [Sub Workflows](#Sub-Workflows) -- [Data based condition](#Data-Based-Conditions) - - -### Greeting With Arguments - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/args.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -{ - "main": { - "params": [ - "args" - ], - "steps": [ - { - "step1": { - "assign": [ - { - "outputVar": "${\"Hello \" + args.firstName + \" \" + args.lastName}" - } - ] - } - }, - { - "step2": { - "return": "${outputVar}" - } - } - ] - } -} -``` - - - -```json -{ - "id": "greetingwithargs", - "name": "Greeting With Args", - "specVersion": "0.8", - "start": "Set Output", - "states": [ - { - "name": "Set Output", - "type": "inject", - "data": { - "outputVar": "Hello ${ .firstname + \" \" + .lastname }" - }, - "stateDataFilter": { - "output": "${ .outputVar }" - }, - "end": true - } - ] -} -``` - -
- -#### Notes - -Both languages allow for JSON initializing data to be defined within the markup. -Google Workflow uses the "assign" keyword to set specific data property where as -Serverless Workflow has a dedicated state for this. Google Workflow uses -a second step with a "return" keyword to set the workflow output where as -in Serverless Workflow each state can define data filters to select the state -data which should be passed to the next state or become workflow data output. -It's important to mention that the inject state is not needed in Serverless Workflow -as this data can also be dynamically passed to the workflow when instance a workflow -instance is created. See the Serverless Workflow ["Workflow Data"](../specification.md#Workflow-Data) section for more info on this. - -### Concatenating Array Values - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/array.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -[ - { - "define": { - "assign": [ - { - "array": [ - "foo", - "ba", - "r" - ] - }, - { - "result": "" - }, - { - "i": 0 - } - ] - } - }, - { - "check_condition": { - "switch": [ - { - "condition": "${len(array) > i}", - "next": "iterate" - } - ], - "next": "exit_loop" - } - }, - { - "iterate": { - "assign": [ - { - "result": "${result + array[i]}" - }, - { - "i": "${i+1}" - } - ], - "next": "check_condition" - } - }, - { - "exit_loop": { - "return": { - "concat_result": "${result}" - } - } - } -] -``` - - - -```json -{ - "id": "concatarray", - "name": "Concatenating array values", - "start": "DoConcat", - "specVersion": "0.8", - "states": [ - { - "name": "DoConcat", - "type": "inject", - "data": { - "array": [ - "foo", - "ba", - "r" - ] - }, - "stateDataFilter": { - "output": "${ .array | join(\"\") }" - }, - "end": true - } - ] -} -``` - -
- -#### Notes - -Google Workflow lang takes a programmatic-like approach here by iterating the array values -with the "switch" directive. -It uses the "+" symbol to we assume is how the underlying programming language -used in their runtime impl can concatenate strings. -The second step, "exit-loop" is then used alongside the "return" keyword to specify the -workflow results. -With Serverless Workflow we can inject the array data via the ["inject" state](../specification.md#inject-state) again, or -it can simply be passed as workflow data input. There is no need for looping here as -we can just utilize the [jq "join" function](https://stedolan.github.io/jq/manual/#join(str)) as shown in the states data filter. -We could use the [ForEach state](../specification.md#ForEach-State) for iteration of -array values, however it would just unnecessarily complicate things. - -### Connect Compute Engine - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/connect_compute_engine.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -[ - { - "initialize": { - "assign": [ - { - "project": "${sys.get_env(\"GOOGLE_CLOUD_PROJECT_NUMBER\")}" - }, - { - "zone": "us-central1-a" - }, - { - "vmToStop": "examplevm" - } - ] - } - }, - { - "stopInstance": { - "call": "http.post", - "args": { - "url": "${\"https://compute.googleapis.com/compute/v1/projects/\"+project+\"/zones/\"+zone+\"/instances/\"+vmToStop+\"/stop\"}", - "auth": { - "type": "OAuth2" - } - }, - "result": "stopResult" - } - } -] -``` - - - -```json -{ - "id": "stopcomputeengine", - "name": "Stop Compute Engine", - "specVersion": "0.8", - "start": "DoStop", - "states": [ - { - "name": "DoStop", - "type": "operation", - "actions": [ - { - "functionRef": { - "refName": "StopComputeEngine", - "arguments": { - "project": "${ .project }", - "zone": "${ .zone }", - "vmToStop": "${ .vmToStop }" - } - } - } - ], - "end": true - } - ], - "functions": [ - { - "name": "StopComputeEngine", - "operation": "computeengineopenapi.json#stopengine" - } - ] -} -``` - -
- -#### Notes - -Google workflow defines its own REST service invocations inside the workflow language -where as Serverless Workflow utilizes the OpenAPI specification for REST service invocations. -The "operation" parameter in Serverless Workflow is an URI to an OpenAPI definition file which -contains all the information needed to invoke this service. -We assume that the values are passed to the Serverless Workflow as workflow data inputs. -Serverless Workflow has a designated "operation" state to perform operations such -as service invocations, where as Google Workflow uses the "call" keyword. - - -### Error Handling For REST Service Invocation - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/connector_publish_pubsub.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -[ - { - "initVariables": { - "assign": [ - { - "project": "${sys.get_env(\"GOOGLE_CLOUD_PROJECT_ID\")}" - }, - { - "topic": "mytopic1" - }, - { - "message": "Hello world!" - } - ] - } - }, - { - "publish": { - "try": { - "call": "googleapis.pubsub.v1.projects.topics.publish", - "args": { - "topic": "${\"projects/\" + project + \"/topics/\" + topic}", - "body": { - "messages": [ - { - "data": "${base64.encode(text.encode(message))}" - } - ] - } - }, - "result": "publishResult" - }, - "except": { - "as": "e", - "steps": [ - { - "handlePubSubError": { - "switch": [ - { - "condition": "${e.code == 404}", - "raise": "PubSub Topic not found" - }, - { - "condition": "${e.code == 403}", - "raise": "Error authenticating to PubSub" - } - ] - } - }, - { - "unhandledException": { - "raise": "${e}" - } - } - ] - } - } - }, - { - "last": { - "return": "${publishResult}" - } - } -] -``` - - - -```json -{ - "id": "publishtotopicwitherrorhandling", - "name": "Publish To Topic With Error Handling", - "specVersion": "0.8", - "start": "DoPublish", - "errors": [ - { - "name": "PubSub Topic not found", - "code": "404" - }, - { - "name": "Error authenticating to PubSub", - "code": "403" - } - ], - "states": [ - { - "name": "DoPublish", - "type": "operation", - "actions": [ - { - "functionRef": { - "refName": "PublishToTopic", - "arguments": { - "project": "${ .project }", - "topic": "${ .topic }", - "message": "${ .message }" - } - } - } - ], - "onErrors": [ - { - "errorRef": "PubSub Topic not found", - "end": { - "produceEvents": [ - { - "eventRef": "TopicError", - "data": { "message": "PubSub Topic not found"} - } - ] - } - }, - { - "errorRef": "Error authenticating to PubSub", - "end": { - "produceEvents": [ - { - "eventRef": "TopicError", - "data": { "message": "Error authenticating to PubSub"} - } - ] - } - } - ], - "end": true - } - ], - "functions": [ - { - "name": "PublishToTopic", - "operation": "pubsubapi.json#publish" - } - ], - "events": [ - { - "name": "TopicError", - "source": "pubsub.topic.events", - "type": "pubsub/events" - } - ] -} -``` - -
- -#### Notes - -This example shows the differences of error handling approaches between the two languages. -We assumed here that the "raise" keyword used in the Google Workflow language completes workflow execution. -The biggest difference here is that with Serverless Workflow there is no specific way -of "raising" or "throwing" a caught exception. [Error handling in Serverless Workflow](../specification.md#Workflow-Error-Handling) is explicit -meaning handling the error has to be defined within the workflow execution logic. -Another difference is that with Serverless Workflow you can notify occurence of an error -to interested parties via events (CloudEvents specification format), which we are showing in this example. - -### Retrying On Errors - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/error_retry_500.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -{ - "main": { - "steps": [ - { - "read_item": { - "try": { - "call": "http.get", - "args": { - "url": "https://host.com/api" - }, - "result": "api_response" - }, - "retry": { - "predicate": "${custom_predicate}", - "max_retries": 5, - "backoff": { - "initial_delay": 2, - "max_delay": 60, - "multiplier": 2 - } - } - } - }, - { - "last_step": { - "return": "OK" - } - } - ] - }, - "custom_predicate": { - "params": [ - "e" - ], - "steps": [ - { - "what_to_repeat": { - "switch": [ - { - "condition": "${e.code == 500}", - "return": true - } - ] - } - }, - { - "otherwise": { - "return": false - } - } - ] - } -} -``` - - - -```json -{ - "id": "errorhandlingwithretries", - "name": "Error Handling with Retries", - "start": "ReadItem", - "specVersion": "0.8", - "states": [ - { - "name": "ReadItem", - "type": "operation", - "actions": [ - { - "functionRef": "ReadItemFromApi", - "retryRef": "ServiceNotAvailableRetryPolicy", - "retryableErrors": ["Service Not Available"] - } - ], - "onErrors": [ - { - "errorRef": "Service Not Available", - "end": true - } - ], - "end": true - } - ], - "functions": [ - { - "name": "ReadItemFromApi", - "operation": "someapi.json#read" - } - ], - "errors": [ - { - "name": "Service Not Available", - "code": "500" - } - ], - "retries": [ - { - "name": "ServiceNotAvailableRetryPolicy", - "maxAttempts": 5, - "delay": "PT2S", - "maxDelay": "PT60S", - "multiplier": 2 - } - ] -} -``` - -
- -#### Notes - -Serverless Workflow defines [reusable retry definitions](../specification.md#retry-definition) which can be referenced by -state actions. By default with Serverless workflow all actions are retried. You can however reference a defined -retry policy to perform specific retries on actions. If the error persists after defined retry attempts logic, -the workflow state can handle the error with its onErrors property. - -Google Workflow seems to reference the -error handlers in the "retry" statement as an expression/variable. - -### Sub Workflows - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/subworkflow.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -{ - "main": { - "steps": [ - { - "first": { - "call": "hello", - "args": { - "input": "Kristof" - }, - "result": "someOutput" - } - }, - { - "second": { - "return": "${someOutput}" - } - } - ] - }, - "hello": { - "params": [ - "input" - ], - "steps": [ - { - "first": { - "return": "${\"Hello \"+input}" - } - } - ] - } -} -``` - - - -```json -{ - "id": "callsubflow", - "name": "Call SubFlow", - "start": "CallSub", - "states": [ - { - "name": "CallSub", - "type":"operation", - "actions": [ - { - "subFlowRef": "calledsubflow" - } - ], - "end": true - } - ] -} -``` - -
- -#### Notes - -Serverless Workflow has a specific [SubFlow action](../specification.md#SubFlow-Action). By default the current workflow data -is passed to it, so there is no need to define specific arguments. -We have omitted the definition of "calledsubflow" as it is pretty straight forward. It would be -a separate workflow definition with the "id" parameter set to "calledsubflow" in this example. - -### Data Based Conditions - -[Google Cloud Workflows Example](https://github.com/GoogleCloudPlatform/workflows-samples/blob/main/src/step_conditional_jump.workflows.json) - - - - - - - - - - -
GoogleServerless Workflow
- -```json -[ - { - "firstStep": { - "call": "http.get", - "args": { - "url": "https://www.example.com/callA" - }, - "result": "firstResult" - } - }, - { - "whereToJump": { - "switch": [ - { - "condition": "${firstResult.body.SomeField < 10}", - "next": "small" - }, - { - "condition": "${firstResult.body.SomeField < 100}", - "next": "medium" - } - ], - "next": "large" - } - }, - { - "small": { - "call": "http.get", - "args": { - "url": "https://www.example.com/SmallFunc" - }, - "next": "end" - } - }, - { - "medium": { - "call": "http.get", - "args": { - "url": "https://www.example.com/MediumFunc" - }, - "next": "end" - } - }, - { - "large": { - "call": "http.get", - "args": { - "url": "https://www.example.com/LargeFunc" - }, - "next": "end" - } - } -] -``` - - - -```json -{ - "id": "databasedconditions", - "name": "Data Based Conditions", - "start": "CallA", - "states": [ - { - "name": "CallA", - "type":"operation", - "actions": [ - { - "functionRef": "callFunctionA" - } - ], - "transition": "EvaluateAResults" - }, - { - "name": "EvaluateAResults", - "type": "switch", - "dataConditions": [ - { - "name": "Less than 10", - "condition": "${ .body | .SomeField < 10 }", - "transition": "CallSmall" - }, - { - "name": "Less than 100", - "condition": "${ .body | .SomeField < 100 }", - "transition": "CallMedium" - } - ], - "defaultCondition": { - "transition": "CallLarge" - } - }, - { - "name": "CallSmall", - "type":"operation", - "actions": [ - { - "functionRef": "callFunctionSmall" - } - ], - "end": true - }, - { - "name": "CallMedium", - "type":"operation", - "actions": [ - { - "functionRef": "callFunctionMedium" - } - ], - "end": true - }, - { - "name": "CallLarge", - "type":"operation", - "actions": [ - { - "functionRef": "callFunctionMedium" - } - ], - "end": true - } - ], - "functions": [ - { - "name": "callFunctionA", - "operation": "myapi.json#calla" - }, - { - "name": "callFunctionSmall", - "operation": "myapi.json#callsmall" - }, - { - "name": "callFunctionMedium", - "operation": "myapi.json#callmedium" - }, - { - "name": "callFunctionLarge", - "operation": "myapi.json#calllarge" - } - ] -} -``` - -
- -#### Notes - -Serverless Workflow has a specific [Switch state](../specification.md#Switch-State) which can handle both data-based as well as event-based -conditions. Instead of hard-coding the REST invocation info in states, it has reusable -function definitions which can be referenced by one or many states. diff --git a/comparisons/comparison-temporal.md b/comparisons/comparison-temporal.md deleted file mode 100644 index 22791f6b..00000000 --- a/comparisons/comparison-temporal.md +++ /dev/null @@ -1,453 +0,0 @@ -# Comparisons - Temporal - -[Temporal](https://temporal.io/) is an open source microservice orchestration platform. Temporal apps are written -in code, and SDKs are currently available for Go, Java, PHP and TypeScript. It provides two special functions, namely Workflow -and Activity functions. - -Workflows in Temporal are cohesive functions with added support for retries, Saga pattern support -rollbacks, and human intervention steps in case of failure. Overall Temporal promotes the "Workflows as Code" -paradigm which might feel natural to developers. Workflows in Temporal cannot call external APIs directly, but -rather orchestrate executions of Activities. - -Activities are object methods written in one of the supported languages. They can contain any code without restrictions, -meaning they can be used to communicate with databases, call external APIs, etc. - -The purpose of this document is to compare and contrast the Temporal workflow code and the equivalent -Serverless Workflow DSL. -This can hopefully help compare and contrast the two workflow languages and give a better understanding of both. - -Given that Temporal provides SDKs in multiple languages, in this document we will focus only on Temporal workflows -written in Java. - -All Temporal examples used in this document are available in their [samples-java](https://github.com/temporalio/samples-java) -Github repository. Note that in this document we only show the Temporal workflow Java code which is relevant to -the actual workflow implementation. Language constructs like imports are not included, but full examples can be found -in the repo mentioned above. The latest version of Temporal as of the time of writing this -document is [1.13.1](https://github.com/temporalio/temporal/releases/tag/v1.13.1). - -We hope that these examples will give you a good start for comparing and contrasting the two workflow -languages. - -## Note when reading provided examples - -[Activities](https://docs.temporal.io/docs/concept-activities) in temporal are comparable with -[actions](https://github.com/serverlessworkflow/specification/blob/master/specification.md#Action-Definition) in Serverless Workflow -language, namely actions that reference [function](https://github.com/serverlessworkflow/specification/blob/master/specification.md#function-definition) -definitions. Serverless Workflow action execution involves invoking distributed functions via REST or via events. - -When looking at provided examples below, please note that the code defined in activities of -Temporal code is assumed to be stand-alone distributed functions/services accessible over REST API -and used in the compared Serverless Workflow DSL. - - -## Table of Contents - -- [Single Activity](#Single-Activity) -- [Periodical Execution (Cron)](#Periodical-Execution) -- [Compensation Logic (SAGA)](#Compensation-Logic) -- [Error Handling and Retries](#Error-Handling-and-Retries) - -### Single Activity - -[Full Temporal Example](https://github.com/temporalio/samples-java/blob/master/src/main/java/io/temporal/samples/hello/HelloActivity.java) - - - - - - - - - - -
TemporalServerless Workflow
- -```java -// Workflow implementation -public static class GreetingWorkflowImpl implements GreetingWorkflow { - private final GreetingActivities activities = - Workflow.newActivityStub( - GreetingActivities.class, - ActivityOptions.newBuilder().setScheduleToCloseTimeout(Duration.ofSeconds(2)).build()); - - // Workflow method - @Override - public String getGreeting(String name) { - return activities.composeGreeting("Hello", name); - } -} -``` - - - -```json -{ - "id": "greetingworkflow", - "name": "Greeting Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "autoRetries": true, - "states": [ - { - "name": "Greet", - "type": "operation", - "actions": [ - { - "name": "Greet Action", - "functionRef": { - "refName": "GreetingFunction", - "arguments": { - "prefix": "Hello", - "name": "${ .name }" - } - } - } - ], - "timeouts": { - "actionExecTimeout": "PT2S" - }, - "end": true - } - ], - "functions": [ - { - "name": "GreetingFunction", - "operation": "myactionsapi.json#composeGreeting" - } - ] -} -``` - -
- -### Periodical Execution - -[Full Temporal Example](https://github.com/temporalio/samples-java/blob/master/src/main/java/io/temporal/samples/hello/HelloCron.java) - - - - - - - - - - -
TemporalServerless Workflow
- -```java -// Workflow implementation -public static class GreetingWorkflowImpl implements GreetingWorkflow { - private final GreetingActivities activities = - Workflow.newActivityStub( - GreetingActivities.class, - ActivityOptions.newBuilder().setScheduleToCloseTimeout(Duration.ofSeconds(10)).build()); - - @Override - public String greet(String name) { - activities.greet("Hello " + name + "!"); - } -} - -// Client code Workflow Options (cron) -WorkflowOptions workflowOptions = - WorkflowOptions.newBuilder() - .setWorkflowId(CRON_WORKFLOW_ID) - .setTaskQueue(TASK_QUEUE) - .setCronSchedule("* * * * *") - .setWorkflowExecutionTimeout(Duration.ofMinutes(10)) - .build(); - -``` - - - -```json -{ - "id": "greetingworkflow", - "name": "Greeting Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "autoRetries": true, - "timeouts": { - "workflowExecTimeout": "PT10M" - } - "start": { - "stateName": "GreetingState", - "schedule": { - "cron": { - "expression": "* * * * *" - } - } - }, - "states": [ - { - "name": "GreetingState", - "type": "operation", - "actions": [ - { - "name": "Greet", - "functionRef": { - "refName": "GreetingFunction", - "arguments": { - "prefix": "Hello", - "name": "${ .name }" - } - } - } - ], - "timeouts": { - "actionExecTimeout": "PT2S" - }, - "end": true - } - ], - "functions": [ - { - "name": "GreetingFunction", - "operation": "myactionsapi.json#greet" - } - ] -} -``` - -
- -### Compensation Logic - -[Full Temporal Example](https://github.com/temporalio/samples-java/blob/8218f4114e52417f8d04175b67027ff0af4fb73c/src/main/java/io/temporal/samples/hello/HelloSaga.java) - - - - - - - - - - -
TemporalServerless Workflow
- -```java -// Workflow implementation - public static class SagaWorkflowImpl implements SagaWorkflow { - ActivityOperation activity = - Workflow.newActivityStub( - ActivityOperation.class, - ActivityOptions.newBuilder().setScheduleToCloseTimeout(Duration.ofSeconds(2)).build()); - - // Workflow Method - @Override - public void execute() { - Saga saga = new Saga(new Saga.Options.Builder().setParallelCompensation(false).build()); - try { - // The following demonstrate how to compensate sync invocations. - ChildWorkflowOperation op1 = Workflow.newChildWorkflowStub(ChildWorkflowOperation.class); - op1.execute(10); - ChildWorkflowCompensation c1 = - Workflow.newChildWorkflowStub(ChildWorkflowCompensation.class); - saga.addCompensation(c1::compensate, -10); - - // The following demonstrate how to compensate async invocations. - Promise result = Async.procedure(activity::execute, 20); - saga.addCompensation(activity::compensate, -20); - result.get(); - - saga.addCompensation( - () -> System.out.println("Other compensation logic in main workflow.")); - throw new RuntimeException("some error"); - - } catch (Exception e) { - saga.compensate(); - } - } -} -``` - - - -```json -{ - "id": "HelloSaga", - "name": "Hello SAGA compensation Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "states": [ - { - "name": "ExecuteState", - "type": "operation", - "compensatedBy": "CompensateState", - "actions": [ - { - "name": "Execute", - "functionRef": { - "refName": "ExecuteFunction", - "arguments": { - "amount": 10 - } - } - } - ], - "end": { - "compensate": true - } - }, - { - "name": "CompensateState", - "type": "operation", - "usedForCompensation": true, - "actions": [ - { - "name": "Compensate", - "functionRef": { - "refName": "CompensateFunction", - "arguments": { - "amount": -10 - } - } - } - ] - } - ], - "functions": [ - { - "name": "ExecuteFunction", - "operation": "myactionsapi.json#execute" - }, - { - "name": "CompensateFunction", - "operation": "myactionsapi.json#compensate" - } - ] -} -``` - -
- -#### Note - -Serverless Workflow defines explicit compensation, meaning it has to be explicitly invoked -as part of the workflow control flow logic. For more information see the -[Workflow Compensation](../specification.md#Workflow-Compensation) section. - -### Error Handling and Retries - -[Full Temporal Example](https://github.com/temporalio/samples-java/blob/master/src/main/java/io/temporal/samples/hello/HelloActivityRetry.java) - - - - - - - - - - -
TemporalServerless Workflow
- -```java -// Workflow Implementation -public static class GreetingWorkflowImpl implements GreetingWorkflow { - private final GreetingActivities activities = - Workflow.newActivityStub( - GreetingActivities.class, - ActivityOptions.newBuilder() - .setScheduleToCloseTimeout(Duration.ofSeconds(10)) - .setRetryOptions( - RetryOptions.newBuilder() - .setInitialInterval(Duration.ofSeconds(1)) - .setDoNotRetry(IllegalArgumentException.class.getName()) - .build()) - .build()); - @Override - public String getGreeting(String name) { - // This is a blocking call that returns only after activity is completed. - return activities.composeGreeting("Hello", name); - } -} - -// Activity Implementation -static class GreetingActivitiesImpl implements GreetingActivities { - private int callCount; - private long lastInvocationTime; - - @Override - public synchronized String composeGreeting(String greeting, String name) { - if (lastInvocationTime != 0) { - long timeSinceLastInvocation = System.currentTimeMillis() - lastInvocationTime; - System.out.print(timeSinceLastInvocation + " milliseconds since last invocation. "); - } - lastInvocationTime = System.currentTimeMillis(); - if (++callCount < 4) { - System.out.println("composeGreeting activity is going to fail"); - throw new IllegalStateException("not yet"); - } - System.out.println("composeGreeting activity is going to complete"); - return greeting + " " + name + "!"; - } -} -``` - - - -```json -{ - "id": "HelloActivityRetry", - "name": "Hello Activity with Retries Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "autoRetries": true, - "start": "GreetingState", - "states": [ - { - "name": "GreetingState", - "type": "operation", - "actions": [ - { - "name": "Greet", - "functionRef": { - "refName": "GreetingFunction", - "arguments": { - "name": "World" - }, - "retryRef": "GreetingRetry", - "nonRetryableErrors": ["IllegalArgumentException"] - } - } - ], - "timeouts": { - "actionExecTimeout": "PT10S" - }, - "onErrors": [ - { - "errorRefs": ["IllegalStateException", "IllegalArgumentException"], - "end": true - } - ], - "end": true - } - ], - "functions": [ - { - "name": "GreetingFunction", - "operation": "myactionsapi.json#composeGreeting" - } - ], - "errors": [ - { - "name": "IllegalStateException" - }, - { - "name": "IllegalArgumentException" - } - ], - "retries": [ - { - "name": "GreetingRetry", - "delay": "PT1S" - } - ] -} -``` - -
diff --git a/ctk/features/call.feature b/ctk/features/call.feature new file mode 100644 index 00000000..6ebcfbe2 --- /dev/null +++ b/ctk/features/call.feature @@ -0,0 +1,147 @@ +Feature: Call Task + As an implementer of the workflow DSL + I want to ensure that call tasks can be executed within the workflow + So that my implementation conforms to the expected behavior + + # Tests HTTP call using `content` output + # Tests interpolated path parameters + # Tests auto-deserialization when reading response with 'application/json' content type + # Tests output filtering + Scenario: Call HTTP With Content Output + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: http-call-with-content-output + do: + getFirstAvailablePet: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/findByStatus?status={status} + output: + from: .[0] + """ + And given the workflow input is: + """yaml + status: available + """ + When the workflow is executed + Then the workflow should complete + And the workflow output should have properties 'id', 'name', 'status' + + # Tests HTTP call using `response` output + # Tests interpolated path parameters + # Tests auto-deserialization when reading response with 'application/json' content type + Scenario: Call HTTP With Response Output + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: http-call-with-response-output + do: + getPetById: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/{petId} + output: response + """ + And given the workflow input is: + """yaml + petId: 1 + """ + When the workflow is executed + Then the workflow should complete + And the workflow output should have properties 'request', 'request.method', 'request.uri', 'request.headers', 'headers', 'statusCode', 'content' + And the workflow output should have properties 'content.id', 'content.name', 'content.status' + + # Tests HTTP call using `basic` authentication + # Tests interpolated path parameters + Scenario: Call HTTP Using Basic Authentication + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: http-call-with-basic-auth + do: + getSecuredEndpoint: + call: http + with: + method: get + endpoint: + uri: https://httpbin.org/basic-auth/{username}/{password} + authentication: + basic: + username: ${ .username } + password: ${ .password } + """ + And given the workflow input is: + """yaml + username: serverless-workflow + password: conformance-test + """ + When the workflow is executed + Then the workflow should complete + + # Tests OpenAPI call using `content` output + # Tests output filtering + Scenario: Call OpenAPI With Content Output + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: openapi-call-with-content-output + do: + getPetsByStatus: + call: openapi + with: + document: + uri: https://petstore.swagger.io/v2/swagger.json + operation: findPetsByStatus + parameters: + status: ${ .status } + output: + from: . | length + """ + And given the workflow input is: + """yaml + status: available + """ + When the workflow is executed + Then the workflow should complete + + # Tests OpenAPI call using `response` output + # Tests output filtering + Scenario: Call OpenAPI With Response Output + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: openapi-call-with-response-output + do: + getPetById: + call: openapi + with: + document: + uri: https://petstore.swagger.io/v2/swagger.json + operation: getPetById + parameters: + petId: ${ .petId } + output: response + """ + And given the workflow input is: + """yaml + petId: 1 + """ + When the workflow is executed + Then the workflow should complete + And the workflow output should have properties 'request', 'request.method', 'request.uri', 'request.headers', 'headers', 'statusCode', 'content' + And the workflow output should have properties 'content.id', 'content.name', 'content.status' \ No newline at end of file diff --git a/ctk/features/composite.feature b/ctk/features/composite.feature new file mode 100644 index 00000000..aad61fc0 --- /dev/null +++ b/ctk/features/composite.feature @@ -0,0 +1,59 @@ +Feature: Composite Task + As an implementer of the workflow DSL + I want to ensure that composite tasks can be executed within the workflow + So that my implementation conforms to the expected behavior + + # Tests composite tasks with sequential sub tasks + Scenario: Composite Task With Sequential Sub Tasks + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: composite-sequential + do: + setRGB: + execute: + sequentially: + setRed: + set: + colors: ${ .colors + ["red"] } + setGreen: + set: + colors: ${ .colors + ["green"] } + setBlue: + set: + colors: ${ .colors + ["blue"] } + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + colors: [ red, green, blue ] + """ + + # Tests composite tasks With competing concurrent sub tasks + Scenario: Composite Task With Competing Concurrent Sub Tasks + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: composite-sequential + do: + setRGB: + execute: + concurrently: + setRed: + set: + colors: ${ .colors + ["red"] } + setGreen: + set: + colors: ${ .colors + ["green"] } + setBlue: + set: + colors: ${ .colors + ["blue"] } + compete: true + """ + When the workflow is executed + Then the workflow should complete + And the workflow output should have a 'colors' property containing 1 items \ No newline at end of file diff --git a/ctk/features/data-flow.feature b/ctk/features/data-flow.feature new file mode 100644 index 00000000..36e1b998 --- /dev/null +++ b/ctk/features/data-flow.feature @@ -0,0 +1,91 @@ +Feature: Data Flow + As an implementer of the workflow DSL + I want to ensure that data flows correctly through the workflow + So that my implementation conforms to the expected behavior + + # Tests task input fileting + Scenario: Input Filtering + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: output-filtering + do: + setUsername: + input: + from: .user.claims.subject #filters the input of the task, using only the user's subject + set: + playerId: ${ . } + """ + And given the workflow input is: + """yaml + user: + claims: + subject: 6AsnRgGEB0q2O7ux9JXFAw + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + playerId: 6AsnRgGEB0q2O7ux9JXFAw + """ + + # Tests task output filtering + Scenario: Output Filteing + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: output-filtering + do: + getPetById: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/{petId} #simple interpolation, only possible with top level variables + output: + from: .id #filters the output of the http call, using only the id of the returned object + """ + And given the workflow input is: + """yaml + petId: 1 + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + 1 + """ + + # Tests using non-object output + Scenario: Use Non-object Output + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: non-object-output + do: + getPetById1: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/{petId} #simple interpolation, only possible with top level variables + output: + from: .id + getPetById2: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/2 + output: + from: '{ ids: [ $input, .id ] }' + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + ids: [ 1, 2 ] + """ \ No newline at end of file diff --git a/ctk/features/emit.feature b/ctk/features/emit.feature new file mode 100644 index 00000000..8954a4c2 --- /dev/null +++ b/ctk/features/emit.feature @@ -0,0 +1,44 @@ +Feature: Emit Task + As an implementer of the workflow DSL + I want to ensure that emit tasks can be executed within the workflow + So that my implementation conforms to the expected behavior + + # Tests emit tasks + Scenario: Emit Task + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: emit + do: + emitUserGreeted: + emit: + event: + with: + source: https://fake-source.com + type: com.fake-source.user.greeted.v1 + data: + greetings: ${ "Hello \(.user.firstName) \(.user.lastName)!" } + """ + And given the workflow input is: + """yaml + user: + firstName: John + lastName: Doe + """ + When the workflow is executed + Then the workflow should complete + And the workflow output should have properties 'id', 'specversion', 'time', 'source', 'type', 'data' + And the workflow output should have a 'source' property with value: + """yaml + https://fake-source.com + """ + And the workflow output should have a 'type' property with value: + """yaml + com.fake-source.user.greeted.v1 + """ + And the workflow output should have a 'data' property with value: + """yaml + greetings: Hello John Doe! + """ \ No newline at end of file diff --git a/ctk/features/flow.feature b/ctk/features/flow.feature new file mode 100644 index 00000000..705a34b3 --- /dev/null +++ b/ctk/features/flow.feature @@ -0,0 +1,61 @@ +Feature: Flow Directive + As an implementer of the workflow DSL + I want to ensure that tasks are executed in the correct order + So that my implementation conforms to the expected behavior + + Scenario: Implicit Sequence Flow + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: implicit-sequence + do: + setRed: + set: + colors: '${ .colors + [ "red" ] }' + setGreen: + set: + colors: '${ .colors + [ "green" ] }' + setBlue: + set: + colors: '${ .colors + [ "blue" ] }' + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + colors: [ red, green, blue ] + """ + And setRed should run first + And setGreen should run after setRed + And setBlue should run after setGreen + + Scenario: Explicit Sequence Flow + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: explicit-sequence + do: + setRed: + set: + colors: '${ .colors + [ "red" ] }' + then: setGreen + setBlue: + set: + colors: '${ .colors + [ "blue" ] }' + then: end + setGreen: + set: + colors: '${ .colors + [ "green" ] }' + then: setBlue + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + colors: [ red, green, blue ] + """ + And setRed should run first + And setGreen should run after setRed + And setBlue should run after setGreen \ No newline at end of file diff --git a/ctk/features/for.feature b/ctk/features/for.feature new file mode 100644 index 00000000..43c17e04 --- /dev/null +++ b/ctk/features/for.feature @@ -0,0 +1,35 @@ +Feature: For Task + As an implementer of the workflow DSL + I want to ensure that for tasks can be executed within the workflow + So that my implementation conforms to the expected behavior + + # Tests for tasks + # Tests named iteration item (i.e.: `color`) + # Tests default iteration index (`index`) + Scenario: For Task + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: for + do: + forEachColor: + for: + each: color + in: '.colors' + do: + set: + processed: '${ { colors: (.processed.colors + [ $color ]), indexes: (.processed.indexes + [ $index ])} }' + """ + And given the workflow input is: + """yaml + colors: [ red, green, blue ] + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + processed: + colors: [ red, green, blue ] + indexes: [ 0, 1, 2 ] + """ \ No newline at end of file diff --git a/ctk/features/raise.feature b/ctk/features/raise.feature new file mode 100644 index 00000000..58567530 --- /dev/null +++ b/ctk/features/raise.feature @@ -0,0 +1,28 @@ +Feature: Raise Task + As an implementer of the workflow DSL + I want to ensure that the Raise task behaves correctly + So that my implementation conforms to the expected behavior + + Scenario: Raise task with inline error + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: raise-custom-error + do: + raiseComplianceError: + raise: + error: + status: 400 + type: https://serverlessworkflow.io/errors/types/compliance + title: Compliance Error + """ + When the workflow is executed + Then the workflow should fault with error: + """yaml + status: 400 + type: https://serverlessworkflow.io/errors/types/compliance + title: Compliance Error + instance: /do/raiseComplianceError + """ diff --git a/ctk/features/set.feature b/ctk/features/set.feature new file mode 100644 index 00000000..6d748ed9 --- /dev/null +++ b/ctk/features/set.feature @@ -0,0 +1,43 @@ +Feature: Set Task + As an implementer of the workflow DSL + I want to ensure that set tasks can be executed within the workflow + So that my implementation conforms to the expected behavior + + # Tests emit tasks + Scenario: Set Task + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: set + do: + initialize: + set: + shape: circle + size: ${ .configuration.size } + fill: ${ .configuration.fill } + """ + And given the workflow input is: + """yaml + configuration: + size: + width: 6 + height: 6 + fill: + red: 69 + green: 69 + blue: 69 + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + shape: circle + size: + width: 6 + height: 6 + fill: + red: 69 + green: 69 + blue: 69 + """ \ No newline at end of file diff --git a/ctk/features/switch.feature b/ctk/features/switch.feature new file mode 100644 index 00000000..01471cc0 --- /dev/null +++ b/ctk/features/switch.feature @@ -0,0 +1,137 @@ +Feature: Switch Task + As an implementer of the workflow DSL + I want to ensure that the Switch task behaves correctly + So that my implementation conforms to the expected behavior + + Scenario: Switch task with matching case + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: switch-match + do: + switchColor: + switch: + red: + when: '.color == "red"' + then: setRed + green: + when: '.color == "green"' + then: setGreen + blue: + when: '.color == "blue"' + then: setBlue + setRed: + set: + colors: '${ .colors + [ "red" ] }' + then: end + setGreen: + set: + colors: '${ .colors + [ "green" ] }' + then: end + setBlue: + set: + colors: '${ .colors + [ "blue" ] }' + then: end + """ + And given the workflow input is: + """yaml + color: red + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + colors: [ red ] + """ + And switchColor should run first + And setRed should run last + + Scenario: Switch task with implicit default case + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: switch-default-implicit + do: + switchColor: + switch: + red: + when: '.color == "red"' + then: setRed + green: + when: '.color == "green"' + then: setGreen + blue: + when: '.color == "blue"' + then: setBlue + then: end + setRed: + set: + colors: '${ .colors + [ "red" ] }' + setGreen: + set: + colors: '${ .colors + [ "green" ] }' + setBlue: + set: + colors: '${ .colors + [ "blue" ] }' + """ + And given the workflow input is: + """yaml + color: yellow + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + color: yellow + """ + And switchColor should run first + And switchColor should run last + + Scenario: Switch task with explicit default case + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: switch-default-implicit + do: + switchColor: + switch: + red: + when: '.color == "red"' + then: setRed + green: + when: '.color == "green"' + then: setGreen + blue: + when: '.color == "blue"' + then: setBlue + anyOtherColor: + then: setCustomColor + setRed: + set: + colors: '${ .colors + [ "red" ] }' + setGreen: + set: + colors: '${ .colors + [ "green" ] }' + setBlue: + set: + colors: '${ .colors + [ "blue" ] }' + setCustomColor: + set: + colors: '${ .colors + [ $input.color ] }' + """ + And given the workflow input is: + """yaml + color: yellow + """ + When the workflow is executed + Then the workflow should complete with output: + """yaml + colors: [ yellow ] + """ + And switchColor should run first + And setCustomColor should run last + \ No newline at end of file diff --git a/ctk/features/try.feature b/ctk/features/try.feature new file mode 100644 index 00000000..cd5fb272 --- /dev/null +++ b/ctk/features/try.feature @@ -0,0 +1,81 @@ +Feature: Try Task + As an implementer of the workflow DSL + I want to ensure that try tasks can be executed within the workflow + So that my implementation conforms to the expected behavior + + # Tests that try tasks complete when catching errors, and execute the defined handler task, if any + # Tests simple uri interpolation + # Tests custom error variable name + # Tests error instance path + Scenario: Try Handle Caught Error + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: try-catch-404 + do: + tryGetPet: + try: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/getPetByName/{petName} + catch: + errors: + with: + type: https://serverlessworkflow.io/dsl/errors/types/communication + status: 404 + as: err + do: + set: + error: ${ $err } + """ + And given the workflow input is: + """yaml + petName: Milou + """ + When the workflow is executed + Then the workflow should complete + And the workflow output should have properties 'error', 'error.type', 'error.status', 'error.title' + And the workflow output should have a 'error.instance' property with value: + """yaml + /do/tryGetPet/try + """ + + # Tests that try tasks fault when an uncaught error is raised + # Tests simple uri interpolation + # Tests custom error variable name + # Tests error instance path + Scenario: Try Raise Uncaught Error + Given a workflow with definition: + """yaml + document: + dsl: 1.0.0-alpha1 + namespace: default + name: try-catch-503 + do: + tryGetPet: + try: + call: http + with: + method: get + endpoint: + uri: https://petstore.swagger.io/v2/pet/getPetByName/{petName} + catch: + errors: + with: + type: https://serverlessworkflow.io/dsl/errors/types/communication + status: 503 + as: err + do: + set: + error: ${ $err } + """ + And given the workflow input is: + """yaml + petName: Milou + """ + When the workflow is executed + Then the workflow should fault \ No newline at end of file diff --git a/ctk/readme.md b/ctk/readme.md new file mode 100644 index 00000000..245d70d2 --- /dev/null +++ b/ctk/readme.md @@ -0,0 +1,297 @@ +# Serverless Workflow CTK + +## Table of Contents + +- [Introduction](#introduction) +- [Using the CTK](#using-the-ctk) + + [Conformance testing](#conformance-testing) + + [Behavior Driven Development](#behavior-driven-development-bdd) +- [Writing Features and Scenarios](#writing-features-and-scenarios) + + [Feature File Structure](#feature-file-structure) + + [Steps](#steps) + - [Arrange](#arrange) + + [Define workflow](#define-workflow) + + [Set workflow input](#set-workflow-input) + - [Act](#act) + + [Execute Workflow](#execute-workflow) + - [Assert](#assert) + - [Workflow has been cancelled](#workflow-has-been-cancelled) + - [Workflow ran to completion](#workflow-ran-to-completion) + - [Workflow has faulted](#workflow-has-faulted) + - [Workflow output should have properties](#workflow-output-should-have-properties) + - [Task ran first](#task-ran-first) + - [Task ran last](#task-ran-last) + - [Task ran before](#task-ran-before) + - [Task ran after](#task-ran-after) + - [Task has been cancelled](#task-has-been-cancelled) + - [Task ran to completion](#task-ran-to-completion) + - [Task has faulted](#task-has-faulted) + +## Introduction + +The Serverless Workflow Conformance Test Kit (CTK) is a suite of automated tests designed to ensure that implementations of the Serverless Workflow specification conform to the standard. The CTK is composed of multiple Gherkin features, each representing various aspects of the Serverless Workflow DSL. + +Gherkin is a human-readable language used for writing structured tests, which can be understood by both non-technical stakeholders and automated test frameworks. It uses a Given-When-Then syntax to describe the preconditions, actions, and expected outcomes of a test scenario. + +## Using the CTK + +The Serverless Workflow CTK serves two primary purposes: conformance testing and Behavior-Driven Development (BDD). + +### Conformance Testing + +Conformance testing is the process of verifying that an implementation adheres to a given specification. By running the CTK, developers can ensure that their implementations of the Serverless Workflow DSL behave as expected and meet the defined standards. This is crucial for maintaining interoperability and consistency across different implementations of the Serverless Workflow specification. + +1. **Clone the Repository**: Start by cloning the Serverless Workflow CTK repository to your local machine. + +```sh +git clone https://github.com/serverlessworkflow/specification.git +``` + +2. **Install Dependencies**: Ensure that you have all the necessary dependencies installed. This typically involves setting up a testing framework that can execute Gherkin tests. + +3. **Run the Tests**: Execute the Gherkin features using your preferred test runner. + +4. **Review Results**: After running the tests, review the results to ensure that your implementation passes all the scenarios. Any failures indicate deviations from the Serverless Workflow specification. + +### Behavior-Driven Development (BDD) + +Behavior-Driven Development (BDD) is an agile software development process that encourages collaboration among developers, testers, and business stakeholders. BDD focuses on defining the behavior of a system through examples in a shared language, which in this case is Gherkin. + +By using the CTK for BDD, teams can: + +**Define Behavior**: Write Gherkin scenarios that describe the expected behavior of the Serverless Workflow DSL. This helps in clearly specifying requirements and expected outcomes. + +**Facilitate Collaboration**: Use the Gherkin scenarios to facilitate discussions and collaboration between technical and non-technical team members. This ensures that everyone has a shared understanding of the system's behavior. + +**Automate Testing**: Implement automated tests based on the Gherkin scenarios to continuously verify that the system behaves as expected. This helps in catching regressions early and maintaining high quality. + +To use the CTK for BDD: + +**Write Scenarios**: Collaborate with stakeholders to write Gherkin scenarios that capture the desired behavior of the workflow. + +**Implement Steps**: Implement the steps in the Gherkin scenarios to interact with your workflow system. + +**Run and Validate**: Execute the scenarios and validate that the system's behavior matches the expectations defined in the Gherkin scenarios. + +## Writing Features and Scenarios + +To contribute new features or scenarios to the Serverless Workflow CTK, follow these guidelines: + +### Feature File Structure + +Each feature file should be placed in the [`/ctk/features`] directory and follow this structure: + +```gherkin +Feature: + As a + I want + So that + + Scenario: + Given + When + Then + And +``` + +### Steps + +For clarity, we've categorized the Gherkin steps used in the Serverless Workflow CTK into three main groups: Arrange, Act, and Assert. + +These divisions help clarify the purpose of each step and streamline scenario comprehension. + +The Arrange section sets up the initial state or context, the Act section describes the action, and the Assert section verifies the outcome. This structure enhances readability, aiding stakeholders in understanding the scenario flow and step intent. + +#### Arrange + +Sets up the initial conditions for the test scenario. + +It includes steps to define the workflow, set the input data for the workflow, and prepare any necessary resources or configurations required for executing the workflow. + +The arrange section of the test ensures that the environment is properly configured before the workflow execution begins. + +##### Define workflow + +Sets up the scenario by providing the definition of a workflow in either JSON or YAML format. It defines the structure and behavior of the workflow to be tested. + +```gherkin +Given a workflow with definition: +"""yaml + +""" +``` + +##### Set workflow input + +Specifies the input data for the workflow being tested. It provides the necessary data required for the workflow to execute and produce the expected output. + +```gherkin +And given the workflow input is: +"""yaml + +""" +``` + +#### Act + +Represents the action or event that triggers the execution of the workflow. + +The act section focuses on performing the specific action that the test scenario aims to verify or validate. + +##### Execute workflow + +Triggers the execution of the workflow. It initiates the processing of the workflow based on the provided definition and input data. + +```gherkin +When the workflow is executed +``` + +#### Assert + +Contains assertions that verify the outcome of the workflow execution. + +It includes steps to check various conditions such as whether the workflow was canceled, completed successfully, or encountered any faults or errors during execution. + +The assert section ensures that the workflow behaves as expected and meets the specified criteria for correctness and reliability. + +##### Workflow has been cancelled + +Asserts that the workflow was canceled during execution. It checks if the workflow terminated prematurely without completing its intended process. + +```gherkin +Then the workflow should cancel +``` + +##### Workflow ran to completion + +Asserts that the workflow execution completed successfully without any errors or faults. It ensures that the workflow ran through its entire process as expected. + +```gherkin +Then the workflow should complete +``` + +Expecting a specific output: + +```gherkin +Then the workflow should complete with output: +"""yaml + +""" +``` + +##### Workflow has faulted + +Asserts that the workflow encountered an error during execution. It verifies that the workflow did not complete successfully and identifies the presence of any faults in its execution. + +```gherkin +Then the workflow should fault +``` + +Expecting a specific error: + +```gherkin +Then the workflow should fault with error: +"""yaml + +""" +``` + +##### Workflow output should have properties + +Asserts that the workflow ran to completion and outputs a map that contains the specified single quoted, comma separated, properties. + +```gherkin +And the workflow output should have properties '', '', '' +``` + +##### Workflow output should have property + +Asserts that the workflow ran to completion and outputs a map that contains the specified single quoted, comma separated, property, which returns the specified value. + +```gherkin +And the workflow output should have a '' property with value: +"""yaml + +""" +``` + +##### Workflow output should have a property with item count + +```gherkin +And the workflow output should have a '' property containing items +``` + + +##### Task ran first + +Asserts that a specific task within the workflow executed first during workflow execution. It ensures the correct sequence of task execution based on the provided workflow definition. + +```gherkin +And should run first +``` + +##### Task ran last + +Asserts that a specific task within the workflow executed last during workflow execution. It ensures the correct sequence of task execution based on the provided workflow definition. + +```gherkin +And should run last +``` + +##### Task ran before + +Asserts that `TASK1` executed before `TASK2` during workflow execution. It ensures the correct order of task execution based on the provided workflow definition. + +```gherkin +And should run before +``` + +##### Task ran after + +Asserts that `TASK2` executed after `TASK1` during workflow execution. It ensures the correct order of task execution based on the provided workflow definition. + +```gherkin +And should run after +``` + +##### Task has been cancelled + +Asserts that a specific task within the workflow was canceled during execution. It verifies that the task did not complete its execution due to cancellation. + +```gherkin +And should cancel +``` + +##### Task ran to completion + +Asserts that a specific task within the workflow completed its execution successfully. It ensures that the task executed without any errors or faults. + +```gherkin +And should complete +``` + +Expecting a specific output: + +```gherkin +And should complete with output: +"""yaml + +""" +``` + +##### Task has faulted + +Asserts that a specific task within the workflow encountered an error or fault during execution. It verifies that the task did not complete successfully and identifies any faults in its execution. + +```gherkin +And should fault +``` + +Expecting a specific error: + +```gherkin +And should fault with error: +"""yaml + +""" +``` \ No newline at end of file diff --git a/dsl-reference.md b/dsl-reference.md new file mode 100644 index 00000000..3b736b7a --- /dev/null +++ b/dsl-reference.md @@ -0,0 +1,1384 @@ +# Serverless Workflow DSL - Reference + +## Table of Contents + +- [Abstract](#abstract) +- [Definitions](#definitions) + + [Workflow](#workflow) + + [Task](#task) + - [Call](#call) + + [AsyncAPI](#asyncapi-call) + + [gRPC](#grpc-call) + + [HTTP](#http-call) + + [OpenAPI](#openapi-call) + - [Composite](#composite) + - [Emit](#emit) + - [For](#for) + - [Listen](#listen) + - [Raise](#raise) + - [Run](#run) + + [Container](#container-process) + + [Shell](#shell-process) + + [Script](#script-process) + + [Workflow](#workflow-process) + - [Switch](#switch) + - [Set](#set) + - [Try](#try) + - [Wait](#wait) + + [Flow Directive](#flow-directive) + + [External Resource](#external-resource) + + [Authentication Policy](#authentication-policy) + - [Basic](#basic-authentication) + - [Bearer](#bearer-authentication) + - [Certificate](#certificate-authentication) + - [Digest](#digest-authentication) + - [OAUTH2](#oauth2-authentication) + + [Extension](#extension) + + [Error](#error) + + [Error Filter](#error-filter) + + [Retry Policy](#retry-policy) + + [Input Data Model](#input-data-model) + + [Output Data Model](#output-data-model) + + [Timeout](#timeout) + + [Duration](#duration) + + [HTTP Response](#http-response) + + [HTTP Request](#http-request) + +## Abstract + +This document provides comprehensive definitions and detailed property tables for all the concepts discussed in the Serverless Workflow DSL. It serves as a reference guide, explaining the structure, components, and configurations available within the DSL. By exploring this document, users will gain a thorough understanding of how to define, configure, and manage workflows, including task definitions, flow directives, and state transitions. This foundational knowledge will enable users to effectively utilize the DSL for orchestrating serverless functions and automating processes. + +## Definitions + +### Workflow + +A [workflow](#workflow) serves as a blueprint outlining the series of [tasks](#task) required to execute a specific business operation. It details the sequence in which [tasks](#task) must be completed, guiding users through the process from start to finish, and helps streamline operations, ensure consistency, and optimize efficiency within an organization. + +#### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| document.dsl | `string` | `yes` | The version of the DSL used by the workflow. | +| document.namespace | `string` | `yes` | The workflow's namespace.
| +| document.name | `string` | `yes` | The workflow's name.
| +| document.version | `string` | `yes` | The workflow's [semantic version](#semantic-version). | +| document.title | `string` | `no` | The workflow's title. | +| document.summary | `string` | `no` | The workflow's Markdown summary. | +| document.tags | `map[string, string]` | `no` | A key/value mapping of the workflow's tags, if any. | +| use | [`componentCollection`](#component-collection) | `no` | A collection containing the workflow's reusable components. | +| do | [`map[string, task]`](#task) | `yes` | The [task(s)](#task) that must be performed by the [workflow](#workflow). | + +#### Examples + +```yaml +document: + dsl: '0.10' + name: order-pet + version: '1.0.0' + title: Order Pet - 1.0.0 + summary: > + # Order Pet - 1.0.0 + ## Table of Contents + - [Description](#description) + - [Requirements](#requirements) + ## Description + A sample workflow used to process an hypothetic pet order using the [PetStore API](https://petstore.swagger.io/) + ## Requirements + ### Secrets + - my-oauth2-secret +use: + authentications: + petStoreOAuth2: + oauth2: my-oauth2-secret + extensions: + externalLogging: + extend: all + before: + call: http + with: + method: post + uri: https://fake.log.collector.com + body: + message: "${ \"Executing task '\($task.reference)'...\" }" + after: + call: http + with: + method: post + uri: https://fake.log.collector.com + body: + message: "${ \"Executed task '\($task.reference)'...\" }" + functions: + getAvailablePets: + call: openapi + with: + document: + uri: https://petstore.swagger.io/v2/swagger.json + operation: findByStatus + parameters: + status: available + secrets: + - my-oauth2-secret +do: + getAvailablePets: + call: getAvailablePets + output: + from: "$input + { availablePets: [.[] | select(.category.name == "dog" and (.tags[] | .breed == $input.order.breed))] }" + submitMatchesByMail: + call: http + with: + method: post + endpoint: + uri: https://fake.smtp.service.com/email/send + authentication: petStoreOAuth2 + body: + from: noreply@fake.petstore.com + to: ${ .order.client.email } + subject: Candidates for Adoption + body: > + Hello ${ .order.client.preferredDisplayName }! + + Following your interest to adopt a dog, here is a list of candidates that you might be interested in: + + ${ .pets | map("-\(.name)") | join("\n") } + + Please do not hesistate to contact us at info@fake.petstore.com if your have questions. + + Hope to hear from you soon! + + ---------------------------------------------------------------------------------------------- + DO NOT REPLY + ---------------------------------------------------------------------------------------------- +``` + +### Task + +A task within a [workflow](#workflow) represents a discrete unit of work that contributes to achieving the overall objectives defined by the [workflow](#workflow). + +It encapsulates a specific action or set of actions that need to be executed in a predefined order to advance the workflow towards its completion. + +[Tasks](#task) are designed to be modular and focused, each serving a distinct purpose within the broader context of the [workflow](#workflow). + +By breaking down the [workflow](#workflow) into manageable [tasks](#task), organizations can effectively coordinate and track progress, enabling efficient collaboration and ensuring that work is completed in a structured and organized manner. + +The Serverless Workflow DSL defines a list of [tasks](#task) that **must be** supported by all runtimes: + +- [Call](#call), used to call services and/or functions. +- [Composite](#composite), used to define a minimum of two subtasks to perform. +- [Emit](#emit), used to emit [events](#event). +- [For](#for), used to iterate over a collection of items, and conditionally perform a task for each of them. +- [Listen](#listen), used to listen for an [event](#event) or more. +- [Raise](#raise), used to raise an [error](#error) and potentially fault the [workflow](#workflow). +- [Run](#run), used to run a [container](#container), a [script](#script) or event a [shell](#shell) command. +- [Switch](#switch), used to dynamically select and execute one of multiple alternative paths based on specified conditions +- [Set](#set), used to dynamically set or update the [workflow](#workflow)'s data during the its execution. +- [Try](#try), used to attempt executing a specified [task](#task), and to handle any resulting [errors](#error) gracefully, allowing the [workflow](#workflow) to continue without interruption. +- [Wait](#wait), used to pause or wait for a specified duration before proceeding to the next task. + +#### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| input | [`inputDataModel`](#input-data-model) | `no` | An object used to customize the task's input and to document its schema, if any. | +| output | [`outputDataModel`](#output-data-model) | `no` | An object used to customize the task's output and to document its schema, if any. | +| timeout | [`timeout`](#timeout) | `no` | The configuration of the task's timeout, if any. | +| then | [`flowDirective`](#flow-directive) | `no` | The flow directive to execute next.
*If not set, defaults to `continue`.* | + +#### Call + +Enables the execution of a specified function within a workflow, allowing seamless integration with custom business logic or external services. + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| call | `string` | `yes` | The name of the function to call. | +| with | `map` | `no` | A name/value mapping of the parameters to call the function with | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + getPetById: + call: http + with: + method: get + uri: https://petstore.swagger.io/v2/pet/{petId} +``` + +Serverless Workflow defines several default functions that **MUST** be supported by all implementations and runtimes: + +- [AsyncAPI](#asyncapi) +- [gRPC](#grpc-call) +- [HTTP](#http-call) +- [OpenAPI](#openapi-call) + +##### AsyncAPI Call + +The [AsyncAPI Call](#asyncapi-call) enables workflows to interact with external services described by [AsyncAPI](https://www.asyncapi.com/). + +###### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| document | [`externalResource`](#external-resource) | `yes` | The AsyncAPI document that defines the operation to call. | +| operation | `string` | `yes` | The id of the AsyncAPI operation to call. | +| server | `string` | `no` | A reference to the server to call the specified AsyncAPI operation on.
If not set, default to the first server matching the operation's channel. | +| message | `string` | `no` | The name of the message to use.
If not set, defaults to the first message defined by the operation. | +| binding | `string` | `no` | The name of the binding to use.
If not set, defaults to the first binding defined by the operation | +| payload | `any` | `no` | The operation's payload, as defined by the configured message | +| authentication | `string`
[`authenticationPolicy`](#authentication-policy) | `no` | The authentication policy, or the name of the authentication policy, to use when calling the AsyncAPI operation. | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + greetUser: + call: asyncapi + with: + document: https://fake.com/docs/asyncapi.json + operation: findPetsByStatus + server: staging + message: getPetByStatusQuery + binding: http + payload: + petId: ${ .pet.id } +``` + +##### gRPC Call + +The [gRPC Call](#grpc-call) enables communication with external systems via the gRPC protocol, enabling efficient and reliable communication between distributed components. + +###### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| proto | [`externalResource`](#external-resource) | `yes` | The proto resource that describes the GRPC service to call. | +| service.name | `string` | `yes` | The name of the GRPC service to call. | +| service.host | `string` | `yes` | The hostname of the GRPC service to call. | +| service.port | `integer` | `no` | The port number of the GRPC service to call. | +| service.authentication | [`authenticationPolicy`](#authentication-policy) | `no` | The authentication policy, or the name of the authentication policy, to use when calling the GRPC service. | +| method | `string` | `yes` | The name of the GRPC service method to call. | +| arguments | `map` | `no` | A name/value mapping of the method call's arguments, if any. | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + greetUser: + call: grpc + with: + proto: file://app/greet.proto + service: + name: GreeterApi.Greeter + host: localhost + port: 5011 + method: SayHello + arguments: + name: ${ .user.preferredDisplayName } +``` + +##### HTTP Call + +The [HTTP Call](#http-call) enables workflows to interact with external services over HTTP. + +###### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| method | `string` | `yes` | The HTTP request method. | +| endpoint | [`endpoint`](#endpoint) | `yes` | An URI or an object that describes the HTTP endpoint to call. | +| headers | `map` | `no` | A name/value mapping of the HTTP headers to use, if any. | +| body | `any` | `no` | The HTTP request body, if any. | +| output | `string` | `no` | The http call's output format.
*Supported values are:*
*- `raw`, which output's the base-64 encoded [http response](#http-response) content, if any.*
*- `content`, which outputs the content of [http response](#http-response), possibly deserialized.*
*- `response`, which outputs the [http response](#http-response).*
*Defaults to `content`.* | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + getPetById: + call: http + with: + method: get + uri: https://petstore.swagger.io/v2/pet/{petId} +``` + +##### OpenAPI Call + +The [OpenAPI Call](#openapi-call) enables workflows to interact with external services described by [OpenAPI](https://www.openapis.org). + +###### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| document | [`externalResource`](#external-resource) | `yes` | The OpenAPI document that defines the operation to call. | +| operation | `string` | `yes` | The id of the OpenAPI operation to call. | +| arguments | `map` | `no` | A name/value mapping of the parameters, if any, of the OpenAPI operation to call. | +| authentication | [`authenticationPolicy`](#authentication-policy) | `no` | The authentication policy, or the name of the authentication policy, to use when calling the OpenAPI operation. | +| output | `string` | `no` | The OpenAPI call's output format.
*Supported values are:*
*- `raw`, which output's the base-64 encoded [http response](#http-response) content, if any.*
*- `content`, which outputs the content of [http response](#http-response), possibly deserialized.*
*- `response`, which outputs the [http response](#http-response).*
*Defaults to `content`.* | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + getPets: + call: openapi + with: + document: https://petstore.swagger.io/v2/swagger.json + operation: findPetsByStatus + parameters: + status: available +``` + +#### Composite + + Serves as a pivotal orchestrator within workflow systems, enabling the seamless integration and execution of multiple subtasks to accomplish complex operations. By encapsulating and coordinating various subtasks, this task type facilitates the efficient execution of intricate workflows. + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| execute.sequentially | [`map(string, task)`](#task) | `no` | The tasks to perform sequentially.
*Required if `execute.concurrently` has not been set, otherwise ignored.*
*If set, must contains **at least** two [`tasks`](#task).* | +| execute.concurrently | [`map(string, task)`](#task) | `no` | The tasks to perform concurrently.
*Required if `execute.sequentially` has not been set, otherwise ignored.*
*If set, must contains **at least** two [`tasks`](#task).* | +| execute.compete | `boolean` | `no` | Indicates whether or not the concurrent [`tasks`](#task) are racing against each other, with a single possible winner, which sets the composite task's output.
*If not set, defaults to `false`.*
*Must **not** be set if the [`tasks`](#task) are executed sequentially.* | + +##### Examples + +*Executing tasks sequentially:* +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + bookTrip: + execute: + sequentially: + bookHotel: + call: http + with: + method: post + endpoint: + uri: https://fake-booking-agency.com/hotels/book + authentication: fake-booking-agency-oauth2 + body: + name: Four Seasons + city: Antwerp + country: Belgium + bookFlight: + call: http + with: + method: post + endpoint: + uri: https://fake-booking-agency.com/flights/book + authentication: fake-booking-agency-oauth2 + body: + departure: + date: '01/01/26' + time: '07:25:00' + from: + airport: BRU + city: Zaventem + country: Belgium + arrival: + date: '01/01/26' + time: '11:12:00' + to: + airport: LIS + city: Lisbon + country: Portugal +``` + +*Executing tasks concurrently:* +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + raiseAlarm: + execute: + concurrently: + callNurse: + call: http + with: + method: put + uri: https://fake-hospital.com/api/v3/alert/nurses + body: + patientId: ${ .patient.fullName } + room: ${ .room.number } + callDoctor: + call: http + with: + method: put + uri: https://fake-hospital.com/api/v3/alert/doctor + body: + patientId: ${ .patient.fullName } + room: ${ .room.number } +``` + +#### Emit + +Allows workflows to publish events to event brokers or messaging systems, facilitating communication and coordination between different components and services. With the Emit task, workflows can seamlessly integrate with event-driven architectures, enabling real-time processing, event-driven decision-making, and reactive behavior based on incoming events. + +##### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| emit.event | [`event`](#event) | `yes` | Defines the event to emit. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + placeOrder: + emit: + event: + with: + source: https://petstore.com + type: com.petstore.order.placed.v1 + data: + client: + firstName: Cruella + lastName: de Vil + items: + - breed: dalmatian + quantity: 101 +``` + +#### For + +Allows workflows to iterate over a collection of items, executing a defined set of subtasks for each item in the collection. This task type is instrumental in handling scenarios such as batch processing, data transformation, and repetitive operations across datasets. + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| for.each | `string` | `no` | The name of the variable used to store the current item being enumerated.
Defaults to `item`. | +| for.in | `string` | `yes` | A [runtime expression](#runtime-expressions) used to get the collection to enumerate. | +| for.at | `string` | `no` | The name of the variable used to store the index of the current item being enumerated.
Defaults to `index`. | +| while | `string` | `no` | A [runtime expression](#runtime-expressions) that represents the condition, if any, that must be met for the iteration to continue. | +| do | [`task`](#task) | `yes` | The task to perform for each item in the collection. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + checkup: + for: + each: pet + in: .pets + at: index + while: .vet != null + do: + checkForFleas: + if: $pet.lastCheckup == null + listen: + to: + one: + with: + type: com.fake.petclinic.pets.checkup.completed.v2 + output: + to: '.pets + [{ "id": $pet.id }]' +``` + +#### Listen + +Provides a mechanism for workflows to await and react to external events, enabling event-driven behavior within workflow systems. + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| listen.to.all | [`eventFilter[]`](#event-filter) | `no` | Configures the workflow to wait for all defined events before resuming execution.
*Required if `any` and `one` have not been set.* | +| listen.to.any | [`eventFilter[]`](#event-filter) | `no` | Configures the workflow to wait for any of the defined events before resuming execution.
*Required if `all` and `one` have not been set.* | +| listen.to.one | [`eventFilter`](#event-filter) | `no` | Configures the workflow to wait for the defined event before resuming execution.
*Required if `all` and `any` have not been set.* | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + callDoctor: + listen: + to: + any: + - with: + type: com.fake-hospital.vitals.measurements.temperature + data: + temperature: ${ .temperature > 38 } + - with: + type: com.fake-hospital.vitals.measurements.bpm + data: + temperature: ${ .bpm < 60 or .bpm > 100 } +``` + +#### Raise + +Intentionally triggers and propagates errors. By employing the "Raise" task, workflows can deliberately generate error conditions, allowing for explicit error handling and fault management strategies to be implemented. + +##### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| raise.error | [`error`](#error) | `yes` | Defines the error to raise. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + processTicket: + switch: + highPriority: + when: .ticket.priority == "high" + then: escalateToManager + mediumPriority: + when: .ticket.priority == "medium" + then: assignToSpecialist + lowPriority: + when: .ticket.priority == "low" + then: resolveTicket + default: + then: raiseUndefinedPriorityError + raiseUndefinedPriorityError: + raise: + error: + type: https://fake.com/errors/tickets/undefined-priority + status: 400 + title: Undefined Priority + escalateToManager: {...} + assignToSpecialist: {...} + resolveTicket: {...} +``` + +#### Run + +Provides the capability to execute external [containers](#container-process), [shell commands](#shell-process), [scripts](#script-process), or [workflows](#workflow-process). + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| run.container | [`container`](#container-process) | `no` | The definition of the container to run.
*Required if `script`, `shell` and `workflow` have not been set.* | +| run.script | [`script`](#script-process) | `no` | The definition of the script to run.
*Required if `container`, `shell` and `workflow` have not been set.* | +| run.shell | [`shell`](#shell-process) | `no` | The definition of the shell command to run.
*Required if `container`, `script` and `workflow` have not been set.* | +| run.workflow | [`workflow`](#workflow-process) | `no` | The definition of the workflow to run.
*Required if `container`, `script` and `shell` have not been set.* | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + + runContainer: + run: + container: + image: fake-image + + runScript: + run: + script: + language: js + code: > + Some cool multiline script + + runShell: + run: + shell: + command: 'echo "Hello, ${ .user.name }"' + + runWorkflow: + run: + workflow: + reference: another-one:0.1.0 + input: {} +``` + +##### Container Process + +Enables the execution of external processes encapsulated within a containerized environment, allowing workflows to interact with and execute complex operations using containerized applications, scripts, or commands. + +###### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| image | `string` | `yes` | The name of the container image to run | +| command | `string` | `no` | The command, if any, to execute on the container | +| ports | `map` | `no` | The container's port mappings, if any | +| volumes | `map` | `no` | The container's volume mappings, if any | +| environment | `map` | `no` | A key/value mapping of the environment variables, if any, to use when running the configured process | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + runContainer: + run: + container: + image: fake-image +``` + +##### Script Process + +Enables the execution of custom scripts or code within a workflow, empowering workflows to perform specialized logic, data processing, or integration tasks by executing user-defined scripts written in various programming languages. + +###### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| language | `string` | `yes` | The language of the script to run | +| code | `string` | `no` | The script's code.
*Required if `source` has not been set.* | +| source | [externalResource](#external-resource) | `no` | The script's resource.
*Required if `code` has not been set.* | +| environment | `map` | `no` | A key/value mapping of the environment variables, if any, to use when running the configured process | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + runScript: + run: + script: + language: js + code: > + Some cool multiline script +``` + +##### Shell Process + +Enables the execution of shell commands within a workflow, enabling workflows to interact with the underlying operating system and perform system-level operations, such as file manipulation, environment configuration, or system administration tasks. + +###### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| command | `string` | `yes` | The shell command to run | +| arguments | `map` | `no` | A list of the arguments of the shell command to run | +| environment | `map` | `no` | A key/value mapping of the environment variables, if any, to use when running the configured process | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + runShell: + run: + shell: + command: 'echo "Hello, ${ .user.name }"' +``` + +##### Workflow Process + +Enables the invocation and execution of nested workflows within a parent workflow, facilitating modularization, reusability, and abstraction of complex logic or business processes by encapsulating them into standalone workflow units. + +###### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| name | `string` | `yes` | The name of the workflow to run | +| version | `string` | `yes` | The version of the workflow to run. Defaults to `latest` | +| input | `any` | `yes` | The data, if any, to pass as input to the workflow to execute. The value should be validated against the target workflow's input schema, if specified | + +###### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + runShell: + run: + workflow: + reference: another-one:0.1.0 + input: + foo: bar +``` + +#### Switch + +Enables conditional branching within workflows, allowing them to dynamically select different paths based on specified conditions or criteria + +##### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| switch | [`map[string, case]`](#switch-case) | `yes` | A name/value map of the cases to switch on | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + processOrder: + switch: + case1: + when: .orderType == "electronic" + then: processElectronicOrder + case2: + when: .orderType == "physical" + then: processPhysicalOrder + default: + then: handleUnknownOrderType + processElectronicOrder: + execute: + sequentially: + validatePayment: {...} + fulfillOrder: {...} + then: exit + processPhysicalOrder: + execute: + sequentially: + checkInventory: {...} + packItems: {...} + scheduleShipping: {...} + then: exit + handleUnknownOrderType: + execute: + sequentially: + logWarning: {...} + notifyAdmin: {...} +``` + +##### Switch Case + +Defines a switch case, encompassing of a condition for matching and an associated action to execute upon a match. + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| when | `string` | `no` | A runtime expression used to determine whether or not the case matches.
*If not set, the case will be matched by default if no other case match.*
*Note that there can be only one default case, all others **MUST** set a condition.* +| then | [`flowDirective`](#flow-directive) | `yes` | The flow directive to execute when the case matches. | + +#### Try + +Serves as a mechanism within workflows to handle errors gracefully, potentially retrying failed tasks before proceeding with alternate ones. + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| try | [`task`](#task) | `yes` | The task to perform. | +| catch | [`errorCatcher`](#error-catcher) | `yes` | The task to perform. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + processOrder: + try: + call: http + with: + method: get + uri: https:// + catch: + errors: + with: + type: https://serverlessworkflow.io.io/dsl/errors/types/communication + status: 503 + as: error + retry: + delay: + seconds: 3 + backoff: + exponential: {} + limit: + attempt: + count: 5 +``` + +##### Error Catcher + +Defines the configuration of a concept used to catch errors + +###### Properties + +| Name | Type | Required | Description | +|:--|:---:|:---:|:---| +| errors | [`errorFilter`](#retry-policy) | `no` | The definition of the errors to catch | +| as | `string` | `no` | The name of the runtime expression variable to save the error as. Defaults to 'error'. | +| when | `string`| `no` | A runtime expression used to determine whether or not to catch the filtered error | +| exceptWhen | `string` | `no` | A runtime expression used to determine whether or not to catch the filtered error | +| retry | [`retryPolicy`](#retry-policy) | `no` | The retry policy to use, if any, when catching errors | +| do | [`task`](#task) | `no` | The definition of the task to run when catching an error | + +#### Wait + +Allows workflows to pause or delay their execution for a specified period of time. + +##### Properties + +| Name | Type | Required | Description| +|:--|:---:|:---:|:---| +| wait | `string`
[`duration`](#duration) | `yes` | The amount of time to wait.
If a `string`, must be a valid [ISO 8601](#) duration expression. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + delay10Seconds: + wait: + seconds: 10 +``` + +### Flow Directive + +Flow Directives are commands within a workflow that dictate its progression. + +| Directive | Description | +| --------- | ----------- | +| `continue` | Instructs the workflow to proceed with the next task in line. This action may conclude the execution of a particular workflow or branch if there are not task defined after the continue one. | +| `exit` | Halts the current branch's execution, potentially terminating the entire workflow if the current task resides within the main branch. | +| `end` | Provides a graceful conclusion to the workflow execution, signaling its completion explicitly. | + +### External Resource + +Defines an external resource. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| name | `string` | `no` | The name, if any, of the defined resource. | +| uri | `string` | `yes` | The URI at which to get the defined resource. | +| authentication | [`authenticationPolicy`](#authentication-policy) | `no` | The authentication policy, or the name of the authentication policy, to use when fecthing the resource. | + +##### Examples + +```yaml +name: sample-resource +uri: https://fake.com/resource/0123 +authentication: + basic: + username: admin + password: 1234 +``` + +### Authentication Policy + +Defines the mechanism used to authenticate users and workflows attempting to access a service or a resource + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| basic | [`basicAuthentication`](#basic-authentication) | `no` | The `basic` authentication scheme to use, if any.
Required if no other property has been set, otherwise ignored. | +| bearer | [`basicAuthentication`](#bearer-authentication) | `no` | The `bearer` authentication scheme to use, if any.
Required if no other property has been set, otherwise ignored. | +| certificate | [`certificateAuthentication`](#certificate-authentication) | `no` | The `certificate` authentication scheme to use, if any.
Required if no other property has been set, otherwise ignored. | +| digest | [`digestAuthentication`](#digest-authentication) | `no` | The `digest` authentication scheme to use, if any.
Required if no other property has been set, otherwise ignored. | +| bearer | [`oauth2`](#oauth2-authentication) | `no` | The `oauth2` authentication scheme to use, if any.
Required if no other property has been set, otherwise ignored. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +use: + secrets: + usernamePasswordSecret: {} + authentication: + sampleBasicFromSecret: + basic: usernamePasswordSecret +do: + getMessages: + call: http + with: + method: get + endpoint: + uri: https://secured.fake.com/sample + authentication: sampleBasicFromSecret +``` + +#### Basic Authentication + +Defines the fundamentals of a 'basic' authentication. + +##### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| username | `string` | `yes` | The username to use. | +| password | `string` | `yes` | The password to use. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +use: + authentication: + sampleBasic: + basic: + username: admin + password: 123 +do: + getMessages: + call: http + with: + method: get + endpoint: + uri: https://secured.fake.com/sample + authentication: sampleBasic +``` + +#### Bearer Authentication + +Defines the fundamentals of a 'bearer' authentication + +##### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| bearer | `string` | `yes` | The bearer token to use. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + getMessages: + call: http + with: + method: get + endpoint: + uri: https://secured.fake.com/sample + authentication: + bearer: + token: ${ .user.token } +``` + +#### Certificate Authentication + + +#### Digest Authentication + + +#### OAUTH2 Authentication + +Defines the fundamentals of an 'oauth2' authentication + +##### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| authority | `string` | `yes` | The URI that references the OAuth2 authority to use. | +| grant | `string` | `yes` | The grant type to use. +| client.id | `string` | `yes` | The client id to use. | +| client.secret | `string` | `no` | The client secret to use, if any. | +| scopes | `string[]` | `no` | The scopes, if any, to request the token for. | +| audiences | `string[]` | `no` | The audiences, if any, to request the token for. | +| username | `string` | `no` | The username to use. Used only if the grant type is `Password`. | +| password | `string` | `no` | The password to use. Used only if the grant type is `Password`. | +| subject | [`oauth2Token`](#oauth2-token) | `no` | The security token that represents the identity of the party on behalf of whom the request is being made. | +| actor | [`oauth2Token`](#oauth2-token) | `no` | The security token that represents the identity of the acting party. | + +##### Examples + +```yaml +document: + dsl: 0.10 + name: sample-workflow + version: '0.1.0' +do: + getMessages: + call: http + with: + method: get + endpoint: + uri: https://secured.fake.com/sample + authentication: + oauth2: + authority: http://keycloak/realms/fake-authority/.well-known/openid-configuration + grant: client-credentials + client: + id: workflow-runtime + secret: ********** + scopes: [ api ] + audiences: [ runtime ] +``` + +##### OAUTH2 Token + +Represents the definition of an OAUTH2 token + +###### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| token | `string` | `yes` | The security token to use to use. | +| type | `string` | `yes` | The type of security token to use. | + +### Extension + +Holds the definition for extending functionality, providing configuration options for how an extension extends and interacts with other components. + +Extensions enable the execution of tasks prior to those they extend, offering the flexibility to potentially bypass the extended task entirely using an [`exit` workflow directive](#flow-directive). + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| extend | `string` | `yes` | The type of task to extend
Supported values are: `call`, `composite`, `emit`, `extension`, `for`, `listen`, `raise`, `run`, `set`, `switch`, `try`, `wait` and `all` | +| when | `string` | `no` | A runtime expression used to determine whether or not the extension should apply in the specified context | +| before | [`task`](#task) | `no` | The task to execute, if any, before the extended task | +| after | [`task`](#task) | `no` | The task to execute, if any, after the extended task | + +#### Examples + +*Perform logging before and after any non-extension task is run:* +```yaml +document: + name: sample-workflow + version: '0.1.0' +use: + extensions: + logging: + extend: all + before: + call: http + with: + method: post + uri: https://fake.log.collector.com + body: + message: "${ \"Executing task '\($task.reference)'...\" }" + after: + call: http + with: + method: post + uri: https://fake.log.collector.com + body: + message: "${ \"Executed task '\($task.reference)'...\" }" +do: + get: + call: http + with: + method: get + uri: https://fake.com/sample +``` + +*Intercept HTTP calls to 'https://mocked.service.com' and mock its response:* +```yaml +document: + name: sample-workflow + version: '0.1.0' +use: + extensions: + mockService: + extend: http + when: ($task.with.uri != null and ($task.with.uri | startswith("https://mocked.service.com"))) or ($task.with.endpoint.uri != null and ($task.with.endpoint.uri | startswith("https://mocked.service.com"))) + before: + set: + statusCode: 200 + headers: + Content-Type: application/json + content: + foo: + bar: baz + then: exit #using this, we indicate to the workflow we want to exit the extended task, thus just returning what we injected +do: + get: + call: http + with: + method: get + uri: https://fake.com/sample +``` + +### Error + +Defines the [Problem Details RFC](https://datatracker.ietf.org/doc/html/rfc7807) compliant description of an error. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| type | `string` | `yes` | A URI reference that identifies the [`error`](#error) type.
For cross-compatibility concerns, it is strongly recommended to use [Standard Error Types](#standard-error-types) whenever possible.
Runtimes **MUST** ensure that the property has been set when raising or escalating the [`error`](#error). | +| status | `integer` | `yes` | The status code generated by the origin for this occurrence of the [`error`](#error).
For cross-compatibility concerns, it is strongly recommended to use [HTTP Status Codes](https://datatracker.ietf.org/doc/html/rfc7231#section-6) whenever possible.
Runtimes **MUST** ensure that the property has been set when raising or escalating the [`error`](#error). | +| instance | `string` | `yes` | A [JSON Pointer](https://datatracker.ietf.org/doc/html/rfc6901) used to reference the component the [`error`](#error) originates from.
Runtimes **MUST** set the property when raising or escalating the [`error`](#error). Otherwise ignore. | +| title | `string` | `no` | A short, human-readable summary of the [`error`](#error). | +| detail | `string` | `no` | A human-readable explanation specific to this occurrence of the [`error`](#error). | + +#### Examples + +```yaml +type: https://Serverless Workflow.io/spec/1.0.0/errors/communication +title: Service Not Available +status: 503 +``` + +#### Standard Error Types + +Standard error types serve the purpose of categorizing errors consistently across different runtimes, facilitating seamless migration from one runtime environment to another. + +| Type | Status¹ | Description | +|------|:-------:|-------------| +| [https://Serverless Workflow.io/spec/1.0.0/errors/configuration](#) | `400` | Errors resulting from incorrect or invalid configuration settings, such as missing or misconfigured environment variables, incorrect parameter values, or configuration file errors. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/validation](#) | `400` | Errors arising from validation processes, such as validation of input data, schema validation failures, or validation constraints not being met. These errors indicate that the provided data or configuration does not adhere to the expected format or requirements specified by the workflow. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/expression](#) | `400` | Errors occurring during the evaluation of runtime expressions, such as invalid syntax or unsupported operations. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/authentication](#) | `401` | Errors related to authentication failures. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/authorization](#) | `403` | Errors related to unauthorized access attempts or insufficient permissions to perform certain actions within the workflow. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/timeout](#) | `408` | Errors caused by timeouts during the execution of tasks or during interactions with external services. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/communication](#) | `500` | Errors encountered while communicating with external services, including network errors, service unavailable, or invalid responses. | +| [https://Serverless Workflow.io/spec/1.0.0/errors/runtime](#) | `500` | Errors occurring during the runtime execution of a workflow, including unexpected exceptions, errors related to resource allocation, or failures in handling workflow tasks. These errors typically occur during the actual execution of workflow components and may require runtime-specific handling and resolution strategies. | + +¹ *Default value. The `status code` that best describe the error should always be used.* + +### Input Data Model + +Documents the structure - and optionally configures the filtering of - workflow/task input data. + +It's crucial for authors to document the schema of input data whenever feasible. This documentation empowers consuming applications to provide contextual auto-suggestions when handling runtime expressions. + +When set, runtimes must validate input data against the defined schema, unless defined otherwise. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| schema | [`schema`](#schema) | `no` | The [`schema`](#schema) used to describe and validate input data.
*Even though the schema is not required, it is strongly encouraged to document it, whenever feasible.* | +| from | `string`
`object` | `no` | A [runtime expression](#runtime-expressions), if any, used to filter and/or mutate the workflow/task input. | + +#### Examples + +```yaml +schema: + format: json + document: + type: object + properties: + petId: + type: string + required: [ petId ] +from: .order.pet +``` + +### Output Data Model + +Documents the structure - and optionally configures the filtering of - workflow/task output data. + +It's crucial for authors to document the schema of output data whenever feasible. This documentation empowers consuming applications to provide contextual auto-suggestions when handling runtime expressions. + +When set, runtimes must validate output data against the defined schema, unless defined otherwise. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| schema | [`schema`](#schema) | `no` | The [`schema`](#schema) used to describe and validate output data.
*Even though the schema is not required, it is strongly encouraged to document it, whenever feasible.* | +| from | `string`
`object` | `no` | A [runtime expression](#runtime-expressions), if any, used to filter and/or mutate the workflow/task output. | +| to | `string`
`object` | `no` | A [runtime expression](#runtime-expressions), if any, used to update the context, using both output and context data. | + +#### Examples + +```yaml +schema: + format: json + document: + type: object + properties: + petId: + type: string + required: [ petId ] +from: + petId: '${ .pet.id }' +to: '.petList += [ . ]' +``` + +### Schema + +Describes a data schema. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| format | `string` | `yes` | The schema format.
*Supported values are:*
*- `json`, which indicates the [JsonSchema](https://json-schema.org/) format.* | +| document | `object` | `no` | The inline schema document.
*Required if `resource` has not been set, otherwise ignored.* | +| resource | [`externalResource`](#external-resource) | `no` | The schema external resource.
*Required if `document` has not been set, otherwise ignored.* | + +#### Examples + +*Example of an inline JsonSchema:* +```yaml +format: json +document: + type: object + properties: + id: + type: string + firstName: + type: string + lastName: + type: string + required: [ id, firstName, lastName ] +``` + +*Example of a JsonSchema based on an external resource:* +```yaml +format: json +resource: + uri: https://test.com/fake/schema/json/document.json +``` + +### Timeout + +Defines a workflow or task timeout. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| after | [`duration`](#duration) | `yes` | The duration after which the workflow or task times out. | + +#### Examples + +```yaml +document: + dsl: 0.10 + namespace: default + name: sample +do: + waitFor60Seconds: + wait: + seconds: 60 +timeout: + after: + seconds: 30 +``` + +### Duration + +Defines a duration. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| Days | `integer` | `no` | Number of days, if any. | +| Hours | `integer` | `no` | Number of hours, if any. | +| Minutes | `integer` | `no`| Number of minutes, if any. | +| Seconds | `integer` | `no`| Number of seconds, if any. | +| Milliseconds | `integer` | `no`| Number of milliseconds, if any. | + +#### Examples + +*Example of a duration of 2 hours, 15 minutes and 30 seconds:* +```yaml +hours: 2 +minutes: 15 +seconds: 30 +``` + +### HTTP Response + +Describes an HTTP response. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| request | [`request`](#http-request) | `yes` | The HTTP request associated with the HTTP response. | +| statusCode | `integer` | `yes` | The HTTP response status code. | +| headers | `map[string, string]` | `no` | The HTTP response headers, if any. | +| content | `any` | `no` | The HTTP response content, if any.
*If the request's content type is one of the following, should contain the deserialized response content. Otherwise, should contain the base-64 encoded response content, if any.*| + +#### Examples + +```yaml +request: + method: get + uri: https://petstore.swagger.io/v2/pet/1 + headers: + Content-Type: application/json +headers: + Content-Type: application/json +statusCode: 200 +content: + id: 1 + name: milou + status: pending +``` + +### HTTP Request + +Describes an HTTP request. + +#### Properties + +| Property | Type | Required | Description | +|----------|:----:|:--------:|-------------| +| method | `string` | `yes` | The request's method. | +| uri | `uri` | `yes` | The request's URI. | +| headers | `map[string, string]` | `no` | The HTTP request headers, if any. | + +#### Examples + +```yaml +method: get +uri: https://petstore.swagger.io/v2/pet/1 +headers: + Content-Type: application/json +``` \ No newline at end of file diff --git a/dsl.md b/dsl.md new file mode 100644 index 00000000..c1513405 --- /dev/null +++ b/dsl.md @@ -0,0 +1,130 @@ +# Serverless Workflow DSL + +## Table of Contents + +- [Abstract](#abstract) +- [Motivation](#motivation) +- [Design](#design) +- [Versioning](#versioning) + +## Abstract + +This document proposes the creation of a Domain Specific Language (DSL) called Serverless Workflow, tailored for building platform agnostic workflows. + +Serverless Workflow aims to simplify the orchestration of complex processes across diverse environments, providing developers with a unified syntax and set of tools for defining and executing serverless workflows. + +## Motivation + +Serverless computing has gained popularity for its ability to abstract away infrastructure management tasks, enabling developers to focus on application logic. However, orchestrating serverless workflows across multiple environments often involves dealing with disparate tools and platforms, leading to complexity and inefficiency. + +Serverless Workflow addresses this challenge by providing a DSL specifically designed for serverless workflow orchestration. By abstracting away the underlying infrastructure complexities and offering a modular and extensible framework, Serverless Workflow aims to streamline the development, deployment, and management of serverless workflows. + +## Design + +The Serverless Workflow DSL is crafted with a design philosophy that prioritizes clarity, expressiveness, and ease of use. Its foundation lies in linguistic fluency, emphasizing readability and comprehension. By adopting a fluent style, the DSL promotes intuitive understanding through natural language constructs. Verbs are employed in the imperative tense to denote actions, enhancing clarity and directness in expressing workflow logic. This imperative approach empowers developers to articulate their intentions succinctly and effectively. + +The DSL also embraces the principle of implicit default behaviors, sparing authors from unnecessary repetition and enhancing the conciseness of workflow definitions. For instance, default settings alleviate the burden of explicitly defining every detail, streamlining the workflow design process. Furthermore, the DSL allows both inline declaration of components or the creation of reusable elements, granting flexibility in workflow composition. This flexibility allows developers to seamlessly integrate inline task definitions without imposing rigid structural requirements. + +Moreover, the DSL eschews strong-typed enumerations wherever feasible, fostering extensibility and adaptability across different runtime environments. While maintaining portability is crucial, the DSL prioritizes customization options for extensions and runtimes, enabling tailored implementations to suit diverse use cases. Additionally, the DSL favors universally understood terms over technical jargon, enhancing accessibility and comprehension for a broader audience. + +- Embrace linguistic fluency for enhanced readability and understanding. +- Utilize imperative verbs to convey actions directly and clearly. +- Employ implicit default behaviors to reduce redundancy and streamline workflow definitions. +- Enable the declaration and effortless import of shared components by supporting external references +- Encourage the declaration of components inline for situations where reusability is unnecessary, prioritizing ease of use in such cases. +- Prioritize flexibility over strong-typed enumerations for enhanced extensibility. +- Opt for universally understood terms to improve accessibility and comprehension. + +## Versioning + +## Version + +The versioning strategy for the Serverless Workflow DSL is structured to accommodate different types of changes introduced through pull requests (PRs). + +If a PR is labeled with `change: documentation`, indicating modifications related to improving or clarifying documentation, it does not trigger a version change. + +Conversely, if the PR addresses a fix, labeled as `change: fix`, it results in an increase in the patch version (0.0.x). +A fix typically refers to corrections made to resolve bugs or errors in the DSL specification or its implementations, ensuring smoother functionality and reliability. + +Similarly, when a PR introduces a new feature, labeled as `change: feature`, it prompts an increase in the minor version (0.x.0). +A feature denotes the addition of significant capabilities, enhancements, or functionalities that extend the DSL's capabilities or improve its usability. + +Lastly, if the PR is marked as `change: breaking`, indicating alterations that are incompatible with previous versions, it leads to an increase in the major version (x.0.0). A breaking change signifies modifications that necessitate adjustments in existing workflows or implementations, potentially impacting backward compatibility. + +This versioning strategy ensures clarity and transparency in tracking changes and communicating their impact on users and implementations. + +| Label | Version Change | Description | +|:-- |:---:|:---| +| `change: documentation` | - | Modifications related to documentation improvements or clarifications. | +| `change: fix` | `0.0.x` | Corrections made to resolve bugs or errors in the DSL specification or its implementations. | +| `change: feature` | `0.x.0` | Addition of significant capabilities, enhancements, or functionalities that extend the DSL's capabilities or improve its usability. | +| `change: breaking` | `x.0.0` | Alterations that are incompatible with previous versions, necessitating adjustments in existing workflows or implementations. | + +In addition to versioning changes denoted by labels in pull requests, pre-release versions will be suffixed with either `alphaX`, `betaX`, or `rcX` where `X` represents the pre-release version number (ex: `1.0.0-alpha1`). These pre-release versions are designated to indicate different stages of development and testing before the final release. + +- **Alpha versions** are the earliest stages of testing and development. They typically contain incomplete features and may have known issues. They are intended for a limited audience, such as internal testers or early adopters, for initial feedback and testing. + +- **Beta versions** represent a more stable state compared to alpha versions. They are released to a broader audience, allowing for wider testing and feedback collection. Beta versions may still contain bugs or issues, but they are generally considered to be closer to the final release state. + +- **Release Candidate (RC)** versions are the versions that are considered to be potentially ready for final release. They undergo rigorous testing to identify and resolve any remaining critical issues. RC versions are released to a wider audience for final validation before the official release. + +These pre-release versions with appropriate suffixes provide transparency about the development stage and help users and testers understand the level of stability and readiness of each release candidate. + +## Concepts + +### Workflow + +A Serverless Workflow is a sequence of specific [tasks](#tasks) that are executed in a predetermined order. By default, this order follows the declaration sequence within the workflow definition. Workflows are designed to automate processes and orchestrate various serverless functions and services. + +Workflows can be triggered in different ways: upon request, scheduled using CRON expressions, or initiated upon correlation with specific events. + +Additionally, workflows may optionally accept inputs and produce outputs, allowing for data processing and transformation within the workflow execution. + +Workflows in the Serverless Workflow DSL can exist in several phases, each indicating the current state of the workflow execution. These phases include: + +| Phase | Description | +| --- | --- | +| `Pending` | The workflow has been initiated and is pending execution. | +| `Running` | The workflow is currently in progress. | +| `Suspended` | The workflow execution has been paused or halted temporarily. | +| `Cancelled` | The workflow execution has been terminated before completion. | +| `Faulted` | The workflow execution has encountered an error. | +| `Completed` | The workflow execution has successfully finished all tasks. | + +Additionally, the flow of execution within a workflow can be controlled using [directives*](#flow-directives), which provide instructions to the workflow engine on how to manage and handle specific aspects of workflow execution. + +**To learn more about flow directives and how they can be utilized to control the execution and behavior of workflows, please refer to [Flow Directives](#flow-directives).* + + +#TODO: ++ Describe how workflow flows ++ Explain how data flows ++ Explain flow directives ++ Explain runtime expressions ++ Explain referanceable components ++ Explain tasks + - Call + - Execute + - Emit + - For + - Listen + + Listen to one + + Listen to all + + Listen to any + + Listen to any until + - Raise + - Run + + container + + script + + shell + + workflow + - Set + - Try + - Wait ++ Explain errors ++ Explain fault tolerance, retries and timeouts ++ Explain service interoperability ++ Explain custom functionality/processes ++ Explain extensions, with before/after ++ Explain authentication ++ Explain endpoints and external resources \ No newline at end of file diff --git a/examples/README.md b/examples/README.md deleted file mode 100644 index 1d815d42..00000000 --- a/examples/README.md +++ /dev/null @@ -1,4946 +0,0 @@ - -# Examples - -Provides Serverless Workflow language examples - -## Table of Contents - -- [Hello World](#Hello-World-Example) -- [Greeting](#Greeting-Example) -- [Event-based greeting (Event State)](#Event-Based-Greeting-Example) -- [Solving Math Problems (ForEach state)](#Solving-Math-Problems-Example) -- [Parallel Execution](#Parallel-Execution-Example) -- [Async Function Invocation](#Async-Function-Invocation-Example) -- [Async SubFlow Invocation](#Async-SubFlow-Invocation-Example) -- [Event Based Transitions (Event-based Switch)](#Event-Based-Transitions-Example) -- [Applicant Request Decision (Data-based Switch + SubFlows)](#Applicant-Request-Decision-Example) -- [Provision Orders (Error Handling)](#Provision-Orders-Example) -- [Monitor Job for completion (Polling)](#Monitor-Job-Example) -- [Send CloudEvent on Workflow Completion](#Send-CloudEvent-On-Workflow-Completion-Example) -- [Monitor Patient Vital Signs (Event state)](#Monitor-Patient-Vital-Signs-Example) -- [Finalize College Application (Event state)](#Finalize-College-Application-Example) -- [Perform Customer Credit Check (Callback state)](#Perform-Customer-Credit-Check-Example) -- [Handle Car Auction Bids (Scheduled start Event state)](#Handle-Car-Auction-Bids-Example) -- [Check Inbox Periodically (Cron-based Workflow start)](#Check-Inbox-Periodically) -- [Event-based service invocation (Event triggered actions)](#Event-Based-Service-Invocation) -- [Reusing Function and Event Definitions](#Reusing-Function-And-Event-Definitions) -- [New Patient Onboarding (Error checking and Retries)](#New-Patient-Onboarding) -- [Purchase order deadline (ExecTimeout)](#Purchase-order-deadline) -- [Accumulate room readings and create timely reports (ExecTimeout and KeepActive)](#Accumulate-room-readings) -- [Car vitals checks (SubFlow Repeat)](#Car-Vitals-Checks) -- [Book Lending Workflow](#Book-Lending) -- [Filling a glass of water (Expression functions)](#Filling-a-glass-of-water) -- [Online Food Ordering](#Online-Food-Ordering) -- [Continuing as a new Execution](#Continuing-as-a-new-Execution) -- [Process Transactions (Foreach State with conditions)](#Process-Transactions) - -### Hello World Example - -#### Description - -In this simple example we use an [Inject State](../specification.md#Inject-State) to inject -`Hello World` in the states data (as the value of the 'result' property). -After the state execution completes, since it is an end state, its data output becomes the workflow -data output, which is: - -```json -{ - "result": "Hello World" -} -``` - -#### Workflow Diagram - -

-Hello World Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "helloworld", - "version": "1.0.0", - "specVersion": "0.8", - "name": "Hello World Workflow", - "description": "Inject Hello World", - "start": "hello-state", - "states": [ - { - "name": "hello-state", - "type": "inject", - "data": { - "result": "Hello World!" - }, - "end": true - } - ] -}``` - - - -```yaml -id: helloworld -version: 1.0.0 -specVersion: "0.8" -name: Hello World Workflow -description: Inject Hello World -start: hello-state -states: - - name: hello-state - type: inject - data: - result: Hello World! - end: true -``` - -
- -### Greeting Example - -#### Description - -This example shows a single [Operation State](../specification.md#operation-state) with one action that calls the "greeting" function. -The workflow data input is assumed to be the name of the person to greet: - -```json -{ - "person": { - "name": "John" - } -} -``` - -The results of the action is assumed to be the greeting for the provided persons name: - -```json -{ - "greeting": "Welcome to Serverless Workflow, John!" -} -``` - -Which is added to the states data and becomes the workflow data output. - -#### Workflow Diagram - -

-Greeting Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "greeting", - "version": "1.0.0", - "specVersion": "0.8", - "name": "greeting-workflow", - "description": "Greet Someone", - "start": "greet", - "functions": [ - { - "name": "greeting-function", - "type": "openapi", - "operation": "file://myapis/greetingapis.json#greeting" - } - ], - "states": [ - { - "name": "greet", - "type": "operation", - "actions": [ - { - "name": "greet-action", - "functionRef": { - "refName": "greeting-function", - "arguments": { - "name": "${ .person.name }" - } - }, - "actionDataFilter": { - "results": "${ {greeting: .greeting} }" - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: greeting -version: 1.0.0 -specVersion: "0.8" -name: greeting-workflow -description: Greet Someone -start: greet -functions: - - name: greeting-function - type: openapi - operation: file://myapis/greetingapis.json#greeting -states: - - name: greet - type: operation - actions: - - name: greet-action - functionRef: - refName: greeting-function - arguments: - name: ${ .person.name } - actionDataFilter: - results: "${ {greeting: .greeting} }" - end: true -``` - -
- -### Event Based Greeting Example - -#### Description - -This example shows a single [Event State](../specification.md#event-state) with one action that calls the "greeting" function. -The event state consumes cloud events of type "greetingEventType". When an event with this type -is consumed, the Event state performs a single action that calls the defined "greeting" function. - -For the sake of the example we assume that the cloud event we will consume has the format: - -```json -{ - "specversion" : "1.0", - "type" : "greetingEventType", - "source" : "greetingEventSource", - "data" : { - "greet": { - "name": "John" - } - } -} -``` - -The results of the action is assumed to be the full greeting for the provided persons name: - -```json -{ - "payload": { - "greeting": "Welcome to Serverless Workflow, John!" - } -} -``` - -Note that in the workflow definition you can see two filters defined. The event data filter defined inside the consume element: - -```json -{ - "eventDataFilter": { - "data": "${ .data.greet } " - } -} -``` - -which is triggered when the greeting event is consumed. It extracts its "data.greet" of the event data (payload) and -merges it with the state data. - -The second, a state data filter, which is defined on the event state itself: - -```json -{ - "stateDataFilter": { - "output": "${ .payload.greeting }" - } -} -``` - -filters what is selected to be the state data output which then becomes the workflow data output (as it is an end state): - -```text - "Welcome to Serverless Workflow, John!" -``` - -#### Workflow Diagram - -

-Event Based Greeting Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "eventbasedgreeting", - "version": "1.0.0", - "specVersion": "0.8", - "name": "Event Based Greeting Workflow", - "description": "Event Based Greeting", - "start": "greet", - "events": [ - { - "name": "greeting-event", - "type": "greetingEventType", - "source": "greetingEventSource" - } - ], - "functions": [ - { - "name": "greeting-function", - "operation": "file://myapis/greetingapis.json#greeting" - } - ], - "states": [ - { - "name": "greet", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "greeting-event" - ], - "eventDataFilter": { - "data": "${ .greet }", - "toStateData": "${ .greet }" - }, - "actions": [ - { - "name": "greet-action", - "functionRef": { - "refName": "greeting-function", - "arguments": { - "name": "${ .greet.name }" - } - } - } - ] - } - ], - "stateDataFilter": { - "output": "${ .payload.greeting }" - }, - "end": true - } - ] -}``` - - - -```yaml -id: eventbasedgreeting -version: 1.0.0 -specVersion: "0.8" -name: Event Based Greeting Workflow -description: Event Based Greeting -start: greet -events: - - name: greeting-event - type: greetingEventType - source: greetingEventSource -functions: - - name: greeting-function - operation: file://myapis/greetingapis.json#greeting -states: - - name: greet - type: event - onEvents: - - eventRefs: - - greeting-event - eventDataFilter: - data: ${ .greet } - toStateData: ${ .greet } - actions: - - name: greet-action - functionRef: - refName: greeting-function - arguments: - name: ${ .greet.name } - stateDataFilter: - output: ${ .payload.greeting } - end: true -``` - -
- -### Solving Math Problems Example - -#### Description - -In this example we show how to iterate over data using the [ForEach State](../specification.md#foreach-state). -The state will iterate over a collection of simple math expressions which are -passed in as the workflow data input: - -```json - { - "expressions": ["2+2", "4-1", "10x3", "20/2"] - } -``` - -The ForEach state will execute a single defined operation state for each math expression. The operation -state contains an action which calls a serverless function which actually solves the expression -and returns its result. - -Results of all math expressions are accumulated into the data output of the ForEach state which become the final -result of the workflow execution. - -#### Workflow Diagram - -

-Looping Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "solvemathproblems", - "version": "1.0.0", - "specVersion": "0.8", - "name": "solve-math-problems-workflow", - "description": "Solve math problems", - "start": "solve", - "functions": [ - { - "name": "solve-math-exp-func", - "operation": "http://myapis.org/mapthapis.json#solveExpression" - } - ], - "states": [ - { - "name": "solve", - "type": "foreach", - "inputCollection": "${ .expressions }", - "iterationParam": "singleexpression", - "outputCollection": "${ .results }", - "actions": [ - { - "name": "solve-action", - "functionRef": { - "refName": "solve-math-exp-func", - "arguments": { - "expression": "${ .singleexpression }" - } - } - } - ], - "stateDataFilter": { - "output": "${ .results }" - }, - "end": true - } - ] -}``` - - - -```yaml -id: solvemathproblems -version: 1.0.0 -specVersion: "0.8" -name: solve-math-problems-workflow -description: Solve math problems -start: solve -functions: - - name: solve-math-exp-func - operation: http://myapis.org/mapthapis.json#solveExpression -states: - - name: solve - type: foreach - inputCollection: ${ .expressions } - iterationParam: singleexpression - outputCollection: ${ .results } - actions: - - name: solve-action - functionRef: - refName: solve-math-exp-func - arguments: - expression: ${ .singleexpression } - stateDataFilter: - output: ${ .results } - end: true -``` - -
- -### Parallel Execution Example - -#### Description - -This example uses a [Parallel State](../specification.md#parallel-state) to execute two branches (simple wait states) at the same time. -The completionType type is set to "allOf", which means the parallel state has to wait for both branches -to finish execution before it can transition (end workflow execution in this case as it is an end state). - -#### Workflow Diagram - -

-Parallel Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "parallelexec", - "version": "1.0.0", - "specVersion": "0.8", - "name": "parallel-execution", - "description": "Executes two branches in parallel", - "start": "parallelexec", - "states": [ - { - "name": "parallelexec", - "type": "parallel", - "completionType": "allOf", - "branches": [ - { - "name": "short-delay-branch", - "actions": [ - { - "name": "short-delay-action", - "subFlowRef": "shortdelayworkflowid" - } - ] - }, - { - "name": "long-delay-branch", - "actions": [ - { - "name": "short-delay-action", - "subFlowRef": "longdelayworkflowid" - } - ] - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: parallelexec -version: 1.0.0 -specVersion: "0.8" -name: parallel-execution -description: Executes two branches in parallel -start: parallelexec -states: - - name: parallelexec - type: parallel - completionType: allOf - branches: - - name: short-delay-branch - actions: - - name: short-delay-action - subFlowRef: shortdelayworkflowid - - name: long-delay-branch - actions: - - name: short-delay-action - subFlowRef: longdelayworkflowid - end: true -``` - -
- -We assume that the two referenced workflows, namely `shortdelayworkflowid` and `longdelayworkflowid` both include a single delay state, -with the `shortdelayworkflowid` workflow delay state defining its `timeDelay` property to be shorter than that of the `longdelayworkflowid` workflows -delay state. - -### Async Function Invocation Example - -#### Description - -This example uses a [Operation State](../specification.md#operation-state) to invoke a function async. -This functions sends an email to a customer. -Async function execution is a "fire-and-forget" type of invocation. The function is invoked and workflow execution -does not wait for its results. - -#### Workflow Diagram - -

-Async Function Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "sendcustomeremail", - "version": "1.0.0", - "specVersion": "0.8", - "name": "send-customer-email-workflow", - "description": "Send email to a customer", - "start": "send-email", - "functions": [ - { - "name": "email-function", - "operation": "file://myapis/emailapis.json#sendEmail" - } - ], - "states": [ - { - "name": "send-email", - "type": "operation", - "actions": [ - { - "name": "send-email-action", - "functionRef": { - "invoke": "async", - "refName": "email-function", - "arguments": { - "customer": "${ .customer }" - } - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: sendcustomeremail -version: 1.0.0 -specVersion: "0.8" -name: send-customer-email-workflow -description: Send email to a customer -start: send-email -functions: - - name: email-function - operation: file://myapis/emailapis.json#sendEmail -states: - - name: send-email - type: operation - actions: - - name: send-email-action - functionRef: - invoke: async - refName: email-function - arguments: - customer: ${ .customer } - end: true -``` - -
- -### Async SubFlow Invocation Example - -#### Description - -This example uses a [Operation State](../specification.md#operation-state) to invoke a [SubFlow](../specification.md#Subflow-Action) async. -This SubFlow is responsible for performing some customer business logic. -Async SubFlow invocation is a "fire-and-forget" type of invocation. The SubFlow is invoked and workflow execution -does not wait for its results. In addition, we specify that the SubFlow should be allowed to continue its execution -event if the parent workflow completes its own execution. This is done by defining the actions `onParentComplete` -property to `continue`. - -#### Workflow Diagram - -

-Async SubFlow Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "onboardcustomer", - "version": "1.0.0", - "specVersion": "0.8", - "name": "onboard-customer", - "description": "Onboard a Customer", - "start": "onboard", - "states": [ - { - "name": "onboard", - "type": "operation", - "actions": [ - { - "name": "onboard-action", - "subFlowRef": { - "invoke": "async", - "onParentComplete": "continue", - "workflowId": "customeronboardingworkflow", - "version": "1.0.0" - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: onboardcustomer -version: 1.0.0 -specVersion: "0.8" -name: onboard-customer -description: Onboard a Customer -start: onboard -states: - - name: onboard - type: operation - actions: - - name: onboard-action - subFlowRef: - invoke: async - onParentComplete: continue - workflowId: customeronboardingworkflow - version: 1.0.0 - end: true -``` - -
- -For the sake of the example, the definition of "customeronboardingworkflow" workflow invoked as a SubFlow -is not shown. - -### Event Based Transitions Example - -#### Description - -In this example we use an Event-based [Switch State](../specification.md#switch-state) to wait for arrival -of the "VisaApproved", or "VisaRejected" Cloud Events. Depending on which type of event happens, -the workflow performs a different transition. If none of the events arrive in the defined 1 hour timeout -period, the workflow transitions to the "HandleNoVisaDecision" state. - -#### Workflow Diagram - -

-Event Based Switch Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "eventbasedswitchstate", - "version": "1.0.0", - "specVersion": "0.8", - "name": "event-based-switch-transitions", - "description": "Event Based Switch Transitions", - "start": "checkvisastatus", - "events": [ - { - "name": "visa-approved-event", - "type": "VisaApproved", - "source": "visaCheckSource" - }, - { - "name": "visa-rejected-event", - "type": "VisaRejected", - "source": "visaCheckSource" - } - ], - "states": [ - { - "name": "checkvisastatus", - "type": "switch", - "eventConditions": [ - { - "eventRef": "visa-approved-event", - "transition": "handle-approved-visa", - "name": "approved-condition" - }, - { - "eventRef": "visa-rejected-event", - "transition": "handle-rejected-visa", - "name": "rejected-condition" - } - ], - "timeouts": { - "eventTimeout": "PT1H" - }, - "defaultCondition": { - "transition": "handle-no-visa-decision" - } - }, - { - "name": "handle-approved-visa", - "type": "operation", - "actions": [ - { - "name": "handle-approved-action", - "subFlowRef": "handleApprovedVisaWorkflowID" - } - ], - "end": true - }, - { - "name": "handle-rejected-visa", - "type": "operation", - "actions": [ - { - "name": "handle-rejected-action", - "subFlowRef": "handleRejectedVisaWorkflowID" - } - ], - "end": true - }, - { - "name": "handle-no-visa-decision", - "type": "operation", - "actions": [ - { - "name": "handle-novisa-action", - "subFlowRef": "handleNoVisaDecisionWorkflowId" - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: eventbasedswitchstate -version: 1.0.0 -specVersion: "0.8" -name: event-based-switch-transitions -description: Event Based Switch Transitions -start: checkvisastatus -events: - - name: visa-approved-event - type: VisaApproved - source: visaCheckSource - - name: visa-rejected-event - type: VisaRejected - source: visaCheckSource -states: - - name: checkvisastatus - type: switch - eventConditions: - - eventRef: visa-approved-event - transition: handle-approved-visa - name: approved-condition - - eventRef: visa-rejected-event - transition: handle-rejected-visa - name: rejected-condition - timeouts: - eventTimeout: PT1H - defaultCondition: - transition: handle-no-visa-decision - - name: handle-approved-visa - type: operation - actions: - - name: handle-approved-action - subFlowRef: handleApprovedVisaWorkflowID - end: true - - name: handle-rejected-visa - type: operation - actions: - - name: handle-rejected-action - subFlowRef: handleRejectedVisaWorkflowID - end: true - - name: handle-no-visa-decision - type: operation - actions: - - name: handle-novisa-action - subFlowRef: handleNoVisaDecisionWorkflowId - end: true -``` - -
- -### Applicant Request Decision Example - -#### Description - -This example shows off the [Switch State](../specification.md#switch-state) and the subflow action. The workflow is started with application information data as input: - -```json - { - "applicant": { - "fname": "John", - "lname": "Stockton", - "age": 22, - "email": "js@something.com" - } - } -``` - -We use the switch state with two conditions to determine if the application should be made based on the applicants age. -If the applicants age is over 18 we start the application (subflow action). Otherwise the workflow notifies the - applicant of the rejection. - -#### Workflow Diagram - -

-Switch State Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "applicantrequest", - "version": "1.0.0", - "specVersion": "0.8", - "name": "applicant-request-decision-workflow", - "description": "Determine if applicant request is valid", - "start": "check-application", - "functions": [ - { - "name": "send-rejection-email-function", - "operation": "http://myapis.org/applicationapi.json#emailRejection" - } - ], - "states": [ - { - "name": "check-application", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .applicants | .age >= 18 }", - "transition": "start-application", - "name": "adult-condition" - }, - { - "condition": "${ .applicants | .age < 18 }", - "transition": "reject-application", - "name": "minor-condition" - } - ], - "defaultCondition": { - "transition": "reject-application" - } - }, - { - "name": "start-application", - "type": "operation", - "actions": [ - { - "name": "start-app-action", - "subFlowRef": "startApplicationWorkflowId" - } - ], - "end": true - }, - { - "name": "reject-application", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "send-reject-action", - "functionRef": { - "refName": "send-rejection-email-function", - "arguments": { - "applicant": "${ .applicant }" - } - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: applicantrequest -version: 1.0.0 -specVersion: "0.8" -name: applicant-request-decision-workflow -description: Determine if applicant request is valid -start: check-application -functions: - - name: send-rejection-email-function - operation: http://myapis.org/applicationapi.json#emailRejection -states: - - name: check-application - type: switch - dataConditions: - - condition: ${ .applicants | .age >= 18 } - transition: start-application - name: adult-condition - - condition: ${ .applicants | .age < 18 } - transition: reject-application - name: minor-condition - defaultCondition: - transition: reject-application - - name: start-application - type: operation - actions: - - name: start-app-action - subFlowRef: startApplicationWorkflowId - end: true - - name: reject-application - type: operation - actionMode: sequential - actions: - - name: send-reject-action - functionRef: - refName: send-rejection-email-function - arguments: - applicant: ${ .applicant } - end: true -``` - -
- -### Provision Orders Example - -#### Description - -In this example we show off the states error handling capability. The workflow data input that's passed in contains -missing order information that causes the function in the "ProvisionOrder" state to throw a runtime exception. With the "onErrors" definition we -can transition the workflow to different error handling states. Each type of error -in this example is handled by simple delay states. If no errors are encountered the workflow can transition to the "ApplyOrder" state. - -Workflow data is assumed to me: - -```json - { - "order": { - "id": "", - "item": "laptop", - "quantity": "10" - } - } -``` - -The data output of the workflow contains the information of the exception caught during workflow execution. - -#### Workflow Diagram - -

-Handle Errors Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "provisionorders", - "version": "1.0.0", - "specVersion": "0.8", - "name": "provision-orders", - "description": "Provision Orders and handle errors thrown", - "start": "provision-order", - "functions": [ - { - "name": "provision-order-function", - "operation": "http://myapis.org/provisioningapi.json#doProvision" - } - ], - "errors": [ - { - "name": "missing-order-id" - }, - { - "name": "missing-order-item" - }, - { - "name": "missing-order-quantity" - } - ], - "states": [ - { - "name": "provision-order", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "provision-action", - "functionRef": { - "refName": "provision-order-function", - "arguments": { - "order": "${ .order }" - } - } - } - ], - "stateDataFilter": { - "output": "${ .exceptions }" - }, - "transition": "apply-order", - "onErrors": [ - { - "errorRef": "missing-order-id", - "transition": "missing-id" - }, - { - "errorRef": "missing-order-item", - "transition": "missing-item" - }, - { - "errorRef": "missing-order-quantity", - "transition": "missing-quantity" - } - ] - }, - { - "name": "missing-id", - "type": "operation", - "actions": [ - { - "name": "missing-action", - "subFlowRef": "handleMissingIdExceptionWorkflow" - } - ], - "end": true - }, - { - "name": "missing-item", - "type": "operation", - "actions": [ - { - "name": "missing-item", - "subFlowRef": "handleMissingItemExceptionWorkflow" - } - ], - "end": true - }, - { - "name": "missing-quantity", - "type": "operation", - "actions": [ - { - "name": "missing-quantity", - "subFlowRef": "handleMissingQuantityExceptionWorkflow" - } - ], - "end": true - }, - { - "name": "apply-order", - "type": "operation", - "actions": [ - { - "name": "apply-order", - "subFlowRef": "applyOrderWorkflowId" - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: provisionorders -version: 1.0.0 -specVersion: "0.8" -name: provision-orders -description: Provision Orders and handle errors thrown -start: provision-order -functions: - - name: provision-order-function - operation: http://myapis.org/provisioningapi.json#doProvision -errors: - - name: missing-order-id - - name: missing-order-item - - name: missing-order-quantity -states: - - name: provision-order - type: operation - actionMode: sequential - actions: - - name: provision-action - functionRef: - refName: provision-order-function - arguments: - order: ${ .order } - stateDataFilter: - output: ${ .exceptions } - transition: apply-order - onErrors: - - errorRef: missing-order-id - transition: missing-id - - errorRef: missing-order-item - transition: missing-item - - errorRef: missing-order-quantity - transition: missing-quantity - - name: missing-id - type: operation - actions: - - name: missing-action - subFlowRef: handleMissingIdExceptionWorkflow - end: true - - name: missing-item - type: operation - actions: - - name: missing-item - subFlowRef: handleMissingItemExceptionWorkflow - end: true - - name: missing-quantity - type: operation - actions: - - name: missing-quantity - subFlowRef: handleMissingQuantityExceptionWorkflow - end: true - - name: apply-order - type: operation - actions: - - name: apply-order - subFlowRef: applyOrderWorkflowId - end: true -``` - -
- -### Monitor Job Example - -#### Description - -In this example we submit a job via an operation state action (serverless function call). It is assumed that it takes some time for -the submitted job to complete and that it's completion can be checked via another separate serverless function call. - -To check for completion we first wait 5 seconds and then get the results of the "CheckJob" serverless function. -Depending on the results of this we either return the results or transition back to waiting and checking the job completion. -This is done until the job submission returns "SUCCEEDED" or "FAILED" and the job submission results are reported before workflow -finishes execution. - -In the case job submission raises a runtime error, we transition to an Operation state which invokes - a sub-flow responsible for handling the job submission issue. - -#### Workflow Diagram - -

-Job Monitoring Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "jobmonitoring", - "version": "1.0.0", - "specVersion": "0.8", - "name": "jobmonitoring", - "description": "Monitor finished execution of a submitted job", - "start": "submit-job", - "functions": [ - { - "name": "submit-job", - "operation": "http://myapis.org/monitorapi.json#doSubmit" - }, - { - "name": "check-job-status", - "operation": "http://myapis.org/monitorapi.json#checkStatus" - }, - { - "name": "report-job-suceeded", - "operation": "http://myapis.org/monitorapi.json#reportSucceeded" - }, - { - "name": "report-job-failed", - "operation": "http://myapis.org/monitorapi.json#reportFailure" - } - ], - "states": [ - { - "name": "submit-job", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "submit-job", - "functionRef": { - "refName": "submit-job", - "arguments": { - "name": "${ .job.name }" - } - }, - "actionDataFilter": { - "results": "${ .jobuid }" - } - } - ], - "stateDataFilter": { - "output": "${ .jobuid }" - }, - "transition": "wait-for-completion" - }, - { - "name": "wait-for-completion", - "type": "sleep", - "duration": "PT5S", - "transition": "get-job-status" - }, - { - "name": "get-job-status", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "get-job-status", - "functionRef": { - "refName": "check-job-status", - "arguments": { - "name": "${ .jobuid }" - } - }, - "actionDataFilter": { - "results": "${ .jobstatus }" - } - } - ], - "stateDataFilter": { - "output": "${ .jobstatus }" - }, - "transition": "determine-completion" - }, - { - "name": "determine-completion", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .jobStatus == \"SUCCEEDED\" }", - "transition": "job-succeeded", - "name": "succeed" - }, - { - "condition": "${ .jobStatus == \"FAILED\" }", - "transition": "job-failed", - "name": "failed" - } - ], - "defaultCondition": { - "transition": "wait-for-completion" - } - }, - { - "name": "job-succeeded", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "job-succeeded", - "functionRef": { - "refName": "report-job-suceeded", - "arguments": { - "name": "${ .jobuid }" - } - } - } - ], - "end": true - }, - { - "name": "job-failed", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "job-failed", - "functionRef": { - "refName": "report-job-failed", - "arguments": { - "name": "${ .jobuid }" - } - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: jobmonitoring -version: 1.0.0 -specVersion: "0.8" -name: jobmonitoring -description: Monitor finished execution of a submitted job -start: submit-job -functions: - - name: submit-job - operation: http://myapis.org/monitorapi.json#doSubmit - - name: check-job-status - operation: http://myapis.org/monitorapi.json#checkStatus - - name: report-job-suceeded - operation: http://myapis.org/monitorapi.json#reportSucceeded - - name: report-job-failed - operation: http://myapis.org/monitorapi.json#reportFailure -states: - - name: submit-job - type: operation - actionMode: sequential - actions: - - name: submit-job - functionRef: - refName: submit-job - arguments: - name: ${ .job.name } - actionDataFilter: - results: ${ .jobuid } - stateDataFilter: - output: ${ .jobuid } - transition: wait-for-completion - - name: wait-for-completion - type: sleep - duration: PT5S - transition: get-job-status - - name: get-job-status - type: operation - actionMode: sequential - actions: - - name: get-job-status - functionRef: - refName: check-job-status - arguments: - name: ${ .jobuid } - actionDataFilter: - results: ${ .jobstatus } - stateDataFilter: - output: ${ .jobstatus } - transition: determine-completion - - name: determine-completion - type: switch - dataConditions: - - condition: ${ .jobStatus == "SUCCEEDED" } - transition: job-succeeded - name: succeed - - condition: ${ .jobStatus == "FAILED" } - transition: job-failed - name: failed - defaultCondition: - transition: wait-for-completion - - name: job-succeeded - type: operation - actionMode: sequential - actions: - - name: job-succeeded - functionRef: - refName: report-job-suceeded - arguments: - name: ${ .jobuid } - end: true - - name: job-failed - type: operation - actionMode: sequential - actions: - - name: job-failed - functionRef: - refName: report-job-failed - arguments: - name: ${ .jobuid } - end: true -``` - -
- -### Send CloudEvent On Workflow Completion Example - -#### Description - -This example shows how we can produce a CloudEvent on completion of a workflow. Let's say we have the following -workflow data containing orders that need to be provisioned by our workflow: - -```json -{ - "orders": [{ - "id": "123", - "item": "laptop", - "quantity": "10" - }, - { - "id": "456", - "item": "desktop", - "quantity": "4" - }] -} -``` - -Our workflow in this example uses a ForEach state to provision the orders in parallel. The "provisionOrder" function -used is assumed to have the following results: - -```json -{ - "id": "123", - "outcome": "SUCCESS" -} -``` - -After orders have been provisioned the ForEach states defines the end property which stops workflow execution. -It defines its end definition to be of type "event" in which case a CloudEvent will be produced which can be consumed -by other orchestration workflows or other interested consumers. - -Note that we define the event to be produced in the workflows "events" property. - -The data attached to the event contains the information on provisioned orders by this workflow. So the produced -CloudEvent upon completion of the workflow could look like: - -```json -{ - "specversion" : "1.0", - "type" : "provisionCompleteType", - "datacontenttype" : "application/json", - ... - "data": { - "provisionedOrders": [ - { - "id": "123", - "outcome": "SUCCESS" - }, - { - "id": "456", - "outcome": "FAILURE" - } - ] - } -} -``` - -#### Workflow Diagram - -

-Send CloudEvent on Workflow Completion Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "sendcloudeventonprovision", - "version": "1.0.0", - "specVersion": "0.8", - "name": "sendcloudeventonprovision", - "start": "provision-orders-state", - "events": [ - { - "name": "provisioning-complete-event", - "type": "provisionCompleteType", - "kind": "produced" - } - ], - "functions": [ - { - "name": "provision-order-function", - "operation": "http://myapis.org/provisioning.json#doProvision" - } - ], - "states": [ - { - "name": "provision-orders-state", - "type": "foreach", - "inputCollection": "${ .orders }", - "iterationParam": "singleorder", - "outputCollection": "${ .provisionedOrders }", - "actions": [ - { - "name": "provision-order-function", - "functionRef": { - "refName": "provision-order-function", - "arguments": { - "order": "${ .singleorder }" - } - } - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "provisioning-complete-event", - "data": "${ .provisionedOrders }" - } - ] - } - } - ] -}``` - - - -```yaml -id: sendcloudeventonprovision -version: 1.0.0 -specVersion: "0.8" -name: sendcloudeventonprovision -start: provision-orders-state -events: - - name: provisioning-complete-event - type: provisionCompleteType - kind: produced -functions: - - name: provision-order-function - operation: http://myapis.org/provisioning.json#doProvision -states: - - name: provision-orders-state - type: foreach - inputCollection: ${ .orders } - iterationParam: singleorder - outputCollection: ${ .provisionedOrders } - actions: - - name: provision-order-function - functionRef: - refName: provision-order-function - arguments: - order: ${ .singleorder } - end: - produceEvents: - - eventRef: provisioning-complete-event - data: ${ .provisionedOrders } -``` - -
- -### Monitor Patient Vital Signs Example - -#### Description - -In this example a hospital patient is monitored by a Vial Sign Monitoring system. This device can produce three different Cloud Events, namely -"High Body Temperature", "High Blood Pressure", and "High Respiration Rate". -Our workflow which needs to take proper actions depending on the event the Vital Sign Monitor produces needs to start -if any of these events occur. For each of these events a new instance of the workflow is started. - -Since the hospital may include many patients that are being monitored it is assumed that all events include a patientId context attribute in the event - message. We can use the value of this context attribute to associate the incoming events with the same patient as well as - use the patient id to pass as parameter to the functions called by event activities. Here is an example of such event: - -```json -{ - "specversion" : "1.0", - "type" : "org.monitor.highBodyTemp", - "source" : "monitoringSource", - "subject" : "BodyTemperatureReading", - "id" : "A234-1234-1234", - "time" : "2020-01-05T17:31:00Z", - "patientId" : "PID-12345", - "data" : { - "value": "98.6F" - } -} -``` - -As you can see the "patientId" context attribute of the event includes our correlation key which is the unique -patient id. If we set it to be the correlation key in our events definition, all events that are considered must -have the matching patient id. - -#### Workflow Diagram - -

-Monitor Patient Vital Signs Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "patientVitalsWorkflow", - "name": "patientVitalsWorkflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "monitor-vitals", - "events": [ - { - "name": "high-body-temperature", - "type": "org.monitor.highBodyTemp", - "source": "monitoringSource", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - }, - { - "name": "high-blood-pressure", - "type": "org.monitor.highBloodPressure", - "source": "monitoringSource", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - }, - { - "name": "high-respiration-rate", - "type": "org.monitor.highRespirationRate", - "source": "monitoringSource", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - } - ], - "functions": [ - { - "name": "call-pulmonologist", - "operation": "http://myapis.org/patientapis.json#callPulmonologist" - }, - { - "name": "send-tylenol-order", - "operation": "http://myapis.org/patientapis.json#tylenolOrder" - }, - { - "name": "call-nurse", - "operation": "http://myapis.org/patientapis.json#callNurse" - } - ], - "states": [ - { - "name": "monitor-vitals", - "type": "event", - "exclusive": true, - "onEvents": [ - { - "eventRefs": [ - "high-body-temperature" - ], - "actions": [ - { - "name": "send-tylenol-order", - "functionRef": { - "refName": "send-tylenol-order", - "arguments": { - "patientid": "${ .patientId }" - } - } - } - ] - }, - { - "eventRefs": [ - "high-blood-pressure" - ], - "actions": [ - { - "name": "call-nurse", - "functionRef": { - "refName": "call-nurse", - "arguments": { - "patientid": "${ .patientId }" - } - } - } - ] - }, - { - "eventRefs": [ - "high-respiration-rate" - ], - "actions": [ - { - "name": "call-pulmonologist", - "functionRef": { - "refName": "call-pulmonologist", - "arguments": { - "patientid": "${ .patientId }" - } - } - } - ] - } - ], - "end": { - "terminate": true - } - } - ] -}``` - - - -```yaml -id: patientVitalsWorkflow -name: patientVitalsWorkflow -version: 1.0.0 -specVersion: "0.8" -start: monitor-vitals -events: - - name: high-body-temperature - type: org.monitor.highBodyTemp - source: monitoringSource - correlation: - - contextAttributeName: patientId - - name: high-blood-pressure - type: org.monitor.highBloodPressure - source: monitoringSource - correlation: - - contextAttributeName: patientId - - name: high-respiration-rate - type: org.monitor.highRespirationRate - source: monitoringSource - correlation: - - contextAttributeName: patientId -functions: - - name: call-pulmonologist - operation: http://myapis.org/patientapis.json#callPulmonologist - - name: send-tylenol-order - operation: http://myapis.org/patientapis.json#tylenolOrder - - name: call-nurse - operation: http://myapis.org/patientapis.json#callNurse -states: - - name: monitor-vitals - type: event - exclusive: true - onEvents: - - eventRefs: - - high-body-temperature - actions: - - name: send-tylenol-order - functionRef: - refName: send-tylenol-order - arguments: - patientid: ${ .patientId } - - eventRefs: - - high-blood-pressure - actions: - - name: call-nurse - functionRef: - refName: call-nurse - arguments: - patientid: ${ .patientId } - - eventRefs: - - high-respiration-rate - actions: - - name: call-pulmonologist - functionRef: - refName: call-pulmonologist - arguments: - patientid: ${ .patientId } - end: - terminate: true -``` - -
- -### Finalize College Application Example - -#### Description - -In this example our workflow is instantiated when all requirements of a college application are completed. -These requirements include a student submitting an application, the college receiving the students SAT scores, as well -as a student recommendation letter from a former teacher. - -We assume three Cloud Events "ApplicationSubmitted", "SATScoresReceived" and "RecommendationLetterReceived". -Each include the applicant id in their "applicantId" context attribute, so we can use it to associate these events with an individual applicant. - -Our workflow is instantiated and performs the actions to finalize the college application for a student only -when all three of these events happened (in no particular order). - -#### Workflow Diagram - -

-Finalize College Application Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "finalize-college-application", - "name": "finalizeCollegeApplication", - "version": "1.0.0", - "specVersion": "0.8", - "start": "finalize-application", - "events": [ - { - "name": "application-submitted", - "type": "org.application.submitted", - "source": "applicationsource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - }, - { - "name": "sat-scores-received", - "type": "org.application.satscores", - "source": "applicationsource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - }, - { - "name": "recommendation-letter-received", - "type": "org.application.recommendationLetter", - "source": "applicationsource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - } - ], - "functions": [ - { - "name": "finalize-application-function", - "operation": "http://myapis.org/collegeapplicationapi.json#finalize" - } - ], - "states": [ - { - "name": "finalize-application", - "type": "event", - "exclusive": false, - "onEvents": [ - { - "eventRefs": [ - "application-submitted", - "sat-scores-received", - "recommendation-letter-received" - ], - "actions": [ - { - "name": "finalize-application", - "functionRef": { - "refName": "finalize-application-function", - "arguments": { - "student": "${ .applicantId }" - } - } - } - ] - } - ], - "end": { - "terminate": true - } - } - ] -}``` - - - -```yaml -id: finalize-college-application -name: finalizeCollegeApplication -version: 1.0.0 -specVersion: "0.8" -start: finalize-application -events: - - name: application-submitted - type: org.application.submitted - source: applicationsource - correlation: - - contextAttributeName: applicantId - - name: sat-scores-received - type: org.application.satscores - source: applicationsource - correlation: - - contextAttributeName: applicantId - - name: recommendation-letter-received - type: org.application.recommendationLetter - source: applicationsource - correlation: - - contextAttributeName: applicantId -functions: - - name: finalize-application-function - operation: http://myapis.org/collegeapplicationapi.json#finalize -states: - - name: finalize-application - type: event - exclusive: false - onEvents: - - eventRefs: - - application-submitted - - sat-scores-received - - recommendation-letter-received - actions: - - name: finalize-application - functionRef: - refName: finalize-application-function - arguments: - student: ${ .applicantId } - end: - terminate: true -``` - -
- -### Perform Customer Credit Check Example - -#### Description - -In this example our serverless workflow needs to integrate with an external microservice to perform -a credit check. We assume that this external microservice notifies a human actor which has to make -the approval decision based on customer information. Once this decision is made the service emits a CloudEvent which -includes the decision information as part of its payload. -The workflow waits for this callback event and then triggers workflow transitions based on the -credit check decision results. - -The workflow data input is assumed to be: - -```json -{ - "customer": { - "id": "customer123", - "name": "John Doe", - "SSN": 123456, - "yearlyIncome": 50000, - "address": "123 MyLane, MyCity, MyCountry", - "employer": "MyCompany" - } -} -``` - -The callback event that our workflow will wait on is assumed to have the following formats. -For approved credit check, for example: - -```json -{ - "specversion" : "1.0", - "type" : "creditCheckCompleteType", - "datacontenttype" : "application/json", - ... - "data": { - "creditCheck": [ - { - "id": "customer123", - "score": 700, - "decision": "Approved", - "reason": "Good credit score" - } - ] - } -} -``` - -And for denied credit check, for example: - -```json -{ - "specversion" : "1.0", - "type" : "creditCheckCompleteType", - "datacontenttype" : "application/json", - ... - "data": { - "creditCheck": [ - { - "id": "customer123", - "score": 580, - "decision": "Denied", - "reason": "Low credit score. Recent late payments" - } - ] - } -} -``` - -#### Workflow Diagram - -

-Perform Customer Credit Check Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "customercreditcheck", - "version": "1.0.0", - "specVersion": "0.8", - "name": "customercreditcheck", - "description": "Perform Customer Credit Check", - "start": "check-credit", - "functions": [ - { - "name": "credit-check-function", - "operation": "http://myapis.org/creditcheckapi.json#doCreditCheck" - }, - { - "name": "send-rejection-email-function", - "operation": "http://myapis.org/creditcheckapi.json#rejectionEmail" - } - ], - "events": [ - { - "name": "credit-check-completed-event", - "type": "creditCheckCompleteType", - "source": "creditCheckSource", - "correlation": [ - { - "contextAttributeName": "customerId" - } - ] - } - ], - "states": [ - { - "name": "check-credit", - "type": "callback", - "action": { - "name": "check-credit", - "functionRef": { - "refName": "call-credit-check-microservice", - "arguments": { - "customer": "${ .customer }" - } - } - }, - "eventRef": "credit-check-completed-event", - "timeouts": { - "stateExecTimeout": "PT15M" - }, - "transition": "evaluate-decision" - }, - { - "name": "evaluate-decision", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .creditCheck | .decision == \"Approved\" }", - "transition": "start-application", - "name": "start-application" - }, - { - "condition": "${ .creditCheck | .decision == \"Denied\" }", - "transition": "reject-application", - "name": "reject-application" - } - ], - "defaultCondition": { - "transition": "reject-application" - } - }, - { - "name": "start-application", - "type": "operation", - "actions": [ - { - "name": "start-application", - "subFlowRef": "startApplicationWorkflowId" - } - ], - "end": true - }, - { - "name": "reject-application", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "reject-application", - "functionRef": { - "refName": "send-rejection-email-function", - "arguments": { - "applicant": "${ .customer }" - } - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: customercreditcheck -version: 1.0.0 -specVersion: "0.8" -name: customercreditcheck -description: Perform Customer Credit Check -start: check-credit -functions: - - name: credit-check-function - operation: http://myapis.org/creditcheckapi.json#doCreditCheck - - name: send-rejection-email-function - operation: http://myapis.org/creditcheckapi.json#rejectionEmail -events: - - name: credit-check-completed-event - type: creditCheckCompleteType - source: creditCheckSource - correlation: - - contextAttributeName: customerId -states: - - name: check-credit - type: callback - action: - name: check-credit - functionRef: - refName: call-credit-check-microservice - arguments: - customer: ${ .customer } - eventRef: credit-check-completed-event - timeouts: - stateExecTimeout: PT15M - transition: evaluate-decision - - name: evaluate-decision - type: switch - dataConditions: - - condition: ${ .creditCheck | .decision == "Approved" } - transition: start-application - name: start-application - - condition: ${ .creditCheck | .decision == "Denied" } - transition: reject-application - name: reject-application - defaultCondition: - transition: reject-application - - name: start-application - type: operation - actions: - - name: start-application - subFlowRef: startApplicationWorkflowId - end: true - - name: reject-application - type: operation - actionMode: sequential - actions: - - name: reject-application - functionRef: - refName: send-rejection-email-function - arguments: - applicant: ${ .customer } - end: true -``` - -
- -### Handle Car Auction Bids Example - -#### Description - -In this example our serverless workflow needs to handle bits for an online car auction. The car auction has a specific start -and end time. Bids are only allowed to be made during this time period. All bids before or after this time should not be considered. -We assume that the car auction starts at 9am UTC on March 20th 2020 and ends at 3pm UTC on March 20th 2020. - -Bidding is done via an online application and bids are received as events are assumed to have the following format: - -```json -{ - "specversion" : "1.0", - "type" : "carBidType", - "datacontenttype" : "application/json", - ... - "data": { - "bid": [ - { - "carid": "car123", - "amount": 3000, - "bidder": { - "id": "xyz", - "firstName": "John", - "lastName": "Wayne" - } - } - ] - } -} -``` - -#### Workflow Diagram - -

-Handle Car Auction Bid Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "handle-car-auction-bid", - "version": "1.0.0", - "specVersion": "0.8", - "name": "handle-car-auction-bid", - "description": "Store a single bid whole the car auction is active", - "start": { - "stateName": "store-car-auction-bid", - "schedule": "R/PT2H" - }, - "functions": [ - { - "name": "store-bid-function", - "operation": "http://myapis.org/carauctionapi.json#storeBid" - } - ], - "events": [ - { - "name": "car-bid-event", - "type": "carBidMadeType", - "source": "carBidEventSource" - } - ], - "states": [ - { - "name": "store-car-auction-bid", - "type": "event", - "exclusive": true, - "onEvents": [ - { - "eventRefs": [ - "car-bid-event" - ], - "actions": [ - { - "name": "car-bid-event", - "functionRef": { - "refName": "store-bid-function", - "arguments": { - "bid": "${ .bid }" - } - } - } - ] - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: handle-car-auction-bid -version: 1.0.0 -specVersion: "0.8" -name: handle-car-auction-bid -description: Store a single bid whole the car auction is active -start: - stateName: store-car-auction-bid - schedule: R/PT2H -functions: - - name: store-bid-function - operation: http://myapis.org/carauctionapi.json#storeBid -events: - - name: car-bid-event - type: carBidMadeType - source: carBidEventSource -states: - - name: store-car-auction-bid - type: event - exclusive: true - onEvents: - - eventRefs: - - car-bid-event - actions: - - name: car-bid-event - functionRef: - refName: store-bid-function - arguments: - bid: ${ .bid } - end: true -``` - -
- -### Check Inbox Periodically - -#### Description - -In this example we show the use of scheduled cron-based start event property. The example workflow checks the users inbox every 15 minutes -and send them a text message when there are important emails. - -The results of the inbox service called is expected to be for example: - -```json -{ - "messages": [ - { - "title": "Update your health benefits", - "from": "HR", - "priority": "high" - }, - { - "title": "New job candidate resume", - "from": "Recruiting", - "priority": "medium" - }, - ... - ] -} -``` - -#### Workflow Diagram - -

-Check Inbox Periodically Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "check-inbox", - "name": "check-inbox", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Periodically Check Inbox", - "start": { - "stateName": "check-inbox", - "schedule": { - "cron": "0 0/15 * * * ?" - } - }, - "functions": [ - { - "name": "check-inbox-function", - "operation": "http://myapis.org/inboxapi.json#checkNewMessages" - }, - { - "name": "send-text-function", - "operation": "http://myapis.org/inboxapi.json#sendText" - } - ], - "states": [ - { - "name": "check-inbox", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name":"check-inbox", - "functionRef": "check-inbox-function" - } - ], - "transition": "send-text-for-high-priority" - }, - { - "name": "send-text-for-high-priority", - "type": "foreach", - "inputCollection": "${ .messages }", - "iterationParam": "singlemessage", - "actions": [ - { - "name": "send-text-for-high-priority", - "functionRef": { - "refName": "send-text-function", - "arguments": { - "message": "${ .singlemessage }" - } - } - } - ], - "end": true - } - ] -}``` - - - -```yaml -id: check-inbox -name: check-inbox -version: 1.0.0 -specVersion: "0.8" -description: Periodically Check Inbox -start: - stateName: check-inbox - schedule: - cron: 0 0/15 * * * ? -functions: - - name: check-inbox-function - operation: http://myapis.org/inboxapi.json#checkNewMessages - - name: send-text-function - operation: http://myapis.org/inboxapi.json#sendText -states: - - name: check-inbox - type: operation - actionMode: sequential - actions: - - name: check-inbox - functionRef: check-inbox-function - transition: send-text-for-high-priority - - name: send-text-for-high-priority - type: foreach - inputCollection: ${ .messages } - iterationParam: singlemessage - actions: - - name: send-text-for-high-priority - functionRef: - refName: send-text-function - arguments: - message: ${ .singlemessage } - end: true -``` - -
- -### Event Based Service Invocation - -#### Description - -In this example we want to make a Veterinary appointment for our dog Mia. The vet service can be invoked only -via an event, and its completion results with the appointment day and time is returned via an event as well. - -This shows a common scenario especially inside container environments where some services may not be exposed via -a resource URI, but only accessible by submitting an event to the underlying container events manager. - -For this example we assume that that payload of the Vet service response event includes an "appointment" -object which contains our appointment info. - -This info is then filtered to become the workflow data output. It could also be used to for example send us an -appointment email, a text message reminder, etc. - -For this example we assume that the workflow instance is started given the following workflow data input: - -```json - { - "patientInfo": { - "name": "Mia", - "breed": "German Shepherd", - "age": 5, - "reason": "Bee sting", - "patientId": "Mia1" - } - } -``` - -#### Workflow Diagram - -

-Vet Appointment Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "vet-appointment-workflow", - "name": "vet-appointment-workflow", - "description": "Vet service call via events", - "version": "1.0.0", - "specVersion": "0.8", - "start": "make-vet-appointment-state", - "events": [ - { - "name": "make-vet-appointment", - "source": "VetServiceSource", - "type": "events.vet.appointments", - "kind": "produced" - }, - { - "name": "vet-appointment-info", - "source": "VetServiceSource", - "type": "events.vet.appointments", - "kind": "consumed" - } - ], - "states": [ - { - "name": "make-vet-appointment-state", - "type": "operation", - "actions": [ - { - "name": "make-appointment-action", - "eventRef": { - "produceEventRef": "make-vet-appointment", - "data": "${ .patientInfo }", - "consumeEventRef": "vet-appointment-info" - }, - "actionDataFilter": { - "results": "${ .appointmentInfo }" - } - } - ], - "timeouts": { - "actionExecTimeout": "PT15M" - }, - "end": true - } - ] -}``` - - - -```yaml -id: vet-appointment-workflow -name: vet-appointment-workflow -description: Vet service call via events -version: 1.0.0 -specVersion: "0.8" -start: make-vet-appointment-state -events: - - name: make-vet-appointment - source: VetServiceSource - type: events.vet.appointments - kind: produced - - name: vet-appointment-info - source: VetServiceSource - type: events.vet.appointments - kind: consumed -states: - - name: make-vet-appointment-state - type: operation - actions: - - name: make-appointment-action - eventRef: - produceEventRef: make-vet-appointment - data: ${ .patientInfo } - consumeEventRef: vet-appointment-info - actionDataFilter: - results: ${ .appointmentInfo } - timeouts: - actionExecTimeout: PT15M - end: true -``` - -
- -### Reusing Function And Event Definitions - -#### Description - -This example shows how [function](../specification.md#Function-Definition) and [event](../specification.md#Event-Definition) definitions -can be declared independently and referenced by workflow definitions. -This is useful when you would like to reuse event and function definitions across multiple workflows. In those scenarios it allows you to make -changed/updates to these definitions in a single place without having to modify multiple workflows. - -For the example we have two files, namely our "functiondefs.json" and "eventdefs.yml" (to show that they can be expressed in either JSON or YAML). -These hold our function and event definitions which then can be referenced by multiple workflows. - -* functiondefs.json - -```json -{ - "functions": [ - { - "name": "checkFundsAvailability", - "operation": "file://myapis/billingapis.json#checkFunds" - }, - { - "name": "sendSuccessEmail", - "operation": "file://myapis/emailapis.json#paymentSuccess" - }, - { - "name": "sendInsufficientFundsEmail", - "operation": "file://myapis/emailapis.json#paymentInsufficientFunds" - } - ] -} -``` - -* eventdefs.yml - -```yaml -events: -- name: PaymentReceivedEvent - type: payment.receive - source: paymentEventSource - correlation: - - contextAttributeName: accountId -- name: ConfirmationCompletedEvent - type: payment.confirmation - kind: produced - -``` - -In our workflow definition then we can reference these files rather than defining function and events in-line. - -#### Workflow Diagram - -

-Reusing Function and Event Definitions Example -

- -#### Workflow Definitions - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "paymentconfirmation", - "version": "1.0.0", - "specVersion": "0.8", - "name": "paymentconfirmation", - "description": "Performs Payment Confirmation", - "functions": "file://functiondefs.json", - "events": "file://eventdefs.yml", - "states": [ - { - "name": "payment-received", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "payment-received-event" - ], - "actions": [ - { - "name": "checkfunds", - "functionRef": { - "refName": "check-funds-availability", - "arguments": { - "account": "${ .accountId }", - "paymentamount": "${ .payment.amount }" - } - } - } - ] - } - ], - "transition": "confirm-based-on-funds" - }, - { - "name": "confirm-based-on-funds", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .funds | .available == \"true\" }", - "transition": "send-payment-success", - "name": "success" - }, - { - "condition": "${ .funds | .available == \"false\" }", - "transition": "send-insufficient-results", - "name": "failed" - } - ], - "defaultCondition": { - "transition": "send-payment-success" - } - }, - { - "name": "send-payment-success", - "type": "operation", - "actions": [ - { - "name": "send-payment-success", - "functionRef": { - "refName": "send-success-email", - "arguments": { - "applicant": "${ .customer }" - } - } - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "confirmation-completed-event", - "data": "${ .payment }" - } - ] - } - }, - { - "name": "send-insufficient-results", - "type": "operation", - "actions": [ - { - "name": "send-insufficient-results", - "functionRef": { - "refName": "send-insufficient-funds-email", - "arguments": { - "applicant": "${ .customer }" - } - } - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "confirmation-completed-event", - "data": "${ .payment }" - } - ] - } - } - ] -}``` - - - -```yaml -id: paymentconfirmation -version: 1.0.0 -specVersion: "0.8" -name: paymentconfirmation -description: Performs Payment Confirmation -functions: file://functiondefs.json -events: file://eventdefs.yml -states: - - name: payment-received - type: event - onEvents: - - eventRefs: - - payment-received-event - actions: - - name: checkfunds - functionRef: - refName: check-funds-availability - arguments: - account: ${ .accountId } - paymentamount: ${ .payment.amount } - transition: confirm-based-on-funds - - name: confirm-based-on-funds - type: switch - dataConditions: - - condition: ${ .funds | .available == "true" } - transition: send-payment-success - name: success - - condition: ${ .funds | .available == "false" } - transition: send-insufficient-results - name: failed - defaultCondition: - transition: send-payment-success - - name: send-payment-success - type: operation - actions: - - name: send-payment-success - functionRef: - refName: send-success-email - arguments: - applicant: ${ .customer } - end: - produceEvents: - - eventRef: confirmation-completed-event - data: ${ .payment } - - name: send-insufficient-results - type: operation - actions: - - name: send-insufficient-results - functionRef: - refName: send-insufficient-funds-email - arguments: - applicant: ${ .customer } - end: - produceEvents: - - eventRef: confirmation-completed-event - data: ${ .payment } -``` - -
- -### New Patient Onboarding - -#### Description - -In this example we want to use a workflow to onboard a new patient (at a hospital for example). -To onboard a patient our workflow is invoked via a "NewPatientEvent" event. This events payload contains the -patient information, for example: - -```json -{ - "name": "John", - "condition": "chest pains" -} -``` - -When this event is received we want to create a new workflow instance and invoke three services -sequentially. The first service we want to invoke is responsible to store patient information, -second is to assign a doctor to a patient given the patient condition, and third to assign a -new appoitment with the patient and the assigned doctor. - -In addition, in this example we need to handle a possible situation where one or all of the needed -services are not available (the server returns a http 503 (Service Unavailable) error). If our workflow -catches this error, we want to try to recover from this by issuing retries for the particular -service invocation that caused the error up to 10 times with three seconds in-between retries. -If the retries are not successful, we want to just gracefully end workflow execution. - -#### Workflow Diagram - -

-Patient Onboarding Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "patientonboarding", - "name": "patientonboarding", - "version": "1.0.0", - "specVersion": "0.8", - "start": "onboard", - "states": [ - { - "name": "onboard", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "new-patient-event" - ], - "actions": [ - { - "name": "store-patient", - "functionRef": "store-patient", - "retryRef": "services-not-available-retry-strategy", - "retryableErrors": [ - "service-not-available" - ] - }, - { - "name": "assign-doctor", - "functionRef": "assign-doctor", - "retryRef": "services-not-available-retry-strategy", - "retryableErrors": [ - "service-not-available" - ] - }, - { - "name": "schedule-appt", - "functionRef": "schedule-appt", - "retryRef": "services-not-available-retry-strategy", - "retryableErrors": [ - "service-not-available" - ] - } - ] - } - ], - "onErrors": [ - { - "errorRef": "service-not-available", - "end": true - } - ], - "end": true - } - ], - "events": [ - { - "name": "store-patient", - "type": "new.patients.event", - "source": "newpatient/+" - } - ], - "functions": [ - { - "name": "store-new-patient-info", - "operation": "api/services.json#addPatient" - }, - { - "name": "assign-doctor", - "operation": "api/services.json#assignDoctor" - }, - { - "name": "schedule-appt", - "operation": "api/services.json#scheduleAppointment" - } - ], - "errors": [ - { - "name": "service-not-available", - "code": "503" - } - ], - "retries": [ - { - "name": "services-not-available-retry-strategy", - "delay": "PT3S", - "maxAttempts": 10 - } - ] -}``` - - - -```yaml -id: patientonboarding -name: patientonboarding -version: 1.0.0 -specVersion: "0.8" -start: onboard -states: - - name: onboard - type: event - onEvents: - - eventRefs: - - new-patient-event - actions: - - name: store-patient - functionRef: store-patient - retryRef: services-not-available-retry-strategy - retryableErrors: - - service-not-available - - name: assign-doctor - functionRef: assign-doctor - retryRef: services-not-available-retry-strategy - retryableErrors: - - service-not-available - - name: schedule-appt - functionRef: schedule-appt - retryRef: services-not-available-retry-strategy - retryableErrors: - - service-not-available - onErrors: - - errorRef: service-not-available - end: true - end: true -events: - - name: store-patient - type: new.patients.event - source: newpatient/+ -functions: - - name: store-new-patient-info - operation: api/services.json#addPatient - - name: assign-doctor - operation: api/services.json#assignDoctor - - name: schedule-appt - operation: api/services.json#scheduleAppointment -errors: - - name: service-not-available - code: "503" -retries: - - name: services-not-available-retry-strategy - delay: PT3S - maxAttempts: 10 -``` - -
- -#### Workflow Demo - -This example is used in our Serverless Workflow Hands-on series videos [#1](https://www.youtube.com/watch?v=0gmpuGLP-_o) -and [#2](https://www.youtube.com/watch?v=6A6OYp5nygg). - -### Purchase order deadline - -#### Description - -In this example our workflow processes purchase orders. An order event triggers instance of our workflow. -To complete the created order, our workflow must first wait for an order confirmation event (correlated to the -order id), and then wait for the shipment sent event (also correlated to initial order id). -We do not want to place an exact timeout limit for waiting for the confirmation and shipment events, -as this might take a different amount of time depending on the size of the order. However we do have the requirement -that a total amount of time for the order to be confirmed, once its created, is 30 days. -If the created order is not completed within 30 days it needs to be automatically closed. - -This example shows the use of the workflow [execTimeout definition](../specification.md#ExecTimeout-Definition). - -#### Workflow Diagram - -

-Purchase Order Deadline Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "order", - "name": "order", - "description": "Purchase Order Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "start-new-order", - "timeouts": { - "workflowExecTimeout": { - "duration": "PT30D", - "runBefore": "CancelOrder" - } - }, - "states": [ - { - "name": "start-new-order", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "order-created-event" - ], - "actions": [ - { - "name": "log-new-order-created", - "functionRef": { - "refName": "log-new-order-created" - } - } - ] - } - ], - "transition": { - "nextState": "wait-for-order-confirmation" - } - }, - { - "name": "wait-for-order-confirmation", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "order-confirmed-event" - ], - "actions": [ - { - "name": "log-order-confirmed", - "functionRef": { - "refName": "log-order-confirmed" - } - } - ] - } - ], - "transition": { - "nextState": "wait-order-shipped" - } - }, - { - "name": "wait-order-shipped", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "shipment-sent-event" - ], - "actions": [ - { - "name": "log-order-shipped", - "functionRef": { - "refName": "log-order-shipped" - } - } - ] - } - ], - "end": { - "terminate": true, - "produceEvents": [ - { - "eventRef": "order-finished-event" - } - ] - } - }, - { - "name": "cancel-order", - "type": "operation", - "actions": [ - { - "name": "cancel-order", - "functionRef": { - "refName": "cancel-order" - } - } - ], - "end": { - "terminate": true, - "produceEvents": [ - { - "eventRef": "order-cancelled-event" - } - ] - } - } - ], - "events": [ - { - "name": "order-created-event", - "type": "my.company.orders", - "source": "/orders/new", - "correlation": [ - { - "contextAttributeName": "orderid" - } - ] - }, - { - "name": "order-confirmed-event", - "type": "my.company.orders", - "source": "/orders/confirmed", - "correlation": [ - { - "contextAttributeName": "orderid" - } - ] - }, - { - "name": "shipment-sent-event", - "type": "my.company.orders", - "source": "/orders/shipped", - "correlation": [ - { - "contextAttributeName": "orderid" - } - ] - }, - { - "name": "order-finished-event", - "type": "my.company.orders", - "kind": "produced" - }, - { - "name": "order-cancelled-event", - "type": "my.company.orders", - "kind": "produced" - } - ], - "functions": [ - { - "name": "log-new-order-created", - "operation": "http.myorg.io/ordersservices.json#logcreated" - }, - { - "name": "log-order-confirmed", - "operation": "http.myorg.io/ordersservices.json#logconfirmed" - }, - { - "name": "log-order-shipped", - "operation": "http.myorg.io/ordersservices.json#logshipped" - }, - { - "name": "cancel-order", - "operation": "http.myorg.io/ordersservices.json#calcelorder" - } - ] -}``` - - - -```yaml -id: order -name: order -description: Purchase Order Workflow -version: 1.0.0 -specVersion: "0.8" -start: start-new-order -timeouts: - workflowExecTimeout: - duration: PT30D - runBefore: CancelOrder -states: - - name: start-new-order - type: event - onEvents: - - eventRefs: - - order-created-event - actions: - - name: log-new-order-created - functionRef: - refName: log-new-order-created - transition: - nextState: wait-for-order-confirmation - - name: wait-for-order-confirmation - type: event - onEvents: - - eventRefs: - - order-confirmed-event - actions: - - name: log-order-confirmed - functionRef: - refName: log-order-confirmed - transition: - nextState: wait-order-shipped - - name: wait-order-shipped - type: event - onEvents: - - eventRefs: - - shipment-sent-event - actions: - - name: log-order-shipped - functionRef: - refName: log-order-shipped - end: - terminate: true - produceEvents: - - eventRef: order-finished-event - - name: cancel-order - type: operation - actions: - - name: cancel-order - functionRef: - refName: cancel-order - end: - terminate: true - produceEvents: - - eventRef: order-cancelled-event -events: - - name: order-created-event - type: my.company.orders - source: /orders/new - correlation: - - contextAttributeName: orderid - - name: order-confirmed-event - type: my.company.orders - source: /orders/confirmed - correlation: - - contextAttributeName: orderid - - name: shipment-sent-event - type: my.company.orders - source: /orders/shipped - correlation: - - contextAttributeName: orderid - - name: order-finished-event - type: my.company.orders - kind: produced - - name: order-cancelled-event - type: my.company.orders - kind: produced -functions: - - name: log-new-order-created - operation: http.myorg.io/ordersservices.json#logcreated - - name: log-order-confirmed - operation: http.myorg.io/ordersservices.json#logconfirmed - - name: log-order-shipped - operation: http.myorg.io/ordersservices.json#logshipped - - name: cancel-order - operation: http.myorg.io/ordersservices.json#calcelorder -``` - -
- -### Accumulate room readings - -#### Description - -In this example we have two IoT sensors for each room in our house. One reads temperature values -and the other humidity values of each room. We get these measurements for each of our rooms -as CloudEvents. We can correlate events send by our sensors by the room it is in. - -For the example we want to accumulate the temperature and humidity values for each and send hourly reports -to the home owner for each room. - -**Note:** In this example each rooms measurements will be accumulated by a single workflow instance per room. -Once we receive events for 1 hour (per room) each of the room-based workflow instances will create the report. Events -consumed after the report is created will trigger a new instance of our workflow (again, per room), accumulate -the data for an hour, send report, and so on. - -#### Workflow Diagram - -

-Accumulate Room Readings Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "roomreadings", - "name": "roomreadings", - "description": "Room Temp and Humidity Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "consume-reading", - "timeouts": { - "workflowExecTimeout": { - "duration": "PT1H", - "runBefore": "generate-report" - } - }, - "keepActive": true, - "states": [ - { - "name": "consume-reading", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "temperature-event", - "humidity-event" - ], - "actions": [ - { - "name": "log-reading", - "functionRef": { - "refName": "log-reading" - } - } - ], - "eventDataFilter": { - "toStateData": "${ .readings }" - } - } - ], - "end": true - }, - { - "name": "generate-report", - "type": "operation", - "actions": [ - { - "name": "generate-report", - "functionRef": { - "refName": "produce-report", - "arguments": { - "data": "${ .readings }" - } - } - } - ], - "end": { - "terminate": true - } - } - ], - "events": [ - { - "name": "temperature-event", - "type": "my.home.sensors", - "source": "/home/rooms/+", - "correlation": [ - { - "contextAttributeName": "roomId" - } - ] - }, - { - "name": "humidity-event", - "type": "my.home.sensors", - "source": "/home/rooms/+", - "correlation": [ - { - "contextAttributeName": "roomId" - } - ] - } - ], - "functions": [ - { - "name": "log-reading", - "operation": "http.myorg.io/ordersservices.json#logreading" - }, - { - "name": "produce-report", - "operation": "http.myorg.io/ordersservices.json#produceReport" - } - ] -}``` - - - -```yaml -id: roomreadings -name: roomreadings -description: Room Temp and Humidity Workflow -version: 1.0.0 -specVersion: "0.8" -start: consume-reading -timeouts: - workflowExecTimeout: - duration: PT1H - runBefore: generate-report -keepActive: true -states: - - name: consume-reading - type: event - onEvents: - - eventRefs: - - temperature-event - - humidity-event - actions: - - name: log-reading - functionRef: - refName: log-reading - eventDataFilter: - toStateData: ${ .readings } - end: true - - name: generate-report - type: operation - actions: - - name: generate-report - functionRef: - refName: produce-report - arguments: - data: ${ .readings } - end: - terminate: true -events: - - name: temperature-event - type: my.home.sensors - source: /home/rooms/+ - correlation: - - contextAttributeName: roomId - - name: humidity-event - type: my.home.sensors - source: /home/rooms/+ - correlation: - - contextAttributeName: roomId -functions: - - name: log-reading - operation: http.myorg.io/ordersservices.json#logreading - - name: produce-report - operation: http.myorg.io/ordersservices.json#produceReport -``` - -
- -### Car Vitals Checks - -#### Description - -In this example we need to check car vital signs while our car is driving. -The workflow should start when we receive the "CarTurnedOnEvent" event and stop when the "CarTurnedOffEvent" event is consumed. -While the car is driving our workflow should repeatedly check the vitals every 1 second. - -For this example we use the workflow [SubFlow](../specification.md#SubFlow-Action) actions to perform the vital checks. - -#### Workflow Diagram - -

-Check Car Vitals Example -

- -#### Workflow Definition - -We fist define our top-level workflow for this example: - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "checkcarvitals", - "name": "checkcarvitals", - "description": "Check Car Vitals Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "when-car-is-on", - "states": [ - { - "name": "when-car-is-on", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "car-turned-on-event" - ] - } - ], - "transition": "do-car-vital-checks" - }, - { - "name": "do-car-vital-checks", - "type": "operation", - "actions": [ - { - "name": "do-car-vital-checks", - "subFlowRef": "vitalscheck", - "sleep": { - "after": "PT1S" - } - } - ], - "transition": "check-continue-vital-checks" - }, - { - "name": "check-continue-vital-checks", - "type": "switch", - "eventConditions": [ - { - "name": "car-turned-off-condition", - "eventRef": "car-turned-off-event", - "end": true - } - ], - "defaultCondition": { - "transition": "do-car-vital-checks" - } - } - ], - "events": [ - { - "name": "car-turned-on-event", - "type": "car.events", - "source": "my/car" - }, - { - "name": "car-turned-off-event", - "type": "car.events", - "source": "my/car" - } - ] -}``` - - - -```yaml -id: checkcarvitals -name: checkcarvitals -description: Check Car Vitals Workflow -version: 1.0.0 -specVersion: "0.8" -start: when-car-is-on -states: - - name: when-car-is-on - type: event - onEvents: - - eventRefs: - - car-turned-on-event - transition: do-car-vital-checks - - name: do-car-vital-checks - type: operation - actions: - - name: do-car-vital-checks - subFlowRef: vitalscheck - sleep: - after: PT1S - transition: check-continue-vital-checks - - name: check-continue-vital-checks - type: switch - eventConditions: - - name: car-turned-off-condition - eventRef: car-turned-off-event - end: true - defaultCondition: - transition: do-car-vital-checks -events: - - name: car-turned-on-event - type: car.events - source: my/car - - name: car-turned-off-event - type: car.events - source: my/car -``` - -
- -And then our reusable sub-workflow which performs the checking of our car vitals: - - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "vitalscheck", - "name": "vitalscheck", - "description": "Car Vitals Check", - "version": "1.0.0", - "specVersion": "0.8", - "start": "check-vitals", - "states": [ - { - "name": "check-vitals", - "type": "operation", - "actions": [ - { - "name": "check-tire-pressure", - "functionRef": "check-tire-pressure" - }, - { - "name": "check-oil-pressure", - "functionRef": "check-oil-pressure" - }, - { - "name": "check-coolant-level", - "functionRef": "check-coolant-level" - }, - { - "name": "check-battery", - "functionRef": "check-battery" - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "display-checks-on-dashboard", - "data": "${ .evaluations }" - } - ] - } - } - ], - "functions": [ - { - "name": "check-tire-pressure", - "operation": "mycarservices.json#checktirepressure" - }, - { - "name": "check-oil-pressure", - "operation": "mycarservices.json#checkoilpressure" - }, - { - "name": "check-coolant-level", - "operation": "mycarservices.json#checkcoolantlevel" - }, - { - "name": "check-battery", - "operation": "mycarservices.json#checkbattery" - } - ] -}``` - - - -```yaml -id: vitalscheck -name: vitalscheck -description: Car Vitals Check -version: 1.0.0 -specVersion: "0.8" -start: check-vitals -states: - - name: check-vitals - type: operation - actions: - - name: check-tire-pressure - functionRef: check-tire-pressure - - name: check-oil-pressure - functionRef: check-oil-pressure - - name: check-coolant-level - functionRef: check-coolant-level - - name: check-battery - functionRef: check-battery - end: - produceEvents: - - eventRef: display-checks-on-dashboard - data: ${ .evaluations } -functions: - - name: check-tire-pressure - operation: mycarservices.json#checktirepressure - - name: check-oil-pressure - operation: mycarservices.json#checkoilpressure - - name: check-coolant-level - operation: mycarservices.json#checkcoolantlevel - - name: check-battery - operation: mycarservices.json#checkbattery -``` - - - -```yaml -id: vitalscheck -name: Car Vitals Check -version: '1.0.0' -specVersion: '0.8' -start: CheckVitals -states: - - name: CheckVitals - type: operation - actions: - - functionRef: Check Tire Pressure - - functionRef: Check Oil Pressure - - functionRef: Check Coolant Level - - functionRef: Check Battery - end: - produceEvents: - - eventRef: DisplayChecksOnDashboard - data: "${ .evaluations }" -functions: - - name: checkTirePressure - operation: mycarservices.json#checktirepressure - - name: checkOilPressure - operation: mycarservices.json#checkoilpressure - - name: checkCoolantLevel - operation: mycarservices.json#checkcoolantlevel - - name: checkBattery - operation: mycarservices.json#checkbattery -``` - -
- -### Book Lending - -#### Description - -In this example we want to create a book lending workflow. The workflow starts when a lender -submits a book lending request (via event "Book Lending Request Event"). -The workflow describes our business logic around lending a book, from checking its current availability, -to waiting on the lender's response if the book is currently not available, to checking out the book and notifying -the lender. - -This example expects the "Book Lending Request Event" event to have a payload of for example: - -```json -{ - "book": { - "title": " ... ", - "id": " ... " - }, - "lender": { - "name": "John Doe", - "address": " ... ", - "phone": " ... " - } -} -``` - -where the "book" property defines the book to be lent out, and the "lender" property provides info -about the person wanting to lend the book. - -For the sake of the example we assume the functions and event definitions are defined in separate JSON files. - -#### Workflow Diagram - -

-Book Lending Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "booklending", - "name": "booklending", - "description": "Book Lending Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "book-lending-request", - "states": [ - { - "name": "book-lending-request", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "book-lending-request-event" - ] - } - ], - "transition": "get-book-status" - }, - { - "name": "get-book-status", - "type": "operation", - "actions": [ - { - "name": "get-book-status", - "functionRef": { - "refName": "get-status-for-book", - "arguments": { - "bookid": "${ .book.id }" - } - } - } - ], - "transition": "book-status-decision" - }, - { - "name": "book-status-decision", - "type": "switch", - "dataConditions": [ - { - "name": "book-is-on-loan", - "condition": "${ .book.status == \"onloan\" }", - "transition": "report-status-to-lender" - }, - { - "name": "check-is-available", - "condition": "${ .book.status == \"available\" }", - "transition": "check-out-book" - } - ], - "defaultCondition": { - "end": true - } - }, - { - "name": "report-status-to-lender", - "type": "operation", - "actions": [ - { - "name": "report-status-to-lender", - "functionRef": { - "refName": "send-status-to-lender", - "arguments": { - "bookid": "${ .book.id }", - "message": "Book ${ .book.title } is already on loan" - } - } - } - ], - "transition": "wait-for-lender-response" - }, - { - "name": "wait-for-lender-response", - "type": "switch", - "eventConditions": [ - { - "name": "hold-book", - "eventRef": "hold-book-event", - "transition": "request-hold" - }, - { - "name": "decline-book-hold", - "eventRef": "decline-hold-event", - "transition": "cancel-request" - } - ], - "defaultCondition": { - "end": true - } - }, - { - "name": "request-hold", - "type": "operation", - "actions": [ - { - "name": "request-hold", - "functionRef": { - "refName": "request-hold-for-lender", - "arguments": { - "bookid": "${ .book.id }", - "lender": "${ .lender }" - } - } - } - ], - "transition": "sleep-two-weeks" - }, - { - "name": "cancel-request", - "type": "operation", - "actions": [ - { - "name": "cancel-request", - "functionRef": { - "refName": "cancel-hold-request-for-lender", - "arguments": { - "bookid": "${ .book.id }", - "lender": "${ .lender }" - } - } - } - ], - "transition": "sleep-two-weeks" - }, - { - "name": "sleep-two-weeks", - "type": "sleep", - "duration": "PT2W", - "transition": "get-book-status" - }, - { - "name": "check-out-book", - "type": "operation", - "actions": [ - { - "name": "check-out-book", - "functionRef": { - "refName": "check-out-book-with-id", - "arguments": { - "bookid": "${ .book.id }" - } - } - }, - { - "name": "notify-lender-for-checkout", - "functionRef": { - "refName": "notify-lender-for-checkout", - "arguments": { - "bookid": "${ .book.id }", - "lender": "${ .lender }" - } - } - } - ], - "end": true - } - ], - "functions": "file://books/lending/functions.json", - "events": "file://books/lending/events.json" -}``` - - - -```yaml -id: booklending -name: booklending -description: Book Lending Workflow -version: 1.0.0 -specVersion: "0.8" -start: book-lending-request -states: - - name: book-lending-request - type: event - onEvents: - - eventRefs: - - book-lending-request-event - transition: get-book-status - - name: get-book-status - type: operation - actions: - - name: get-book-status - functionRef: - refName: get-status-for-book - arguments: - bookid: ${ .book.id } - transition: book-status-decision - - name: book-status-decision - type: switch - dataConditions: - - name: book-is-on-loan - condition: ${ .book.status == "onloan" } - transition: report-status-to-lender - - name: check-is-available - condition: ${ .book.status == "available" } - transition: check-out-book - defaultCondition: - end: true - - name: report-status-to-lender - type: operation - actions: - - name: report-status-to-lender - functionRef: - refName: send-status-to-lender - arguments: - bookid: ${ .book.id } - message: Book ${ .book.title } is already on loan - transition: wait-for-lender-response - - name: wait-for-lender-response - type: switch - eventConditions: - - name: hold-book - eventRef: hold-book-event - transition: request-hold - - name: decline-book-hold - eventRef: decline-hold-event - transition: cancel-request - defaultCondition: - end: true - - name: request-hold - type: operation - actions: - - name: request-hold - functionRef: - refName: request-hold-for-lender - arguments: - bookid: ${ .book.id } - lender: ${ .lender } - transition: sleep-two-weeks - - name: cancel-request - type: operation - actions: - - name: cancel-request - functionRef: - refName: cancel-hold-request-for-lender - arguments: - bookid: ${ .book.id } - lender: ${ .lender } - transition: sleep-two-weeks - - name: sleep-two-weeks - type: sleep - duration: PT2W - transition: get-book-status - - name: check-out-book - type: operation - actions: - - name: check-out-book - functionRef: - refName: check-out-book-with-id - arguments: - bookid: ${ .book.id } - - name: notify-lender-for-checkout - functionRef: - refName: notify-lender-for-checkout - arguments: - bookid: ${ .book.id } - lender: ${ .lender } - end: true -functions: file://books/lending/functions.json -events: file://books/lending/events.json -``` - -
- -### Filling a glass of water - -#### Description - -In this example we showcase the power of [expression functions](../specification.md#Using-Functions-For-Expression-Evaluation). -Our workflow definition is assumed to have the following data input: - -```json -{ - "counts": { - "current": 0, - "max": 10 - } -} -``` - -Our workflow simulates filling up a glass of water one "count" at a time until "max" count is reached which -represents our glass is full. -Each time we increment the current count, the workflow checks if we need to keep refilling the glass. -If the current count reaches the max count, the workflow execution ends. -To increment the current count, the workflow invokes the "IncrementCurrent" expression function. -Its results are then merged back into the state data according to the "toStateData" property of the event data filter. - -#### Workflow Diagram - -

-Fill Glass of Water Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "fillglassofwater", - "name": "fillglassofwater", - "description": "Fill glass of water workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "check-if-full", - "functions": [ - { - "name": "increment-current-count-function", - "type": "expression", - "operation": ".counts.current += 1 | .counts.current" - } - ], - "states": [ - { - "name": "check-if-full", - "type": "switch", - "dataConditions": [ - { - "name": "need-to-fill-more", - "condition": "${ .counts.current < .counts.max }", - "transition": "add-water" - }, - { - "name": "glass-full", - "condition": ".counts.current >= .counts.max", - "end": true - } - ], - "defaultCondition": { - "end": true - } - }, - { - "name": "add-water", - "type": "operation", - "actions": [ - { - "name": "add-water", - "functionRef": "increment-current-count-function", - "actionDataFilter": { - "toStateData": ".counts.current" - } - } - ], - "transition": "check-if-full" - } - ] -}``` - - - -```yaml -id: fillglassofwater -name: fillglassofwater -description: Fill glass of water workflow -version: 1.0.0 -specVersion: "0.8" -start: check-if-full -functions: - - name: increment-current-count-function - type: expression - operation: .counts.current += 1 | .counts.current -states: - - name: check-if-full - type: switch - dataConditions: - - name: need-to-fill-more - condition: ${ .counts.current < .counts.max } - transition: add-water - - name: glass-full - condition: .counts.current >= .counts.max - end: true - defaultCondition: - end: true - - name: add-water - type: operation - actions: - - name: add-water - functionRef: increment-current-count-function - actionDataFilter: - toStateData: .counts.current - transition: check-if-full -``` - -
- -### Online Food Ordering - -#### Description - -In this example we want to create an online food ordering workflow. The below image outlines the workflow -structure and the available services: - -

-Online Food Ordering Structure -

- -Our workflow starts with the "Place Order" [Subflow](../specification.md#SubFlow-Action), which is responsible -to send the received order to the requested restaurant and the estimated order ETA. -We then wait for the ETA time when our workflow should go into the "Deliver Order" SubFlow, responsible -for dispatching a Courier and sending her/him off to pick up the order. Once the order is picked up, the Courier needs to deliver the order to the customer. -After the order has been delivered to the customer, our workflow needs to charge the customer. - -Our workflow needs to communicate with three services during its execution, namely the Order, Delivery, and -the Payment services. - -For the sake of the example, we assume that our workflow can communicate to the Order and Delivery services via REST and the Payment service via gRPC. -Let's start by defining an example CloudEvent which triggers an instance of our workflow. -This event can be sent by a web UI, for example, or be pushed onto a Kafka/MQTT topic to start our order workflow. - -```json -{ - "specversion": "1.0", - "type": "org.orders", - "source": "/orders/", - "subject": "Food Order", - "id": "A234-1234-1234", - "time": "2021-03-05T17:31:00Z", - "orderid": "ORDER-12345", - "data": { - "id": "ORDER-12345", - "customerId": "CUSTOMER-12345", - "status": [], - "order": { - "restaurantId": "RESTAURANT-54321", - "items": [ - { - "itemId": "ITEM-8765", - "amount": 1, - "addons": "" - } - ] - }, - "delivery":{ - "address": "1234 MyStreet, MyCountry", - "type": "contactless", - "requestedTime": "ASAP", - "location": "Front door", - "instructions": "" - } - } -} -``` - -Note the `orderid` CloudEvent context attribute, which contains the unique ID of the order specified in this event. [Event correlation](../specification.md#Correlation-Definition) is done against CE context attributes, and as such, to be able -to correlate multiple order events to the same order id, it needs to be part of the CE context attributes, and -not its data (payload). - -Now let's start defining our workflow. For the sake of this example, let's define our function and event definitions -as separate YAML files (and then reference them inside our workflow definition). This is useful in cases -when you want to reuse them between multiple workflow definitions. - -#### Workflow Event Definition - -``` yaml -events: -- name: Food Order Event - source: "/orders/" - type: org.orders - correlation: - - contextAttributeName: orderid -- name: ETA Deadline Event - source: "/orderseta" - type: org.orders.eta - correlation: - - contextAttributeName: orderid -- name: Order Picked Up Event - source: "/orderspickup" - type: org.orders.delivery - correlation: - - contextAttributeName: orderid -- name: Order Delievered Event - source: "/orderdelivery" - type: org.orders.delivery - correlation: - - contextAttributeName: orderid -``` - -#### Workflow Function Definition - -``` yaml -functions: -- name: Submit Order Function - operation: http://myorderservice.org/orders.json#submit -- name: Get Order ETA Function - operation: http://myorderservice.org/orders.json#orderETA -- name: Dispatch Courrier Function - operation: http://mydeliveryservice.org/deliveries.json#dispatch -- name: Deliver Order Function - operation: http://mydeliveryservice.org/deliveries.json#deliver -- name: Charge For Order Function - operation: http://mypaymentservice.org/payments.proto#PaymentService#ChargeUser -``` - -#### Main Workflow Definition - -With the function and event definitions in place we can now start writing our main workflow definition: - -```yaml -id: foodorderworkflow -name: Food Order Workflow -version: '1.0.0' -specVersion: '0.8' -start: Place Order -functions: file://orderfunctions.yml -events: file://orderevents.yml -states: -- name: Place Order - type: operation - actions: - - subFlowRef: placeorderworkflow - transition: Wait for ETA Deadline -- name: Wait for ETA Deadline - type: event - onEvents: - - eventRefs: - - ETA Deadline Event - eventDataFilter: - data: "${ .results.status }" - toStateData: "${ .status }" - transition: Deliver Order -- name: Deliver Order - type: operation - actions: - - subFlowRef: deliverorderworkflow - transition: Charge For Order -- name: Charge For Order - type: operation - actions: - - functionRef: - refName: Charge For Order Function - arguments: - order: "${ .order.id }" - actionDataFilter: - results: "${ .outcome.status }" - toStateData: "${ .status }" - stateDataFilter: - output: '${ . | {"orderid": .id, "orderstatus": .status} | .orderstatus += ["Order - Completed"] }' - end: true -``` - -With this in place we can start defining our sub-workflows: - -#### Place Order Sub-Workflow - -```yaml -id: placeorderworkflow -name: Place Order Workflow -version: '1.0.0' -specVersion: '0.8' -start: Submit Order -states: -- name: Submit Order - type: event - onEvents: - - eventRefs: - - Food Order Event - actions: - - functionRef: - refName: Submit Order Function - arguments: - order: "${ .order }" - actionDataFilter: - results: "${ .results.status }" - toStateData: "${ .status }" - - functionRef: - refName: Get Order ETA Function - arguments: - customer: "${ .customerId }" - restaurantid: "${ .order.restaurantId }" - delivery: " ${ .delivery }" - actionDataFilter: - results: "${ .results.status }" - toStateData: "${ .status }" - end: true -``` - -#### Deliver Order Sub-Workflow - -```yaml -id: deliverorderworkflow -name: Deliver Order Workflow -version: '1.0.0' -specVersion: '0.8' -start: Dispatch Courier -states: -- name: Dispatch Courier - type: operation - actions: - - functionRef: Dispatch Courrier Function - transition: Wait for Order Pickup -- name: Wait for Order Pickup - type: event - onEvents: - - eventRefs: - - Order Picked Up Event - eventDataFilter: - data: "${ .data.status }" - toStateData: "${ .status }" - actions: - - functionRef: Deliver Order Function - transition: Wait for Delivery Confirmation -- name: Wait for Delivery Confirmation - type: event - onEvents: - - eventRefs: - - Order Delievered Event - eventDataFilter: - data: "${ .data.status }" - toStateData: "${ .status }" - end: true -``` - -#### Workflow Results - -For the example order event, the workflow output for a successful completion would look like for example: - -```json -{ - "orderid": "ORDER-12345", - "orderstatus": [ - "Order Submitted", - "Order ETA Received", - "Order Picked up", - "Order Delievered", - "Order Charged", - "Order Completed" - ] -} -``` - -### Continuing as a new Execution - -#### Description - -Some runtime implementations on which we run our workflows can have different quotas, such as maximum execution durations, maximum consumed events, etc. We can use the Serverless workflow "continueAs" functionality that can be used to stop the current workflow execution and start another one (of the same or a different type). This is very useful in cases where we have to ensure we don't reach the imposed quotas of single workflow execution. - -This example assumes that the runtime we are using has a quota set to a maximum of one thousand consumed events per single workflow execution. -Our sample workflow consumes a single customer event at a time and invokes the `emailCustomer` function. -Note that we do not set a workflow `workflowExecTimeout`, so we intend to have a long-running workflow. However, because of the runtime restriction, in this case, we would run into the event consume limit, and our workflow would have to terminate. We can fix this problem by using [`continueAs`](../specification.md#Continuing-as-a-new-Execution), which will allow us to make sure that we reach the given limit and then continue our workflow execution as a new run. - -We assume that our workflow input has the runtime-imposed quota: - -```json -{ - "quota": { - "maxConsumedEvents": 1000 - } -} -``` - -#### Workflow Diagram - -

-ContinueAs Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "notifycustomerworkflow", - "name": "notifycustomerworkflow", - "description": "Notify Customer", - "version": "1.0.0", - "specVersion": "0.8", - "start": "wait-for-customer-event", - "states": [ - { - "name": "wait-for-customer-event", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "customer-event" - ], - "eventDataFilter": { - "data": "${ .customerId }", - "toStateData": "${ .eventCustomerId }" - }, - "actions": [ - { - "name": "notify-customer-function", - "functionRef": { - "refName": "notify-customer-function", - "arguments": { - "customerId": "${ .eventCustomerId }" - } - } - } - ] - } - ], - "stateDataFilter": { - "output": "${ .count = .count + 1 }" - }, - "transition": "check-event-quota" - }, - { - "name": "check-event-quota", - "type": "switch", - "dataConditions": [ - { - "name": "ready", - "condition": "${ try(.customerCount) != null and .customerCount > .quota.maxConsumedEvents }", - "end": { - "continueAs": { - "workflowId": "notifycustomerworkflow", - "version": "1.0.0", - "data": "${ del(.customerCount) }" - } - } - } - ], - "defaultCondition": { - "transition": "wait-for-customer-event" - } - } - ], - "events": [ - { - "name": "customer-event", - "type": "org.events.customerEvent", - "source": "customerSource" - } - ], - "functions": [ - { - "name": "notify-customer-function", - "operation": "http://myapis.org/customerapis.json#notifyCustomer" - } - ] -}``` - - - -```yaml -id: notifycustomerworkflow -name: notifycustomerworkflow -description: Notify Customer -version: 1.0.0 -specVersion: "0.8" -start: wait-for-customer-event -states: - - name: wait-for-customer-event - type: event - onEvents: - - eventRefs: - - customer-event - eventDataFilter: - data: ${ .customerId } - toStateData: ${ .eventCustomerId } - actions: - - name: notify-customer-function - functionRef: - refName: notify-customer-function - arguments: - customerId: ${ .eventCustomerId } - stateDataFilter: - output: ${ .count = .count + 1 } - transition: check-event-quota - - name: check-event-quota - type: switch - dataConditions: - - name: ready - condition: ${ try(.customerCount) != null and .customerCount > - .quota.maxConsumedEvents } - end: - continueAs: - workflowId: notifycustomerworkflow - version: 1.0.0 - data: ${ del(.customerCount) } - defaultCondition: - transition: wait-for-customer-event -events: - - name: customer-event - type: org.events.customerEvent - source: customerSource -functions: - - name: notify-customer-function - operation: http://myapis.org/customerapis.json#notifyCustomer -``` - -
- -### Process Transactions - -#### Description - -This example shows how we can loop through a data input array (in parallel), and decide which action to perform -depending on the value of each element in the input array. -We use the [action definition](../specification.md#Action-Definition) `condition` property to perform the action that -is best suited for the transaction value. -Note that in this example we set the "large transaction amount" as a [workflow constant](../specification.md#Workflow-Constants). -There are other ways to set -this value, for example passing it as [workflow data input](../specification.md#Workflow-Data-Input), -or if this data is sensitive, to use [workflow secrets](../specification.md#Workflow-Secrets). - -For the example, we assume the following workflow data input: - -```json -{ - "customer": { - "id": "abc123", - "name": "John Doe", - "transactions": [1000, 400, 60, 7000, 12000, 250] - } -} -``` - -We use the [ForeEach workflow state](../specification.md#ForEach-State) to iterate through customer transactions (in parallel), and -decide which activity to perform based on the transaction value. - -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -{ - "id": "customerbankingtransactions", - "name": "customerbankingtransactions", - "description": "Customer Banking Transactions Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "autoRetries": true, - "constants": { - "largetxamount": 5000 - }, - "states": [ - { - "name": "process-transactions", - "type": "foreach", - "inputCollection": "${ .customer.transactions }", - "iterationParam": "${ .tx }", - "actions": [ - { - "name": "process-larger-transaction", - "functionRef": "banking-service-larger-tx", - "condition": "${ .tx >= $CONST.largetxamount }" - }, - { - "name": "process-smaller-transaction", - "functionRef": "banking-service-smaller-tx", - "condition": "${ .tx < $CONST.largetxamount }" - } - ], - "end": true - } - ], - "functions": [ - { - "name": "banking-service-larger-tx", - "type": "asyncapi", - "operation": "banking.yaml#largerTransation" - }, - { - "name": "banking-service-smaller-tx", - "type": "asyncapi", - "operation": "banking.yaml#smallerTransation" - } - ] -}``` - - - -```yaml -id: customerbankingtransactions -name: customerbankingtransactions -description: Customer Banking Transactions Workflow -version: 1.0.0 -specVersion: "0.8" -autoRetries: true -constants: - largetxamount: 5000 -states: - - name: process-transactions - type: foreach - inputCollection: ${ .customer.transactions } - iterationParam: ${ .tx } - actions: - - name: process-larger-transaction - functionRef: banking-service-larger-tx - condition: ${ .tx >= $CONST.largetxamount } - - name: process-smaller-transaction - functionRef: banking-service-smaller-tx - condition: ${ .tx < $CONST.largetxamount } - end: true -functions: - - name: banking-service-larger-tx - type: asyncapi - operation: banking.yaml#largerTransation - - name: banking-service-smaller-tx - type: asyncapi - operation: banking.yaml#smallerTransation -``` - -
diff --git a/examples/README_TEMPLATE.md b/examples/README_TEMPLATE.md deleted file mode 100644 index cdacb187..00000000 --- a/examples/README_TEMPLATE.md +++ /dev/null @@ -1,1828 +0,0 @@ -# Examples - -Provides Serverless Workflow language examples - -## Table of Contents - -- [Hello World](#Hello-World-Example) -- [Greeting](#Greeting-Example) -- [Event-based greeting (Event State)](#Event-Based-Greeting-Example) -- [Solving Math Problems (ForEach state)](#Solving-Math-Problems-Example) -- [Parallel Execution](#Parallel-Execution-Example) -- [Async Function Invocation](#Async-Function-Invocation-Example) -- [Async SubFlow Invocation](#Async-SubFlow-Invocation-Example) -- [Event Based Transitions (Event-based Switch)](#Event-Based-Transitions-Example) -- [Applicant Request Decision (Data-based Switch + SubFlows)](#Applicant-Request-Decision-Example) -- [Provision Orders (Error Handling)](#Provision-Orders-Example) -- [Monitor Job for completion (Polling)](#Monitor-Job-Example) -- [Send CloudEvent on Workflow Completion](#Send-CloudEvent-On-Workflow-Completion-Example) -- [Monitor Patient Vital Signs (Event state)](#Monitor-Patient-Vital-Signs-Example) -- [Finalize College Application (Event state)](#Finalize-College-Application-Example) -- [Perform Customer Credit Check (Callback state)](#Perform-Customer-Credit-Check-Example) -- [Handle Car Auction Bids (Scheduled start Event state)](#Handle-Car-Auction-Bids-Example) -- [Check Inbox Periodically (Cron-based Workflow start)](#Check-Inbox-Periodically) -- [Event-based service invocation (Event triggered actions)](#Event-Based-Service-Invocation) -- [Reusing Function and Event Definitions](#Reusing-Function-And-Event-Definitions) -- [New Patient Onboarding (Error checking and Retries)](#New-Patient-Onboarding) -- [Purchase order deadline (ExecTimeout)](#Purchase-order-deadline) -- [Accumulate room readings and create timely reports (ExecTimeout and KeepActive)](#Accumulate-room-readings) -- [Car vitals checks (SubFlow Repeat)](#Car-Vitals-Checks) -- [Book Lending Workflow](#Book-Lending) -- [Filling a glass of water (Expression functions)](#Filling-a-glass-of-water) -- [Online Food Ordering](#Online-Food-Ordering) -- [Continuing as a new Execution](#Continuing-as-a-new-Execution) -- [Process Transactions (Foreach State with conditions)](#Process-Transactions) - -### Hello World Example - -#### Description - -In this simple example we use an [Inject State](../specification.md#Inject-State) to inject -`Hello World` in the states data (as the value of the 'result' property). -After the state execution completes, since it is an end state, its data output becomes the workflow -data output, which is: - -```json -{ - "result": "Hello World" -} -``` - -#### Workflow Diagram - -

-Hello World Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Greeting Example - -#### Description - -This example shows a single [Operation State](../specification.md#operation-state) with one action that calls the "greeting" function. -The workflow data input is assumed to be the name of the person to greet: - -```json -{ - "person": { - "name": "John" - } -} -``` - -The results of the action is assumed to be the greeting for the provided persons name: - -```json -{ - "greeting": "Welcome to Serverless Workflow, John!" -} -``` - -Which is added to the states data and becomes the workflow data output. - -#### Workflow Diagram - -

-Greeting Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Event Based Greeting Example - -#### Description - -This example shows a single [Event State](../specification.md#event-state) with one action that calls the "greeting" function. -The event state consumes cloud events of type "greetingEventType". When an event with this type -is consumed, the Event state performs a single action that calls the defined "greeting" function. - -For the sake of the example we assume that the cloud event we will consume has the format: - -```json -{ - "specversion" : "1.0", - "type" : "greetingEventType", - "source" : "greetingEventSource", - "data" : { - "greet": { - "name": "John" - } - } -} -``` - -The results of the action is assumed to be the full greeting for the provided persons name: - -```json -{ - "payload": { - "greeting": "Welcome to Serverless Workflow, John!" - } -} -``` - -Note that in the workflow definition you can see two filters defined. The event data filter defined inside the consume element: - -```json -{ - "eventDataFilter": { - "data": "${ .data.greet } " - } -} -``` - -which is triggered when the greeting event is consumed. It extracts its "data.greet" of the event data (payload) and -merges it with the state data. - -The second, a state data filter, which is defined on the event state itself: - -```json -{ - "stateDataFilter": { - "output": "${ .payload.greeting }" - } -} -``` - -filters what is selected to be the state data output which then becomes the workflow data output (as it is an end state): - -```text - "Welcome to Serverless Workflow, John!" -``` - -#### Workflow Diagram - -

-Event Based Greeting Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Solving Math Problems Example - -#### Description - -In this example we show how to iterate over data using the [ForEach State](../specification.md#foreach-state). -The state will iterate over a collection of simple math expressions which are -passed in as the workflow data input: - -```json - { - "expressions": ["2+2", "4-1", "10x3", "20/2"] - } -``` - -The ForEach state will execute a single defined operation state for each math expression. The operation -state contains an action which calls a serverless function which actually solves the expression -and returns its result. - -Results of all math expressions are accumulated into the data output of the ForEach state which become the final -result of the workflow execution. - -#### Workflow Diagram - -

-Looping Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Parallel Execution Example - -#### Description - -This example uses a [Parallel State](../specification.md#parallel-state) to execute two branches (simple wait states) at the same time. -The completionType type is set to "allOf", which means the parallel state has to wait for both branches -to finish execution before it can transition (end workflow execution in this case as it is an end state). - -#### Workflow Diagram - -

-Parallel Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -We assume that the two referenced workflows, namely `shortdelayworkflowid` and `longdelayworkflowid` both include a single delay state, -with the `shortdelayworkflowid` workflow delay state defining its `timeDelay` property to be shorter than that of the `longdelayworkflowid` workflows -delay state. - -### Async Function Invocation Example - -#### Description - -This example uses a [Operation State](../specification.md#operation-state) to invoke a function async. -This functions sends an email to a customer. -Async function execution is a "fire-and-forget" type of invocation. The function is invoked and workflow execution -does not wait for its results. - -#### Workflow Diagram - -

-Async Function Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Async SubFlow Invocation Example - -#### Description - -This example uses a [Operation State](../specification.md#operation-state) to invoke a [SubFlow](../specification.md#Subflow-Action) async. -This SubFlow is responsible for performing some customer business logic. -Async SubFlow invocation is a "fire-and-forget" type of invocation. The SubFlow is invoked and workflow execution -does not wait for its results. In addition, we specify that the SubFlow should be allowed to continue its execution -event if the parent workflow completes its own execution. This is done by defining the actions `onParentComplete` -property to `continue`. - -#### Workflow Diagram - -

-Async SubFlow Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -For the sake of the example, the definition of "customeronboardingworkflow" workflow invoked as a SubFlow -is not shown. - -### Event Based Transitions Example - -#### Description - -In this example we use an Event-based [Switch State](../specification.md#switch-state) to wait for arrival -of the "VisaApproved", or "VisaRejected" Cloud Events. Depending on which type of event happens, -the workflow performs a different transition. If none of the events arrive in the defined 1 hour timeout -period, the workflow transitions to the "HandleNoVisaDecision" state. - -#### Workflow Diagram - -

-Event Based Switch Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Applicant Request Decision Example - -#### Description - -This example shows off the [Switch State](../specification.md#switch-state) and the subflow action. The workflow is started with application information data as input: - -```json - { - "applicant": { - "fname": "John", - "lname": "Stockton", - "age": 22, - "email": "js@something.com" - } - } -``` - -We use the switch state with two conditions to determine if the application should be made based on the applicants age. -If the applicants age is over 18 we start the application (subflow action). Otherwise the workflow notifies the - applicant of the rejection. - -#### Workflow Diagram - -

-Switch State Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Provision Orders Example - -#### Description - -In this example we show off the states error handling capability. The workflow data input that's passed in contains -missing order information that causes the function in the "ProvisionOrder" state to throw a runtime exception. With the "onErrors" definition we -can transition the workflow to different error handling states. Each type of error -in this example is handled by simple delay states. If no errors are encountered the workflow can transition to the "ApplyOrder" state. - -Workflow data is assumed to me: - -```json - { - "order": { - "id": "", - "item": "laptop", - "quantity": "10" - } - } -``` - -The data output of the workflow contains the information of the exception caught during workflow execution. - -#### Workflow Diagram - -

-Handle Errors Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Monitor Job Example - -#### Description - -In this example we submit a job via an operation state action (serverless function call). It is assumed that it takes some time for -the submitted job to complete and that it's completion can be checked via another separate serverless function call. - -To check for completion we first wait 5 seconds and then get the results of the "CheckJob" serverless function. -Depending on the results of this we either return the results or transition back to waiting and checking the job completion. -This is done until the job submission returns "SUCCEEDED" or "FAILED" and the job submission results are reported before workflow -finishes execution. - -In the case job submission raises a runtime error, we transition to an Operation state which invokes - a sub-flow responsible for handling the job submission issue. - -#### Workflow Diagram - -

-Job Monitoring Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Send CloudEvent On Workflow Completion Example - -#### Description - -This example shows how we can produce a CloudEvent on completion of a workflow. Let's say we have the following -workflow data containing orders that need to be provisioned by our workflow: - -```json -{ - "orders": [{ - "id": "123", - "item": "laptop", - "quantity": "10" - }, - { - "id": "456", - "item": "desktop", - "quantity": "4" - }] -} -``` - -Our workflow in this example uses a ForEach state to provision the orders in parallel. The "provisionOrder" function -used is assumed to have the following results: - -```json -{ - "id": "123", - "outcome": "SUCCESS" -} -``` - -After orders have been provisioned the ForEach states defines the end property which stops workflow execution. -It defines its end definition to be of type "event" in which case a CloudEvent will be produced which can be consumed -by other orchestration workflows or other interested consumers. - -Note that we define the event to be produced in the workflows "events" property. - -The data attached to the event contains the information on provisioned orders by this workflow. So the produced -CloudEvent upon completion of the workflow could look like: - -```json -{ - "specversion" : "1.0", - "type" : "provisionCompleteType", - "datacontenttype" : "application/json", - ... - "data": { - "provisionedOrders": [ - { - "id": "123", - "outcome": "SUCCESS" - }, - { - "id": "456", - "outcome": "FAILURE" - } - ] - } -} -``` - -#### Workflow Diagram - -

-Send CloudEvent on Workflow Completion Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Monitor Patient Vital Signs Example - -#### Description - -In this example a hospital patient is monitored by a Vial Sign Monitoring system. This device can produce three different Cloud Events, namely -"High Body Temperature", "High Blood Pressure", and "High Respiration Rate". -Our workflow which needs to take proper actions depending on the event the Vital Sign Monitor produces needs to start -if any of these events occur. For each of these events a new instance of the workflow is started. - -Since the hospital may include many patients that are being monitored it is assumed that all events include a patientId context attribute in the event - message. We can use the value of this context attribute to associate the incoming events with the same patient as well as - use the patient id to pass as parameter to the functions called by event activities. Here is an example of such event: - -```json -{ - "specversion" : "1.0", - "type" : "org.monitor.highBodyTemp", - "source" : "monitoringSource", - "subject" : "BodyTemperatureReading", - "id" : "A234-1234-1234", - "time" : "2020-01-05T17:31:00Z", - "patientId" : "PID-12345", - "data" : { - "value": "98.6F" - } -} -``` - -As you can see the "patientId" context attribute of the event includes our correlation key which is the unique -patient id. If we set it to be the correlation key in our events definition, all events that are considered must -have the matching patient id. - -#### Workflow Diagram - -

-Monitor Patient Vital Signs Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Finalize College Application Example - -#### Description - -In this example our workflow is instantiated when all requirements of a college application are completed. -These requirements include a student submitting an application, the college receiving the students SAT scores, as well -as a student recommendation letter from a former teacher. - -We assume three Cloud Events "ApplicationSubmitted", "SATScoresReceived" and "RecommendationLetterReceived". -Each include the applicant id in their "applicantId" context attribute, so we can use it to associate these events with an individual applicant. - -Our workflow is instantiated and performs the actions to finalize the college application for a student only -when all three of these events happened (in no particular order). - -#### Workflow Diagram - -

-Finalize College Application Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Perform Customer Credit Check Example - -#### Description - -In this example our serverless workflow needs to integrate with an external microservice to perform -a credit check. We assume that this external microservice notifies a human actor which has to make -the approval decision based on customer information. Once this decision is made the service emits a CloudEvent which -includes the decision information as part of its payload. -The workflow waits for this callback event and then triggers workflow transitions based on the -credit check decision results. - -The workflow data input is assumed to be: - -```json -{ - "customer": { - "id": "customer123", - "name": "John Doe", - "SSN": 123456, - "yearlyIncome": 50000, - "address": "123 MyLane, MyCity, MyCountry", - "employer": "MyCompany" - } -} -``` - -The callback event that our workflow will wait on is assumed to have the following formats. -For approved credit check, for example: - -```json -{ - "specversion" : "1.0", - "type" : "creditCheckCompleteType", - "datacontenttype" : "application/json", - ... - "data": { - "creditCheck": [ - { - "id": "customer123", - "score": 700, - "decision": "Approved", - "reason": "Good credit score" - } - ] - } -} -``` - -And for denied credit check, for example: - -```json -{ - "specversion" : "1.0", - "type" : "creditCheckCompleteType", - "datacontenttype" : "application/json", - ... - "data": { - "creditCheck": [ - { - "id": "customer123", - "score": 580, - "decision": "Denied", - "reason": "Low credit score. Recent late payments" - } - ] - } -} -``` - -#### Workflow Diagram - -

-Perform Customer Credit Check Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Handle Car Auction Bids Example - -#### Description - -In this example our serverless workflow needs to handle bits for an online car auction. The car auction has a specific start -and end time. Bids are only allowed to be made during this time period. All bids before or after this time should not be considered. -We assume that the car auction starts at 9am UTC on March 20th 2020 and ends at 3pm UTC on March 20th 2020. - -Bidding is done via an online application and bids are received as events are assumed to have the following format: - -```json -{ - "specversion" : "1.0", - "type" : "carBidType", - "datacontenttype" : "application/json", - ... - "data": { - "bid": [ - { - "carid": "car123", - "amount": 3000, - "bidder": { - "id": "xyz", - "firstName": "John", - "lastName": "Wayne" - } - } - ] - } -} -``` - -#### Workflow Diagram - -

-Handle Car Auction Bid Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Check Inbox Periodically - -#### Description - -In this example we show the use of scheduled cron-based start event property. The example workflow checks the users inbox every 15 minutes -and send them a text message when there are important emails. - -The results of the inbox service called is expected to be for example: - -```json -{ - "messages": [ - { - "title": "Update your health benefits", - "from": "HR", - "priority": "high" - }, - { - "title": "New job candidate resume", - "from": "Recruiting", - "priority": "medium" - }, - ... - ] -} -``` - -#### Workflow Diagram - -

-Check Inbox Periodically Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Event Based Service Invocation - -#### Description - -In this example we want to make a Veterinary appointment for our dog Mia. The vet service can be invoked only -via an event, and its completion results with the appointment day and time is returned via an event as well. - -This shows a common scenario especially inside container environments where some services may not be exposed via -a resource URI, but only accessible by submitting an event to the underlying container events manager. - -For this example we assume that that payload of the Vet service response event includes an "appointment" -object which contains our appointment info. - -This info is then filtered to become the workflow data output. It could also be used to for example send us an -appointment email, a text message reminder, etc. - -For this example we assume that the workflow instance is started given the following workflow data input: - -```json - { - "patientInfo": { - "name": "Mia", - "breed": "German Shepherd", - "age": 5, - "reason": "Bee sting", - "patientId": "Mia1" - } - } -``` - -#### Workflow Diagram - -

-Vet Appointment Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Reusing Function And Event Definitions - -#### Description - -This example shows how [function](../specification.md#Function-Definition) and [event](../specification.md#Event-Definition) definitions -can be declared independently and referenced by workflow definitions. -This is useful when you would like to reuse event and function definitions across multiple workflows. In those scenarios it allows you to make -changed/updates to these definitions in a single place without having to modify multiple workflows. - -For the example we have two files, namely our "functiondefs.json" and "eventdefs.yml" (to show that they can be expressed in either JSON or YAML). -These hold our function and event definitions which then can be referenced by multiple workflows. - -* functiondefs.json - -```json -{ - "functions": [ - { - "name": "checkFundsAvailability", - "operation": "file://myapis/billingapis.json#checkFunds" - }, - { - "name": "sendSuccessEmail", - "operation": "file://myapis/emailapis.json#paymentSuccess" - }, - { - "name": "sendInsufficientFundsEmail", - "operation": "file://myapis/emailapis.json#paymentInsufficientFunds" - } - ] -} -``` - -* eventdefs.yml - -```yaml -events: -- name: PaymentReceivedEvent - type: payment.receive - source: paymentEventSource - correlation: - - contextAttributeName: accountId -- name: ConfirmationCompletedEvent - type: payment.confirmation - -``` - -In our workflow definition then we can reference these files rather than defining function and events in-line. - -#### Workflow Diagram - -

-Reusing Function and Event Definitions Example -

- -#### Workflow Definitions - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### New Patient Onboarding - -#### Description - -In this example we want to use a workflow to onboard a new patient (at a hospital for example). -To onboard a patient our workflow is invoked via a "NewPatientEvent" event. This events payload contains the -patient information, for example: - -```json -{ - "name": "John", - "condition": "chest pains" -} -``` - -When this event is received we want to create a new workflow instance and invoke three services -sequentially. The first service we want to invoke is responsible to store patient information, -second is to assign a doctor to a patient given the patient condition, and third to assign a -new appoitment with the patient and the assigned doctor. - -In addition, in this example we need to handle a possible situation where one or all of the needed -services are not available (the server returns a http 503 (Service Unavailable) error). If our workflow -catches this error, we want to try to recover from this by issuing retries for the particular -service invocation that caused the error up to 10 times with three seconds in-between retries. -If the retries are not successful, we want to just gracefully end workflow execution. - -#### Workflow Diagram - -

-Patient Onboarding Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -#### Workflow Demo - -This example is used in our Serverless Workflow Hands-on series videos [#1](https://www.youtube.com/watch?v=0gmpuGLP-_o) -and [#2](https://www.youtube.com/watch?v=6A6OYp5nygg). - -### Purchase order deadline - -#### Description - -In this example our workflow processes purchase orders. An order event triggers instance of our workflow. -To complete the created order, our workflow must first wait for an order confirmation event (correlated to the -order id), and then wait for the shipment sent event (also correlated to initial order id). -We do not want to place an exact timeout limit for waiting for the confirmation and shipment events, -as this might take a different amount of time depending on the size of the order. However we do have the requirement -that a total amount of time for the order to be confirmed, once its created, is 30 days. -If the created order is not completed within 30 days it needs to be automatically closed. - -This example shows the use of the workflow [execTimeout definition](../specification.md#ExecTimeout-Definition). - -#### Workflow Diagram - -

-Purchase Order Deadline Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Accumulate room readings - -#### Description - -In this example we have two IoT sensors for each room in our house. One reads temperature values -and the other humidity values of each room. We get these measurements for each of our rooms -as CloudEvents. We can correlate events send by our sensors by the room it is in. - -For the example we want to accumulate the temperature and humidity values for each and send hourly reports -to the home owner for each room. - -**Note:** In this example each rooms measurements will be accumulated by a single workflow instance per room. -Once we receive events for 1 hour (per room) each of the room-based workflow instances will create the report. Events -consumed after the report is created will trigger a new instance of our workflow (again, per room), accumulate -the data for an hour, send report, and so on. - -#### Workflow Diagram - -

-Accumulate Room Readings Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Car Vitals Checks - -#### Description - -In this example we need to check car vital signs while our car is driving. -The workflow should start when we receive the "CarTurnedOnEvent" event and stop when the "CarTurnedOffEvent" event is consumed. -While the car is driving our workflow should repeatedly check the vitals every 1 second. - -For this example we use the workflow [SubFlow](../specification.md#SubFlow-Action) actions to perform the vital checks. - -#### Workflow Diagram - -

-Check Car Vitals Example -

- -#### Workflow Definition - -We fist define our top-level workflow for this example: - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -And then our reusable sub-workflow which performs the checking of our car vitals: - - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - - - -```yaml -id: vitalscheck -name: Car Vitals Check -version: '1.0.0' -specVersion: '0.8' -start: CheckVitals -states: - - name: CheckVitals - type: operation - actions: - - functionRef: Check Tire Pressure - - functionRef: Check Oil Pressure - - functionRef: Check Coolant Level - - functionRef: Check Battery - end: - produceEvents: - - eventRef: DisplayChecksOnDashboard - data: "${ .evaluations }" -functions: - - name: checkTirePressure - operation: mycarservices.json#checktirepressure - - name: checkOilPressure - operation: mycarservices.json#checkoilpressure - - name: checkCoolantLevel - operation: mycarservices.json#checkcoolantlevel - - name: checkBattery - operation: mycarservices.json#checkbattery -``` - -
- -### Book Lending - -#### Description - -In this example we want to create a book lending workflow. The workflow starts when a lender -submits a book lending request (via event "Book Lending Request Event"). -The workflow describes our business logic around lending a book, from checking its current availability, -to waiting on the lender's response if the book is currently not available, to checking out the book and notifying -the lender. - -This example expects the "Book Lending Request Event" event to have a payload of for example: - -```json -{ - "book": { - "title": " ... ", - "id": " ... " - }, - "lender": { - "name": "John Doe", - "address": " ... ", - "phone": " ... " - } -} -``` - -where the "book" property defines the book to be lent out, and the "lender" property provides info -about the person wanting to lend the book. - -For the sake of the example we assume the functions and event definitions are defined in separate JSON files. - -#### Workflow Diagram - -

-Book Lending Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Filling a glass of water - -#### Description - -In this example we showcase the power of [expression functions](../specification.md#Using-Functions-For-Expression-Evaluation). -Our workflow definition is assumed to have the following data input: - -```json -{ - "counts": { - "current": 0, - "max": 10 - } -} -``` - -Our workflow simulates filling up a glass of water one "count" at a time until "max" count is reached which -represents our glass is full. -Each time we increment the current count, the workflow checks if we need to keep refilling the glass. -If the current count reaches the max count, the workflow execution ends. -To increment the current count, the workflow invokes the "IncrementCurrent" expression function. -Its results are then merged back into the state data according to the "toStateData" property of the event data filter. - -#### Workflow Diagram - -

-Fill Glass of Water Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Online Food Ordering - -#### Description - -In this example we want to create an online food ordering workflow. The below image outlines the workflow -structure and the available services: - -

-Online Food Ordering Structure -

- -Our workflow starts with the "Place Order" [Subflow](../specification.md#SubFlow-Action), which is responsible -to send the received order to the requested restaurant and the estimated order ETA. -We then wait for the ETA time when our workflow should go into the "Deliver Order" SubFlow, responsible -for dispatching a Courier and sending her/him off to pick up the order. Once the order is picked up, the Courier needs to deliver the order to the customer. -After the order has been delivered to the customer, our workflow needs to charge the customer. - -Our workflow needs to communicate with three services during its execution, namely the Order, Delivery, and -the Payment services. - -For the sake of the example, we assume that our workflow can communicate to the Order and Delivery services via REST and the Payment service via gRPC. -Let's start by defining an example CloudEvent which triggers an instance of our workflow. -This event can be sent by a web UI, for example, or be pushed onto a Kafka/MQTT topic to start our order workflow. - -```json -{ - "specversion": "1.0", - "type": "org.orders", - "source": "/orders/", - "subject": "Food Order", - "id": "A234-1234-1234", - "time": "2021-03-05T17:31:00Z", - "orderid": "ORDER-12345", - "data": { - "id": "ORDER-12345", - "customerId": "CUSTOMER-12345", - "status": [], - "order": { - "restaurantId": "RESTAURANT-54321", - "items": [ - { - "itemId": "ITEM-8765", - "amount": 1, - "addons": "" - } - ] - }, - "delivery":{ - "address": "1234 MyStreet, MyCountry", - "type": "contactless", - "requestedTime": "ASAP", - "location": "Front door", - "instructions": "" - } - } -} -``` - -Note the `orderid` CloudEvent context attribute, which contains the unique ID of the order specified in this event. [Event correlation](../specification.md#Correlation-Definition) is done against CE context attributes, and as such, to be able -to correlate multiple order events to the same order id, it needs to be part of the CE context attributes, and -not its data (payload). - -Now let's start defining our workflow. For the sake of this example, let's define our function and event definitions -as separate YAML files (and then reference them inside our workflow definition). This is useful in cases -when you want to reuse them between multiple workflow definitions. - -#### Workflow Event Definition - -``` yaml -events: -- name: Food Order Event - source: "/orders/" - type: org.orders - correlation: - - contextAttributeName: orderid -- name: ETA Deadline Event - source: "/orderseta" - type: org.orders.eta - correlation: - - contextAttributeName: orderid -- name: Order Picked Up Event - source: "/orderspickup" - type: org.orders.delivery - correlation: - - contextAttributeName: orderid -- name: Order Delievered Event - source: "/orderdelivery" - type: org.orders.delivery - correlation: - - contextAttributeName: orderid -``` - -#### Workflow Function Definition - -``` yaml -functions: -- name: Submit Order Function - operation: http://myorderservice.org/orders.json#submit -- name: Get Order ETA Function - operation: http://myorderservice.org/orders.json#orderETA -- name: Dispatch Courrier Function - operation: http://mydeliveryservice.org/deliveries.json#dispatch -- name: Deliver Order Function - operation: http://mydeliveryservice.org/deliveries.json#deliver -- name: Charge For Order Function - operation: http://mypaymentservice.org/payments.proto#PaymentService#ChargeUser -``` - -#### Main Workflow Definition - -With the function and event definitions in place we can now start writing our main workflow definition: - -```yaml -id: foodorderworkflow -name: Food Order Workflow -version: '1.0.0' -specVersion: '0.8' -start: Place Order -functions: file://orderfunctions.yml -events: file://orderevents.yml -states: -- name: Place Order - type: operation - actions: - - subFlowRef: placeorderworkflow - transition: Wait for ETA Deadline -- name: Wait for ETA Deadline - type: event - onEvents: - - eventRefs: - - ETA Deadline Event - eventDataFilter: - data: "${ .results.status }" - toStateData: "${ .status }" - transition: Deliver Order -- name: Deliver Order - type: operation - actions: - - subFlowRef: deliverorderworkflow - transition: Charge For Order -- name: Charge For Order - type: operation - actions: - - functionRef: - refName: Charge For Order Function - arguments: - order: "${ .order.id }" - actionDataFilter: - results: "${ .outcome.status }" - toStateData: "${ .status }" - stateDataFilter: - output: '${ . | {"orderid": .id, "orderstatus": .status} | .orderstatus += ["Order - Completed"] }' - end: true -``` - -With this in place we can start defining our sub-workflows: - -#### Place Order Sub-Workflow - -```yaml -id: placeorderworkflow -name: Place Order Workflow -version: '1.0.0' -specVersion: '0.8' -start: Submit Order -states: -- name: Submit Order - type: event - onEvents: - - eventRefs: - - Food Order Event - actions: - - functionRef: - refName: Submit Order Function - arguments: - order: "${ .order }" - actionDataFilter: - results: "${ .results.status }" - toStateData: "${ .status }" - - functionRef: - refName: Get Order ETA Function - arguments: - customer: "${ .customerId }" - restaurantid: "${ .order.restaurantId }" - delivery: " ${ .delivery }" - actionDataFilter: - results: "${ .results.status }" - toStateData: "${ .status }" - end: true -``` - -#### Deliver Order Sub-Workflow - -```yaml -id: deliverorderworkflow -name: Deliver Order Workflow -version: '1.0.0' -specVersion: '0.8' -start: Dispatch Courier -states: -- name: Dispatch Courier - type: operation - actions: - - functionRef: Dispatch Courrier Function - transition: Wait for Order Pickup -- name: Wait for Order Pickup - type: event - onEvents: - - eventRefs: - - Order Picked Up Event - eventDataFilter: - data: "${ .data.status }" - toStateData: "${ .status }" - actions: - - functionRef: Deliver Order Function - transition: Wait for Delivery Confirmation -- name: Wait for Delivery Confirmation - type: event - onEvents: - - eventRefs: - - Order Delievered Event - eventDataFilter: - data: "${ .data.status }" - toStateData: "${ .status }" - end: true -``` - -#### Workflow Results - -For the example order event, the workflow output for a successful completion would look like for example: - -```json -{ - "orderid": "ORDER-12345", - "orderstatus": [ - "Order Submitted", - "Order ETA Received", - "Order Picked up", - "Order Delievered", - "Order Charged", - "Order Completed" - ] -} -``` - -### Continuing as a new Execution - -#### Description - -Some runtime implementations on which we run our workflows can have different quotas, such as maximum execution durations, maximum consumed events, etc. We can use the Serverless workflow "continueAs" functionality that can be used to stop the current workflow execution and start another one (of the same or a different type). This is very useful in cases where we have to ensure we don't reach the imposed quotas of single workflow execution. - -This example assumes that the runtime we are using has a quota set to a maximum of one thousand consumed events per single workflow execution. -Our sample workflow consumes a single customer event at a time and invokes the `emailCustomer` function. -Note that we do not set a workflow `workflowExecTimeout`, so we intend to have a long-running workflow. However, because of the runtime restriction, in this case, we would run into the event consume limit, and our workflow would have to terminate. We can fix this problem by using [`continueAs`](../specification.md#Continuing-as-a-new-Execution), which will allow us to make sure that we reach the given limit and then continue our workflow execution as a new run. - -We assume that our workflow input has the runtime-imposed quota: - -```json -{ - "quota": { - "maxConsumedEvents": 1000 - } -} -``` - -#### Workflow Diagram - -

-ContinueAs Example -

- -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
- -### Process Transactions - -#### Description - -This example shows how we can loop through a data input array (in parallel), and decide which action to perform -depending on the value of each element in the input array. -We use the [action definition](../specification.md#Action-Definition) `condition` property to perform the action that -is best suited for the transaction value. -Note that in this example we set the "large transaction amount" as a [workflow constant](../specification.md#Workflow-Constants). -There are other ways to set -this value, for example passing it as [workflow data input](../specification.md#Workflow-Data-Input), -or if this data is sensitive, to use [workflow secrets](../specification.md#Workflow-Secrets). - -For the example, we assume the following workflow data input: - -```json -{ - "customer": { - "id": "abc123", - "name": "John Doe", - "transactions": [1000, 400, 60, 7000, 12000, 250] - } -} -``` - -We use the [ForeEach workflow state](../specification.md#ForEach-State) to iterate through customer transactions (in parallel), and -decide which activity to perform based on the transaction value. - -#### Workflow Definition - - - - - - - - - - -
JSONYAML
- -```json -``` - - - -```yaml -``` - -
diff --git a/examples/accumulate-room-readings.json b/examples/accumulate-room-readings.json deleted file mode 100644 index 1fbbef24..00000000 --- a/examples/accumulate-room-readings.json +++ /dev/null @@ -1,90 +0,0 @@ -{ - "name": "room-readings", - "description": "Room Temp and Humidity Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "consume-reading", - "timeouts": { - "workflowExecTimeout": { - "duration": "PT1H", - "runBefore": "generate-report" - } - }, - "keepActive": true, - "states": [ - { - "name": "consume-reading", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "temperature-event", - "humidity-event" - ], - "actions": [ - { - "name": "log-reading", - "functionRef": { - "refName": "log-reading" - } - } - ], - "eventDataFilter": { - "toStateData": "${ .readings }" - } - } - ], - "end": true - }, - { - "name": "generate-report", - "type": "operation", - "actions": [ - { - "name": "generate-report", - "functionRef": { - "refName": "produce-report", - "arguments": { - "data": "${ .readings }" - } - } - } - ], - "end": { - "terminate": true - } - } - ], - "events": [ - { - "name": "temperature-event", - "type": "my.home.sensors", - "source": "/home/rooms/+", - "correlation": [ - { - "contextAttributeName": "roomId" - } - ] - }, - { - "name": "humidity-event", - "type": "my.home.sensors", - "source": "/home/rooms/+", - "correlation": [ - { - "contextAttributeName": "roomId" - } - ] - } - ], - "functions": [ - { - "name": "log-reading", - "operation": "http.myorg.io/ordersservices.json#logreading" - }, - { - "name": "produce-report", - "operation": "http.myorg.io/ordersservices.json#produceReport" - } - ] -} \ No newline at end of file diff --git a/examples/applicant-request-decision.json b/examples/applicant-request-decision.json deleted file mode 100644 index 4c356b29..00000000 --- a/examples/applicant-request-decision.json +++ /dev/null @@ -1,62 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "applicant-request-decision-workflow", - "description": "Determine if applicant request is valid", - "start": "check-application", - "functions": [ - { - "name": "send-rejection-email-function", - "operation": "http://myapis.org/applicationapi.json#emailRejection" - } - ], - "states": [ - { - "name": "check-application", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .applicants | .age >= 18 }", - "transition": "start-application", - "name": "adult-condition" - }, - { - "condition": "${ .applicants | .age < 18 }", - "transition": "reject-application", - "name": "minor-condition" - } - ], - "defaultCondition": { - "transition": "reject-application" - } - }, - { - "name": "start-application", - "type": "operation", - "actions": [ - { - "name": "start-app-action", - "subFlowRef": "startApplicationWorkflowId" - } - ], - "end": true - }, - { - "name": "reject-application", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "send-reject-action", - "functionRef": { - "refName": "send-rejection-email-function", - "arguments": { - "applicant": "${ .applicant }" - } - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/async-function-invocation.json b/examples/async-function-invocation.json deleted file mode 100644 index 9400d808..00000000 --- a/examples/async-function-invocation.json +++ /dev/null @@ -1,32 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "send-customer-email-workflow", - "description": "Send email to a customer", - "start": "send-email", - "functions": [ - { - "name": "email-function", - "operation": "file://myapis/emailapis.json#sendEmail" - } - ], - "states": [ - { - "name": "send-email", - "type": "operation", - "actions": [ - { - "name": "send-email-action", - "functionRef": { - "invoke": "async", - "refName": "email-function", - "arguments": { - "customer": "${ .customer }" - } - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/async-subflow-invocation.json b/examples/async-subflow-invocation.json deleted file mode 100644 index f1ac0e13..00000000 --- a/examples/async-subflow-invocation.json +++ /dev/null @@ -1,25 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "onboard-customer", - "description": "Onboard a Customer", - "start": "onboard", - "states": [ - { - "name": "onboard", - "type": "operation", - "actions": [ - { - "name": "onboard-action", - "subFlowRef": { - "invoke": "async", - "onParentComplete": "continue", - "workflowId": "customeronboardingworkflow", - "version": "1.0.0" - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/book-lending.json b/examples/book-lending.json deleted file mode 100644 index 298d6585..00000000 --- a/examples/book-lending.json +++ /dev/null @@ -1,161 +0,0 @@ -{ - "name": "book-lending", - "description": "Book Lending Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "book-lending-request", - "constants" : { "WAIT_BEFORE_POLL" : "PT2W"}, - "states": [ - { - "name": "book-lending-request", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "book-lending-request-event" - ] - } - ], - "transition": "get-book-status" - }, - { - "name": "get-book-status", - "type": "operation", - "actions": [ - { - "name": "get-book-status", - "functionRef": { - "refName": "get-status-for-book", - "arguments": { - "bookid": "${ .book.id }" - } - } - } - ], - "transition": "book-status-decision" - }, - { - "name": "book-status-decision", - "type": "switch", - "dataConditions": [ - { - "name": "book-is-on-loan", - "condition": "${ .book.status == \"onloan\" }", - "transition": "report-status-to-lender" - }, - { - "name": "check-is-available", - "condition": "${ .book.status == \"available\" }", - "transition": "check-out-book" - } - ], - "defaultCondition": { - "end": true - } - }, - { - "name": "report-status-to-lender", - "type": "operation", - "actions": [ - { - "name": "report-status-to-lender", - "functionRef": { - "refName": "send-status-to-lender", - "arguments": { - "bookid": "${ .book.id }", - "message": "Book ${ .book.title } is already on loan" - } - } - } - ], - "transition": "wait-for-lender-response" - }, - { - "name": "wait-for-lender-response", - "type": "switch", - "eventConditions": [ - { - "name": "hold-book", - "eventRef": "hold-book-event", - "transition": "request-hold" - }, - { - "name": "decline-book-hold", - "eventRef": "decline-hold-event", - "transition": "cancel-request" - } - ], - "defaultCondition": { - "end": true - } - }, - { - "name": "request-hold", - "type": "operation", - "actions": [ - { - "name": "request-hold", - "functionRef": { - "refName": "request-hold-for-lender", - "arguments": { - "bookid": "${ .book.id }", - "lender": "${ .lender }" - } - }, - "sleep" : { - "after" : "$CONST.WAIT_BEFORE_POLL" - } - } - ], - "transition": "get-book-status" - }, - { - "name": "cancel-request", - "type": "operation", - "actions": [ - { - "name": "cancel-request", - "functionRef": { - "refName": "cancel-hold-request-for-lender", - "arguments": { - "bookid": "${ .book.id }", - "lender": "${ .lender }" - } - }, - "sleep" : { - "after" : "$CONST.WAIT_BEFORE_POLL" - } - } - ], - "transition": "get-book-status" - }, - { - "name": "check-out-book", - "type": "operation", - "actions": [ - { - "name": "check-out-book", - "functionRef": { - "refName": "check-out-book-with-id", - "arguments": { - "bookid": "${ .book.id }" - } - } - }, - { - "name": "notify-lender-for-checkout", - "functionRef": { - "refName": "notify-lender-for-checkout", - "arguments": { - "bookid": "${ .book.id }", - "lender": "${ .lender }" - } - } - } - ], - "end": true - } - ], - "functions": "file://books/lending/functions.json", - "events": "file://books/lending/events.json" -} \ No newline at end of file diff --git a/examples/car-vitals-checks-subflow.json b/examples/car-vitals-checks-subflow.json deleted file mode 100644 index 0f242c94..00000000 --- a/examples/car-vitals-checks-subflow.json +++ /dev/null @@ -1,57 +0,0 @@ -{ - "name": "vitals-check", - "description": "Car Vitals Check", - "version": "1.0.0", - "specVersion": "0.8", - "start": "check-vitals", - "states": [ - { - "name": "check-vitals", - "type": "operation", - "actions": [ - { - "name": "check-tire-pressure", - "functionRef": "check-tire-pressure" - }, - { - "name": "check-oil-pressure", - "functionRef": "check-oil-pressure" - }, - { - "name": "check-coolant-level", - "functionRef": "check-coolant-level" - }, - { - "name": "check-battery", - "functionRef": "check-battery" - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "display-checks-on-dashboard", - "data": "${ .evaluations }" - } - ] - } - } - ], - "functions": [ - { - "name": "check-tire-pressure", - "operation": "mycarservices.json#checktirepressure" - }, - { - "name": "check-oil-pressure", - "operation": "mycarservices.json#checkoilpressure" - }, - { - "name": "check-coolant-level", - "operation": "mycarservices.json#checkcoolantlevel" - }, - { - "name": "check-battery", - "operation": "mycarservices.json#checkbattery" - } - ] -} \ No newline at end of file diff --git a/examples/car-vitals-checks.json b/examples/car-vitals-checks.json deleted file mode 100644 index b40ab74c..00000000 --- a/examples/car-vitals-checks.json +++ /dev/null @@ -1,61 +0,0 @@ -{ - "name": "check-car-vitals", - "description": "Check Car Vitals Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "when-car-is-on", - "states": [ - { - "name": "when-car-is-on", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "car-turned-on-event" - ] - } - ], - "transition": "do-car-vital-checks" - }, - { - "name": "do-car-vital-checks", - "type": "operation", - "actions": [ - { - "name": "do-car-vital-checks", - "subFlowRef": "vitalscheck", - "sleep": { - "after": "PT1S" - } - } - ], - "transition": "check-continue-vital-checks" - }, - { - "name": "check-continue-vital-checks", - "type": "switch", - "eventConditions": [ - { - "name": "car-turned-off-condition", - "eventRef": "car-turned-off-event", - "end": true - } - ], - "defaultCondition": { - "transition": "do-car-vital-checks" - } - } - ], - "events": [ - { - "name": "car-turned-on-event", - "type": "car.events", - "source": "my/car" - }, - { - "name": "car-turned-off-event", - "type": "car.events", - "source": "my/car" - } - ] -} \ No newline at end of file diff --git a/examples/check-inbox-periodically.json b/examples/check-inbox-periodically.json deleted file mode 100644 index 985f27e9..00000000 --- a/examples/check-inbox-periodically.json +++ /dev/null @@ -1,54 +0,0 @@ -{ - "name": "check-inbox", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Periodically Check Inbox", - "start": { - "stateName": "check-inbox", - "schedule": { - "cron": "0 0/15 * * * ?" - } - }, - "functions": [ - { - "name": "check-inbox-function", - "operation": "http://myapis.org/inboxapi.json#checkNewMessages" - }, - { - "name": "send-text-function", - "operation": "http://myapis.org/inboxapi.json#sendText" - } - ], - "states": [ - { - "name": "check-inbox", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name":"check-inbox", - "functionRef": "check-inbox-function" - } - ], - "transition": "send-text-for-high-priority" - }, - { - "name": "send-text-for-high-priority", - "type": "foreach", - "inputCollection": "${ .messages }", - "iterationParam": "singlemessage", - "actions": [ - { - "name": "send-text-for-high-priority", - "functionRef": { - "refName": "send-text-function", - "arguments": { - "message": "${ .singlemessage }" - } - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/continuing-as-a-new-execution.json b/examples/continuing-as-a-new-execution.json deleted file mode 100644 index 0f42ac92..00000000 --- a/examples/continuing-as-a-new-execution.json +++ /dev/null @@ -1,72 +0,0 @@ -{ - "name": "notify-customer-workflow", - "description": "Notify Customer", - "version": "1.0.0", - "specVersion": "0.8", - "start": "wait-for-customer-event", - "states": [ - { - "name": "wait-for-customer-event", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "customer-event" - ], - "eventDataFilter": { - "data": "${ .customerId }", - "toStateData": "${ .eventCustomerId }" - }, - "actions": [ - { - "name": "notify-customer-function", - "functionRef": { - "refName": "notify-customer-function", - "arguments": { - "customerId": "${ .eventCustomerId }" - } - } - } - ] - } - ], - "stateDataFilter": { - "output": "${ .count = .count + 1 }" - }, - "transition": "check-event-quota" - }, - { - "name": "check-event-quota", - "type": "switch", - "dataConditions": [ - { - "name": "ready", - "condition": "${ try(.customerCount) != null and .customerCount > .quota.maxConsumedEvents }", - "end": { - "continueAs": { - "workflowId": "notifycustomerworkflow", - "version": "1.0.0", - "data": "${ del(.customerCount) }" - } - } - } - ], - "defaultCondition": { - "transition": "wait-for-customer-event" - } - } - ], - "events": [ - { - "name": "customer-event", - "type": "org.events.customerEvent", - "source": "customerSource" - } - ], - "functions": [ - { - "name": "notify-customer-function", - "operation": "http://myapis.org/customerapis.json#notifyCustomer" - } - ] -} \ No newline at end of file diff --git a/examples/curl.json b/examples/curl.json deleted file mode 100644 index 3a8a9b0d..00000000 --- a/examples/curl.json +++ /dev/null @@ -1,35 +0,0 @@ -{ - "name": "curlgoogle", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Curl Google", - "start": "curl", - "functions": [ - { - "name": "curl-google", - "type": "http", - "operation": { - "method": "GET", - "uri": "https://www.google.com/search?q={query}" - } - } - ], - "states": [ - { - "name": "curl", - "type": "operation", - "actions": [ - { - "name": "do-curl", - "functionRef": { - "refName": "curl-google", - "arguments": { - "query": "${ .query }" - } - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/event-based-greeting.json b/examples/event-based-greeting.json deleted file mode 100644 index 9036726f..00000000 --- a/examples/event-based-greeting.json +++ /dev/null @@ -1,52 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "event-based-greeting-workflow", - "description": "Event Based Greeting", - "start": "greet", - "events": [ - { - "name": "greeting-event", - "type": "greetingEventType", - "source": "greetingEventSource" - } - ], - "functions": [ - { - "name": "greeting-function", - "operation": "file://myapis/greetingapis.json#greeting" - } - ], - "states": [ - { - "name": "greet", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "greeting-event" - ], - "eventDataFilter": { - "data": "${ .greet }", - "toStateData": "${ .greet }" - }, - "actions": [ - { - "name": "greet-action", - "functionRef": { - "refName": "greeting-function", - "arguments": { - "name": "${ .greet.name }" - } - } - } - ] - } - ], - "stateDataFilter": { - "output": "${ .payload.greeting }" - }, - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/event-based-service-invocation.json b/examples/event-based-service-invocation.json deleted file mode 100644 index 1a02e866..00000000 --- a/examples/event-based-service-invocation.json +++ /dev/null @@ -1,47 +0,0 @@ -{ - "name": "vet-appointment-workflow", - "description": "Vet service call via events", - "version": "1.0.0", - "specVersion": "0.8", - "start": "make-vet-appointment-state", - "events": [ - { - "name": "make-vet-appointment", - "source": "VetServiceSource", - "type": "events.vet.appointments" - }, - { - "name": "vet-appointment-info", - "source": "VetServiceSource", - "type": "events.vet.appointments" - } - ], - "states": [ - { - "name": "make-vet-appointment-state", - "type": "operation", - "actions": [ - { - "name": "make-appointment-action", - "publish": { - "event": "make-vet-appointment", - "data": "${ .patientInfo }" - } - }, - { - "name": "wait-appointement-confirmation", - "subscribe": { - "event": "vet-appointment-info" - }, - "actionDataFilter": { - "results": "${ .appointmentInfo }" - } - } - ], - "timeouts": { - "actionExecTimeout": "PT15M" - }, - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/event-based-transitions.json b/examples/event-based-transitions.json deleted file mode 100644 index d094f540..00000000 --- a/examples/event-based-transitions.json +++ /dev/null @@ -1,76 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "event-based-switch-transitions", - "description": "Event Based Switch Transitions", - "start": "checkvisastatus", - "events": [ - { - "name": "visa-approved-event", - "type": "VisaApproved", - "source": "visaCheckSource" - }, - { - "name": "visa-rejected-event", - "type": "VisaRejected", - "source": "visaCheckSource" - } - ], - "states": [ - { - "name": "checkvisastatus", - "type": "switch", - "eventConditions": [ - { - "eventRef": "visa-approved-event", - "transition": "handle-approved-visa", - "name": "approved-condition" - }, - { - "eventRef": "visa-rejected-event", - "transition": "handle-rejected-visa", - "name": "rejected-condition" - } - ], - "timeouts": { - "eventTimeout": "PT1H" - }, - "defaultCondition": { - "transition": "handle-no-visa-decision" - } - }, - { - "name": "handle-approved-visa", - "type": "operation", - "actions": [ - { - "name": "handle-approved-action", - "subFlowRef": "handleApprovedVisaWorkflowID" - } - ], - "end": true - }, - { - "name": "handle-rejected-visa", - "type": "operation", - "actions": [ - { - "name": "handle-rejected-action", - "subFlowRef": "handleRejectedVisaWorkflowID" - } - ], - "end": true - }, - { - "name": "handle-no-visa-decision", - "type": "operation", - "actions": [ - { - "name": "handle-novisa-action", - "subFlowRef": "handleNoVisaDecisionWorkflowId" - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/filling-a-glass-of-water.json b/examples/filling-a-glass-of-water.json deleted file mode 100644 index 98dc8d98..00000000 --- a/examples/filling-a-glass-of-water.json +++ /dev/null @@ -1,49 +0,0 @@ -{ - "name": "fill-glass-of-water", - "description": "Fill glass of water workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "check-if-full", - "functions": [ - { - "name": "increment-current-count-function", - "type": "expression", - "operation": ".counts.current += 1 | .counts.current" - } - ], - "states": [ - { - "name": "check-if-full", - "type": "switch", - "dataConditions": [ - { - "name": "need-to-fill-more", - "condition": "${ .counts.current < .counts.max }", - "transition": "add-water" - }, - { - "name": "glass-full", - "condition": ".counts.current >= .counts.max", - "end": true - } - ], - "defaultCondition": { - "end": true - } - }, - { - "name": "add-water", - "type": "operation", - "actions": [ - { - "name": "add-water", - "functionRef": "increment-current-count-function", - "actionDataFilter": { - "toStateData": ".counts.current" - } - } - ], - "transition": "check-if-full" - } - ] -} \ No newline at end of file diff --git a/examples/finalize-college-application.json b/examples/finalize-college-application.json deleted file mode 100644 index d0468092..00000000 --- a/examples/finalize-college-application.json +++ /dev/null @@ -1,74 +0,0 @@ -{ - "name": "finalize-college-application", - "version": "1.0.0", - "specVersion": "0.8", - "start": "finalize-application", - "events": [ - { - "name": "application-submitted", - "type": "org.application.submitted", - "source": "applicationsource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - }, - { - "name": "sat-scores-received", - "type": "org.application.satscores", - "source": "applicationsource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - }, - { - "name": "recommendation-letter-received", - "type": "org.application.recommendationLetter", - "source": "applicationsource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - } - ], - "functions": [ - { - "name": "finalize-application-function", - "operation": "http://myapis.org/collegeapplicationapi.json#finalize" - } - ], - "states": [ - { - "name": "finalize-application", - "type": "event", - "exclusive": false, - "onEvents": [ - { - "eventRefs": [ - "application-submitted", - "sat-scores-received", - "recommendation-letter-received" - ], - "actions": [ - { - "name": "finalize-application", - "functionRef": { - "refName": "finalize-application-function", - "arguments": { - "student": "${ .applicantId }" - } - } - } - ] - } - ], - "end": { - "terminate": true - } - } - ] -} \ No newline at end of file diff --git a/examples/greeting.json b/examples/greeting.json deleted file mode 100644 index 5ee1e67a..00000000 --- a/examples/greeting.json +++ /dev/null @@ -1,35 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "greeting-workflow", - "description": "Greet Someone", - "start": "greet", - "functions": [ - { - "name": "greeting-function", - "type": "openapi", - "operation": "file://myapis/greetingapis.json#greeting" - } - ], - "states": [ - { - "name": "greet", - "type": "operation", - "actions": [ - { - "name": "greet-action", - "functionRef": { - "refName": "greeting-function", - "arguments": { - "name": "${ .person.name }" - } - }, - "actionDataFilter": { - "results": "${ {greeting: .greeting} }" - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/handle-car-auction-bids.json b/examples/handle-car-auction-bids.json deleted file mode 100644 index fb95d910..00000000 --- a/examples/handle-car-auction-bids.json +++ /dev/null @@ -1,49 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "handle-car-auction-bid", - "description": "Store a single bid whole the car auction is active", - "start": { - "stateName": "store-car-auction-bid", - "schedule": "R/PT2H" - }, - "functions": [ - { - "name": "store-bid-function", - "operation": "http://myapis.org/carauctionapi.json#storeBid" - } - ], - "events": [ - { - "name": "car-bid-event", - "type": "carBidMadeType", - "source": "carBidEventSource" - } - ], - "states": [ - { - "name": "store-car-auction-bid", - "type": "event", - "exclusive": true, - "onEvents": [ - { - "eventRefs": [ - "car-bid-event" - ], - "actions": [ - { - "name": "car-bid-event", - "functionRef": { - "refName": "store-bid-function", - "arguments": { - "bid": "${ .bid }" - } - } - } - ] - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/hello-world.json b/examples/hello-world.json deleted file mode 100644 index 0231d304..00000000 --- a/examples/hello-world.json +++ /dev/null @@ -1,17 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "hello-world-workflow", - "description": "Inject Hello World", - "start": "hello-state", - "states": [ - { - "name": "hello-state", - "type": "inject", - "data": { - "result": "Hello World!" - }, - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/monitor-job.json b/examples/monitor-job.json deleted file mode 100644 index 38598fc6..00000000 --- a/examples/monitor-job.json +++ /dev/null @@ -1,129 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "job-monitoring", - "description": "Monitor finished execution of a submitted job", - "start": "submit-job", - "functions": [ - { - "name": "submit-job", - "operation": "http://myapis.org/monitorapi.json#doSubmit" - }, - { - "name": "check-job-status", - "operation": "http://myapis.org/monitorapi.json#checkStatus" - }, - { - "name": "report-job-suceeded", - "operation": "http://myapis.org/monitorapi.json#reportSucceeded" - }, - { - "name": "report-job-failed", - "operation": "http://myapis.org/monitorapi.json#reportFailure" - } - ], - "states": [ - { - "name": "submit-job", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "submit-job", - "functionRef": { - "refName": "submit-job", - "arguments": { - "name": "${ .job.name }" - } - }, - "actionDataFilter": { - "results": "${ .jobuid }" - } - } - ], - "stateDataFilter": { - "output": "${ .jobuid }" - }, - "transition": "get-job-status" - }, - { - "name": "get-job-status", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "get-job-status", - "functionRef": { - "refName": "check-job-status", - "arguments": { - "name": "${ .jobuid }" - } - }, - "actionDataFilter": { - "results": "${ .jobstatus }" - }, - "sleep" : { - "before": "PT5S" - } - } - ], - "stateDataFilter": { - "output": "${ .jobstatus }" - }, - "transition": "determine-completion" - }, - { - "name": "determine-completion", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .jobStatus == \"SUCCEEDED\" }", - "transition": "job-succeeded", - "name": "succeed" - }, - { - "condition": "${ .jobStatus == \"FAILED\" }", - "transition": "job-failed", - "name": "failed" - } - ], - "defaultCondition": { - "transition": "get-job-status" - } - }, - { - "name": "job-succeeded", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "job-succeeded", - "functionRef": { - "refName": "report-job-suceeded", - "arguments": { - "name": "${ .jobuid }" - } - } - } - ], - "end": true - }, - { - "name": "job-failed", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "job-failed", - "functionRef": { - "refName": "report-job-failed", - "arguments": { - "name": "${ .jobuid }" - } - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/monitor-patient-vital-signs.json b/examples/monitor-patient-vital-signs.json deleted file mode 100644 index 1ae55caf..00000000 --- a/examples/monitor-patient-vital-signs.json +++ /dev/null @@ -1,112 +0,0 @@ -{ - "name": "patient-vitals-workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "monitor-vitals", - "events": [ - { - "name": "high-body-temperature", - "type": "org.monitor.highBodyTemp", - "source": "monitoringSource", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - }, - { - "name": "high-blood-pressure", - "type": "org.monitor.highBloodPressure", - "source": "monitoringSource", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - }, - { - "name": "high-respiration-rate", - "type": "org.monitor.highRespirationRate", - "source": "monitoringSource", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - } - ], - "functions": [ - { - "name": "call-pulmonologist", - "operation": "http://myapis.org/patientapis.json#callPulmonologist" - }, - { - "name": "send-tylenol-order", - "operation": "http://myapis.org/patientapis.json#tylenolOrder" - }, - { - "name": "call-nurse", - "operation": "http://myapis.org/patientapis.json#callNurse" - } - ], - "states": [ - { - "name": "monitor-vitals", - "type": "event", - "exclusive": true, - "onEvents": [ - { - "eventRefs": [ - "high-body-temperature" - ], - "actions": [ - { - "name": "send-tylenol-order", - "functionRef": { - "refName": "send-tylenol-order", - "arguments": { - "patientid": "${ .patientId }" - } - } - } - ] - }, - { - "eventRefs": [ - "high-blood-pressure" - ], - "actions": [ - { - "name": "call-nurse", - "functionRef": { - "refName": "call-nurse", - "arguments": { - "patientid": "${ .patientId }" - } - } - } - ] - }, - { - "eventRefs": [ - "high-respiration-rate" - ], - "actions": [ - { - "name": "call-pulmonologist", - "functionRef": { - "refName": "call-pulmonologist", - "arguments": { - "patientid": "${ .patientId }" - } - } - } - ] - } - ], - "end": { - "terminate": true - } - } - ] -} \ No newline at end of file diff --git a/examples/new-patient-onboarding.json b/examples/new-patient-onboarding.json deleted file mode 100644 index 1c988387..00000000 --- a/examples/new-patient-onboarding.json +++ /dev/null @@ -1,88 +0,0 @@ -{ - "name": "patient-onboarding", - "version": "1.0.0", - "specVersion": "0.8", - "start": "onboard", - "states": [ - { - "name": "onboard", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "new-patient-event" - ], - "actions": [ - { - "name": "store-patient", - "functionRef": "store-patient", - "onErrors": "fault-tolerance" - }, - { - "name": "assign-doctor", - "functionRef": "assign-doctor", - "onErrors": "fault-tolerance" - }, - { - "name": "schedule-appt", - "functionRef": "schedule-appt", - "onErrors": "fault-tolerance" - } - ] - } - ], - "end": true - } - ], - "events": [ - { - "name": "store-patient", - "type": "new.patients.event", - "source": "newpatient/+" - } - ], - "functions": [ - { - "name": "store-new-patient-info", - "operation": "api/services.json#addPatient" - }, - { - "name": "assign-doctor", - "operation": "api/services.json#assignDoctor" - }, - { - "name": "schedule-appt", - "operation": "api/services.json#scheduleAppointment" - } - ], - "errors": { - "handlers":[ - { - "name": "handle-503-errors", - "when":[ - { - "status": 503 - } - ], - "retry": "services-not-available-retry-strategy" - } - ], - "policies":[ - { - "name": "fault-tolerance", - "handlers":[ - { - "refName": "handle-503-errors" - } - ] - } - ] - }, - "retries": [ - { - "name": "services-not-available-retry-strategy", - "delay": "PT3S", - "maxAttempts": 10 - } - ] -} \ No newline at end of file diff --git a/examples/parallel-execution.json b/examples/parallel-execution.json deleted file mode 100644 index d562fc9b..00000000 --- a/examples/parallel-execution.json +++ /dev/null @@ -1,35 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "parallel-execution", - "description": "Executes two branches in parallel", - "start": "parallelexec", - "states": [ - { - "name": "parallelexec", - "type": "parallel", - "completionType": "allOf", - "branches": [ - { - "name": "short-delay-branch", - "actions": [ - { - "name": "short-delay-action", - "subFlowRef": "shortdelayworkflowid" - } - ] - }, - { - "name": "long-delay-branch", - "actions": [ - { - "name": "short-delay-action", - "subFlowRef": "longdelayworkflowid" - } - ] - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/perform-customer-credit-check.json b/examples/perform-customer-credit-check.json deleted file mode 100644 index 10f24f26..00000000 --- a/examples/perform-customer-credit-check.json +++ /dev/null @@ -1,96 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "customer-credit-check", - "description": "Perform Customer Credit Check", - "start": "check-credit", - "functions": [ - { - "name": "check-credit-function", - "operation": "http://myapis.org/creditcheckapi.json#doCreditCheck" - }, - { - "name": "send-rejection-email-function", - "operation": "http://myapis.org/creditcheckapi.json#rejectionEmail" - } - ], - "events": [ - { - "name": "credit-check-completed-event", - "type": "creditCheckCompleteType", - "source": "creditCheckSource", - "correlation": [ - { - "contextAttributeName": "customerId" - } - ] - } - ], - "states": [ - { - "name": "check-credit", - "type": "callback", - "action": { - "name": "check-credit", - "functionRef": { - "refName": "check-credit-function", - "arguments": { - "customer": "${ .customer }" - } - } - }, - "eventRef": "credit-check-completed-event", - "timeouts": { - "stateExecTimeout": "PT15M" - }, - "transition": "evaluate-decision" - }, - { - "name": "evaluate-decision", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .creditCheck | .decision == \"Approved\" }", - "transition": "start-application", - "name": "start-application" - }, - { - "condition": "${ .creditCheck | .decision == \"Denied\" }", - "transition": "reject-application", - "name": "reject-application" - } - ], - "defaultCondition": { - "transition": "reject-application" - } - }, - { - "name": "start-application", - "type": "operation", - "actions": [ - { - "name": "start-application", - "subFlowRef": "startApplicationWorkflowId" - } - ], - "end": true - }, - { - "name": "reject-application", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "reject-application", - "functionRef": { - "refName": "send-rejection-email-function", - "arguments": { - "applicant": "${ .customer }" - } - } - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/process-transactions.json b/examples/process-transactions.json deleted file mode 100644 index 5cb15972..00000000 --- a/examples/process-transactions.json +++ /dev/null @@ -1,42 +0,0 @@ -{ - "name": "customer-banking-transactions", - "description": "Customer Banking Transactions Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "constants": { - "largetxamount": 5000 - }, - "states": [ - { - "name": "process-transactions", - "type": "foreach", - "inputCollection": "${ .customer.transactions }", - "iterationParam": "${ .tx }", - "actions": [ - { - "name": "process-larger-transaction", - "functionRef": "banking-service-larger-tx", - "condition": "${ .tx >= $CONST.largetxamount }" - }, - { - "name": "process-smaller-transaction", - "functionRef": "banking-service-smaller-tx", - "condition": "${ .tx < $CONST.largetxamount }" - } - ], - "end": true - } - ], - "functions": [ - { - "name": "banking-service-larger-tx", - "type": "asyncapi", - "operation": "banking.yaml#largerTransation" - }, - { - "name": "banking-service-smaller-tx", - "type": "asyncapi", - "operation": "banking.yaml#smallerTransation" - } - ] -} \ No newline at end of file diff --git a/examples/provision-orders.json b/examples/provision-orders.json deleted file mode 100644 index 2f77cf78..00000000 --- a/examples/provision-orders.json +++ /dev/null @@ -1,111 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "provision-orders", - "description": "Provision Orders and handle errors thrown", - "start": "provision-order", - "functions": [ - { - "name": "provision-order-function", - "operation": "http://myapis.org/provisioningapi.json#doProvision" - } - ], - "states": [ - { - "name": "provision-order", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "name": "provision-action", - "functionRef": { - "refName": "provision-order-function", - "arguments": { - "order": "${ .order }" - } - } - } - ], - "stateDataFilter": { - "output": "${ .exceptions }" - }, - "transition": "apply-order", - "onErrors": [ - { - "when": [ - { - "type": "/samples/errors/missing-order-id" - } - ], - "then": { - "transition": "missing-id" - } - }, - { - "when": [ - { - "type": "/samples/errors/missing-order-item" - } - ], - "then": { - "transition": "missing-item" - } - }, - { - "when": [ - { - "type": "/samples/errors/missing-order-quantity" - } - ], - "then": { - "transition": "missing-quantity" - } - } - ] - }, - { - "name": "missing-id", - "type": "operation", - "actions": [ - { - "name": "missing-action", - "subFlowRef": "handleMissingIdExceptionWorkflow" - } - ], - "end": true - }, - { - "name": "missing-item", - "type": "operation", - "actions": [ - { - "name": "missing-item", - "subFlowRef": "handleMissingItemExceptionWorkflow" - } - ], - "end": true - }, - { - "name": "missing-quantity", - "type": "operation", - "actions": [ - { - "name": "missing-quantity", - "subFlowRef": "handleMissingQuantityExceptionWorkflow" - } - ], - "end": true - }, - { - "name": "apply-order", - "type": "operation", - "actions": [ - { - "name": "apply-order", - "subFlowRef": "applyOrderWorkflowId" - } - ], - "end": true - } - ] -} \ No newline at end of file diff --git a/examples/purchase-order-deadline.json b/examples/purchase-order-deadline.json deleted file mode 100644 index 5f31cefa..00000000 --- a/examples/purchase-order-deadline.json +++ /dev/null @@ -1,164 +0,0 @@ -{ - "name": "order", - "description": "Purchase Order Workflow", - "version": "1.0.0", - "specVersion": "0.8", - "start": "start-new-order", - "timeouts": { - "workflowExecTimeout": { - "duration": "PT30D", - "runBefore": "CancelOrder" - } - }, - "states": [ - { - "name": "start-new-order", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "order-created-event" - ], - "actions": [ - { - "name": "log-new-order-created", - "functionRef": { - "refName": "log-new-order-created" - } - } - ] - } - ], - "transition": { - "nextState": "wait-for-order-confirmation" - } - }, - { - "name": "wait-for-order-confirmation", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "order-confirmed-event" - ], - "actions": [ - { - "name": "log-order-confirmed", - "functionRef": { - "refName": "log-order-confirmed" - } - } - ] - } - ], - "transition": { - "nextState": "wait-order-shipped" - } - }, - { - "name": "wait-order-shipped", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "shipment-sent-event" - ], - "actions": [ - { - "name": "log-order-shipped", - "functionRef": { - "refName": "log-order-shipped" - } - } - ] - } - ], - "end": { - "terminate": true, - "produceEvents": [ - { - "eventRef": "order-finished-event" - } - ] - } - }, - { - "name": "cancel-order", - "type": "operation", - "actions": [ - { - "name": "cancel-order", - "functionRef": { - "refName": "cancel-order" - } - } - ], - "end": { - "terminate": true, - "produceEvents": [ - { - "eventRef": "order-cancelled-event" - } - ] - } - } - ], - "events": [ - { - "name": "order-created-event", - "type": "my.company.orders", - "source": "/orders/new", - "correlation": [ - { - "contextAttributeName": "orderid" - } - ] - }, - { - "name": "order-confirmed-event", - "type": "my.company.orders", - "source": "/orders/confirmed", - "correlation": [ - { - "contextAttributeName": "orderid" - } - ] - }, - { - "name": "shipment-sent-event", - "type": "my.company.orders", - "source": "/orders/shipped", - "correlation": [ - { - "contextAttributeName": "orderid" - } - ] - }, - { - "name": "order-finished-event", - "type": "my.company.orders" - }, - { - "name": "order-cancelled-event", - "type": "my.company.orders" - } - ], - "functions": [ - { - "name": "log-new-order-created", - "operation": "http.myorg.io/ordersservices.json#logcreated" - }, - { - "name": "log-order-confirmed", - "operation": "http.myorg.io/ordersservices.json#logconfirmed" - }, - { - "name": "log-order-shipped", - "operation": "http.myorg.io/ordersservices.json#logshipped" - }, - { - "name": "cancel-order", - "operation": "http.myorg.io/ordersservices.json#calcelorder" - } - ] -} \ No newline at end of file diff --git a/examples/reusing-function-and-event-definitions.json b/examples/reusing-function-and-event-definitions.json deleted file mode 100644 index 3a8f2c77..00000000 --- a/examples/reusing-function-and-event-definitions.json +++ /dev/null @@ -1,99 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "payment-confirmation", - "description": "Performs Payment Confirmation", - "functions": "file://functiondefs.json", - "events": "file://eventdefs.yml", - "states": [ - { - "name": "payment-received", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "payment-received-event" - ], - "actions": [ - { - "name": "checkfunds", - "functionRef": { - "refName": "check-funds-availability", - "arguments": { - "account": "${ .accountId }", - "paymentamount": "${ .payment.amount }" - } - } - } - ] - } - ], - "transition": "confirm-based-on-funds" - }, - { - "name": "confirm-based-on-funds", - "type": "switch", - "dataConditions": [ - { - "condition": "${ .funds | .available == \"true\" }", - "transition": "send-payment-success", - "name": "success" - }, - { - "condition": "${ .funds | .available == \"false\" }", - "transition": "send-insufficient-results", - "name": "failed" - } - ], - "defaultCondition": { - "transition": "send-payment-success" - } - }, - { - "name": "send-payment-success", - "type": "operation", - "actions": [ - { - "name": "send-payment-success", - "functionRef": { - "refName": "send-success-email", - "arguments": { - "applicant": "${ .customer }" - } - } - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "confirmation-completed-event", - "data": "${ .payment }" - } - ] - } - }, - { - "name": "send-insufficient-results", - "type": "operation", - "actions": [ - { - "name": "send-insufficient-results", - "functionRef": { - "refName": "send-insufficient-funds-email", - "arguments": { - "applicant": "${ .customer }" - } - } - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "confirmation-completed-event", - "data": "${ .payment }" - } - ] - } - } - ] -} \ No newline at end of file diff --git a/examples/send-cloudevent-on-workflow-completion.json b/examples/send-cloudevent-on-workflow-completion.json deleted file mode 100644 index 783d0859..00000000 --- a/examples/send-cloudevent-on-workflow-completion.json +++ /dev/null @@ -1,46 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "send-cloudevent-provision", - "start": "provision-orders-state", - "events": [ - { - "name": "provisioning-complete-event", - "type": "provisionCompleteType" - } - ], - "functions": [ - { - "name": "provision-order-function", - "operation": "http://myapis.org/provisioning.json#doProvision" - } - ], - "states": [ - { - "name": "provision-orders-state", - "type": "foreach", - "inputCollection": "${ .orders }", - "iterationParam": "singleorder", - "outputCollection": "${ .provisionedOrders }", - "actions": [ - { - "name": "provision-order-function", - "functionRef": { - "refName": "provision-order-function", - "arguments": { - "order": "${ .singleorder }" - } - } - } - ], - "end": { - "produceEvents": [ - { - "eventRef": "provisioning-complete-event", - "data": "${ .provisionedOrders }" - } - ] - } - } - ] -} \ No newline at end of file diff --git a/examples/solving-math-problems.json b/examples/solving-math-problems.json deleted file mode 100644 index 1e22820d..00000000 --- a/examples/solving-math-problems.json +++ /dev/null @@ -1,37 +0,0 @@ -{ - "version": "1.0.0", - "specVersion": "0.8", - "name": "solve-math-problems", - "description": "Solve math problems", - "start": "solve", - "functions": [ - { - "name": "solve-math-exp-func", - "operation": "http://myapis.org/mapthapis.json#solveExpression" - } - ], - "states": [ - { - "name": "solve", - "type": "foreach", - "inputCollection": "${ .expressions }", - "iterationParam": "singleexpression", - "outputCollection": "${ .results }", - "actions": [ - { - "name": "solve-action", - "functionRef": { - "refName": "solve-math-exp-func", - "arguments": { - "expression": "${ .singleexpression }" - } - } - } - ], - "stateDataFilter": { - "output": "${ .results }" - }, - "end": true - } - ] -} \ No newline at end of file diff --git a/extensions/README.md b/extensions/README.md deleted file mode 100644 index bd582adf..00000000 --- a/extensions/README.md +++ /dev/null @@ -1,17 +0,0 @@ -# Extensions - -If you have an idea for a new workflow extension, or would like to enhance an existing one, -please open an `New Extension Request` issue in this repository. - -If you would like to contribute a new workflow extension to the specification, please do so via a new -Pull Request in this repository. - -Here is a list of available workflow extensions hosted by the Serverless Workflow specification. - -Click on the `Extension Id` link to see detailed explanation of the extension. - -| Extension Id | Description | Json Schema | -| --- | --- | --- | -| [kpi](kpi.md) | Define workflow key performance indicators (KPIs) | [kpi.json](../schema/extensions/kpi.json) | -| [ratelimiting](ratelimiting.md) | Define numerous rate limiting options for a workflow per single or all instances | [ratelimiting.json](../schema/extensions/ratelimiting.json) | - diff --git a/extensions/kpi.md b/extensions/kpi.md deleted file mode 100644 index bea8055b..00000000 --- a/extensions/kpi.md +++ /dev/null @@ -1,234 +0,0 @@ -# Extensions - KPI - -## Table of Contents - -- [Introduction](#Introduction) -- [Extension Definition](#Extension-Definition) - - [Workflow KPIs Definition](#Workflow-KPIs-Definition) - - [Event KPIs Definition](#Event-KPIs-Definition) - - [Function KPIs Definition](#Function-KPIs-Definition) - - [State KPIs Definition](#State-KPIs-Definition) - - [Thresholds Definition](#Thresholds-Definition) -- [Example](#Example) - -## Introduction - -Key performance indicators (KPIs) are an important metric for analyzing statistical data of workflows. - -KPIs can be used to -* Show workflow efficiencies and inefficiencies. -* Help improve specific workflow activities in terms of defined criteria (performance, cost, etc) -* Help demonstrate the overall workflow effectiveness against defined criteria (performance, cost, etc) -* Help show progress towards intended workflow objectives - -The KPI extension allows you to define `expected` key performance indicators to the workflow model it references. -KPIs can be added for the model: -* [Workflow definition](../specification.md#Workflow-Definition) -* [Function (services) definition](../specification.md#Function-Definition) -* [Event definitions](../specification.md#Event-Definition) -* [State definitions](../specification.md#State-Definition) - -## Extension Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| extensionId | Unique extension Id (default is 'workflow-kpi-extension') | string | yes | -| [workflow](#Workflow-KPIs-Definition) | Workflow definition KPIs | object | no | -| [events](#Event-KPIs-Definition) | Workflow event definitions KPIs | array | no | -| [functions](#Function-KPIs-Definition) | Workflow function definitions KPIs | array | no | -| [states](#State-KPIs-Definition) | Workflow states definitions KPIs | array | no | - -#### Workflow KPIs Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| [per](#Thresholds-Definition) | Define the kpi thresholds in terms of time and/or num of workflow instances| object | yes | -| maxInvoked | Max number of workflow invocations | string | no | -| minInvoked | Min number of workflow invocations | string | no | -| avgInvoked | Average number of workflow invocations | string | no | -| maxDuration | ISO 8601. Max duration of workflow execution | string | no | -| minDuration | ISO 8601. Min duration of workflow execution | string | no | -| avgDuration | ISO 8601. Average duration of workflow execution | string | no | -| maxCost | Max workflow execution cost | string | no | -| minCost | Min workflow execution cost | string | no | -| avgCost | Average workflow execution cost | string | no | - -#### Event KPIs Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| for | References an unique event name in the defined workflow events | string | yes | -| [per](#Thresholds-Definition) | Define the kpi thresholds in terms of time and/or num of workflow instances| object | yes | -| maxConsumed | If the referenced event kind is 'consumed', the max amount of times this event is consumed | string | yes | -| minConsumed | If the referenced event kind is 'consumed', the min amount of times this event is consumed | string | no | -| avgConsumed | If the referenced event kind is 'consumed', the average amount of times this event is consumed | string | no | -| maxProduced | If the referenced event kind is 'produced', the max amount of times this event is produced | string | no | -| minProduced | If the referenced event kind is 'produced', the min amount of times this event is produced | string | no | -| avgProduced | If the referenced event kind is 'produced', the average amount of times this event is produced | string | no | - -#### Function KPIs Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| for | References an unique function name in the defined workflow functions | string | yes | -| [per](#Thresholds-Definition) | Define the kpi thresholds in terms of time and/or num of workflow instances| object | yes | -| maxErrors | Max number of errors during function invocation | string | yes | -| maxRetry | Max number of retries done for this function invocation | string | no | -| maxTimeout | Max number of times the function timeout time was reached| string | no | -| maxInvoked | Max number of invocations for the referenced function | string | no | -| minInvoked | Min number of invocations for the referenced function | string | no | -| avgInvoked | Average number of invocations for the referenced function | string | no | -| maxCost | Max function execution cost | string | no | -| minCost | Min function execution cost | string | no | -| avgCost | Average function execution cost | string | no | - -#### State KPIs Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| for | References an unique state name in the defined workflow states | string | yes | -| [per](#Thresholds-Definition) | Define the kpi thresholds in terms of time and/or num of workflow instances| object | yes | -| maxTimeout | Max number of times the function timeout time was reached| string | no | -| maxExec | Max executions of the referenced state | string | no | -| minExec | Min executions of the referenced state | string | no | -| avgExec | Average executions of the referenced state | string | no | -| maxDuration | ISO 8601. Max duration of state execution | string | no | -| minDuration | ISO 8601. Min duration of state execution | string | no | -| avgDuration | ISO 8601. Average duration of state execution | string | no | -| maxCost | Max state execution cost | string | no | -| minCost | Min state execution cost | string | no | -| avgCost | Average state execution cost | string | no | - -#### Thresholds Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| time | ISO_8601 time. Threshhold time. Default is 1 day | string | no | -| instances | Threshold number of workflow instances | integer | no | - -## Example - -The following example shows a workflow definition on the left and -an associated sample KPIs extension definition on the right. We assume that our -extensions definition yaml is located in a resource accessible via URI: -`file://myextensions/kpi.yml`. - - - - - - - - - - -
WorkflowKPIs Extension
- -```yaml -id: patientVitalsWorkflow -name: Monitor Patient Vitals -version: '1.0.0' -specVersion: '0.8' -start: MonitorVitals -extensions: - - extensionId: workflow-kpi-extension - path: file://myextensions/kpi.yml -events: - - name: HighBodyTemperature - type: org.monitor.highBodyTemp - source: monitoringSource - correlation: - - contextAttributeName: patientId - - name: HighBloodPressure - type: org.monitor.highBloodPressure - source: monitoringSource - correlation: - - contextAttributeName: patientId - - name: HighRespirationRate - type: org.monitor.highRespirationRate - source: monitoringSource - correlation: - - contextAttributeName: patientId -functions: - - name: callPulmonologist - operation: http://myapi.org/patientapi.json#callPulmonologist - - name: sendTylenolOrder - operation: http://myapi.org/patientapi.json#sendTylenol - - name: callNurse - operation: http://myapi.org/patientapi.json#callNurse -states: - - name: MonitorVitals - type: event - exclusive: true - onEvents: - - eventRefs: - - HighBodyTemperature - actions: - - functionRef: - refName: sendTylenolOrder - arguments: - patientid: "${ .patientId }" - - eventRefs: - - HighBloodPressure - actions: - - functionRef: - refName: callNurse - arguments: - patientid: "${ .patientId }" - - eventRefs: - - HighRespirationRate - actions: - - functionRef: - refName: callPulmonologist - arguments: - patientid: "${ .patientId }" - end: true -``` - - - -```yaml -extensionid: workflow-kpi-extension -currency: USD -workflow: - per: - time: PT1D - maxCost: '1300' - maxInvoked: '500' - minInvoked: '100' -events: -- for: HighBodyTemperature - per: - time: PT1D - avgConsumed: '50' -- for: HighBloodPressure - per: - time: PT1D - avgConsumed: '30' -functions: -- for: callPulmonologist - per: - instances: 1000 - maxCost: '400' - maxErrors: '5' - maxRetry: '10' - maxTimeout: '15' - avgInvoked: '40' -- for: sendTylenolOrder - per: - instances: 1000 - maxCost: '200' - maxErrors: '5' - maxRetry: '10' - maxTimeout: '15' - avgInvoked: '400' -states: -- for: MonitorVitals - per: - time: PT1D - maxCost: '300' - maxExec: '1000' - minExec: '50' -``` - -
\ No newline at end of file diff --git a/extensions/ratelimiting.md b/extensions/ratelimiting.md deleted file mode 100644 index 359c9360..00000000 --- a/extensions/ratelimiting.md +++ /dev/null @@ -1,120 +0,0 @@ -# Extensions - Rate Limiting - -## Table of Contents - -- [Introduction](#Introduction) -- [Extension Definition](#Extension-Definition) - - [Workflow KPIs Definition](#Workflow-KPIs-Definition) -- [Example](#Example) - -## Introduction - -Out workflows can execute numerous downstream services. Rate limiting can be used to protect these -downstream services from flooding. In addition, rate limiting can help us keep our cost -at a desired rate in cases where downstream service invocations have an associated cost factor. - -## Extension Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| extensionid | Unique extension Id (default is 'workflow-kpi-extension') | string | yes | -| [singleInstance](#Single-Instance-Definition) | Rate limits per single workflow instance | object | no | -| [allInstances](#All-Instances-Definition) | Rate limits per all workflow instances | object | no | - -### Single Instance Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| maxActionsPerSecond | Sets the rate limiting on number of actions that can be executed per second. Notice that the number is represented as number type, so that you can set it to less than 1 if needed. For example, set the number to 0.1 means you want workflow actions should be executed once every 10 seconds. Default zero value means 'unlimited'| number | no | -| maxConcurrentActions | Maximum number of actions that can be executed in parallel | string | no | -| maxProducedEventsPerSecond |Sets the rate limiting on number of events that can be produced per second. Notice that the number is represented as number type, so that you can set it to less than 1 if needed. For example, set the number to 0.1 means workflow can produce events once every 10 seconds. Default zero value means 'unlimited' | string | no | -| maxStates | Maximum number of workflow states that should be executed. Default is zero, meaning unlimited. | string | no | -| maxTransitions | Maximum number of workflow transitions that should be executed. Default is zero, meaning unlimited. | string | no | - -### All Instances Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| maxActionsPerSecond | Sets the rate limiting on number of actions that can be executed per second. Notice that the number is represented as number type, so that you can set it to less than 1 if needed. For example, set the number to 0.1 means you want workflow actions should be executed once every 10 seconds. Default zero value means 'unlimited'| number | no | -| maxConcurrentActions | Maximum number of actions that can be executed in parallel | string | no | -| maxProducedEventsPerSecond |Sets the rate limiting on number of events that can be produced per second. Notice that the number is represented as number type, so that you can set it to less than 1 if needed. For example, set the number to 0.1 means workflow can produce events once every 10 seconds. Default zero value means 'unlimited' | string | no | -| maxStates | Maximum number of workflow states that should be executed. Default is zero, meaning unlimited. | string | no | -| maxTransitions | Maximum number of workflow transitions that should be executed. Default is zero, meaning unlimited. | string | no | - -## Example - -The following example shows a workflow definition on the left and -an associated sample Rate Limiting extension definition on the right. -We assume that our -extensions definition yaml is located in a resource accessible via URI: -`file://myextensions/ratelimiting.yml`. - - - - - - - - - - -
WorkflowRate Limiting Extension
- -```yaml -id: processapplication -name: Process Application -version: '1.0.0' -specVersion: '0.8' -extensions: - - extensionId: workflow-ratelimiting-extension - path: file://myextensions/ratelimiting.yml -start: ProcessNewApplication -states: - - name: ProcessNewApplication - type: event - onEvents: - - eventRefs: - - ApplicationReceivedEvent - actions: - - functionRef: processApplicationFunction - - functionRef: acceptApplicantFunction - - functionRef: depositFeesFunction - end: - produceEvents: - - eventRef: NotifyApplicantEvent -functions: - - name: processApplicationFunction - operation: file://myservice.json#process - - name: acceptApplicantFunction - operation: file://myservice.json#accept - - name: depositFeesFunction - operation: file://myservice.json#deposit -events: - - name: ApplicationReceivedEvent - type: application - source: "/applications/new" - - name: NotifyApplicantEvent - type: notifications - source: "/applicants/notify" -``` - - - -```yaml -extensionid: workflow-ratelimiting-extension -singleInstance: - maxActionsPerSecond: 0.1 - maxConcurrentActions: 200 - maxProducedEventsPerSecond: 2 - maxStates: '1000' - maxTransitions: '1000' -allInstances: - maxActionsPerSecond: 1 - maxConcurrentActions: 500 - maxProducedEventsPerSecond: 20 - maxStates: '10000' - maxTransitions: '10000' - -``` - -
\ No newline at end of file diff --git a/media/comparisons/bpmn/approvereport.png b/media/comparisons/bpmn/approvereport.png deleted file mode 100644 index 99e81736..00000000 Binary files a/media/comparisons/bpmn/approvereport.png and /dev/null differ diff --git a/media/comparisons/bpmn/error-with-retries.png b/media/comparisons/bpmn/error-with-retries.png deleted file mode 100644 index 95ab32df..00000000 Binary files a/media/comparisons/bpmn/error-with-retries.png and /dev/null differ diff --git a/media/comparisons/bpmn/event-decisions.png b/media/comparisons/bpmn/event-decisions.png deleted file mode 100644 index 2e31ce02..00000000 Binary files a/media/comparisons/bpmn/event-decisions.png and /dev/null differ diff --git a/media/comparisons/bpmn/exec-timeout.png b/media/comparisons/bpmn/exec-timeout.png deleted file mode 100644 index 19767e99..00000000 Binary files a/media/comparisons/bpmn/exec-timeout.png and /dev/null differ diff --git a/media/comparisons/bpmn/loop-subprocess.png b/media/comparisons/bpmn/loop-subprocess.png deleted file mode 100644 index fd4086ba..00000000 Binary files a/media/comparisons/bpmn/loop-subprocess.png and /dev/null differ diff --git a/media/comparisons/bpmn/multiinstance-subprocess.png b/media/comparisons/bpmn/multiinstance-subprocess.png deleted file mode 100644 index 9dc7990c..00000000 Binary files a/media/comparisons/bpmn/multiinstance-subprocess.png and /dev/null differ diff --git a/media/comparisons/bpmn/process-applicant.png b/media/comparisons/bpmn/process-applicant.png deleted file mode 100644 index 80f7783b..00000000 Binary files a/media/comparisons/bpmn/process-applicant.png and /dev/null differ diff --git a/media/comparisons/bpmn/simple-compensation.png b/media/comparisons/bpmn/simple-compensation.png deleted file mode 100644 index 38acd0b8..00000000 Binary files a/media/comparisons/bpmn/simple-compensation.png and /dev/null differ diff --git a/media/comparisons/bpmn/simple-file-processing.png b/media/comparisons/bpmn/simple-file-processing.png deleted file mode 100644 index 0e091274..00000000 Binary files a/media/comparisons/bpmn/simple-file-processing.png and /dev/null differ diff --git a/media/examples/example-accumulateroomreadings.png b/media/examples/example-accumulateroomreadings.png deleted file mode 100644 index 329a6a71..00000000 Binary files a/media/examples/example-accumulateroomreadings.png and /dev/null differ diff --git a/media/examples/example-asyncfunction.png b/media/examples/example-asyncfunction.png deleted file mode 100644 index 7cbed66b..00000000 Binary files a/media/examples/example-asyncfunction.png and /dev/null differ diff --git a/media/examples/example-asyncsubflow.png b/media/examples/example-asyncsubflow.png deleted file mode 100644 index 21cade0b..00000000 Binary files a/media/examples/example-asyncsubflow.png and /dev/null differ diff --git a/media/examples/example-booklending.png b/media/examples/example-booklending.png deleted file mode 100644 index 7f08c21a..00000000 Binary files a/media/examples/example-booklending.png and /dev/null differ diff --git a/media/examples/example-carauctionbid.png b/media/examples/example-carauctionbid.png deleted file mode 100644 index c277cb44..00000000 Binary files a/media/examples/example-carauctionbid.png and /dev/null differ diff --git a/media/examples/example-checkcarvitals.png b/media/examples/example-checkcarvitals.png deleted file mode 100644 index 89071c2e..00000000 Binary files a/media/examples/example-checkcarvitals.png and /dev/null differ diff --git a/media/examples/example-continueas.png b/media/examples/example-continueas.png deleted file mode 100644 index 211b6b02..00000000 Binary files a/media/examples/example-continueas.png and /dev/null differ diff --git a/media/examples/example-customercreditcheck.png b/media/examples/example-customercreditcheck.png deleted file mode 100644 index 2ea80a8e..00000000 Binary files a/media/examples/example-customercreditcheck.png and /dev/null differ diff --git a/media/examples/example-eventbasedgreeting.png b/media/examples/example-eventbasedgreeting.png deleted file mode 100644 index 9e10fa54..00000000 Binary files a/media/examples/example-eventbasedgreeting.png and /dev/null differ diff --git a/media/examples/example-eventbasedswitch.png b/media/examples/example-eventbasedswitch.png deleted file mode 100644 index 8f42f7c6..00000000 Binary files a/media/examples/example-eventbasedswitch.png and /dev/null differ diff --git a/media/examples/example-finalizecollegeapplication.png b/media/examples/example-finalizecollegeapplication.png deleted file mode 100644 index 4bd335a9..00000000 Binary files a/media/examples/example-finalizecollegeapplication.png and /dev/null differ diff --git a/media/examples/example-foodorder-outline.png b/media/examples/example-foodorder-outline.png deleted file mode 100644 index a6dc0a3a..00000000 Binary files a/media/examples/example-foodorder-outline.png and /dev/null differ diff --git a/media/examples/example-greeting.png b/media/examples/example-greeting.png deleted file mode 100644 index 0389b0d9..00000000 Binary files a/media/examples/example-greeting.png and /dev/null differ diff --git a/media/examples/example-handlerrors.png b/media/examples/example-handlerrors.png deleted file mode 100644 index d04e5b3f..00000000 Binary files a/media/examples/example-handlerrors.png and /dev/null differ diff --git a/media/examples/example-helloworld.png b/media/examples/example-helloworld.png deleted file mode 100644 index 6dd66b0a..00000000 Binary files a/media/examples/example-helloworld.png and /dev/null differ diff --git a/media/examples/example-looping.png b/media/examples/example-looping.png deleted file mode 100644 index 3224ab98..00000000 Binary files a/media/examples/example-looping.png and /dev/null differ diff --git a/media/examples/example-monitorpatientvitalsigns.png b/media/examples/example-monitorpatientvitalsigns.png deleted file mode 100644 index a785adb6..00000000 Binary files a/media/examples/example-monitorpatientvitalsigns.png and /dev/null differ diff --git a/media/examples/example-parallel.png b/media/examples/example-parallel.png deleted file mode 100644 index 32fdf97c..00000000 Binary files a/media/examples/example-parallel.png and /dev/null differ diff --git a/media/examples/example-patientonboarding.png b/media/examples/example-patientonboarding.png deleted file mode 100644 index 58c26396..00000000 Binary files a/media/examples/example-patientonboarding.png and /dev/null differ diff --git a/media/examples/example-periodicalexec.png b/media/examples/example-periodicalexec.png deleted file mode 100644 index 107c3711..00000000 Binary files a/media/examples/example-periodicalexec.png and /dev/null differ diff --git a/media/examples/example-purchaseorderdeadline.png b/media/examples/example-purchaseorderdeadline.png deleted file mode 100644 index 788b8cce..00000000 Binary files a/media/examples/example-purchaseorderdeadline.png and /dev/null differ diff --git a/media/examples/example-reusefunceventdefs.png b/media/examples/example-reusefunceventdefs.png deleted file mode 100644 index b5a60f4a..00000000 Binary files a/media/examples/example-reusefunceventdefs.png and /dev/null differ diff --git a/media/examples/example-sendcloudeentonworkflowcompletion.png b/media/examples/example-sendcloudeentonworkflowcompletion.png deleted file mode 100644 index e82c8f2e..00000000 Binary files a/media/examples/example-sendcloudeentonworkflowcompletion.png and /dev/null differ diff --git a/media/examples/example-switchstate.png b/media/examples/example-switchstate.png deleted file mode 100644 index c3c8c535..00000000 Binary files a/media/examples/example-switchstate.png and /dev/null differ diff --git a/media/examples/example-vetappointment.png b/media/examples/example-vetappointment.png deleted file mode 100644 index 1da7e804..00000000 Binary files a/media/examples/example-vetappointment.png and /dev/null differ diff --git a/media/examples/examples-fill-glass.png b/media/examples/examples-fill-glass.png deleted file mode 100644 index bc061e2e..00000000 Binary files a/media/examples/examples-fill-glass.png and /dev/null differ diff --git a/media/examples/examples-jobmonitoring.png b/media/examples/examples-jobmonitoring.png deleted file mode 100644 index 07fac9da..00000000 Binary files a/media/examples/examples-jobmonitoring.png and /dev/null differ diff --git a/media/references/loan-approval-workflow.bpmn b/media/references/loan-approval-workflow.bpmn deleted file mode 100644 index 026b475f..00000000 --- a/media/references/loan-approval-workflow.bpmn +++ /dev/null @@ -1,122 +0,0 @@ - - - - - SequenceFlow_0b1bwkk - - - - SequenceFlow_11nqkbd - SequenceFlow_0acyvqi - SequenceFlow_06h1jzx - - - - - SequenceFlow_1tyggk3 - - - - SequenceFlow_01qpt8j - - - - SequenceFlow_0b1bwkk - SequenceFlow_0j9jmbc - - - SequenceFlow_0acyvqi - SequenceFlow_01qpt8j - - - SequenceFlow_06h1jzx - SequenceFlow_1tyggk3 - - - - - SequenceFlow_0j9jmbc - SequenceFlow_11nqkbd - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/media/references/loan-approval-workflow.png b/media/references/loan-approval-workflow.png deleted file mode 100644 index 450a3aa0..00000000 Binary files a/media/references/loan-approval-workflow.png and /dev/null differ diff --git a/media/references/travel-booking-workflow.bpmn b/media/references/travel-booking-workflow.bpmn deleted file mode 100644 index fd48bc9d..00000000 --- a/media/references/travel-booking-workflow.bpmn +++ /dev/null @@ -1,143 +0,0 @@ - - - - - SequenceFlow_1ij93cz - - - - SequenceFlow_1ij93cz - SequenceFlow_1f5f5lz - - - - SequenceFlow_1f5f5lz - SequenceFlow_1fpmfmp - - - - - SequenceFlow_1fpmfmp - SequenceFlow_15xmmki - SequenceFlow_00fkrm2 - - - - - SequenceFlow_00fkrm2 - SequenceFlow_0vadnsd - - - SequenceFlow_15xmmki - SequenceFlow_1bcr26p - - - - - SequenceFlow_0vadnsd - SequenceFlow_1bcr26p - SequenceFlow_0nywj2d - - - - SequenceFlow_0nywj2d - SequenceFlow_11dr1u1 - - - - SequenceFlow_0xs3zh9 - - - - SequenceFlow_11dr1u1 - SequenceFlow_0xs3zh9 - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/media/references/travel-booking-workflow.png b/media/references/travel-booking-workflow.png deleted file mode 100644 index 0cfffbe0..00000000 Binary files a/media/references/travel-booking-workflow.png and /dev/null differ diff --git a/media/usecases/usecase-app-payment.png b/media/usecases/usecase-app-payment.png deleted file mode 100644 index dc8ff4be..00000000 Binary files a/media/usecases/usecase-app-payment.png and /dev/null differ diff --git a/media/usecases/usecase-continuous-integration.png b/media/usecases/usecase-continuous-integration.png deleted file mode 100644 index 956390fc..00000000 Binary files a/media/usecases/usecase-continuous-integration.png and /dev/null differ diff --git a/media/usecases/usecase-data-analysis.png b/media/usecases/usecase-data-analysis.png deleted file mode 100644 index 72405390..00000000 Binary files a/media/usecases/usecase-data-analysis.png and /dev/null differ diff --git a/media/usecases/usecase-error-notifications.png b/media/usecases/usecase-error-notifications.png deleted file mode 100644 index 55d8efcf..00000000 Binary files a/media/usecases/usecase-error-notifications.png and /dev/null differ diff --git a/media/usecases/usecase-vehicle-auction.png b/media/usecases/usecase-vehicle-auction.png deleted file mode 100644 index 48db662d..00000000 Binary files a/media/usecases/usecase-vehicle-auction.png and /dev/null differ diff --git a/meetingminutes/9-13-21-meeting-minutes.md b/meetingminutes/9-13-21-meeting-minutes.md deleted file mode 100644 index 710a22d7..00000000 --- a/meetingminutes/9-13-21-meeting-minutes.md +++ /dev/null @@ -1,32 +0,0 @@ -# Serverless Workflow Community Meeting Minutes - -## Meeting Information -**Meeting Date/Time:** 9/13/21, 1pm EST -**Meeting Purpose:** Weekly project meeting -**Meeting Location:** https://zoom.us/my/cncfserverlesswg?pwd=YjNqYzhOdjRRd01YWFkzS1lHbDZqUT09 - -## Attendees -People who attended: -- Yuri -- Antonio -- Tihomir - -## Agenda Items - -Item | Description ----- | ---- -Agenda Item 1 | • Updates
• Annual report
• Questions
- -## Discussion Items -Item | Who | Notes | ----- | ---- | ---- | -Project graduation status | all | Will wait for v1.0 spec release | -CNCF Incubation status | all | Discussed what benefits it brings to project | - - -## Action Items -| Done? | Item | Responsible | Due Date | -| ---- | ---- | ---- | ---- | -| | Keep up discussions around using workflow states inside branches | tihomir, yuri | | - -## Other Notes & Information diff --git a/meetingminutes/9-20-21-meeting-minutes.md b/meetingminutes/9-20-21-meeting-minutes.md deleted file mode 100644 index cf47132e..00000000 --- a/meetingminutes/9-20-21-meeting-minutes.md +++ /dev/null @@ -1,34 +0,0 @@ -# Serverless Workflow Community Meeting Minutes - -## Meeting Information -**Meeting Date/Time:** 9/20/21, 1pm EST -**Meeting Purpose:** Weekly project meeting -**Meeting Location:** https://zoom.us/my/cncfserverlesswg?pwd=YjNqYzhOdjRRd01YWFkzS1lHbDZqUT09 - -## Attendees -People who attended: -- Yuri -- Antonio -- Ricardo -- Charles - -## Agenda Items - -Item | Description ----- | ---- -Agenda Item 1 | • Updates
• SDKs
• Questions
- -## Discussion Items -Item | Who | Notes | ----- | ---- | ---- | -Start planning v0.8 release and its contents | all | Should be done sooner than later | -SDKs status | all | Update SDKs to v0.7 | -Work on case studies | all | Video or blog-based | - - -## Action Items -| Done? | Item | Responsible | Due Date | -| ---- | ---- | ---- | ---- | -| | Keep up discussions around using workflow states inside branches | tihomir, yuri | | - -## Other Notes & Information diff --git a/meetingminutes/template.md b/meetingminutes/template.md deleted file mode 100644 index d22fe677..00000000 --- a/meetingminutes/template.md +++ /dev/null @@ -1,33 +0,0 @@ -# Serverless Workflow Community Meeting Minutes - -## Meeting Information -**Meeting Date/Time:** meeting_date, meeting_time -**Meeting Purpose:** Project meeting -**Meeting Location:** https://zoom.us/my/cncfserverlesswg?pwd=YjNqYzhOdjRRd01YWFkzS1lHbDZqUT09 - -## Attendees -People who attended: -- Person A -- Person B -- Person C - -## Agenda Items - -Item | Description ----- | ---- -Agenda Item 1 | •



• -Agenda Item 2 | •



• - -## Discussion Items -Item | Who | Notes | ----- | ---- | ---- | -item | who | notes | - - -## Action Items -| Done? | Item | Responsible | Due Date | -| ---- | ---- | ---- | ---- | -| | item | who | due_date | - -## Other Notes & Information -N/A \ No newline at end of file diff --git a/references/README.md b/references/README.md deleted file mode 100644 index 07986644..00000000 --- a/references/README.md +++ /dev/null @@ -1,175 +0,0 @@ -# References - -State machines/Workflows have a long history in software design and development. The goal of this section is to list other tools and languages that had been used to define orchestration between different actors (software and human). The main intention here is to make sure that we cover as many use cases as possible leveraging existing approaches to avoid pitfalls from the past. - -- [Workflow Patterns](#Workflow-Patterns) -- [Business Process Model and Notation](#Business-Process-Model-and-Notation) -- [Mistral Workflow Language](#Mistral-Workflow-Language) -- [Amazon States Language](#Amazon-States-Language) -- [Huawei FunctionGraph workflow definition](#Huawei-FunctionGraph-workflow-definition) -- [Flogo](#Flogo) -- [Alibaba FunctionFlow](#Alibaba-FunctionFlow) -- [Common Workflow Language](#Common-Workflow-Language) - -## Workflow Patterns -The research of the [Workflow Patterns Initiative](http://www.workflowpatterns.com/) provides a thorough examination of the various perspectives (control flow, data, resource, and exception handling) that need to be supported by a workflow language. - -## Business Process Model and Notation - -[Business Process Modeling and Notation (BPMN)](https://www.omg.org/spec/BPMN/) was standardized by Object Management Group (OMG) in collaboration with companies such as: IBM, Red Hat, Oracle, SAP, TIBCO, Software AG, among others. The latest version of the specification has even been [adopted by the ISO](https://www.iso.org/standard/62652.html) and provides a rich set of constructs to define workflows in a technology-agnostic way. One of the main advantages of the BPMN spec is that it visually defines how a workflow should look like and most importantly, it defines the execution semantics of such workflows. [This article](https://www.esentri.com/bpmn-and-serverless-workflows/) describes how BPMN can be used for serverless workflows. - -
- BPMN model - - BPMN provides the following entities to define workflows: - -- Tasks: Orchestrate interactions between systems and people -- User Task - - Service Task - - Business Rule Task -- Events: Emitting and catching events that are relevant to a workflow instance - - Condition Events - - Message Event: Enable communication between different workflow instances - - Throw - - Catch - - Timer Events -- Gateways: Enable fork/join behaviors based on certain condition - - Exclusive - - Parallel - - Complex -- Aggregation: Provide a mechanism to deal with complexity when workflows become to large to understand - - Embedded Sub Process - - Call Activity - -The [BPMN specification](https://www.omg.org/spec/BPMN/) provides XML Schemas for defining and validating workflow definitions. -
- -
- BPMN examples - - Here are BPMN diagrams covering two examples listed in the [Serverless Workflow Specification - Use Cases](../usecases/README.md) section: - - **Loan Approval Workflow** - - ![Loan Approval Example](../media/references/loan-approval-workflow.png) - - You can find the BPMN XML which can be executed in a number of Open Source and Proprietary engines [here](../media/references/loan-approval-workflow.bpmn) - - **Travel Booking Workflow** - - ![Travel Booking Example](../media/references/travel-booking-workflow.png) - - You can find the BPMN XML which can be executed in a number of Open Source and Proprietary engines [here](../media/references/travel-booking-workflow.bpmn) -
- -## Mistral Workflow Language - -[Mistral Workflow Language](https://docs.openstack.org/mistral/latest/user/wf_lang_v2.html) is a YAML-based task graph description. Unless a dependency (link/transition/requirement) is expressed between two tasks, all unconnected tasks in the workflow description would be executed. If a transition links one task to another, its execution depends on the predecessor. With transitions, successors of a task can be identified to create workflow graphs (type: *direct*). With dependencies, required tasks can be identified to create a dependency graph (type: *reverse*). Each workflow activation maintains a context and completes one task at a time. However, at the end of each task, the task may define more than one task to continue with (fork concurrent branches) and at the beginning of a task, the task may wait on the completion of other tasks (join). The workflow concludes when all branches/tasks have completed. - -
- Workflow Language details - A workflow describes a task graph, i.e. it consists of tasks that can be linked with transitions. - -**Workflow:** - -- type (direct or reverse) -- description -- input (required input parameters and optional default values) -- output (construct an output from the final context content) -- output-on-error (same as output but when the workflow goes into error) -- task-defaults (defaults for all tasks, unless tasks overwrites) - - pause-before - - wait-before - - wait-after - - timeout - - retry - - concurrency - - *direct-only* - - on-error (list of tasks which will run if the task has completed with an error) - - on-success (list of tasks which will run if the task has completed successfully) - - on-complete (regardless if successful or not) - - *reverse-only* - - requires (for reverse workflows that express requires-dependencies instead of on-xxx forward control) -- tasks (dictionary of all tasks) - -**Task:** - -- name -- description -- action or workflow, otherwise it's a no-op -- input (constructs action/subworkflow input parameters from the context of the task) -- publish (decides which action/subworkflow outputs are put into the context) -- publish-on-error -- with-items (processes items of a collection, i.e. the action/workflow executes multiple times) -- keep-result (can be used to discard the action/subworkflow output) -- target (which worker should execute the task) -- pause-before -- wait-before -- wait-after -- fail-on -- timeout -- retry (with count, delay, break-on, continue-on) -- concurrency (max concurrent actions, see with-items) - -
- -## Amazon States Language - -[Amazon States Language](https://states-language.net/spec.html) is a JSON-based DSL to define AWS Step Functions, a workflow-like execution of AWS Lambda serverless functions. The language has inspired the original draft of the serverless workflows specification. The copyright lies with Amazon.com Inc. or Affiliates and the license excludes modification or merging of the specification. The workflow is orchestrated by the engine that invokes the serverless functions (resources) referenced by the workflow. - -## Huawei FunctionGraph workflow definition - -[Huawei FunctionGraph workflow definition](https://support.huaweicloud.com/en-us/productdesc-functiongraph/functiongraph_01_0100.html) has served as [initial draft](https://github.com/cncf/wg-serverless/commit/e42aaabb2c5dd78d0bd638b5cc8be0cd771101a4#diff-bc18ddd43c9fef122edf80ec220f04bb) of this specification and is very similar to the Amazon States Language. - -## Flogo - -[TIBCO's (Project Flogo™)](http://www.flogo.io) defines applications as triggers, handlers and actions to create workflows. Its [current support for AWS Lambda](https://tibcosoftware.github.io/flogo/labs/flogo-lambda/) wraps the entire workflow as an embedded application into a single Lambda function. - -## Alibaba FunctionFlow - -FunctionFlow defines workflows of steps using yaml arrays and a simple control logic that starts with the first step in the array, allows jumps with goto and ending a flow either by reaching the last step in the array or by marking a step as the end of the flow. - -
- FunctionFlow FDL (Flow Definition Language) model - - The language is documented [here](https://help.aliyun.com/document_detail/122492.html). - The following entities have been extracted from [fnf examples](https://github.com/awesome-fnf). - - Each flow activation maintains a context addressable with XPath (JSONPath). The event that has triggered the execution is provided in $.input, outputs of serverless functions are available in $.local and unless outputMappings are specified, $.local is passed on. - -flow: - -- steps lists the steps to be executed (using goto) -- outputMappings to map the workflow output to a response - -step types: - -- task (invoke serverless function) - - resourceArn that points to the function - - inputMappings to map input data to parameters of the serverless function - - retry to retry on errors or outcomes that retry the execution with backing-off intervals and number of attempts - - catch to jump to a different state upon errors -- succeed (an end state) -- fail (an end state) -- wait -- pass (useful for mapping of data) -- choice - - inputMappings - - choices (condition + goto) - - default -- parallel -- foreach - - inputMappings - - iterationMapping (to define branching) - -
- -## Common Workflow Language - -The [Common Workflow Language (CWL)](https://www.commonwl.org/) is an open standard for describing analysis workflows and tools in a way that makes them portable and scalable across a variety of software and hardware environments, from workstations to cluster, cloud, and high performance computing (HPC) environments. CWL is designed to meet the needs of data-intensive science, such as Bioinformatics, Medical Imaging, Astronomy, High Energy Physics, and Machine Learning. - -## Others - -If you know other related tools, languages, projects that can help this specification to be better, more scoped and serve a wide range of use cases please feel free to get in touch or send a PR to this section. diff --git a/roadmap/README.md b/roadmap/README.md deleted file mode 100644 index f9add27d..00000000 --- a/roadmap/README.md +++ /dev/null @@ -1,177 +0,0 @@ -# Specification Roadmap - -_Note: Items in tables for each milestone do not imply an order of implementation._ - -_Note: Milestone entries include the most notable updates only. For list of all commits see [link](https://github.com/cncf/wg-serverless/commits/master)_ - -_Status description:_ - -| Completed | In Progress | In Planning | On Hold | -| :--: | :--: | :--: | :--: | -| ✔ | ✏️ | 🚩 | ❗️| - -## Releases - -- [Roadmap for next planned release](#v09) -- [v0.8 released Nov 2021](#v08) -- [v0.7 released Aug 2021](#v07) -- [v0.6 released March 2021](#v06) -- [v0.5 released November 2020](#v05) -- [v0.1 released April 2020](#v01) - -## Next planned release - -| Status | Description | Comments | -| --- | --- | --- | -| ✔️| Fix support for workflow extensions | [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md) | -| ✔️| Fix state execution timeout | [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md) | -| ✔️| Update rules of retries increment and multiplier properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md) | -| ✔️| Add clarification on mutually exclusive properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md) | -| ✔️| Make the `resultEventRef` attribute in `EventRef` definition not required [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md#EventRef-Definition) | -| ✔️| Make the `stateName` attribute in `start` definition not required [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md#EventRef-Definition) | -| ✔️| Remove `id` attribute from `actions` and `states`. Now, the names from both attributes must be unique within the workflow definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md#transitions) -| ✔️| Update eventRef props to`produceEventRef` and `consumeEventRef` [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md#EventRef-Definition) | -| ✔️| Update eventRef props to`resultEventTimeout` and `consumeEventTimeout` [spec doc](https://github.com/serverlessworkflow/specification/blob/main/specification.md#EventRef-Definition) | -| ✔️| Apply fixes to auth spec schema [workflow schema](https://github.com/serverlessworkflow/specification/tree/main/schema) | -| ✔️| Update the `dataInputSchema` top-level property by supporting the assignment of a JSON schema object [workflow schema](https://github.com/serverlessworkflow/specification/tree/main/specification.md#workflow-definition-structure) | -| ✔️| Add the new `WORKFLOW` reserved keyword to workflow expressions | -| ✔️| Update `ForEach` state iteration parameter example. This parameter is an expression variable, not a JSON property | -| ✔️| Add the new `rest` function type [spec doc](https://github.com/serverlessworkflow/specification/tree/main/specification.md#using-functions-for-restful-service-invocations) | -| ✔️| Refactor error handling and retries | -| ✏️️| Add inline state defs in branches | | -| ✏️️| Add "completedBy" functionality | | -| ✏️️| Define workflow context | | -| ✏️️| Start work on TCK | | -| ✏️️| Add integration with open-source runtimes | | -| ✏️️| Add SDKs for more languages (Python, PHP, Rust, etc) | | -| ✏️️| Add more samples | | -| ✏️️| Enforce SemVer `version` | | -| ✏️️| Add `dataOutputSchema` | | - -## Released version v0.8 - -| Status | Description | Comments | -| --- | --- | --- | -| ✔️| Support custom function `type` definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️| Workflow "name" no longer a required property | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️| Workflow "start" no longer a required property | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️| ForEach state "iterationParam" no longer a required property | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️| Added "useData" for eventDataFilter, and "useResults" for actionDataFilter | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️| Added "resultEventTimeout" for action eventref | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️| Added example for "continueAs" | [examples doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/examples/README.md) | -| ✔️️| Support for async action invocation | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | -| ✔️️| Support for action condition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.8.x/specification.md) | - - -## Released version v0.7 - -| Status | Description | Comments | -| --- | --- | --- | -| ✔️| Add workflow `key` and `annotations` properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Replaced SubFlow state with subflow action type | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Add workflow `dataInputSchema` property | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Rename switch state `default` to `defaultCondition` to avoid keyword conflicts for SDK's | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Add description of additional properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Rename Parallel `completionType` values | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Removed `workflowId` from ParallelState and ForEach states (use subFlow action instead) | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Add subflow actions `version` property | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Renamed `schemaVersion` to `specVersion` and it is now a required parameter | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Add GraphQL support for function definitions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added "dataOnly" property to Event Definitions (allow event data filters to access entire event) | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added support for Secrets and Constants | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Changed default value of execution timeout `interrupt` property. This is a non-backwards compatible changes. | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Updated workflow timeouts | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added Workflow Auth definitions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added State execution timeouts | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Temporarily removed `waitForCompletion` for subflows | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added function definition support for OData | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added function definition support for AsyncAPI | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Rename Delay state to Sleep state | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added 'sleep' property to action definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added Rate Limiting extension | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Update ForEach state - adding sequential exec option and batch size for parallel option | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Update to error handling and retries. Retries are now per action rather than per state. Added option of automatic retries for actions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | -| ✔️| Added "continueAs" property to end definitions | [spec doc](hhttps://github.com/serverlessworkflow/specification/blob/0.7.x/specification.md) | - -## Released version v0.6 - -| Status | Description | Comments | -| --- | --- | --- | -| ✔️| Adding Workflow Compensation capabilities (cmp [Compensating Transaction](https://docs.microsoft.com/en-us/azure/architecture/patterns/compensating-transaction), [SAGA pattern](https://microservices.io/patterns/data/saga.html)) | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Adding comparison examples with Google Cloud Workflow language| [comparisons doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/comparisons/README.md) | -| ✔️| Updates to retry functionality | [retries: exponential backoff & max backoff](https://github.com/serverlessworkflow/specification/issues/137) [retries: max-attempts & interval](https://github.com/serverlessworkflow/specification/issues/136)| -| ✔️| Update "directInvoke" property type | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Data schema input/output update | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Updating start and end state definitions| [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Update cron definition (adding validUntil parameter)| [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Adding comparison examples with Temporal | [comparison doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/comparisons/README.md) | -| ✔️| Simplified functionRef and transition properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Adding comparison examples with Cadence | [comparison doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/comparisons/README.md) | -| ✔️| Adding workflow execTimeout and keepActive properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Adding SubFlow state repeat (loop) ability | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Adding comparison examples with BPMN | [comparison doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/comparisons/README.md) | -| ✔️| Adding RPC type to function definitions (gRPC) | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Change function definition 'parameters' to 'arguments' | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Replace JsonPath with jq | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Update start definition (move to top-level worklow param) | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Updated schedule definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | -| ✔️| Update data filters | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.6.x/specification.md) | - -## Released version v0.5 - -| Status | Description | Comments | -| --- | --- | --- | -| ✔ | Update Switch State | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔ | Rename Relay to Inject state | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Update waitForCompletion property of Parallel State | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add timeout property to actions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add examples comparing Argo workflow and spec markups | [examples doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/examples/examples-argo.md) | -| ✔️| Add ability to produce events during state transitions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add event-based condition capabilities to Switch State | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add examples comparing Brigade workflow and spec markups | [examples doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/examples/examples-brigade.md) | -| ✔️| Update produceEvent data property | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Change uppercase property and enum types to lowercase | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add Parallel State Exception Handling section | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add Go SDK | [sdk repo](https://github.com/serverlessworkflow/sdk-go) | -| ✔️| Add Java SDK | [sdk repo](https://github.com/serverlessworkflow/sdk-java) | -| ✔️| Allow to define events as produced or consumed | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add "triggered" start definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Update scheduled start definition - adding cron def | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add ability to reference trigger and result events in actions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Expand event correlation capabilities | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Only use JsonPath expressions ( remove need for expression languages other than JsonPath) | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Update workflow extensions | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add Workflow KPIs extension | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Add Workflow Validation to Java SDK | [sdk repo](https://github.com/serverlessworkflow/sdk-java) | -| ✔️| Update Switch state conditions and default definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Update transitions and end definition 'produceEvents' definition | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Events definition update - add convenience way to define multiple events that share properties | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Update to function and events definitions - allow inline array def as well as uri reference to external resource | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Enforce use of OpenAPI specification in function definitions for portability | [spec doc](https://github.com/serverlessworkflow/specification/blob/0.5.x/specification.md) | -| ✔️| Update workflow Error Handling | [spec doc](../specification.md) | - -## Released version v0.1 - -| Status | Description | Comments | -| :--: | --- | --- | -| ✔ | Establish governance, contributing guidelines and initial stakeholder | [governance doc](https://github.com/cncf/wg-serverless/tree/v0.1/workflow/spec/governance) | -| ✔ | Define specification goals | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Define specification functional scope | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Include set of use-cases for Serverless Workflow | [usecases doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/usecases.md) | -| ✔ | Include set of examples for Serverless Workflow | [examples doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/examples.md) | -| ✔ | Define specification JSON Schema | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add SubFlow state | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add Relay state | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add ForEach state | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Update Event state| [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Define Workflow data input/output | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Update state data filtering | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Clearly define workflow info passing | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add Workflow error handling | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add reusable function definitions | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add support for YAML definitions | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Update workflow end definition | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add Callback state | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔ | Add workflow metadata | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔️| Update workflow start definition | [spec doc](https://github.com/cncf/wg-serverless/blob/v0.1/workflow/spec/spec.md) | -| ✔️| Prepare github branch and docs for v0.1 | [branch](https://github.com/cncf/wg-serverless/tree/v0.1/workflow/spec) | diff --git a/schema/auth.json b/schema/auth.json deleted file mode 100644 index d949e7c4..00000000 --- a/schema/auth.json +++ /dev/null @@ -1,195 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/auth.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - auth schema", - "type": "object", - "auth": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing auth definitions (json or yaml)" - }, - { - "type": "array", - "description": "Workflow auth definitions", - "items": { - "type": "object", - "$ref": "#/definitions/authdef" - }, - "additionalItems": false, - "minItems": 1 - } - ] - }, - "required": [ - "auth" - ], - "definitions": { - "authdef": { - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "Unique auth definition name", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "scheme": { - "type": "string", - "description": "Defines the auth type", - "enum": [ - "basic", - "bearer", - "oauth2" - ], - "default": "basic" - }, - "properties": { - "oneOf": [ - { - "type": "string", - "description": "Expression referencing a workflow secret that contains all needed auth info" - }, - { - "title": "Basic Auth Info", - "$ref": "#/definitions/basicpropsdef" - }, - { - "title": "Bearer Auth Info State", - "$ref": "#/definitions/bearerpropsdef" - }, - { - "title": "OAuth2 Info", - "$ref": "#/definitions/oauth2propsdef" - } - ] - } - }, - "required": [ - "name", - "properties" - ] - }, - "basicpropsdef": { - "type": "object", - "description": "Basic auth information", - "properties": { - "username": { - "type": "string", - "description": "String or a workflow expression. Contains the user name", - "minLength": 1 - }, - "password": { - "type": "string", - "description": "String or a workflow expression. Contains the user password", - "minLength": 1 - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "required": [ - "username", - "password" - ], - "additionalProperties": false - }, - "bearerpropsdef": { - "type": "object", - "description": "Bearer auth information", - "properties": { - "token": { - "type": "string", - "description": "String or a workflow expression. Contains the token", - "minLength": 1 - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "required": [ - "token" - ], - "additionalProperties": false - }, - "oauth2propsdef": { - "type": "object", - "description": "OAuth2 information", - "properties": { - "authority": { - "type": "string", - "description": "String or a workflow expression. Contains the authority information", - "minLength": 1 - }, - "grantType": { - "type": "string", - "description": "Defines the grant type", - "enum": [ - "password", - "clientCredentials", - "tokenExchange" - ], - "additionalItems": false - }, - "clientId": { - "type": "string", - "description": "String or a workflow expression. Contains the client identifier", - "minLength": 1 - }, - "clientSecret": { - "type": "string", - "description": "Workflow secret or a workflow expression. Contains the client secret", - "minLength": 1 - }, - "scopes": { - "type": "array", - "description": "Array containing strings or workflow expressions. Contains the OAuth2 scopes", - "items": { - "type": "string" - }, - "minItems": 1, - "additionalItems": false - }, - "username": { - "type": "string", - "description": "String or a workflow expression. Contains the user name. Used only if grantType is 'resourceOwner'", - "minLength": 1 - }, - "password": { - "type": "string", - "description": "String or a workflow expression. Contains the user password. Used only if grantType is 'resourceOwner'", - "minLength": 1 - }, - "audiences": { - "type": "array", - "description": "Array containing strings or workflow expressions. Contains the OAuth2 audiences", - "items": { - "type": "string" - }, - "minItems": 1, - "additionalItems": false - }, - "subjectToken": { - "type": "string", - "description": "String or a workflow expression. Contains the subject token", - "minLength": 1 - }, - "requestedSubject": { - "type": "string", - "description": "String or a workflow expression. Contains the requested subject", - "minLength": 1 - }, - "requestedIssuer": { - "type": "string", - "description": "String or a workflow expression. Contains the requested issuer", - "minLength": 1 - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "required": ["grantType", "clientId"] - } - } -} \ No newline at end of file diff --git a/schema/common.json b/schema/common.json deleted file mode 100644 index 4322f834..00000000 --- a/schema/common.json +++ /dev/null @@ -1,15 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/common.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - common schema", - "type": "object", - "definitions": { - "metadata": { - "type": "object", - "description": "Metadata information", - "additionalProperties": { - "type": "string" - } - } - } -} \ No newline at end of file diff --git a/schema/errors.json b/schema/errors.json deleted file mode 100644 index 2026d7b4..00000000 --- a/schema/errors.json +++ /dev/null @@ -1,334 +0,0 @@ -{ - "$id":"https://serverlessworkflow.io/schemas/0.9/errors.json", - "$schema":"http://json-schema.org/draft-07/schema#", - "description":"Serverless Workflow specification - errors schema", - "type":"object", - "errors":{ - "oneOf":[ - { - "type":"string", - "format":"uri", - "description":"URI to a resource containing workflow error strategy (json or yaml)" - }, - { - "type":"object", - "description":"The workflow's error handling configuration, including error definitions, error handlers and error policies", - "properties":{ - "definitions":{ - "type":"array", - "description":"Defines errors that can be explicitly handled and/or thrown during workflow execution", - "items":{ - "$ref":"#/definitions/errorDefinition" - }, - "additionalItems":false, - "minItems":1 - }, - "handlers":{ - "type":"array", - "description":"Defines error handlers used to configure what to do when catching specific errors", - "items":{ - "$ref":"#/definitions/errorHandler" - }, - "additionalItems":false, - "minItems":1 - }, - "policies":{ - "type":"array", - "description":"Defines groups of error handlers that define reusable error policies", - "items":{ - "$ref":"#/definitions/errorPolicy" - }, - "additionalItems":false, - "minItems":1 - } - } - } - ] - }, - "required":[ - "errors" - ], - "definitions":{ - "errorDefinition":{ - "type":"object", - "properties":{ - "name":{ - "type":"string", - "description":"The name of the error. Must follow the Serverless Workflow Naming Convention.", - "pattern":"^[a-z0-9](-?[a-z0-9])*$", - "minLength": 1 - }, - "source":{ - "type":"string", - "description":"An RFC6901 JSON pointer that precisely identifies the component within a workflow definition (ex: funcRef, subflowRef, ...) from which the described issue originates", - "pattern":"/(\/(([^/~])|(~[01]))*)/g", - "minLength": 1 - }, - "type":{ - "type":"string", - "description":"A RFC3986 URI reference that identifies the problem type. The RFC7807 Problem Details specification encourages that, when dereferenced, it provide human-readable documentation for the problem type (e.g., using HTML)", - "format":"uri", - "minLength": 1 - }, - "status":{ - "type":"integer", - "description":"The status code generated by the origin for an occurrence of a problem. Status codes are extensible by nature and runtimes are not required to understand the meaning of all defined status codes. However, for cross-compatibility purpose, the specification encourages using RFC7231 HTTP Status Codes" - }, - "title":{ - "type":"string", - "description":"A short, human-readable summary of a problem type. It SHOULD NOT change from occurrence to occurrence of a problem, except for purposes of localization" - }, - "detail":{ - "type":"string", - "description":"A human-readable explanation specific to an occurrence of a problem" - } - }, - "additionalProperties":false, - "required":[ - "name", - "type", - "status", - "source" - ] - }, - "errorReference":{ - "oneOf":[ - { - "type":"object", - "properties":{ - "refName":{ - "type":"string", - "minLength": 1 - } - }, - "required":[ - "refName" - ], - "additionalProperties":false - }, - { - "type":"object", - "properties":{ - "instance":{ - "type":"string", - "minLength": 1 - }, - "type":{ - "type":"string", - "minLength": 1 - }, - "status":{ - "oneOf": [ - { - "type": "string", - "minLength": 1 - }, - { - "type": "integer" - } - ] - } - }, - "minProperties":1, - "additionalProperties":false - } - ] - }, - "throw":{ - "oneOf":[ - { - "type":"boolean", - "description":"If true, rethrows the caught error as is" - }, - { - "type":"object", - "anyOf":[ - { - "required":[ - "refName" - ], - "properties":{ - "refName":{ - "type":"string" - } - } - }, - { - "required":[ - "type", - "status" - ], - "properties":{ - "type":{ - "type":"string", - "minLength": 1 - }, - "status":{ - "anyOf":[ - { - "type":"integer" - }, - { - "type":"string", - "minLength": 1 - } - ] - }, - "title":{ - "type":"string", - "minLength": 1 - }, - "detail":{ - "type":"string", - "minLength": 1 - } - } - } - ] - } - ] - }, - "errorHandler":{ - "type":"object", - "properties":{ - "name":{ - "type":"string", - "description":"The unique name which is used to reference the handler.", - "pattern":"^[a-z0-9](-?[a-z0-9])*$", - "minLength": 1 - }, - "when":{ - "type":"array", - "items":{ - "$ref":"#/definitions/errorReference" - }, - "description":"References the errors to handle. If null, and if `exceptWhen` is null, all errors are caught.", - "additionalItems":false, - "minItems":1 - }, - "exceptWhen":{ - "type":"array", - "items":{ - "$ref":"#/definitions/errorReference" - }, - "description":"References the errors not to handle. If null, and if `when` is null, all errors are caught.", - "additionalItems":false, - "minItems": 1 - }, - "retry":{ - "oneOf":[ - { - "type":"string", - "description":"The unique name of the retry definition to use" - }, - { - "$ref":"retries.json#/definitions/retrydef", - "description":"The inline retry definition to use" - } - ] - }, - "then":{ - "$ref":"#/definitions/errorOutcomeDefinition" - } - }, - "required":[ - "name" - ] - }, - "errorOutcomeDefinition":{ - "type":"object", - "properties":{ - "compensate":{ - "type":"string", - "description":"Unique Name of a workflow state which is responsible for compensation" - }, - "end":{ - "$ref":"workflow.json#/definitions/end" - }, - "transition":{ - "$ref":"workflow.json#/definitions/transition" - }, - "throw":{ - "$ref":"#/definitions/throw" - } - }, - "minProperties":1, - "maxProperties":1 - }, - "errorHandlerReference":{ - "type":"object", - "oneOf":[ - { - "properties":{ - "refName":{ - "type":"string", - "minLength": 1 - } - }, - "required":[ - "refName" - ], - "additionalProperties": false - }, - { - "properties":{ - "when":{ - "type":"array", - "items":{ - "$ref":"#/definitions/errorReference" - }, - "additionalItems":false, - "minItems": 1 - }, - "exceptWhen":{ - "type":"array", - "items":{ - "$ref":"#/definitions/errorReference" - }, - "additionalItems":false, - "minItems": 1 - }, - "retry":{ - "oneOf":[ - { - "type":"string", - "minLength": 1 - }, - { - "$ref":"retries.json#/definitions/retrydef" - } - ] - }, - "then":{ - "$ref":"#/definitions/errorOutcomeDefinition" - } - }, - "additionalProperties": false - } - ] - }, - "errorPolicy":{ - "type":"object", - "properties":{ - "name":{ - "type":"string", - "description":"The unique name which is used to reference the policy.", - "pattern":"^[a-z0-9](-?[a-z0-9])*$", - "minLength": 1 - }, - "handlers":{ - "type":"array", - "description":"Defines the handlers the policy is made out of", - "items":{ - "$ref":"#/definitions/errorHandlerReference" - }, - "additionalItems":false, - "minItems": 1 - } - }, - "required":[ - "name" - ] - } - } -} \ No newline at end of file diff --git a/schema/events.json b/schema/events.json deleted file mode 100644 index 27fca014..00000000 --- a/schema/events.json +++ /dev/null @@ -1,108 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/events.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - events schema", - "type": "object", - "events": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing event definitions (json or yaml)" - }, - { - "type": "array", - "description": "Workflow CloudEvent definitions. Defines CloudEvents that can be consumed or produced", - "items": { - "type": "object", - "$ref": "#/definitions/eventdef" - }, - "additionalItems": false, - "minItems": 1 - } - ] - }, - "required": [ - "events" - ], - "definitions": { - "eventdef": { - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "Unique event name", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "source": { - "type": "string", - "description": "CloudEvent source" - }, - "type": { - "type": "string", - "description": "CloudEvent type" - }, - "correlation": { - "type": "array", - "description": "CloudEvent correlation definitions", - "minItems": 1, - "items": { - "type": "object", - "$ref": "#/definitions/correlationDef" - }, - "additionalItems": false - }, - "dataOnly": { - "type": "boolean", - "default": true, - "description": "If `true`, only the Event payload is accessible to consuming Workflow states. If `false`, both event payload and context attributes should be accessible " - }, - "metadata": { - "$ref": "common.json#/definitions/metadata", - "description": "Metadata information" - } - }, - "additionalProperties": false, - "if": { - "properties": { - "source": { - "type": "null" - } - } - }, - "then": { - "required": [ - "name", - "type" - ] - }, - "else": { - "required": [ - "name", - "source" - ] - } - }, - "correlationDef": { - "type": "object", - "description": "CloudEvent correlation definition", - "properties": { - "contextAttributeName": { - "type": "string", - "description": "CloudEvent Extension Context Attribute name", - "minLength": 1 - }, - "contextAttributeValue": { - "type": "string", - "description": "CloudEvent Extension Context Attribute value", - "minLength": 1 - } - }, - "additionalProperties": false, - "required": [ - "contextAttributeName" - ] - } - } -} \ No newline at end of file diff --git a/schema/extensions.json b/schema/extensions.json deleted file mode 100644 index 364b543e..00000000 --- a/schema/extensions.json +++ /dev/null @@ -1,50 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/extensions.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - extensions schema", - "type": "object", - "extensions": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing workflow extensions definitions (json or yaml)" - }, - { - "type": "array", - "description": "Workflow extensions definitions", - "items": { - "type": "object", - "$ref": "#/definitions/extension" - }, - "additionalItems": false, - "minItems": 1 - } - ] - }, - "required": [ - "extensions" - ], - "definitions": { - "extension": { - "type": "object", - "properties": { - "extensionId": { - "type": "string", - "description": "Unique extension id", - "minLength": 1 - }, - "resource": { - "type": "string", - "description": "URI to a resource containing this workflow extension definitions (json or yaml)", - "minLength": 1 - } - }, - "additionalProperties": false, - "required": [ - "extensionId", - "resource" - ] - } - } -} \ No newline at end of file diff --git a/schema/extensions/kpi.json b/schema/extensions/kpi.json deleted file mode 100644 index 08a81382..00000000 --- a/schema/extensions/kpi.json +++ /dev/null @@ -1,289 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.8/extensions/kpi.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - KPIs Extension Schema", - "type": "object", - "definitions": { - "kpi": { - "type": "object", - "description": "Serverless Workflow KPI Extension", - "properties": { - "extensionid": { - "type": "string", - "default": "workflow-kpi-extension", - "description": "Extension unique identifier" - }, - "workflowid": { - "type": "string", - "minLength": 1, - "description": "Workflow definition unique identifier (workflow id property)" - }, - "workflowVersions": { - "type": "array", - "description": "Workflow versions. If not defined, applies to all workflow instances (regardless of their associated workflow version)", - "items": { - "type": "string" - } - }, - "currency": { - "type": "string", - "default": "USD", - "description": "Unit for all cost-based KPI parameters. Default 'USD'" - }, - "events": { - "type": "array", - "description": "Events KPIs", - "items": { - "type": "object", - "$ref": "#/definitions/eventskpi" - } - }, - "functions": { - "type": "array", - "description": "Functions KPIs", - "items": { - "type": "object", - "$ref": "#/definitions/functionskpi" - } - }, - "states": { - "type": "array", - "description": "States KPIs", - "items": { - "type": "object", - "$ref": "#/definitions/stateskpi" - } - }, - "workflow": { - "description": "Workflow KPIs", - "$ref": "#/definitions/workflowkpi" - } - }, - "required": [ - "extensionid", - "workflowid" - ] - }, - "eventskpi": { - "type": "object", - "properties": { - "for": { - "type": "string", - "description": "References an unique event name in the defined workflow events" - }, - "per": { - "description": "Define the kpi thresholds in terms of time and/or num of workflow instances", - "$ref": "#/definitions/thresholds" - }, - "maxConsumed": { - "type": "string", - "description": "If event kind is 'consumed', the max amount of times this event is consumed" - }, - "minConsumed": { - "type": "string", - "description": "If event kind is 'consumed', the min amount of times this event is consumed" - }, - "avgConsumed": { - "type": "string", - "description": "If event kind is 'consumed', the avg amount of times this event is consumed" - }, - "maxProduced": { - "type": "string", - "description": "If event kind is 'produced', the max amount of times this event is produced" - }, - "minProduced": { - "type": "string", - "description": "If event kind is 'produced', the min amount times this event is produced" - }, - "avgProduced": { - "type": "string", - "description": "If event kind is 'produced', the avg amount of times this event is produced" - } - }, - "required": [ - "for", - "per" - ] - }, - "functionskpi": { - "type": "object", - "allOf": [ - { - "properties": { - "for": { - "type": "string", - "description": "References an unique function name in the defined workflow functions" - }, - "per": { - "description": "Define the kpi thresholds in terms of time and/or num of workflow instances", - "$ref": "#/definitions/thresholds" - }, - "maxErrors": { - "type": "string", - "description": "Max number of errors during function invocation" - }, - "maxRetry": { - "type": "string", - "description": "Max number of retries done for this function invocation" - }, - "maxTimeout": { - "type": "string", - "description": "Max number of times the function timeout time was reached" - } - } - }, - { - "$ref": "#/definitions/invocationkpis" - }, - { - "$ref": "#/definitions/costkpis" - } - ], - "required": [ - "for", - "per" - ] - }, - "stateskpi": { - "type": "object", - "allOf": [ - { - "properties": { - "for": { - "type": "string", - "description": "References an unique state name in the defined workflow events" - }, - "per": { - "description": "Define the kpi thresholds in terms of time and/or num of workflow instances", - "$ref": "#/definitions/thresholds" - } - } - }, - { - "$ref": "#/definitions/execkpis" - }, - { - "$ref": "#/definitions/durationkpis" - }, - { - "$ref": "#/definitions/costkpis" - } - ], - "required": [ - "for", - "per" - ] - }, - "workflowkpi": { - "type": "object", - "allOf": [ - { - "properties": { - "per": { - "description": "Define the kpi thresholds in terms of time and/or num of workflow instances", - "$ref": "#/definitions/thresholds" - } - } - }, - { - "$ref": "#/definitions/invocationkpis" - }, - { - "$ref": "#/definitions/durationkpis" - }, - { - "$ref": "#/definitions/costkpis" - } - ], - "required": [ - "per" - ] - }, - "thresholds": { - "type": "object", - "properties": { - "time": { - "type": "string", - "default": "PT1D", - "description": "ISO_8601 time. 1 day default" - }, - "instances": { - "type": "integer", - "minimum": 1, - "default": 1, - "description": "Number of workflow instances" - } - }, - "required": [ - ] - }, - "costkpis": { - "type": "object", - "properties": { - "maxCost": { - "type": "string", - "description": "Max cost" - }, - "minCost": { - "type": "string", - "description": "Min cost" - }, - "avgCost": { - "type": "string", - "description": "Avg cost" - } - } - }, - "invocationkpis": { - "type": "object", - "properties": { - "maxInvoked": { - "type": "string", - "description": "Max number of invocation times" - }, - "minInvoked": { - "type": "string", - "description": "Min number of invocation times" - }, - "avgInvoked": { - "type": "string", - "description": "Avg number of invocation times" - } - } - }, - "durationkpis": { - "type": "object", - "properties": { - "maxDuration": { - "type": "string", - "description": "ISO 8601. Max duration" - }, - "minDuration": { - "type": "string", - "description": "ISO 8601. Min duration" - }, - "avgDuration": { - "type": "string", - "description": "ISO 8601. Avg duration" - } - } - }, - "execkpis": { - "type": "object", - "properties": { - "maxExec": { - "type": "string", - "description": "Max exec number" - }, - "minExec": { - "type": "string", - "description": "Min exec numbe" - }, - "avgExec": { - "type": "string", - "description": "Avg exec number" - } - } - } - } -} \ No newline at end of file diff --git a/schema/extensions/ratelimiting.json b/schema/extensions/ratelimiting.json deleted file mode 100644 index a7bc6f2c..00000000 --- a/schema/extensions/ratelimiting.json +++ /dev/null @@ -1,73 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.8/extensions/ratelimiting.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - Various workflow rate limiting settings", - "type": "object", - "definitions": { - "ratelimiting": { - "type": "object", - "description": "Serverless Workflow Rate Limiting Extension", - "properties": { - "extensionid": { - "type": "string", - "default": "workflow-ratelimiting-extension", - "description": "Extension unique identifier" - }, - "workflowid": { - "type": "string", - "minLength": 1, - "description": "Workflow definition unique identifier (workflow id property)" - }, - "workflowVersions": { - "type": "array", - "description": "Workflow versions. If not defined, applies to all workflow instances (regardless of their associated workflow version)", - "items": { - "type": "string" - } - }, - "singleInstance": { - "description": "Rate Limit settings per single instance of a workflow with provided workflowid", - "$ref": "#/definitions/ratelimits" - }, - "allInstances": { - "description": "Rate Limit settings across all instance of a workflow with provided workflowid", - "$ref": "#/definitions/ratelimits" - } - }, - "required": [ - "extensionid", - "workflowid" - ] - }, - "ratelimits": { - "type": "object", - "properties": { - "maxActionsPerSecond": { - "type": "number", - "default": 0, - "description": "Sets the rate limiting on number of actions that can be executed per second. Notice that the number is represented as number type, so that you can set it to less than 1 if needed. For example, set the number to 0.1 means workflow actions can be executed once every 10 seconds. Default zero value means 'unlimited'" - }, - "maxConcurrentActions": { - "type": "number", - "default": 100, - "description": "Maximum number of actions that can be executed in parallel" - }, - "maxProducedEventsPerSecond": { - "type": "number", - "default": 0, - "description": "Sets the rate limiting on number of events that can be produced per second. Notice that the number is represented as number type, so that you can set it to less than 1 if needed. For example, set the number to 0.1 means workflow can produce events once every 10 seconds. Default zero value means 'unlimited'" - }, - "maxStates": { - "type": "integer", - "default": 0, - "description": "Maximum number of workflow states that should be executed. Default is zero, meaning unlimited." - }, - "maxTransitions": { - "type": "integer", - "default": 0, - "description": "Maximum number of workflow transitions that should be executed. Default is zero, meaning unlimited." - } - } - } - } -} \ No newline at end of file diff --git a/schema/functions.json b/schema/functions.json deleted file mode 100644 index 44906f25..00000000 --- a/schema/functions.json +++ /dev/null @@ -1,143 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/functions.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - functions schema", - "type": "object", - "functions": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing function definitions (json or yaml)" - }, - { - "type": "array", - "description": "Workflow function definitions", - "items": { - "type": "object", - "$ref": "#/definitions/function" - }, - "additionalItems": false, - "minItems": 1 - } - ] - }, - "required": ["functions"], - "definitions": { - "function": { - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "Unique function name", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "operation": { - "$ref": "#/definitions/operation" - }, - "type": { - "type": "string", - "description": "Defines the function type. Is either `http`, `openapi`,`asyncapi, `rpc`, `graphql`, `odata`, `expression`, or `custom`. Default is `openapi`.", - "enum": [ - "http", - "openapi", - "asyncapi", - "rpc", - "graphql", - "odata", - "expression", - "custom" - ], - "default": "openapi" - }, - "authRef": { - "oneOf": [ - { - "type": "string", - "description": "References the auth definition to be used to invoke the operation", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - { - "type": "object", - "description": "Configures both the auth definition used to retrieve the operation's resource and the auth definition used to invoke said operation", - "properties": { - "resource": { - "type": "string", - "description": "References an auth definition to be used to access the resource defined in the operation parameter", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "invocation": { - "type": "string", - "description": "References an auth definition to be used to invoke the operation" - } - }, - "additionalProperties": false, - "required": ["resource"] - } - ] - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": ["name", "operation"] - }, - "operation": { - "oneOf": [ - { - "type": "string", - "description": "If type is `openapi`, #. If type is `asyncapi`, #. If type is `rpc`, ##. If type is `graphql`, ##. If type is `odata`, #. If type is `expression`, defines the workflow expression.", - "minLength": 1 - }, - { - "type": "object", - "description": "OpenAPI Path Object definition", - "$comment": "https://spec.openapis.org/oas/v3.1.0#paths-object", - "patternProperties": { - "^/": { - "type": "object" - } - }, - "additionalProperties": false - }, - { - "type": "object", - "description": "HTTP Function operation definition", - "properties": { - "method": { - "type": "string", - "enum": [ - "GET", - "HEAD", - "POST", - "PUT", - "DELETE", - "CONNECT", - "OPTIONS", - "TRACE" - ], - "default": "GET" - }, - "uri": { - "type": "string" - }, - "headers": { - "type": "object", - "additionalProperties": { "type": "string" } - }, - "cookies": { - "type": "object", - "additionalProperties": { "type": "string" } - } - }, - "required": ["method", "uri"], - "additionalProperties": false - } - ] - } - } -} diff --git a/schema/odata.json b/schema/odata.json deleted file mode 100644 index 79df36ea..00000000 --- a/schema/odata.json +++ /dev/null @@ -1,81 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/odata.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - OData command options schema", - "type": "object", - "key": { - "type": "string", - "description": "The unique identifier of the single entry to query", - "minLength": 1 - }, - "queryOptions":{ - "$ref": "#/definitions/queryoptions" - }, - "definitions": { - "queryoptions": { - "type": "object", - "properties": { - "filter": { - "type": "string", - "description": "The $filter system query option allows clients to filter the set of resources that are addressed by a request URL. $filter specifies conditions that MUST be met by a resource for it to be returned in the set of matching resources", - "minLength": 1 - }, - "expand": { - "type": "string", - "description": "The $expand system query option allows clients to request related resources when a resource that satisfies a particular request is retrieved", - "minLength": 1 - }, - "select": { - "type": "string", - "description": "The $select system query option allows clients to requests a limited set of information for each entity or complex type identified by the ResourcePath and other System Query Options like $filter, $top, $skip etc. The $select query option is often used in conjunction with the $expand query option, to first increase the scope of the resource graph returned ($expand) and then selectively prune that resource graph ($select)", - "minLength": 1 - }, - "orderBy": { - "type": "string", - "description": "The $orderby system query option allows clients to request resource in a particular order", - "minLength": 1 - }, - "top": { - "type": "integer", - "description": "The $top system query option allows clients a required number of resources. Usually used in conjunction with the $skip query options", - "minLength": 1 - }, - "skip": { - "type": "integer", - "description": "The $skip system query option allows clients to skip a given number of resources. Usually used in conjunction with the $top query options", - "minLength": 1 - }, - "count": { - "type": "boolean", - "description": "The $count system query option allows clients to request a count of the matching resources included with the resources in the response" - }, - "search": { - "type": "string", - "description": "The $search system query option allows clients to request items within a collection matching a free-text search expression", - "minLength": 1 - }, - "format": { - "type": "string", - "description": "The $format system query option if supported allows clients to request a response in a particular format", - "minLength": 1 - }, - "compute": { - "type": "string", - "description": "The $compute system query option allows clients to define computed properties that can be used in a $select or within a $filter or $orderby expression.", - "minLength": 1 - }, - "index": { - "type": "string", - "description": "The $index system query option allows clients to do a positional insert into a collection annotated with using the Core.PositionalInsert term (see http://docs.oasis-open.org/odata/odata/v4.01/odata-v4.01-part2-url-conventions.html#VocCore)", - "minLength": 1 - }, - "schemaVersion": { - "type": "string", - "description": "The $schemaversion system query option allows clients to specify the version of the schema against which the request is made. The semantics of $schemaversion is covered in the OData-Protocol (http://docs.oasis-open.org/odata/odata/v4.01/odata-v4.01-part2-url-conventions.html#odata) document.", - "minLength": 1 - } - }, - "additionalProperties": false - } - } -} \ No newline at end of file diff --git a/schema/retries.json b/schema/retries.json deleted file mode 100644 index ee3c2fe3..00000000 --- a/schema/retries.json +++ /dev/null @@ -1,86 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/retries.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - retries schema", - "type": "object", - "retries": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing retry definitions (json or yaml)" - }, - { - "type": "array", - "description": "Workflow Retry definitions. Define retry strategies that can be referenced in states onError definitions", - "items": { - "type": "object", - "$ref": "#/definitions/retrydef" - }, - "additionalItems": false, - "minItems": 1 - } - ] - }, - "required": [ - "retries" - ], - "definitions": { - "retrydef": { - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "Unique retry strategy name", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "delay": { - "type": "string", - "description": "Time delay between retry attempts (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)" - }, - "maxDelay": { - "type": "string", - "description": "Maximum time delay between retry attempts (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)" - }, - "increment": { - "type": "string", - "description": "Time delay increment during each attempt (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)" - }, - "multiplier": { - "type": [ - "number", - "string" - ], - "minimum": 0, - "minLength": 1, - "multipleOf": 0.01, - "description": "Numeric value, if specified the delay between retries is multiplied by this value." - }, - "maxAttempts": { - "type": [ - "number", - "string" - ], - "minimum": 1, - "minLength": 0, - "description": "Maximum number of retry attempts." - }, - "jitter": { - "type": [ - "number", - "string" - ], - "minimum": 0, - "maximum": 1, - "description": "If float type, maximum amount of random time added or subtracted from the delay between each retry relative to total delay (between 0 and 1). If string type, absolute maximum amount of random time added or subtracted from the delay between each retry (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)" - } - }, - "additionalProperties": false, - "required": [ - "name", - "maxAttempts" - ] - } - } -} diff --git a/schema/secrets.json b/schema/secrets.json deleted file mode 100644 index dec1c0c4..00000000 --- a/schema/secrets.json +++ /dev/null @@ -1,26 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/secrets.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - secrets schema", - "type": "object", - "secrets": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing secrets definitions (json or yaml)" - }, - { - "type": "array", - "description": "Workflow Secrets definitions", - "items": { - "type": "string" - }, - "minItems": 1 - } - ] - }, - "required": [ - "secrets" - ] -} \ No newline at end of file diff --git a/schema/timeouts.json b/schema/timeouts.json deleted file mode 100644 index 49f23ea9..00000000 --- a/schema/timeouts.json +++ /dev/null @@ -1,96 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/timeouts.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - functions schema", - "type": "object", - "timeouts": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing timeouts definitions (json or yaml)" - }, - { - "type": "object", - "description": "Workflow default timeouts", - "properties": { - "workflowExecTimeout": { - "$ref": "#/definitions/workflowExecTimeout" - }, - "stateExecTimeout": { - "$ref": "#/definitions/stateExecTimeout" - }, - "actionExecTimeout": { - "$ref": "#/definitions/actionExecTimeout" - }, - "branchExecTimeout": { - "$ref": "#/definitions/branchExecTimeout" - }, - "eventTimeout": { - "$ref": "#/definitions/eventTimeout" - } - }, - "additionalProperties": false, - "required": [] - } - ] - }, - "required": [ - "timeouts" - ], - "definitions": { - "workflowExecTimeout": { - "oneOf": [ - { - "type": "string", - "description": "Workflow execution timeout duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration). If not specified should be 'unlimited'", - "minLength": 1 - }, - { - "type": "object", - "properties": { - "duration": { - "type": "string", - "description": "Workflow execution timeout duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration). If not specified should be 'unlimited'", - "minLength": 1 - }, - "interrupt": { - "type": "boolean", - "description": "If `false`, workflow instance is allowed to finish current execution. If `true`, current workflow execution is abrupted.", - "default": true - }, - "runBefore": { - "type": "string", - "description": "Name of a workflow state to be executed before workflow instance is terminated", - "minLength": 1 - } - }, - "additionalProperties": false, - "required": [ - "duration" - ] - } - ] - }, - "stateExecTimeout": { - "type": "string", - "description": "Workflow state execution timeout duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)", - "minLength": 1 - }, - "actionExecTimeout": { - "type": "string", - "description": "Action execution timeout duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)", - "minLength": 1 - }, - "branchExecTimeout": { - "type": "string", - "description": "Branch execution timeout duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)", - "minLength": 1 - }, - "eventTimeout": { - "type": "string", - "description": "Timeout duration to wait for consuming defined events (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration)", - "minLength": 1 - } - } -} \ No newline at end of file diff --git a/schema/workflow.json b/schema/workflow.json deleted file mode 100644 index 7c2b242c..00000000 --- a/schema/workflow.json +++ /dev/null @@ -1,1809 +0,0 @@ -{ - "$id": "https://serverlessworkflow.io/schemas/0.9/workflow.json", - "$schema": "http://json-schema.org/draft-07/schema#", - "description": "Serverless Workflow specification - workflow schema", - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "The name that identifies the workflow definition, and which, when combined with its version, forms a unique identifier.", - "pattern": "^[a-z0-9](-?[a-z0-9])*$", - "minLength": 1 - }, - "version": { - "type": "string", - "description": "Workflow version", - "pattern": "^(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)(?:-((?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\\.(?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\\+([0-9a-zA-Z-]+(?:\\.[0-9a-zA-Z-]+)*))?$" - }, - "description": { - "type": "string", - "description": "Workflow description" - }, - "key": { - "type": "string", - "description": "Optional expression that will be used to generate a domain-specific workflow instance identifier" - }, - "annotations": { - "type": "array", - "description": "List of helpful terms describing the workflows intended purpose, subject areas, or other important qualities", - "minItems": 1, - "items": { - "type": "string" - }, - "additionalItems": false - }, - "dataInputSchema": { - "$ref": "#/definitions/validationSchema" - }, - "dataOutputSchema": { - "$ref": "#/definitions/validationSchema" - }, - "secrets": { - "$ref": "secrets.json#/secrets" - }, - "constants": { - "oneOf": [ - { - "type": "string", - "format": "uri", - "description": "URI to a resource containing constants data (json or yaml)" - }, - { - "type": "object", - "description": "Workflow constants data (object type)" - } - ] - }, - "start": { - "$ref": "#/definitions/startdef" - }, - "specVersion": { - "type": "string", - "description": "Serverless Workflow schema version", - "minLength": 1 - }, - "expressionLang": { - "type": "string", - "description": "Identifies the expression language used for workflow expressions. Default is 'jq'", - "default": "jq", - "minLength": 1 - }, - "timeouts": { - "$ref": "timeouts.json#/timeouts" - }, - "errors": { - "$ref": "errors.json#/errors" - }, - "keepActive": { - "type": "boolean", - "default": false, - "description": "If 'true', workflow instances is not terminated when there are no active execution paths. Instance can be terminated via 'terminate end definition' or reaching defined 'workflowExecTimeout'" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - }, - "events": { - "$ref": "events.json#/events" - }, - "functions": { - "$ref": "functions.json#/functions" - }, - "retries": { - "$ref": "retries.json#/retries" - }, - "auth": { - "$ref": "auth.json#/auth" - }, - "extensions": { - "$ref": "extensions.json#/extensions" - }, - "states": { - "type": "array", - "description": "State definitions", - "items": { - "anyOf": [ - { - "title": "Event State", - "$ref": "#/definitions/eventstate" - }, - { - "title": "Operation State", - "$ref": "#/definitions/operationstate" - }, - { - "title": "Parallel State", - "$ref": "#/definitions/parallelstate" - }, - { - "title": "Switch State", - "$ref": "#/definitions/switchstate" - }, - { - "title": "Inject State", - "$ref": "#/definitions/injectstate" - }, - { - "title": "ForEach State", - "$ref": "#/definitions/foreachstate" - }, - { - "title": "Callback State", - "$ref": "#/definitions/callbackstate" - } - ] - }, - "additionalItems": false, - "minItems": 1 - } - }, - "additionalProperties": false, - "required": [ - "name", - "specVersion", - "states" - ], - "definitions": { - "sleep": { - "type": "object", - "properties": { - "before": { - "type": "string", - "description": "Amount of time (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) to sleep before function/subflow invocation. Does not apply if 'eventRef' is defined." - }, - "after": { - "type": "string", - "description": "Amount of time (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) to sleep after function/subflow invocation. Does not apply if 'eventRef' is defined." - } - }, - "anyOf": [ - { - "required": [ - "before" - ] - }, - { - "required": [ - "after" - ] - }, - { - "required": [ - "before", - "after" - ] - } - ] - }, - "crondef": { - "oneOf": [ - { - "type": "string", - "description": "Cron expression defining when workflow instances should be created (automatically)", - "minLength": 1 - }, - { - "type": "object", - "properties": { - "expression": { - "type": "string", - "description": "Repeating interval (cron expression) describing when the workflow instance should be created", - "minLength": 1 - }, - "validUntil": { - "type": "string", - "description": "Specific date and time (ISO 8601 format) when the cron expression invocation is no longer valid" - } - }, - "additionalProperties": false, - "required": [ - "expression" - ] - } - ] - }, - "continueasdef": { - "oneOf": [ - { - "type": "string", - "description": "Unique id of the workflow to be continue execution as. Entire state data is passed as data input to next execution", - "minLength": 1 - }, - { - "type": "object", - "properties": { - "workflowId": { - "type": "string", - "description": "Unique id of the workflow to continue execution as" - }, - "version": { - "type": "string", - "description": "Version of the workflow to continue execution as", - "minLength": 1, - "pattern": "^(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)(?:-((?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\\.(?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\\+([0-9a-zA-Z-]+(?:\\.[0-9a-zA-Z-]+)*))?$" - }, - "data": { - "type": [ - "string", - "object" - ], - "description": "If string type, an expression which selects parts of the states data output to become the workflow data input of continued execution. If object type, a custom object to become the workflow data input of the continued execution" - }, - "workflowExecTimeout": { - "$ref": "timeouts.json#/definitions/workflowExecTimeout", - "description": "Workflow execution timeout to be used by the workflow continuing execution. Overwrites any specific settings set by that workflow" - } - }, - "required": [ - "workflowId" - ] - } - ] - }, - "transition": { - "oneOf": [ - { - "type": "string", - "description": "Name of state to transition to", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - { - "type": "object", - "description": "Function Reference", - "properties": { - "nextState": { - "type": "string", - "description": "Name of state to transition to", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "produceEvents": { - "type": "array", - "description": "Array of events to be produced before the transition happens", - "items": { - "type": "object", - "$ref": "#/definitions/produceeventdef" - }, - "additionalItems": false - }, - "compensate": { - "type": "boolean", - "default": false, - "description": "If set to true, triggers workflow compensation when before this transition is taken. Default is false" - } - }, - "additionalProperties": false, - "required": [ - "nextState" - ] - } - ] - }, - "onerrors": { - "oneOf": [ - { - "type": "string", - "description": "References the error policy to use." - }, - { - "type": "array", - "items": { - "$ref": "errors.json#/definitions/errorHandlerReference" - }, - "description": "References multiple error handlers." - } - ] - }, - "onevents": { - "type": "object", - "properties": { - "eventRefs": { - "type": "array", - "description": "References one or more unique event names in the defined workflow events", - "minItems": 1, - "items": { - "type": "string", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "uniqueItems": true, - "additionalItems": false - }, - "actionMode": { - "type": "string", - "enum": [ - "sequential", - "parallel" - ], - "description": "Specifies how actions are to be performed (in sequence or in parallel)", - "default": "sequential" - }, - "actions": { - "type": "array", - "description": "Actions to be performed if expression matches", - "items": { - "type": "object", - "$ref": "#/definitions/action" - }, - "additionalItems": false - }, - "eventDataFilter": { - "description": "Event data filter", - "$ref": "#/definitions/eventdatafilter" - } - }, - "additionalProperties": false, - "required": [ - "eventRefs" - ] - }, - "action": { - "type": "object", - "properties": { - "name": { - "type": "string", - "description": "Unique action definition name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "functionRef": { - "description": "References a function to be invoked", - "$ref": "#/definitions/functionref" - }, - "publish": { - "description": "Publish an event", - "$ref": "#/definitions/publish" - }, - "subscribe": { - "description": "Subscribe to an event channel", - "$ref": "#/definitions/subscribe" - }, - "subFlowRef": { - "description": "References a sub-workflow to invoke", - "$ref": "#/definitions/subflowref" - }, - "errorRef": { - "description": "References or defines the error to throw", - "$ref": "errors.json#/definitions/errorReference" - }, - "sleep": { - "description": "Defines time periods workflow execution should sleep before / after function execution", - "$ref": "#/definitions/sleep" - }, - "onErrors":{ - "$ref": "#/definitions/onerrors" - }, - "actionDataFilter": { - "description": "Action data filter", - "$ref": "#/definitions/actiondatafilter" - }, - "condition": { - "description": "Expression, if defined, must evaluate to true for this action to be performed. If false, action is disregarded", - "type": "string", - "minLength": 1 - } - }, - "additionalProperties": false, - "oneOf": [ - { - "required": [ - "name", - "functionRef" - ] - }, - { - "required": [ - "name", - "publish" - ] - }, - { - "required": [ - "name", - "subscribe" - ] - }, - { - "required": [ - "name", - "subFlowRef" - ] - }, - { - "required": [ - "name", - "errorRef" - ] - } - ] - }, - "functionref": { - "oneOf": [ - { - "type": "string", - "description": "Name of the referenced function", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - { - "type": "object", - "description": "Function Reference", - "properties": { - "refName": { - "type": "string", - "description": "Name of the referenced function", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "arguments": { - "type": "object", - "description": "Function arguments/inputs" - }, - "selectionSet": { - "type": "string", - "description": "Only used if function type is 'graphql'. A string containing a valid GraphQL selection set" - }, - "invoke": { - "type": "string", - "enum": [ - "sync", - "async" - ], - "description": "Specifies if the function should be invoked sync or async", - "default": "sync" - } - }, - "additionalProperties": false, - "required": [ - "refName" - ] - } - ] - }, - "publish": { - "type": "object", - "description": "Publish an event", - "properties": { - "event": { - "type": "string", - "description": "Reference to the unique name of a 'published' event definition", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "data": { - "type": [ - "string", - "object" - ], - "description": "If string type, an expression which selects parts of the states data output to become the data (payload) of the event referenced by 'publish'. If object type, a custom object to become the data (payload) of the event referenced by 'publish'." - }, - "contextAttributes": { - "type": "object", - "description": "Add additional extension context attributes to the produced event", - "additionalProperties": { - "type": "string" - } - } - }, - "additionalProperties": false, - "required": [ - "event", "data" - ] - }, - "subscribe": { - "type": "object", - "description": "Subscribe to an event channel", - "properties": { - "event": { - "type": "string", - "description": "Reference to the unique name of a 'subscribed' event definition", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "timeout": { - "type": "string", - "description": "Maximum amount of time (ISO 8601 format) to wait for the result event. If not defined it should default to the actionExecutionTimeout" - } - }, - "additionalProperties": false, - "required": [ - "event" - ] - }, - "subflowref": { - "oneOf": [ - { - "type": "string", - "description": "Unique id of the sub-workflow to be invoked", - "minLength": 1 - }, - { - "type": "object", - "description": "Specifies a sub-workflow to be invoked", - "properties": { - "workflowId": { - "type": "string", - "description": "Unique id of the sub-workflow to be invoked" - }, - "version": { - "type": "string", - "description": "Version of the sub-workflow to be invoked", - "minLength": 1, - "pattern": "^(0|[1-9]\\d*)\\.(0|[1-9]\\d*)\\.(0|[1-9]\\d*)(?:-((?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\\.(?:0|[1-9]\\d*|\\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\\+([0-9a-zA-Z-]+(?:\\.[0-9a-zA-Z-]+)*))?$" - }, - "onParentComplete": { - "type": "string", - "enum": [ - "continue", - "terminate" - ], - "description": "If invoke is 'async', specifies how subflow execution should behave when parent workflow completes. Default is 'terminate'", - "default": "terminate" - }, - "invoke": { - "type": "string", - "enum": [ - "sync", - "async" - ], - "description": "Specifies if the subflow should be invoked sync or async", - "default": "sync" - } - }, - "required": [ - "workflowId" - ] - } - ] - }, - "branch": { - "type": "object", - "description": "Branch Definition", - "properties": { - "name": { - "type": "string", - "description": "Branch name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "actionExecTimeout": { - "$ref": "timeouts.json#/definitions/actionExecTimeout" - }, - "branchExecTimeout": { - "$ref": "timeouts.json#/definitions/branchExecTimeout" - } - }, - "required": [] - }, - "actions": { - "type": "array", - "description": "Actions to be executed in this branch", - "items": { - "type": "object", - "$ref": "#/definitions/action" - }, - "additionalItems": false - } - }, - "additionalProperties": false, - "required": [ - "name", - "actions" - ] - }, - "eventstate": { - "type": "object", - "description": "This state is used to wait for events from event sources, then consumes them and invoke one or more actions to run in sequence or parallel", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "event", - "description": "State type" - }, - "exclusive": { - "type": "boolean", - "default": true, - "description": "If true consuming one of the defined events causes its associated actions to be performed. If false all of the defined events must be consumed in order for actions to be performed" - }, - "onEvents": { - "type": "array", - "description": "Define the events to be consumed and optional actions to be performed", - "items": { - "type": "object", - "$ref": "#/definitions/onevents" - }, - "additionalItems": false - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - }, - "actionExecTimeout": { - "$ref": "timeouts.json#/definitions/actionExecTimeout" - }, - "eventTimeout": { - "$ref": "timeouts.json#/definitions/eventTimeout" - } - }, - "required": [] - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "transition": { - "description": "Next transition of the workflow after all the actions have been performed", - "$ref": "#/definitions/transition" - }, - "end": { - "$ref": "#/definitions/end", - "description": "State end definition" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "oneOf": [ - { - "required": [ - "name", - "type", - "onEvents", - "end" - ] - }, - { - "required": [ - "name", - "type", - "onEvents", - "transition" - ] - } - ] - }, - "operationstate": { - "type": "object", - "description": "Defines actions be performed. Does not wait for incoming events", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "operation", - "description": "State type" - }, - "end": { - "$ref": "#/definitions/end", - "description": "State end definition" - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "actionMode": { - "type": "string", - "enum": [ - "sequential", - "parallel" - ], - "description": "Specifies whether actions are performed in sequence or in parallel", - "default": "sequential" - }, - "actions": { - "type": "array", - "description": "Actions to be performed", - "items": { - "type": "object", - "$ref": "#/definitions/action" - } - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - }, - "actionExecTimeout": { - "$ref": "timeouts.json#/definitions/actionExecTimeout" - } - }, - "required": [] - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "transition": { - "description": "Next transition of the workflow after all the actions have been performed", - "$ref": "#/definitions/transition" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "if": { - "properties": { - "usedForCompensation": { - "const": true - } - }, - "required": [ - "usedForCompensation" - ] - }, - "then": { - "required": [ - "name", - "type", - "actions" - ] - }, - "else": { - "oneOf": [ - { - "required": [ - "name", - "type", - "actions", - "end" - ] - }, - { - "required": [ - "name", - "type", - "actions", - "transition" - ] - } - ] - } - }, - "parallelstate": { - "type": "object", - "description": "Consists of a number of states that are executed in parallel", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "parallel", - "description": "State type" - }, - "end": { - "$ref": "#/definitions/end", - "description": "State end definition" - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - }, - "branchExecTimeout": { - "$ref": "timeouts.json#/definitions/branchExecTimeout" - } - }, - "required": [] - }, - "branches": { - "type": "array", - "description": "Branch Definitions", - "items": { - "type": "object", - "$ref": "#/definitions/branch" - }, - "additionalItems": false - }, - "completionType": { - "type": "string", - "enum": [ - "allOf", - "atLeast" - ], - "description": "Option types on how to complete branch execution.", - "default": "allOf" - }, - "numCompleted": { - "type": [ - "number", - "string" - ], - "minimum": 0, - "minLength": 0, - "description": "Used when completionType is set to 'atLeast' to specify the minimum number of branches that must complete before the state will transition." - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "transition": { - "description": "Next transition of the workflow after all branches have completed execution", - "$ref": "#/definitions/transition" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "if": { - "properties": { - "usedForCompensation": { - "const": true - } - }, - "required": [ - "usedForCompensation" - ] - }, - "then": { - "required": [ - "name", - "type", - "branches" - ] - }, - "else": { - "oneOf": [ - { - "required": [ - "name", - "type", - "branches", - "end" - ] - }, - { - "required": [ - "name", - "type", - "branches", - "transition" - ] - } - ] - } - }, - "switchstate": { - "oneOf": [ - { - "$ref": "#/definitions/databasedswitchstate" - }, - { - "$ref": "#/definitions/eventbasedswitchstate" - } - ] - }, - "eventbasedswitchstate": { - "type": "object", - "description": "Permits transitions to other states based on events", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "switch", - "description": "State type" - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - }, - "eventTimeout": { - "$ref": "timeouts.json#/definitions/eventTimeout" - } - }, - "required": [] - }, - "eventConditions": { - "type": "array", - "description": "Defines conditions evaluated against events", - "items": { - "type": "object", - "$ref": "#/definitions/eventcondition" - }, - "additionalItems": false - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "defaultCondition": { - "description": "Default transition of the workflow if there is no matching data conditions. Can include a transition or end definition", - "$ref": "#/definitions/defaultconditiondef" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": [ - "name", - "type", - "eventConditions", - "defaultCondition" - ] - }, - "databasedswitchstate": { - "type": "object", - "description": "Permits transitions to other states based on data conditions", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "switch", - "description": "State type" - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - } - }, - "required": [] - }, - "dataConditions": { - "type": "array", - "description": "Defines conditions evaluated against state data", - "items": { - "type": "object", - "$ref": "#/definitions/datacondition" - }, - "additionalItems": false - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "defaultCondition": { - "description": "Default transition of the workflow if there is no matching data conditions. Can include a transition or end definition", - "$ref": "#/definitions/defaultconditiondef" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": [ - "name", - "type", - "dataConditions", - "defaultCondition" - ] - }, - "defaultconditiondef": { - "type": "object", - "description": "DefaultCondition definition. Can be either a transition or end definition", - "properties": { - "name": { - "type": "string", - "description": "The optional name of the default condition, used solely for display purposes", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "transition": { - "$ref": "#/definitions/transition" - }, - "end": { - "$ref": "#/definitions/end" - } - }, - "additionalProperties": false, - "oneOf": [ - { - "required": [ - "transition" - ] - }, - { - "required": [ - "end" - ] - } - ] - }, - "eventcondition": { - "oneOf": [ - { - "$ref": "#/definitions/transitioneventcondition" - }, - { - "$ref": "#/definitions/endeventcondition" - } - ] - }, - "transitioneventcondition": { - "type": "object", - "description": "Switch state data event condition", - "properties": { - "name": { - "type": "string", - "description": "Event condition name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "eventRef": { - "type": "string", - "description": "References an unique event name in the defined workflow events", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "transition": { - "description": "Next transition of the workflow if there is valid matches", - "$ref": "#/definitions/transition" - }, - "eventDataFilter": { - "description": "Event data filter definition", - "$ref": "#/definitions/eventdatafilter" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": [ - "name", - "eventRef", - "transition" - ] - }, - "endeventcondition": { - "type": "object", - "description": "Switch state data event condition", - "properties": { - "name": { - "type": "string", - "description": "Event condition name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "eventRef": { - "type": "string", - "description": "References an unique event name in the defined workflow events", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "end": { - "$ref": "#/definitions/end", - "description": "Explicit transition to end" - }, - "eventDataFilter": { - "description": "Event data filter definition", - "$ref": "#/definitions/eventdatafilter" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": [ - "name", - "eventRef", - "end" - ] - }, - "datacondition": { - "oneOf": [ - { - "$ref": "#/definitions/transitiondatacondition" - }, - { - "$ref": "#/definitions/enddatacondition" - } - ] - }, - "transitiondatacondition": { - "type": "object", - "description": "Switch state data based condition", - "properties": { - "name": { - "type": "string", - "description": "Data condition name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "condition": { - "type": "string", - "description": "Workflow expression evaluated against state data. Must evaluate to true or false" - }, - "transition": { - "description": "Workflow transition if condition is evaluated to true", - "$ref": "#/definitions/transition" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": [ - "name", - "condition", - "transition" - ] - }, - "enddatacondition": { - "type": "object", - "description": "Switch state data based condition", - "properties": { - "name": { - "type": "string", - "description": "Data condition name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "condition": { - "type": "string", - "description": "Workflow expression evaluated against state data. Must evaluate to true or false" - }, - "end": { - "$ref": "#/definitions/end", - "description": "Workflow end definition" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "required": [ - "name", - "condition", - "end" - ] - }, - "injectstate": { - "type": "object", - "description": "Inject static data into state data. Does not perform any actions", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "inject", - "description": "State type" - }, - "end": { - "$ref": "#/definitions/end", - "description": "State end definition" - }, - "data": { - "type": "object", - "description": "JSON object which can be set as states data input and can be manipulated via filters" - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "transition": { - "description": "Next transition of the workflow after injection has completed", - "$ref": "#/definitions/transition" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "if": { - "properties": { - "usedForCompensation": { - "const": true - } - }, - "required": [ - "usedForCompensation" - ] - }, - "then": { - "required": [ - "name", - "type", - "data" - ] - }, - "else": { - "oneOf": [ - { - "required": [ - "name", - "type", - "data", - "end" - ] - }, - { - "required": [ - "name", - "type", - "data", - "transition" - ] - } - ] - } - }, - "foreachstate": { - "type": "object", - "description": "Execute a set of defined actions or workflows for each element of a data array", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "foreach", - "description": "State type" - }, - "end": { - "$ref": "#/definitions/end", - "description": "State end definition" - }, - "inputCollection": { - "type": "string", - "description": "Workflow expression selecting an array element of the states data" - }, - "outputCollection": { - "type": "string", - "description": "Workflow expression specifying an array element of the states data to add the results of each iteration" - }, - "iterationParam": { - "type": "string", - "description": "Name of the iteration parameter that can be referenced in actions/workflow. For each parallel iteration, this param should contain an unique element of the inputCollection array" - }, - "batchSize": { - "type": [ - "number", - "string" - ], - "minimum": 0, - "minLength": 0, - "description": "Specifies how many iterations may run in parallel at the same time. Used if 'mode' property is set to 'parallel' (default)" - }, - "actions": { - "type": "array", - "description": "Actions to be executed for each of the elements of inputCollection", - "items": { - "type": "object", - "$ref": "#/definitions/action" - }, - "additionalItems": false - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - }, - "actionExecTimeout": { - "$ref": "timeouts.json#/definitions/actionExecTimeout" - } - }, - "required": [] - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "transition": { - "description": "Next transition of the workflow after state has completed", - "$ref": "#/definitions/transition" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "mode": { - "type": "string", - "enum": [ - "sequential", - "parallel" - ], - "description": "Specifies how iterations are to be performed (sequentially or in parallel)", - "default": "parallel" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "if": { - "properties": { - "usedForCompensation": { - "const": true - } - }, - "required": [ - "usedForCompensation" - ] - }, - "then": { - "required": [ - "name", - "type", - "inputCollection", - "actions" - ] - }, - "else": { - "oneOf": [ - { - "required": [ - "name", - "type", - "inputCollection", - "actions", - "end" - ] - }, - { - "required": [ - "name", - "type", - "inputCollection", - "actions", - "transition" - ] - } - ] - } - }, - "callbackstate": { - "type": "object", - "description": "This state performs an action, then waits for the callback event that denotes completion of the action", - "properties": { - "name": { - "type": "string", - "description": "State name", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "type": { - "type": "string", - "const": "callback", - "description": "State type" - }, - "action": { - "description": "Defines the action to be executed", - "$ref": "#/definitions/action" - }, - "eventRef": { - "type": "string", - "description": "References an unique callback event name in the defined workflow events", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "timeouts": { - "type": "object", - "description": "State specific timeouts", - "properties": { - "stateExecTimeout": { - "$ref": "timeouts.json#/definitions/stateExecTimeout" - }, - "actionExecTimeout": { - "$ref": "timeouts.json#/definitions/actionExecTimeout" - }, - "eventTimeout": { - "$ref": "timeouts.json#/definitions/eventTimeout" - } - }, - "required": [] - }, - "eventDataFilter": { - "description": "Event data filter", - "$ref": "#/definitions/eventdatafilter" - }, - "stateDataFilter": { - "description": "State data filter", - "$ref": "#/definitions/statedatafilter" - }, - "onErrors": { - "$ref": "#/definitions/onerrors" - }, - "transition": { - "description": "Next transition of the workflow after all the actions have been performed", - "$ref": "#/definitions/transition" - }, - "end": { - "$ref": "#/definitions/end", - "description": "State end definition" - }, - "compensatedBy": { - "type": "string", - "minLength": 1, - "description": "Unique Name of a workflow state which is responsible for compensation of this state", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "usedForCompensation": { - "type": "boolean", - "default": false, - "description": "If true, this state is used to compensate another state. Default is false" - }, - "metadata": { - "$ref": "common.json#/definitions/metadata" - } - }, - "additionalProperties": false, - "if": { - "properties": { - "usedForCompensation": { - "const": true - } - }, - "required": [ - "usedForCompensation" - ] - }, - "then": { - "required": [ - "name", - "type", - "action", - "" - ] - }, - "else": { - "oneOf": [ - { - "required": [ - "name", - "type", - "action", - "eventRef", - "end" - ] - }, - { - "required": [ - "name", - "type", - "action", - "eventRef", - "transition" - ] - } - ] - } - }, - "startdef": { - "oneOf": [ - { - "type": "string", - "description": "Name of the starting workflow state", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - { - "type": "object", - "description": "Workflow start definition", - "properties": { - "stateName": { - "type": "string", - "description": "Name of the starting workflow state", - "minLength": 1, - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "schedule": { - "description": "Define the time/repeating intervals or cron at which workflow instances should be automatically started.", - "$ref": "#/definitions/schedule" - } - }, - "additionalProperties": false, - "required": [ - "schedule" - ] - } - ] - }, - "schedule": { - "oneOf": [ - { - "type": "string", - "description": "Time interval (must be repeating interval) described with ISO 8601 format. Declares when workflow instances will be automatically created. (UTC timezone is assumed)", - "minLength": 1 - }, - { - "type": "object", - "description": "Start state schedule definition", - "properties": { - "interval": { - "type": "string", - "description": "Time interval (must be repeating interval) described with ISO 8601 format. Declares when workflow instances will be automatically created.", - "minLength": 1 - }, - "cron": { - "$ref": "#/definitions/crondef" - }, - "timezone": { - "type": "string", - "description": "Timezone name used to evaluate the interval & cron-expression. (default: UTC)" - } - }, - "additionalProperties": false, - "oneOf": [ - { - "required": [ - "interval" - ] - }, - { - "required": [ - "cron" - ] - } - ] - } - ] - }, - "end": { - "oneOf": [ - { - "type": "boolean", - "description": "State end definition", - "default": true - }, - { - "type": "object", - "description": "State end definition", - "properties": { - "terminate": { - "type": "boolean", - "default": false, - "description": "If true, completes all execution flows in the given workflow instance" - }, - "produceEvents": { - "type": "array", - "description": "Defines events that should be produced", - "items": { - "type": "object", - "$ref": "#/definitions/produceeventdef" - }, - "additionalItems": false - }, - "compensate": { - "type": "boolean", - "default": false, - "description": "If set to true, triggers workflow compensation. Default is false" - }, - "continueAs": { - "$ref": "#/definitions/continueasdef" - } - }, - "additionalProperties": false, - "required": [] - } - ] - }, - "produceeventdef": { - "type": "object", - "description": "Produce an event and set its data", - "properties": { - "eventRef": { - "type": "string", - "description": "References a name of a defined event", - "pattern": "^[a-z0-9](-?[a-z0-9])*$" - }, - "data": { - "type": [ - "string", - "object" - ], - "description": "If String, expression which selects parts of the states data output to become the data of the produced event. If object a custom object to become the data of produced event." - }, - "contextAttributes": { - "type": "object", - "description": "Add additional event extension context attributes", - "additionalProperties": { - "type": "string" - } - } - }, - "additionalProperties": false, - "required": [ - "eventRef" - ] - }, - "statedatafilter": { - "type": "object", - "properties": { - "input": { - "type": "string", - "description": "Workflow expression to filter the state data input" - }, - "output": { - "type": "string", - "description": "Workflow expression that filters the state data output" - } - }, - "additionalProperties": false, - "required": [] - }, - "eventdatafilter": { - "type": "object", - "properties": { - "useData": { - "type": "boolean", - "description": "If set to false, event payload is not added/merged to state data. In this case 'data' and 'toStateData' should be ignored. Default is true.", - "default": true - }, - "data": { - "type": "string", - "description": "Workflow expression that filters the received event payload (default: '${ . }')" - }, - "toStateData": { - "type": "string", - "description": " Workflow expression that selects a state data element to which the filtered event should be added/merged into. If not specified, denotes, the top-level state data element." - } - }, - "additionalProperties": false, - "required": [] - }, - "actiondatafilter": { - "type": "object", - "properties": { - "fromStateData": { - "type": "string", - "description": "Workflow expression that selects state data that the state action can use" - }, - "useResults": { - "type": "boolean", - "description": "If set to false, action data results are not added/merged to state data. In this case 'results' and 'toStateData' should be ignored. Default is true.", - "default": true - }, - "results": { - "type": "string", - "description": "Workflow expression that filters the actions data results" - }, - "toStateData": { - "type": "string", - "description": "Workflow expression that selects a state data element to which the action results should be added/merged into. If not specified, denote, the top-level state data element" - } - }, - "additionalProperties": false, - "required": [] - }, - "validationSchema" : { - "oneOf": [ - { - "type": "string", - "description": "URI of the JSON Schema used to validate the workflow", - "minLength": 1 - }, - { - "type": "object", - "description": "Workflow data input schema definition", - "properties": { - "schema": { - "oneOf":[ - { - "type": "string", - "description": "URI of the JSON Schema used to validate the workflow", - "minLength": 1 - }, - { - "type": "object", - "description": "The JSON Schema object used to validate the workflow", - "$schema": "http://json-schema.org/draft-07/schema#" - } - ] - }, - "failOnValidationErrors": { - "type": "boolean", - "default": true, - "description": "Determines if error should be thrown if there are validation errors" - } - }, - "additionalProperties": false, - "required": [ - "schema" - ] - } - ] - } - } -} diff --git a/schema/workflow.yaml b/schema/workflow.yaml new file mode 100644 index 00000000..91cb64dc --- /dev/null +++ b/schema/workflow.yaml @@ -0,0 +1,733 @@ +id: https://serverlessworkflow.io/schemas/1.0.0-alpha1/workflow.json +$schema: http://json-schema.org/draft-07/schema +description: Serverless Workflow specification - Workflow Schema +type: object +properties: + document: + type: object + properties: + dsl: + type: string + pattern: ^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$ + description: The version of the DSL used by the workflow. + namespace: + type: string + pattern: ^[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?$ + description: The workflow's namespace. + name: + type: string + pattern: ^[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?$ + description: The workflow's name. + version: + type: string + pattern: ^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$ + description: The workflow's semantic version. + title: + type: string + description: The workflow's title. + summary: + type: string + description: The workflow's Markdown summary. + tags: + type: object + description: A key/value mapping of the workflow's tags, if any. + additionalProperties: true + required: [ dsl, namespace, name, version ] + description: Documents the workflow + do: + type: object + minProperties: 1 + additionalProperties: + $ref: '#/$defs/task' + description: Defines the tasks the workflow must perform +$defs: + task: + type: object + oneOf: + - $ref: '#/$defs/callTask' + - $ref: '#/$defs/compositeTask' + callTask: + type: object + properties: + call: + type: string + description: The name of the function to call. + with: + type: object + additionalProperties: true + description: name/value mapping of the parameters, if any, to call the function with + required: [ call ] + compositeTask: + properties: + execute: + type: object + oneOf: + - properties: + concurrently: + type: object + minProperties: 2 + additionalProperties: + $ref: '#/$defs/task' + description: A name/definition mapping of the tasks to perform concurrently. + compete: + type: boolean + description: Indicates whether or not the concurrent tasks are racing against each other, with a single possible winner, which sets the composite task's output. + required: [ concurrently ] + - properties: + sequentially: + type: object + minProperties: 2 + additionalProperties: + $ref: '#/$defs/task' + description: A name/definition mapping of the tasks to perform sequentially. + required: [ sequentially ] + description: Configures the task execution strategy to use + required: [ execute ] + endpoint: + type: object + properties: + uri: + type: string + format: uri + required: [ uri ] +required: [ document, do ] +id: https://serverlessworkflow.io/schemas/1.0.0-alpha1/workflow.json +$schema: http://json-schema.org/draft-07/schema +description: Serverless Workflow specification - Workflow Schema +type: object +properties: + document: + type: object + properties: + dsl: + type: string + pattern: ^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$ + description: The version of the DSL used by the workflow. + namespace: + type: string + pattern: ^[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?$ + description: The workflow's namespace. + name: + type: string + pattern: ^[a-zA-Z0-9](?:[a-zA-Z0-9-]{0,61}[a-zA-Z0-9])?$ + description: The workflow's name. + version: + type: string + pattern: ^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)(?:-((?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*)(?:\.(?:0|[1-9]\d*|\d*[a-zA-Z-][0-9a-zA-Z-]*))*))?(?:\+([0-9a-zA-Z-]+(?:\.[0-9a-zA-Z-]+)*))?$ + description: The workflow's semantic version. + title: + type: string + description: The workflow's title. + summary: + type: string + description: The workflow's Markdown summary. + tags: + type: object + description: A key/value mapping of the workflow's tags, if any. + additionalProperties: true + required: [ dsl, namespace, name, version ] + description: Documents the workflow. + use: + type: object + properties: + authentications: + type: object + additionalProperties: + $ref: '#/$defs/authenticationPolicy' + description: The workflow's reusable authentication policies. + errors: + type: object + additionalProperties: + $ref: '#/$defs/error' + description: The workflow's reusable errors. + extensions: + type: object + additionalProperties: + $ref: '#/$defs/extension' + description: The workflow's extensions. + functions: + type: object + additionalProperties: + $ref: '#/$defs/function' + description: The workflow's reusable functions. + retries: + type: object + additionalProperties: + $ref: '#/$defs/retryPolicy' + description: The workflow's reusable retry policies. + secrets: + type: array + items: + type: string + description: The workflow's secrets. + description: Defines the workflow's reusable components. + do: + type: object + minProperties: 1 + additionalProperties: + $ref: '#/$defs/task' + description: Defines the tasks the workflow must perform. +$defs: + task: + type: object + oneOf: + - $ref: '#/$defs/callTask' + - $ref: '#/$defs/compositeTask' + - $ref: '#/$defs/emitTask' + - $ref: '#/$defs/forTask' + - $ref: '#/$defs/listenTask' + - $ref: '#/$defs/raiseTask' + - $ref: '#/$defs/runTask' + - $ref: '#/$defs/setTask' + - $ref: '#/$defs/switchTask' + - $ref: '#/$defs/tryTask' + - $ref: '#/$defs/waitTask' + callTask: + type: object + properties: + call: + type: string + description: The name of the function to call. + with: + type: object + additionalProperties: true + description: name/value mapping of the parameters, if any, to call the function with + required: [ call ] + compositeTask: + properties: + execute: + type: object + oneOf: + - properties: + concurrently: + type: object + minProperties: 2 + additionalProperties: + $ref: '#/$defs/task' + description: A name/definition mapping of the tasks to perform concurrently. + compete: + type: boolean + description: Indicates whether or not the concurrent tasks are racing against each other, with a single possible winner, which sets the composite task's output. + required: [ concurrently ] + - properties: + sequentially: + type: object + minProperties: 2 + additionalProperties: + $ref: '#/$defs/task' + description: A name/definition mapping of the tasks to perform sequentially. + required: [ sequentially ] + description: Configures the task execution strategy to use + required: [ execute ] + description: Serves as a pivotal orchestrator within workflow systems, enabling the seamless integration and execution of multiple subtasks to accomplish complex operations + emitTask: + properties: + emit: + type: object + properties: + event: + type: object + properties: + id: + type: string + description: The event's unique identifier + source: + type: string + format: uri + description: Identifies the context in which an event happened + type: + type: string + description: This attribute contains a value describing the type of event related to the originating occurrence. + time: + type: string + format: date-time + subject: + type: string + datacontenttype: + type: string + description: Content type of data value. This attribute enables data to carry any type of content, whereby format and encoding might differ from that of the chosen event format. + dataschema: + type: string + format: uri + required: [ source, type ] + additionalProperties: true + required: [ event ] + required: [ emit ] + description: Allows workflows to publish events to event brokers or messaging systems, facilitating communication and coordination between different components and services. + forTask: + properties: + for: + type: object + properties: + each: + type: string + description: The name of the variable used to store the current item being enumerated. + default: item + in: + type: string + description: A runtime expression used to get the collection to enumerate. + at: + type: string + description: The name of the variable used to store the index of the current item being enumerated. + default: index + required: [ in ] + while: + type: string + description: A runtime expression that represents the condition, if any, that must be met for the iteration to continue. + do: + $ref: '#/$defs/task' + description: Allows workflows to iterate over a collection of items, executing a defined set of subtasks for each item in the collection. This task type is instrumental in handling scenarios such as batch processing, data transformation, and repetitive operations across datasets. + required: [ for, do ] + listenTask: + type: object + properties: + listen: + type: object + properties: + to: + type: object + oneOf: + - properties: + all: + type: array + items: + $ref: '#/$defs/eventFilter' + required: [ all ] + - properties: + any: + type: array + items: + $ref: '#/$defs/eventFilter' + required: [ any ] + - properties: + one: + $ref: '#/$defs/eventFilter' + required: [ one ] + required: [ to ] + required: [ listen ] + description: Provides a mechanism for workflows to await and react to external events, enabling event-driven behavior within workflow systems. + raiseTask: + type: object + properties: + raise: + type: object + properties: + error: + $ref: '#/$defs/error' + description: Defines the error to raise. + required: [ error ] + required: [ raise ] + description: Intentionally triggers and propagates errors. + runTask: + type: object + properties: + run: + type: object + oneOf: + - properties: + container: + type: object + properties: + image: + type: string + description: The name of the container image to run. + command: + type: string + description: The command, if any, to execute on the container + ports: + type: object + description: The container's port mappings, if any. + volumes: + type: object + description: The container's volume mappings, if any. + environment: + type: object + description: A key/value mapping of the environment variables, if any, to use when running the configured process. + required: [ image ] + required: [ container ] + description: Enables the execution of external processes encapsulated within a containerized environment. + - properties: + script: + type: object + properties: + language: + type: string + description: The language of the script to run. + environment: + type: object + additionalProperties: true + description: A key/value mapping of the environment variables, if any, to use when running the configured process. + oneOf: + - properties: + code: + type: string + required: [ code ] + description: The script's code. + - properties: + source: + $ref: '#/$defs/externalResource' + description: The script's resource. + required: [ code ] + required: [ language ] + required: [ script ] + description: Enables the execution of custom scripts or code within a workflow, empowering workflows to perform specialized logic, data processing, or integration tasks by executing user-defined scripts written in various programming languages. + - properties: + shell: + type: object + properties: + command: + type: string + description: The shell command to run. + arguments: + type: object + additionalProperties: true + description: A list of the arguments of the shell command to run. + environment: + type: object + additionalProperties: true + description: A key/value mapping of the environment variables, if any, to use when running the configured process. + required: [ command ] + required: [ shell ] + description: Enables the execution of shell commands within a workflow, enabling workflows to interact with the underlying operating system and perform system-level operations, such as file manipulation, environment configuration, or system administration tasks. + - properties: + workflow: + type: object + properties: + namespace: + type: string + description: The namespace the workflow to run belongs to. + name: + type: string + description: The name of the workflow to run. + version: + type: string + default: latest + description: The version of the workflow to run. Defaults to latest + input: + type: object + additionalProperties: true + description: The data, if any, to pass as input to the workflow to execute. The value should be validated against the target workflow's input schema, if specified. + required: [ namespace, name, version ] + required: [ workflow ] + description: Enables the invocation and execution of nested workflows within a parent workflow, facilitating modularization, reusability, and abstraction of complex logic or business processes by encapsulating them into standalone workflow units. + required: [ run ] + description: Provides the capability to execute external containers, shell commands, scripts, or workflows. + setTask: + type: object + properties: + set: + type: object + minProperties: 1 + additionalProperties: true + description: The data to set + required: [ set ] + description: A task used to set data + switchTask: + type: object + properties: + switch: + type: object + minProperties: 1 + additionalProperties: + type: object + properties: + when: + type: string + description: A runtime expression used to determine whether or not the case matches. + then: + type: string + enum: [ continue, exit, end ] + default: continue + description: The flow directive to execute when the case matches. + required: [ switch ] + description: Enables conditional branching within workflows, allowing them to dynamically select different paths based on specified conditions or criteria + tryTask: + type: object + properties: + try: + $ref: '#/$defs/task' + description: The task to perform. + catch: + type: object + properties: + errors: + type: object + as: + type: string + description: The name of the runtime expression variable to save the error as. Defaults to 'error'. + when: + type: string + description: A runtime expression used to determine whether or not to catch the filtered error + exceptWhen: + type: string + description: A runtime expression used to determine whether or not to catch the filtered error + retry: + $ref: '#/$defs/retryPolicy' + description: The retry policy to use, if any, when catching errors. + do: + $ref: '#/$defs/task' + description: The definition of the task to run when catching an error. + required: [ try, catch ] + description: Serves as a mechanism within workflows to handle errors gracefully, potentially retrying failed tasks before proceeding with alternate ones. + waitTask: + type: object + properties: + wait: + $ref: '#/$defs/duration' + description: The amount of time to wait. + required: [ wait ] + description: Allows workflows to pause or delay their execution for a specified period of time. + authenticationPolicy: + type: object + oneOf: + - properties: + basic: + type: object + properties: + username: + type: string + description: The username to use. + password: + type: string + description: The password to use. + required: [ username, password ] + required: [ basic ] + description: Use basic authentication. + - properties: + bearer: + type: object + properties: + token: + type: string + description: The bearer token to use. + required: [ token ] + required: [ bearer ] + description: Use bearer authentication. + - properties: + oauth2: + type: object + properties: + authority: + type: string + format: uri + description: The URI that references the OAuth2 authority to use. + grant: + type: string + description: The grant type to use. + client: + type: object + properties: + id: + type: string + description: The client id to use. + secret: + type: string + description: The client secret to use, if any. + required: [ id ] + scopes: + type: array + items: + type: string + description: The scopes, if any, to request the token for. + audiences: + type: array + items: + type: string + description: The audiences, if any, to request the token for. + username: + type: string + description: The username to use. Used only if the grant type is Password. + password: + type: string + description: The password to use. Used only if the grant type is Password. + subject: + $ref: '#/$defs/oauth2Token' + description: The security token that represents the identity of the party on behalf of whom the request is being made. + actor: + $ref: '#/$defs/oauth2Token' + description: The security token that represents the identity of the acting party. + required: [ authority, grant, client ] + required: [ oauth2 ] + description: Use OAUTH2 authentication. + description: Defines an authentication policy. + oauth2Token: + type: object + properties: + token: + type: string + description: The security token to use to use. + type: + type: string + description: The type of the security token to use to use. + required: [ token, type ] + duration: + type: object + minProperties: 1 + properties: + days: + type: integer + description: Number of days, if any. + hours: + type: integer + description: Number of days, if any. + minutes: + type: integer + description: Number of minutes, if any. + seconds: + type: integer + description: Number of seconds, if any. + milliseconds: + type: integer + description: Number of milliseconds, if any. + description: The definition of a duration. + error: + type: object + properties: + type: + type: string + format: uri + description: A URI reference that identifies the error type. + status: + type: integer + description: The status code generated by the origin for this occurrence of the error. + instance: + type: string + format: uri + description: A JSON Pointer used to reference the component the error originates from. + title: + type: string + description: A short, human-readable summary of the error. + detail: + type: string + description: A human-readable explanation specific to this occurrence of the error. + required: [ type, status, instance ] + endpoint: + type: object + properties: + uri: + type: string + format: uri + description: The endpoint's URI. + authentication: + $ref: '#/$defs/authenticationPolicy' + description: The authentication policy to use. + required: [ uri ] + eventFilter: + type: object + properties: + with: + type: object + minProperties: 1 + properties: + id: + type: string + description: The event's unique identifier + source: + type: string + description: Identifies the context in which an event happened + type: + type: string + description: This attribute contains a value describing the type of event related to the originating occurrence. + time: + type: string + subject: + type: string + datacontenttype: + type: string + description: Content type of data value. This attribute enables data to carry any type of content, whereby format and encoding might differ from that of the chosen event format. + dataschema: + type: string + additionalProperties: true + required: [ with ] + extension: + type: object + properties: + extend: + type: string + enum: [ call, composite, emit, for, listen, raise, run, set, switch, try, wait, all ] + description: The type of task to extend. + when: + type: string + description: A runtime expression, if any, used to determine whether or not the extension should apply in the specified context. + before: + $ref: '#/$defs/task' + description: The task to execute before the extended task, if any. + after: + $ref: '#/$defs/task' + description: The task to execute after the extended task, if any. + required: [ extend ] + description: The definition of a an extension. + externalResource: + type: object + properties: + uri: + type: string + format: uri + description: The endpoint's URI. + authentication: + $ref: '#/$defs/authenticationPolicy' + description: The authentication policy to use. + name: + type: string + description: The external resource's name, if any. + required: [ uri ] + retryPolicy: + type: object + properties: + when: + type: string + description: A runtime expression, if any, used to determine whether or not to retry running the task, in a given context. + exceptWhen: + type: string + description: A runtime expression used to determine whether or not to retry running the task, in a given context. + delay: + $ref: '#/$defs/duration' + description: The duration to wait between retry attempts. + backoff: + type: object + oneOf: + - properties: + constant: + type: object + description: The definition of the constant backoff to use, if any. + required: [ constant ] + - properties: + exponential: + type: object + description: The definition of the exponential backoff to use, if any. + required: [ exponential ] + - properties: + linear: + type: object + description: The definition of the linear backoff to use, if any. + required: [ linear ] + description: The retry duration backoff. + limit: + type: object + properties: + attempt: + type: object + properties: + count: + type: integer + description: The maximum amount of retry attempts, if any. + duration: + $ref: '#/$defs/duration' + description: The maximum duration for each retry attempt. + duration: + $ref: '#/$defs/duration' + description: The duration limit, if any, for all retry attempts. + description: The retry limit, if any + jitter: + type: object + properties: + from: + $ref: '#/$defs/duration' + description: The minimum duration of the jitter range + to: + $ref: '#/$defs/duration' + description: The maximum duration of the jitter range + required: [ from, to ] + description: The parameters, if any, that control the randomness or variability of the delay between retry attempts. + description: Defines a retry policy. +required: [ document, do ] \ No newline at end of file diff --git a/specification.md b/specification.md deleted file mode 100644 index 37a31837..00000000 --- a/specification.md +++ /dev/null @@ -1,6943 +0,0 @@ -# Serverless Workflow Specification - -## Table of Contents - -- [Abstract](#abstract) -- [Status of this document](#status-of-this-document) -- [Overview](#overview) - * [Why we need a specification?](#why-we-need-a-specification) - * [Focus on standards](#focus-on-standards) -- [Project Components](#project-components) -- [Specification Details](#specification-details) - * [Core Concepts](#core-concepts) - * [Workflow Definition](#workflow-definition) - * [Workflow Instance](#workflow-instance) - * [Workflow Model](#workflow-model) - * [Workflow Data](#workflow-data) - + [Workflow Data Input](#workflow-data-input) - + [Information Passing Between States](#information-passing-between-states) - + [Workflow data output](#workflow-data-output) - + [State data filters](#state-data-filters) - + [Action data filters](#action-data-filters) - + [Event data filters](#event-data-filters) - + [Using multiple data filters](#using-multiple-data-filters) - + [Data Merging](#data-merging) - * [Workflow Functions](#workflow-functions) - + [Using Functions for OpenAPI Service Invocations](#using-functions-for-openapi-service-invocations) - + [Using Functions for HTTP Service Invocations](#using-functions-for-http-service-invocations) - + [Using Functions for Async API Service Invocations](#using-functions-for-async-api-service-invocations) - + [Using Functions for RPC Service Invocations](#using-functions-for-rpc-service-invocations) - + [Using Functions for GraphQL Service Invocations](#using-functions-for-graphql-service-invocations) - - [Invoking a GraphQL `Query`](#invoking-a-graphql-query) - - [Invoking a GraphQL `Mutation`](#invoking-a-graphql-mutation) - + [Using Functions for OData Service Invocations](#using-functions-for-odata-service-invocations) - - [Creating an OData Function Definition](#creating-an-odata-function-definition) - - [Invoking an OData Function Definition](#invoking-an-odata-function-definition) - + [Using Functions for Expression Evaluation](#using-functions-for-expression-evaluation) - + [Defining custom function types](#defining-custom-function-types) - * [Workflow Expressions](#workflow-expressions) - * [Workflow Definition Structure](#workflow-definition-structure) - + [Workflow States](#workflow-states) - - [Event State](#event-state) - - [Operation State](#operation-state) - - [Switch State](#switch-state) - - [Parallel State](#parallel-state) - - [Inject State](#inject-state) - - [ForEach State](#foreach-state) - - [Callback State](#callback-state) - + [Related State Definitions](#related-state-definitions) - - [Function Definition](#function-definition) - - [Event Definition](#event-definition) - - [Auth Definition](#auth-definition) - - [Basic Properties Definition](#basic-properties-definition) - - [Bearer Properties Definition](#bearer-properties-definition) - - [OAuth2 Properties Definition](#oauth2-properties-definition) - - [Correlation Definition](#correlation-definition) - - [OnEvents Definition](#onevents-definition) - - [Action Definition](#action-definition) - - [Subflow Action](#subflow-action) - - [FunctionRef Definition](#functionref-definition) - - [EventRef Definition](#eventref-definition) - - [SubFlowRef Definition](#subflowref-definition) - - [Error Handling Configuration](#error-handling-configuration) - - [Error Definition](#error-definition) - - [Error Types](#error-types) - - [Error Reference](#error-reference) - - [Error Handler Definition](#error-handler-definition) - - [Error Handler Reference](#error-handler-reference) - - [Error Policy Definition](#error-policy-definition) - - [Error Outcome Definition](#error-outcome-definition) - - [Error Throw Definition](#error-throw-definition) - - [Retry Definition](#retry-definition) - - [Transition Definition](#transition-definition) - - [Switch State Data Conditions](#switch-state-data-conditions) - - [Switch State Event Conditions](#switch-state-event-conditions) - - [Parallel State Branch](#parallel-state-branch) - - [Parallel State Handling Exceptions](#parallel-state-handling-exceptions) - - [Start Definition](#start-definition) - - [Schedule Definition](#schedule-definition) - - [Cron Definition](#cron-definition) - - [End Definition](#end-definition) - - [ProducedEvent Definition](#producedevent-definition) - - [Transitions](#transitions) - - [Additional Properties](#additional-properties) - * [Workflow Error Handling](#workflow-error-handling) - + [Error Definitions](#error-definitions) - + [Error Types](#error-types) - + [Error Source](#error-source) - + [Error Handling Strategies](#error-handling-strategies) - - [Error Handlers](#error-handlers) - - [Error Policies](#error-policies) - + [Error Retries](#error-retries) - - [Retry Policy Execution](#retry-policy-execution) - - [Retry Behavior](#retry-behavior) - - [Retry Exhaustion](#retry-exhaustion) - + [Error Outcomes](#error-outcomes) - + [Error Bubbling](#error-bubbling) - + [Error Handling Best Practices](#error-handling-best-practices) - * [Workflow Timeouts](#workflow-timeouts) - + [Workflow Timeout Definition](#workflow-timeout-definition) - - [WorkflowExecTimeout Definition](#workflowexectimeout-definition) - + [States Timeout Definition](#states-timeout-definition) - + [Branch Timeout Definition](#branch-timeout-definition) - + [Event Timeout Definition](#event-timeout-definition) - * [Workflow Compensation](#workflow-compensation) - + [Defining Compensation](#defining-compensation) - + [Triggering Compensation](#triggering-compensation) - + [Compensation Execution Details](#compensation-execution-details) - + [Compensation and Active States](#compensation-and-active-states) - + [Unrecoverable errors during compensation](#unrecoverable-errors-during-compensation) - * [Continuing as a new Execution](#continuing-as-a-new-execution) - + [ContinueAs in sub workflows](#continueas-in-sub-workflows) - * [Workflow Versioning](#workflow-versioning) - * [Workflow Constants](#workflow-constants) - * [Workflow Secrets](#workflow-secrets) - * [Workflow Metadata](#workflow-metadata) - * [Workflow Context](#workflow-context) - * [Naming Convention](#naming-convention) -- [Extensions](#extensions) -- [Use Cases](#use-cases) -- [Examples](#examples) -- [Comparison to other workflow languages](#comparison-to-other-workflow-languages) -- [References](#references) -- [License](#license) - -## Abstract - -The Serverless Workflow project defines a vendor-neutral and declarative workflow language, -targeting the Serverless computing technology domain. - -## Status of this document - -This document represents the current state of the specification. -It includes all features so far released -as well as all features planned to be added in the next release. - -You can find all specification releases [here](https://github.com/serverlessworkflow/specification/releases). -You can find the specification roadmap [here](roadmap/README.md). - -## Overview - -Workflows allow us to capture and organize business requirements in a unified manner. -They can bridge the gap between how we express and model business logic. - -A key component of workflows is the domain-specific language (DSL) we use to model our -business logic and solutions. Selecting the appropriate workflow language for our business and technology domains is -a very important decision to be considered. - -Serverless Workflow focuses on defining a **vendor-neutral**, **platform-independent**, and **declarative** workflow -language that targets the serverless computing technology domain. -It can be used to significantly bridge the gap between your unique business domain and the target technology domain. - -### Why we need a specification? - -The lack of a common way to define and model workflows means that we must constantly re-learn -how to write them. This also limits the potential for common libraries, tooling and -infrastructure to aid workflow modeling and execution across different platforms. -Portability as well as productivity that can be achieved from workflow orchestration is hindered overall. - -Serverless Workflow addresses the need for a community-driven, vendor-neutral and a platform-independent -workflow language specification that targets the serverless computing technology domain. - -Having and using a specification-based workflow language allows us to model our workflows once and deploy them -onto many different container/cloud platforms, expecting the same execution results. - -

-Serverless Workflow Specification Goals -

- -For more information on the history, development and design rationale behind the specification, see the [Serverless Workflow Wiki](https://github.com/serverlessworkflow/specification/wiki). - -### Focus on standards - -

-Serverless Workflow Specification Focus On Standards -

- -Serverless Workflow language takes advantage of well-established and known standards such as [CloudEvents](https://cloudevents.io/), [OpenAPI](https://www.openapis.org/) specifications, -[gRPC](https://grpc.io/) and [GraphQL](https://graphql.org/). - -## Project Components - -

-Serverless Workflow Specification Overview -

- -The specification has multiple components: - -* Definitions of the workflow language. This is defined via the [Workflow JSON Schema](schema/workflow.json). You can use both - [JSON](https://www.json.org/json-en.html) and [YAML](https://yaml.org/) formats to model your workflows. -* Software Development Kits (SDKs) for [Go](https://github.com/serverlessworkflow/sdk-go), [Java](https://github.com/serverlessworkflow/sdk-java), [.NET](https://github.com/serverlessworkflow/sdk-net), [Typescript](https://github.com/serverlessworkflow/sdk-typescript) and [Python](https://github.com/serverlessworkflow/sdk-python), and we plan to add them for more languages in the future. -* Set of [Workflow Extensions](extensions/README.md) which - allow users to define additional, non-execution-related workflow information. This information can be used to improve - workflow performance. - Some example workflow extensions include Key Performance Indicators (KPIs), Rate Limiting, Simulation, Tracing, etc. -* Technology Compatibility Kit (TCK) to be used as a specification conformance tool for runtime implementations. - -## Specification Details - -Following sections provide detailed descriptions of all parts of the Serverless Workflow language. - -### Core Concepts - -This section describes some of the core Serverless Workflow concepts: - -### Workflow Definition - -A workflow definition is a JSON or YAML file that conforms to the Serverless Workflow specification DSL. -It consists of the core [Workflow Definition Structure](#Workflow-Definition-Structure) -and the [Workflow Model](#Workflow-Model) It defines a blueprint used by runtimes for its execution. - -A business solution can be composed of any number of related workflow definitions. -Their relationships are explicitly modeled with the Serverless Workflow language (for example -by using [SubFlowRef Definition](#SubFlowRef-Definition) in actions). - -Runtimes can initialize workflow definitions for some particular set of data inputs or events. - -### Workflow Instance - -A workflow instance represents a single workflow execution corresponding to the instructions provided by a -workflow definition. A workflow instance can be short or long-running. A single workflow instance -should be isolated, meaning it should not share state and data with other workflow instances. -Workflow instances should be able to communicate with each other via events. - -Depending on their workflow definition, workflow instances can be short-lived or -can execute for days, weeks, or years. - -Each workflow instances should have its unique identifier, which should remain -unchanged throughout its execution. - -Workflow instances can be started providing some data input. This is described in detail in the -[workflow data input](#Workflow-Data-Input) section. -Workflow instances can also wait for events to start their execution, which is the case -where a workflow definition contains a [EventState](#Event-State) starting workflow state. - -The workflow definition also explicitly defines when a workflow instance should be completed. -By default, instances should be completed once there are no active workflow paths (all active -paths reach a state containing the default [end definition](#End-Definition)), -or if the defined [`workflowExecTimeout`](#Workflow-Timeouts) time is reached. -Other ways, such as using the `terminate` property of the [end definition](#End-Definition) to terminate instance execution, -or defining an [`workflowExecTimeout`](#Workflow-Timeouts) property are also possible. - -For long-running workflow-executions, you can utilize the `keepActive` workflow property which -provides more control as to when exactly to terminate workflow execution. In cases where a -workflow execution should be continued as a new one, the DSL also provides the `continueAs` property which is described -in detail in the [Continuing a new Execution](#Continuing-as-a-new-Execution) section. - -### Workflow Model - -The Serverless Workflow language is composed of: - -* [Function definitions](#Function-Definition) - Reusable functions that can declare services that need to be invoked, or expressions to be evaluated. -* [Event definitions](#Event-Definition) - Reusable declarations of events that need to be consumed to start or continue workflow instances, trigger function/service execution, or be produced during workflow execution. -* [Retry definitions](#Retry-Definition) - Reusable retry definitions. Can specify retry strategies for service invocations during workflow execution. -* [Timeout definitions](#Workflow-Timeouts) - Reusable timeout definitions. Can specify default workflow execution timeout, as well as workflow state, action, and branch execution timeouts. -* [Errors definition](#Defining-Errors) - Reusable error definitions. Provide domain-specific error definitions which can be referenced in workflow states error handling. -* [State definitions](#Workflow-States) - Definition of states, the building blocks of workflow `control flow logic`. States can reference the reusable function, event and retry definitions. - -### Workflow Data - -Serverless Workflow data is represented in [JSON](https://www.json.org/json-en.html) format. -Data flow and execution logic go hand in hand, meaning as workflow execution follows the workflow definition -logic, so does the workflow data: - -

-Serverless Workflow Data Flow -

- -The initial [Workflow data input](#Workflow-data-input) is passed to the workflow starting state as its data input. -When a state finishes its execution, [its data output is passed as data input to the next state](#Information-passing-Between-States) that should be executed. - -When workflow execution ends, the last executed workflow state's data output becomes the final [Workflow data output](#Workflow-data-output). - -States can filter their data inputs and outputs using [State Data filters](#State-data-filters). - -States can also consume events as well as invoke services. These event payloads and service invocation results -can be filtered using [Event data filters](#Event-data-filters) and [Action data filters](#Action-data-filters). - -Data filters use [workflow expressions](#Workflow-Expressions) for selecting and manipulating state data -input and output, action inputs and results, and event payloads. - -Multiple filters can be combined to gain high level of control of your workflow state data. You can find an example of that in -[this](#Using-multiple-data-filters) section. - -Data from consumed events,and action execution results are added/merged -to state data. Reference the [data merging section](#Data-Merging) to learn about the merging rules that should be applied. - -#### Workflow Data Input - -The initial data input into a workflow instance. Must be a valid [JSON object](https://tools.ietf.org/html/rfc7159#section-4). -If no input is provided, the default data input should be an empty JSON object: - -```json -{ } -``` - -Workflow data input is passed to the workflow starting state as its data input. - -

-Workflow data input -

- -#### Information Passing Between States - -States in a workflow can receive data (data input) and produce a data result (data output). The state's data input is typically the previous state's data output. -When a state completes its execution, its data output is passed to the state's data input it transitions to. -There are two rules to consider here: - -- If the state is the workflow starting state, its data input is the [workflow data input](#Workflow-data-input). -- When workflow execution ends, the data output of the last executed state becomes the [workflow data output](#Workflow-data-output). - -

-Basic state data passing -

- -#### Workflow data output - -Each workflow execution should produce a data output. -The workflow data output is the data output of the last executed workflow state. - -#### State data filters - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| input | Workflow expression to filter the states data input | string | no | -| output | Workflow expression that filters the states data output | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "stateDataFilter": { - "input": "${ .orders }", - "output": "${ .provisionedOrders }" - } -} -``` - - - -```yaml -stateDataFilter: - input: "${ .orders }" - output: "${ .provisionedOrders }" -``` - -
- -

- -State data filters can be used to filter the state's data input and output. - -The state data filters `input` property expression is applied when the workflow transitions to the current state and receives its data input. -It can be used to select only data that is needed and disregard what is not needed. -If `input` is not defined or does not select any parts of the state's data input, its data input is not filtered. - -The state data filter `output` property expression is applied right before the state transitions to the next state defined. -It filters the state's data output to be passed as data input to the transitioning state. -If the current state is the workflow end state, the filtered state's data output becomes the workflow data output. -If `output` is not defined or does not select any parts of the state's data output, its data output is not filtered. - -Results of the `input` expression should become the state data input. -Results of the `output` expression should become the state data output. - -For more information on this you can reference the [data merging](#Data-Merging) section. - -Let's take a look at some examples of state filters. For our examples let's say the data input to our state is as follows: - -```json -{ - "fruits": [ "apple", "orange", "pear" ], - "vegetables": [ - { - "veggieName": "potato", - "veggieLike": true - }, - { - "veggieName": "broccoli", - "veggieLike": false - } - ] -} -``` - -For the first example, our state only cares about fruits data, and we want to disregard the vegetables. To do this -we can define a state filter: - -```json -{ - "stateDataFilter": { - "input": "${ {fruits: .fruits} }" - } -} -``` - -The state data output then would include only the fruits data: - -```json -{ - "fruits": [ "apple", "orange", "pear"] -} -``` - -

-State Data Filter Example -

- -For our second example, let's say that we are interested in the only vegetable "veggie-like". -Here we have two ways of filtering our data, depending on if actions within our state need access to all vegetables, or -only the ones that are "veggie-like". - -The first way would be to use both "input", and "output": - -```json -{ - "stateDataFilter": { - "input": "${ {vegetables: .vegetables} }", - "output": "${ {vegetables: [.vegetables[] | select(.veggieLike == true)]} }" - } -} -``` - -The states data input filter selects all the vegetables from the main data input. Once all actions have performed, before the state transition -or workflow execution completion (if this is an end state), the "output" of the state filter selects only the vegetables which are "veggie like". - -

-State Data Filter Example -

- -The second way would be to directly filter only the "veggie like" vegetables with just the data input path: - -```json -{ - "stateDataFilter": { - "input": "${ {vegetables: [.vegetables[] | select(.veggieLike == true)]} }" - } -} -``` - -#### Action data filters - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| fromStateData | Workflow expression that filters state data that can be used by the action | string | no | -| useResults | If set to `false`, action data results are not added/merged to state data. In this case 'results' and 'toStateData' should be ignored. Default is `true`. | boolean | no | -| results | Workflow expression that filters the actions data results | string | no | -| toStateData | Workflow expression that selects a state data element to which the action results should be added/merged. If not specified denotes the top-level state data element. In case it is not specified and the result of the action is not an object, that result should be merged as the value of an automatically generated key. That key name will be the result of concatenating the action name with `-output` suffix. | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "actionDataFilter": { - "fromStateData": "${ .language }", - "results": "${ .results.greeting }", - "toStateData": "${ .finalgreeting }" - } -} -``` - - - -```yaml -actionDataFilter: - fromStateData: "${ .language }" - results: "${ .results.greeting }" - toStateData: "${ .finalgreeting }" -``` - -
- -

- -Action data filters can be used inside [Action definitions.](#Action-Definition) -Each action can define this filter which can: - -* Filter the state data to select only the data that can be used within function definition arguments using its `fromStateData` property. -* Filter the action results to select only the result data that should be added/merged back into the state data - using its `results` property. -* Select the part of state data which the action data results should be added/merged to - using the `toStateData` property. - -To give an example, let's say we have an action which returns a list of breads and pasta types. -For our workflow, we are only interested into breads and not the pasta. - -Action results: - -```json -{ - "breads": ["baguette", "brioche", "rye"], - "pasta": [ "penne", "spaghetti", "ravioli"] -} -``` - -We can use an action data filter to filter only the breads data: - -```json -{ -"actions":[ - { - "functionRef": "breadAndPastaTypesFunction", - "actionDataFilter": { - "results": "${ {breads: .breads} }" - } - } - ] -} -``` - -The `results` will filter the action results, which would then be: - -```json -{ - "breads": [ - "baguette", - "brioche", - "rye" - ] -} -``` - -Now let's take a look at a similar example (same expected action results) and assume our current state data is: - -```json -{ - "itemsToBuyAtStore": [ - ] -} -``` - -and have the following action definition: - -```json -{ -"actions":[ - { - "name": "fetch-items-to-buy", - "functionRef": "breadAndPastaTypesFunction", - "actionDataFilter": { - "results": "${ [ .breads[0], .pasta[1] ] }", - "toStateData": "${ .itemsToBuyAtStore }" - } - } - ] -} -``` - -In this case, our `results` select the first bread and the second element of the pasta array. -The `toStateData` expression then selects the `itemsToBuyAtStore` array of the state data to add/merge these results -into. With this, after our action executes the state data would be: - -```json -{ - "itemsToBuyAtStore": [ - "baguette", - "spaghetti" - ] -} -``` - -To illustrate the merge of non-JSON for both objects, let's assume that, in the previous example, the action definition is the follows - -```json -"actions":[ - { - "name": "fetch-only-pasta", - "functionRef": "breadAndPastaTypesFunction", - "actionDataFilter": { - "results": "${ .pasta[1] ]", - } - } - ] -``` -Since there is no `toStateData` attribute and the result is not a JSON object but a string, the model would be: - -```json -{ - "fetch-only-pasta-output": "spaghetti" -} -``` -In the case action results should not be added/merged to state data, we can set the `useResults` property to `false`. -In this case, the `results` and `toStateData` properties should be ignored, and nothing is added/merged to state data. -If `useResults` is not specified (or it's value set to `true`), action results, if available, should be added/merged to state data. - -#### Event data filters - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| useData | If set to `false`, event payload is not added/merged to state data. In this case 'data' and 'toStateData' should be ignored. Default is `true`. | boolean | no | -| data | Workflow expression that filters the event data (payload) | string | no | -| toStateData | Workflow expression that selects a state data element to which the action results should be added/merged into. If not specified denotes the top-level state data element | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "eventDataFilter": { - "data": "${ .data.results }" - } -} -``` - - - -```yaml -eventDataFilter: - data: "${ .data.results }" -``` - -
- -

- -Event data filters can be used to filter consumed event payloads. -They can be used to: - -* Filter the event payload to select only the data that should be added/merged into the state data - using its `data` property. -* Select the part of state data into which the event payload should be added/merged into - using the `toStateData` property. - -Allows event data to be filtered and added to or merged with the state data. All events have to be in the CloudEvents format -and event data filters can filter both context attributes and the event payload (data) using the `data` property. - -Here is an example using an event filter: - -

-Event Data Filter Example -

- -Note that the data input to the Event data filters depends on the `dataOnly` property of the associated [Event definition](#Event-Definition). -If this property is not defined (has default value of `true`), Event data filter expressions are evaluated against the event payload (the CloudEvents `data` attribute only). If it is set to -`false`, the expressions should be evaluated against the entire CloudEvent (including its context attributes). - -In the case event data/payload should not be added/merged to state data, we can set the `useData` property to `false`. -In this case, the `data` and `toStateData` properties should be ignored, and nothing is added/merged to state data. -If `useData` is not specified (or it's value set to `true`), event payload, if available, should be added/merged to state data. - - -#### Using multiple data filters - -As [Event states](#Event-State) can take advantage of all defined data filters. In the example below, we define -a workflow with a single event state and show how data filters can be combined. - -```json -{ - "name": "greet-customers-workflow", - "description": "Greet Customers when they arrive", - "version": "1.0.0", - "specVersion": "0.8", - "start": "WaitForCustomerToArrive", - "states":[ - { - "name": "wait-for-customer-to-arrive", - "type": "event", - "onEvents": [{ - "eventRefs": ["customer-arrives-event"], - "eventDataFilter": { - "data": "${ .customer }", - "toStateData": "${ .customerInfo }" - }, - "actions":[ - { - "name": "greet-customer", - "functionRef": { - "refName": "greeting-function", - "arguments": { - "greeting": "${ .hello.spanish } ", - "customerName": "${ .customerInfo.name } " - } - }, - "actionDataFilter": { - "fromStateData": "${ { hello, customerInfo } }", - "results": "${ .greetingMessageResult }", - "toStateData": "${ .finalCustomerGreeting }" - } - } - ] - }], - "stateDataFilter": { - "input": "${ .greetings } ", - "output": "${ { finalCustomerGreeting } }" - }, - "end": true - } - ], - "events": [{ - "name": "customer-arrives-event", - "type": "customer-arrival-type", - "source": "customer-arrival-event-source" - }], - "functions": [{ - "name": "greeting-function", - "operation": "http://my.api.org/myapi.json#greeting" - }] -} -``` - -The workflow data input when starting workflow execution is assumed to include greetings in different languages: - -```json -{ - "greetings": { - "hello": { - "english": "Hello", - "spanish": "Hola", - "german": "Hallo", - "russian": "Здравствуйте" - }, - "goodbye": { - "english": "Goodbye", - "spanish": "Adiós", - "german": "Auf Wiedersehen", - "russian": "Прощай" - } - } -} -``` - -The workflow data input then becomes the data input of the starting workflow state. - -We also assume for this example that the CloudEvent that our event state consumes include the data (payload): - -```json -{ - "customer": { - "name": "John Michaels", - "address": "111 Some Street, SomeCity, SomeCountry", - "age": 40 - } -} -``` - -Here is a sample diagram showing our workflow, each numbered step on this diagram shows a certain defined point during -workflow execution at which data filters are invoked and correspond to the numbered items below. - -

-Using Multple Filters Example -

- -**(1) Workflow execution starts**: Workflow data is passed to our "WaitForCustomerToArrive" event state as data input. -Workflow executes its starting state, namely the "WaitForCustomerToArrive" event state. - -The event state **stateDataFilter** is invoked to filter its data input. The filters "input" expression is evaluated and -selects only the "greetings" data. The rest of the state data input should be disregarded. - -At this point our state data should be: - -```json -{ - "hello": { - "english": "Hello", - "spanish": "Hola", - "german": "Hallo", - "russian": "Здравствуйте" - }, - "goodbye": { - "english": "Goodbye", - "spanish": "Adiós", - "german": "Auf Wiedersehen", - "russian": "Прощай" - } -} -``` - -**(2) CloudEvent of type "customer-arrival-type" is consumed**: Once the event is consumed, the "eventDataFilter" is triggered. -Its "data" expression selects the "customer" object from the events data. The "toStateData" expression -says that we should add/merge this selected event data to the state data in its "customerInfo" property. If this property -exists it should be merged, if it does not exist, one should be created. - -At this point our state data contains: - -```json -{ - "hello": { - "english": "Hello", - "spanish": "Hola", - "german": "Hallo", - "russian": "Здравствуйте" - }, - "goodbye": { - "english": "Goodbye", - "spanish": "Adiós", - "german": "Auf Wiedersehen", - "russian": "Прощай" - }, - "customerInfo": { - "name": "John Michaels", - "address": "111 Some Street, SomeCity, SomeCountry", - "age": 40 - } -} -``` - -**(3) Event state performs its actions**: -Before the first action is executed, its actionDataFilter is invoked. Its "fromStateData" expression filters -the current state data to select from its data that should be available to action arguments. In this example -it selects the "hello" and "customerInfo" properties from the current state data. -At this point the action is executed. -We assume that for this example "greetingFunction" returns: - -```json -{ - "execInfo": { - "execTime": "10ms", - "failures": false - }, - "greetingMessageResult": "Hola John Michaels!" -} -``` - -After the action is executed, the actionDataFilter "results" expression is evaluated to filter the results returned from the action execution. In this case, we select only the "greetingMessageResult" element from the results. - -The action filters "toStateData" expression then defines that we want to add/merge this action result to -state data under the "finalCustomerGreeting" element. - -At this point, our state data contains: - -```json -{ - "hello": { - "english": "Hello", - "spanish": "Hola", - "german": "Hallo", - "russian": "Здравствуйте" - }, - "goodbye": { - "english": "Goodbye", - "spanish": "Adiós", - "german": "Auf Wiedersehen", - "russian": "Прощай" - }, - "customerInfo": { - "name": "John Michaels", - "address": "111 Some Street, SomeCity, SomeCountry", - "age": 40 - }, - "finalCustomerGreeting": "Hola John Michaels!" -} -``` - -**(4) Event State Completes Execution**: - -When our event state finishes its execution, the states "stateDataFilter" "output" filter expression is executed -to filter the state data to create the final state data output. - -Because our event state is also an end state, its data output becomes the final [workflow data output](#Workflow-data-output). Namely: - -```json -{ - "finalCustomerGreeting": "Hola John Michaels!" -} -``` - -#### Data Merging - -Consumed event data (payload) and action execution results should be merged into the state data. -Event and action data filters can be used to give more details about this operation. - -By default, with no data filters specified, when an event is consumed, its entire data section (payload) should be merged -to the state data. Merging should be applied to the entire state data JSON element. - -In case of event and action filters, their "toStateData" property can be defined to select a specific element -of the state data with which merging should be done against. If this element does not exist, a new one should -be created first. - -When merging, the state data element and the data (payload)/action result should have the same type, meaning -that you should not merge arrays with objects or objects with arrays etc. - -When merging elements of type object should be done by inserting all the key-value pairs from both objects into -a single combined object. If both objects contain a value for the same key, the object of the event data/action results -should "win". To give an example, let's say we have the following state data: - -```json -{ - "customer": { - "name": "John", - "address": "1234 street", - "zip": "12345" - } -} -``` - -and we have the following event payload that needs to be merged into the state data: - -```json -{ - "customer": { - "name": "John", - "zip": "54321" - } -} -``` - -After merging the state data should be: - -```json -{ - "customer": { - "name": "John", - "address": "1234 street", - "zip": "54321" - } -} -``` - -Merging array types should be done by concatenating them into a larger array including unique elements of both arrays. -To give an example, merging: - -```json -{ - "customers": [ - { - "name": "John", - "address": "1234 street", - "zip": "12345" - }, - { - "name": "Jane", - "address": "4321 street", - "zip": "54321" - } - ] -} -``` - -into state data: - -```json -{ - "customers": [ - { - "name": "Michael", - "address": "6789 street", - "zip": "6789" - } - ] -} -``` - -should produce state data: - -```json -{ - "customers": [ - { - "name": "Michael", - "address": "6789 street", - "zip": "6789" - }, - { - "name": "John", - "address": "1234 street", - "zip": "12345" - }, - { - "name": "Jane", - "address": "4321 street", - "zip": "54321" - } - ] -} -``` - -Merging number types should be done by overwriting the data from events data/action results into the merging element of the state data. -For example merging action results: - -```json -{ - "age": 30 -} -``` - -into state data: - -```json -{ - "age": 20 -} -``` - -would produce state data: - -```json -{ - "age": 30 -} -``` - -Merging string types should be done by overwriting the data from events data/action results into the merging element of the state data. - -### Workflow Functions - -Workflow [functions](#Function-Definition) are reusable definitions for service invocations and/or expression evaluation. -They can be referenced by their domain-specific names inside workflow [states](#Workflow-States). - -Reference the following sections to learn more about workflow functions: - -* [Using functions for OpenAPI Service invocations](#using-functions-for-openapi-service-invocations) -+ [Using functions for HTTP Service Invocations](#using-functions-for-http-service-invocations) -* [Using functions for Async API Service Invocations](#Using-Functions-for-Async-API-Service-Invocations) -* [Using functions for gRPC service invocation](#Using-Functions-For-RPC-Service-Invocations) -* [Using functions for GraphQL service invocation](#Using-Functions-For-GraphQL-Service-Invocations) -* [Using Functions for OData Service Invocations](#Using-Functions-for-OData-Service-Invocations) -* [Using functions for expression evaluations](#Using-Functions-For-Expression-Evaluation) -* [Defining custom function types](#defining-custom-function-types) - -We can define if functions are invoked sync or async. Reference -the [functionRef](#FunctionRef-Definition) to learn more on how to do this. - -#### Using Functions for OpenAPI Service Invocations - -[Functions](#Function-Definition) can be used to describe services and their operations that need to be invoked during -workflow execution. They can be referenced by states [action definitions](#Action-Definition) to clearly -define when the service operations should be invoked during workflow execution, as well as the data parameters -passed to them if needed. - -Note that with Serverless Workflow, we can also define invocation of services which are triggered via an event. -To learn more about that, please reference the [event definitions](#Event-Definition) section, -as well as the [actions definitions](#Action-Definition) [eventRef](#EventRef-Definition) property. - -Because of an overall lack of a common way to describe different services and their operations, -many workflow languages typically chose to define custom function definitions. -This approach, however, often runs into issues such as lack of portability, limited capabilities, as well as -forcing non-workflow-specific information, such as service authentication, to be added inside the workflow language. - -To avoid these issues, the Serverless Workflow specification mandates that details about -RESTful services and their operations be described using the [OpenAPI Specification](https://www.openapis.org/). -OpenAPI is a language-agnostic standard that describes discovery of RESTful services. -This allows Serverless Workflow language to describe RESTful services in a portable -way, as well as workflow runtimes to utilize OpenAPI tooling and APIs to invoke service operations. - -Here is an example function definition for a RESTful service operation. - -```json -{ -"functions": [ - { - "name": "send-order-confirmation", - "operation": "file://confirmationapi.json#sendOrderConfirmation" - } -] -} -``` - -It can, as previously mentioned be referenced during workflow execution when the invocation of this service is desired. -For example: - -```json -{ -"states": [ - { - "name":"send-confirm-state", - "type":"operation", - "actions":[ - { - "functionRef": "send-order-confirmation" - }], - "end": true - }] -} -``` - -Note that the referenced function definition type in this case must be `openapi` (default type). - -The specification also supports describing OpenAPI for REST invocations inline in the [functions definition](#Function-Definition) using [OpenAPI Paths Object](https://spec.openapis.org/oas/v3.1.0#paths-object). - -Here is an example function definition for REST requests with method `GET` and request target corresponding with [URI Template](https://www.rfc-editor.org/rfc/rfc6570.html) `/users/{id}`: - -```json -{ - "functions":[ - { - "name":"queryUserById", - "operation": { - "/users": { - "get": { - "parameters": [{ - "name": "id", - "in": "path", - "required": true - }] - } - } - }, - "type":"openapi" - } - ] -} -``` - -Note that the [Function Definition](#Function-Definition)'s `operation` property must follow the [OpenAPI Paths Object](https://spec.openapis.org/oas/v3.1.0#paths-object) specification definition. - -The function can be referenced during workflow execution when the invocation of the REST service is desired. For example: - -```json -{ - "states":[ - { - "name":"QueryUserInfo", - "type":"operation", - "actions":[ - { - "functionRef":"queryUserById", - "arguments":{ - "id":"${ .user.id }" - } - } - ], - "end":true - } - ] -} -``` - -Example of the `POST` request sending the state data as part of the body: - -```json -{ - "functions":[ - { - "name": "createUser", - "type": "openapi", - "operation": { - "/users": { - "post": { - "requestBody": { - "content": { - "application/json": { - "schema": { - "type": "object", - "properties": { - "id": { - "type": "string" - }, - "name": { - "type": "string" - }, - "email": { - "type": "string" - } - }, - "required": ["name", "email"] - } - } - } - } - } - } - } - } - ] -} -``` - -Note that the `requestBody` [`content` attribute](https://spec.openapis.org/oas/v3.1.0#fixed-fields-10) is described inline rather than a reference to an external document. - -You can reference the `createUser` function and filter the input data to invoke it. Given the workflow input data: - -```json -{ - "order":{ - "id":"1234N", - "products":[ - { - "name":"Product 1" - } - ] - }, - "user":{ - "name":"John Doe", - "email":"john@doe.com" - } -} -``` - -Function invocation example: - -```json -{ - "states":[ - { - "name":"CreateNewUser", - "type":"operation", - "actions":[ - { - "functionRef":"createUser", - "actionDataFilter":{ - "fromStateData":"${ .user }", - "toStateData":"${ .user.id }" - } - } - ], - "end":true - } - ] -} -``` - -In this case, only the contents of the `user` attribute will be passed to the function. The user ID returned by the REST request body will then be added to the state data: - -```json -{ - "order":{ - "id":"1234N", - "products":[ - { - "name":"Product 1" - } - ] - }, - "user":{ - "id":"5678U", - "name":"John Doe", - "email":"john@doe.com" - } -} -``` - -When inlining the OpenAPI operation, the specification does not support the [Security Requirement Object](https://spec.openapis.org/oas/v3.1.0#security-requirement-object) since its redundat to function [Auth Definition](#Auth-Definition). If provided, this field is ignored. - -For more information about functions, reference the [Functions definitions](#Function-Definition) section. - -#### Using functions for HTTP Service Invocations - -The HTTP function can make HTTP requests to a given endpoint. It can be used in cases a service doesn't have an OpenAPI definition or users require a simple HTTP, curl-style invocation. - -The table below lists the `operation` properties for the `http` function type. - -| Property | Description | Type | Required | -| --- | --- | --- | --- | - -| uri | The URI where to send the request | String | yes | -| method | The HTTP method according to the [RFC 2616](https://datatracker.ietf.org/doc/html/rfc2616#page-36) | String | yes | -| headers | Headers to send in the HTTP call. The `Content-Type` header mandates the body convertion. | Map | no | -| cookies | Cookies to send in the HTTP call. | Map | no | - -Note that in the function definition, these values are static. When invoking the function in the `actions` definition, `jq` can be used to set the attribute values. - -Here is a function definition example for a HTTP service operation. - -```json -{ -"functions": [ - { - "name": "getPetById", - "type": "http", - "operation": { - "method": "GET", - "uri": "https://petstore.swagger.io/v2/pet/{petId}" - } - } -] -} -``` - -This function can be used later in the workflow definition: - -```json -{ - "states":[ - { - "name": "getpet", - "type": "operation", - "actions":[ - { - "functionRef": "getPetById", - "arguments":{ - "petId": "${ .pet.id }" - } - } - ], - "end":true - } - ] -} -``` - -Not that the `arguments` attribute must map the template in the `uri` definition so the underlying engine can map the arguments correctly. - -The `arguments` attribute accepts the following reserved properties when calling a HTTP function type: - -| Property | Description | Type | Required | -| --- | --- | --- | --- | - -| body | The HTTP body. If an object, it will be sent as a JSON payload by default if the `Content-Type` header is missing. Otherwise, it will try to convert it based on the `Content-Type` header definition | Object or String | no | -| headers | Headers to send in the HTTP call. The `Content-Type` header mandates the body convertion. | Map | no | -| cookies | Cookies to send in the HTTP call. | Map | no | - -These attributes are merged with the ones in the function definition. - -The listing below exemplifies how to define and call a HTTP POST endpoint. - -```json -{ -"functions": [ - { - "name": "createPet", - "type": "http", - "operation": { - "method": "POST", - "uri": "https://petstore.swagger.io/v2/pet/", - "headers": { - "Content-Type": "application/json" - } - } - } -] -}, -{ - "states":[ - { - "name":"create-pet", - "type":"operation", - "actions":[ - { - "functionRef":"createPet", - "arguments":{ - "body": { - "name": "Lulu" - }, - "headers": { - "my-header": "my-value" - } - } - } - ], - "end":true - } - ] -} -``` - -#### Using Functions for Async API Service Invocations - -[Functions](#Function-Definition) can be used to invoke PUBLISH and SUBSCRIBE operations on a message broker documented by the [Async API Specification](https://www.asyncapi.com/docs/specifications/v2.1.0). -[Async API operations](https://www.asyncapi.com/docs/specifications/v2.1.0#operationObject) are bound to a [channel](https://www.asyncapi.com/docs/specifications/v2.1.0#definitionsChannel) which describes the technology, security mechanisms, input and validation to be used for their execution. - -Let's take a look at a hypothetical Async API document (assumed its stored locally with the file name `streetlightsapi.yaml`) and define a single publish operation: - -```yaml -asyncapi: 2.1.0 -info: - title: Streetlights API - version: 1.0.0 - description: | - The Smartylighting Streetlights API allows you - to remotely manage the city lights. - license: - name: Apache 2.0 - url: https://www.apache.org/licenses/LICENSE-2.0 -servers: - mosquitto: - url: mqtt://test.mosquitto.org - protocol: mqtt -channels: - light/measured: - publish: - summary: Inform about environmental lighting conditions for a particular streetlight. - operationId: onLightMeasured - message: - name: LightMeasured - payload: - type: object - properties: - id: - type: integer - minimum: 0 - description: Id of the streetlight. - lumens: - type: integer - minimum: 0 - description: Light intensity measured in lumens. - sentAt: - type: string - format: date-time - description: Date and time when the message was sent. -``` - -To define a workflow action invocation, we can then use the following workflow [Function Definition](#Function-Definition) and set the `operation` to `onLightMeasured`: - -```json -{ - "functions": [ - { - "name": "publish-light-measurements", - "operation": "file://streetlightsapi.yaml#onLightMeasured", - "type": "asyncapi" - }] -} -``` - -Note that the [Function Definition](#Function-Definition)'s `operation` property must have the following format: - -```text -# -``` - -Also note that the referenced function definition type in this case must have the value `asyncapi`. - -Our defined function definition can then we referenced in a workflow [action](#Action-Definition), for example: - -```json -{ - "name": "publish-measurements", - "type": "operation", - "actions":[ - { - "name": "publish-light-measurements", - "functionRef":{ - "refName": "publish-light-measurements", - "arguments":{ - "id": "${ .currentLight.id }", - "lumens": "${ .currentLight.lumens }", - "sentAt": "${ now }" - } - } - } - ] -} -``` - -#### Using Functions for RPC Service Invocations - -Similar to defining invocations of operations on RESTful services, you can also use the workflow -[functions definitions](#Function-Definition) that follow the remote procedure call (RPC) protocol. -For RPC invocations, the Serverless Workflow specification mandates that they are described using [gRPC](https://grpc.io/), -a widely used RPC system. -gRPC uses [Protocol Buffers](https://developers.google.com/protocol-buffers/docs/overview) to define messages, services, -and the methods on those services that can be invoked. - -Let's look at an example of invoking a service method using RPC. For this example let's say we have the following -gRPC protocol buffer definition in a myuserservice.proto file: - -```text -service UserService { - rpc AddUser(User) returns (google.protobuf.Empty) { - option (google.api.http) = { - post: "/api/v1/users" - body: "*" - }; - } - rpc ListUsers(ListUsersRequest) returns (stream User) { - option (google.api.http) = { - get: "/api/v1/users" - }; - } - rpc ListUsersByRole(UserRole) returns (stream User) { - option (google.api.http) = { - get: "/api/v1/users/role" - }; - } - rpc UpdateUser(UpdateUserRequest) returns (User) { - option (google.api.http) = { - patch: "/api/v1/users/{user.id}" - body: "*" - }; - } -} -``` - -In our workflow definition, we can then use function definitions: - -```json -{ -"functions": [ - { - "name": "list-users", - "operation": "file://myuserservice.proto#UserService#ListUsers", - "type": "rpc" - } -] -} -``` - -Note that the `operation` property has the following format: - -```text -## -``` - -Note that the referenced function definition type in this case must be `rpc`. - -For more information about functions, reference the [Functions definitions](#Function-Definition) section. - -#### Using Functions for GraphQL Service Invocations - -If you want to use GraphQL services, you can also invoke them using a similar syntax to the above methods. - -We'll use the following [GraphQL schema definition](https://graphql.org/learn/schema/) to show how that would work with both a query and a mutation: - -```graphql -type Query { - pets: [Pet] - pet(id: Int!): Pet -} - -type Mutation { - createPet(pet: PetInput!): Pet -} - -type Treat { - id: Int! -} - -type Pet { - id: Int! - name: String! - favoriteTreat: Treat -} - -input PetInput { - id: Int! - name: String! - favoriteTreatId: Int -} -``` - -##### Invoking a GraphQL Query - -In our workflow definition, we can then use a function definition for the `pet` query field as such: - -```json -{ - "functions": [ - { - "name": "get-one-pet", - "operation": "https://example.com/pets/graphql#query#pet", - "type": "graphql" - } - ] -} -``` - -Note that the `operation` property has the following format for the `graphql` type: - -```text -## -``` - -In order to invoke this query, we would use the following `functionRef` parameters: - -```json -{ - "refName": "get-one-pet", - "arguments": { - "id": 42 - }, - "selectionSet": "{ id, name, favoriteTreat { id } }" -} -``` - -Which would return the following result: - -```json -{ - "pet": { - "id": 42, - "name": "Snuffles", - "favoriteTreat": { - "id": 9001 - } - } -} -``` - -##### Invoking a GraphQL Mutation - -Likewise, we would use the following function definition: - -```json -{ - "functions": [ - { - "name": "create-pet", - "operation": "https://example.com/pets/graphql#mutation#createPet", - "type": "graphql" - } - ] -} -``` - -With the parameters for the `functionRef`: - -```json -{ - "refName": "create-pet", - "arguments": { - "pet": { - "id": 43, - "name":"Sadaharu", - "favoriteTreatId": 9001 - } - }, - "selectionSet": "{ id, name, favoriteTreat { id } }" -} -``` - -Which would execute the mutation, creating the object and returning the following data: - -```json -{ - "pet": { - "id": 43, - "name": "Sadaharu", - "favoriteTreat": { - "id": 9001 - } - } -} -``` - -Note you can include [expressions](#Workflow-Expressions) in both `arguments` and `selectionSet`: - -```json -{ - "refName": "get-one-pet", - "arguments": { - "id": "${ .petId }" - }, - "selectionSet": "{ id, name, age(useDogYears: ${ .isPetADog }) { dateOfBirth, years } }" -} -``` - -Expressions must be evaluated before executing the operation. - -Note that GraphQL Subscriptions are not supported at this time. - -For more information about functions, reference the [Functions definitions](#Function-Definition) section. - -#### Using Functions for OData Service Invocations - -Similar to defining invocations of operations on GraphQL services, you can also use workflow -[Functions Definitions](#Function-Definition) to execute complex queries on an [OData](https://www.odata.org/documentation/) service. - -##### Creating an OData Function Definition - -We start off by creating a workflow [Functions Definitions](#Function-Definition). For example: - - -```json -{ -"functions": [ - { - "name": "query-persons", - "operation": "https://services.odata.org/V3/OData/OData.svc#Persons", - "type": "odata" - } -] -} -``` - -Note that the `operation` property must follow the following format: - -```text -# -``` - -##### Invoking an OData Function Definition - -In order to invoke the defined [OData](https://www.odata.org/documentation/) function, -simply reference it in a workflow [Action Definition](#Action-Definition) and set its function arguments. For example: - -```json -{ - "refName": "query-persons", - "arguments": { - "queryOptions":{ - "expand": "PersonDetail/Person", - "select": "Id, PersonDetail/Person/Name", - "top": 5, - "orderby": "PersonDetail/Person/Name" - } - } -} -``` - -In order to ensure compatibility of OData support across runtimes, -the`arguments` property of an [OData](https://www.odata.org/documentation/) function reference -should follow the Serverless Workflow [OData Json schema](https://github.com/serverlessworkflow/specification/tree/main/schema/odata.json) - -#### Using Functions for Expression Evaluation - -In addition to defining RESTful, AsyncAPI, RPC, GraphQL and OData services and their operations, workflow [functions definitions](#Function-Definition) -can also be used to define expressions that should be evaluated during workflow execution. - -Defining expressions as part of function definitions has the benefit of being able to reference -them by their logical name through workflow states where expression evaluation is required. - -Expression functions must declare their `type` parameter to be `expression`. - -Let's take a look at an example of such definitions: - -```json -{ -"functions": [ - { - "name": "is-adult", - "operation": ".applicant | .age >= 18", - "type": "expression" - }, - { - "name": "is-minor", - "operation": ".applicant | .age < 18", - "type": "expression" - } -] -} -``` - -Here we define two reusable expression functions. Expressions in Serverless Workflow -can be evaluated against the workflow, or workflow state data. Note that different data filters play a big role as to which parts of the -workflow data are being evaluated by the expressions. Reference the -[State Data Filters](#State-data-filters) section for more information on this. - -Our expression function definitions can now be referenced by workflow states when they need to be evaluated. For example: - -```json -{ -"states":[ - { - "name":"check-Applicant", - "type":"switch", - "dataConditions": [ - { - "name": "applicant-is-adult", - "condition": "${ fn:is-adult }", - "transition": "approve-application" - }, - { - "name": "applicant-is-minor", - "condition": "${ fn:is-minor }", - "transition": "reject-application" - } - ], - "defaultCondition": { - "transition": "reject-application" - } - } -] -} -``` - -Our expression functions can also be referenced and executed as part of state [action](#Action-Definition) execution. -Let's say we have the following workflow definition: - -```json -{ - "name": "simpleadd", - "functions": [ - { - "name": "increment-count-function", - "type": "expression", - "operation": ".count += 1 | .count" - } - ], - "start": "initialize-count", - "states": [ - { - "name": "initialize-count", - "type": "inject", - "data": { - "count": 0 - }, - "transition": "increment-count" - }, - { - "name": "increment-count", - "type": "operation", - "actions": [ - { - "functionRef": "increment-count-function", - "actionDataFilter": { - "toStateData": "${ .count }" - } - } - ], - "end": true - } - ] -} -``` - -The starting [inject state](#Inject-State) "Initialize Count" injects the count element into our state data, -which then becomes the state data input of our "Increment Count" [operation state](#Operation-State). -This state defines an invocation of the "Increment Count Function" expression function defined in our workflow definition. - -This triggers the evaluation of the defined expression. The input of this expression is by default the current state data. -Just like with "rest", and "rpc" type functions, expression functions also produce a result. In this case -the result of the expression is just the number 1. -The actions filter then assigns this result to the state data element "count" and the state data becomes: - -``` json -{ - "count": 1 -} -``` - -Note that the used function definition type in this case must be `expression`. - -For more information about functions, reference the [Functions definitions](#Function-Definition) section. - -For more information about workflow expressions, reference the [Workflow Expressions](#Workflow-Expressions) section. - -#### Defining custom function types - -[Function definitions](#function-definition) `type` property defines a list of function types that are set by -the specification. - -Some runtime implementations might support additional function types that extend the ones -defined in the specification. In those cases you can define a custom function type with for example: - -```json -{ -"functions": [ - { - "name": "send-order-confirmation", - "operation": "/path/to/my/script/order.ts#myFunction", - "type": "custom" - } -] -} -``` - -In this example we define a custom function type that is meant to execute an external [TypeScript](https://www.typescriptlang.org/) script. - -When a custom function type is specified, the operation property value has a **custom format**, meaning that -its format is controlled by the runtime which provides the custom function type. - -Later, the function should be able to be used in an action as any other function supported by the specification: - -```json -[{ - "states": [{ - "name": "handle-order", - "type": "operation", - "actions": [ - { - "name": "send-order-confirmation", - "functionRef": { - "refName": "send-order-confirmation", - "arguments": { - "order": "${ .order }" - } - } - } - ], - "transition": "emailCustomer" - }] -}] -``` - -Note that custom function types are not portable across runtimes. - -### Workflow Expressions - -Workflow model parameters can use expressions to select/manipulate workflow and/or state data. - -Note that different data filters play a big role as to which parts of the states data are to be used when the expression is -evaluated. Reference the -[State Data Filtering](#State-data-filters) section for more information about state data filters. - -By default, all workflow expressions should be defined using the [jq](https://stedolan.github.io/jq/) [version 1.6](https://github.com/stedolan/jq/releases/tag/jq-1.6) syntax. -You can find more information on jq in its [manual](https://stedolan.github.io/jq/manual/). - -Serverless Workflow does not mandate the use of jq and it's possible to use an expression language -of your choice with the restriction that a single one must be used for all expressions -in a workflow definition. If a different expression language needs to be used, make sure to set the workflow -`expressionLang` property to identify it to runtime implementations. - -Note that using a non-default expression language could lower the portability of your workflow definitions -across multiple container/cloud platforms. - -All workflow expressions in this document, [specification examples](examples/README.md) as well as [comparisons examples](comparisons/README.md) -are written using the default jq syntax. - -Workflow expressions have the following format: - -```text -${ expression } -``` - -Where `expression` can be either an in-line expression, or a reference to a -defined [expression function definition](#Using-Functions-For-Expression-Evaluation). - -To reference a defined [expression function definition](#Using-Functions-For-Expression-Evaluation) -the expression must have the following format, for example: - -```text -${ fn:myExprFuncName } -``` - -Where `fn` is the namespace of the defined expression functions and -`myExprName` is the unique expression function name. - -To show some expression examples, let's say we have the following state data: - -```json -{ - "applicant": { - "name": "John Doe", - "age" : 26, - "email": "johndoe@something.com", - "address" : { - "streetAddress": "Naist street", - "city" : "Nara", - "postalCode" : "630-0192" - }, - "phoneNumbers": [ - { - "type" : "iPhone", - "number": "0123-4567-8888" - }, - { - "type" : "home", - "number": "0123-4567-8910" - } - ] - } -} -``` - -In our workflow model we can define our reusable expression function: - -```json -{ -"functions": [ - { - "name": "is-adult-applicant", - "operation": ".applicant | .age > 18", - "type": "expression" - } -] -} -``` - -We will get back to this function definition in just a bit, but now let's take a look at using -an inline expression that sets an input parameter inside an action for example: - -```json -{ -"actions": [ - { - "functionRef": { - "refName": "confirm-applicant", - "arguments": { - "applicantName": "${ .applicant.name }" - } - } - } -] -} -``` - -In this case our input parameter `applicantName` would be set to "John Doe". - -Expressions can also be used to select and manipulate state data, this is in particularly useful for -state data filters. - -For example let's use another inline expression: - -```json -{ - "stateDataFilter": { - "output": "${ .applicant | {applicant: .name, contactInfo: { email: .email, phone: .phoneNumbers }} }" - } -} -``` - -This would set the data output of the particular state to: - -```json -{ - "applicant": "John Doe", - "contactInfo": { - "email": "johndoe@something.com", - "phone": [ - { - "type": "iPhone", - "number": "0123-4567-8888" - }, - { - "type": "home", - "number": "0123-4567-8910" - } - ] - } -} -``` - -[Switch state](#Switch-State) [conditions](#Switch-State-Data-Conditions) require for expressions to be resolved to a boolean value (`true`/`false`). - -We can now get back to our previously defined "IsAdultApplicant" expression function and reference it: - -```json -{ - "dataConditions": [ { - "condition": "${ fn:is-adult-applicant }", - "transition": "start-application" - }] -} -``` - -As previously mentioned, expressions are evaluated against certain subsets of data. For example -the `parameters` param of the [functionRef definition](#FunctionRef-Definition) can evaluate expressions -only against the data that is available to the [action](#Action-Definition) it belongs to. -One thing to note here are the top-level [workflow definition](#Workflow-Definition-Structure) parameters. Expressions defined -in them can only be evaluated against the initial [workflow data input](#Workflow-Data-Input). - -For example let's say that we have a workflow data input of: - -```json -{ - "inputVersion" : "1.0.0" -} -``` - -we can use this expression in the workflow "version" parameter: - -```json -{ - "name": "my-sample-workflow", - "description": "Sample Workflow", - "version": "${ .inputVersion }", - "specVersion": "0.8" -} -``` - -which would set the workflow version to "1.0.0". -Note that the workflow "name" property value is not allowed to use an expression. The workflow -definition "name" must be a constant value. - -### Workflow Definition Structure - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | The name that identifies the workflow definition, and which, when combined with its version, forms a unique identifier. | string | yes | -| version | Workflow version. MUST respect the [semantic versioning](https://semver.org/) format. If not provided, `latest` is assumed | string | no | -| description | Workflow description | string | no | -| key | Optional expression that will be used to generate a domain-specific workflow instance identifier | string | no | -| annotations | List of helpful terms describing the workflows intended purpose, subject areas, or other important qualities | array | no | -| dataInputSchema | Used to validate the workflow data input against a defined JSON Schema| string or object | no | -| dataOutputSchema | Used to validate the workflow data output against a defined JSON Schema| string or object | no | -| [constants](#Workflow-Constants) | Workflow constants | string or object | no | -| [secrets](#Workflow-Secrets) | Workflow secrets | string or array | no | -| [start](#Start-Definition) | Workflow start definition | string or object | no | -| specVersion | Serverless Workflow specification release version | string | yes | -| expressionLang | Identifies the expression language used for workflow expressions. Default value is "jq" | string | no | -| [timeouts](#Workflow-Timeouts) | Defines the workflow default timeout settings | string or object | no | -| [errors](#error-definitions) | Defines the workflow's error handling configuration, including error definitions, error handlers, and error policies | string or [error handling configuration](#error-handling-configuration) | no | -| keepActive | If `true`, workflow instances is not terminated when there are no active execution paths. Instance can be terminated with "terminate end definition" or reaching defined "workflowExecTimeout" | boolean | no | -| [auth](#Auth-Definition) | Workflow authentication definitions | array or string | no | -| [events](#Event-Definition) | Workflow event definitions. | array or string | no | -| [functions](#Function-Definition) | Workflow function definitions. Can be either inline function definitions (if array) or URI pointing to a resource containing json/yaml function definitions (if string) | array or string| no | -| [retries](#Retry-Definition) | Workflow retries definitions. Can be either inline retries definitions (if array) or URI pointing to a resource containing json/yaml retry definitions (if string) | array or string| no | -| [states](#Workflow-States) | Workflow states | array | yes | -| [extensions](#Extensions) | Workflow extensions definitions | array or string | no | -| [metadata](#Workflow-Metadata) | Metadata information | object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "sample-workflow", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Sample Workflow", - "start": "MyStartingState", - "states": [], - "functions": [], - "events": [], - "errors": [], - "retries":[] -} -``` - - - -```yaml -name: sample-workflow -version: '1.0.0' -specVersion: '0.8' -description: Sample Workflow -start: MyStartingState -states: [] -functions: [] -events: [] -errors: [] -retries: [] -``` - -
- -

- -Defines the top-level structure of a serverless workflow model. -Following figure describes the main workflow definition blocks. - -

-Serverless Workflow Definitions Blocks -

- -The required `name` property defines the unique workflow definition identifier, for example "orders", "payment", etc. - -The optional `key` property is an expression that evaluates to a domain related, unique running workflow instance identifier, for example "orders-1", "orders-2"... - -The `description` property might be used to give further information about the workflow. - -The `version` property can be used to provide a specific workflow version. It must use the [semantic versioning](https://semver.org/) format.If not specified, "latest" is assumed. - -The `annotations` property defines a list of helpful terms describing the workflows intended purpose, subject areas, or other important qualities, -for example "machine learning", "monitoring", "networking", etc - -The `dataInputSchema` and `dataOutputSchema` properties can be used to validate input and output data against a defined JSON Schema. - -The `dataInputSchema` property validates the [workflow data input](#Workflow-Data-Input). Validation should be performed before any states are executed. In case of -a start [Event state](#Event-state) the input schema is ignored, if present. The `failOnValidationErrors` property determines if workflow execution should continue in case of validation errors. - -The `dataOutputSchema` property validates the [Workflow data output](#workflow-data-output). Validation is performed on the output of the workflow execution. -The `failOnValidationErrors` property determines what should be done when the workflow output does not match the provided schema. -If `failOnValidationErrors` is true, an error should be thrown. If executed within a subprocess, that error can be be handled by the parent workflow. -If `failOnValidationErrors` is false, the error should not be propagated. It is up to the implementor to warn the user about that fact. For example, printing a log. - -Both properties can be expressed as object or string type. - -If using object type, their `schema` property might be an URI, which points to the JSON schema used to validate the workflow data input, or it might be the JSON schema object. `failOnValidationErrors` is optional, default value is `true`. - -Example for Json schema reference - -```json -"dataInputSchema": { - "schema": "URI to json schema", - "failOnValidationErrors": false -} -``` - -Example for Json schema included in the workflow file - -```json -"dataOutputSchema": { - "schema": { - "title": "MyJSONSchema", - "properties":{ - "firstName":{ - "type": "string" - }, - "lastName":{ - "type": "string" - } - } - }, - "failOnValidationErrors": true -} - -``` - -If using string type, then the string value is the external schema URI and `failOnValidationErrors` default value of `true` is assumed. - -Example using string type - -```json -"dataInputSchema": "URI_to_json_schema" -``` - -The `secrets` property allows you to use sensitive information such as passwords, OAuth tokens, ssh keys, etc. inside your -Workflow expressions. - -It has two possible types, `string` or `array`. -If `string` type, it is an URI pointing to a JSON or YAML document -which contains an array of names of the secrets, for example: - -```json -"secrets": "file://workflowsecrets.json" -``` - -If `array` type, it defines an array (of string types) which contains the names of the secrets, for example: - -```json -"secrets": ["MY_PASSWORD", "MY_STORAGE_KEY", "MY_ACCOUNT"] -``` - -For more information about Workflow secrets, reference the [Workflow Secrets section](#Workflow-Secrets). - -The `constants` property can be used to define Workflow constants values -which are accessible in [Workflow Expressions](#Workflow-Expressions). - -It has two possible types, `string` or `object`. -If `string` type, it is an URI pointing to a JSON or YAML document -which contains an object of global definitions, for example: - -```json -"constants": "file://workflowconstants.json" -``` - -If `object` type, it defines a JSON object which contains the constants definitions, for example: - -```json -{ - "AGE": { - "MIN_ADULT": 18 - } -} -``` - -For more information see the [Workflow Constants](#Workflow-Constants) section. - -The `start` property defines the workflow starting information. For more information see the [start definition](#Start-Definition) section. -This property is not required. If not defined, the workflow starting state has to be -the very first state defined in the [workflow states array](#Workflow-States). - -The `specVersion` property is used to set the Serverless Workflow specification release version -the workflow markup adheres to. -It has to follow the specification release versions (excluding the leading "v"), meaning that for -the [release version v0.8](https://github.com/serverlessworkflow/specification/releases/tag/v0.8) -its value should be set to `"0.8"`. - -The `expressionLang` property can be used to identify the expression language used for all expressions in -the workflow definition. The default value of this property is ["jq"](https://stedolan.github.io/jq/). -You should set this property if you chose to define [workflow expressions](#Workflow-Expressions) -with an expression language / syntax other than the default. - -The `timeouts` property is used to define the default workflow timeouts for workflow, state, action, and branch -execution. For more information about timeouts and its use cases see the [Workflow Timeouts](#Workflow-Timeouts) section. - -The `error` property is used to define checked errors that can be explicitly handled during workflow execution. -For more information about workflow error handling see [this section](#Defining-Errors). - -The `auth` property can be either an inline [auth](#Auth-Definition) definition array, or a URI reference to -a resource containing an array of [auth](#Auth-Definition) definitions. -If defined in a separate resource file (Json or Yaml), `auth` definitions can be re-used by multiple workflow definitions. -Auth definitions can be used to define authentication that should be used to access -the resource defined in the `operation` property of the [function](#Function-Definition) definitions. -If we have the following function definition: - -```json -{ - "functions": [ - { - "name": "hello-world-function", - "operation": "https://secure.resources.com/myapi.json#helloWorld", - "authRef": "my-basic-auth" - } - ] -} -``` - -The `authRef` property is used to reference an authentication definition in -the `auth` property and should be applied when invoking the `helloWorld` function. An [AuthRef](#AuthRef-Definition) object can alternatively be used to configure the authentication definition to use when accessing the function's resource and/or when invoking the function. - -The `functions` property can be either an in-line [function](#Function-Definition) definition array, or an URI reference to -a resource containing an array of [functions](#Function-Definition) definition. -Referenced resource can be used by multiple workflow definitions. - -Here is an example of using external resource for function definitions: - -1. Workflow definition: - -```json -{ - "name": "sample-workflow", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Sample Workflow", - "start": "MyStartingState", - "functions": "http://myhost:8080/functiondefs.json", - "states":[ - ... - ] -} -``` - -2. Function definitions resource: - -```json -{ - "functions": [ - { - "name":"hello-world-function", - "operation":"file://myapi.json#helloWorld" - } - ] -} -``` - -Referenced resource must conform to the specifications [Workflow Functions JSON Schema](schema/functions.json). - -The `events` property can be either an in-line [event](#Event-Definition) definition array, or an [URI](https://en.wikipedia.org/wiki/Uniform_Resource_Identifier) reference to -a resource containing an array of [event](#Event-Definition) definition. Referenced resource can be used by multiple workflow definitions. - -Here is an example of using external resource for event definitions: - -1. Workflow definition: - -```json -{ - "name": "sample-workflow", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Sample Workflow", - "start": "MyStartingState", - "events": "http://myhost:8080/eventsdefs.json", - "states":[ - ... - ] -} -``` - -2. Event definitions resource: - -```json -{ - "events": [ - { - "name": "applicant-info", - "type": "org.application.info", - "source": "applicationssource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] - } - ] -} -``` - -Referenced resource must conform to the specifications [Workflow Events JSON Schema](schema/events.json). - -The `retries` property can be either an in-line [retry](#Retry-Definition) definition array, or an URI reference to -a resource containing an array of [retry](#Retry-Definition) definition. -Referenced resource can be used by multiple workflow definitions. For more information about -using and referencing retry definitions see the [Workflow Error Handling](#Workflow-Error-Handling) section. - -The `keepActive` property allows you to change the default behavior of workflow instances. -By default, as described in the [Core Concepts](#Core-Concepts) section, a workflow instance is terminated once there are no more -active execution paths, one of its active paths ends in a "terminate" [end definition](#End-Definition), or when -its [`workflowExecTimeout`](#Workflow-Timeouts) time is reached. - -Setting the `keepActive` property to `true` allows you to change this default behavior in that a workflow instance -created from this workflow definition can only be terminated if one of its active paths ends in a "terminate" [end definition](#End-Definition), or when -its [`workflowExecTimeout`](#Workflow-Timeouts) time is reached. -This allows you to explicitly model workflows where an instance should be kept alive, to collect (event) data for example. - -You can reference the [specification examples](#Examples) to see the `keepActive` property in action. - -The `extensions` property can be used to define extensions for this workflow definition. -You can learn more about workflow extensions in the [Extensions](#extensions) section. -Sample `extensions` property definition could look like this for example: - -```json -{ - "extensions": [ - { - "extensionId": "workflow-ratelimiting-extension", - "path": "file://myextensions/ratelimiting.yml" - }, - { - "extensionId": "workflow-kpi-extension", - "path": "file://myextensions/kpi.yml" - } - ] -} -``` - -Here we define two workflow extensions, namely the [rate limiting](extensions/ratelimiting.md) and [kpi](extensions/kpi.md) extensions for our workflow definition. - -#### Workflow States - -Workflow states define building blocks of the workflow execution instructions. They define the -control flow logic instructions on what the workflow is supposed to do. -Serverless Workflow defines the following Workflow States: - -| Name | Description | Consumes events? | Produces events? | Executes actions? | Handles errors/retries? | Allows parallel execution? | Makes data-based transitions? | Can be workflow start state? | Can be workflow end state? | -| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- | -| **[Event](#Event-State)** | Define events that trigger action execution | yes | yes | yes | yes | yes | no | yes | yes | -| **[Operation](#Operation-State)** | Execute one or more actions | no | yes | yes | yes | yes | no | yes | yes | -| **[Switch](#Switch-State)** | Define data-based or event-based workflow transitions | no | yes | no | yes | no | yes | yes | no | -| **[Parallel](#Parallel-State)** | Causes parallel execution of branches (set of states) | no | yes | no | yes | yes | no | yes | yes | -| **[Inject](#Inject-State)** | Inject static data into state data | no | yes | no | yes | no | no | yes | yes | -| **[ForEach](#ForEach-State)** | Parallel execution of states for each element of a data array | no | yes | no | yes | yes | no | yes | yes | -| **[Callback](#Callback-State)** | Manual decision step. Executes a function and waits for callback event that indicates completion of the manual decision | yes | yes | yes | yes | no | no | yes | yes | - -##### Event State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name | string | yes | -| type | State type | string | yes | -| exclusive | If `true`, consuming one of the defined events causes its associated actions to be performed. If `false`, all of the defined events must be consumed in order for actions to be performed. Default is `true` | boolean | no | -| [onEvents](#OnEvents-Definition) | Define the events to be consumed and optional actions to be performed | array | yes | -| [timeouts](#Workflow-Timeouts) | State specific timeout settings | object | no | -| [stateDataFilter](#State-data-filters) | State data filter definition| object | no | -| [transition](#Transitions) | Next transition of the workflow after all the actions have been performed | string or object | yes (if `end` is not defined) | -| [onErrors](#Error-Definition) | States error handling definitions | array | no | -| [end](#End-Definition) | Is this state an end state | boolean or object | yes (if `transition` is not defined) | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ -"name": "monitor-vitals", -"type": "event", -"exclusive": true, -"onEvents": [{ - "eventRefs": ["high-body-temperature"], - "actions": [{ - "functionRef": { - "refName": "send-tylenol-order", - "arguments": { - "patientid": "${ .patientId }" - } - } - }] - }, - { - "eventRefs": ["high-blood-pressure"], - "actions": [{ - "functionRef": { - "refName": "call-nurse", - "arguments": { - "patientid": "${ .patientId }" - } - } - }] - }, - { - "eventRefs": ["high-respiration-rate"], - "actions": [{ - "functionRef": { - "refName": "call-pulmonologist", - "arguments": { - "patientid": "${ .patientId }" - } - } - }] - } -], -"end": { - "terminate": true -} -} -``` - - - -```yaml -name: monitor-vitals -type: event -exclusive: true -onEvents: -- eventRefs: - - high-body-temperature - actions: - - functionRef: - refName: send-tylenol-order - arguments: - patientid: "${ .patientId }" -- eventRefs: - - high-blood-pressure - actions: - - functionRef: - refName: call-nurse - arguments: - patientid: "${ .patientId }" -- eventRefs: - - high-respiration-rate - actions: - - functionRef: - refName: call-pulmonologist - arguments: - patientid: "${ .patientId }" -end: - terminate: true -``` - -
- -

- -Event states await one or more events and perform actions when they are received. -If defined as the workflow starting state, the event state definition controls when the workflow -instances should be created. - -The `exclusive` property determines if the state should wait for any of the defined events in the `onEvents` array, or -if all defined events must be present for their associated actions to be performed. - -Following two figures illustrate the `exclusive` property: - -

-Event state with exclusive set to true -

- -If the Event state in this case is a workflow starting state, the occurrence of *any* of the defined events would start a new workflow instance. - -

-Event state with exclusive set to false -

- -If the Event state in this case is a workflow starting state, the occurrence of *all* defined events would start a new -workflow instance. - -In order to consider only events that are related to each other, we need to set the `correlation` property in the workflow -[events definitions](#Event-Definition). This allows us to set up event correlation rules against the events -extension context attributes. - -If the Event state is not a workflow starting state, the `timeout` property can be used to define the time duration from the -invocation of the event state. If the defined event, or events have not been received during this time, -the state should transition to the next state or can end the workflow execution (if it is an end state). - -The `timeouts` property can be used to define state specific timeout settings. Event states can define the -`stateExecTimeout`, `actionExecTimeout`, and `eventTimeout` properties. -For more information about Event state specific event timeout settings reference [this section](#Event-Timeout-Definition). -For more information about workflow timeouts reference the [Workflow Timeouts](#Workflow-Timeouts) section. - -Note that `transition` and `end` properties are mutually exclusive, meaning that you cannot define both of them at the same time. - -##### Operation State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| type | State type | string | yes | -| actionMode | Should actions be performed sequentially or in parallel. Default is `sequential` | enum | no | -| [actions](#Action-Definition) | Actions to be performed | array | yes | -| [timeouts](#Workflow-Timeouts) | State specific timeout settings | object | no | -| [stateDataFilter](#State-data-filters) | State data filter | object | no | -| [onErrors](#Error-Definition) | States error handling and retries definitions | array | no | -| [transition](#Transitions) | Next transition of the workflow after all the actions have been performed | string or object | yes (if `end` is not defined) | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [usedForCompensation](#Workflow-Compensation) | If `true`, this state is used to compensate another state. Default is `false` | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | -| [end](#End-Definition) | Is this state an end state | boolean or object | yes (if `transition` is not defined) | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "reject-application", - "type": "operation", - "actionMode": "sequential", - "actions": [ - { - "functionRef": { - "refName": "send-rejection-email-function", - "arguments": { - "customer": "${ .customer }" - } - } - } - ], - "end": true -} -``` - - - -```yaml -name: reject-application -type: operation -actionMode: sequential -actions: -- functionRef: - refName: send-rejection-email-function - arguments: - customer: "${ .customer }" -end: true -``` - -
- -

- -Operation state defines a set of actions to be performed in sequence or in parallel. -Once all actions have been performed, a transition to another state can occur. - -The `timeouts` property can be used to define state specific timeout settings. Operation states can define -the `stateExecTimeout` and `actionExecTimeout` settings. For more information on Workflow timeouts reference -the [Workflow Timeouts](#Workflow-Timeouts) section. - -##### Switch State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| type | State type | string | yes | -| [dataConditions](#Switch-state-Data-Conditions) | Defined if the Switch state evaluates conditions and transitions based on state data. | array | yes (if `eventConditions` is not defined) | -| [eventConditions](#Switch-State-Event-Conditions) | Defined if the Switch state evaluates conditions and transitions based on arrival of events. | array | yes (if `dataConditions` is not defined | -| [stateDataFilter](#State-data-filters) | State data filter | object | no | -| [onErrors](#Error-Definition) | States error handling and retries definitions | array | no | -| [timeouts](#Workflow-Timeouts) | State specific timeout settings | object | no | -| defaultCondition | Default transition of the workflow if there is no matching data conditions or event timeout is reached. Can be a transition or end definition | object | yes | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [usedForCompensation](#Workflow-Compensation) | If `true`, this state is used to compensate another state. Default is `false` | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name":"check-visa-status", - "type":"switch", - "eventConditions": [ - { - "eventRef": "visa-approved-event", - "transition": "handle-approved-visa" - }, - { - "eventRef": "visa-rejected-event", - "transition": "handle-rejected-visa" - } - ], - "timeouts": { - "eventTimeout": "PT1H" - }, - "defaultCondition": { - "transition": "handle-no-visa-decision" - } -} -``` - - - -```yaml -name: check-visa-status -type: switch -eventConditions: -- eventRef: visa-approved-event - transition: handle-approved-visa -- eventRef: visa-rejected-event - transition: handle-rejected-visa -timeouts: - eventTimeout: PT1H -defaultCondition: - transition: handle-no-visa-decision -``` - -
- -

- -Switch states can be viewed as workflow gateways: they can direct transitions of a workflow based on certain conditions. -There are two types of conditions for switch states: - -* [Data-based conditions](#Switch-State-Data-Conditions) -* [Event-based conditions](#Switch-State-Event-Conditions) - -These are exclusive, meaning that a switch state can define one or the other condition type, but not both. - -At times multiple defined conditions can be evaluated to `true` by runtime implementations. -Conditions defined first take precedence over conditions defined later. This is backed by the fact that arrays/sequences -are ordered in both JSON and YAML. For example, let's say there are two `true` conditions: A and B, defined in that order. -Because A was defined first, its transition will be executed, not B's. - -In case of data-based conditions definition, switch state controls workflow transitions based on the states data. -If no defined conditions can be matched, the state transitions is taken based on the `defaultCondition` property. -This property can be either a `transition` to another workflow state, or an `end` definition meaning a workflow end. - -For event-based conditions, a switch state acts as a workflow wait state. It halts workflow execution -until one of the referenced events arrive, then making a transition depending on that event definition. -If events defined in event-based conditions do not arrive before the states `eventTimeout` property expires, -state transitions are based on the defined `defaultCondition` property. - -The `timeouts` property can be used to define state specific timeout settings. Switch states can define the -`stateExecTimeout` setting. If `eventConditions` is defined, the switch state can also define the -`eventTimeout` property. For more information on workflow timeouts reference the [Workflow Timeouts](#Workflow-Timeouts) section. - -##### Parallel State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| type | State type | string | yes | -| [branches](#Parallel-State-Branch) | List of branches for this parallel state| array | yes | -| completionType | Option types on how to complete branch execution. Default is "allOf" | enum | no | -| numCompleted | Used when branchCompletionType is set to `atLeast` to specify the least number of branches that must complete in order for the state to transition/end. | string or number | yes (if `completionType` is `atLeast`) | -| [timeouts](#Workflow-Timeouts) | State specific timeout settings | object | no | -| [stateDataFilter](#State-data-filters) | State data filter | object | no | -| [onErrors](#Error-Definition) | States error handling and retries definitions | array | no | -| [transition](#Transitions) | Next transition of the workflow after all branches have completed execution | string or object | yes (if `end` is not defined) | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [usedForCompensation](#Workflow-Compensation) | If `true`, this state is used to compensate another state. Default is `false` | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | -| [end](#End-Definition) | Is this state an end state | boolean or object | yes (if `transition` is not defined) | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json - { - "name":"parallel-exec", - "type":"parallel", - "completionType": "allOf", - "branches": [ - { - "name": "branch-1", - "actions": [ - { - "functionRef": { - "refName": "function-name-one", - "arguments": { - "order": "${ .someParam }" - } - } - } - ] - }, - { - "name": "branch-2", - "actions": [ - { - "functionRef": { - "refName": "function-name-two", - "arguments": { - "order": "${ .someParam }" - } - } - } - ] - } - ], - "end": true -} -``` - - - -```yaml -name: parallel-exec -type: parallel -completionType: allOf -branches: -- name: branch-1 - actions: - - functionRef: - refName: function-name-one - arguments: - order: "${ .someParam }" -- name: branch-2 - actions: - - functionRef: - refName: function-name-two - arguments: - order: "${ .someParam }" -end: true -``` - -
- -

- -Parallel state defines a collection of `branches` that are executed in parallel. -A parallel state can be seen a state which splits up the current workflow instance execution path -into multiple ones, one for each branch. These execution paths are performed in parallel -and are joined back into the current execution path depending on the defined `completionType` parameter value. - -The "completionType" enum specifies the different ways of completing branch execution: - -* allOf: All branches must complete execution before the state can transition/end. This is the default value in case this parameter is not defined in the parallel state definition. -* atLeast: State can transition/end once at least the specified number of branches have completed execution. In this case you must also - specify the `numCompleted` property to define this number. - -Exceptions may occur during execution of branches of the Parallel state, this is described in detail in [this section](#Parallel-State-Handling-Exceptions). - -The `timeouts` property can be used to set state specific timeout settings. Parallel states can define the -`stateExecTimeout` and `branchExecTimeout` timeout settings. For more information on workflow timeouts -reference the [Workflow Timeouts](#Workflow-Timeouts) section. - -Note that `transition` and `end` properties are mutually exclusive, meaning that you cannot define both of them at the same time. - -##### Inject State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| type | State type | string | yes | -| data | JSON object which can be set as state's data input and can be manipulated via filter | object | yes | -| [stateDataFilter](#state-data-filters) | State data filter | object | no | -| [transition](#Transitions) | Next transition of the workflow after injection has completed | string or object | yes (if `end` is not defined) | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [usedForCompensation](#Workflow-Compensation) | If `true`, this state is used to compensate another state. Default is `false` | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | -| [end](#End-Definition) | Is this state an end state | boolean or object | yes (if `transition` is not defined) | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name":"hello", - "type":"inject", - "data": { - "result": "Hello" - }, - "transition": "world" -} -``` - - - -```yaml -name: hello -type: inject -data: - result: Hello -transition: world -``` - -
- -

- -Inject state can be used to inject static data into state data input. Inject state does not perform any actions. -It is very useful for debugging, for example, as you can test/simulate workflow execution with pre-set data that would typically -be dynamic in nature (e.g., function calls, events). - -The inject state `data` property allows you to statically define a JSON object which gets added to the states data input. -You can use the filter property to control the states data output to the transition state. - -Here is a typical example of how to use the inject state to add static data into its states data input, which then is passed -as data output to the transition state: - - - - - - - - - - -
JSONYAML
- - ```json - { - "name":"simple-inject-state", - "type":"inject", - "data": { - "person": { - "fname": "John", - "lname": "Doe", - "address": "1234 SomeStreet", - "age": 40 - } - }, - "transition": "greet-person-state" - } - ``` - - - -```yaml - name: simple-inject-state - type: inject - data: - person: - fname: John - lname: Doe - address: 1234 SomeStreet - age: 40 - transition: greet-person-state -``` - -
- -The data output of the "SimpleInjectState" which then is passed as input to the transition state would be: - -```json -{ - "person": { - "fname": "John", - "lname": "Doe", - "address": "1234 SomeStreet", - "age": 40 - } -} - -``` - -If the inject state already receives a data input from the previous transition state, the inject data should be merged -with its data input. - -You can also use the filter property to filter the state data after data is injected. Let's say we have: - - - - - - - - - - -
JSONYAML
- -```json - { - "name":"simple-inject-state", - "type":"inject", - "data": { - "people": [ - { - "fname": "John", - "lname": "Doe", - "address": "1234 SomeStreet", - "age": 40 - }, - { - "fname": "Marry", - "lname": "Allice", - "address": "1234 SomeStreet", - "age": 25 - }, - { - "fname": "Kelly", - "lname": "Mill", - "address": "1234 SomeStreet", - "age": 30 - } - ] - }, - "stateDataFilter": { - "output": "${ {people: [.people[] | select(.age < 40)]} }" - }, - "transition": "greet-person-state" - } -``` - - - -```yaml - name: simple-inject-state - type: inject - data: - people: - - fname: John - lname: Doe - address: 1234 SomeStreet - age: 40 - - fname: Marry - lname: Allice - address: 1234 SomeStreet - age: 25 - - fname: Kelly - lname: Mill - address: 1234 SomeStreet - age: 30 - stateDataFilter: - output: "${ {people: [.people[] | select(.age < 40)]} }" - transition: greet-person-state -``` - -
- -In which case the states data output would include only people whose age is less than 40: - -```json -{ - "people": [ - { - "fname": "Marry", - "lname": "Allice", - "address": "1234 SomeStreet", - "age": 25 - }, - { - "fname": "Kelly", - "lname": "Mill", - "address": "1234 SomeStreet", - "age": 30 - } - ] -} -``` - -You can change your output path easily during testing, for example change the expression to: - -```text -${ {people: [.people[] | select(.age >= 40)]} } -``` - -This allows you to test if your workflow behaves properly for cases when there are people whose age is greater or equal 40. - -Note that `transition` and `end` properties are mutually exclusive, meaning that you cannot define both of them at the same time. - -##### ForEach State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| type | State type | string | yes | -| inputCollection | Workflow expression selecting an array element of the states data | string | yes | -| outputCollection | Workflow expression specifying an array element of the states data to add the results of each iteration | string | no | -| iterationParam | Name of the iteration parameter that can be referenced in actions/workflow. For each parallel iteration, this param should contain an unique element of the inputCollection array | string | no | -| batchSize | Specifies how many iterations may run in parallel at the same time. Used if `mode` property is set to `parallel` (default). If not specified, its value should be the size of the `inputCollection` | string or number | no | -| mode | Specifies how iterations are to be performed (sequentially or in parallel). Default is `parallel` | enum | no | -| [actions](#Action-Definition) | Actions to be executed for each of the elements of inputCollection | array | yes | -| [timeouts](#Workflow-Timeouts) | State specific timeout settings | object | no | -| [stateDataFilter](#State-data-filters) | State data filter definition | object | no | -| [onErrors](#Error-Definition) | States error handling and retries definitions | array | no | -| [transition](#Transitions) | Next transition of the workflow after state has completed | string or object | yes (if `end` is not defined) | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [usedForCompensation](#Workflow-Compensation) | If `true`, this state is used to compensate another state. Default is `false` | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | -| [end](#End-Definition) | Is this state an end state | boolean or object | yes (if `transition` is not defined) | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "provision-orders-state", - "type": "foreach", - "inputCollection": "${ .orders }", - "iterationParam": "singleorder", - "outputCollection": "${ .provisionresults }", - "actions": [ - { - "functionRef": { - "refName": "provision-order-function", - "arguments": { - "order": "${ $singleorder }" - } - } - } - ] -} -``` - - - -```yaml -name: provision-orders-state -type: foreach -inputCollection: "${ .orders }" -iterationParam: "singleorder" -outputCollection: "${ .provisionresults }" -actions: -- functionRef: - refName: provision-order-function - arguments: - order: "${ $singleorder }" -``` - -
- -

- -ForEach states can be used to execute [actions](#Action-Definition) for each element of a data set. - -Each iteration of the ForEach state is by default executed in parallel by default. -However, executing iterations sequentially is also possible by setting the value of the `mode` property to -`sequential`. - -The `mode` property defines if iterations should be done sequentially or in parallel. By default, -if `mode` is not specified, iterations should be done in parallel. - -If the default `parallel` iteration mode is used, the `batchSize` property to the number of iterations (batch) -that can be executed at a time. To give an example, if the number of iterations is 55 and `batchSize` -is set to `10`, 10 iterations are to be executed at a time, meaning that the state would execute 10 iterations in parallel, -then execute the next batch of 10 iterations. After 5 such executions, the remaining 5 iterations are to be executed in the last batch. -The batch size value must be greater than 1. If not specified, its value should be the size of the `inputCollection` (all iterations). - -The `inputCollection` property is a workflow expression which selects an array in the states data. All iterations -are performed against data elements of this array. If this array does not exist, the runtime should throw -an error. This error can be handled inside the states [`onErrors`](#Error-Definition) definition. - -The `outputCollection` property is a workflow expression which selects an array in the state data where the results -of each iteration should be added to. If this array does not exist, it should be created. - -The `iterationParam` property defines the name of the iteration parameter passed to each iteration of the ForEach state. -It should contain the unique element of the `inputCollection` array and made available to actions of the ForEach state. -`iterationParam` can be accessed as an expression variable. [In JQ, expression variables are prefixed by $](https://stedolan.github.io/jq/manual/#Variable/SymbolicBindingOperator:...as$identifier|...). -If `iterationParam` is not explicitly defined, runtimes should create one and populate it with the value of the unique -iteration parameter for each iteration of the ForEach state. - -The `actions` property defines actions to be executed in each state iteration. - -Let's take a look at an example: - -In this example the data input to our workflow is an array of orders: - -```json -{ - "orders": [ - { - "orderNumber": "1234", - "completed": true, - "email": "firstBuyer@buyer.com" - }, - { - "orderNumber": "5678", - "completed": true, - "email": "secondBuyer@buyer.com" - }, - { - "orderNumber": "9910", - "completed": false, - "email": "thirdBuyer@buyer.com" - } - ] -} -``` - -and our workflow is defined as: - - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "send-confirmation-for-completed-orders", - "version": "1.0.0", - "specVersion": "0.8", - "start": "send-confirm-state", - "functions": [ - { - "name": "send-confirmation-function", - "operation": "file://confirmationapi.json#sendOrderConfirmation" - } - ], - "states": [ - { - "name":"send-confirm-state", - "type":"foreach", - "inputCollection": "${ [.orders[] | select(.completed == true)] }", - "iterationParam": "completedorder", - "outputCollection": "${ .confirmationresults }", - "actions":[ - { - "functionRef": { - "refName": "send-confirmation-function", - "arguments": { - "orderNumber": "${ $completedorder.orderNumber }", - "email": "${ $completedorder.email }" - } - } - }], - "end": true - }] -} -``` - - - -```yaml -name: send-confirmation-for-completed-orders -version: '1.0.0' -specVersion: '0.8' -start: send-confirm-state -functions: -- name: send-confirmation-function - operation: file://confirmationapi.json#sendOrderConfirmation -states: -- name: send-confirm-state - type: foreach - inputCollection: "${ [.orders[] | select(.completed == true)] }" - iterationParam: completedorder - outputCollection: "${ .confirmationresults }" - actions: - - functionRef: - refName: send-confirmation-function - arguments: - orderNumber: "${ $completedorder.orderNumber }" - email: "${ $completedorder.email }" - end: true -``` - -
- -The workflow data input containing order information is passed to the `SendConfirmState` [ForEach](#ForEach-State) state. -The ForEach state defines an `inputCollection` property which selects all orders that have the `completed` property set to `true`. - -For each element of the array selected by `inputCollection` a JSON object defined by `iterationParam` should be -created containing an unique element of `inputCollection` and passed as the data input to the parallel executed actions. - -So for this example, we would have two parallel executions of the `sendConfirmationFunction`, the first one having data: - -```json -{ - "completedorder": { - "orderNumber": "1234", - "completed": true, - "email": "firstBuyer@buyer.com" - } -} -``` - -and the second: - -```json -{ - "completedorder": { - "orderNumber": "5678", - "completed": true, - "email": "secondBuyer@buyer.com" - } -} -``` - -The results of each parallel action execution are stored as elements in the state data array defined by the `outputCollection` property. - -The `timeouts` property can be used to set state specific timeout settings. ForEach states can define the -`stateExecTimeout` and `actionExecTimeout` settings. For more information on workflow timeouts reference the [Workflow Timeouts](#Workflow-Timeouts) -section. - -Note that `transition` and `end` properties are mutually exclusive, meaning that you cannot define both of them at the same time. - -##### Callback State - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique State name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| type | State type | string | yes | -| [action](#Action-Definition) | Defines the action to be executed | object | yes | -| eventRef | References an unique callback event name in the defined workflow [events](#Event-Definition) | string | yes | -| [timeouts](#Workflow-Timeouts) | State specific timeout settings | object | no | -| [eventDataFilter](#Event-data-filters) | Callback event data filter definition | object | no | -| [stateDataFilter](#State-data-filters) | State data filter definition | object | no | -| [onErrors](#Error-Definition) | States error handling and retries definitions | array | no | -| [transition](#Transitions) | Next transition of the workflow after callback event has been received | string or object | yes (if `end` is not defined) | -| [end](#End-Definition) | Is this state an end state | boolean or object | yes (if `transition` is not defined) | -| [compensatedBy](#Workflow-Compensation) | Unique name of a workflow state which is responsible for compensation of this state | string | no | -| [usedForCompensation](#Workflow-Compensation) | If `true`, this state is used to compensate another state. Default is `false` | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "check-credit", - "type": "callback", - "action": { - "functionRef": { - "refName": "call-credit-check-microservice", - "arguments": { - "customer": "${ .customer }" - } - } - }, - "eventRef": "credit-check-completed-event", - "timeouts": { - "stateExecTimeout": "PT15M" - }, - "transition": "evaluate-decision" -} -``` - - - -```yaml -name: check-credit -type: callback -action: - functionRef: - refName: call-credit-check-microservice - arguments: - customer: "${ .customer }" -eventRef: credit-check-completed-event -timeouts: - stateExecTimeout: PT15M -transition: evaluate-decision -``` - -
- -

- -Serverless orchestration can at times require manual steps/decisions to be made. While some work performed -in a serverless workflow can be executed automatically, some decisions must involve manual steps (e.g., human decisions). -The Callback state allows you to explicitly model manual decision steps during workflow execution. - -The action property defines a function call that triggers an external activity/service. Once the action executes, -the callback state will wait for a CloudEvent (defined via the `eventRef` property), which indicates the completion -of the manual decision by the called service. - -Note that the called decision service is responsible for emitting the callback CloudEvent indicating the completion of the -decision and including the decision results as part of the event payload. This event must be correlated to the -workflow instance using the callback events context attribute defined in the `correlation` property of the -referenced [Event Definition](#Event-Definition). - -Once the completion (callback) event is received, the Callback state completes its execution and transitions to the next -defined workflow state or completes workflow execution in case it is an end state. - -The callback event payload is merged with the Callback state data and can be filtered via the "eventDataFilter" definition. - -If the defined callback event has not been received during this time period, the state should transition to the next state or end workflow execution if it is an end state. - -The `timeouts` property defines state specific timeout settings. Callback states can define the -`stateExecTimeout`, `actionExecTimeout`, and `eventTimeout` properties. -For more information on workflow timeouts reference the [Workflow Timeouts](#Workflow-Timeouts) -section. - -Note that `transition` and `end` properties are mutually exclusive, meaning that you cannot define both of them at the same time. - -#### Related State Definitions - -##### Function Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique function name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| operation | See the table "Function Operation description by type" below. | string or object | yes | -| type | Defines the function type. Can be either `http`, `openapi`, `asyncapi`, `rpc`, `graphql`, `odata`, `expression`, or [`custom`](#defining-custom-function-types). Default is `openapi` | enum | no | -| authRef | References an [auth definition](#Auth-Definition) name to be used to access to resource defined in the operation parameter | string | no | -| [metadata](#Workflow-Metadata) | Metadata information. Can be used to define custom function information | object | no | - -Function Operation description by type: - -| Type | Operation Description | -| ---- | --------- | -| `openapi` | # | -| `rest` | [OpenAPI Paths Object](https://spec.openapis.org/oas/v3.1.0#paths-object) definition | -| `asyncapi` | # | -| `rpc` | ## | -| `graphql` | ## | -| `odata` | # | -| `expression` | defines the workflow expression | -| `custom` | see [Defining custom function types](#defining-custom-function-types) - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "hello-world-function", - "operation": "https://hellworldservice.api.com/api.json#helloWorld" -} -``` - - - -```yaml -name: hello-world-function -operation: https://hellworldservice.api.com/api.json#helloWorld -``` - -
- -

- -The `name` property defines an unique name of the function definition. - -The `type` enum property defines the function type. Its value can be either `rest`, `openapi` or `expression`. Default value is `openapi`. - -Depending on the function `type`, the `operation` property can be: - -* If `type` is `openapi`, a combination of the function/service OpenAPI definition document URI and the particular service operation that needs to be invoked, separated by a '#'. - For example `https://petstore.swagger.io/v2/swagger.json#getPetById`. -* If `type` is `rest`, an object definition of the [OpenAPI Paths Object](https://spec.openapis.org/oas/v3.1.0#paths-object). - For example, see [Using Functions for RESTful Service Invocations](#using-functions-for-rest-service-invocations). -* If `type` is `asyncapi`, a combination of the AsyncApi definition document URI and the particular service operation that needs to be invoked, separated by a '#'. - For example `file://streetlightsapi.yaml#onLightMeasured`. -* If `type` is `rpc`, a combination of the gRPC proto document URI and the particular service name and service method name that needs to be invoked, separated by a '#'. - For example `file://myuserservice.proto#UserService#ListUsers`. -* If `type` is `graphql`, a combination of the GraphQL schema definition URI and the particular service name and service method name that needs to be invoked, separated by a '#'. - For example `file://myuserservice.proto#UserService#ListUsers`. -* If `type` is `odata`, a combination of the GraphQL schema definition URI and the particular service name and service method name that needs to be invoked, separated by a '#'. - For example `https://services.odata.org/V3/OData/OData.svc#Products`. -* If `type` is `expression`, defines the expression syntax. Take a look at the [workflow expressions section](#Workflow-Expressions) for more information on this. - -Defining custom function types is possible, for more information on that refer to the [Defining custom function types](#defining-custom-function-types) section. - -The `authRef` property references a name of a defined workflow [auth definition](#Auth-Definition). -It is used to provide authentication info to access the resource defined in the `operation` property and/or to invoke the function. - -The [`metadata`](#Workflow-Metadata) property allows users to define custom information to function definitions. -This allows you for example to define functions that describe of a command executions on a Docker image: - -```yaml -functions: -- name: whalesayimage - metadata: - image: docker/whalesay - command: cowsay -``` - -Note that using metadata for cases such as above heavily reduces the portability of your workflow markup. - -Function definitions themselves do not define data input parameters. Parameters can be -defined via the `parameters` property in [function definitions](#FunctionRef-Definition) inside [actions](#Action-Definition). - -###### AuthRef Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| resource | References an auth definition to be used to access the resource defined in the operation parameter | string | yes | -| invocation | References an auth definition to be used to invoke the operation | string | no | - -The `authRef` property references a name of a defined workflow [auth definition](#Auth-Definition). It can be a string or an object. - -If it's a string, the referenced [auth definition](#Auth-Definition) is used solely for the function's invocation. - -If it's an object, it is possible to specify an [auth definition](#Auth-Definition) to use for the function's resource retrieval (as defined by the `operation` property) and another for its invocation. - -Example of a function definition configured to use an [auth definition](#Auth-Definition) called "My Basic Auth" upon invocation: - -```yaml -functions: -- name: secured-function-invocation - operation: https://test.com/swagger.json#HelloWorld - authRef: My Basic Auth -``` - -Example of a function definition configured to use an [auth definition](#Auth-Definition) called "My Basic Auth" to retrieve the resource defined by the `operation` property, and an [auth definition](#Auth-Definition) called "My OIDC Auth" upon invocation: - -```yaml -functions: -- name: secured-function-invocation - operation: https://test.com/swagger.json#HelloWorld - authRef: - resource: My Basic Auth - invocation: My OIDC Auth -``` - -Note that if multiple functions share the same `operation` path (*which is the first component of the operation value, located before the first '#' character*), and if one of them defines an [auth definition](#Auth-Definition) for resource access, then it should always be used to access said resource. -In other words, when retrieving the resource of the function "MySecuredFunction2" defined in the following example, the "My Api Key Auth" [auth definition](#Auth-Definition) should be used, because the "MySecuredFunction1" has defined it for resource access. -This is done to avoid unnecessary repetitions of [auth definition](#Auth-Definition) configuration when using the same resource for multiple defined functions. - -```yaml -functions: - - name: secured-function-1 - operation: https://secure.resources.com/myapi.json#helloWorld - authRef: - resource: My ApiKey Auth - - name: secured-function-2 - operation: https://secure.resources.com/myapi.json#holaMundo -``` - -It's worth noting that if an [auth definition](#Auth-Definition) has been defined for an OpenAPI function which's resource declare an authentication mechanism, the later should be used instead, thus ignoring entirely the [auth definition](#Auth-Definition). - -##### Event Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique event name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| source | CloudEvent source. If not set when producing an event, runtimes are expected to use a default value, such as https://serverlessworkflow.io in order to comply with the [CloudEvent spec constraints](https://github.com/cloudevents/spec/blob/v1.0.2/cloudevents/spec.md#source-1))| string | yes (if `type` is not defined. | -| type | CloudEvent type | string | yes (if `source` is not defined) | -| [correlation](#Correlation-Definition) | Define event correlation rules for this event. Only used for consumed events | array | no | -| dataOnly | If `true` (default value), only the Event payload is accessible to consuming Workflow states. If `false`, both event payload and context attributes should be accessible | boolean | no | -| [metadata](#Workflow-Metadata) | Metadata information | object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "applicant-info", - "type": "org.application.info", - "source": "applicationssource", - "correlation": [ - { - "contextAttributeName": "applicantId" - } - ] -} -``` - - - -```yaml -name: applicant-info -type: org.application.info -source: applicationssource -correlation: -- contextAttributeName: applicantId -``` - -
- -

- -Used to define events and their correlations. These events can be either consumed or produced during workflow execution as well -as can be used to [trigger function/service invocations](#EventRef-Definition). - -The Serverless Workflow specification mandates that all events conform to the [CloudEvents](https://github.com/cloudevents/spec) specification. -This is to assure consistency and portability of the events format used. - -The `name` property defines a single name of the event that is unique inside the workflow definition. This event name can be -then referenced within [function](#Function-Definition) and [state](#Workflow-States) definitions. - -The `source` property matches this event definition with the [source](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#source-1) -property of the CloudEvent required attributes. - -The `type` property matches this event definition with the [type](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#type) property of the CloudEvent required attributes. - -Event correlation plays a big role in large event-driven applications. Correlating one or more events with a particular workflow instance -can be done by defining the event correlation rules within the `correlation` property. -This property is an array of [correlation](#Correlation-Definition) definitions. -The CloudEvents specification allows users to add [Extension Context Attributes](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#extension-context-attributes) -and the correlation definitions can use these attributes to define clear matching event correlation rules. -Extension context attributes are not part of the event payload, so they are serialized the same way as other standard required attributes. -This means that the event payload does not have to be inspected by implementations in order to read and evaluate the defined correlation rules. - -Let's take a look at an example. Here we have two events that have an extension context attribute called "patientId" (as well as "department", which -will be used in further examples below): - -```json -{ - "specversion" : "1.0", - "type" : "com.hospital.patient.heartRateMonitor", - "source" : "hospitalMonitorSystem", - "subject" : "HeartRateReading", - "id" : "A234-1234-1234", - "time" : "2020-01-05T17:31:00Z", - "patientId" : "PID-12345", - "department": "UrgentCare", - "data" : { - "value": "80bpm" - } -} -``` - -and - -```json -{ - "specversion" : "1.0", - "type" : "com.hospital.patient.bloodPressureMonitor", - "source" : "hospitalMonitorSystem", - "subject" : "BloodPressureReading", - "id" : "B234-1234-1234", - "time" : "2020-02-05T17:31:00Z", - "patientId" : "PID-12345", - "department": "UrgentCare", - "data" : { - "value": "110/70" - } -} -``` - -We can then define a correlation rule, through which all consumed events with the "hospitalMonitorSystem", and the "com.hospital.patient.heartRateMonitor" -type that have the **same** value of the `patientId` property to be correlated to the created workflow instance: - -```json -{ -"events": [ - { - "name": "heart-rate-reading-event", - "source": "hospitalMonitorSystem", - "type": "com.hospital.patient.heartRateMonitor", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - } -] -} -``` - -If a workflow instance is created (e.g., via Event state) by consuming a "HeartRateReadingEvent" event, all other consumed events -from the defined source and with the defined type that have the same "patientId" as the event that triggered the workflow instance -should then also be associated with the same instance. - -You can also correlate multiple events together. In the following example, we assume that the workflow consumes two different event types, -and we want to make sure that both are correlated, as in the above example, with the same "patientId": - -```json -{ -"events": [ - { - "name": "heart-rate-reading-event", - "source": "hospitalMonitorSystem", - "type": "com.hospital.patient.heartRateMonitor", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - }, - { - "name": "blood-pressure-reading-event", - "source": "hospitalMonitorSystem", - "type": "com.hospital.patient.bloodPressureMonitor", - "correlation": [ - { - "contextAttributeName": "patientId" - } - ] - } -] -} -``` - -Event correlation can be based on equality (values of the defined "contextAttributeName" must be equal), but it can also be based -on comparing it to custom defined values (string, or expression). For example: - -```json -{ -"events": [ - { - "name": "heart-rate-reading-event", - "source": "hospitalMonitorSystem", - "type": "com.hospital.patient.heartRateMonitor", - "correlation": [ - { - "contextAttributeName": "patientId" - }, - { - "contextAttributeName": "department", - "contextAttributeValue" : "UrgentCare" - } - ] - } -] -} -``` - -In this example, we have two correlation rules defined: The first one is on the "patientId" CloudEvent context attribute, meaning again that -all consumed events from this source and type must have the same "patientId" to be considered. The second rule -says that these events must all have a context attribute named "department" with the value of "UrgentCare". - -This allows developers to write orchestration workflows that are specifically targeted to patients that are in the hospital urgent care unit, -for example. - -The `dataOnly` property deals with what Event data is accessible by the consuming Workflow states. -If its value is `true` (default value), only the Event payload is accessible to consuming Workflow states. -If `false`, both Event payload and context attributes should be accessible. - -##### Auth Definition - -Auth definitions can be used to define authentication information that should be applied to [function definitions](#Function-Definition). -It can be used for both the retrieval of the function's resource (as defined by the `operation` property) and for the function's invocation. - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique auth definition name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| scheme | Auth scheme, can be "basic", "bearer", or "oauth2". Default is "basic" | enum | yes | -| name | Unique auth definition name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| scheme | Auth scheme, can be "basic", "bearer", or "oauth2". Default is "basic" | enum | no | -| properties | Auth scheme properties. Can be one of ["Basic properties definition"](#basic-properties-definition), ["Bearer properties definition"](#bearer-properties-definition), or ["OAuth2 properties definition"](#oauth2-properties-definition) | object | yes | - -The `name` property defines the unique auth definition name. -The `scheme` property defines the auth scheme to be used. Can be "bearer", "basic" or "oauth2". -The `properties` property defines the auth scheme properties information. -Can be one of ["Basic properties definition"](#basic-properties-definition), ["Bearer properties definition"](#bearer-properties-definition), or ["OAuth2 properties definition"](#oauth2-properties-definition) - -###### Basic Properties Definition - -See [here](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication#basic_authentication_scheme) for more information about Basic Authentication scheme. - -The Basic properties definition can have two types, either `string` or `object`. -If `string` type, it defines a [workflow expression](#workflow-expressions) that contains all needed Basic auth information. -If `object` type, it has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| username | String or a workflow expression. Contains the user name | string | yes | -| password | String or a workflow expression. Contains the user password | string | yes | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -###### Bearer Properties Definition - -See [here](https://datatracker.ietf.org/doc/html/rfc6750) for more information about Bearer Authentication scheme. - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| token | String or a workflow expression. Contains the token information | string | yes | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -###### OAuth2 Properties Definition - -See [here](https://oauth.net/2/) for more information about OAuth2 Authentication scheme. - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| authority | String or a workflow expression. Contains the authority information | string | no | -| grantType | Defines the grant type. Can be "password", "clientCredentials", or "tokenExchange" | enum | yes | -| clientId | String or a workflow expression. Contains the client identifier | string | yes | -| clientSecret | Workflow secret or a workflow expression. Contains the client secret | string | no | -| scopes | Array containing strings or workflow expressions. Contains the OAuth2 scopes | array | no | -| username | String or a workflow expression. Contains the user name. Used only if grantType is 'resourceOwner' | string | no | -| password | String or a workflow expression. Contains the user password. Used only if grantType is 'resourceOwner' | string | no | -| audiences | Array containing strings or workflow expressions. Contains the OAuth2 audiences | array | no | -| subjectToken | String or a workflow expression. Contains the subject token | string | no | -| requestedSubject | String or a workflow expression. Contains the requested subject | string | no | -| requestedIssuer | String or a workflow expression. Contains the requested issuer | string | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -##### Correlation Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| contextAttributeName | CloudEvent Extension Context Attribute name | string | yes | -| contextAttributeValue | CloudEvent Extension Context Attribute value | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "correlation": [ - { - "contextAttributeName": "patientId" - }, - { - "contextAttributeName": "department", - "contextAttributeValue" : "UrgentCare" - } - ] -} -``` - - - -```yaml -correlation: -- contextAttributeName: patientId -- contextAttributeName: department - contextAttributeValue: UrgentCare -``` - -
- -

- -Used to define event correlation rules. - -The `contextAttributeName` property defines the name of the CloudEvent [extension context attribute](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#extension-context-attributes). -The `contextAttributeValue` property defines the value of the defined CloudEvent [extension context attribute](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#extension-context-attributes). - -##### OnEvents Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| eventRefs | References one or more unique event names in the defined workflow [events](#Event-Definition) | array | yes | -| actionMode | Specifies how actions are to be performed (in sequence or in parallel). Default is `sequential` | enum | no | -| [actions](#Action-Definition) | Actions to be performed | array | no | -| [eventDataFilter](#Event-data-filters) | Event data filter definition | object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "eventRefs": ["high-body-temperature"], - "actions": [{ - "functionRef": { - "refName": "send-tylenol-order", - "arguments": { - "patientid": "${ .patientId }" - } - } - }] -} -``` - - - -```yaml -eventRefs: -- high-body-temperature -actions: -- functionRef: - refName: send-tylenol-order - arguments: - patientid: "${ .patientId }" -``` - -
- -

- -OnEvent definition allow you to define which [actions](#Action-Definition) are to be performed -for the one or more [events definitions](#Event-Definition) defined in the `eventRefs` array. -Note that the values of `eventRefs` array must be unique. - -The `actionMode` property defines if the defined actions need to be performed sequentially or in parallel. - -The `actions` property defines a list of actions to be performed. - -When specifying the `onEvents` definition it is important to consider the Event states `exclusive` property, -because it determines how 'onEvents' is interpreted. -Let's look at the following JSON definition of 'onEvents' to show this: - -```json -{ - "onEvents": [{ - "eventRefs": ["high-body-temperature", "high-blood-pressure"], - "actions": [{ - "functionRef": { - "refName": "send-tylenol-order", - "arguments": { - "patient": "${ .patientId }" - } - } - }, - { - "functionRef": { - "refName": "call-nurse", - "arguments": { - "patient": "${ .patientId }" - } - } - } - ] - }] -} -``` - -Depending on the value of the Event states `exclusive` property, this definition can mean two different things: - -1. If `exclusive` is set to `true`, the consumption of **either** the `HighBodyTemperature` or `HighBloodPressure` events will trigger action execution. - -2. If `exclusive` is set to `false`, the consumption of **both** the `HighBodyTemperature` and `HighBloodPressure` events will trigger action execution. - -This is visualized in the diagram below: - -

-Event onEvents example -

- -##### Action Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique Action name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| [functionRef](#FunctionRef-Definition) | References a reusable function definition | object or string | yes (if `eventRef` & `subFlowRef` are not defined) | -| [eventRef](#EventRef-Definition) | References a `produce` and `consume` reusable event definitions | object | yes (if `functionRef` & `subFlowRef` are not defined) | -| [subFlowRef](#SubFlowRef-Definition) | References a workflow to be invoked | object or string | yes (if `eventRef` & `functionRef` are not defined) | -| onErrors | Defines the error handling policy to use | string or array of [error handler references](#error-handler-reference) | no | -| [actionDataFilter](#Action-data-filters) | Action data filter definition | object | no | -| sleep | Defines time periods workflow execution should sleep before / after function execution | object | no | -| [condition](#Workflow-Expressions) | Expression, if defined, must evaluate to `true` for this action to be performed. If `false`, action is disregarded | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "finalize-application-action", - "functionRef": { - "refName": "finalize-application-function", - "arguments": { - "applicantid": "${ .applicantId }" - } - } -} -``` - - - -```yaml -name: finalize-application-action -functionRef: - refName: finalize-application-function - arguments: - applicantid: "${ .applicantId }" -``` - -
- -

- -An action can: - -* Reference [functions definitions](#Function-Definition) by its unique name using the `functionRef` property. -* Publish an event [event definitions](#Event-Definition) via the `publish` property. -* Subscribe to an event [event definitions](#Event-Definition) via the `subscribe` property. -* Reference a sub-workflow invocation via the `subFlowRef` property. - -Note that `functionRef`, `publish`, `subscribe` and `subFlowRef` are mutually exclusive, meaning that only one of them can be -specified in a single action definition. - -The `name` property specifies the action name. - -In the event-based scenario a service, or a set of services we want to invoke are not exposed via a specific resource URI for example, but can only be invoked via an event. -In that case, an event definition might be referenced via its `publish` property. -Also, if there is the need to consume an event within a set of actions (for example, wait for the result of a previous action invocation) an event definition might be referenced via its `susbcribe` property. - -The `sleep` property can be used to define time periods that workflow execution should sleep -before and/or after function execution. It can have two properties: -* `before` - defines the amount of time (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) to sleep before function invocation. -* `after` - defines the amount of time (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) to sleep after function invocation. - -Function invocation timeouts should be handled via the states [timeouts](#Workflow-Timeouts) definition. - -The `onErrors` property can be used to define the error handling policy to use. If a string, references the [error policy definition](#error-policy-definition) to use. Otherwise, defines an array of the [error handlers](#error-handler-reference) to use. - -The `condition` property is a [workflow expression](#Workflow-Expressions). If defined, it must evaluate to `true` -for this action to be performed. If it evaluates to `false` the action is skipped. -If the `condition` property is not defined, the action is always performed. - -##### Subflow Action - -Often you want to group your workflows into small logical units that solve a particular business problem and can be reused in -multiple other workflow definitions. - -

-Referencing reusable workflow via SubFlow actions -

- -Reusable workflows are referenced by their `name` property via the SubFlow action `workflowId` parameter. - -For the simple case, `subFlowRef` can be a string containing the `name` of the sub-workflow to invoke. -If you want to specify other parameters then a [subFlowRef](#SubFlowRef-Definition) should be provided instead. - -Each referenced workflow receives the SubFlow actions data as workflow data input. - -Referenced sub-workflows must declare their own [function](#Function-Definition) and [event](#Event-Definition) definitions. - -##### FunctionRef Definition - -`FunctionRef` definition can have two types, either `string` or `object`. -If `string` type, it defines the name of the referenced [function](#Function-Definition). -This can be used as a short-cut definition when you don't need to define any other parameters, for example: - -```json -"functionRef": "my-function" -``` - -Note that if used with `string` type, the invocation of the function is synchronous. - -If you need to define parameters in your `functionRef` definition, you can define -it with its `object` type which has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| refName | Name of the referenced [function](#Function-Definition). Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| arguments | Arguments (inputs) to be passed to the referenced function | object | yes (if function type is `graphql`, otherwise no) | -| selectionSet | Used if function type is `graphql`. String containing a valid GraphQL [selection set](https://spec.graphql.org/June2018/#sec-Selection-Sets) | string | yes (if function type is `graphql`, otherwise no) | -| invoke | Specifies if the function should be invoked `sync` or `async`. Default is `sync` | enum | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "refName": "finalize-application-function", - "arguments": { - "applicantid": "${ .applicantId }" - } -} -``` - - - -```yaml -refName: finalize-application-function -arguments: - applicantid: "${ .applicantId }" -``` - -
- -

- -The `refName` property is the name of the referenced [function](#Function-Definition). - -The `arguments` property defines the arguments that are to be passed to the referenced function. -Here is an example of using the `arguments` property: - -```json -{ - "refName": "check-funds-available", - "arguments": { - "account": { - "id": "${ .accountId }" - }, - "forAmount": "${ .payment.amount }", - "insufficientMessage": "The requested amount is not available." - } -} -``` - -The `invoke` property defines how the function is invoked (sync or async). Default value of this property is -`sync`, meaning that workflow execution should wait until the function completes. -If set to `async`, workflow execution should just invoke the function and should not wait until its completion. -Note that in this case the action does not produce any results and the associated actions actionDataFilter as well as -its retry definition, if defined, should be ignored. -In addition, functions that are invoked async do not propagate their errors to the associated action definition and the -workflow state, meaning that any errors that happen during their execution cannot be handled in the workflow states -onErrors definition. Note that errors raised during functions that are invoked async should not fail workflow execution. - -##### Publish Definition - -Publish an event. - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| [event](#Event-Definition) | Reference to the unique name of an event definition. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| data | If string type, an expression which selects parts of the states data output to become the data (payload) of the event referenced by `publish`. If object type, a custom object to become the data (payload) of the event referenced by `publish`. | string or object | yes | -| contextAttributes | Add additional event extension context attributes to the trigger/produced event | object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "publish": { - "event": "make-vet-appointment", - "data": "${ .patientInfo }", - } -} -``` - - - -```yaml -publish: - event: make-vet-appointment - data: "${ .patientInfo }" -``` - -
- -

- -Publish an [event definition](#Event-Definition) referenced via the `event` property. - -The `data` property can have two types: string or object. If it is of string type, it is an expression that can select parts of state data -to be used as payload of the event referenced by `publish`. If it is of object type, you can define a custom object to be the event payload. - -The `contextAttributes` property allows you to add one or more [extension context attributes](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#extension-context-attributes) -to the trigger/produced event. - -##### Susbscribe Definition - -Wait for an event to arrive. - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| [event](#Event-Definition) | Reference to the unique name of an event definition. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| timeout | Maximum amount of time (ISO 8601 format literal or expression) to wait for the consume event. If not defined it be set to the [actionExecutionTimeout](#Workflow-Timeout-Definition) | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "subscribe": { - "name": "approved-appointment", - } -} -``` - - - -```yaml -eventRef: - subscribe: approved-appointment - -``` - -
- -

- -Consumes an [event definition](#Event-Definition) referenced via the `event` property. - -The `timeout` property defines the maximum amount of time (ISO 8601 format literal or expression) to wait for the result event. If not defined it should default to the [actionExecutionTimeout](#Workflow-Timeout-Definition). -If the event defined by the `name` property is not received in that set time, action invocation should raise an error that can be handled in the states `onErrors` definition. - -##### SubFlowRef Definition - -`SubFlowRef` definition can have two types, namely `string` or `object`. - -If `string` type, it defines the unique name of the sub-workflow to be invoked. -This short-hand definition can be used if sub-workflow lookup is done only by its `name` -property and not its `version` property. - -```json -"subFlowRef": "my-subflow-id" -``` - -If you need to define the `version` properties, you can use its `object` type: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| workflowId | Sub-workflow unique name | string | yes | -| version | Sub-workflow version | string | no | -| invoke | Specifies if the subflow should be invoked `sync` or `async`. Default is `sync` | enum | no | -| onParentComplete | If `invoke` is `async`, specifies if subflow execution should `terminate` or `continue` when parent workflow completes. Default is `terminate` | enum | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "workflowId": "handle-approved-visa", - "version": "2.0.0" -} -``` - - - -```yaml -workflowId: handle-approved-visa -version: '2.0.0' -``` - -
- -

- -The `workflowId` property define the unique name of the sub-workflow to be invoked. -The workflow id should not be the same name of the workflow where the action is defined. Otherwise, it may occur undesired recurring calls to the same workflow. - -The `version` property defined the unique version of the sub-workflow to be invoked. -If this property is defined, runtimes should match both the `id` and the `version` properties -defined in the sub-workflow definition. - -The `invoke` property defines how the subflow is invoked (sync or async). Default value of this property is -`sync`, meaning that workflow execution should wait until the subflow completes. -If set to `async`, workflow execution should just invoke the subflow and not wait for its results. -Note that in this case the action does not produce any results, and the associated actions actionDataFilter as well as -its retry definition, if defined, should be ignored. -Subflows that are invoked async do not propagate their errors to the associated action definition and the -workflow state, meaning that any errors that happen during their execution cannot be handled in the workflow states -onErrors definition. Note that errors raised during subflows that are invoked async -should not fail workflow execution. - -The `onParentComplete` property defines how subflow execution that is invoked async should behave if the parent workflow -completes execution before the subflow completes its own execution. -The default value of this property is `terminate`, meaning that if the parent workflow (the workflow that invoked the subflow) -completes, execution of the subflow should be terminated. -If it is set to `continue`, if the parent workflow completes, the subflow execution is allowed to continue its own execution. - -##### Error Handling Configuration - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| definitions | An array containing reusable definitions of errors to throw and/or to handle. | array of [error definitions](#error-definition) | no | -| handlers | An array containing reusable error handlers, which are used to configure what to do when catching specific errors. | array of [error handler definitions](#error-handler-definition) | no | -| policies | An array containg named groups of error handlers that define reusable error policies | array of [error handling policies](#error-policy-definition) | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "retries": [ - { - "name": "retry-five-times", - "maxAttempts": 5 - } - ], - "errors": { - "definitions": [ - { - "name": "service-not-available-error", - "type": "https://serverlessworkflow.io/spec/errors/communication", - "status": 503, - "title": "Service Not Available", - "detail": "Failed to contact service, even after multiple retries" - } - ], - "handlers": [ - { - "name": "handle-503", - "when": [ - { - "status": 503 - } - ], - "retry": "retry-five-times", - "then": { - "throw": { - "refName": "service-not-available-error" - } - } - } - ], - "policies": [ - { - "name": "fault-tolerance-policy", - "handlers": [ - { - "refName": "handle-503" - } - ] - } - ] - } -} -``` - - - -```yaml -retries: - - name: retry-five-times - maxAttempts: 5 -errors: - definitions: - - name: service-not-available-error - type: https://serverlessworkflow.io/spec/errors/communication - status: 503 - title: Service Not Available - detail: Failed to contact service, even after multiple retries - handlers: - - name: handle-503 - when: - - status: 503 - retry: retry-five-times - then: - throw: - refName: service-not-available-error - policies: - - name: fault-tolerance-policy - handlers: - - refName: handle-503 -``` - -
- -

- -Represents the workflow's error handling configuration, including error definitions, error handlers and error policies. - -##### Error Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | The name of the error. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| instance | An [RFC 6901 JSON pointer](https://datatracker.ietf.org/doc/html/rfc6901) that precisely identifies the component within a workflow definition (ex: funcRef, subflowRef, ...) from which the described error originates. | string | yes, but is added by runtime when throwing an error | -| type | An [RFC 3986](https://datatracker.ietf.org/doc/html/rfc3986) URI reference that identifies the error type. The [RFC 7807 Problem Details specification](https://datatracker.ietf.org/doc/html/rfc7807) encourages that, when dereferenced, it provides human-readable documentation for the error type (e.g., using HTML). The specification strongly recommends using [default error types](#error-types) for cross-compatibility concerns. | string | yes | -| status | The status code generated by the origin for the occurrence of an error. Status codes are extensible by nature and runtimes are not required to understand the meaning of all defined status codes. However, for cross-compatibility concerns, the specification encourages using [RFC 7231 HTTP Status Codes](https://datatracker.ietf.org/doc/html/rfc7231). | string | yes | -| title | A short, human-readable summary of an error type. It SHOULD NOT change from occurrence to occurrence of an error, except for purposes of localization. | string | no | -| detail | A human-readable explanation specific to the occurrence of an error. | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "instance": "/states/0/actions/0", - "type": "https://example.com/errors#timeout", - "status": 504, - "title": "Function Timeout", - "detail": "The function 'my-function' timed out." -} -``` - - - -```yaml -instance: "/states/0/actions/0" -type: "https://example.com/errors#timeout" -status: 504 -title: "Function Timeout" -detail: "The function 'my-function' timed out." -``` - -
- -

- -Error definitions are [RFC 7807](https://datatracker.ietf.org/doc/html/rfc7807) compliant descriptions of errors that are produced by/originating from the execution of a workflow. Runtimes use them to describe workflow related errors in a user-friendly, technology agnostic, and cross-platform way. - -Property `instance` identifies the component within a workflow definition from which the described error originates. It is set by runtimes when throwing an error. - -For example, in the above definition, the source `/states/0/actions/0` indicates that the error originates from the execution of the first action of the first state of the workflow definitions. -This helps both users and implementers to describe and communicate the origins of errors without technical, technology/platform-specific knowledge or understanding. - -Property `type` is a URI used to identify the type of the error. -**For cross-compatibility concerns, the specification strongly encourages using the [default types](#default-error-types).** - -Property `status` identifies the error's status code. -**For cross-compatibility concerns, the specification strongly encourage using [HTTP Status Codes](https://datatracker.ietf.org/doc/html/rfc7231#section-6.1).** - -Properties `title` and `detail` are used to provide additional information about the error. - -Note that an error definition should **NOT** carry any implementation-specific information such as stack traces or code references: its purpose is to provide users with a consistent, human-readable description of an error. - -##### Error Types - -| Type | Status | Description -| --- | --- | --- | -| [https://serverlessworkflow.io/spec/errors/configuration](#) | 400 | Errors resulting from incorrect or invalid configuration settings, such as missing or misconfigured environment variables, incorrect parameter values, or configuration file errors. | -| [https://serverlessworkflow.io/spec/errors/validation](#) | 400 | Errors arising from validation processes, such as validation of input data, schema validation failures, or validation constraints not being met. These errors indicate that the provided data or configuration does not adhere to the expected format or requirements specified by the workflow. | -| [https://serverlessworkflow.io/spec/errors/expression](#) | 400 | Errors occurring during the evaluation of runtime expressions, such as invalid syntax or unsupported operations. | -| [https://serverlessworkflow.io/spec/errors/authentication](#) | 401 | Errors related to authentication failures. | -| [https://serverlessworkflow.io/spec/errors/authorization](#) | 403 | Errors related to unauthorized access attempts or insufficient permissions to perform certain actions within the workflow. | -| [https://serverlessworkflow.io/spec/errors/timeout](#) | 408 | Errors caused by timeouts during the execution of tasks or during interactions with external services. | -| [https://serverlessworkflow.io/spec/errors/communication](#) | 500 | Errors encountered while communicating with external services, including network errors, service unavailable, or invalid responses. | -| [https://serverlessworkflow.io/spec/errors/runtime](#) | 500 | Errors occurring during the runtime execution of a workflow, including unexpected exceptions, errors related to resource allocation, or failures in handling workflow tasks. These errors typically occur during the actual execution of workflow components and may require runtime-specific handling and resolution strategies. | - -The specification promotes the use of default error types by runtimes and workflow authors for describing thrown [errors](#error-definition). This approach ensures consistent identification, handling, and behavior across various platforms and implementations. - -##### Error Reference - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| refName | The name of the error definition to reference. If set, all other properties are ignored. | string | no | -| instance | An RFC6901 JSON pointer that precisely identifies the component within a workflow definition from which the error to reference originates | string | no | -| type | A RFC3986 URI reference that identifies the type of error(s) to reference | string | no | -| status | The status code of the error(s) to reference | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "type": "https://example.com/errors#timeout", - "status": 504 -} -``` - - - -```yaml -type: "https://example.com/errors#timeout" -status: 504 -``` - -
- -

- -An Error Reference in a Serverless Workflow provides a means to point to specific error instances or types within the workflow definition. It serves as a convenient way to refer to errors without duplicating their definitions. - -If multiple properties are set, they are considered cumulative conditions to match an error. - -For example, the above definition is the same as saying "match errors with `type` 'https://example.com/errors#timeout' AND with `status` '504'". - -##### Error Handler Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | The unique name which is used to reference the defined handler. | string | yes | -| when | References the errors to handle. If null or empty, and if `exceptWhen` is null or empty, all errors are caught. | array of [error references](#error-reference) | no | -| exceptWhen | References the errors not to handle. If null or empty, and if `when` is null or empty, all errors are caught. | array of [error references](#error-reference) | no | -| retry | The retry policy to use, if any. If a string, references an existing [retry definition](#retry-definition). | string or [retry definition](#retry-definition) | no | -| then | Defines the outcome, if any, when handling errors | [error outcome definition](#error-outcome-definition) | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "errors": { - "handlers": [ - { - "name": "handle-invalid-error", - "when": [ - { "error": "invalid" }, - { "status": 404 }, - { "status": 403 } - ], - "then": { - "transition": "my-state" - } - }, - { - "name": "handle-timeout-error", - "when": [ - { "status": 503 } - ], - "retry": "my-retry-policy", - "then": { - "transition": "my-state" - } - } - ] - } -} -``` - - - -```yaml -errors: - handlers: - - name: 'handle-invalid-error' - when: - - type: 'invalid' - - status: 404 - - status: 403 - then: - transition: 'my-state' - - name: 'handle-timeout-error' - when: - - status: 503 - retry: 'my-retry-policy' - then: - transition: 'my-state' -``` - -
- -

- -Error handler definitions specify which errors to handle and how they should be handled within a workflow. - -The `name` property specifies the distinct identifier utilized to reference the error handler. - -The `when` property defines the specific errors to handle. Allows for handling only specific errors. - -The `exceptWhen` property defines the specified errors NOT to handle. Allows for handling all errors, excluding specific ones. - -The `retry` property serves to either reference an existing retry policy or define a new one to be employed when handling specified errors within the workflow. If a retry policy is designated, the error source identified by the [error source](#error-source) will undergo retries according to the guidelines outlined in the associated [policy](#retry-definition). If a retry attempt is successful, the workflow seamlessly proceeds as though the error had not transpired. However, if the maximum number of configured retry attempts is exhausted without success, the workflow proceeds to execute the error outcome stipulated by the `then` property. - -The `then` property defines caught error outcomes, if any. If not defined, caught errors will be considered as handled, and the execution of the workflow will continue as if the error never occurred. Handled errors that are not [rethrown](#error-outcome-definition) do NOT [bubble up](#error-bubbling). - -For more information, see the [Workflow Error Handling](#Workflow-Error-Handling) sections. - -##### Error Handler Reference - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| refName | The name of the error handler definition to reference. If set, all other properties are ignored. | string | no | -| when | References the errors to handle. If null or empty, and if `exceptWhen` is null or empty, all errors are caught. | array of [error references](#error-reference) | no | -| exceptWhen | References the errors not to handle. If null or empty, and if `when` is null or empty, all errors are caught. | array of [error references](#error-reference) | no | -| retry | The retry policy to use, if any. If a string, references an existing [retry definition](#retry-definition). | string or [retry definition](#retry-definition) | no | -| then | Defines the outcome, if any, when handling errors | [outcome definition](#error-outcome-definition) | no | - - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "errors": { - "policies": [ - { - "name": "my-retry-policy", - "handlers": [ - { - "refName": "handle-timeout-error" - }, - { - "when": [ - { "status": 503 } - ], - "retry": "my-retry-policy" - } - ] - } - ] - } -} -``` - - - -```yaml -errors: - policies: - - name: 'my-retry-policy' - handlers: - - refName: 'handle-timeout-error' - - when: - - status: 503 - retry: 'my-retry-policy' -``` - -
- -

- -Error Handler References streamline the error handling process by enabling workflows to leverage established error handling logic. - -By referencing pre-defined error handler definitions, workflows can ensure consistency and reusability of error handling strategies, promoting maintainability and clarity within the workflow definition. - -##### Error Policy Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | The name of the error handler | string | yes | -| handlers | A list of the error handlers to use | array of [error handler references](#error-handler-reference) | yes | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "errors": { - "policies": [ - { - "name": "my-retry-policy", - "handlers": [ - { - "refName": "handle-timeout-error" - }, - { - "when": [ - { "status": 503 } - ], - "retry": "my-retry-policy" - } - ] - } - ] - } -} -``` - - - -```yaml -errors: - policies: - - name: 'my-retry-policy' - handlers: - - refName: 'handle-timeout-error' - - when: - - status: 503 - retry: 'my-retry-policy' -``` - -
- -

- -Error Policy Definition in a Serverless Workflow specifies a named collection of error handlers to be applied for error handling within the workflow. They are used to streamline error handling by organizing and grouping error handlers into reusable sets. - -By defining error policies, workflows can easily apply consistent error handling strategies across multiple components or states within the workflow, promoting modularity and maintainability of the workflow definition. - -##### Error Outcome Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| end | If `true`, ends the workflow. | boolean or [end definition](#end-definition) | yes if `transition` and `throw` are null, otherwise no. | -| transition | Indicates that the workflow should transition to the specified state when the error is handled. All potential other activities are terminated. | string or [transition](#transition-definition). | yes if `end` and `throw` are null, otherwise no. | -| throw | Indicates that the handled error should be rethrown. If true, the error is re-thrown as is. Otherwise, configures the error to throw, potentially using runtime expressions. | boolean or [error throw definition](#error-throw-definition). | yes if `end` and `transition` are null, otherwise no. | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "when": [ - { - "status": 503 - } - ], - "then": { - "transition": "my-state" - } -} -``` - - - -```yaml -when: - - status: 503 -then: - transition: 'my-state' -``` - -
- -

- -Error Outcome Definitions provide a flexible mechanism for defining the behavior of the workflow after handling errors. - -By specifying actions such as compensation, ending the workflow, retrying failed actions, transitioning to specific states, or rethrowing errors, Error Outcome Definitions enable precise error handling strategies tailored to the workflow's requirements. - -##### Error Throw Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| refName | The name of the [error definition](#error-definition) to throw. If set, all other properties are ignored. | string | yes, if no other property has been set, otherwise no. | -| type | The URI reference that identifies the type of error to throw. Supports runtime expressions. | string | yes if `name` has not been set, otherwise no. | -| status | The status code generated by the origin for an occurrence of a problem. Supports runtime expressions. | integer or string | yes if `name` has not been set, otherwise no. | -| title | A short, human-readable summary of a problem type. Supports runtime expressions. | string | no | -| detail | A human-readable explanation specific to an occurrence of a problem. Supports runtime expressions. | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "throw": { - "type": "https://serverlessworkflow.io/spec/errors/runtime", - "status": 400, - "detail": "${ $CONST.localizedErrorDetail }" - } -} -``` - - - -```yaml -throw: - type: https://serverlessworkflow.io/spec/errors/runtime - status: 400 - detail: ${ $CONST.localizedErrorDetail } -``` - -
- -

- -Error Throw Definitions provide a mechanism for throwing custom errors within the workflow. - -By specifying the error to be thrown and optionally providing a runtime expression, Error Throw Definitions enable workflows to generate and throw errors dynamically, enhancing flexibility and adaptability in error handling strategies. - -##### Retry Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Unique retry strategy name | string | yes | -| delay | Time delay between retry attempts (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | -| maxAttempts | Maximum number of retry attempts. Value of 1 means no retries are performed | string or number | yes | -| maxDelay | Maximum amount of delay between retry attempts (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | -| increment | Static duration which will be added to the delay between successive retries (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | -| multiplier | Float value by which the delay is multiplied before each attempt. For example: "1.2" meaning that each successive delay is 20% longer than the previous delay. For example, if delay is 'PT10S', then the delay between the first and second attempts will be 10 seconds, and the delay before the third attempt will be 12 seconds. | float or string | no | -| jitter | If float type, maximum amount of random time added or subtracted from the delay between each retry relative to total delay (between 0.0 and 1.0). If string type, absolute maximum amount of random time added or subtracted from the delay between each retry (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | float or string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "timeout-retry-strat", - "delay": "PT2M", - "maxAttempts": 3, - "jitter": "PT0.001S" -} -``` - - - -```yaml -name: timeout-retry-strat -delay: PT2M -maxAttempts: 3 -jitter: PT0.001S -``` - -
- -

- -Defines the states retry policy (strategy). This is an explicit definition and can be reused across multiple -defined state [actions](#Action-Definition). - -The `name` property specifies the unique name of the retry definition (strategy). This unique name -can be referred by workflow states [error definitions](#Error-Definition). - -The `delay` property specifies the initial time delay between retry attempts (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration). - -The `increment` property specifies a duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) which will be added to the delay between successive retries. -To explain this better, let's say we have the following retry definition: - -```json -{ - "name": "timeout-errors-strategy", - "delay": "PT10S", - "increment": "PT2S", - "maxAttempts": 4 -} -``` - -which means that we will retry up to 4 times after waiting with increasing delay between attempts; -in this example 10, 12, 14, and 16 seconds between retries. - -The `multiplier` property specifies the value by which the interval time is increased for each of the retry attempts. -To explain this better, let's say we have the following retry definition: - -```json -{ - "name": "timeout-errors-strategy", - "delay": "PT10S", - "multiplier": 2, - "maxAttempts": 4 -} -``` - -which means that we will retry up to 4 times after waiting with increasing delay between attempts; -in this example 10, 20, 40, and 80 seconds between retries. - -If both `increment` and `multiplier` properties are defined, `increment` should be applied first and then -the `multiplier` when determining the next retry time. - -The `maxAttempts` property determines the maximum number of retry attempts allowed and is a positive integer value. - -The `jitter` property is important to prevent certain scenarios where clients -are retrying in sync, possibly causing or contributing to a transient failure -precisely because they're retrying at the same time. Adding a typically small, -bounded random amount of time to the period between retries serves the purpose -of attempting to prevent these retries from happening simultaneously, possibly -reducing total time to complete requests and overall congestion. How this value -is used in the exponential backoff algorithm is left up to implementations. - -`jitter` may be specified as a percentage relative to the total delay. -Once the next retry attempt delay is calculated, we can apply `jitter` as a percentage value relative to this -calculated delay. For example, if your calculated delay for the next retry is six seconds, and we specify -a `jitter` value of 0.3, a random amount of time between 0 and 1.8 (0.3 times 6) is to be added or subtracted -from the calculated delay. - -Alternatively, `jitter` may be defined as an absolute value specified as an ISO -8601 duration (literal or expression). This way, the maximum amount of random time added is fixed and -will not increase as new attempts are made. - -The `maxDelay` property determines the maximum amount of delay that is desired between retry attempts, and is applied -after `increment`, `multiplier`, and `jitter`. - -To explain this better, let's say we have the following retry definition: - -```json -{ - "name": "timeout-errors-strategy", - "delay": "PT10S", - "maxDelay": "PT100S", - "multiplier": 4, - "jitter": "PT1S", - "maxAttempts": 4 -} -``` - -which means that we will retry up to 4 times after waiting with increasing delay between attempts; -in this example we might observe the following series of delays: - -* 11s (min(`maxDelay`, (`delay` +/- rand(`jitter`)) => min(100, 10 + 1)) -* 43s (min(`maxDelay`, (11s * `multiplier`) +/- rand(`jitter`)) => min(100, (11 * 4) - 1)) -* 100s (min(`maxDelay`, (43s * `multiplier`) +/- rand(`jitter`)) => min(100, (43 * 4) + 0)) -* 100s (min(`maxDelay`, (100s * `multiplier`) +/- rand(`jitter`)) => min(100, (100 * 4) - 1)) - -##### Transition Definition - -`Transition` definition can have two types, either `string` or `object`. -If `string`, it defines the name of the state to transition to. -This can be used as a short-cut definition when you don't need to define any other parameters, for example: - -```json -"transition": "my-next-state" -``` - -If you need to define additional parameters in your `transition` definition, you can define -it with its `object` type which has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| [nextState](#Transitions) | Name of the state to transition to next | string | yes | -| [compensate](#Workflow-Compensation) | If set to `true`, triggers workflow compensation before this transition is taken. Default is `false` | boolean | no | -| produceEvents | Array of [producedEvent](#ProducedEvent-Definition) definitions. Events to be produced before the transition takes place | array | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "produceEvents": [{ - "eventRef": "produce-result-event", - "data": "${ .result.data }" - }], - "nextState": "eval-result-state" -} -``` - - - -```yaml -produceEvents: -- eventRef: produce-result-event - data: "${ .result.data }" -nextState: eval-result-state -``` - -
- -

- -The `nextState` property defines the name of the state to transition to next. -The `compensate` property allows you to trigger [compensation](#Workflow-Compensation) before the transition (if set to `true`). -The `produceEvents` property allows you to define a list of events to produce before the transition happens. - -Transitions allow you to move from one state (control-logic block) to another. For more information see the -[Transitions section](#Transitions) section. - -##### Switch State Data Conditions - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Data condition name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| [condition](#Workflow-Expressions) | Workflow expression evaluated against state data. Must evaluate to `true` or `false` | string | yes | -| [transition](#Transitions) | Transition to another state if condition is `true` | string or object | yes (if `end` is not defined) | -| [end](#End-Definition) | End workflow execution if condition is `true` | boolean or object | yes (if `transition` is not defined) | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "eighteen-or-older", - "condition": "${ .applicant | .age >= 18 }", - "transition": "start-application" -} -``` - - - -```yaml -name: eighteen-or-older -condition: "${ .applicant | .age >= 18 }" -transition: start-application -``` - -
- -

- -Switch state data conditions specify a data-based condition statement, which causes a transition to another -workflow state if evaluated to `true`. -The `condition` property of the condition defines an expression (e.g., `${ .applicant | .age > 18 }`), which selects -parts of the state data input. The condition must evaluate to `true` or `false`. - -If the condition is evaluated to `true`, you can specify either the `transition` or `end` definitions -to decide what to do, transition to another workflow state, or end workflow execution. Note that `transition` and `end` -definitions are mutually exclusive, meaning that you can specify either one or the other, but not both. - -##### Switch State Event Conditions - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Event condition name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| eventRef | References an unique event name in the defined workflow events | string | yes | -| [transition](#Transitions) | Transition to another state if condition is `true` | string or object | yes (if `end` is not defined) | -| [end](#End-Definition) | End workflow execution if condition is `true` | boolean or object | yes (if `transition` is not defined) | -| [eventDataFilter](#Event-data-filters) | Event data filter definition | object | no | -| [metadata](#Workflow-Metadata) | Metadata information| object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "visa-approved", - "eventRef": "visa-approved-event", - "transition": "handle-approved-visa" -} -``` - - - -```yaml -name: visa-approved -eventRef: visa-approved-event -transition: handle-approved-visa -``` - -
- -

- -Switch state event conditions specify events, which the switch state must wait for. Each condition -can reference one workflow-defined event. Upon arrival of this event, the associated transition is taken. -The `eventRef` property references a name of one of the defined workflow events. - -If the referenced event is received, you can specify either the `transition` or `end` definitions -to decide what to do, transition to another workflow state, or end workflow execution. - -The `eventDataFilter` property can be used to filter event data when it is received. - -Note that `transition` and `end` -definitions are mutually exclusive, meaning that you can specify either one or the other, but not both. - -##### Parallel State Branch - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| name | Branch name. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | yes | -| [actions](#Action-Definition) | Actions to be executed in this branch | array | yes | -| [timeouts](#Workflow-Timeouts) | Branch specific timeout settings | object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "name": "branch-1", - "actions": [ - { - "functionRef": { - "refName": "function-name-one", - "arguments": { - "order": "${ .someParam }" - } - } - }, - { - "functionRef": { - "refName": "function-name-two", - "arguments": { - "order": "${ .someParamTwo }" - } - } - } - ] -} -``` - - - -```yaml -name: branch-1 -actions: -- functionRef: - refName: function-name-one - arguments: - order: "${ .someParam }" -- functionRef: - refName: function-name-two - arguments: - order: "${ .someParamTwo }" -``` - -
- -

- -Each branch receives the same copy of the Parallel state's data input. - -A branch can define actions that need to be executed. For the [`SubFlowRef`](#SubFlowRef-Definition) action, the workflow name should not be the same name of the workflow where the branch is defined. Otherwise, it may occur undesired recurring calls to the same workflow. - - -The `timeouts` property can be used to set branch specific timeout settings. Parallel state branches can set the -`actionExecTimeout` and `branchExecTimeout` timeout properties. For more information on workflow timeouts reference the -[Workflow Timeouts](#Workflow-Timeouts) section. - -##### Parallel State Handling Exceptions - -Exceptions can occur during execution of Parallel state branches. - -By default, exceptions that are not handled within branches stop branch execution and are propagated -to the Parallel state and should be handled with its `onErrors` definition. - -If the parallel states branch defines actions, all exceptions that arise from executing these actions (after all -allotted retries are exhausted) -are propagated to the parallel state -and can be handled with the parallel states `onErrors` definition. - -If the parallel states defines a subflow action, exceptions that occur during execution of the called workflow -can choose to handle exceptions on their own. All unhandled exceptions from the called workflow -execution however are propagated back to the parallel state and can be handled with the parallel states -`onErrors` definition. - -Note that once an error that is propagated to the parallel state from a branch and handled by the -states `onErrors` definition is handled (its associated transition is taken) no further errors from branches of this -parallel state should be considered as the workflow control flow logic has already moved to a different state. - -For more information, see the [Workflow Error Handling](#Workflow-Error-Handling) sections. - -##### Start Definition - -Can be either `string` or `object` type. If type string, it defines the name of the workflow starting state. - -```json -"start": "my-starting-state" -``` - -In this case it's assumed that the `schedule` property is not defined. - -If the start definition is of type `object`, it has the following structure: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| stateName | Name of the starting workflow state. Must follow the [Serverless Workflow Naming Convention](#naming-convention) | string | no | -| [schedule](#Schedule-Definition) | Define the recurring time intervals or cron expressions at which workflow instances should be automatically started. | string or object | yes | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "stateName": "my-starting-state", - "schedule": "2020-03-20T09:00:00Z/PT2H" -} -``` - - - -```yaml -stateName: my-starting-state -schedule: 2020-03-20T09:00:00Z/PT2H -``` - -
- -

- -Start definition explicitly defines how/when workflow instances should be created and what the workflow starting state is. - -The start definition can be either `string` or `object` type. - -If `string` type, it defines the name of the workflow starting state. - -If `object` type, it provides the ability to set the workflow starting state name, as well as the `schedule` property. - -The `stateName` property can be set to define the starting workflow state. If not specified, the first state -in the [workflow states definition](#Workflow-States) should be used as the starting workflow state. - -The `schedule` property allows to define scheduled workflow instance creation. -Scheduled starts have two different choices. You can define a recurring time interval or cron-based schedule at which a workflow -instance **should** be created (automatically). - -You can also define cron-based scheduled starts, which allows you to specify periodically started workflow instances based on a [cron](http://crontab.org/) definition. -Cron-based scheduled starts can handle absolute time intervals (i.e., not calculated in respect to some particular point in time). -One use case for cron-based scheduled starts is a workflow that performs periodical data batch processing. -In this case we could use a cron definition - -``` text -0 0/5 * * * ? -``` - -to define that a workflow instance from the workflow definition should be created every 5 minutes, starting at full hour. - -Here are some more examples of cron expressions and their meanings: - -``` text -* * * * * - Create workflow instance at the top of every minute -0 * * * * - Create workflow instance at the top of every hour -0 */2 * * * - Create workflow instance every 2 hours -0 9 8 * * - Create workflow instance at 9:00:00AM on the eighth day of every month -``` - -[See here](http://crontab.org/) to get more information on defining cron expressions. - -One thing to discuss when dealing with cron-based scheduled starts is when the workflow starting state is an [Event](#Event-State). -Event states define that workflow instances are triggered by the existence of the defined event(s). -Defining a cron-based scheduled starts for the runtime implementations would mean that there needs to be an event service that issues -the needed events at the defined times to trigger workflow instance creation. - -Defining a start definition is not required. If it's not defined, the starting workflow -state has to be the very first state defined in the [workflow states array](#Workflow-States). - -##### Schedule Definition - -`Schedule` definition can have two types, either `string` or `object`. -If `string` type, it defines time interval describing when the workflow instance should be automatically created. -This can be used as a short-cut definition when you don't need to define any other parameters, for example: - -```json -{ - "schedule": "R/PT2H" -} -``` - -If you need to define the `cron` or the `timezone` parameters in your `schedule` definition, you can define -it with its `object` type which has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| interval | A recurring time interval expressed in the derivative of ISO 8601 format specified below. Declares that workflow instances should be automatically created at the start of each time interval in the series. | string | yes (if `cron` is not defined) | -| [cron](#Cron-Definition) | Cron expression defining when workflow instances should be automatically created | object | yes (if `interval` is not defined) | -| timezone | Timezone name used to evaluate the interval & cron-expression. If the interval specifies a date-time w/ timezone then proper timezone conversion will be applied. (default: UTC). | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "cron": "0 0/15 * * * ?" -} -``` - - - -```yaml -cron: 0 0/15 * * * ? -``` - -
- -

- -The `interval` property uses a derivative of ISO 8601 recurring time interval format to describe a series of consecutive time intervals for workflow instances to be automatically created at the start of. Unlike full ISO 8601, this derivative format does not allow expression of an explicit number of recurrences or identification of a series by the date and time at the start and end of its first time interval. -There are three ways to express a recurring interval: - -1. `R//`: Defines the start time and a duration, for example: "R/2020-03-20T13:00:00Z/PT2H", meaning workflow - instances will be automatically created every 2 hours starting from March 20th 2020 at 1pm UTC. -2. `R//`: Defines a duration and an end, for example: "R/PT2H/2020-05-11T15:30:00Z", meaning that workflow instances will be - automatically created every 2 hours until until May 11th 2020 at 3:30pm UTC (i.e., the last instance will be created 2 hours prior to that, at 1:30pm UTC). -3. `R/`: Defines a duration only, for example: "R/PT2H", meaning workflow instances will be automatically created every 2 hours. The start time of the first interval may be indeterminate, but should be delayed by no more than the specified duration and must repeat on schedule after that (this is effectively supplying the start time "out-of-band" as permitted ISO ISO 8601-1:2019 section 5.6.1 NOTE 1). Each runtime implementation should document how the start time for a duration-only interval is established. - -The `cron` property uses a [cron expression](http://crontab.org/) -to describe a repeating interval upon which a workflow instance should be created automatically. -For more information see the [cron definition](#Cron-Definition) section. - -The `timezone` property is used to define a time zone name to evaluate the cron or interval expression against. If not specified, it should default -to UTC time zone. See [here](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) for a list of timezone names. For ISO 8601 date time -values in `interval` or `cron.validUntil`, runtimes should treat `timezone` as the 'local time' (UTC if `timezone` is not defined by the user). - -Note that when the workflow starting state is an [Event](#Event-State) -defining cron-based scheduled starts for the runtime implementations would mean that there needs to be an event service that issues -the needed events at the defined times to trigger workflow instance creation. - -##### Cron Definition - -`Cron` definition can have two types, either `string` or `object`. -If `string` type, it defines the cron expression describing when the workflow instance should be created (automatically). -This can be used as a short-cut definition when you don't need to define any other parameters, for example: - -```json -{ - "cron": "0 15,30,45 * ? * *" -} -``` - -If you need to define the `validUntil` parameters in your `cron` definition, you can define -it with its `object` type which has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| expression | Cron expression describing when the workflow instance should be created (automatically) | string | yes | -| validUntil | Specific date and time (ISO 8601 format, literal or expression producing it) when the cron expression is no longer valid | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "expression": "0 15,30,45 * ? * *", - "validUntil": "2021-11-05T08:15:30-05:00" -} -``` - - - -```yaml -expression: 0 15,30,45 * ? * * -validUntil: '2021-11-05T08:15:30-05:00' -``` - -
- -

- -The `expression` property is a a [cron expression](http://crontab.org/) which defines -when workflow instances should be created (automatically). - -The `validUntil` property defines a date and time (using ISO 8601 format, literal or expression). When the -`validUntil` time is reached, the cron expression for instances creations of this workflow -should no longer be valid. - -For example let's say we have to following cron definitions: - -```json -{ - "expression": "0 15,30,45 * ? * *", - "validUntil": "2021-11-05T08:15:30-05:00" -} -``` - -This tells the runtime engine to create an instance of this workflow every hour -at minutes 15, 30 and 45. This is to be done until November 5, 2021, 8:15:30 am, US Eastern Standard Time -as defined by the `validUntil` property value. - -##### End Definition - -Can be either `boolean` or `object` type. If type boolean, must be set to `true`, for example: - -```json -"end": true -``` - -In this case it's assumed that the `terminate` property has its default value of `false`, and the `produceEvents`, -`compensate`, and `continueAs` properties are not defined. - -If the end definition is of type `object`, it has the following structure: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| terminate | If `true`. terminates workflow instance execution | boolean | no | -| produceEvents | Array of [producedEvent](#ProducedEvent-Definition) definitions. Defines events that should be produced. | array | no | -| [compensate](#Workflow-Compensation) | If set to `true`, triggers workflow compensation before workflow execution completes. Default is `false` | boolean | no | -| [continueAs](#continuing-as-a-new-execution) | Defines that current workflow execution should stop, and execution should continue as a new workflow instance of the provided name | string or object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "terminate": true, - "produceEvents": [{ - "eventRef": "provisioning-complete-event", - "data": "${ .provisionedOrders }" - }] -} -``` - - - -```yaml -terminate: true -produceEvents: -- eventRef: provisioning-complete-event - data: "${ .provisionedOrders }" - -``` - -
- -

- -End definitions are used to explicitly define execution completion of a workflow instance or workflow execution path. -A workflow definition must include at least one [workflow state](#Workflow-States). -Note that [Switch states](#Switch-State) cannot declare to be workflow end states. Their conditions however can -define a stop of workflow execution. - -The `terminate` property, if set to `true`, completes the workflow instance execution, this any other active -execution paths. -If a terminate end is reached inside a ForEach or Parallel state the entire workflow instance is terminated. - -The [`produceEvents`](#ProducedEvent-Definition) allows defining events which should be produced -by the workflow instance before workflow stops its execution. - -It's important to mention that if the workflow `keepActive` property is set to`true`, -the only way to complete execution of the workflow instance -is if workflow execution reaches a state that defines an end definition with `terminate` property set to `true`, -or, if the [`workflowExecTimeout`](#Workflow-Timeouts) property is defined, the time defined in its `interval` -is reached. - -The [compensate](#Workflow-Compensation) property defines that workflow compensation should be performed before the workflow -execution is completed. - -The [continueAs](#Continuing-as-a-new-Execution) property defines that the current workflow instance should stop its execution, -and worklow execution should continue as a new instance of a new workflow. -When defined, it should be assumed that `terminate` is `true`. If `continueAs` is defined, and `terminate` is explicitly -set to `false`, runtimes should report this to users. Producing events, and compensation should still be performed (if defined) -before the workflow execution is stopped, and continued as a new workflow instance with the defined workflow name. - -##### ProducedEvent Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| eventRef | Reference to a defined unique event name in the [events](#Event-Definition) definition | string | yes | -| data | If string type, an expression which selects parts of the states data output to become the data (payload) of the produced event. If object type, a custom object to become the data (payload) of produced event. | string or object | no | -| contextAttributes | Add additional event extension context attributes | object | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "eventRef": "provisioning-complete-event", - "data": "${ .provisionedOrders }", - "contextAttributes": { - "buyerId": "${ .buyerId }" - } - } -``` - - - -```yaml -eventRef: provisioning-complete-event -data: "${ .provisionedOrders }" -contextAttributes: - buyerId: "${ .buyerId }" -``` - -
- -

- -Defines the event (CloudEvent format) to be produced when workflow execution completes or during a workflow [transitions](#Transitions). -The `eventRef` property must match the name of -one of the defined events in the [events](#Event-Definition) definition. - -The `data` property can have two types, object or string. If of string type, it is an expression that can select parts of state data -to be used as the event payload. If of object type, you can define a custom object to be the event payload. - -The `contextAttributes` property allows you to add one or more [extension context attributes](https://github.com/cloudevents/spec/blob/main/cloudevents/spec.md#extension-context-attributes) -to the generated event. - -Being able to produce events when workflow execution completes or during state transition -allows for event-based orchestration communication. -For example, completion of an orchestration workflow can notify other orchestration workflows to decide if they need to act upon -the produced event, or notify monitoring services of the current state of workflow execution, etc. -It can be used to create very dynamic orchestration scenarios. - -##### Transitions - -Serverless workflow states can have one or more incoming and outgoing transitions (from/to other states). -Each state can define a `transition` definition that is used to determine which -state to transition to next. - -Implementers **must** use the unique State `name` property for determining the transition. - -Events can be produced during state transitions. The `produceEvents` property of the `transition` definitions allows you -to reference one or more defined events in the workflow [events definitions](#Event-Definition). -For each of the produced events you can select what parts of state data to be the event payload. - -Transitions can trigger compensation via their `compensate` property. See the [Workflow Compensation](#Workflow-Compensation) -section for more information. - -##### Additional Properties - -Specifying additional properties, namely properties which are not defined by the specification -are only allowed in the [Workflow Definition](#Workflow-Definition-Structure). -Additional properties serve the same purpose as [Workflow Metadata](#Workflow-Metadata). -They allow you to enrich the workflow definition with custom information. - -Additional properties, just like workflow metadata, should not affect workflow execution. -Implementations may choose to use additional properties or ignore them. - -It is recommended to use workflow metadata instead of additional properties in the workflow definition. - -Let's take a look at an example of additional properties: - -```json -{ - "name": "my-workflow", - "version": "1.0.0", - "specVersion": "0.8", - "description": "My Test Workflow", - "start": "My First State", - "loglevel": "Info", - "environment": "Production", - "category": "Sales", - "states": [ ... ] -} -``` - -In this example, we specify the `loglevel`, `environment`, and `category` additional properties. - -Note the same can be also specified using workflow metadata, which is the preferred approach: - -```json -{ - "name": "my-workflow", - "version": "1.0.0", - "specVersion": "0.8", - "description": "Py Test Workflow", - "start": "My First State", - "metadata": { - "loglevel": "Info", - "environment": "Production", - "category": "Sales" - }, - "states": [ ... ] -} -``` - -### Workflow Error Handling - -Error handling is a crucial aspect of any workflow system, ensuring that the workflow can gracefully handle unexpected situations or errors that may occur during its execution. In Serverless Workflow, error handling is a well-defined and structured process aimed at providing developers with the tools and mechanisms necessary to manage errors effectively within their workflows. - -#### Error Definitions - -[Error definitions](#error-definition) in Serverless Workflow follow the [RFC7807 Problem Details specification](https://datatracker.ietf.org/doc/html/rfc7807), providing a standardized format for describing errors that may occur during workflow execution. These definitions include parameters such as name, instance, type, status, title, and detail, which collectively provide a comprehensive description of the error. By adhering to this standard, errors can be described in a consistent, technology-agnostic, and human-readable manner, facilitating effective communication and resolution. - -#### Error Types - -Serverless Workflow defines a set of [default error types](#error-types), each identified by a unique URI reference and associated with specific status code(s). These error types cover common scenarios such as configuration errors, validation failures, authentication issues, timeouts, and runtime exceptions. By utilizing these predefined error types, workflows can maintain cross-compatibility and ensure consistent error identification and handling across different platforms and implementations. - -#### Error Source - -In Serverless Workflow, the concept of "Error Source" refers to the precise origin or location within the workflow definition where an error occurs during its execution. This crucial aspect is identified and pinpointed using the [error definition's ](#error-definition) `instance` property, which is defined as an [RFC6901 JSON pointer](https://datatracker.ietf.org/doc/html/rfc6901). - -When an error arises during the execution of a workflow, whether it's at the level of an action or a state, the Error Source becomes instrumental in identifying the specific component within the workflow where the error originated. This granular identification is essential for efficient debugging and troubleshooting, as it allows developers to swiftly locate and address the root cause of the error. - -By leveraging the Error Source, developers can streamline the error-handling process, facilitating quicker resolution of issues and enhancing the overall reliability and robustness of the workflow. - -#### Error Handling Strategies - -In Serverless Workflow, you have the flexibility to define error handling strategies using error handlers, policies, and outcome definitions. - -Errors can be configured at both the state and action levels, allowing you to tailor error handling to specific components within your workflow. - -When choosing an error handling strategy, consider your workflow requirements and strike a balance between simplicity, maintainability, and flexibility. Choose the approach that best fits the needs of your workflow to ensure effective error management. - -##### Inline Error Handling - -The most basic method involves configuring the `onErrors` property directly within a state or an action and adding an inline handler. While suitable for specific scenarios, this approach should be used sparingly as it may lead to code duplication and reduced maintainability. - - - - - - - - - - -
JSONYAML
- -```json -{ - "actions": [ - { - "name": "my-action", - "functionRef": "my-function", - "onErrors": [ - { - "when": [ - { - "status": 503 - } - ], - "retry": "retry-five-times" - } - ] - } - ] -} - -``` - - - -```yaml -actions: - - name: my-action - functionRef: my-function - onErrors: - - when: - - status: 503 - retry: retry-five-times -``` - -
- -##### Error Handler Reference - -A more structured approach is to reference a pre-configured, reusable error handler. However, in most cases, it's recommended to reference an error policy instead, for improved maintainability and consistency. - - - - - - - - - - -
JSONYAML
- -```json -{ - "errors": { - "definitions": [ - { - "name": "service-not-available-error", - "type": "https://serverlessworkflow.io/spec/errors/communication", - "status": 503, - "title": "Service Not Available", - "detail": "Failed to contact service, even after multiple retries" - } - ], - "handlers": [ - { - "name": "handle-503", - "when": [ - { - "status": 503 - } - ], - "retry": "retry-five-times", - "then": { - "throw": { - "refName": "service-not-available-error" - } - } - } - ] - }, - "states": [ - { - "name": "my-state", - "type": "operation", - "actions": [ - { - "name": "my-action", - "functionRef": "my-function", - "onErrors": [ - { - "refName": "handle-503" - } - ] - } - ] - } - ] -} - -``` - - - -```yaml -errors: - definitions: - - name: service-not-available-error - type: https://serverlessworkflow.io/spec/errors/communication - status: 503 - title: Service Not Available - detail: Failed to contact service, even after multiple retries - handlers: - - name: handle-503 - when: - - status: 503 - retry: retry-five-times - then: - throw: - refName: service-not-available-error -states: - - name: my-state - type: operation - actions: - - name: my-action - functionRef: my-function - onErrors: - - refName: handle-503 -``` - -
- -##### Error Policy Reference - -The optimal approach for addressing most error handling scenarios is to reference a configurable, reusable error policy. This promotes consistency, simplifies maintenance, and enhances workflow readability. - - - - - - - - - - -
JSONYAML
- -```json -{ - "errors": { - "definitions": [ - { - "name": "service-not-available-error", - "type": "https://serverlessworkflow.io/spec/errors/communication", - "status": 503, - "title": "Service Not Available", - "detail": "Failed to contact service, even after multiple retries" - } - ], - "handlers": [ - { - "name": "handle-503", - "when": [ - { - "status": 503 - } - ], - "retry": "retry-five-times", - "then": { - "throw": { - "refName": "service-not-available-error" - } - } - } - ] - }, - "states": [ - { - "name": "my-state", - "type": "operation", - "actions": [ - { - "name": "my-action", - "functionRef": "my-function", - "onErrors": [ - { - "refName": "handle-503" - } - ] - } - ] - } - ] -} - -``` - - - -```yaml -errors: - definitions: - - name: service-not-available-error - type: https://serverlessworkflow.io/spec/errors/communication - status: 503 - title: Service Not Available - detail: Failed to contact service, even after multiple retries - handlers: - - name: handle-503 - when: - - status: 503 - retry: retry-five-times - then: - throw: - refName: service-not-available-error - policy: - - name: fault-tolerance - handlers: - - refName: handle-503 -states: - - name: my-state - type: operation - actions: - - name: my-action - functionRef: my-function - onErrors: fault-tolerance -``` - -
- - - -#### Error Retries - -Serverless Workflow offers a robust error retry mechanism designed to enhance the reliability and resilience of workflows by automatically attempting to execute failed operations again under specific conditions. When an error is caught within a workflow, the retry mechanism is activated, providing an opportunity to retry the failed operation. This retry behavior is configured using the `retry` property within the [error handling definition](#error-handler-definition). - -The retry mechanism provides several benefits to workflow developers. Firstly, it improves reliability by automatically retrying failed operations, thereby reducing the likelihood of transient errors causing workflow failures. Additionally, it enhances the resilience of workflows by enabling them to recover from temporary issues or transient faults in the underlying systems, ensuring continuous execution even in the face of occasional errors. Moreover, the built-in retry capabilities simplify error handling logic, eliminating the need for manual implementation of complex retry mechanisms. This streamlines workflow development and maintenance, making it easier for developers to manage and troubleshoot error scenarios effectively. - -In summary, Serverless Workflow's error retry mechanism offers a comprehensive solution for handling errors during workflow execution, providing improved reliability, enhanced resilience, and simplified error handling logic. By automatically retrying failed operations under specific conditions, it ensures smoother workflow execution and minimizes the impact of errors on overall system performance. -Serverless Workflow offers a robust error retry mechanism to handle errors that occur during workflow execution. This retry mechanism is designed to enhance the reliability and resilience of workflows by automatically attempting to execute failed operations again under certain conditions. - -##### Retry Policy Execution - -Upon encountering a defined error, if a retry policy is defined, the workflow runtime will initiate a retry attempt according to the specified policy. The [error source](#error-source), whether it be an action or a state, will be retried based on the configured policy. - -##### Retry Behavior - -During each retry attempt, the workflow runtime will make another attempt to execute the operation that resulted in the error. If the retry attempt is successful, the workflow will continue execution as if the error never occurred, seamlessly progressing through the workflow. - -##### Retry Exhaustion - -If the maximum configured number of retry attempts is reached without success, the workflow runtime will execute the error outcome defined by the `then` property within the error handling definition. This outcome could involve transitioning to a specific state, triggering compensation logic, or terminating the workflow, depending on the defined error handling strategy. - -#### Error Outcomes - -Error outcomes in Serverless Workflow provide a flexible mechanism for defining the behavior of the workflow after handling errors. They enable precise error handling strategies tailored to the workflow's requirements, ensuring that errors are managed effectively and workflows can gracefully recover from unexpected situations. - -The `compensate` outcome triggers workflow compensation. This outcome allows workflows to execute compensation logic to undo any previously completed actions and restore the system to a consistent state before proceeding to the current state's outcome. It ensures that workflows can recover from errors and maintain data integrity. - -The `end` outcome ends the workflow immediately after handling the error. This outcome is useful when errors indicate unrecoverable situations or when workflows should terminate gracefully after encountering specific errors. - -The `transition` outcome instructs the workflow to transition to the specified state when the error is handled. This outcome is particularly useful for redirecting the workflow to alternative paths or recovery mechanisms based on the encountered error. - -Finally, the `throw` outcome allows workflows to rethrow the [handled error](#error-definition) or throw a new [error](#error-definition). When set to `true`, the error is rethrown as is, propagating it up the workflow hierarchy. Alternatively, the outcome can define or reference a new error to throw, potentially using runtime expressions to customize error details dynamically. - -Overall, error outcomes in Serverless Workflow offer a comprehensive set of options for managing errors within workflows. By defining precise error handling strategies using these outcomes, workflows can effectively handle errors, recover from failures, and maintain robustness and resilience in various execution scenarios. - -#### Error Bubbling - -Error bubbling within Serverless Workflow describes the process by which an unhandled or rethrown error propagates or "bubbles up" from its current location to its parent component, typically the state in which it originated. This mechanism ensures that errors are managed and handled effectively within the workflow hierarchy, maintaining consistent error handling and workflow behavior. - -When an error arises within a workflow, it initially occurs at the lowest level of execution, such as within an action. If the error remains unhandled or uncaught at this level, it ascends through the workflow's structure until it reaches the parent component of the location where the error originated. If the error persists and is not addressed at the state level, it ultimately terminates the workflow. - -The termination of the workflow due to an unhandled error at the state level serves as a means of ensuring that errors are appropriately dealt with and do not result in erroneous or inconsistent workflow behavior. By halting the workflow's execution at the point of error occurrence, Serverless Workflow promotes resilience and reliability, averting potential cascading failures and ensuring predictable error handling behavior. - -In essence, the error handling mechanism within Serverless Workflow is designed to guarantee that errors are managed and resolved effectively within workflows, thereby preventing unexpected outcomes and fostering reliability and consistency in workflow execution. - -#### Error Handling Best Practices - -When designing error handling logic in Serverless Workflow, it's essential to adhere to best practices to ensure robustness and reliability: - -- Define Clear Error Definitions: Clearly define error types and their corresponding definitions to provide meaningful information about encountered errors. -- Use Default Error Types: Whenever possible, use the predefined default error types provided by ServerlessWorkflow to ensure consistency and compatibility. -- Group Error Handlers: Group related error handlers into error policies to promote code reuse and maintainability. -- Handle Errors Gracefully: Handle errors gracefully within workflows by defining appropriate error handlers and outcome definitions to mitigate the impact of errors on workflow execution. - -### Workflow Timeouts - -Workflow timeouts define the maximum times for: - -1. Workflow execution -2. State execution -3. Action execution -4. Branch execution -5. Event consumption time - -The specification allows for timeouts to be defined on the top-level workflow definition, as well as -in each of the workflow state definitions. Note that the timeout settings defined in states, and state branches overwrite the top-level -workflow definition for state, action and branch execution. If they are not defined, then the top-level -timeout settings should take in effect. - -To give an example, let's say that in our workflow definition we define the timeout for state execution: - -```json - "name": "test-workflow", - ... - "timeouts": { - ... - "stateExecTimeout": "PT2S" - } - ... -} -``` - -This top-level workflow timeout setting defines that the maximum execution time of all defined workflow states -is two seconds each. - -Now let's say that we have worfklow states "A" and "B". State "A" does not define a timeout definition, but state -"B" does: - -```json -{ - "name": "b", - "type": "operation", - ... - "timeouts": { - ... - "stateExecTimeout": "PT10S" - } - ... -} -``` - -Since state "A" does not overwrite the top-level `stateExecTimeout`, its execution timeout should be inherited from -the top-level timeout definition. -On the other hand, state "B" does define it's own `stateExecTimeout`, in which case it would overwrite the default -setting, meaning that it would its execution time has a max limit of ten seconds. - -Defining timeouts is not mandatory, meaning that if not defined, all the timeout settings should be assumed to -be "unlimited". - -Note that the defined workflow execution timeout has precedence over all other defined timeouts. -Just to give an extreme example, let's say we define the workflow execution timeout to ten seconds, -and the state execution timeout to twenty seconds. In this case if the workflow execution timeout is reached -it should follow the rules of workflow execution timeout and end workflow execution, no matter what the -state execution time has been set to. - -Let's take a look all possible timeout definitions: - -#### Workflow Timeout Definition - -Workflow timeouts are defined with the top-level `timeouts` property. It can have two types, `string` and `object`. -If `string` type it defines an URI that points to a Json or Yaml file containing the workflow timeout definitions. -If `object` type, it is used to define the timeout definitions in-line and has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| [workflowExecTimeout](#workflowexectimeout-definition) | Workflow execution timeout (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string or object | no | -| [stateExecTimeout](#states-timeout-definition) | Workflow state execution timeout (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | -| actionExecTimeout | Actions execution timeout (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | -| [branchExecTimeout](#branch-timeout-definition) | Branch execution timeout (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | -| [eventTimeout](#event-timeout-definition) | Default timeout for consuming defined events (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | no | - -The `eventTimeout` property defines the maximum amount of time to wait to consume defined events. If not specified it should default to -"unlimited". - -The `branchExecTimeout` property defines the maximum execution time for a single branch. If not specified it should default to -"unlimited". - -The `actionExecTimeout` property defines the maximum execution time for a single actions definition. If not specified it should default to -"unlimited". Note that an action definition can include multiple actions. - -The `stateExecTimeout` property defines the maximum execution time for a single workflow state. If not specified it should default to -"unlimited". - -The `workflowExecTimeout` property defines the workflow execution timeout. -It is defined using the ISO 8601 duration format. If not defined, the workflow execution should be given "unlimited" -amount of time to complete. -`workflowExecTimeout` can have two possibly types, either `string` or `object`. -If `string` type, it defines the maximum workflow execution time. -If Object type it has the following format: - -##### WorkflowExecTimeout Definition - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| duration | Timeout duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration) | string | yes | -| interrupt | If `false`, workflow instance is allowed to finish current execution. If `true`, current workflow execution is stopped immediately. Default is `false` | boolean | no | -| runBefore | Name of a workflow state to be executed before workflow instance is terminated | string | no | - -
Click to view example definition -

- - - - - - - - - - -
JSONYAML
- -```json -{ - "duration": "PT2M", - "runBefore": "createandsendreport" -} -``` - - - -```yaml -duration: PT2M -runBefore: createandsendreport -``` - -
- -

- -The `duration` property defines the time duration of the execution timeout. Once a workflow instance is created, -and the amount of the defined time is reached, the workflow instance should be terminated. - -The `interrupt` property defines if the currently running instance should be allowed to finish its current -execution flow before it needs to be terminated. If set to `true`, the current instance execution should stop immediately. - -The `runBefore` property defines a name of a workflow state to be executed before workflow instance is terminated. -States referenced by `runBefore` (as well as any other states that they transition to) must obey following rules: - -* They should not have any incoming transitions (should not be part of the main workflow control-flow logic) -* They cannot be states marked for compensation (have their `usedForCompensation` property set to `true`) -* If it is a single state, it must define an [end definition](#End-Definition), if it transitions to other states, - at last one must define it. -* They can transition only to states are also not part of the main control flow logic (and are not marked - for compensation). - -Runtime implementations should raise compile time / parsing exceptions if any of the rules mentioned above are -not obeyed in the workflow definition. - -#### States Timeout Definition - -All workflow states except Inject State can define the `timeouts` property and can define different timeout -settings depending on their state type. -Please reference each [workflow state definitions](#Workflow-States) for more information on which -timeout settings are available for each state type. - -Workflow states timeouts cannot define the `workflowExecTimeout` property. - -Workflow states can set their `stateExecTimeout` property inside the `timeouts` definition. -The value of this property is a time duration (literal ISO 8601 duration format or expression which evaluation results in an ISO 8601 duration). -It must be a duration that's greater than zero and defines the total state execution timeout. -When this timeout is reached, state execution -should be stopped and can be handled as a timeout error in the states `onErrors` definition. - -#### Branch Timeout Definition - -[Parallel states](#Parallel-State) can define the `branchExecTimeout` property. If defined on the state -level, it applies to each [branch](#Parallel-State-Branch) of the Parallel state. Note that each parallel state branch -can overwrite this setting to define its own branch execution timeout. -If a branch does not define this timeout property, it should be inherited from it's state definition branch timeout setting. -If its state does not define it either, it should be inherited from the top-level workflow branch timeout settings. - -#### Event Timeout Definition - -The Event state `timeouts` property can be used to -specify state specific timeout settings. For event state it can contain the `eventTimeout` property -which is defined using the ISO 8601 data and time format. -You can specify for example "PT15M" to represent 15 minutes or "P2DT3H4M" to represent 2 days, 3 hours and 4 minutes. -`eventTimeout` values should always be represented as durations and not as specific time intervals. - -The `eventTimeout` property needs to be described in detail for Event states as it depends on whether or not the Event state is a workflow starting state or not. - -If the Event state is a workflow starting state, incoming events may trigger workflow instances. In this case, -if the `exclusive` property is set to `true`, the `eventTimeout` property should be ignored. - -If the `exclusive` property is set to `false`, in this case, the defined `eventTimeout` represents the time -between arrival of specified events. To give an example, consider the following: - -```json -{ -"states": [ -{ - "name": "example-event-state", - "type": "event", - "exclusive": false, - "timeouts": { - "eventTimeout": "PT2M" - } - "onEvents": [ - { - "eventRefs": [ - "example-event-1", - "example-event-2" - ], - "actions": [ - ... - ] - } - ], - "end": { - "terminate": true - } -} -] -} -``` - -The first `eventTimeout` would start once any of the referenced events are consumed. If the second event does not occur within -the defined eventTimeout, no workflow instance should be created. - -If the event state is not a workflow starting state, the `eventTimeout` property is relative to the time when the -state becomes active. If the defined event conditions (regardless of the value of the exclusive property) -are not satisfied within the defined timeout period, the event state should transition to the next state or end the workflow -instance in case it is an end state without performing any actions. - -### Workflow Compensation - -Compensation deals with undoing or reversing the work of one or more states which have -already successfully completed. For example, let's say that we have charged a customer $100 for an item -purchase. In the case customer laster on decides to cancel this purchase we need to undo it. One way of -doing that is to credit the customer $100. - -It's important to understand that compensation with workflows is not the same as for example rolling back -a transaction (a strict undo). Compensating a workflow state which has successfully completed -might involve multiple logical steps and thus is part of the overall business logic that must be -defined within the workflow itself. To explain this let's use our previous example and say that when our -customer made the item purchase, our workflow has sent her/him a confirmation email. In the case, to -compensate this purchase, we cannot just "undo" the confirmation email sent. Instead, we want to -send a second email to the customer which includes purchase cancellation information. - -Compensation in Serverless Workflow must be explicitly defined by the workflow control flow logic. -It cannot be dynamically triggered by initial workflow data, event payloads, results of service invocations, or -errors. - -#### Defining Compensation - -Each workflow state can define how it should be compensated via its `compensatedBy` property. -This property references another workflow state (by its unique name) which is responsible for the actual compensation. - -States referenced by `compensatedBy` (as well as any other states that they transition to) must obey following rules: - -* They should not have any incoming transitions (should not be part of the main workflow control-flow logic) -* They cannot be an [event state](#Event-State) -* They cannot define an [end definition](#End-definition). If they do, it should be ignored -* They must define the `usedForCompensation` property and set it to `true` -* They can transition only to states which also have their `usedForCompensation` property set to `true` -* They cannot themselves set their `compensatedBy` property to any state (compensation is not recursive) - -Runtime implementations should raise compile time / parsing exceptions if any of the rules mentioned above are -not obeyed in the workflow definition. - -Let's take a look at an example workflow state which defines its `compensatedBy` property, and the compensation -state it references: - - - - - - - - - - -
JSONYAML
- -```json - { - "states": [ - { - "name": "new-item-purchase", - "type": "event", - "onEvents": [ - { - "eventRefs": [ - "new-purchase" - ], - "actions": [ - { - "functionRef": { - "refName": "debit-customer-function", - "arguments": { - "customerid": "${ .purchase.customerid }", - "amount": "${ .purchase.amount }" - } - } - }, - { - "functionRef": { - "refName": "send-purchase-confirmation-email-function", - "arguments": { - "customerid": "${ .purchase.customerid }" - } - } - } - ] - } - ], - "compensatedBy": "cancel-purchase", - "transition": "some-next-workflow-state" - }, - { - "name": "cancel-purchase", - "type": "operation", - "usedForCompensation": true, - "actions": [ - { - "functionRef": { - "refName": "credit-customer-function", - "arguments": { - "customerid": "${ .purchase.customerid }", - "amount": "${ .purchase.amount }" - } - } - }, - { - "functionRef": { - "refName": "send-purchase-cancellation-email-function", - "arguments": { - "customerid": "${ .purchase.customerid }" - } - } - } - ] - } - ] - } -``` - - - -```yaml -states: -- name: new-item-purchase - type: event - onEvents: - - eventRefs: - - new-purchase - actions: - - functionRef: - refName: debit-customer-function - arguments: - customerid: "${ .purchase.customerid }" - amount: "${ .purchase.amount }" - - functionRef: - refName: send-purchase-confirmation-email-function - arguments: - customerid: "${ .purchase.customerid }" - compensatedBy: cancel-purchase - transition: some-next-workflow-state -- name: CancelPurchase - type: operation - usedForCompensation: true - actions: - - functionRef: - refName: credit-customer-function - arguments: - customerid: "${ .purchase.customerid }" - amount: "${ .purchase.amount }" - - functionRef: - refName: send-purchase-cancellation-email-function - arguments: - customerid: "${ .purchase.customerid }" -``` - -
- -In this example our "NewItemPurchase" [event state](#Event-state) waits for a "NewPurchase" event and then -debits the customer and sends them a purchase confirmation email. It defines that it's compensated by the -"CancelPurchase" [operation state](#Operation-state) which performs two actions, namely credits back the -purchase amount to customer and sends them a purchase cancellation email. - -#### Triggering Compensation - -As previously mentioned, compensation must be explicitly triggered by the workflows control-flow logic. -This can be done via [transition](#Transition-definition) and [end](#End-definition) definitions. - -Let's take a look at each: - -1. Compensation triggered on transition: - - - - - - - - - - -
JSONYAML
- -```json -{ - "transition": { - "compensate": true, - "nextState": "next-workflow-state" - } -} -``` - - - -```yaml -transition: - compensate: true - nextState: next-workflow-state -``` - -
- -Transitions can trigger compensations by specifying the `compensate` property and setting it to `true`. -This means that before the transition is executed (workflow continues its execution to the "NextWorkflowState" in this example), -workflow compensation must be performed. - -2. Compensation triggered by end definition: - - - - - - - - - - -
JSONYAML
- -```json -{ - "end": { - "compensate": true - } -} -``` - - - -```yaml -end: - compensate: true -``` - -
- -End definitions can trigger compensations by specifying the `compensate` property and setting it to `true`. -This means that before workflow finishes its execution workflow compensation must be performed. Note that -in case when the end definition has its `produceEvents` property set, compensation must be performed before -producing the specified events and ending workflow execution. -In the case the end definition has a `continueAs` property defined, compensation must be performed before -workflow execution continues as a new workflow invocation. -In the case where the end definition has both `produceEvents`, and `continueAs` compensation is performed first, -then the event should be produced, and then the workflow should continue its execution as a new workflow invocation. - -#### Compensation Execution Details - -Now that we have seen how to define and trigger compensation, we need to go into details on how compensation should be executed. -Compensation is performed on all already successfully completed states (that define `compensatedBy`) in **reverse** order. -Compensation is always done in sequential order, and should not be executed in parallel. - -Let's take a look at the following workflow image: - -

-Compensation Execution Example -

- -In this example lets say our workflow execution is at the "End" state which defines the `compensate` property to `true` -as shown in the previous section. States with a red border, namely "A", "B", "D" and "E" are states which have so far -been executed successfully. State "C" has not been executed during workflow execution in our example. - -When workflow execution encounters our "End" state, compensation has to be performed. This is done in **reverse** order: - -1. State "E" is not compensated as it does not define a `compensatedBy` state -2. State "D" is compensated by executing compensation "D1" -3. State "B" is compensated by executing "B1" and then "B1-2" -4. State C is not compensated as it was never active during workflow execution -5. State A is not comped as it does not define a `compensatedBy` state - -So if we look just at the workflow execution flow, the same workflow could be seen as: - -

-Compensation Execution Example 2 -

- -In our example, when compensation triggers, -the current workflow data is passed as input to the "D1" state, the first compensation state for our example. -The states data output is then passed as states data input to "B1", and so on. - -#### Compensation and Active States - -In some cases when compensation is triggered, some states such as [Parallel](#Parallel-State) and [ForEach](#ForEach-State) -states can still be "active", meaning they still might have some async executions that are being performed. - -If compensation needs to performed on such still active states, the state execution must be first cancelled. -After it is cancelled, compensation should be performed. - -#### Unrecoverable errors during compensation - -States that are marked as `usedForCompensation` can define [error handling](#Workflow-Error-Handling) via their -`onErrors` property just like any other workflow states. In case of unrecoverable errors during their execution -(errors not explicitly handled), -workflow execution should be stopped, which is the same behavior as when not using compensation as well. - -### Continuing as a new Execution - -In some cases our workflows are deployed and executed on runtimes and/or cloud platforms that expose some -execution limitations such as finite execution duration, finite number of workflow transitions, etc. -Some runtimes, especially when dealing with stateful workflow orchestrations have a finite limit of -execution history log sizes, meaning that once a long-running workflow reaches these limits workflow executions is -likely to be forced to stop before reaching its completion. This can result in unexpected issues, especially with -mission-critical workflows. - -For those cases, the Serverless Workflow DSL provides a way to explicitly define stopping the current workflow -instance execution, and starting a new one (for the same workflow name or a different one). -This can be done via the [end definitions](#end-definition) `continueAs` property. - -The end definitions `continueAs` can be either of type `string` or `object`. -If string type, it contains the unique workflow name of the workflow that the execution should continue as, for example: - - -```json -{ - "end": { - "continueAs": "my-workflow-name" - } -} -``` - -Defining this should stop the current workflow execution, and continue execution as a new workflow instance of the -workflow which defines the workflow name of "my-workflow-name". The state data where this is define should -become the workflow data input of the workflow that is continuing the current workflow execution. - -Note that any defined `produceEvents` and `compensate` definitions should be honored before `continueAs` is applied. - -If `object` type, the `continueAs` property has the following properties: - -| Parameter | Description | Type | Required | -| --- | --- | --- | --- | -| workflowId | Unique name of the workflow to continue execution as. | string | yes | -| version | Version of the workflow to continue execution as. | string | no | -| data | If string type, a workflow expression which selects parts of the states data output to become the workflow data input of continued execution. If object type, a custom object to become the workflow data input of the continued execution. | string or object | no | -| [`workflowExecTimeout`](#Workflow-Timeouts) | Workflow execution timeout to be used by the workflow continuing execution. Overwrites any specific settings set by that workflow. | string or object | no | - -Continuing execution with `continueAs` can also be used inside sub-workflow executions, which brings its next use case. - -#### ContinueAs in sub workflows - -Workflows can invoke sub-workflows during their execution. In Serverless Workflow DSL, sub-workflows are invoked -similarly to other function types via the [SubFlowRef Definition](#SubFlowRef-Definition) -in workflow states [Action](#Action-Definition) definitions. - -Just like "parent" workflows, sub-workflow can also be long-running, and can run into the same type of runtime/serverless platform -limitations as previously discussed. As such they can also use `continueAs` to stop their current execution and continue it as -a new one of the same or different workflow name. - -Note that when a sub-workflow is invoked it can produce a result that is then merged into the parent workflow state data. -This may bring up a question as to what happens when a sub-workflow calls `continueAs` in terms of what is returned as -result to of its invocation by the parent workflow. - -No matter how many times sub-workflow may use `continueAs`, to the parent workflow it should be as a single invocation is performed, -meaning that the results of the last sub-workflow invocation (triggered by `continueAs`) should be used as the -data returned by the invocation of the sub-workflow to the parent workflow. - -### Workflow Versioning - -In any application, regardless of size or type, one thing is for sure: changes happen. -Versioning your workflow definitions is an important task to consider. Versions indicate -changes or updates of your workflow definitions to the associated execution runtimes. - -There are two places in the [workflow definition](#Workflow-Definition-Structure) where versioning can be applied: - -1. Top level workflow definition `version` property. -2. Actions [subflowRef](#SubFlowRef-Definition) `version` property. - -The `version` property must respect the [semantic versioning](https://semver.org/) guidelines. - -### Workflow Constants - -Workflow constants are used to define static, and immutable, data which is available to [Workflow Expressions](#Workflow-Expressions). - -Constants can be defined via the [Workflow top-level "constants" property](#Workflow-Definition-Structure), -for example: - -```json -"constants": { - "Translations": { - "Dog": { - "Serbian": "pas", - "Spanish": "perro", - "French": "chien" - } - } -} -``` - -Constants can only be accessed inside Workflow expressions via the `$CONST` variable. -Runtimes must make `$CONST` available to expressions as a predefined variable. - -Here is an example of using constants in Workflow expressions: - -```json -{ -..., -"constants": { - "AGE": { - "MIN_ADULT": 18 - } -}, -... -"states":[ - { - "name":"check-applicant", - "type":"switch", - "dataConditions": [ - { - "name": "applicant-is-adult", - "condition": "${ .applicant | .age >= $CONST.AGE.MIN_ADULT }", - "transition": "approve-application" - }, - { - "name": "applicant-is-minor", - "condition": "${ .applicant | .age < $CONST.AGE.MIN_ADULT }", - "transition": "reject-application" - } - ], - ... - }, - ... -] -} -``` - -Note that constants can also be used in [expression functions](#Using-Functions-for-Expression-Evaluation), -for example: - -```json -{ -"functions": [ - { - "name": "is-adult", - "operation": ".applicant | .age >= $CONST.AGE.MIN_ADULT", - "type": "expression" - }, - { - "name": "is-minor", - "operation": ".applicant | .age < $CONST.AGE.MIN_ADULT", - "type": "expression" - } -] -} -``` - -Workflow constants values should only contain static data, meaning that their value should not -contain Workflow expressions. -Workflow constants data must be immutable. -Workflow constants should not have access to [Workflow secrets definitions](#Workflow-Secrets). - -### Workflow Secrets - -Secrets allow you access sensitive information, such as passwords, OAuth tokens, ssh keys, etc -inside your [Workflow Expressions](#Workflow-Expressions). - -You can define the names of secrets via the [Workflow top-level "secrets" property](#Workflow-Definition-Structure), -for example: - -```json -"secrets": ["MY_PASSWORD", "MY_STORAGE_KEY", "MY_ACCOUNT"] -``` - -If secrets are defined in a Workflow definition, runtimes must assure to provide their values -during Workflow execution. - -Secrets can be used only in [Workflow expressions](#Workflow-Expressions) by referencing them via the `$SECRETS` variable. -Runtimes must make `$SECRETS` available to expressions as a predefined variable. - -Here is an example on how to use secrets and pass them as arguments to a function invocation: - -```json -"secrets": ["AZURE_STORAGE_ACCOUNT", "AZURE_STORAGE_KEY"], - -... - -{ - "refName": "upload-to-azure", - "arguments": { - "account": "${ $SECRETS.AZURE_STORAGE_ACCOUNT }", - "account-key": "${ $SECRETS.AZURE_STORAGE_KEY }", - ... - } - -} -``` - -Note that secrets can also be used in [expression functions](#Using-Functions-for-Expression-Evaluation). - -Secrets are immutable, meaning that workflow expressions are not allowed to change their values. - -### Workflow Metadata - -Metadata enables you to enrich the serverless workflow model with information beyond its core definitions. -It is intended to be used by clients, such as tools and libraries, as well as users that find this information relevant. - -Metadata should not affect workflow execution. Implementations may choose to use metadata information or ignore it. -Note, however, that using metadata to control workflow execution can lead to vendor-locked implementations that do not comply with the main goals of this specification, which is to be completely vendor-neutral. - -Metadata includes key/value pairs (string types). Both keys and values are completely arbitrary and non-identifying. - -Metadata can be added to: - -- [Workflow Definition](#Workflow-Definition-Structure) -- [Function definitions](#Function-Definition) -- [Event definitions](#Event-Definition) -- [State definitions](#Workflow-States) -- [Switch state](#Switch-State) [data](#Switch-State-Data-Conditions) and [event](#Switch-State-Event-Conditions) conditions. - -Here is an example of metadata attached to the core workflow definition: - -```json -{ - "name": "process-sales-orders", - "description": "Process Sales Orders", - "version": "1.0.0", - "specVersion": "0.8", - "start": "MyStartingState", - "metadata": { - "loglevel": "Info", - "environment": "Production", - "category": "Sales", - "giturl": "github.com/myproject", - "author": "Author Name", - "team": "Team Name", - ... - }, - "states": [ - ... - ] -} -``` - -Some other examples of information that could be recorded in metadata are: - -- UI tooling information such as sizing or scaling factors. -- Build, release, or image information such as timestamps, release ids, git branches, PR numbers, etc. -- Logging, monitoring, analytics, or audit repository information. -- Labels used for organizing/indexing purposes, such as "release" "stable", "track", "daily", etc. - -### Workflow Context - -Similar to [Constants](https://github.com/serverlessworkflow/specification/blob/main/specification.md#workflow-constants) and [Secrets](https://github.com/serverlessworkflow/specification/blob/main/specification.md#workflow-secrets), workflows expressions can have access to the context information of a running instance via the keyword `WORKFLOW`. - -Implementations may use this keyword to give access to any relevant information of the running instance within an expression. For example: - -```json - -{ - "name": "process-sales-orders", - "description": "Process Sales Orders", - "version": "1.0.0", - "specVersion": "0.8", - "start": "my-starting-state", - "functions": [{ - "name": "my-function", - "operation": "myopenapi.json#myFunction" - }], - "states":[ - { - "name":"my-starting-state", - "type":"operation", - "actions": [{ - "functionRef": "my-function", - "args": { - "order": "${ .orderId }", - "callerId": "${ $WORKFLOW.instanceId }" - } - }], - "end": true - }] -} -``` - -In this use case, a third-party service may require information from the caller for traceability purposes. - -The specification doesn't define any specific variable within the `WORKFLOW` bucket, but it's considered a reserved keyword. - -### Naming Convention - -Identifiable components of a workflow definition, such as states, actions, branches, events and functions define a required non-null `name` property which is based on DNS label names as defined by [RFC 1123](https://datatracker.ietf.org/doc/html/rfc1123#page-13) with further restrictions. - -Specifically, `names` must be lowercase, start and end with an alphanumeric character, and consist entirely of alphanumeric characters with optional isolated medial dashes '-' (i.e., dashes must not be adjacent to each other). - -The regular expression used in [schemas](/schema/workflow.json) is: `^[a-z0-9](-?[a-z0-9])*$`. - -## Extensions - -The workflow extension mechanism allows you to enhance your model definitions with additional information useful for -things like analytics, rate limiting, logging, simulation, debugging, tracing, etc. - -Model extensions do no influence control flow logic (workflow execution semantics). -They enhance it with extra information that can be consumed by runtime systems or tooling and -evaluated with the end goal being overall workflow improvements in terms of time, cost, efficiency, etc. - -Serverless Workflow specification provides extensions which can be found [here](extensions/README.md). - -You can define extensions in your workflow definition using its top-level `extensions` property. -For more information about this property, see the `extensions` property in the -[Workflow Definition Structure section](#Workflow-Definition-Structure). - -Even though users can define their own extensions, it is encouraged to use the ones provided by the specification. -We also encourage users to contribute their extensions to the specification. That way they can be shared -with the rest of the community. - -If you have an idea for a new workflow extension, or would like to enhance an existing one, -please open an `New Extension Request` issue in this repository. - -## Use Cases - -You can find different Serverless Workflow use cases [here](usecases/README.md). - -## Examples - -You can find many Serverless Workflow examples [here](examples/README.md). - -## Comparison to other workflow languages - -You can find info how the Serverless Workflow language compares with -other workflow languages [here](comparisons/README.md). - -## References - -You can find a list of other languages, technologies and specifications related to workflows [here](references/README.md). - -## License - -Serverless Workflow specification operates under the -[Apache License version 2.0](LICENSE). diff --git a/usecases/README.md b/usecases/README.md deleted file mode 100644 index cfdb707a..00000000 --- a/usecases/README.md +++ /dev/null @@ -1,63 +0,0 @@ -# Use Cases - -Use cases for the Serverless Workflow Specification highly depend on the reference implementations -and the ecosystem available during workflow execution (available functions/services/events, etc). - -As mentioned in the [main specification document](../README.md) one of the main benefits of Serverless Workflows -is that they provide clear separation of business and orchestration logic in your serverless apps. - -Developers can focus on solving business logic inside functions and utilize workflows to define function invocations, - react to events, as well as provide data management for different microservices. - -So what can you automate with Serverless Workflows? You can get some ideas from the use cases below. - -## Table of Contents - -- [Online Vehicle Auction](#Online-Vehicle-Auction) -- [Payment Processing](#Payment-Processing) -- [Data Analysis](#Data-Analysis) -- [Error Notifications](#Error-Notifications) -- [Continuous Integration And Deployment](#Continuous-Integration-And-Deployment) - -## Online Vehicle Auction - -You can use Serverless Workflows to coordinate all of the steps of an Online Vehicle Auction. -These can include: - -- Authentication of users making bids. -- Communication with Bidding and Inventory services -- Make decisions to start/end the auction under certain conditions - -

- -## Payment Processing - -Servlerless Workflows are ideal for coordinating session-based apps such as e-commerce sites. You can -use Serverless Workflows to coordinate all steps of the checkout process allowing for example users to take a picture -of their credit card rather than having to type in the numbers and information. - -

- -## Data Analysis - -You can use Serverless Workflows to coordinate data analysis of Marketing and Sales information. -Analysis can be scheduled on a timely basis to trigger workflow coordination of different ETL services. - -

- -## Error Notifications - -You can design Serverless Workflows that trigger notifications regarding their success or failure. -In conjunction with available messaging services you can notify developers on different platforms of such possible failures - including error information and exactly the point in the execution the failure happened. - At the same time you can log the workflow execution status to cloud storage services for further analysis. - -

- -## Continuous Integration And Deployment - -Serverless Workflows can help you build solid continuous integration and deployment solutions. -Code check-ins can trigger website builds and automatic redeploys. Pull requests can trigger -running automated tests to make sure code is well-tested before human reviews. - -