VMware offers training and certification to turbo-charge your progress.Learn more
The Spring Cloud Data Flow team is pleased to announce the release of
1.7 M1. Follow the Getting Started guides for Local Server, Cloud Foundry, and Kubernetes.
Stream Application DSL
Concurrent Task Launch Limiting
Stream and Task validation
Force upgrade for Streams
The UI has a completely new look. The navigation has moved from tabs to a left side navigation system. This gives increased screen real estate for creating streams with the Flo designer and even more screen real estate can be obtained by minimizing the left side navigation. There is a quick search feature that searches across all the different Data Flow categories. Additional colors and overall theme changes have been added to make the UI look more lively. Deeper in the core, the route management has been improved and we have increased our end to end testing coverage using BrowserStack/SauceLabs.
Not all use cases can be solved by having a linear pipeline with data flowing from source to processor to sink with a single destination connecting each application. Some use cases require a collection of applications that have multiple input and outputs. This topology is supported in Spring Cloud Stream by using user defined binding interfaces but was not supported in Data Flow. It is also common to have multiple inputs and outputs in Kafka Streams applications.
In addition, not all use cases are solved using Spring Cloud Stream applications. A http gateway application that sends a synchronous request/reply message to a Kafka or RabbitMQ application can be written using only Spring Integration.
In these cases, Data Flow can’t make assumptions about the flow of data from one application to another and therefore can’t set the application’s destination properties as is done when using the Stream Pipeline DSL.
To address these use cases we have introduced the Stream Application DSL. This DSL uses a
comma, instead of the
pipe symbol, to indicate that Data Flow should not configure the binding properties of the application. Instead, the developer needs to set the appropriate deployment properties to 'wire up' the application. An example of the DSL using the domain of the EIP Cafe example, is
dataflow:> stream create --definition "orderGeneratorApp, baristaApp, hotDrinkDeliveryApp, coldDrinkDeliveryApp" --name myCafeStream
Where the listed applications in the DSL need to be registered as
In this stream, the
baristaApp has two output destinations intended to be consumed by the
coldDrinkDeliveryApp respectively. When deploying the stream, set the destination properties such that destinations match up for the desired flow of data, for example:
app.baristaApp.spring.cloud.stream.bindings.hotDrinks.destination=hotDrinksDest app.baristaApp.spring.cloud.stream.bindings.coldDrinks.destination=coldDrinksDest app.hotDrinkDeliveryApp.spring.cloud.stream.bindings.input.destination=hotDrinksDest app.coldDrinkDeliveryApp.spring.cloud.stream.bindings.input.destination=coldDrinksDest
To help answer the question "Who did what and when?" an audit trail has been introduced to store the actions taken involving app registration, schedules, streams and tasks. For apps and schedules the create and deletion actions are audited. For streams, the creation, deletion, deployment, undeployment, update and rollback are audited. For tasks, the creation, launching and destory are audited. The audit information is available in the UI to query. Accessing audit information in the shell is forthcoming.
Spring Cloud Data Flow allows you to enforce a maximum number of concurrently running tasks to prevent the saturation of compute resources. This limit can be configured by setting the
spring.cloud.dataflow.task.maximum-concurrent-tasks property. The default value is 20. You can also retrieve the current number of concurrently executing tasks though the REST endpoint
/tasks/executions/current. A new tasklauncher-dataflow application makes use of this feature to only launch tasks if the number of concurrent tasks is below the maximum. The feature is also at the center of a new FTP ingest sample application that is under development. A sneak peak is available in the Cloud-native patterns for Data-intensive applications webinar
The new shell commands
stream validate and
task validate will validate that the stream or task application resources are valid and accessable. This avoids getting exceptions at deployment time. Validation using the UI is forthcoming.
Use the force! When upgrading a stream, you can now use the option
--force to deploy new instances of currently deployed applications even if no applicaton or deployment properties have changed. This behavior is needed in the case when configuration information is obtained by the application itself at startup time, for example from Spring Cloud Config Server. You can specify which applications to force upgrade by using the option
--app-names. If you do not specify any application names, all the applications will be force upgraded. You can specify
--app-names options together with
As always, we welcome feedback and contributions, so please reach out to us on Stackoverflow or GitHub or via Gitter.