Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 6 - State Stores and Interactive Queries

Engineering | Soby Chacko | December 09, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued Part 3 - Data deserialization and serialization Part 4 - Error Handling Part 5 - Application Customizations

In this part (the sixth and final one of this series), we are going to look into the ways Spring Cloud Stream Binder for Kafka Streams supports state stores and interactive queries in Kafka Streams.

Named State Stores

When you have the need to maintain state in the application, Kafka Streams lets you materialize that state information into a named state store. There are several operations in Kafka Streams that require it to keep track of the state such as count, aggregate, reduce, various windowing operations, and others. Kafka Streams uses a special database called RocksDB for maintaining this state store in most cases (unless you explicitly change…

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 5 - Application Customizations

Engineering | Soby Chacko | December 06, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued Part 3 - Data deserialization and serialization Part 4 - Error Handling

In this blog post, we continue our discussion on the support for Kafka Streams in Spring Cloud Stream. We are going to elaborate on the ways in which you can customize a Kafka Streams application.

Customizing the StreamsBuilderFactoryBean

Kafka Streams binder uses the StreamsBuilderFactoryBean, provided by the Spring for Apache Kafka project, to build the StreamsBuilder object that is the foundation for a Kafka Streams application. This factory bean is a Spring lifecycle bean. Oftentimes, this factory bean must be customized before it is started, for various reasons. As described in the previous blog post on error handling, you need to customize the StreamsBuilderFactoryBean

Spring Data R2DBC goes GA

Releases | Mark Paluch | December 06, 2019 | ...

On behalf of the team and everyone that contributed, I am delighted to announce that Spring Data R2DBC 1.0 is generally available from repo.spring.io as well as Maven Central!

Spring Data R2DBC 1.0 is a non-blocking database client library for the just released R2DBC specification that lets you build reactive applications that use SQL databases. The most notable features of Spring Data R2DBC are:

  • Functional-reactive declaration of data access
  • Fluent API
  • Support for Transactions
  • Named parameter support (Dialect-aware)
  • Repositories
  • Kotlin Coroutines extensions

Spring Data R2DBC 1.0 requires JDK 8 or higher and any R2DBC driver. Head over to start.spring.io and add R2DBC to configure your dependencies or, if you're already using the Spring Boot R2DBC starter, upgrade your spring-boot-bom-r2dbc to 0.1.0.M3

Spring Boot 2.2.2 is now available

Releases | Brian Clozel | December 06, 2019 | ...

On behalf of the team and everyone who has contributed, I'm happy to announce that Spring Boot 2.2.2 has been released and is now available from repo.spring.io and Maven Central.

This release includes 88 fixes, improvements, and dependency upgrades. Thanks to all those who have contributed with issue reports and pull requests.

How can you help?

If you're interested in helping out, check out the "ideal for contribution" tag in the issue repository. If you have general questions, please ask on stackoverflow.com using the spring-boot tag or chat with the community on Gitter.

Project Page | GitHub | Issues | Documentation | Stack Overflow |

Spring Boot 2.1.11 is now available

Releases | Madhura Bhave | December 06, 2019 | ...

On behalf of the team and everyone who has contributed, I'm happy to announce that Spring Boot 2.1.11 has been released and is now available from repo.spring.io and Maven Central.

This release includes 53 fixes, improvements, and dependency upgrades. Thanks to all those who have contributed with issue reports and pull requests.

How can you help?

If you're interested in helping out, check out the "ideal for contribution" tag in the issue repository. If you have general questions, please ask on stackoverflow.com using the spring-boot tag or chat with the community on Gitter.

Project Page | GitHub | Issues | Documentation | Stack Overflow |

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 4 - Error Handling

Engineering | Soby Chacko | December 05, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued Part 3 - Data deserialization and serialization

Continuing with the series on looking at the Spring Cloud Stream binder for Kafka Streams, in this blog post, we are looking at the various error-handling strategies that are available in the Kafka Streams binder.

The error handling in Kafka Streams is largely centered around errors that occur during deserialization on the inbound and during production on the outbound.

Handling Deserialization Exceptions

Kafka Streams lets you register deserialization exception handlers. The default behavior is that, when you have a deserialization exception, it logs that error and fails the application (LogAndFailExceptionHandler). It also lets you log and skip the record and continue the application (LogAndContinueExceptionHandler). Normally, you provide the corresponding…

Spring Batch 4.0.4, 4.1.3 and 4.2.1 available now!

Releases | Mahmoud Ben Hassine | December 04, 2019 | ...

I am pleased to announce the release of Spring Batch 4.0.4, 4.1.3 and 4.2.1 with bug fixes as well as documentation and dependencies updates. Please find the complete list of changes in the release notes: 4.0.4, 4.1.3, 4.2.1.

As we announced earlier this year, version 4.0.4 is the last release of the 4.0 line. The 4.1.x line will get another bug fix release next year and 4.1.4 will be the last release for this line. Please upgrade to v4.2+ at your earliest convenience as this is the primary active branch for the moment.

The next feature release will be 4.3, with a GA planned for October 202…

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 3 - Data deserialization and serialization

Engineering | Soby Chacko | December 04, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued

Continuing on the previous two blog posts, in this series on writing stream processing applications with Spring Cloud Stream and Kafka Streams, now we will look at the details of how these applications handle deserialization on the inbound and serialization on the outbound.

All three major higher-level types in Kafka Streams - KStream<K,V>, KTable<K,V> and GlobalKTable<K,V> - work with a key and a value.

With Spring Cloud Stream Kafka Streams support, keys are always deserialized and serialized by using the native Serde mechanism. A Serde is a container object where it provides a deserializer and a…

Spring Data Moore SR3 and Lovelace SR14 released

Releases | Jens Schauder | December 04, 2019 | ...

On behalf of the community, we are pleased to announce that Spring Data Moore SR3 and Lovelace SR14 are now available from Maven Central. Both releases ship with almost 70 tickets in total, mostly bugfixes and dependency upgrades.

Moore SR3 is built on top of the recently released Spring Framework 5.2.2 and will be picked up by Spring Boot 2.2.2 for easier consumption and Lovelace SR14 is built on top of the recently released Spring Framework 5.1.12 and will be picked up by Spring Boot 2.1.11 for easier consumption.

Here are links to the reference documentation, changelogs, and artifacts of…

Get the Spring newsletter

Thank you for your interest. Someone will get back to you shortly.

Get ahead

VMware offers training and certification to turbo-charge your progress.

Learn more

Get support

Tanzu Spring Runtime offers support and binaries for OpenJDK™, Spring, and Apache Tomcat® in one simple subscription.

Learn more

Upcoming events

Check out all the upcoming events in the Spring community.

View all