The Spring Blog

Engineering
Releases
News and Events

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 5 - Application Customizations

Part 1 - Programming Model
Part 2 - Programming Model Continued
Part 3 - Data deserialization and serialization
Part 4 - Error Handling

In this blog post, we continue our discussion on the support for Kafka Streams in Spring Cloud Stream. We are going to elaborate on the ways in which you can customize a Kafka Streams application.

Customizing the StreamsBuilderFactoryBean

Kafka Streams binder uses the StreamsBuilderFactoryBean, provided by the Spring for Apache Kafka project, to build the StreamsBuilder object that is the foundation for a Kafka Streams application. This factory bean is a Spring lifecycle bean. Oftentimes, this factory bean must be customized before it is started, for various reasons. As described in the previous blog post on error handling, you need to customize the StreamsBuilderFactoryBean if you want to register a production exception handler. Let’s say that you have this producer exception handler:

Read more...

Spring Data R2DBC goes GA

On behalf of the team and everyone that contributed, I am delighted to announce that Spring Data R2DBC 1.0 is generally available from repo.spring.io as well as Maven Central!

Spring Data R2DBC 1.0 is a non-blocking database client library for the just released R2DBC specification that lets you build reactive applications that use SQL databases. The most notable features of Spring Data R2DBC are:

  • Functional-reactive declaration of data access
  • Fluent API
  • Support for Transactions
  • Named parameter support (Dialect-aware)
  • Repositories
  • Kotlin Coroutines extensions
Read more...

Spring Boot 2.2.2 is now available

On behalf of the team and everyone who has contributed, I’m happy to announce that Spring Boot 2.2.2 has been released and is now available from repo.spring.io and Maven Central.

This release includes 88 fixes, improvements, and dependency upgrades. Thanks to all those who have contributed with issue reports and pull requests.

How can you help?

If you’re interested in helping out, check out the “ideal for contribution” tag in the issue repository. If you have general questions, please ask on stackoverflow.com using the spring-boot tag or chat with the community on Gitter.

Read more...

Spring Boot 2.1.11 is now available

On behalf of the team and everyone who has contributed, I’m happy to announce that Spring Boot 2.1.11 has been released and is now available from repo.spring.io and Maven Central.

This release includes 53 fixes, improvements, and dependency upgrades. Thanks to all those who have contributed with issue reports and pull requests.

How can you help?

If you’re interested in helping out, check out the “ideal for contribution” tag in the issue repository. If you have general questions, please ask on stackoverflow.com using the spring-boot tag or chat with the community on Gitter.

Read more...

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 4 - Error Handling

Part 1 - Programming Model
Part 2 - Programming Model Continued
Part 3 - Data deserialization and serialization

Continuing with the series on looking at the Spring Cloud Stream binder for Kafka Streams, in this blog post, we are looking at the various error-handling strategies that are available in the Kafka Streams binder.

The error handling in Kafka Streams is largely centered around errors that occur during deserialization on the inbound and during production on the outbound.

Handling Deserialization Exceptions

Read more...

Spring Batch 4.0.4, 4.1.3 and 4.2.1 available now!

I am pleased to announce the release of Spring Batch 4.0.4, 4.1.3 and 4.2.1 with bug fixes as well as documentation and dependencies updates. Please find the complete list of changes in the release notes: 4.0.4, 4.1.3, 4.2.1.

As we announced earlier this year, version 4.0.4 is the last release of the 4.0 line. The 4.1.x line will get another bug fix release next year and 4.1.4 will be the last release for this line. Please upgrade to v4.2+ at your earliest convenience as this is the primary active branch for the moment and which will be supported until the end of 2020.

Read more...

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 3 - Data deserialization and serialization

Part 1 - Programming Model
Part 2 - Programming Model Continued

Continuing on the previous two blog posts, in this series on writing stream processing applications with Spring Cloud Stream and Kafka Streams, now we will look at the details of how these applications handle deserialization on the inbound and serialization on the outbound.

All three major higher-level types in Kafka Streams - KStream<K,V>, KTable<K,V> and GlobalKTable<K,V> - work with a key and a value.

With Spring Cloud Stream Kafka Streams support, keys are always deserialized and serialized by using the native Serde mechanism. A Serde is a container object where it provides a deserializer and a serializer.

Read more...

Spring Data Moore SR3 and Lovelace SR14 released

On behalf of the community, we are pleased to announce that Spring Data Moore SR3 and Lovelace SR14 are now available from Maven Central. Both releases ship with almost 70 tickets in total, mostly bugfixes and dependency upgrades.

Moore SR3 is built on top of the recently released Spring Framework 5.2.2 and will be picked up by Spring Boot 2.2.2 for easier consumption and Lovelace SR14 is built on top of the recently released Spring Framework 5.1.12 and will be picked up by Spring Boot 2.1.11 for easier consumption.

Read more...

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 2 - Programming Model Continued

On the heels of the previous blog in which we introduced the basic functional programming model for writing streaming applications with Spring Cloud Stream and Kafka Streams, in this part, we are going to further explore that programming model.

Let’s look at a few scenarios.

Scenario 1: Single input and output binding

If your application consumes data from a single input binding and produces data into an output binding, you can use Java’s Function interface to do that. Keep in mind that binding in this sense is not necessarily mapped to a single input Kafka topic, because topics could be multiplexed and attached to a single input binding (with comma-separated multiple topics configured on a single binding - see below for an example). On the outbound case, the binding maps to a single topic here.

Read more...