Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 6 - State Stores and Interactive Queries

Engineering | Soby Chacko | December 09, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued Part 3 - Data deserialization and serialization Part 4 - Error Handling Part 5 - Application Customizations

In this part (the sixth and final one of this series), we are going to look into the ways Spring Cloud Stream Binder for Kafka Streams supports state stores and interactive queries in Kafka Streams.

Named State Stores

When you have the need to maintain state in the application, Kafka Streams lets you materialize that state information into a named state store. There are several operations in Kafka Streams that require it to keep track of the state such as count, aggregate, reduce, various windowing operations, and others. Kafka Streams uses a special database called RocksDB for maintaining this state store in most cases (unless you explicitly change…

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 5 - Application Customizations

Engineering | Soby Chacko | December 06, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued Part 3 - Data deserialization and serialization Part 4 - Error Handling

In this blog post, we continue our discussion on the support for Kafka Streams in Spring Cloud Stream. We are going to elaborate on the ways in which you can customize a Kafka Streams application.

Customizing the StreamsBuilderFactoryBean

Kafka Streams binder uses the StreamsBuilderFactoryBean, provided by the Spring for Apache Kafka project, to build the StreamsBuilder object that is the foundation for a Kafka Streams application. This factory bean is a Spring lifecycle bean. Oftentimes, this factory bean must be customized before it is started, for various reasons. As described in the previous blog post on error handling, you need to customize the StreamsBuilderFactoryBean

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 4 - Error Handling

Engineering | Soby Chacko | December 05, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued Part 3 - Data deserialization and serialization

Continuing with the series on looking at the Spring Cloud Stream binder for Kafka Streams, in this blog post, we are looking at the various error-handling strategies that are available in the Kafka Streams binder.

The error handling in Kafka Streams is largely centered around errors that occur during deserialization on the inbound and during production on the outbound.

Handling Deserialization Exceptions

Kafka Streams lets you register deserialization exception handlers. The default behavior is that, when you have a deserialization exception, it logs that error and fails the application (LogAndFailExceptionHandler). It also lets you log and skip the record and continue the application (LogAndContinueExceptionHandler). Normally, you provide the corresponding…

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 3 - Data deserialization and serialization

Engineering | Soby Chacko | December 04, 2019 | ...

Part 1 - Programming Model Part 2 - Programming Model Continued

Continuing on the previous two blog posts, in this series on writing stream processing applications with Spring Cloud Stream and Kafka Streams, now we will look at the details of how these applications handle deserialization on the inbound and serialization on the outbound.

All three major higher-level types in Kafka Streams - KStream<K,V>, KTable<K,V> and GlobalKTable<K,V> - work with a key and a value.

With Spring Cloud Stream Kafka Streams support, keys are always deserialized and serialized by using the native Serde mechanism. A Serde is a container object where it provides a deserializer and a…

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 2 - Programming Model Continued

Engineering | Soby Chacko | December 03, 2019 | ...

On the heels of the previous blog in which we introduced the basic functional programming model for writing streaming applications with Spring Cloud Stream and Kafka Streams, in this part, we are going to further explore that programming model.

Let’s look at a few scenarios.

Scenario 1: Single input and output binding

If your application consumes data from a single input binding and produces data into an output binding, you can use Java’s Function interface to do that. Keep in mind that binding in this sense is not necessarily mapped to a single input Kafka topic, because topics could be…

This Week in Spring - December 3, 2019

Engineering | Josh Long | December 03, 2019 | ...

Hi, Spring fans! Can you believe - and I can't, by the way - that we're already in December 2019? The last month before the new year? The last month of this decade? It defies belief! I can't even imagine how we got this far so quickly, but it's great that we did. I started writing This Week in Spring in the first week of January 2011, so we're fast approaching 9 years of This Week in Spring!

As I write this I am in Toronto, Canada, for the last stop on the SpringOne Tour train for 2019. I enjoyed giving a two-hour talk introducing all sorts of stuff in the wide world of Reactive Spring…

Stream Processing with Spring Cloud Stream and Apache Kafka Streams. Part 1 - Programming Model

Engineering | Soby Chacko | December 02, 2019 | ...

This is the first in a series of blog posts in which we will look at how stream processing applications are written using Spring Cloud Stream and Kafka Streams.

The Spring Cloud Stream Horsham release (3.0.0) introduces several changes to the way applications can leverage Apache Kafka using the binders for Kafka and Kafka Streams. One of the major enhancements that this release brings to the table is first class support for writing apps by using a fully functional programming paradigm. This blog post gives an introduction to how this functional programming model can be used to develop stream…

A Bootiful Podcast: Spring Tools lead Martin Lippert

Engineering | Josh Long | November 29, 2019 | ...

Hi, Spring fans! In today's episode Josh Long (@starbuxman) talks to Spring Tools lead Martin Lippert (@martinlippert) about his time at Pivotal, and on the Spring team, his work on Spring Tools, and his work on language servers that now serve as the foundational integration for Spring users using Microsoft's Visual Studio Code, emacs and Atom, among other things.

Thanks, dear listener, and Happy Thanksgiving!

This Week in Spring - November 26th, 2019

Engineering | Josh Long | November 26, 2019 | ...

Hi, Spring fans! Welcome to yet another installment of This Week in Spring! This week, I'm in Tokyo, Japan, for the Pivotal Summit Japan event. I've regretfully had to miss the China and Korea events because of a famly emergency, so it's nice to be able to make this, the last stop on the tour, before returning to California to celebrate Thanksgiving with the family.

And, on that note... it's almost Thanksgiving in the US. Thanksgiving is a time for us in the US to reflect on that for which we're thankful. I think I speak for the entire Spring team when I say that we are very grateful for you…

Get the Spring newsletter

Stay connected with the Spring newsletter

Subscribe

Get ahead

VMware offers training and certification to turbo-charge your progress.

Learn more

Get support

Tanzu Spring offers support and binaries for OpenJDK™, Spring, and Apache Tomcat® in one simple subscription.

Learn more

Upcoming events

Check out all the upcoming events in the Spring community.

View all