Integration Testing for Kafka

Blog Summary: (AI Summaries by Summarizes)
  • Complex data pipelines and systems with Kafka are becoming more common, especially with microservices.
  • Testing, debugging, and fixing these systems are crucial for ongoing success.
  • Integration tests are longer running tests that test several different parts of a system, including microservices.
  • Unit tests, on the other hand, test a much smaller piece of the code and should only test specific functionality.
  • Integration tests are important during initial development and testing code after deployment, including testing new features/changes and fixing bugs in production.

We’re creating more and more complicated data pipelines and systems with Kafka. These interactions are becoming even more complex as we create microservices.

As we create these complex systems, we aren’t thinking about how to test, debug, or fix them. These 3 parts are the defining factors of a project’s ongoing success.

What Are Integration Tests?

Integration tests are longer running tests that test several different parts of a system. For example if you were writing an integration test for a microservice, you’d test the very first stage or input to the microservice. You’d test the second input, etc, all the way to the very last interaction with the microservice. In a microservice, this is often the output that gets sent back to the user.

These tests can take several seconds to run. They may not be part of the test suite that developers run before checking in code. They’re part of the tests run on the continuous integration server.

Integration tests stand in contrast to a unit test. A unit test tests a much smaller piece of the code. They shouldn’t test across microservices and only test very specific functionality. They should finish in milliseconds.

Why Do Integration Testing?

I think integration tests have two important jobs during the development cycle.

The first is during the initial development. You’ll need to run your various microservices. During the early development phases, the code may not be written. You’ll need a way to stub out the parts that aren’t there yet. As you finish each part, you can add that into your integration test and verify that it works. In theory, the integration test is already passing. When you add the newly written piece, you’ll know it’s working correctly.

The second, and perhaps most important, is testing your code after deployment. This breaks down into two parts: testing new features/changes and fixing bugs in production.

As you add new features or make any changes to your data pipeline, you’ll need to make sure you didn’t break anything. Remember that data pipelines aren’t something that you write and walk away from — they’re constantly evolving. You need to verify that your addition didn’t break something down the line. Also, you may not remember every interaction one microservice has with another. The integration test should validate all of these interactions.

When I mentor a team, I evaluate them on several different levels. I validate that they take their testing seriously. If the team has a problem in production and they can’t reproduce that bug at a breakpoint in their IDE within 10 minutes, they have two problems. They have a production problem and they have a lack of unit or integration tests. Scrambling to fix a production problem is not the time to be writing an integration test. That should have already been done. The developer should just have to copy/paste the unit test, change the inputs, and rerun the test.

If you’re new to Kafka or Big Data, this level of testing might sound like overkill. I’ve seen microservices make 50 calls. If you think you can simulate those calls easily, think again. Overall, these integration tests will save you great deal of time because you’re not sitting there waiting for the breakpoint to hit. The integration tests will run faster than a manual staging of the environment.

How to Do Integration Test With Kafka?

First off, I don’t think you need to do integration tests on Kafka itself. You don’t need to run a Kafka broker and make network calls to it. This adds in some moving parts and will slow your tests down. IMHO, this level of testing should be done at a QA phase and not for the development cycles. You’ll want it to be as easy as possible to write, run, and debug the tests.

In two other posts, I talk about the specific mechanics of unit testing Kafka with mock objects. There is a post on unit testing producers and consumers. Please refer to these posts to see the exact mechanics of writing the unit tests.

The very first class in a microservice or data pipeline will be a producer. This producer will output based on some kind of input. This could be a log file, proprietary input, user interaction, etc. The first part of the integration test will look something like:

    firstPhase.setInput(“my input”);

    // Get the output from the Producer
    producer.history();

    // Run your assertions on the output of the Producer
    Assert.assertEquals(...);

Next, you will need to run the second phase of your microservice or data pipeline. This phase will have both a consumer and a producer. The second phase will consume the output produced by the first phase. It will do whatever processing and then produce its output. The second part will look something like:

    // Put the first phase’s output into a mocked consumer
    consumer.addRecord(new ConsumerRecord<String, String>(...);

    // Get the output from the second phase’s producer
    producer.history();

    // Run your assertions on the output of the second phase’s producer
    Assert.assertEquals(...);

You’ll keep on following this pattern until you reach the phase of the microservice or data pipeline. This phase will have a consumer, but may or may not have a producer. The final phase’s data could return an object, AJAX, RDBMS call, etc. You’ll need to get ahold of that object and run your assestions on it. The final part will look something like:

    // Put the next to last phase’s output into a mocked consumer
    consumer.addRecord(new ConsumerRecord<String, String>(...);

    // Get the output from lhe last phase
    MyObject lastObject = lastPhase.getLastObject();

    // Run your assertions
    Assert.assertEquals(...);

Refactoring

If you’ve already written your code, you may need to refactor it. It may not have been written with testing in mind or the way that you’ll have to test with Kafka. See the posts about unit testing producers and consumers to see my recommendations on refactoring.

Using Integration Tests to Debug

When an exception happens in Kafka, you log out several things:

  • The topic name
  • The partition
  • The offset
  • The entire key (base 64 encoded if binary)
  • The entire value (base 64 encoded if binary)

Using this information, you can either copy/paste the key/value into the integration test or put the topic/partition/offset as the starting point for a consumer. Using this information, you will:

  1. Copy and paste your existing integration test
  2. Change the method name and information for the key and value.
  3. Run the test and verify it fails in the same way as production
  4. Step through the code in your IDE to see why the exception happened
  5. Fix the problem
  6. Rerun the test to verify that the issue was fixed
  7. Check in the code along with the new integration test, and possibly unit test, to make sure this bug doesn’t happen again

Related Posts

The Difference Between Learning and Doing

Blog Summary: (AI Summaries by Summarizes)Learning options trading involves data and programming but is not as technical as data engineering or software engineering.Different types of

The Data Discovery Team

Blog Summary: (AI Summaries by Summarizes)Data discovery team plays a crucial role in searching for data in the IT landscape.Data discovery team must make data