Skip to content

v3.15.0

Compare
Choose a tag to compare
@github-actions github-actions released this 20 Jun 16:27
· 115 commits to refs/heads/develop since this release

Summary

We're excited to announce the Kafka Consumer utility, which transparently handles message deserialization, provides an intuitive developer experience, and integrates seamlessly with the rest of the Powertools for AWS Lambda ecosystem.

Key features

  • Automatic deserialization of Kafka messages (JSON, Avro, and Protocol Buffers)
  • Simplified event record handling with intuitive interface
  • Support for key and value deserialization
  • Support for Pydantic models and Dataclass output
  • Support for Event Source Mapping (ESM) with and without Schema Registry integration
  • Out of the box error handling for deserialization issues

Getting Started

To get started, depending on the schema types you want to use, install the library and the corresponding libraries:

For JSON schemas:

pip install aws-lambda-powertools

For Avro schemas:

pip install 'aws-lambda-powertools[kafka-consumer-avro]'

For Protobuf schemas:

pip install 'aws-lambda-powertools[kafka-consumer-protobuf]'

Additionally, if you want to use output serialization with Pydantic Models or Dataclases

Processing Kafka events

Docs

You can use Kafka consumer utility to transform raw Kafka events into an intuitive format for processing.

The @kafka_consumer decorator can deserialize both keys and values independently based on your schema configuration. This flexibility allows you to work with different data formats in the same message.

Working with Avro
carbon (8)

Working with Protobuf
carbon (9)

Working with JSON
carbon (10)

Custom output serializers

Docs

You can transform deserialized data into your preferred object types using output serializers. This can help you integrate Kafka data with your domain models and application architecture, providing type hints, validation, and structured data access.

carbon (11)

Error handling

Docs

You can handle errors when processing Kafka messages to ensure your application maintains resilience and provides clear diagnostic information.

We lazily decode fields like value, key, and headers only when accessed. This allows you to handle deserialization errors at the point of access rather than when the record is first processed.

carbon (12)

Changes

🌟New features and non-breaking changes

📜 Documentation updates

🐛 Bug and hot fixes

🔧 Maintenance

This release was made possible by the following contributors:

@dependabot[bot], @github-actions[bot], @leandrodamascena, @matteofigus, dependabot[bot] and github-actions[bot]