Our system currently publishes Kafka messages using raw JSON serialization of domain DTOs (e.g., Report). This causes several issues:
Introduce schema-based message publishing for events produced when a new report is created.
{
"type": "record",
"name": "Report",
"namespace": "org.fungover.zipp.dto",
"fields": [
{ "name": "submittedByUserId", "type": "long" },
{ "name": "description", "type": "string" },
{ "name": "eventType", "type": { "type": "enum", "name": "ReportType",
"symbols": ["ACCIDENT", "DEBRIS", "OTHER"] }
},
{ "name": "latitude", "type": "double" },
{ "name": "longitude", "type": "double" },
{ "name": "submittedAt",
"type": ["null", { "type": "long", "logicalType": "timestamp-millis" }],
"default": null
},
{ "name": "status",
"type": ["null", { "type": "enum", "name": "ReportStatus",
"symbols": ["ACTIVE", "RESOLVED", "EXPIRED"] }],
"default": null
},
{ "name": "imageUrls",
"type": ["null", { "type": "array", "items": "string" }],
"default": null
}
]
}
Optimization
Migrate Kafka Payload from JSON to Avro
Background
Our system currently publishes Kafka messages using raw JSON serialization of domain DTOs (e.g., Report). This causes several issues:
No version-safe schema definition
Consumers are tightly coupled to backend DTOs
Schema evolution risks breaking downstream systems
Larger message size and slower serialization
Goal
Introduce schema-based message publishing for events produced when a new report is created.
Requirements
Define a schema for a “report created” event
Use Avro for serialization
Stop sending raw JSON representations of DTOs
Ensure the schema contains the correct data:
Add a Schema Registry
Register schemas automatically via producer
Consumer Compatibility
Ensure the existing consumer(s) can deserialize schema-based payload
The consumer must be able to read events without requiring database lookups
Example expected payload fields:
Parent issue