In Kafka, can I create a single kafka topic and have multiple producers write to it -
i have following use case : log files comes single data source pushed kafka topic (say topic 1). there consumer read , converts json format , writes topic (topic 2). consumer expecting data in json read topic 2 other modification , writes topic (topic 3).
my question instead of creating 3 different topics, can create single topic , have these multiple producers write same topic? how consumer know partition read since group id cannot set producer? 1 solution learnt create partitions , make each producer write particular partition alone. problem approach number of producers , consumers might change , modifying topic not desired. please advice.
as 1 commented should not push different type of schema's single topic. number of topics in kafka not issue. can use nomenclature manage them. "topic1", "topic1_json", "topic1_modification".
if use case have unmanageable list of topics, same consumer can read json topics & don't want batching of same schema events @ destination file system. can follow below approach.
create object generic schema or setup schema registry(check confluent schema registry). schemas fits subrecord or record carry schema information. create single topic json responses(for ex: topic_json_generic). after reading data "topic1" push "topic_json_generic". similar further topic. @ consumer level can handle needs done type of object.
Comments
Post a Comment