Kafka Input allows the reading of several kafka topics simultaneously, as well as topic monitoring for detection of new topics that match the selected patterns.
Kafka subscription to 'logline__.*' pattern. Will watch for new topics that match the pattern.
"inputs" : {
"kafka" : {
"type" : "kafka",
"config" : {
"url" : ["kafka://server1:9092","kafka://server2:9092"],
"topics" : ["/logline__.*/"],
"format" : "json",
"offset" : "earliest",
"group" : "nsyslog",
"watch" : true,
"debug" : false,
"options" : {
"sessionTimeout" : 15000,
"requestTimeout" : 10000
}
}
}
}- url : String or array of strings. List of Kafka hosts to connect to (kafka://host:port or kafkas://host:port for TLS connection).
- topics : String or array. List of kafka topics to subscribe to. If a topic is embraced between '/' characters, it will be interpreted as a regular expression to be matched against.
- format : Can be either of raw or json. When raw is used, the raw content of the message will be put in the 'originalMessage' field of the entry. Otherwise, if json is used, the content will be parsed as a JSON object and placed into the 'originalMessage' field.
- offset : Can be one of earliest or latest. Initial offset when starting to read a new topic.
- group : Consumer group ID (to keep track of the topics' offsets).
- watch : If true, the Kafka input will search periodically for new topics that match the patterns and start reading from them.
- tls : TLS/SSL options as described in NodeJS documentation.
- debug : Boolean. If true, enables debug logging for Kafka input.
- options : Object. Additional Kafka configuration options, such as
sessionTimeoutandrequestTimeout.
Each read will generate an object with the following schema:
{
id : '<input ID>',
type : 'kafka',
topic : '<topic.name>',
originalMessage : '<String value or JSON object>'
}