配置flume,编写kafka.conf文件。从端口44444采集数据,发送到kafka的first主题。
# Name the components on this agent a1.sources = r1 a1.sinks = k1 a1.channels = c1 # Describe/configure the source a1.sources.r1.type = netcat a1.sources.r1.bind = localhost a1.sources.r1.port = 44444 # Describe the sink 相当于kafka的生产者 a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink a1.sinks.k1.kafka.topic = first a1.sinks.k1.kafka.bootstrap.servers = hadoop102:9092,hadoop103:9092,hadoop104:9092 a1.sinks.k1.kafka.flumeBatchSize = 20 a1.sinks.k1.kafka.producer.acks = 1 a1.sinks.k1.kafka.producer.linger.ms = 1 # Use a channel which buffers events in memory a1.channels.c1.type = memory a1.channels.c1.capacity = 1000 a1.channels.c1.transactionCapacity = 100 # Bind the source and sink to the channel a1.sources.r1.channels = c1 a1.sinks.k1.channel = c1
启动flume采集数据
bin/flume-ng agent -c conf/ -n a1 -f job/kafka.conf
模拟生产数据
[atguigu@hadoop102 ~]$ nc localhost 44444 helloworld OK 123 OK
控制台消费数据。如此便形成了一个由flume采集数据,然后发送到kafka的过程。
[atguigu@hadoop102 kafka]$ bin/kafka-console-consumer.sh --bootstrap-server hadoop102:9092 --topic first helloworld 123
通过flume拦截器,将数据发送到kafka的不同主题。待续。。
思路:如果是kafka作为channel,通过ChannelSelector的拦截器,可以发往不同的Kafka Channel。