zoukankan      html  css  js  c++  java
  • kafka stream 使用样例

    1. war包导入

    <dependencies>
            <dependency>
                <groupId>org.apache.kafka</groupId>
                <artifactId>kafka-streams</artifactId>
                <version>2.3.0</version>
            </dependency>
            <dependency>
                <groupId>org.apache.kafka</groupId>
                <artifactId>kafka-clients</artifactId>
                <version>2.3.0</version>
            </dependency>
        </dependencies>

    2. 代码实现

    package com.atguigu.kafkastream;
    
    import org.apache.kafka.common.serialization.Serde;
    import org.apache.kafka.common.serialization.Serdes;
    import org.apache.kafka.streams.KafkaStreams;
    import org.apache.kafka.streams.StreamsBuilder;
    import org.apache.kafka.streams.StreamsConfig;
    import org.apache.kafka.streams.kstream.Consumed;
    import org.apache.kafka.streams.kstream.KStream;
    import org.apache.kafka.streams.kstream.Produced;
    import org.apache.kafka.streams.processor.internals.InternalTopologyBuilder;
    
    import java.util.Properties;
    
    public class Application {
        public static void main(String[] args) {
            String input = "abc";   //输入 topic
            String output = "recommender";  //输出 topic
    
            Properties properties = new Properties();
            properties.put(StreamsConfig.APPLICATION_ID_CONFIG,"logProcessor");
            properties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG,"hadoop1:9092");
            //使用Serdes类创建序列化/反序列化所需的Serde实例 Serdes类为以下类型提供默认的实现:String、Byte array、Long、Integer和Double。
            Serde<String> stringSerde = Serdes.String();
    
            StreamsBuilder builder = new StreamsBuilder();
            KStream<String, String> simpleFirstStream = builder.stream(input, Consumed.with(stringSerde, stringSerde));
            // 使用KStream.mapValues 将输入数据流以 abc: 拆分获取下标为 1 字符串
            KStream<String, String> upperCasedStream = simpleFirstStream.mapValues(line -> line.split("abc:")[1]);
            // 把转换结果输出到另一个topic
            upperCasedStream.to(output, Produced.with(stringSerde, stringSerde));
    
            //创建和启动KStream
            KafkaStreams kafkaStreams = new KafkaStreams(builder.build(), properties);
            kafkaStreams.start();
        }
    }

    3. 测试 

      1)启动 2 中程序

      2)启动 kafka

      3)启动一个名称为 abc 的 topic 生产者

        bin/kafka-topics.sh --create --zookeeper hadoop1:2181 --replication-factor 1 --partitions 1 --topic abc

      4)启动一个名词为 recommender 的topic 消费者

        bin/kafka-console-consumer.sh --bootstrap-server hadoop1:9092 --topic recommender

      5)abc topic 中输入字符串(如:   abc:22|33|44|55)

      6)recommender 中就可收到过滤后的字符串   22|33|44|55

  • 相关阅读:
    3、SpringBoot+MybatisPlus整合-------代码生成器
    2、SpringBoot+MybatisPlus整合-------BaseCRUD
    1、SpringBoot+MybatisPlus整合
    11、SpringBoot------定时任务
    4、SpringBoot------邮件发送(2)
    3、SpringBoot------邮件发送(1)
    2.初识CronTrigger
    3.初识Cron表达式
    1.初识Quartz
    9、SpringBoot+Mybatis整合------动态sql
  • 原文地址:https://www.cnblogs.com/redhat0019/p/11696746.html
Copyright © 2011-2022 走看看