zoukankan      html  css  js  c++  java
  • kafka Enabling Kerberos Authentication

    CDK 2.0 and higher Powered By Apache Kafka supports Kerberos authentication, but it is supported only for the new Kafka Producer and Consumer APIs. If you already have a Kerberos server, you can add Kafka to your current configuration. If you do not have a Kerberos server, install it before proceeding. See Enabling Kerberos Authentication Using the Wizard.

    If you already have configured the mapping from Kerberos principals to short names using the hadoop.security.auth_to_local HDFS configuration property, configure the same rules for Kafka by adding the sasl.kerberos.principal.to.local.rules property to the Advanced Configuration Snippet for Kafka Broker Advanced Configuration Snippet using Cloudera Manager. Specify the rules as a comma separated list.

    To enable Kerberos authentication for Kafka:

    1. From Cloudera Manager, navigate to Kafka > Configurations. Set SSL client authentication to none. Set Inter Broker Protocol to SASL_PLAINTEXT.
    2. Click Save Changes.
    3. Restart the Kafka service.
    4. Make sure that listeners = SASL_PLAINTEXT is present in the Kafka broker logs /var/log/kafka/server.log.
    5. Create a jaas.conf file with the following contents to use with cached Kerberos credentials (you can modify this to use keytab files instead of cached credentials. To generate keytabs, see Step 6: Get or Create a Kerberos Principal for Each User Account).

      If you use kinit first, use this configuration.

      KafkaClient {
      com.sun.security.auth.module.Krb5LoginModule required
      useTicketCache=true;
      };
      If you use keytab, use this configuration:
      KafkaClient {
      com.sun.security.auth.module.Krb5LoginModule required
      useKeyTab=true
      keyTab="/etc/security/keytabs/kafka_server.keytab"
      principal="kafka/kafka1.hostname.com@EXAMPLE.COM";
      };
    6. Create the client.properties file containing the following properties.
      security.protocol=SASL_PLAINTEXT
      sasl.kerberos.service.name=kafka
    7. Test with the Kafka console producer and consumer. To obtain a Kerberos ticket-granting ticket (TGT):
      $ kinit <user>
    8. Verify that your topic exists. (This does not use security features, but it is a best practice.)
      $ kafka-topics --list --zookeeper <zkhost>:2181
    9. Verify that the jaas.conf file is used by setting the environment.
      $ export KAFKA_OPTS="-Djava.security.auth.login.config=/home/user/jaas.conf"
    10. Run a Kafka console producer.
      $ kafka-console-producer --broker-list <anybroker>:9092 --topic test1 
      --producer.config client.properties
    11. Run a Kafka console consumer.
      $ kafka-console-consumer --new-consumer --topic test1 --from-beginning 
      --bootstrap-server <anybroker>:9092 --consumer.config client.properties
  • 相关阅读:
    x264
    [转贴]使用dbstart 和dbshut 脚本来自动化启动和关闭数据库
    企业搜索引擎开发之连接器connector(二十六)
    企业搜索引擎开发之连接器connector(二十五)
    深入分析 Java 中的中文编码问题
    深入分析 Java I/O 的工作机制
    企业搜索引擎开发之连接器connector(二十四)
    企业搜索引擎开发之连接器connector(二十三)
    企业搜索引擎开发之连接器connector(二十二)
    企业搜索引擎开发之连接器connector(二十一)
  • 原文地址:https://www.cnblogs.com/felixzh/p/11508383.html
Copyright © 2011-2022 走看看