zoukankan      html  css  js  c++  java
  • IBM developer:Setting up the Kafka plugin for Ranger

    Follow these steps to enable and configure the Kafka plugin for Ranger.

    Before you begin

    The default policy user (ambari-qa) used for a plug-in should be an existing valid user on the system which is configured for Ranger.

    Procedure

    1. From the Ambari web interface, select the Ranger service and then open the Configs tab. Select the Ranger Plugin tab.
      Screen capture of the Ambari Ranger Configs dialog.
    2. In the Ranger Plugin section, enable the Kafka Ranger Plugin, and then click Save.
      Note
      1. The Kafka Ranger plugin requires Kerberos. You will see a warning if you try to enable Kafka on an non-Kerberized cluster. For details see the Kafka Plugin section of the Ranger FAQ.
      2. Topic creation can be authorized via Ranger, but only if the topic is being auto-created by consumers or producers. The recommended policy setup to authorize topic auto-creation for producers or consumers is as follows:
        1. Create a policy where resource is all topics, i.e. *.
        2. For producers, create a policy item under this policy which grants both Produce and Configure permissions to the relevant user or user-groups.
        3. For consumers, create a policy item under this policy which grants both Consume and Configure permissions to the relevant user or user-groups.

    Example

    The following is an example of how to use the Kafka Ranger plugin for authorization:
    1. Ensure that the default policy created when the plugin is enabled is enabled and synced.
    2. Ensure that Kerberos tickets are not expired by using the kinit command as the kafka user.
    3. Run the following command to create a topic in Kafka. Run the command as the kafka user and from the /usr/iop/current/kafka-broker/ directory:
      bin/kafka-topics.sh --create --zookeeper hostname.fyre.ibm.com:2181 --replication-factor 1 
      --partitions 1 --topic test-topic
      复制
    4. Create files named producer.properties and consumer.properties, each with a single line with the value security.protocol=SASL_PLAINTEXT.
    5. Run the following command to start the producer. Run the command as the kafka user and from the /usr/iop/current/kafka-broker/ directory:
      bin/kafka-console-producer.sh --broker-list <cluster url>:6667 --topic test-topic 
      --producer.config <path>/producer.properties
      复制
    6. In another window, run the following command to start the consumer. Run the command as the root user and from the /usr/iop/current/kafka-broker/ directory:
      bin/kafka-console-consumer.sh --topic test-topic --from-beginning --bootstrap-server <cluster url>:6667 
      --consumer.config <path>/consumer.properties
      复制
    7. In the producer window, write some test messages and observe that they appear in the consumer window.
    8. Disable the policy and observe that error messages show up in both windows that they can no longer connect.
    9. Re-enable the policy and observe that messages can be sent and received properly again.
  • 相关阅读:
    Vue3.0官方文档
    简单实现Vue的双向绑定原理
    小程序使用weapp-qrcode二维码插件,宽高自适应解决方法
    小程序判断ios还是android
    手写实现bind
    手写实现call,apply函数
    React onClick点击事件传参三种写法
    zynq 中断
    zynq_ps端点亮led灯代码
    突然发现自己的很多博客无法显示图片,人都傻了,于是就整理了一早上,全部换成了markdown格式,就好了,希望博客的时间不会对大家造成困扰!!!
  • 原文地址:https://www.cnblogs.com/felixzh/p/10474815.html
Copyright © 2011-2022 走看看