- 新建docker-compose.yml文件
version: '2' services: fluentd: build: ./fluentd volumes: - ./fluentd/conf:/fluentd/etc links: - "elasticsearch" ports: - "24224:24224" - "24224:24224/udp" elasticsearch: image: elasticsearch volumes: - ./es/data:/usr/share/elasticsearch/data expose: - 9200 ports: - "9200:9200" kibana: image: kibana volumes: - ./kibana/plugins/:/usr/share/kibana/plugins/ links: - "elasticsearch" ports: - "5601:5601"
- 新建新建 fluentd 目录,在fluentd目录下新建Dockerfile文件
FROM fluent/fluentd:v0.12-debian RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-rdoc", "--no-ri", "--version", "1.9.2"]
在fluentd目录下 新建conf/fluent.conf
<source> @type forward port 24224 bind 0.0.0.0 </source> <match *.**> @type copy <store> @type elasticsearch host elasticsearch port 9200 logstash_format true logstash_prefix fluentd logstash_dateformat %Y%m%d include_tag_key true type_name access_log tag_key @log_name flush_interval 1s </store> <store> @type stdout </store> </match>
- docker-compose up -d 启动即可 然后配置 docker/swarm daemon.json内容 修改
"log-driver":"fluentd", "log-opts":{ "fluentd-address":"192.168.0.133:24224" },
注意,此时如果fluentd服务挂了 服务启动不起来的,可以在服务启动时候 加上 --log-opt=fluentd-async-connect
- 具体详情请参照官方文档 https://docs.fluentd.org/v0.12/articles/docker-logging-efk-compose