zoukankan      html  css  js  c++  java
  • logstash Codec插件

    Codec: 解码编码 数据格式
    
    json,msgpack,edn
    
    
    logstash处理流程:
    
    input->decode->filter->encode->output
    
    
    plain 是一个空的解析器,它可以让用户自己制定格式
    
    [elk@db01 0204]$ cat plain01.conf 
    input {
     stdin {
       }
    }
    
    output {
     stdout{
      codec=>plain
    }
    }
    
    
    
    [elk@db01 0204]$ logstash -f plain01.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    333333
    2017-01-17T21:16:27.548Z db01 33333344444
    2017-01-17T21:16:34.774Z db01 44444
    
    
    [elk@db01 0204]$ cat plain02.conf 
    input {
     stdin {
       }
    }
    
    output {
     stdout{
    codec=>json
    }
    }
    
    
    
    [elk@db01 0204]$ logstash -f plain02.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    aaaa
    {"message":"aaaa","@version":"1","@timestamp":"2017-01-17T21:18:22.160Z","host":"db01"}
    
    
    json编码:
    
    如果事件数据是json格式,可以加入codec=>json来进行解析
    
    [elk@db01 0204]$ logstash -f plain02.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    aaaa
    {"message":"aaaa","@version":"1","@timestamp":"2017-01-17T21:18:22.160Z","host":"db01"}
    
    
    
    json_lines 编码:
    
    input {
      tcp{
          port=>12388
          host=>"127.0.0.1"
          codec=>json_lines{
       }
     }
    }
    
    output{
      stdout{}
    }
    
    
    
    rubydebug 
    
    采用Ruby库来解析日志
    
    [elk@db01 0204]$ cat ruby.conf 
    input {
      stdin {
      codec=>json
    }
    }
    
    output {
     stdout{
     codec=>rubydebug
     }
    }
    
    [elk@db01 0204]$ logstash -f ruby.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    {"bookname":"elk","price":12}  
    {
          "bookname" => "elk",
             "price" => 12,
          "@version" => "1",
        "@timestamp" => "2017-01-17T21:40:28.601Z",
              "host" => "db01"
    }
    
    
    
    
    
    multiline 多行事件
    
    有时候有的日志用多行去展现,这么多行其实都是一个事件
    
    比如JAVA的异常日志
    
    
    
    what=>"previous" 未匹配的内容向前合并
    [elk@db01 0204]$ cat mulit.conf 
    input {
     stdin {
     codec=>multiline {
     pattern=>"^["
     negate=>true
     what=>"previous"
     }
    }
    }
    
    output {
     stdout{}
    }
    
    
    [elk@db01 0204]$ logstash -f mulit.conf 
    Settings: Default pipeline workers: 4
    Pipeline main started
    [03-Jun-2014 13:34:13:] PHP err01:aaaaaaaaa
    111111111111111
    222222222222222
    [09-Aug-2015 44:33:22] PHP 9999
    2017-01-17T21:59:39.654Z db01 [03-Jun-2014 13:34:13:] PHP err01:aaaaaaaaa
    111111111111111
    222222222222222
    
    
    为什么[09-Aug-2015 44:33:22] PHP 9999 这条没输出,因为需要匹配下一个 pattern=>"^["
    
    
    
    
    

  • 相关阅读:
    NOIP2011 D1T1 铺地毯
    NOIP2013 D1T3 货车运输 倍增LCA OR 并查集按秩合并
    POJ 2513 trie树+并查集判断无向图的欧拉路
    599. Minimum Index Sum of Two Lists
    594. Longest Harmonious Subsequence
    575. Distribute Candies
    554. Brick Wall
    535. Encode and Decode TinyURL(rand and srand)
    525. Contiguous Array
    500. Keyboard Row
  • 原文地址:https://www.cnblogs.com/hzcya1995/p/13349892.html
Copyright © 2011-2022 走看看