input配置:
file:读取文件
input {
file{
path => ["/var/log/*.log","/var/log/message"]
type => "system"
start_position => "beginning"
}
}
start_position:logstash从什么位置读取文件数据,默认是结束的位置,也就是说logstash会以类似tail -f的形式运行。
如果需要导入原始数据,需要把这个设定为:"beginnning",logstash就从头开始读取.
stdin:标准输入
input {
stdin {
add_filed =>{"key" => "value"}
codec => "plain"
tags => ["add"]
type => "std"
}
}
input {
stdin {
type => "web"
}
}
filter{
if[type] == "web"{
gork{
match => ["message",%{COMBINEDAPACHELOG}]
}
}
}
output {
if "_grokparsefailure" in [tags] {
nagios_nsca{
nagios_status => "1"
}
}else {
elasticsearch {
}
}
}
filter配置:
data:时间处理
%{+YYYY.MM.dd}这种写法必须读取@timestamp数据,所以一定不要直接删除这个字段保留自己的字段,而是应该用filter/date转换后删除自己的字段.
filter {
grok {
match => ["message","%{HTTPDATE:logdate}"]
}
date {
match => ["logstash","dd/MMM/yyyy:HH:mm:ss Z"]
}
}
时区偏移量使用Z.
output配置:
elasticsearch:
output {
elasticsearch {
hosts => ["192.168.0.2:9200"]
index => "logstash-%{type}-%{+YYYY.MM.dd}"
document_type => "%{type}"
flush_size => 20000
idle_flush_time => 10
sniffing => true
template_overwrite => true
}
}
?logstash在有多个conf文件的情况下,进入es的数据会重复,几个conf数据就会重复几次.
!output段顺序执行,没有对日志type进行判断的各插件配置都会全部执行一次.
output {
if [type] == "nginxaccess" {
elasticsearch { }
}
}
email:发送邮件
output {
email {
to => "admin@website.com,root@website.com"
cc => "other@website.com"
via => "smtp"
subject => "Warning: %{title}"
options => {
smtpIporHost => "localhost",
port => 25,
domain => 'localhost.localdomain',
userName => nil,
password => nil,
authenticationType => nil, # (plain, login and cram_md5)
starttls => true
}
htmlbody => ""
body => ""
attachments => ["/path/to/filename"]
}
}
注意:option参数在logstash2.0以后已经被移除.
output {
email {
port => "25"
address => "smtp.126.com"
username => "test@126.com"
password => ""
authentication => "plain"
use_tls => true
from => "test@126.com"
subject => "Warning: %{title}"
to => "test@qq.com"
via => "smtp"
body => "%{message}"
}
}
file:保存成文件
output {
file {
path => "/path/to/%{+yyyy}/%{+MM}/%{+dd}/%{host}.log.gz"
message_format => "%{message}"
gzip => true
}
}