input配置:
file:讀取文件
input {
file{
path => ["/var/log/*.log","/var/log/message"]
type => "system"
start_position => "beginning"
}
}
start_position:logstash從什么位置讀取文件數據,默認是結束的位置,也就是說logstash會以類似tail -f的形式運行。
如果需要導入原始數據,需要把這個設定為:"beginnning",logstash就從頭開始讀取.
stdin:標准輸入
input {
stdin {
add_filed =>{"key" => "value"}
codec => "plain"
tags => ["add"]
type => "std"
}
}
input {
stdin {
type => "web"
}
}
filter{
if[type] == "web"{
gork{
match => ["message",%{COMBINEDAPACHELOG}]
}
}
}
output {
if "_grokparsefailure" in [tags] {
nagios_nsca{
nagios_status => "1"
}
}else {
elasticsearch {
}
}
}
filter配置:
data:時間處理
%{+YYYY.MM.dd}這種寫法必須讀取@timestamp數據,所以一定不要直接刪除這個字段保留自己的字段,而是應該用filter/date轉換后刪除自己的字段.
filter {
grok {
match => ["message","%{HTTPDATE:logdate}"]
}
date {
match => ["logstash","dd/MMM/yyyy:HH:mm:ss Z"]
}
}
時區偏移量使用Z.
output配置:
elasticsearch:
output {
elasticsearch {
hosts => ["192.168.0.2:9200"]
index => "logstash-%{type}-%{+YYYY.MM.dd}"
document_type => "%{type}"
flush_size => 20000
idle_flush_time => 10
sniffing => true
template_overwrite => true
}
}
?logstash在有多個conf文件的情況下,進入es的數據會重復,幾個conf數據就會重復幾次.
!output段順序執行,沒有對日志type進行判斷的各插件配置都會全部執行一次.
output {
if [type] == "nginxaccess" {
elasticsearch { }
}
}
email:發送郵件
output {
email {
to => "admin@website.com,root@website.com"
cc => "other@website.com"
via => "smtp"
subject => "Warning: %{title}"
options => {
smtpIporHost => "localhost",
port => 25,
domain => 'localhost.localdomain',
userName => nil,
password => nil,
authenticationType => nil, # (plain, login and cram_md5)
starttls => true
}
htmlbody => ""
body => ""
attachments => ["/path/to/filename"]
}
}
注意:option參數在logstash2.0以后已經被移除.
output {
email {
port => "25"
address => "smtp.126.com"
username => "test@126.com"
password => ""
authentication => "plain"
use_tls => true
from => "test@126.com"
subject => "Warning: %{title}"
to => "test@qq.com"
via => "smtp"
body => "%{message}"
}
}
file:保存成文件
output {
file {
path => "/path/to/%{+yyyy}/%{+MM}/%{+dd}/%{host}.log.gz"
message_format => "%{message}"
gzip => true
}
}
