在外理日志時,除了訪問日志外,還要處理運行時日志,該日志大都用程序寫的,比如log4j。運行時日志跟訪問日志最大的不同是,運行時日志是多行,也就是說,連續的多行才能表達一個意思。
在filter中,加入以下代碼:
filter {
multiline { }
}
如果能按多行處理,那么把他們拆分到字段就很容易了。
字段屬性:
對於multiline插件來說,有三個設置比較重要:negate , pattern 和 what
negate:類型是boolean默認為false
pattern:
必須設置,並且沒有默認值,類型為string,要匹配下則表達式
what:
必須設置,並且沒有默認值,可以為previous(之前的)或next
下面看看這個例子:
# cat logstash_multiline_shipper.conf input { file { path => "/apps/logstash/conf/test/c.out" type => "runtimelog" codec => multiline { pattern => "^\[" negate => true what => "previous" } start_position => "beginning" sincedb_path => "/apps/logstash/logs/sincedb-access" ignore_older =>0 } } output { stdout{ codec => rubydebug } }
說明:區配以"["開頭的行,如果不是,那肯定是屬於前一行的
測試數據如下:
[16-04-12 03:40:01 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over. [16-04-12 03:40:02 DEBUG] impl.JdbcEntityInserter:- from product_category product_category where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null order by product_category.ORDERS asc [16-04-12 03:40:03 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over. [16-04-12 03:40:04 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over. [16-04-12 03:40:05 DEBUG] impl.JdbcEntityInserter:- from product_category product_category where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null order by product_category.ORDERS desc [16-04-12 03:40:06 DEBUG] impl.JdbcEntityInserter:- from product_category product_category where product_category.PARENT_ID is null and product_category.STATUS = ? and product_category.DEALER_ID is null order by product_category.ORDERS asc [16-04-12 03:40:07 DEBUG] model.MappingNode:- ['/store/shopclass'] matched over.
啟動logstash:
# ./../bin/logstash -f logstash_multiline_shipper.conf Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties [2016-12-09T15:16:59,173][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500} [2016-12-09T15:16:59,192][INFO ][logstash.pipeline ] Pipeline main started [2016-12-09T15:16:59,263][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}
加入測試數據到被監控的log后,查看輸出:
# ./../bin/logstash -f logstash_multiline_shipper.conf Sending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties [2016-12-09T15:16:59,173][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500} [2016-12-09T15:16:59,192][INFO ][logstash.pipeline ] Pipeline main started [2016-12-09T15:16:59,263][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601} { "path" => "/apps/logstash/conf/test/c.out", "@timestamp" => 2016-12-09T07:21:15.403Z, "@version" => "1", "host" => "ofs1", "message" => "# ./../bin/logstash -f logstash_multiline_shipper.conf \nSending Logstash's logs to /apps/logstash/logs which is now configured via log4j2.properties", "type" => "runtimelog", "tags" => [ [0] "multiline" ] } { "path" => "/apps/logstash/conf/test/c.out", "@timestamp" => 2016-12-09T07:21:15.409Z, "@version" => "1", "host" => "ofs1", "message" => "[2016-12-09T15:16:59,173][INFO ][logstash.pipeline ] Starting pipeline {\"id\"=>\"main\", \"pipeline.workers\"=>4, \"pipeline.batch.size\"=>125, \"pipeline.batch.delay\"=>5, \"pipeline.max_inflight\"=>500}", "type" => "runtimelog", "tags" => [] } { "path" => "/apps/logstash/conf/test/c.out", "@timestamp" => 2016-12-09T07:21:15.410Z, "@version" => "1", "host" => "ofs1", "message" => "[2016-12-09T15:16:59,192][INFO ][logstash.pipeline ] Pipeline main started", "type" => "runtimelog", "tags" => [] }