同事整理的,在此分享。logback,log4j2 等slf4j的日志實現都可以以json格式輸出日志, 這里采用的是logback。當然也可以以文本行的格式輸出,然后在logstash里通過grok解析,但是直接以json格式輸出,在logstash處理時效率會高一點。
Logback 輸出 Json格式日志文件
為了讓 logback 輸出JSON 格式的日志文件,需要在pom.xml 加入如下依賴
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>
4.8
</version>
<scope>runtime</scope>
</dependency>
|
logback日志配置示例
<appender name=
"errorFile"
class
=
"ch.qos.logback.core.rolling.RollingFileAppender"
>
<filter
class
=
"ch.qos.logback.classic.filter.LevelFilter"
>
<level>ERROR</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<file>${log.dir}/elk/error.log</file> <!-- 當前的日志文件文件放在 elk文件下,該日志的內容會被filebeat傳送到es -->
<rollingPolicy
class
=
"ch.qos.logback.core.rolling.TimeBasedRollingPolicy"
> <! -- 歷史日志會放到 bak 文件下,最多保存
7
天的歷史,最多占用 1G的空間 -->
<fileNamePattern>${log.dir}/bak/error.%d{yyyy-MM-dd}.log</fileNamePattern>
<maxHistory>
7
</maxHistory>
<totalSizeCap>1GB</totalSizeCap>
</rollingPolicy>
<encoder
class
=
"net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder"
>
<providers>
<pattern>
<pattern>
{
"tags"
: [
"errorlog"
],
"project"
:
"myproject"
,
"timestamp"
:
"%date{\"yyyy-MM-dd'T'HH:mm:ss,SSSZ\"}"
,
"log_level"
:
"%level"
,
"thread"
:
"%thread"
,
"class_name"
:
"%class"
,
"line_number"
:
"%line"
,
"message"
:
"%message"
,
"stack_trace"
:
"%exception{5}"
,
"req_id"
:
"%X{reqId}"
,
"elapsed_time"
:
"#asLong{%X{elapsedTime}}"
}
</pattern>
</pattern>
</providers>
</encoder>
</appender>
|
Json 字段說明:
名稱
|
說明
|
備注
|
|
|
|
|
|
---|---|---|---|---|---|---|---|
tags | 用於說明這條日志是屬於哪一類日志 | ||||||
timestamp |
日志記錄時間 | ||||||
project |
系統名稱,該日志來自於哪個系統 | ||||||
log_level |
輸出日志級別 | ||||||
thread |
輸出產生日志的線程名。 | ||||||
class_name |
輸出執行記錄請求的調用者的全限定名 | |
|||||
line_number |
輸出執行日志請求的行號 | |
|||||
message |
輸出應用程序提供的信息 | ||||||
stack_trace |
異常棧信息 | ||||||
req_id |
請求ID,用於追蹤請求 | 需要引入aop-logging | |||||
elapsed_time |
該方法執行時間,單位: 毫秒 | 需要引入aop-logging |
%X{key}: 表示該項來自於SLF4j MDC,需要引入 aop-logging
<dependency>
<groupId>com.cloud</groupId>
<artifactId>xspring-aop-logging</artifactId>
<version>
0.7
.
1
</version>
</dependency>
針對web應用,在 web.xml 中加入 ReqIdFilter,該過濾器會在MDC 加入 reqId
<filter>
<filter-name>aopLogReqIdFilter</filter-name>
<filter-
class
>com.github.nickvl.xspring.core.log.aop.ReqIdFilter</filter-
class
>
</filter>
<filter-mapping>
<filter-name>aopLogReqIdFilter</filter-name>
<url-pattern>/*</url-pattern>
</filter-mapping>
or register in springboot like
this
:
@Bean
public
FilterRegistrationBean getDemoFilter(){
ReqIdFilter reqIdFilter=
new
ReqIdFilter();
FilterRegistrationBean registrationBean=
new
FilterRegistrationBean();
registrationBean.setFilter(reqIdFilter);
List<String> urlPatterns=
new
ArrayList<String>();
urlPatterns.add(
"/*"
);
registrationBean.setUrlPatterns(urlPatterns);
registrationBean.setOrder(
100
);
return
registrationBean;
}
如果需要記錄該方法執行時間: elapsed_time,如果在該類或者方法上加入如下注解:
import
com.github.nickvl.xspring.core.log.aop.annotation.LogDebug;
import
com.github.nickvl.xspring.core.log.aop.annotation.LogInfo;
@LogInfo
// 當logger 設為level=INFO 會輸出
@LogException
(value = {
@Exc
(value = Exception.
class
, stacktrace =
false
)}, warn = {
@Exc
({IllegalArgumentException.
class
})})
//
當logger 設為level=error 會輸出
針對dubbo 消費者的日志記錄,dubbo消費者是通過 javassist 生成的動態類型,如果要監控該dubbo接口的傳入參數,返回值,和調用時間 需要引入aop-logging,
以及在 eye-rpc包中的接口上給對應的類或方法 加上上面的注解。
dubbo 消費者的日志會輸出如下配置:
<logger name=
"com.alibaba.dubbo.common.bytecode"
level=
"INFO"
additivity=
"false"
>
<appender-ref ref=
"dubboApiFile"
/>
</logger>
|
ElasticSearch 模板設置
curl -XPUT http:
//localhost:9200/_template/log -d '{
"mappings"
: {
"_default_"
: {
"_all"
: {
"enabled"
:
false
},
"_meta"
: {
"version"
:
"5.1.1"
},
"dynamic_templates"
: [
{
"strings_as_keyword"
: {
"mapping"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"match_mapping_type"
:
"string"
}
}
],
"properties"
: {
"@timestamp"
: {
"type"
:
"date"
},
"beat"
: {
"properties"
: {
"hostname"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"name"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"version"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
}
}
},
"input_type"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"message"
: {
"norms"
:
false
,
"type"
:
"text"
},
"offset"
: {
"type"
:
"long"
},
"source"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"tags"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"type"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
}
}
}
},
"order"
:
0
,
"settings"
: {
"index.refresh_interval"
:
"5s"
},
"template"
:
"log-*"
}'
curl -XPUT http:
//localhost:9200/_template/log-java -d '
{
"mappings"
: {
"_default_"
: {
"properties"
: {
"log_level"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"project"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"thread"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"req_id"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"class_name"
: {
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"line_number"
: {
"type"
:
"long"
},
"exception_class"
:{
"ignore_above"
:
1024
,
"type"
:
"keyword"
},
"elapsed_time"
: {
"type"
:
"long"
},
"stack_trace"
: {
"type"
:
"keyword"
}
}
}
},
"order"
:
1
,
"settings"
: {
"index.refresh_interval"
:
"5s"
},
"template"
:
"log-java-*"
}'
|
logstatsh 設置
logstash-java-log
if
[fields][logType] ==
"java"
{
json {
source =>
"message"
remove_field => [
"offset"
]
}
date {
match => [
"timestamp"
,
"yyyy-MM-dd'T'HH:mm:ss,SSSZ"
]
remove_field => [
"timestamp"
]
}
if
[stack_trace] {
mutate {
add_field => {
"exception_class"
=>
"%{stack_trace}"
}
}
}
if
[exception_class] {
mutate {
gsub => [
"exception_class"
,
"\n"
,
""
,
"exception_class"
,
":.*"
,
""
]
}
}
}
|
filebeat 設置
filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- /eyebiz/logs/eyebiz-service/elk/*.log # eyebiz-service 日志
- /eyebiz/logs/eyebiz-web/elk/*.log # eyebiz-web 日志
fields:
logType:
"java"
docType:
"log-java-dev"
|