Asp.net Core + Log4net + ELK 搭建日志中心


Docker中一鍵安裝ELK

對於這種工具類的東西,第一步就直接到docker的hub中查找了,很幸運,不僅有Elasticsearch,kibana,logstash 單獨的鏡像,而且還直接 有ELK的鏡像。

sudo docker run -p 5601:5601 -p 9200:9200 -p 5044:5044 -d --name log-platform --restart always  sebp/elk

這當然能少好多配置,毫不猶豫就選擇了elk的鏡像, 運行起來!如果沒有異常的話相信就很容易的跑起來了(最有可能出現的問題就是虛擬內存不足了,可以百度找解決方案這里就不在詳細說了)

項目中添加log4net到Elasticseach的Appender

因為在.net core 之前就有搭建過日志中心,所以對於appender還記得有一個Log4net.Elasticsearch的dll,但是在查看資料之后發現很久沒有更新 也不支持.net standard。在決定自己實現appender之前,抱着僥幸心理去查找了一翻,既然找到一個支持.net core的開源項目log4stash。很幸運,又可以不要造輪子了,哈哈。log4Stash使用很簡單,但是配置的東西還挺多的而且作者也沒有很好的文檔介紹,先不管其他的用起來在說。

  • 項目中添加log4stash
Install-Package log4stash -Version 2.2.1
  • 修改log4net.config
    在log4net.config中添加appender
<appender name="ElasticSearchAppender" type="log4stash.ElasticSearchAppender, log4stash">
    <Server>localhost</Server>
    <Port>9200</Port>
    <IndexName>log_test_%{+yyyy-MM-dd}</IndexName>
    <IndexType>LogEvent</IndexType>
    <Bulksize>2000</Bulksize>
    <BulkIdleTimeout>10000</BulkIdleTimeout>
    <IndexAsync>True</IndexAsync>
</appender>

另外附上全部的配置信息

<appender name="ElasticSearchAppender" type="log4stash.ElasticSearchAppender, log4stash">
    <Server>localhost</Server>
    <Port>9200</Port>
    <!-- optional: in case elasticsearch is located behind a reverse proxy the URL is like http://Server:Port/Path, default = empty string -->
    <Path>/es5</Path>
    <IndexName>log_test_%{+yyyy-MM-dd}</IndexName>
    <IndexType>LogEvent</IndexType>
    <Bulksize>2000</Bulksize>
    <BulkIdleTimeout>10000</BulkIdleTimeout>
    <IndexAsync>False</IndexAsync>
    <DocumentIdSource>IdSource</DocumentIdSource> <!-- obsolete! use IndexOperationParams -->
    
    <!-- Serialize log object as json (default is true).
      -- This in case you log the object this way: `logger.Debug(obj);` and not: `logger.Debug("string");` -->
    <SerializeObjects>True</SerializeObjects> 

    <!-- optional: elasticsearch timeout for the request, default = 10000 -->
    <ElasticSearchTimeout>10000</ElasticSearchTimeout>

    <!--You can add parameters to the request to control the parameters sent to ElasticSearch.
    for example, as you can see here, you can add a routing specification to the appender.
    The Key is the key to be added to the request, and the value is the parameter's name in the log event properties.-->
    <IndexOperationParams>
      <Parameter>
        <Key>_routing</Key>
        <Value>%{RoutingSource}</Value>
      </Parameter>
      <Parameter>
        <Key>_id</Key>
        <Value>%{IdSource}</Value>
      </Parameter>
      <Parameter>
        <Key>key</Key>
        <Value>value</Value>
      </Parameter>
    </IndexOperationParams>

    <!-- for more information read about log4net.Core.FixFlags -->
    <FixedFields>Partial</FixedFields>
    
    <Template>
      <Name>templateName</Name>
      <FileName>path2template.json</FileName>
    </Template>

    <!--Only one credential type can used at once-->
    <!--Here we list all possible types-->
    <AuthenticationMethod>
      <!--For basic authentication purposes-->
      <Basic>
          <Username>Username</Username>
          <Password>Password</Password>
      </Basic>
      <!--For AWS ElasticSearch service-->
      <Aws>
          <Aws4SignerSecretKey>Secret</Aws4SignerSecretKey>
          <Aws4SignerAccessKey>AccessKey</Aws4SignerAccessKey>
          <Aws4SignerRegion>Region</Aws4SignerRegion>
      </Aws>
    </AuthenticationMethod>
    
    <!-- all filters goes in ElasticFilters tag -->
    <ElasticFilters>
      <Add>
        <Key>@type</Key>
        <Value>Special</Value>
      </Add>

      <!-- using the @type value from the previous filter -->
      <Add>
        <Key>SmartValue</Key>
        <Value>the type is %{@type}</Value>
      </Add>

      <Remove>
        <Key>@type</Key>
      </Remove>

      <!-- you can load custom filters like I do here -->
      <Filter type="log4stash.Filters.RenameKeyFilter, log4stash">
        <Key>SmartValue</Key>
        <RenameTo>SmartValue2</RenameTo>
      </Filter>
    
      <!-- converts a json object to fields in the document -->
      <Json>
        <SourceKey>JsonRaw</SourceKey>
        <FlattenJson>false</FlattenJson>
        <!-- the separator property is only relevant when setting the FlattenJson property to 'true' -->
        <Separator>_</Separator> 
      </Json>

      <!-- converts an xml object to fields in the document -->
      <Xml>
        <SourceKey>XmlRaw</SourceKey>
        <FlattenXml>false</FlattenXml>
      </Xml>
      
      <!-- kv and grok filters similar to logstash's filters -->
      <Kv>
        <SourceKey>Message</SourceKey>
        <ValueSplit>:=</ValueSplit>
        <FieldSplit> ,</FieldSplit>
      </kv>

      <Grok>
        <SourceKey>Message</SourceKey>
        <Pattern>the message is %{WORD:Message} and guid %{UUID:the_guid}</Pattern>
        <Overwrite>true</Overwrite>
      </Grok>

      <!-- Convert string like: "1,2, 45 9" into array of numbers [1,2,45,9] -->
      <ConvertToArray>
        <SourceKey>someIds</SourceKey>
        <!-- The separators (space and comma) -->
        <Seperators>, </Seperators> 
      </ConvertToArray>

      <Convert>
        <!-- convert given key to string -->
        <ToString>shouldBeString</ToString>

        <!-- same as ConvertToArray. Just for convenience -->
        <ToArray>
           <SourceKey>anotherIds</SourceKey>
        </ToArray>
      </Convert>
    </ElasticFilters>
</appender>

最后別忘了在root中添加上appender

  <root>
    <level value="WARN" />
    <appender-ref ref="ElasticSearchAppender" />
  </root>

OK,項目的配置就到這里結束了,可以運行項目寫入一些測試的日志了。

在kibana建立Index Pattern

Elasticsearch的Index跟關系數據庫中的Database挺類似的,雖然我們項目在寫了測試數據后,Elasticsearch中就已經有Index了,但是如果我們需要在可視化工具中查詢數據的話建立Index Pattern
進入Management - Create Index Pattern,輸入我們項目日志配置文件中的Index名稱log_test-*(如果有數據,這邊應該是會自動帶出來的),然后創建,之后就可以在Kibana中瀏覽,查詢我們的日志信息了。


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM