filebeat收集多個目錄日志配置


因業務需要,我們現有得服務器上一個節點上裝了多個服務,前后端都有涉及,因此就需要用 filebeat 將這些日志收集起來生成不一樣得索引,配置如下(僅供參考):

input:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/nginx/*.log
  fields:
    log_type: "nginx"

  json.key_under_root: true
  json.overwite_keys: true
    #- c:\programdata\elasticsearch\logs\*
- type: log
  enabled: true
  paths:
    - /var/log/elasticsearch/elasticsearch.log
  fields:
    log_type: "es"

  multiline.pattern: '^\s'
  multiline.negate: true
  multiline.match: after

- type: log
  enabled: true
  paths:
    - /data/ruoyi/*.log
  fields:
    log_type: "ruoyi"

  multiline.pattern: '^\s'
  multiline.negate: true
  multiline.match: after

output:

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["192.168.53.21:9200","192.168.53.22:9200"]
  index: "nginx-%{+yyyy.MM}"
  indices:
    - index: "es-log"
      when.contains:
        fields:
          log_type: "es"
    - index: "ruoyi-log"
      when.contains:
        fields:
          log_type: "ruoyi"

解釋一下大概就是按域或者說是字段區分,按照域創建不同得索引,output 中 hosts 下面得index 意思是除下面兩個判斷,其他得放在nginx索引中


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM