filebeat收集多个目录日志配置


因业务需要,我们现有得服务器上一个节点上装了多个服务,前后端都有涉及,因此就需要用 filebeat 将这些日志收集起来生成不一样得索引,配置如下(仅供参考):

input:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/nginx/*.log
  fields:
    log_type: "nginx"

  json.key_under_root: true
  json.overwite_keys: true
    #- c:\programdata\elasticsearch\logs\*
- type: log
  enabled: true
  paths:
    - /var/log/elasticsearch/elasticsearch.log
  fields:
    log_type: "es"

  multiline.pattern: '^\s'
  multiline.negate: true
  multiline.match: after

- type: log
  enabled: true
  paths:
    - /data/ruoyi/*.log
  fields:
    log_type: "ruoyi"

  multiline.pattern: '^\s'
  multiline.negate: true
  multiline.match: after

output:

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["192.168.53.21:9200","192.168.53.22:9200"]
  index: "nginx-%{+yyyy.MM}"
  indices:
    - index: "es-log"
      when.contains:
        fields:
          log_type: "es"
    - index: "ruoyi-log"
      when.contains:
        fields:
          log_type: "ruoyi"

解释一下大概就是按域或者说是字段区分,按照域创建不同得索引,output 中 hosts 下面得index 意思是除下面两个判断,其他得放在nginx索引中


免责声明!

本站转载的文章为个人学习借鉴使用,本站对版权不负任何法律责任。如果侵犯了您的隐私权益,请联系本站邮箱yoyou2525@163.com删除。



 
粤ICP备18138465号  © 2018-2025 CODEPRJ.COM