반응형

로컬 파일 -> logstash -> ES로 저장

logstash.conf

input {
    file {
        path => "/home/ubuntu/data/stock.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter {
    csv {
        separator => ","
        columns => [ "Date", "High", "Low" ]
    }
    date {
        match => ["Date", "yyyy-MM-dd"]
    }
    mutate {convert => ["High", "float"]}
    mutate {convert => ["Low", "float"]}
}

output {
    elasticsearch {
        hosts => "192.168.56.101"
        index => "stock"
    }
    stdout  {}
}

nginx 로그 저장

로그 설정 예시

log_format  main  '"$time_local" "$remote_addr" "$request_method" "$request_uri" "$status" "$http_referer" "$http_user_agent" "$http_x_forwarded_for"';

로그 예시 - /var/logs/nginx-access.log

"11/Feb/2024:15:54:07 +0900" "220.79.108.60" "GET" "/favicon.svg?v=20240210193906" "200" "https://unbelidictor.com/" "Mozilla/5.0 (iPhone; CPU iPhone OS 16_6 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Mobile/15E148 Safari/604.1" "-"

logstash.conf

  • 아래 예제를 위해 filebeat에서 fields.type 지정 필요
input {
  beats {
    port => 5044
  }
}

filter {
  if [fields][type] == "nginx" {
    grok {
      match => ["message", '"%{DATA:_datetime}" "%{DATA:clientIp}" "%{DATA:method}" "%{DATA:url}" "%{DATA:stauts}" "%{DATA:referrer}" "%{DATA:userAgent}" "%{DATA:_xForwardedFor}"']
    }

    grok {
      match => ["[log][file][path]", "%{GREEDYDATA:_filepath}/%{DATA:filename}.log"]
    }

    date {
      match => ["_datetime", "dd/MMM/yyyy:HH:mm:ss Z"]
      target => "@timestamp"
      timezone => "Asia/Seoul"
    }

    if [_xForwardedFor] != "-" {
      mutate {
        replace => {
          "clientIp" => "%{[_xForwardedFor]}"
        }
      }
    }

    mutate {
      remove_field => [ "@version", "tags", "input", "host", "log", "fields", "ecs", "agent", "message", "_filepath", "_xForwardedFor", "_datetime" ]
    }
  }
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "%{filename}-%{+YYYYMMdd}" # YYYYMMDD는 @timestamp 값을 포맷팅
  }
}

spring 로그 저장

로그 예시 - /var/logs/demo-api-was-20240321.log

[2024-01-07 19:44:19:998] [ INFO] [http-nio-8080-exec-1] [o.s.web.servlet.DispatcherServlet       ] [local01] [] Completed initialization in 9 ms
[2024-01-07 19:44:20:001] [ INFO] [http-nio-8080-exec-1] [c.r.api.logging.AccessLoggingFilter     ] [local01] [TX_c32ae74e-3faf-4dc9-abe7-5d7361fd1109] [ACCESS_LOG:START]
[2024-01-07 19:44:20:024] [ERROR] [http-nio-8080-exec-1] [o.a.c.c.C.[.[.[/].[dispatcherServlet]   ] [local01] [TX_c32ae74e-3faf-4dc9-abe7-5d7361fd1109] Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed: java.lang.IllegalStateException: for test] with root cause
java.lang.IllegalStateException: for test
	at com.demo.api.controller.CodeController.getCodes(CodeController.kt:14)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)

logstash.conf

  • 아래 예제를 위해 filebeat에서 fields.type 지정 필요
  • timezone 설정을 올바르게 해주지 않으면 kibana에서 나오는 시간값과 다를 수 있음
input {
  beats {
    port => 5044
  }
}

filter {
  if [fields][type] == "was" {
    grok {
      match => ["message", "\[%{DATA:_logTime}\] \[%{DATA:logLevel}\] \[%{DATA:thread}\] \[%{DATA:className}\] \[%{DATA:machineId}\] \[%{DATA:txId}\] %{GREEDYDATA:logMessage}"]
    }

    grok {
      match => ["[log][file][path]", "%{GREEDYDATA:_filepath}/%{GREEDYDATA:filename}-%{DATA:_filedate}.log"]
    }

    date {
      match => ["_logTime", "yyyy-MM-dd HH:mm:ss:SSS"]
      target => "@timestamp"
      timezone => "Asia/Seoul"
    }

    mutate {
      strip => ["logLevel", "className"]
      remove_field => [ "@version", "tags", "input", "host", "log", "fields", "ecs", "agent", "message", "_filepath", "_filedate", "_logTime" ]
    }
  }
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "%{filename}-%{+YYYYMMdd}" # YYYYMMDD는 @timestamp 값을 포맷팅
  }
}

json 데이터 추출 후 ES에 저장하기

로그 예시 - /var/logs/demo-api-access-20240321.log

[2024-01-07 19:50:12:507] [ INFO] [http-nio-8080-exec-1] [access.logger                           ] [local01] [TX_68bb7182-e242-44bd-9b9f-ab2acbdc5716] {"status":null,"resTime":null,"method":"GET","url":"/api/v1/codes","parameters":{},"headers":{"host":"localhost:8080","connection":"keep-alive","cache-control":"max-age=0","sec-ch-ua":"\"Not_A Brand\";v=\"8\", \"Chromium\";v=\"120\", \"Google Chrome\";v=\"120\"","sec-ch-ua-mobile":"?0","sec-ch-ua-platform":"\"macOS\"","upgrade-insecure-requests":"1","user-agent":"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36","accept":"text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7","sec-fetch-site":"none","sec-fetch-mode":"navigate","sec-fetch-user":"?1","sec-fetch-dest":"document","accept-encoding":"gzip, deflate, br","accept-language":"ko-KR,ko;q=0.9,en-US;q=0.8,en;q=0.7,ja;q=0.6,th;q=0.5","cookie":"Idea-21be4b45=7c67ac5c-0279-400a-bc66-5e9a1b5e2b06"},"body":""}
[2024-01-07 19:50:12:555] [ INFO] [http-nio-8080-exec-1] [access.logger                           ] [local01] [TX_68bb7182-e242-44bd-9b9f-ab2acbdc5716] {"status":200,"resTime":18,"method":"GET","url":"/api/v1/codes","parameters":{},"headers":{},"body":""}

logstash.conf

input {
  beats {
    port => 5044
  }
}

filter {
  if [fields][type] == "access" {
    grok {
      match => ["message", "\[%{DATA:_logTime}\] \[%{DATA:logLevel}\] \[%{DATA:thread}\] \[%{DATA:className}\] \[%{DATA:machineId}\] \[%{DATA:txId}\] %{GREEDYDATA:_logMessage}"]
    }

    grok {
      match => ["[log][file][path]", "%{GREEDYDATA:_filepath}/%{GREEDYDATA:filename}-%{DATA:_filedate}.log"]
    }

    json {
      source => "_logMessage" # _logMessage를 object로 파싱
    }

    mutate {
      rename => {
        "headers" => "_headers"
        "parameters" => "_parameters"
      }

      add_field => {
        "headers" => "%{_headers}" # _headers(object) -> headers(string)
        "parameters" => "%{_parameters}" # _parameters(object) -> parameters(string)
      }
    }

    date {
      match => ["_logTime", "yyyy-MM-dd HH:mm:ss:SSS"]
      target => "@timestamp"
      timezone => "Asia/Seoul"
    }

    mutate {
      strip => ["logLevel", "className"]
      remove_field => [ "@version", "tags", "input", "host", "log", "fields", "ecs", "agent", "message", "_filepath", "_filedate", "_logTime", "_headers", "_parameters", "_logTime", "_logMessage" ]
    }
  }
}

output {
  elasticsearch {
    hosts => ["http://elasticsearch:9200"]
    index => "%{filename}-%{+YYYYMMdd}" # YYYYMMDD는 @timestamp 값을 포맷팅
  }
}

filter의 grok match 패턴 찾는 방법

  • https://grokdebug.herokuapp.com
  • input : 로그 붙여넣기
  • Pattern : 패턴 입력
    예시)
    \[%{DATA:logTime}\]\[%{DATA:logLevel}\]\[%{DATA:thread}\]\[%{DATA:className}:%{DATA:classLine}\] %{GREEDYDATA:logMessage}
    
반응형

'Development > ELK' 카테고리의 다른 글

[Filebeat] 설정  (0) 2019.03.02
[Filebeat] 설치  (1) 2019.03.01
[Logstash] 설치  (0) 2019.03.01
[Elasticsearch] 개념  (0) 2019.02.24
[Kibana] 설치  (0) 2019.02.24

+ Recent posts