Docker logging drivers
Driver | Description |
---|---|
none |
No logs are available for the container and docker logs does not return any output. |
local |
Logs are stored in a custom format designed for minimal overhead. |
json-file |
The logs are formatted as JSON. The default logging driver for Docker. |
syslog |
Writes logging messages to the syslog facility. The syslog daemon must be running on the host machine. |
journald |
Writes log messages to journald . The journald daemon must be running on the host machine. |
gelf |
Writes log messages to a Graylog Extended Log Format (GELF) endpoint such as Graylog or Logstash. |
fluentd |
Writes log messages to fluentd (forward input). The fluentd daemon must be running on the host machine. |
awslogs |
Writes log messages to Amazon CloudWatch Logs. |
splunk |
Writes log messages to splunk using the HTTP Event Collector. |
etwlogs |
Writes log messages as Event Tracing for Windows (ETW) events. Only available on Windows platforms. |
gcplogs |
Writes log messages to Google Cloud Platform (GCP) Logging. |
logentries |
Writes log messages to Rapid7 Logentries. |
使用Fluentd收集日志
#创建配置文件test.conf
<source>
@type forward
</source>
<match *>
@type stdout
</match>
#启动Fluentd
docker run -it -p 24224:24224 -v /path/to/conf/test.conf:/fluentd/etc/test.conf -e FLUENTD_CONF=test.conf fluent/fluentd:latest
#启动容器,发送日志,默认发送到 localhost:24224
docker run --log-driver=fluentd -p 8998:80 nginx
#或者远程发送日志
docker run --log-driver=fluentd --log-opt fluentd-address=fluentdhost:24224 nginx
组合运行ElasticSearch + Kibana + Fluentd
Step1 创建docker-compose.yml
version: '3'
services:
web:
image: httpd
ports:
- "8080:80"
links:
- fluentd
logging:
driver: "fluentd"
options:
fluentd-address: localhost:24224
tag: httpd.access
fluentd:
build: ./fluentd
volumes:
- ./fluentd/conf:/fluentd/etc
links:
- "elasticsearch"
ports:
- "24224:24224"
- "24224:24224/udp"
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:7.2.0
environment:
- "discovery.type=single-node"
expose:
- "9200"
ports:
- "9200:9200"
kibana:
image: kibana:7.2.0
links:
- "elasticsearch"
ports:
- "5601:5601"
Step2 创建带es插件的Fluentd镜像
# fluentd/Dockerfile
FROM fluent/fluentd:v1.6-debian-1
USER root
RUN ["gem", "install", "fluent-plugin-elasticsearch", "--no-document", "--version", "3.5.2"]
USER fluent
# fluentd/conf/fluent.conf
<source>
@type forward
port 24224
bind 0.0.0.0
</source>
<match *.**>
@type copy
<store>
@type elasticsearch
host elasticsearch
port 9200
logstash_format true
logstash_prefix fluentd
logstash_dateformat %Y%m%d
include_tag_key true
type_name access_log
tag_key @log_name
flush_interval 1s
</store>
<store>
@type stdout
</store>
</match>
Step3 启动服务容器
docker-compose up
wayne@MacBook-Pro ~ % docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
26cb6f300fc2 httpd "httpd-foreground" 3 hours ago Up 3 hours 0.0.0.0:8080->80/tcp efk_web_1
17d790f12bfc efk_fluentd "tini -- /bin/entryp…" 3 hours ago Up 3 hours 5140/tcp, 0.0.0.0:24224->24224/tcp, 0.0.0.0:24224->24224/udp efk_fluentd_1
7e000240c5f7 kibana:7.2.0 "/usr/local/bin/kiba…" 3 hours ago Up 3 hours 0.0.0.0:5601->5601/tcp efk_kibana_1
a6fc9de04f6a docker.elastic.co/elasticsearch/elasticsearch:7.2.0 "/usr/local/bin/dock…" 3 hours ago Up 3 hours 0.0.0.0:9200->9200/tcp, 9300/tcp efk_elasticsearch_1
Step4 访问Kibana查看日志
http://127.0.0.1:8080/
创建index,输入fluentd-*,查看日志
截屏2020-11-20 下午7.26.46.png