Add the docker elk tutorial
rodzic
c551446301
commit
4aad7286fe
|
@ -2,3 +2,5 @@
|
||||||
output.txt
|
output.txt
|
||||||
terraform.tfstate
|
terraform.tfstate
|
||||||
terraform.tfstate.backup
|
terraform.tfstate.backup
|
||||||
|
docker-elk/elasticsearch/data-sandbox/
|
||||||
|
docker-elk/elasticsearch/data-full-stack/
|
||||||
|
|
|
@ -0,0 +1,2 @@
|
||||||
|
/elasticsearch/data-sandbox/
|
||||||
|
/elasticsearch/data-full-stack/
|
|
@ -0,0 +1,76 @@
|
||||||
|
Run kibana and elasticsearch, sending data coming from nginx logs
|
||||||
|
=================================================================
|
||||||
|
|
||||||
|
Setup
|
||||||
|
-----
|
||||||
|
|
||||||
|
1. Run
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone https://github.com/Ovski4/tutorials.git
|
||||||
|
cd docker-elk
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Create the data volume with the right permissions:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose -f docker-compose-full-stack.yml run elasticsearch chown elasticsearch -R /usr/share/elasticsearch/data
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Launch all containers:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose -f docker-compose-full-stack.yml up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
4. Browse `http://localhost:5601/`. You might have to wait a few minutes while Kibana set things up. You can then click on '**Explore on my own**'.
|
||||||
|
|
||||||
|
5. Then browse `http://localhost:8085/`. The http request will trigger some logs to be send to elasticsearch.
|
||||||
|
|
||||||
|
![Screenshot nginx page](nginx-page.png "Screenshot nginx page")
|
||||||
|
|
||||||
|
6. Come back to kibana at `http://localhost:5601/`. In the left panel, click on **Discover** under the **Kibana section** and create a new **index pattern**. You should see the filebeat index appearing in the select box. In the **Index pattern name** text field, type **filebeat-***
|
||||||
|
|
||||||
|
![First step of the kibana index creation](kibana-index-creation-step-1.png "First step of the kibana index creation")
|
||||||
|
|
||||||
|
7. In the following page, select `@timestamp` and click `Create index pattern`.
|
||||||
|
|
||||||
|
![Second step of the kibana index creation](kibana-index-creation-step-2.png "Second step of the kibana index creation")
|
||||||
|
|
||||||
|
Go to the discover page at `http://localhost:5601/app/discover#/`.
|
||||||
|
|
||||||
|
That's it, you should see some data. You might have to update the dates filter located on the top right of the page if nothing shows up.
|
||||||
|
|
||||||
|
You can now create visualizations with Kibana.
|
||||||
|
|
||||||
|
> Follow the next instructions to have a look at a kibana dashboard and some visualizations.
|
||||||
|
|
||||||
|
Run kibana and elasticsearch with existing data
|
||||||
|
===============================================
|
||||||
|
|
||||||
|
This quick setup can be very useful as a sandbox. It comes with data fetched from my personal blog.
|
||||||
|
|
||||||
|
Setup
|
||||||
|
-----
|
||||||
|
|
||||||
|
Stop the containers from the first part if needed :
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose -f docker-compose-full-stack.yml down
|
||||||
|
```
|
||||||
|
|
||||||
|
Extract the data in the volume to bind:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
unzip data-sandbox.zip -d ./elasticsearch/
|
||||||
|
```
|
||||||
|
|
||||||
|
Launch the containers:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker-compose -f docker-compose-sandbox.yml up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
Browse `http://localhost:5601/`, have look at the dashboard at `http://localhost:5601/app/kibana#/dashboards`.
|
||||||
|
|
||||||
|
![Kibana dashboard](kibana-dashboard.png "Kibana dashboard")
|
Plik binarny nie jest wyświetlany.
|
@ -0,0 +1,36 @@
|
||||||
|
version: '3.7'
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
|
||||||
|
elasticsearch_data:
|
||||||
|
|
||||||
|
services:
|
||||||
|
|
||||||
|
nginx:
|
||||||
|
image: nginx:1.18.0-alpine
|
||||||
|
ports:
|
||||||
|
- 8085:80
|
||||||
|
|
||||||
|
elasticsearch:
|
||||||
|
image: docker.elastic.co/elasticsearch/elasticsearch:7.10.1
|
||||||
|
environment:
|
||||||
|
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
|
||||||
|
discovery.type: single-node
|
||||||
|
volumes:
|
||||||
|
- ./elasticsearch/data-full-stack:/usr/share/elasticsearch/data
|
||||||
|
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
|
||||||
|
|
||||||
|
kibana:
|
||||||
|
image: docker.elastic.co/kibana/kibana:7.10.1
|
||||||
|
volumes:
|
||||||
|
- ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml
|
||||||
|
ports:
|
||||||
|
- 5601:5601
|
||||||
|
|
||||||
|
filebeat:
|
||||||
|
image: docker.elastic.co/beats/filebeat:7.10.1
|
||||||
|
user: root
|
||||||
|
volumes:
|
||||||
|
- /var/run/docker.sock:/var/run/docker.sock:ro
|
||||||
|
- /var/lib/docker/containers:/var/lib/docker/containers:ro
|
||||||
|
- ./filebeat/filebeat.docker.yml:/usr/share/filebeat/filebeat.yml
|
|
@ -0,0 +1,25 @@
|
||||||
|
version: '3.7'
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
|
||||||
|
elasticsearch_data:
|
||||||
|
|
||||||
|
services:
|
||||||
|
|
||||||
|
elasticsearch:
|
||||||
|
image: docker.elastic.co/elasticsearch/elasticsearch:7.10.1
|
||||||
|
environment:
|
||||||
|
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
|
||||||
|
discovery.type: single-node
|
||||||
|
volumes:
|
||||||
|
- ./elasticsearch/data-sandbox:/usr/share/elasticsearch/data
|
||||||
|
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml
|
||||||
|
|
||||||
|
kibana:
|
||||||
|
image: docker.elastic.co/kibana/kibana:7.10.1
|
||||||
|
volumes:
|
||||||
|
- ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml
|
||||||
|
ports:
|
||||||
|
- 5601:5601
|
||||||
|
depends_on:
|
||||||
|
- elasticsearch
|
|
@ -0,0 +1,5 @@
|
||||||
|
---
|
||||||
|
|
||||||
|
cluster.name: "docker-cluster"
|
||||||
|
network.host: 0.0.0.0
|
||||||
|
xpack.license.self_generated.type: basic
|
|
@ -0,0 +1,26 @@
|
||||||
|
filebeat.autodiscover:
|
||||||
|
providers:
|
||||||
|
- type: docker
|
||||||
|
templates:
|
||||||
|
- condition:
|
||||||
|
contains:
|
||||||
|
docker.container.image: nginx
|
||||||
|
config:
|
||||||
|
- module: nginx
|
||||||
|
access:
|
||||||
|
enabled: true
|
||||||
|
input:
|
||||||
|
type: container
|
||||||
|
stream: stdout
|
||||||
|
paths:
|
||||||
|
- '/var/lib/docker/containers/${data.docker.container.id}/*.log'
|
||||||
|
error:
|
||||||
|
enabled: true
|
||||||
|
input:
|
||||||
|
type: container
|
||||||
|
stream: stderr
|
||||||
|
paths:
|
||||||
|
- '/var/lib/docker/containers/${data.docker.container.id}/*.log'
|
||||||
|
|
||||||
|
output.elasticsearch:
|
||||||
|
hosts: 'elasticsearch:9200'
|
Plik binarny nie jest wyświetlany.
Po Szerokość: | Wysokość: | Rozmiar: 257 KiB |
Plik binarny nie jest wyświetlany.
Po Szerokość: | Wysokość: | Rozmiar: 90 KiB |
Plik binarny nie jest wyświetlany.
Po Szerokość: | Wysokość: | Rozmiar: 85 KiB |
|
@ -0,0 +1,5 @@
|
||||||
|
---
|
||||||
|
|
||||||
|
server.name: kibana
|
||||||
|
server.host: "0"
|
||||||
|
elasticsearch.hosts: [ "http://elasticsearch:9200" ]
|
Plik binarny nie jest wyświetlany.
Po Szerokość: | Wysokość: | Rozmiar: 21 KiB |
Ładowanie…
Reference in New Issue