Skip to content
This repository was archived by the owner on Nov 11, 2024. It is now read-only.

Setup Kubernetes logging environment

TangZhiZhen edited this page Dec 26, 2019 · 11 revisions

Introduction

When running multiple services and applications on a Kubernetes cluster, a centralized, cluster-level logging stack can help you quickly sort through and analyze the heavy volume of log data produced by your Pods. we choosed the one popular centralized logging solution is the Elasticsearch, Fluentd, and Kibana (EFK) stack.

Architecture

  • Elasticsearch: Elasticsearch is a distributed, RESTful search and analytics engine. In the EFK stack, we use Elasticsearch to store and search the logs forwarded by Fluentd.
  • Fluentd: Fluentd is an open source data collector, which lets you unify the data collection. it can be run directly on the host, or in a Docker container. Here we will use a DaemonSet to ensure that Fluentd is running on every node.
  • Kibana: Once you have some logs into Elasticsearch, we can add a tool for exploring and analyze them like Kibana.Kibana lets you visualize your Elasticsearch data and navigate the Elastic Stack.

Starting Kubernetes logging

On Kubernetes master node, run below command to start Kubernetes logging:

$ cd deployment/kubernetes/logging
$ ./start_logging.sh

Note: This script must be run as root

Accessing kibana and Exploring Log Data

Visit https://<CDN-Transcode Server IP address>:5601 in your web browser. You should see Kibana login page:

1. View logs: Management --> Create index pattern --> Discover

Select the "Management" in the left menu bar, click "Index patterns" button, start to create index patterns.

Select the "Discover" in the left menu bar, see the logs.

Select the "Filters", input "kubernetes.pod_name : live-transcode-service-xxxx", click "Update" to filter logs of "live-transcode-service-xxxx" pod.

You can check the detailed log from the filtered log list.


2. Save logs: Discover --> Save --> Share

On discover page, click "Save" button to save logs and choose "Share" button to create report.

You can download logs from Management > Kibana > Reporting


3. Create visualization:Visualize --> Create new visualization --> Area --> Source

Select the "Visualize" in the menu bar, start to create your visualization. Click "Area", choose a source, configure the coordinate axis parameters and run it.


4. Elasticsearch

You can use "Dev Tools" to interacting with ES.


Clone this wiki locally