Skip to content

KrMayank02/Publish-App-Logs_ELK-Stack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 

Repository files navigation

Publishing Application Logs Directly to ELK Stack (Elasticsearch, Logstash, and Kibana) & Filebeat

Objective: To automate the application log monitoring across environments using the ELK stack and a NodeJS application. This streamlines log collection, processing, and visualization with Elasticsearch, Logstash, and Kibana. The goal is to create dashboards and alerts for faster, high-quality software delivery.

Real-time Scenario: General Insurance, a leading global insurance provider based in the US, offers various products such as home, health, car, and life insurance. The company is transitioning to a DevOps architecture and aims to automate continuous monitoring across its environments. To achieve this, they have adopted the ELK stack as their application monitoring tool. ELK stack will collect and process application logs using Logstash. However, to support the microservices architecture running on Docker containers, logs will be sent directly to the ELK stack. By using ELK Stack and Docker, the company aims to provide continuous feedback to developers, speeding up software delivery, improving quality, and reducing the feedback loop between developers and testers.


Major Tools, Environment Used in This Project:

  • Node-JS App
  • Elasticsearch
  • Kibana
  • Logstash
  • Filebeat
  • Docker
  • Docker Compose
  • AWS EC2 Instance: Ubuntu-22.04, 8GB RAM, 2 vCPU (as Host Machine)
  • Java: Version: open-jdk 11
  • GitHub Repo for App Source Code

High Level Project Diagram:

ELK Architecture with Node-js App

image

High Level Tasks/Steps:

  • Install Pre-requisites: Java (JDK), Docker, Docker-compose
  • Install and Configure Elasticsearch
  • Install and Configure Logstash
  • Install and Configure Kibana
  • Install and Configure Filebeat
  • Clone the Github repo of Node-js App source code
  • Prepare Dockerfile for Node-js App
  • Prepare docker-compose.yaml file to configure & run the App
  • Verify Node-js App logs at
    • Container local logs path
    • Host/Server > volume logs path
    • Kibana GUI
  • Create Kibana Dashboard for logs Data Visualization & Analysis

Output Result Screenshots:

Verify whether the Elasticsearch service is running by sending an HTTP request:

image

Logstash:

image

Access Kibana GUI on Browser, with URL: http://100.24.35.222:5601/

image

Filebeat inputs:

image
image
image

The below command displays that Elasticsearch is loading multiple logs under the index searched for pattern “filebeat-*“

image

Check the instruction in Dockerfile:

vi Dockerfile

image

Check the instructions in docker-compose.yml:

vi docker-compose.yml

image

Let’s hit the App URL on browser to generate app specific logs:

http://100.24.35.222:8080

http://100.24.35.222:8080/post

image
image

Both these URLs have been hit 9 times. So, the logs will be generated for 9 hits.


Let’s verify Node-js App logs inside the docker container:

image

Hence, for 9 Hits on Browser – App URL, 9 times logs have been generated with correct message. This is generated inside container.


Let’s check the App logs on Volume at Host path mapped with container path.

cd /home/ubuntu/elk-project/nodejs-logs/

ls

cat app.log

image

A long list of System logs and Application Logs are being displayed on Kibana GUI with index pattern “filebeat-*”

image
image
image

Create Dashboard, modify the Visualization Type as required: here we have chosen three dashboard type respectively: Pie, Bar Horizontal and Metric.

image
image
image

Now, the Objective of the Project has been met: This streamlines log collection, processing and visualization with Elasticsearch, Logstash, Kibana and Filebeat with Nodejs-Application. Hence, the Project is finally completed!

About

The Project Automates the Application Log Monitoring across environments using the ELK stack and a NodeJS application, a microservices architecture running on Docker containers. This streamlined the log collection, processing, and visualization with Elasticsearch, Logstash, Kibana and File-beat. Resulted in high quality software delivery.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors