Share

Configure Unified Logging for Flow Production Tracking

Warning:

Local installations of Flow Production Tracking are no longer offered. This documentation is intended only for those with existing instances of Flow Production Tracking Enterprise Docker. Click here for a list of our current offerings.

This solution uses Fluentd as the data collector between the Flow Production Tracking application and the Elasticsearch database.

Getting started

First, you will need to build the Fluentd image and the Kibana image using the following command.

Enterprise Unified Logging GitHub

# Clone the example repository found on GitHub
git clone https://github.com/shotgunsoftware/enterprise-unified-logging.git /opt/shotgun/enterprise-unified-logging  

# Build the container  
cd /opt/shotgun/enterprise-unified-logging/  
sudo docker-compose build

Then start Fluentd and Elasticsearch along with Kibana.

 sudo docker-compose up -d  

Note that some servers with low memory may have an issue with starting up Elasticsearch. If that is the case, you may see this message:

Max virtual memory areas vm.maxp_map_count [65530] is too low, increase to at least [262144]."

Use the following to increase the limit:

 sudo sysctl -w vm.max_map_count=262144 

Next, you will need to change the Flow Production Tracking application logging driver in its docker-compose.yml file from json-file to fluentd. Locate the logging section of in the file and replace it by:

# fluentd
logging:
  driver: "fluentd"
  options:
    fluentd-address: "127.0.0.1:24224"
    tag: "sg.app.{{.ID}}"

Finally, you will need to restart the Flow Production Tracking Application container

sudo docker-compose up -d app

How to access logs

Kibana

Kibana GitHub

Logs can be access via Kibana at http://localhost:5601/

From there you can create your indexes (ex: shotgun_logs-* already created by default) and then query Elasticsearch.

Saved Objects

Saved objects can be provisioned by default by modifying the right json file in the kibana/files_docker/provisioning/directory.

Log files

Logs are also available (not by default) in json file in the logs/ directory.

Fluentd

Fluentd GitHub

Config

Fluentd Config GitHub

All the configuration takes place in the fluentd/files_docker/fluent.conf file.

Documentation

Plugin

For Elasticsearch, we use the fluent-plugin-elasticsearch plugin.

To install additional plugin see the Dockerfile at fluentd/Dockerfile.

Access for support team

ELK will contain a lot of useful information for the Flow Production Tracking Support team troubleshooting your site. It is convenient that you allow the Flow Production Tracking Support team to get access to your ELK stack. To achieve this, make sure that the firewall rules of the server hosting the enterprise-elk stack allows incoming HTTP connections to port 5601.

Allow traffic only to the company network and Flow Production Tracking Client VPN network

Was this information helpful?