Share

Configure AWS S3

Warning:

Local installations of Flow Production Tracking are no longer offered. This documentation is intended only for those with existing instances of Shotgun Enterprise Docker. Click here for a list of our current offerings.

This guide is intended to help Flow Production Tracking Administrators set up Flow Production Tracking to use a AWS S3 bucket to store media.

Overview

Your Flow Production Tracking users need direct access to AWS S3. They will access S3 with a URL that will look like https://sg-bucket.s3.us-west-2.amazonaws.com 

Minimum version for S3 support

Docker Containers

  • flow production tracking-app: 8.1.2.6
  • transcoder-worker: 10.3.0
  • transcoder-server: 6.2.0

Flow Production Tracking Integration / Apps

  • Flow Production Tracking Python API version: 3.0.37
  • RV Review: 7.3.1
  • iOS App: 2.0.0 

AWS S3 buckets requirements

Disclaimer:

The client is solely responsible to secure his S3 bucket, and the integrity of your data will be at risk without it. We very strongly recommend securing your S3 bucket properly.

See https://aws.amazon.com/premiumsupport/knowledge-center/secure-s3-resources/

CORS configuration

You need to configure CORS policy on your new bucket. Replace flow production tracking.mystudio.test with your local Flow Production Tracking server URL.

 [  
    {  
        "AllowedHeaders": [  
            "*"  
        ],  
        "AllowedMethods": [  
            "GET",  
            "PUT",  
            "HEAD"  
        ],  
        "AllowedOrigins": [  
            "https://flow production tracking.mystudio.test"  
        ],  
        "ExposeHeaders": [  
            "ETag"  
        ],  
        "MaxAgeSeconds": 3000  
    }  
] 

Required AWS permissions on your bucket objects

  • s3:AbortMultipartUpload
  • s3:GetObject
  • s3:GetObjectAcl
  • s3:ListMultipartUploadParts
  • s3:PutObject

Customizing docker-compose.yml

Add the following entries in the environments section of the app component. Replace s3-bucket by your bucket and us-west-2 by the region of your bucket.

GENERATE_SERVICES_YML: 'yes'  
S3_BUCKET_LIST: "s3-bucket:us-west-2" 

AWS Credentials

Flow Production Tracking-app container will auto load the IAM instance role if it run on a EC2 instance. You can also set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY env variables on the container. 

Configuring the Flow Production Tracking site

Replace s3-bucket by your bucket and s3-prefix by an S3 prefix where you want to store Flow Production Tracking media in your bucket. 

 sudo docker-compose run --rm app rake 'admin:configure_s3[<s3-bucket>,<s3-prefix>]' 

Where s3-bucket and s3-prefix are s3://<s3-bucket>/<s3-prefix>/

How to test

Go in your site and try to upload a new version from the UI. We recommend you open the browser developer console as all the interesting errors will be there.

Uploading existing media to AWS S3

Since Flow Production Tracking v8.3.2.1 we do support upload existing media from local instance to AWS S3 bucket.

Following settings must be set in docker-compose.yml file to make this script work:

GENERATE_SERVICES_YML: 'yes'  
S3_BUCKET_LIST: "<bucket_name>:<aws_region_name>"  
AWS_ACCESS_KEY_ID: <aws_access_key>  
AWS_SECRET_ACCESS_KEY: <aws_secret_key> 

Example:

GENERATE_SERVICES_YML: 'yes'  
S3_BUCKET_LIST: "my_studio_name_bucket:us-west-2"  
AWS_ACCESS_KEY_ID: ABCdefgHijKlMnOpq  
AWS_SECRET_ACCESS_KEY: ZYXWVUTSRQPONMLK  

Upload thumbnails:

sudo docker-compose run --rm app script/admin/s3/upload_files_to_s3.rb -d <bucket_name> --entity thumbnail 

Upload attachments:

sudo docker-compose run --rm app script/admin/s3/upload_files_to_s3.rb -d <bucket_name> --entity attachment 

This job will slowly upload the files to the AWS S3 bucket, your Flow Production Tracking site media will available while it's running. This jobs can run for a few hours and can be interrupted and resumed later.

FAQ

Can media stored on an NFS or AWS EFS coexist with media stored on AWS S3?

Yes, as long as EFS and S3 are properly configured for Flow Production Tracking to be able to read media from it. The path to the media is stored in the database, which allow reading media and attachments from multiple locations at a time.

If AWS S3 is configured, new media will be stored on S3 only.

Was this information helpful?