Our Server: TDC Tower

Sometimes filmmaking is VERY nerdy.

Introducing you to our post-production server, TDC Tower. For us camera dorks, having 70 terabytes of online and available storage is critical when managing multiple post-production projects.

  • Featuring parity protected storage that keeps projects safe. If your project is in the works there are at least 3 backups at any given time with the baked in ability to lose a drive anytime.
  • Also we have a ZFS segment that features ultra-fast editing available to anyone in our office. Further, it’s another backup if needed.
  • Both these sets of storage are disconnected from the internet – ensuring secure, safe data. Bad actors from the outside can’t even access your footage.
  • On a completely different set of drives, we have our own managed online storage. We’re able to upload files securely over our fiber connection to anywhere in the world, using just a link.
  • For all deliverables, not only are they backed up twice at our office, they live on Dropbox servers for the rest of our lives ensuring easy access and redundancy.
  • Last, after delivery, all projects are backed up to two drives and ready to be shipped to client or held in archive at our office.

That’s the general overview, let’s nerd out.

We use software called Unraid to run our server and below are the shares that we use. Let’s walk through the ones pertinent to post-production:

  • FCPX – this is located on an SSD and it’s central storage for our FCPX libraries. Yes we use FCPX, here’s a post on why and how we use it.
  • Project Assets – we call this our “warm storage”. This contains all of our digital negatives from a production. Sometimes the online, working project files are on a RAID 5 drive or they’re on ZFS but this is our second backup when we are working through projects. This is purely an array share, so it’s slow to write to but while that’s the case it’s protected by double parity.
  • Server Cache – this exists on an SSD and this is where we export version and store temporary files.
  • TDC-Backups – this is part of the array and it’s a place to where we make up backups.
  • TDC-Stock – all of our stock footage on the array, always available.
  • time-machine-backups – I have all computers hooked into the server to upload time machine backups to it.
  • Tributaries-Public – this is another array set of storage. It’s where LUT’s, logos, brand assets, post-production assets are stored.
  • ZFS – we love the new ZFS features in Unraid. This is basically a big, fast hard drive accessible by all editors. We can share footage and work on projects together in real-time.
  • Shared Project Files – Last, we have an unassigned drive that holds all projects files such as premiere files, music, graphics, VO etc. The reason this is an unassigned drive is because of the overhead of the Unraid array system. This is a fast, always available drive. 2TB.

Automated Backups

Unraid is powerful software and this is one of our favorite features. To be able to use scripts to run rclone commands on timely intervals is an incredible way to keep footage safe and efficiently keep projects backed up. Let’s walk through our user scripts.

  • Every morning we backup our FCPX and Shared Project Files to Backblaze (an incredibly easy to use cloud provider and backup solution).
  • Every 4 hours we make a copy of FCPX and Shared Project files to the Unraid array. Just in case an SSD fails during editing. 10+ years of editing has taught no amount of paranoia is too much.
  • We use the server to make a monthly backup of this very website.
  • Last, weekly, we do a couple things:
    • We copy all of our client delivery files to Backblaze. With this, these files exist in three places.
    • And then we backup a slew of important files to Backblaze to ensure they last forever