«Conveyoring» files for backup (CloudBerry Backup with pre/post scripts)

Working with customers is always great experience and obviously it is one of the biggest way to learn new things. One of the case I've been working on (being CloudBerry Lab solutions architect member) had interesting requirements. Customer wanted to offload all their video data (exported surveillance content from their cameras) off to Amazon S3 with Infrequent Access storage class. This is very simple, but the interesting part is that amount of data depended on the time of the day and .. all these files are exported by third-party to certain directory on the server with cloud backup tool and you never knew their total size and more over you see those files (names), but you couldn't touch them until they became over certain size. Last statement gave me really good strategy and the following iterations came up on board as acceptable solution.

  • Files are generated by video exporter tool (another server) and copied through network to backup server to specific directory (Export). Script is triggered by pre-plan action in CBL Backup;
  • Generated files are scanned by windows script in the folder and copied to another folder (Backup) when they were bigger certain size (can be controlled by script);
  • Backup tool (CBL Backup) offloaded data to S3 bucket and sets the object class to IA (Infrequent Access) to reduce the overall storage price (twice less than with Standard class);
  • On successful upload another script (triggered by post-plan action) cleaned up the source folder (Backup) and were waiting for its next run (where the above carousel started its circle again).

Let's see all these steps in details.

Windows script to copy files larger certain size

This script is based on file size condition and does simple action - move files from Export to Backup. Let's create pre-plan.bat and place the following:

FOR /R "C:\POC\Export" %%F in (*) do if %%~zF geq 1000000000 move "%%F" "C:\POC\Backup"

We have C:\POC\Export where the data arrives from video software and fills files by content. C:\POC\Backup is the folder that CloudBerry Backup uses as source folder. So using this script we take all files in the video temp folder, measure them with ~ 1Gb (1000000000) and move if there are greater. This script is based on standard windows command. It can be done by PowerShell if you need so.

Clean up script

The next thing we need is to create clean up script that simply deletes all files in the folder with certain size (well actually we can delete them all) for next iteration (backup). The overall plan was to offload to the cloud storage and delete everything locally.

So the script is super simple, let's create post-plan.bat and place the following:

cd C:\POC\Backup\
del *

Backup Pre / Post actions in action

Final step is to configure Backup tool with two scripts added in the wizard to do its custom things.

Lastly, we need to consider backup schedule. We set hourly in this particular case it worked fine!

This is it really! I haven't founded better way to do this, but if there any - would be happy to go through!

In short, this is about:

Start discussion:
Related articles:
Cloudberry 1.8 for Mac OS X is now live! Impressive list of features and storage platforms support! S3 compatible, new UI and etc. ... more
over 5 years#cloudberrylab #mac
Asterisk IP PBX comes with embedded MixMonitor, that allows us to to record and place those recordings somewhere. ... more
Amazon Web Services (AWS) has its own AWS Linux AMI (also free tier). There are several reasons you want to use this AMI for your projects. ... more
In this article you'll learn how to protect your data against hardware and software failures according to the 3-2-1 rule. ... more
Since FullAccess is disabled even for Administrator's account in Exchange (including 2010) CloudBerry Backup for Exchange can't restore using Administrator's username and password pair for item level restore of any mailbox. Let's fix it. ... more
In CloudBerry Lab we help people to move data offsite to the cloud either with backup or file management solution. Sometimes I need sample files, a lot of sample files. ... more
over 4 years#cloudberrylab #ruby
Launching AWS EC2 can be supplied by something what is known as "user data" and it helps to have things ready before you access it. ... more
How to pre-deploy App using user data script for Windows OS with powershell powered script in EC2? Example inside. ... more
In this short article you will learn how to use PsExec with remote powershell script to download, execute and clean up software. You can do it for multiple computers in super easy way. ... more
Here at CloudBerry, where one of the goal I have is to help partners decide, pick and build right configurations for dealing with computers data when they need backup solution. FTP, SCP, WebDav and some other proprietary protocols have been here forever, where simplicity and flexibility made them number one in data protection and management fields as primary target configurations. But, there is big BUT... ... more