Backup procedure for Mailinabox

Hi

Can you help on getting the backups more streamlined.

We had 30 GB server where now free space left with 2 GB,
Last full backup was 3.3 GB and total incremental backups size was 2.7 GB from last full backup. When does the full backup occur.

  1. Can we delete the files in duplicity and encrypted folders except encryptionkey.
  2. Is there any way to stop backup for a while and create manual full backup

Reading this may help you meantime someone may provide you a better answer: How Do I Delete Backups

I had a quite similar issue where my backups filled up all the space.

I went in and deleted all my backups, including the secret key. And then on some night of the week there was a full backup generated.

Now, I had made sure to delete all the excess data from the server, so my backups won’t fill up the space, as they incrementally backup stuff.

Did you delete data in both duplicity and encrypted folder?

After deletion of encryption key, did it generated automatically at next backup?

I went in and deleted the encrypted data. That was all I think I needed to do.

And that emptied all the extra space

…and yes, it did create a new secret key.

@sagaryellina @francoischevel if you wish to schedule some maintenance here you have a description about what I have:

IMPORTANT: Don’t forget to schedule automated backup(s) or snapshot(s) at DO or Vultr at least one every 30 of each month (this will guaranty you to have a full backup of your cloud server when needed).

You may use this command through SSH:

find /home/user-data/backup/encrypted/*.gpg -type f -mtime +1 #-exec rm {} +

Note that there are spaces between rm, {}, and ;

Explanation:

  • The first argument is the path to the files (This can be a path, a directory, or a wildcard as in the example above). I would recommend using the full path, and make sure that you run the command without the exec rm to make sure you are getting the right results. When sure remove the # from #-exec .
  • The second argument, -mtime, is used to specify the number of days old that the file is. If you enter +1, it will find files older than 1 day. Don’t forget to modify +1 with the one that meet your particular needs.
  • The third argument, -exec, allows you to pass in a command such as rm. The {} ; at the end is required to end the command.

When sure you get what you need you may wish to create your cronjob as follows to be executed the day 1 of each month, as follows:

* * 1 * *  find /home/user-data/backup/encrypted/*.gpg -type f -mtime +30 -exec rm {} +

BTW, this article may help to understand ‘Cron’ in linux if needed: https://www.digitalocean.com/community/tutorials/how-to-use-cron-to-automate-tasks-on-a-vps

Hope this helps!

Please don’t make awkward changes to your box like this. You do so at your peril.

@JoshData Please, could you explain why do you think that’s a peril?

  • I have scheduled weekly backups and/or snapshots of my cloud server (the last 1 day before) and just scheduling a monthly maintenance (once a month) to get back the huge disk space took by the incremental backups (just deleting encrypted backup files only)
  • I would love to learn from you why you consider this as a peril

Thans in advance for your answer.

I can’t provide support in these cases, and I can’t guarantee that upgrades will work. That’s all.

Cna you let me know, along with encrypted data can we delete full duplicity folder data also ??

I had deleted files in encrypted folder, but in next backup it created again all the files and screwed up my space completed, lastly application went down