In an earlier article, we have described How to Backup WordPress Site Manually via SSH. In that guide, we have stored the backup on a backup server. It is natural to ask certain questions about WordPress backup, like “Why you are not using a WordPress Plugin to backup” “Why you are using a backup server” or “Can we use any cloud storage or CDN to keep the backup”. This article will help you to judge each option of WordPress backup.
Why We Avoid WordPress Backup Plugins
WordPress backup plugins work for small sites with uncomplicated structures and no extra files on the server. There are a good number of WordPress plugins (both paid and free) which can help users back up their sites almost anywhere. ValutPress, UpdraftPlus, and BlogVault are some noteworthy plugins which perform the task of scheduled backup and also they provide storage space. Also, installing more WordPress plugins increases the burden on the server. They are good options for the shared servers for smaller sites.
How We Can Automate the WordPress Backup (via SSH)
You can create bash scripts to run the steps described in the article How to Backup WordPress Site Manually via SSH. You can check out the simple bash scripts here in two blogs:
There are many advanced and basic bash scripts available on GitHub to perform the desired action. You need to set a cron job to make it automated. Do not forget to regularly check what is going on and delete the old backups.
What Are Our Options of WordPress Backup Storage?
The default and easiest option are using a remote backup server. You can manually pull from that server or use
rsync to automatically pull/push. This method is easier as a backup server with root access avoids complicated coding which may be required by other options. You can customize your bash scripts. You can install OwnCloud on the backup server too. OwnCloud provides an option to add/integrate Amazon S3. OwnCloud has an Android app too. This is a robust option of backup, but the total cost will not become huge per month. If the backup server has enough RAM, you can even run a site for staging/development.
The second option is uploading the backups to DropBox, Google Drive etc freemium cloud storage. It is not difficult to push towards DropBox, we have shown the method some years back to our readers – Copy Backup Files From Server to DropBox Cloud (HP Cloud).
The third option is pushing towards Amazon S3. This is also a good option since there are command line tools and Amazon Glacier is reliable for archiving.
At least take one manual backup every week if you do not want to complicate your life with automatic backups and do not forget to upload that backup to Amazon Glacier every month. If you have a money-making site, you should not ignore taking backups.
You can not use GitHub for backup purpose since their limit is 5GB.
It is good to download one backup every six months on your local computer. If your FTP size is too big, just download the database.