| 5 | {{{ |
| 6 | #!/bin/bash |
| 7 | mysqldump [database-name] > ~/[domain-directory]/web/[database-name]_`date +%y%m%d`.sql |
| 8 | tar cvzf ~/backups/[site]_backup_`date +%y%m%d`.tgz ~/[domain-directory]/web |
| 9 | rm -f ~/[domain-directory]/web/[database-name]_`date +%y%m%d`.sql |
| 10 | }}} |
| 11 | |
| 12 | Change the info inside the bracket for your own site. Save file as "backup.sh". SSH into your server and create a new directory called "backups". Put the script in that directory and make it executable (chmod 755 backup.sh). |
| 13 | |
| 14 | The first line (after #!/bin/bash) dumps a dated copy of your database right into your web directory (maybe not the best practice but it gets deleted very fast). |
| 15 | The 2nd line creates in your "backups" directory a compressed tarball of everything in the "web" directory (including the database dump). |
| 16 | The 3rd line deletes the database dump from the "web" directory. |
| 17 | |
| 18 | I would suggest first (after creating the "backups" directory) just running the 3 lines from your command line. The results should be a new dated *.tgz file in the "backups" directory. Also check that there is no *.sql file left in your "web" directory. If you have access errors doing the database dump, see here: [https://support.mayfirst.org/wiki/mysql_command_line_access]. If this works, try running the script: |
| 19 | {{{ |
| 20 | ./backups/backup.sh |
| 21 | }}} |
| 22 | After I run this I SFTP into the server and download the new backup to my local machine. I also keep the latest version on the server and delete the previous one. I do this periodically (every 2 weeks). I'm sure this could be easily improved but it works. I am looking at better options for automated back-up to cloud storage. |
| 23 | |
| 24 | |
| 25 | |
| 26 | |
| 27 | |
| 28 | |
| 29 | |
| 30 | |