1. Pick a simple backup strategy you can live with. document it.
  2. Make incremental backups a part of your daily routine.
  3. Include an off-site backup in your strategy(use 2 disks).

@see RSYNC

Compression

# compress folder
tar -czf big.tgz /var/www/big/
 
# per rimuovere le subdirectory dall'archivio:
tar -czf big.tgz /var/www/big/*.jpg -C /var/www/big/
 
# test archive
# get list
tar -tvzf my_tar.tar.gz
# test gzip
gunzip -t file.tar.gz
# To test the tar file inside is not corrupt:
gunzip -c file.tar.gz | tar t > /dev/null
 
# extract
tar -xzfv big.tgz
# zip a folder
zip -9 -r <zip file> <folder name>
# --junk-paths per rimuovere il percorso dei files originali dall'archivio
 
 
# To zip a single file:
zip -9 <zip file> <filename>
 
#extract
unzip file.zip
unzip file.zip -d $destination_folder
tar -zcvpf backup_`date +"%Y%m%d_%H%M%S"`.tar.gz `find <target> -atime +5` 2> /dev/null | xargs rm -fr ;
 
# escludere svn da una tarball
tar --exclude-vcs -cf src.tar src/

eliminare files non modificati da $x giorni

#!/bin/sh
DIR=/your/backups/dir
find $DIR -maxdepth 1 -type f -mtime +30 -exec rm {} \;

remote copy

curl -u user:passwd -T /home/dir/local_file_to_upload ftp://your_host.com/subdir/
 
rdiff-backup /some/local-dir hostname.net::/whatever/remote-dir
  • rdiff-backup* backs up one directory to another, possibly over a network. The target directory ends up a copy of the source directory, but extra reverse diffs are stored in a special subdirectory of that target directory, so you can still recover files lost some time ago.

backup di una cartella

rsync -ax --delete --force /testdir /backup/

comprimere i files di una dir ed eliminare i sorgenti (failsafe compression)

for i in `ls`; do tar cvjf $i.tar.bz2 $i && rm -r $i; done

Crontab

 
# esegui uno script php ogni 30 min
0,30 * * * * php -q /address/to/script.php
 
# logging
* * * * * /myhome/myscript >/myhome/myscript.log 2>&1
 
#copia automatica tutte le mattine
0 5 * * * rsync -vaxE --delete --ignore-errors /adir /mnt/backup/
# backup dati
0 23 * * * root  rsync -a -v --delete /home /backup_giornaliero
0 2 * * 0  root  rsync -a -v --delete /home /backup_settimanale
0 5 1 * *  root  rsync -a -v --delete /home /backup_mensile

Mail Attachments

mail unfortunately can't send attachments. You could use uuencode to embed the file in the mail using:

15 2 * * * root mysqldump -uroot -pPASSWORD --all-databases | gzip > /database_`date +'%m-%d-%Y'`.sql.gz ; uuencode /database_`date +'%m-%d-%Y'`.sql.gz /dev/stdout | mail -s "Report 05/06/07" test@gmail.com
 
# Or if you want to have a proper MIME attachment use (You will need MetaMail installed):
 
15 2 * * * root mysqldump -uroot -pPASSWORD --all-databases | gzip > /database_`date +'%m-%d-%Y'`.sql.gz ; metasend -b -t test@gmail.com -s "Report 05/06/07" -m application/gzip -f /database_`date +'%m-%d-%Y'`.sql.gz

Using SCP

# copy local to remote
scp SoureFile user@host:DestDirectory/filename
 
# copy local directory
scp SoureDirectory -r user@host:DestDirectory (recursive copy directories)
 
# download
scp user@host:SourceDir/SoureFile

To receive (a) file(s) from a remote server:

pscp [options] [user@]host:source target

to copy the file /etc/hosts from the server example.com as user fred to the file c:\temp\example-hosts.txt, you would type:

pscp test@gmail.com:/etc/hosts c:\temp\example-hosts.txt

To send (a) file(s) to a remote server:

pscp [options] source [source...] [user@]host:target

to copy the local file c:\documents\foo.txt to the server example.com as user fred to the file /tmp/foo you would type:

pscp c:\documents\foo.txt test@gmail.com:/tmp/foo

You can use wildcards to transfer multiple files in either direction, like this:

pscp c:\documents\*.doc test@gmail.com:docfiles
pscp test@gmail.com:source/*.c c:\source

scp unattend transfer

sshpass

apt-get install sshpass
sshpass -p "password" scp -r user@example.com:/some/remote/path /some/local/path
# reading the password from prompt
read -s -p "SSH password : " PASSWORD_SSH;
# useit
sshpass -p $PASSWORD_SSH scp file.tar.gz <username>@<host>:/path/to/backup

scp without password prompt (batch mode: scp -B) you want to use a script/cron job that transfers data via scp from $server_A to $server_B, you would not want to have the password prompt all the time.

into $server_A as $username_A and enter:

ssh-keygen -d

do not enter any passphrase, this will generate a couple of files in the folder ~/.ssh/

SCP the file rsa_id.pub ($username_A public key) to $server_B:

scp ~/.ssh/id_dsa.pub $username_B@$server_B:/home/$username_B/.ssh/authorized_keys2

ATTENTION: if authorized_keys2 already exists on server $server_B, append the content of id_dsa.pub to the existing file!

if you want access to both server, do the same vice versa.

example: this syncs all files in folders and subfolders created within the last 10 days to the same folder structure from server $server_A to server $server_B without prompting for passwords.

$username_A@$server_A:/my/project/folder$ find . -type f -ctime -10 -print -exec scp -B '{}' $username_B@$server_B:/my/project/folder/'{}' ';'

Mysql

db_mysql5

Remote copy over FTP

linux_ftp

Gdrive

# GDRIVE
0,30 * * * *  cd /home/taz/GDrive && grive

Windows

per usare SCP e SFTP usa winscp scripting semplice backup

Set CurrentDate = %Date%
Set CurrentTime = %Time%
Set YYYY = %CurrentDate:~10,4%
Set MM = %CurrentDate:~4,2%
Set DD = %CurrentDate:~7,2%
Set Source = D:\Misc\DailyLog.txt
rem Set dest = D:\Misc\DailyLog_%YYYY%%MM%%DD%.txt
rem Copy %Source% %dest% /Y
Set dest = D:\Misc\copy_%YYYY%%MM%%DD%
xcopy %source% %dest%\
Pause

solo le differenze, basato su Unison

"e:\man\bin\unison.exe" -batch e:\work c:\__ro_backup\work -force "e:\work"
#!/bin/bash
unison -batch "/media/data/bin" "/media/data_copy/__ro_backup/bin" -force "/media/data/bin" 2>/var/log/backup/bin.log &