• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

Favorite backup solution?

I use Sugarsync myself, I pay for the 30 GB plan, 50 bucks a year for as many PC's as I need.

This is double what I actually need, and am in the process of getting some friends hooked up with it which will increase my storage massively.

However, I have wanted a local alternative, so I made a .bat file with the xcopy commands in it to copy to my external hard drive.

I will include a delete directory command in it so that once a week it will delete all the files and then copy them all over to keep the changes the same.
 
Testing this currently.

Code:
#!/bin/bash

sudo rsync -rltDvu --modify-window=1 --progress --delete --delete-excluded --exclude-from=/home/dan/bash/BackupExclude.txt / /mnt/NAS/backup/ubuntu

Will keep you posted.

Any suggestions on hopw to make it better, I'm listening. Bear in mind I am using an NTFS share though
 
Make it a cron job? Lol - I just realized you wanted help with the command line itself, not with how to make it more efficient. Sorry.
 
Dont know what cron is? I want it to be a,bash script.for the same reasons I use batch files in windows. To keep my hand in
 
cron is a scheduler so that you can automate when it runs all the time - which would mean that you can always edit the script whenever you want but it will still run at the specified time interval.
 
Aye cron is the unix flavor of Task Scheduler -- though very unix-y

Anyways it's interesting this thread comes up again as I am having issues with DirSyncPro -- I think it has problems resolving conflicts with in-use files.

I need to do some more testing. It also has problems with the Macbook Pro going to sleep/screensaver causing a disconnect to the SMB share... though I suspect this is more of a Mac/SMB bug....
 
Thanks guys, cron seems to work well. I can see the NAS light flashing now, although I have no idea what its actually doing specifically ;)
 
Thanks guys, cron seems to work well. I can see the NAS light flashing now, although I have no idea what its actually doing specifically ;)

I actually wrote my own batch script to delete the directory, copy over all the necessary directories. I then jumped into task scheduling and set it up to run every Monday and Friday at noon.

I also have SugarSync that auto updates everything, but atleast this way I have a secondary backup.
 
So I finally got around to giving Crash Plan a try. Going to see how it does on my mac backing up to the NAS 1st.

Sadly I think DirSyncPro isn't really cut out for heavy lifting backups. I think I may use it to sync video directories and whatnot, but we'll see. SyncToy 2.0 does ok for that as well.
 
Well, I'm a Linux user and use FSArchiver to do a whole partition backup to an external hard drive.

FSArchiver is a system tool that allows you to save the contents of a file-system to a compressed archive file. The file-system can be restored on a partition which has a different size and it can be restored on a different file-system.
Some important files such as photos, docs, etc...I will save them manually to an external drive.

Thnx.
 
See that's no good for me. I don't want 250GB going over my network every time I want to back up. Incremental is an absolute must for me.
 
OK so crash plan worked swimmingly on the Mac, have yet to "restore" though.


A few more test are in order, then I may see if I can make this more of a main backup solution.
 
OK, testing a new one now, kinda giving up on DirSyncPro..... it just worried me that it was corrupting files and the warning messages when it encountered errors were kinda useless (didnt state the name of file that caused the error)

So here goes with Synkron

Synkron – Folder synchronisation


also cross platform and open source
 
rysnc is the best for copy/backup. It makes perfect copies down to the attributes, dates, and ownership permissions.

Many apps use rsync as the core w/ a pretty gui but if you can extract more power straight from the command line. You can synchronize deletions, archives, update only new files,etc.. After the initial clone, minor changes are mirrored instantly.

you can make live running clones across the network so if your machine goes down, a ready-to-go clone is up and running in minutes. Literally, up in minutes with the latest updates/file changes. I have set up fail-over webservers that switch instantly and rysnc makes it possible.

Rysnc is pretty built into every *nix OS like Linux, BSD, Solaris, OSX. There is a Windows port I believe.

All you need to do is set up a cron or a live file-system detection bash script. Even with million of files, rsync only needs to copy whatever isn't on the destination path or whatever is recently updated.

I see people copy-n-pasting folders/drives all the time and it is really inefficient. Same for the backup apps that do whole backups/clones.
 
Aye rsync is nice, I think it's the core of the app I am trying now.

I think what I'm going to end up with is crash plan for encrypted full backups and Synkron for minor sync jobs like music and video to specialized folder on the NAS.
 
Since my personal files are not really big, I'm saving them directly on dropbox (pictures, documents, music). For example, my Documents Library saves my documents on my Dropbox by default. (I'm using Windows 7)

The same personal files are also backed up daily to a 2TB external drive via the Backup solution that comes with Windows. The main use of this external drive is to manually store the bigger files, but I've got no backup of those since I delete them from my laptop. I wouldn't be happy losing this drive's content, but it's nothing that I couldn't find again.

Seems like you can't get help from me, but I thought I'd post to inspire others. ;)
 
Back
Top Bottom