Here's an update:
I have my computer do a backup of the documents folder every Friday at 3 AM. It backs up to cloud storage, and it happens automatically. Getting the task scheduler to do this was a bear, because of all the gotchas. Some of the reasons it didn't work: Set to run only when plugged in, exe file has to be set to run as administrator, you must allow wake timers in the power options.
Each of those things had to be solved individually (via Googling), and there was no indication of why it wouldn't trigger.
But it will be nice to have the backups done unattended.
Something that would be far simpler, and almost as good, would to be to just set a pop-up reminder on Fridays. If you had this down to a one button operation while you are powered, that would be almost as good. That would eliminate all the complexities/gotchas of waking up and running a script unattended.
Now, how do you verify that it worked? Is that automated? How do you verify that?
My motto is -
If you have not successfully restored from the back-up, you do not have a back-up.
I do it manually (and not often enough, so it is far from perfect), but I do open at least a few files on the backup for a sanity check.
So here's a question for the techies:
I see that others have said they have too much to put to the cloud. Here is what I was thinking - I do several full backups of my 'stuff' occasionally (I treat music, photos and videos separately), then I do incremental updates to those bases, until I decide to delete one and start a new base. Now, here is what could be handy for 'the cloud':
A) Make a full baseline back-up to several portable drives.
B) Do a full incremental backup to the cloud, but use the directory listing on the portable drive as a reference for the incremental backup.
This way, only the incremental stuff would be in the cloud.
I kind of did this a while back - I wrote some scripts that would back up files that were X days old (prompted for X), and it retained the directory structure. I had that set up to zip and store to USB, but I could redirect that to the cloud. But if you do this once per day, you end up with a bunch of files and you would need to do some searching to find something. Hmmmm .... OK, picture this as a daily routine for simplicity:
A) First run my 'recently changed' script to capture all the files that were changed in the one day since I backed up to portable drives, and store to the cloud unzipped.
B) Run my 'recently changed' script every day after that and capture files changed only within the last day (plus a few hours for overlap) - copy these temporarily to a directory on my hard drive.
C) Now do an incremental update to the cloud from that temp directory. Delete the temp directory. Repeat from 'B'.
That should build up all the changed files in one neat directory structure in the cloud. Baseline would be missing, but I have that on my backups. And I could sync from the cloud to my backups if my internal HD crashed, and have everything in one place. Assuming rsync works to the cloud, but I'm pretty sure it does.
Since only the daily changes go to the cloud, this should be fast.
Anyone see any problems? This should be easy to turn into a one or two button script.
-ERD50