Where Have All the Decent File Backup Programs Gone?

RJM62

Touchdown! Greaser!
Joined
Jun 15, 2007
Messages
13,157
Location
Upstate New York
Display Name

Display name:
Geek on the Hill
I use Macrium Reflect for imaging and cloning, and I'm very happy with it. But I was looking for software to copy and update certain files and folders to a folder on my new "disaster-proof" ioSafe external hard drive in unencrypted, uncompressed form, so it can be quickly accessed without any special software if the need arises.

I can do that from the clone drive, of course, but that drive isn't disaster-proof. I wanted these files on the disaster-proof drive, as well. I'm kind of a backup nut.

I must have downloaded a dozen trial programs, none of which could do this simple thing properly. The biggest problem was that they all seemed designed for idiots. Some, for example, had buttons or check boxes for "mail," but only if you used Outlook or Windows Mail. There was no way to manually select the Thunderbird data folder. And most wouldn't let me select a folder on a drive as the backup destination. They could only use the root, and I didn't want to repartition the drive just to do a simply file copy.

I finally gave up and used ShadowSpawn and RoboCopy. It took me about 15 minutes to write a batch file that works just fine. I haven't decided whether to let it monitor for changes or just run as a scheduled task a few times a day. I work with a lot of video and image files that may be changed a dozen or more times before they're done. But I may enable monitoring anyway. I haven't decided yet.

If anyone's interested, this is the batch file I'm currently using:

Code:
shadowspawn "C:\Users\[User Name]\AppData" Q: robocopy Q:\ "D:\Files and Folders\AppData" /s /xj /xo /tee /log:my_backup_log.txt
shadowspawn "C:\Users\[User Name]\Downloads" R: robocopy R:\ "D:\Files and Folders\Downloads" /s /xo /xj /tee /log:my_backup_log.txt
shadowspawn "C:\Users\[User Name]\Desktop" S:\ robocopy S:\ "D:\Files and Folders\Desktop" /s /xo /xj /tee /log:my_backup_log.txt
shadowspawn "C:\Users\[User Name]\Documents" T:\ robocopy T:\ "D:\Files and Folders\Documents" /s /xo /xj /tee /log:my_backup_log.txt
pause

It assumes that you have ShadowSpawn installed, which just means copying the executable into the System32 folder. It's old, alpha code that's not maintained anymore, but it still works just fine. All it does is shadow the selected folder so RoboCopy can copy it, even if the files are in use. It's pretty standard, old-school stuff. It's also quick-and-dirty. There's no error handling in there. I wasn't even sure that ShadowSpawn (or RoboCopy, for that matter) would work on Win10. But they do.

There was one hiccup, though: I forgot that Win7 and later uses "junctions," which are more like symlinks than ordinary Windows shortcuts. They did this for backward compatibility. The problem is that RoboCopy ****s the bed with gusto on certain junctions and goes into an infinite recursive loop, copying the same folders inside of themselves until the drive is full.

The /xj switch takes care of that problem quite handily. Of course, that also means that it ignores data that might be behind a junction that won't cause a recursive loop. But that's not an issue in my case.

Rich
 
I use Crash Plan.

Set it and forget it. Free version backs up to any local drive while the paid version will do it to Crash Plan's servers.
 
If you do anything with Linux, Ubuntu comes with a pretty functional back-up program... Does everything you mention above. I love working in an OS that I control, rather than vice-versa. :)

Jim
 
SyncBack and its free
 
SyncBack and its free

I actually got as far as attempting to download that one, but ESET popped up an adware warning, so I aborted. I do understand that an adware-free version is available if you make a donation, but I'm not going to make a donation just to try someone's software. They need to at least offer a seven-day trial, at a minimum, before cramming the spyware down my throat.

Rich
 
If you do anything with Linux, Ubuntu comes with a pretty functional back-up program... Does everything you mention above. I love working in an OS that I control, rather than vice-versa. :)

Jim

I have CentOS and Mint machines, but no Ubuntu at the moment. And yeah, Linux backup is a breeze. I have multiple backups set up on my CentOS machines both to other machines at the datacenter and to Amazon S3. I think they took me about six minutes total to set up.

Rich
 
My inclination at the moment is to just keep using ShadowSpawn and RoboCopy (with some additional switches to exclude system files and temp files, cache folders, log files, and all that other other useless **** that Windows generates), and then to sync that data to one of the bazillion online locations available to me.

In all honesty, had I known that RoboCopy was still included with Win10 and that ShadowSpawn still worked, I probably would have started there instead of looking for other solutions. It's a free and easy solution that affords exquisite granularity.

Rich
 
My inclination at the moment is to just keep using ShadowSpawn and RoboCopy (with some additional switches to exclude system files and temp files, cache folders, log files, and all that other other useless **** that Windows generates), and then to sync that data to one of the bazillion online locations available to me.

In all honesty, had I known that RoboCopy was still included with Win10 and that ShadowSpawn still worked, I probably would have started there instead of looking for other solutions. It's a free and easy solution that affords exquisite granularity.

Rich

Your approach is awesome in my opinion. Robocopy never fails. Rsync for GNU systems is its equivalent and is also awesome. These days people are buying appliances not computers, so its great to see you take the basic tools and get the job done quickly.
 
BTW, I also use google drive and dropbox. They are both great services in my opinion.
 
Your approach is awesome in my opinion. Robocopy never fails. Rsync for GNU systems is its equivalent and is also awesome. These days people are buying appliances not computers, so its great to see you take the basic tools and get the job done quickly.

Thanks. Sometimes the basic tools are the best.

I've tweaked the batch file a bit:

Code:
shadowspawn "C:\Users\[Profile Name]\AppData" Q: robocopy Q:\ "D:\Files and Folders\AppData" /s /xa:st /xj /xd "*cache*" "temporary internet files" "temp" "tmp" /xo /purge /tee /log:Q_backup_log.txt
shadowspawn "C:\Users\[Profile Name]\Downloads" R: robocopy R:\ "D:\Files and Folders\Downloads" /s /xo /xj /purge /tee /log:R_backup_log.txt
shadowspawn "C:\Users\[Profile Name]\Desktop" S:\ robocopy S:\ "D:\Files and Folders\Desktop" /s /xo /xj /purge /tee /log:S_backup_log.txt
shadowspawn "C:\Users\[Profile Name]\Documents" T:\ robocopy T:\ "D:\Files and Folders\Documents" /s /xo /xj /purge /tee /log:T_backup_log.txt
pause

with an eye toward shrinking the backup set, and then syncing the files on D:\ to online backup. Part of what I'm trying to do is reduce my bandwidth use. I figure I really don't need to retain the deleted files in this backup because I also have the clone and the image. If I accidentally delete something I can get it from the clone, and if I want an older version I can get it from the image.

In other words, this backup is intended as an absolute last resort if everything else fails. That would mean the place burns to the ground, both the external clone and the Macrium image on the ioSafe are destroyed, and the company can't recover the data. If all those things happen, my most important files would still reside somewhere online in their most recent states.

The reason I don't use real-time online backup, by the way, is because I edit a lot of pictures and videos from clients, and real-time backup backs them up every time I touch them. When you're talking about multiple files in the hundreds of MB (plus the resulting projects) being re-uploaded every time they're touched, the bandwidth use starts getting ridiculous. Even with an 800 GB cap, I've come close to busting it a few times.

Doing it this way gives me some control. Almost all my projects are completed in one day; so I can download the source files as I need them, delete them once I'm done with them, delete the resulting projects once they're uploaded, and then run the online sync at night once all that stuff has been deleted. The source files would still be on the FTP server if they're ever needed, and the finished files would be on the Web server and backed up from there.

Rich
 
I'm glad that you like Google and that Google Drive is working out for you. :)

Rich

I really just use it for photos, but works well.. unlimited and free.

I have tons of experience with robocopy scripts too. If you need compression you can do it with power shell to create zip archives.

Also check out synctoy it's a free lightweight Microsoft utility for syncing folders with decent filters and scheduling
 
We've been a "we don't back up desktops" shop for a long time. If you want it backed up, put it on the NAS. (It backs up to Amazon S3 nightly.)

Some of us run local stuff to local USB drives for speed of recovery, but we're the exception.
 
We've been a "we don't back up desktops" shop for a long time. If you want it backed up, put it on the NAS. (It backs up to Amazon S3 nightly.)

Some of us run local stuff to local USB drives for speed of recovery, but we're the exception.

I never really could understand that. Even if the user runs everything on the server, it still takes the better part of a day to set up a Windows workstation in the event of a hard drive crash or hosed system by the time you factor in installing the OS, installing the bazillion updates to the OS, joining to the network, configuring whatever user settings the machine needs, and installing whatever software has to run locally.

Daily cloning, on the other hand, is cheap, easy, and reduces downtime to minutes. I wouldn't recommend it as a sole backup solution, but having fresh clones has made my life and my clients' lives easier many, many times.

Rich
 
I never really could understand that. Even if the user runs everything on the server, it still takes the better part of a day to set up a Windows workstation in the event of a hard drive crash or hosed system by the time you factor in installing the OS, installing the bazillion updates to the OS, joining to the network, configuring whatever user settings the machine needs, and installing whatever software has to run locally.

Daily cloning, on the other hand, is cheap, easy, and reduces downtime to minutes. I wouldn't recommend it as a sole backup solution, but having fresh clones has made my life and my clients' lives easier many, many times.

Rich

Nah... images. Restore to base company image, change a small list of settings... we keep the images up to date. Plus, we can (when we are ahead of the hardware ordering curve) have a couple of spare laptops ready to go. About 15 min to turn it into "their" machine and AD integration reconnects all the server/network stuff at login.

We would do full desktop images, but the local storage requirements are massive with most machines having a 1TB drive in them these days.

The vast majority of our users can plop down at any machine and log into the domain and fire up their fave mail client and be working, as long as they keep their "stuff" on either the group shares on the NAS or in their own personal space there.

What we essentially did was model the internal off of "cloud" -- that way as we actually move more and more toward "cloud" it really doesn't change workflow much. Grab any machine and hook it to the Domain and you'll be 90% "there" without doing anything to it. If you can remember all of your passwords... which is another thing that's slowly creeping across all of our services, single sign on. Done for 80% of those also.

DEVELOPERS are a different story. We'll buy them pretty much whatever they want for local backup because them taking three days to re-set-up their own person Dev environment is a timeline disaster for the business whenever it happens.

They also get new laptops in "pair" mode ... whenever we have new hardware for them, they get about a month with both the old and the new so they can slowly set up their new one before having to return the old one to our hardware stock, where it usually gets reused immediately.
 
SyncBack, Beyond Compare or Second Copy. SyncBack can run as a service, and has MANY options.
 
[snip]
DEVELOPERS are a different story. We'll buy them pretty much whatever they want for local backup because them taking three days to re-set-up their own person Dev environment is a timeline disaster for the business whenever it happens.

They also get new laptops in "pair" mode ... whenever we have new hardware for them, they get about a month with both the old and the new so they can slowly set up their new one before having to return the old one to our hardware stock, where it usually gets reused immediately.

This is a good way to handle developers. They do have unique requirements. And it's not just because they're prima donnas (although some are).

John
 
SSD drives are inexpensive and fast today. I don't use any "formal" backup software. Every once in a while I'll just image a SSD drive, in minutes.

It is super easy to just swap in the new drive, if ener needed.
 
SSD drives are inexpensive and fast today. I don't use any "formal" backup software. Every once in a while I'll just image a SSD drive, in minutes.

It is super easy to just swap in the new drive, if ener needed.

A decent strategy. Do be aware of what you have and what you don't with this approach, however. I spent a decade in the computer storage and backup business (architecting storage systems and writing shrink wrapped backup code). Backup systems are sold to handle catastrophic hardware failure (What I used to call the "small meteor through the server room" scenario) but what they're actually USED for most often is the "Dang! I wish I hadn't deleted that." You're not getting the history with the images. Create a file a week ago, delete it yesterday and then decide you need it? You're toast.

John
 
SSD drives are inexpensive and fast today. I don't use any "formal" backup software. Every once in a while I'll just image a SSD drive, in minutes.

It is super easy to just swap in the new drive, if ener needed.

Granted my system is for home use, but when I upgraded to a SSD I put the old HD in the safe just in case. It might need some updating, but all the basics are still there.
 
Nah... images. Restore to base company image, change a small list of settings... we keep the images up to date. Plus, we can (when we are ahead of the hardware ordering curve) have a couple of spare laptops ready to go. About 15 min to turn it into "their" machine and AD integration reconnects all the server/network stuff at login.

We would do full desktop images, but the local storage requirements are massive with most machines having a 1TB drive in them these days.

The vast majority of our users can plop down at any machine and log into the domain and fire up their fave mail client and be working, as long as they keep their "stuff" on either the group shares on the NAS or in their own personal space there.

What we essentially did was model the internal off of "cloud" -- that way as we actually move more and more toward "cloud" it really doesn't change workflow much. Grab any machine and hook it to the Domain and you'll be 90% "there" without doing anything to it. If you can remember all of your passwords... which is another thing that's slowly creeping across all of our services, single sign on. Done for 80% of those also.

DEVELOPERS are a different story. We'll buy them pretty much whatever they want for local backup because them taking three days to re-set-up their own person Dev environment is a timeline disaster for the business whenever it happens.

They also get new laptops in "pair" mode ... whenever we have new hardware for them, they get about a month with both the old and the new so they can slowly set up their new one before having to return the old one to our hardware stock, where it usually gets reused immediately.

Ah, okay. That makes sense.

Rich
 
A decent strategy. Do be aware of what you have and what you don't with this approach, however. I spent a decade in the computer storage and backup business (architecting storage systems and writing shrink wrapped backup code). Backup systems are sold to handle catastrophic hardware failure (What I used to call the "small meteor through the server room" scenario) but what they're actually USED for most often is the "Dang! I wish I hadn't deleted that." You're not getting the history with the images. Create a file a week ago, delete it yesterday and then decide you need it? You're toast.

John

That's actually one thing I like about Macrium Reflect: Using a typical Grandfather / Father / Son scheme, the images are easily browsable in Explorer at any point for as far back as the retention time of the full backups. It just mounts the selected image as a drive. For imaging and cloning, it's the best backup software I've used.

The clone, on the other hand, has no history unless the source drive itself had some sort of versioning enabled. Its purpose is to reduce downtime, not for backup per se. And frankly, I'm not even sure how much time it saves these days with SSD drives and fast interfaces like SATA and USB 3. I could restore my hard drive from a Macrium image in about half an hour nowadays.

The file backup function in Macrium also works well, but it's not exactly what I'm looking for for this backup. This one is the Doomsday Backup. Once it's online, I want to be able to access it from anywhere, using any device, without installing any software. So zipping and encrypting would be okay for the version that gets stored online, but not much more than that.

I admit to being a little obsessive about backups. I used to tell my clients that when I was helping them make disaster plans, my job was to be a professional pessimist with a doomsday mentality and more than a touch of OCD.

Rich
 
This is a good way to handle developers. They do have unique requirements. And it's not just because they're prima donnas (although some are).

John

That they do.

I must admit; they're the most happy about maybe moving the company to GSuite for mail though, and will require the least handholding.

The only part they'll hate is the password sync from the domain. Whenever it forces a change to the password they all have like five devices that'll all start screaming in unison to have their passwords updated. LOL. Domain, Mail/GSuite, VPN, you name it.

Maybe we can get a better SSO system with keys working after this. Ha.
 
Amazon S3/Glacier and Cloudberry backup.
 
Amazon S3/Glacier and Cloudberry backup.

S3 will more than likely be where they go since I already use it for the CentOS machines and its been trouble-free. How they get there is still under consideration. I'll look into Cludberry, along with the others people have mentioned. Thanks.

I had been using BackBlaze, which has been splendidly reliable, to back up the desktop machine; but their client won't do what I want it to do, which is to exclude C:\ and just back up D:\. But they're looking into a possible workaround. Or so they told me.

Using BackBlaze, I actually could exclude all the files on C:\, but not the drive itself. But unless I renamed the files on D:\ something other than what they're called on C:\ (I suppose zipping them would also work), they wouldn't be backed up from D:\ either. It's really not designed with granularity in mind.

Rich
 
Amazon S3/Glacier and Cloudberry backup.

Actually, BackBlaze just got back to me with a recommendation that I try their B2 cloud backup service rather than the Personal Backup service. They're much cheaper than AWS, Google, etc., and in my experience they're a solid company. I've used and sold their Personal Backup service for years.

If I were a brighter man, I would have checked their site first to see if they offered an alternative service, rather than an alternative client. Sigh...

Rich
 
I use Crash Plan.

Set it and forget it. Free version backs up to any local drive while the paid version will do it to Crash Plan's servers.

The paid version failed for me about three years ago, and then I quit them. Their servers crashed the one night that I needed my backup files. I called them, and they were worthless -- a pure Mickey-Mouse operation.

I then switched to Carbonite for cloud backups, which hasn't failed me yet. And I can access my files easily, from anywhere. I just hope that they are secure -- one never knows.
 
The paid version failed for me about three years ago, and then I quit them. Their servers crashed the one night that I needed my backup files. I called them, and they were worthless -- a pure Mickey-Mouse operation.

I then switched to Carbonite for cloud backups, which hasn't failed me yet. And I can access my files easily, from anywhere. I just hope that they are secure -- one never knows.

I use them to backup to my own raid server and mirrored drive. Have triple copies and can survive double drive failure. I don't use their servers but good to know.
 
S3 will more than likely be where they go since I already use it for the CentOS machines and its been trouble-free. How they get there is still under consideration. I'll look into Cludberry, along with the others people have mentioned. Thanks.

I had been using BackBlaze, which has been splendidly reliable, to back up the desktop machine; but their client won't do what I want it to do, which is to exclude C:\ and just back up D:\. But they're looking into a possible workaround. Or so they told me.

Using BackBlaze, I actually could exclude all the files on C:\, but not the drive itself. But unless I renamed the files on D:\ something other than what they're called on C:\ (I suppose zipping them would also work), they wouldn't be backed up from D:\ either. It's really not designed with granularity in mind.

Rich

I'm amazed in 2016 Windows is even still stuck with the whole "C:" thing.

The rest of us have been naming our disks whatever we like for decades.
 
I'm amazed in 2016 Windows is even still stuck with the whole "C:" thing.

The rest of us have been naming our disks whatever we like for decades.

Meh. Of all the complaints I could come up with about Microsoft and Windows, that's pretty low on the list.

Rich
 
Meh. Of all the complaints I could come up with about Microsoft and Windows, that's pretty low on the list.

Certainly not a show-stopper until you run into some doof who hard-coded it into their backup software. :)

I just find it odd that with all the new tech in BIOS and bootstrapping, even in PeeCees, that we're still pretending the hard disk is the third drive we bought and added to our spiffy new dual-floppy PC-XT from the CompuHut.

When even back then, the Unixy and Vaxen and other bigger toys weren't doing that.

It's like a vestigial tail on that bottom vertebrae. Ha... one of these days a PeeCee will be born without it. :)
 
I use Crash Plan.

Set it and forget it. Free version backs up to any local drive while the paid version will do it to Crash Plan's servers.

Same here. Paid version covers up to 10 computers iirc. I'm happy with it.
 
So here's what I have working as of now. It may be permanent.

1. ShadowSpawn creates shadows.
2. Robocopy copies the new or changed files to the ioSafe
3. rclone uploads them to a bucket on BackBlaze's B2 service.

I tried some more graphical clients, and I was happy with none of them. One of them especially annoyed me because four hours after I started it, it was still "preparing" to upload. Preparing what? It's a file sync. There's nothing to prepare.

Somewhere in the course of my frustration, I came across rclone, which I'd never heard of before. It's a command-line tool along the lines of rsync, but purpose-built for cloud storage. It's a very nice little tool. After about 10 minutes learning how to use it, I plugged it into the batch file; and it's so far uploaded about 30 GBytes of data with nary a fart, belch, or hiccup.

The only problem was that the default of four simultaneous connections was too few and too slow, so I increased it to 24, which caused some socket errors. So I backed it down to 16. That seems to be the sweet spot for my particular connection.

I also tweaked the batch file again to further reduce the stuff being backed up. All the application data I need is in Roaming, for example. The rest is just useless ******** that Windows puts in there whether you need it or not. Most of the directories outside of Roaming were empty, but excluding them speeds things up because they don't have to be read and parsed every time the batch file runs.

The script has been running flawlessly for almost 16 hours, but it's the first backup. I'm curious how long the subsequent ones will take.

As an aside, I actually was willing to pay for software to do this. But everything I tried sucked, in my opinion. This is an extremely simple task, and none of them did it efficiently. So I wound up using three free command-line programs, and so far, it's doing exactly what I want it to.

Rich
 
An update: After 24 hours and ~40 GB of data, rclone stopped uploading. The cause was that Backblaze's token had expired, and rclone didn't have the presence of mind to request a new one. It didn't crash. It just sat there providing the error message for me, which was helpful. So I stopped the process and restarted it without running the preceding scripts (ShadowSpawn and Robocopy). It scanned the source and destination for about five minutes, then picked up where it left off.

When I have a moment I'll check to see if that's a known issue with rclone and Backblaze, and if not, I'll file a bug report. But it's not a deal killer for me because this should be the last time I have to upload so much data that it will need to be connected that long. I thought I'd mention it, though.

EDIT: I suppose I could also script an instruction to request a new token (or whatever it is that Backblaze calls it) and resume if rclone throws that particular error. But I have to remember or otherwise recall the exact wording. It was before my morning coffee and I neglected to write it down. Grrr.

Rich
 
Last edited:
EDIT: I suppose I could also script an instruction to request a new token (or whatever it is that Backblaze calls it) and resume if rclone throws that particular error. But I have to remember or otherwise recall the exact wording. It was before my morning coffee and I neglected to write it down. Grrr.

Heh. Trap for young players. You know better than to go to the computer room before the coffee pot! ;)
 
Back
Top