Good backups being the operative phrase here Rich. Explain to me the point of backing 1 TB of information twice if nothing has changed, are you doing it just for the sake of doing it?
Now, if the OP is doing a full backup at the beginning of the week for example and an incremental that would backup the 1 KB file that changed the next day, awesome. Backing up daily for no reason is just a waste of time. Having an external hard drive that starts the chain and a cloud that houses the incremental is also a bad idea. Lose the EHD, lose the whole chain.
I think it's also important to note that the "external hard drive" or file-based backups that most everyone has been talking about only really apply to exactly that..files. And if that's all that matters, why are you even using a server? You shouldn't need a full blown server for file storage. There's about a dozen other ways to do that better these days, with less power, overhead, etc. So I'll bet that you are using the server for more than just file storage. And..if that's the case, file-based backups only go so far.
If you are backing up application files with an external hard drive, you are doing it wrong. Image-based backups are better, but what are you going to do with those if the ###$ hits the fan? You still need a host. Even if that host lives in the cloud like Cloudberry or Datto for example.
Also, I'd be wary of any companies out there promising a low turnaround SLA for disaster without any proof. It's easy for a company to say, oh yeah we'll have your data restored on a server and back to you in 48 hours, ok..show me. For those of you that use those companies, have you ever TESTED that? I used to run through annual (or more) fire-drills where we would actually take the data that we had and see how fast we could make it available for the client. Some applications are super complex and even after a restore can require support from third-parties to fully use again. Do they factor that in? What if a fire knocks out three of their clients at the same time, are they going to have you up and running in 48 hours as originally promised? Are they staffed for that?
I know the OP's question was really simple, but it's a bit scary reading what some of you are doing for backups...
I've experienced, either directly or as a consultant to someone else, every kind of data loss event short of a nuclear blast. Yet I personally have never permanently lost any of my own data, nor have any of my clients who were already my clients when their sundry disasters struck permanently lost any of their own data. That's because of my scary, admittedly OCD backup strategies. Planning for impossible things to happen and for fail-safe devices to fail has bailed many asses out of many fires for many years.
Your apparent dislike of EHDs and NASs is especially mystifying to me, especially in Dave's case. He has a crappy, unstable DSL connection that rules out any form of online backup. So what's left as a destination for Dave's data? Only EHD or NAS, unless you want to get into old-school tape drives or network backup to another building that's close enough to run cable or connect to with WiFi.
An EHD or NAS is a perfectly suitable destination for both file backups an image backups. (A dedicated EHD would also be a perfectly good destination for a clone, if desired.) If it's a file backup, then nothing special is needed to retrieve it. If it's an image and you're using the right software, then that software can simply be installed on Any Available PC if all that's desired is to recover data as opposed to the entire HDD image. Yes, you need a host. But that host can be any computer if all you want is the data.
But why limit yourself to one or the other? Just do both. The EHD can store both image and file backups, so why not both?
Because of the lack of a viable online option in Dave's case, some degree of protection of the EHD / NAS against fires and floods would also be desirable. Hence my suggestion for the ioSafe, which is as fire- and flood-resistant as you're going to get for a reasonable price. Both file backups and images can be stored on the ioSafe drive and be immediately or almost-immediately available in the case of a disaster. The odds are overwhelmingly good that the data the ioSafe houses will survive a fire or flood, and the backup will be the most current one.
In a normal situation, I would also use Some Software to back up the backups on the ioSafe drive to online backup. I don't mind command-line tools, so I just use Robocopy and Rclone for that part, with flags to copy only the changes. But practically any GUI-based backup software can do the same thing.
The reason for backing up the ioSafe is to create the doomsday backup in case the ioSafe, against all odds, does **** the bed along with its drives, and the company can't recover the data; or if Dave's computer
and the ioSafe are both stolen. By using versioning on the remote host, it also provides additional protection against ransomware should the local machine become infected with ransomware that's smart enough to encrypt the files on the ioSafe.
Dave, however, has a ****ty Internet connection; so using online backup is out. That limits his doomsday backup options to either EHDs or some LAN scheme to another building for fire and flood protection.
I'm not sure what you find scary about that, so please advise. I can think of other options that are as good, but I fail to get the scary part. Thanks.
Rich