Network Attached Storage components

You should be able to give it a static IP in your router… I’ve had to do that with my Mac Mini because I run a Calibre server on it, and its easier than trying to work out the latest change.

1 Like

Indeed they do, sometimes more than one, sometimes any story one wants - sometimes they even suggest/reveal the truth about someones claims - all I said is “I take no comfort in statistics” …

3 Likes

“There are three kinds of lies: lies, damned lies, and statistics.” [attributed to Disraeli]

“Then I’ll get on my knees and pray
We don’t get fooled again” [Townshend]

“I will your reduce taxes, see here are the figures …” [any politician you run into this month]

3 Likes

As stated by @SueW you can fix the address in the router/modem for the device. This will stop the changing IP address issue. It is likely that the router is reissuing a new IP address when the “lease” expires, typically about 24 hours from allocation using DHCP (router dynamically assigns addresses to connected devices as needed).

When power fails or the network fails then the NAS box will work hard to ensure the data it is storing is correct. It will run diagnostics, it will compare data, run error correction, it will update changed data and all these things can take a very long time to happen. If wanting to reduce this to a minimum buy a reasonable UPS (Uninterruptible Power Supply) and place your modem/router and NAS device on it.

Maybe. Some dredging through the details shows different failure rates for different HDD models, even if the volume smoothes out scatter from individual events I would expect to see more shelves and discontinuities in the curve as different components are layered over each other. As it is the shape is just so textbook. The solution would be to download the data and process it yourself. I am not that keen.

Touch wood I have never had a hardware failure of any kind - not a disk, not the server, no fires. (I do however monitor SMART data, and upgrade replace hardware from time to time - and I do backups anyway.)

My point, perhaps flippantly made, is to get people to think about what scenarios mirroring will protect you against and what scenarios it won’t protect you against.

Plenty of houses have burnt to the ground - no examples needed. With mirroring that will toast both copies of your files.

The other general point is that disk redundancy will faithfully propagate various problems to both (all) disks. If you delete a file accidentally, it is gone from both disks. If an application error causes the contents of a file to become corrupted internally, that will be faithfully propagated to both disks. If ransomware encrypts all your files, that will be faithfully propagated to both disks.

So if you can only afford to buy 2 disks then buy 1 disk to use for storage and 1 disk to use as external backup - rather than buying 2 disks to use as a storage mirror pair.

1 Like

The risk exists but is minimal as I live in a steel/concrete/hardiplank house with functional fire alarm and extinguisher.

I will have two copies on the NAS, one on the working PC and one on DVD or portable HDD (although this may lag the rest at times) outside the building. For a photo collection (not the heart of a business) I think that is enough redundancy. The point about the mirrored NAS backup is that it adds a layer of redundancy and can be automated and so I will not suffer a different class of failure, that is failure to do my backups. In my experience the warmware fails more often than the hardware.

As I will not be working on the NAS backup copies I cannot see how an accidental deletion or a file corrupted from an application error can affect them. I thought that one reason for having backup was to protect you from those kinds of loss. Please correct me if I am wrong.

My comments were of a general nature i.e. applying to all types of dwellings (including those with plastic cladding, legal or otherwise, etc.).

In agreement there. This is a genuine issue. One of the best ways of ensuring that backups actually get done is ensuring that they are fast and easy.

If external backups must be automated then “cloud” may be the best option - provided that privacy risks are understood and, where desired, managed.

In addition, “cloud” has the vulnerability that malicious software may automatically target the cloud copy.

Corruption by application error can be an issue if it is not detected before the NAS copy is overwritten. That could apply to a cloud / external copy too but can be mitigated with “time capsule”-like functionality. A basic disk mirror (what I was talking about) is unlikely to provide this richness of functionality. With your specific arrangement (master copy is on PC?, NAS is backup and NAS has mirroring) you could presumably arrange that functionality - since it depends on what the PC is capable of, not what the NAS is capable of.

Do you suggest ‘time machine’ like includes using the incremental backup options that offer an appended file strategy? These can provide the ability to recover earlier versions, rather than overwriting a backup. I’ve used that strategy effectively with the added use of separate external removable drives for holding complete system disk images and duplicate copies of all data at regular intervals.

The idea of using the cloud for data storage may be out of practical reach for many? Thank you NBN Co and the LNP.

I have made the effort with copies of some key documents held in a cloud system. Drop Box and Norton. It takes time. We have never had much more than 500+kbps upload speeds to rely on. IE 200MB per hour upload! Something to leave running overnight after you remember to leave the laptop in ‘always on’ power mode. It is really only practical for adding new files as they are created.

That is what I am suggesting. Most people wouldn’t have enough media and wouldn’t want to put in the time and effort to create a sufficiently large number of complete disk images on external drives - hence the desirability of “incremental” backup.

Regardless of that, it is essential to have a minimum of two complete disk images on external drives (so that one is safe while the other is being written).

Regrettably true. The initial upload will take ‘forever’. It isn’t going to be viable to back up a video collection to the cloud but then videos aren’t likely to be changed after initial creation - so other options are available. For example, you might use an external disk for the complete disk image and the cloud for the changes. The backup process and the restore process is more complex.

(If you’re doing real video editing work then you probably need FTTP.)

1 Like

Perhaps differential would be better than incremental in the scenario of a recovery need on the parent machine if using “at hand” backup media. A bit more space required than incremental but faster recovery in the event of the need to recover. Incremental from the cloud might work better in the case of slow internet as the incremental parts would be each smaller than a differential but complete backups even for what may seem fast FTTP internet can still be very time consuming. But not having a complete backup off site is still not best practice so splitting the two pieces is still not real good security.

As to needing FTTP then that’s a problem for many who land in FTTN, some in FTTC and any in Satellite or Wireless NBN terrortory (deliberate misspell), not because of their choice but a political one. It could also be extremely costly to obtain the FTTP after either by moving residence or paying for the upgrade.

A complete backup to external disk can be off site - but see my comment about having a minimum of two of them.

This was what I was referring to…the more complex you make the chain that is needed to fix an issue the more risk of failure of that chain.

Simpler and more secure to use each backup set on a single place/disk/tape eg Cloud and 2 disks at least, 2 different cloud backups (different providers) and at least 1 disk, at least 3 disks, a tape backup or 2 or 3 or more plus at least a disk, 3 or more tape backups. Keep one remote, one nearby eg good friend or neighbour, one at home.

If someone lacks room to save or has access issues eg slow internet for Cloud backup then at least saving the important files such as docs, photos, favourites, emails (if stored locally), music, keys for software on smaller media may be a cheaper way out. Then it is a matter of restoring the operating system, install the programs/apps and then copying the important data back. Win 10 these days is pretty forgiving when it comes to re-installing it after a failure even with hardware changes (as long as the licence has been registered to a Microsoft account). If someone is using Office 365 same goes for that just de-register the old machine/user & re-register the new. I don’t know the Apple process but I am sure it is a similar process of re-authorisation.

Some people swear by the clean re-install to freshen their machines on a regular basis. Not my favourite way of cleaning but if it works for the user then not my place to tell them different.

2 Likes

This is something I did until Windows 10. Past versions have tended to get a little… old and saggy after a while, especially if you install and uninstall a wide range of software - as most leaves traces scattered about. Windows 10 seems (so far) to be a lot more competent at managing with these traces while not grinding to a halt.

The install traces are still there, but they don’t seem to impact performance as much as in previous Windows iterations.

3 Likes

I think some of you need an Ultrium 8 library for true happiness, or at least a couple of standalones either way with an offsite or three - good luck finding media at the moment though, especially if you want to avoid Chinese media - you could back off to LTO-7 and go Type-M, but wheres the fun in that?

1 Like

Lol I have used a 2nd hand HP LTO 6 (with compression on) one, slow recovery in comparison to a 6 TB disk but it really is long lived storage.

1 Like

Here is summary of where I wanted to go and where I went. The gear I ordered (see my post 9 days ago) has arrived and is installed.

I am a happy camper.

Computer Alliance supplied all the gear at a good price and delivered it when they said with no hassle.

The Synology NAS box was easy to set up and configure and it all worked OOTB. One oddity is that it chooses as default to set you up with their proprietary RAID system (SHR) and you have no choice until after the OS is downloaded and installed to choose any other RAID format. In my case it doesn’t matter as with two identical drives it is equivalent to mirroring. I am satisfied that the software is tried and true and I am not going to be left with nothing. There are advantages in this format should I want to install larger or different drives latter.

The NAS OS is fine and the apps for it (that I have seen so far - there are many) are likewise. Similarly the EaseUs backup software is good.

As to the speed question I am getting about 90MB/s transfer speed, a little more or less depending on file size. Whether the NAS box, HDDs or network is the limiting factor I neither know nor care. It means I can set up my backup regime to run (say during dinner) and come back and check that it completed satisfactorily before the washing up. The NAS box has a utility to backup the whole logical drive to an external HDD via its own USB 3.0 port. I am yet to buy a 3TB drive for the purpose and I guess it will be slower but that’s OK, it will be simple enough and I can remove the external drive to another location.

So for a total cost of about $750 I have met my objective of getting the photo collection (and work PC primary partition) backed up with a fairly good level of security and little effort, meaning the backups will actually get done regularly. It sure beats feeding DVDs into the slot!

Once again, thanks to all who helped. Unless there is more to say I propose to leave it there.

2 Likes