Advice on massive file copies...
chris at tincreek.com
Tue Dec 31 09:14:57 MST 2019
On Tue, Dec 31, 2019 at 8:22 AM John Von Essen <john at essenz.com> wrote:
> So I have been tinkering with this for a few days and keep hitting issues.
> Because my wife takes an absurd amount of photos, over the years we have
> built an insanely large amount of data, about 800GB. Its mostly photos,
> movies, but also docs, and old backups of old computers. New data gets into
> the QNAP NAS by way of my iMac, i.e. I attach my wifes phone or Canon
> camera or USB stcik, copy the new files into the QNAP mount on the iMac,
> that copy is manual by click and drag, or if its alot of stuff I just use
> cp in a terminal window.
> The QNAP is 2-Bay NAS with 2x2TB SATA drives in Raid 1.
> I dont feel super confident with the QNAP as the sole source, so I want a
> backup of the backup.
> To do this, I have a home linux server with a single 2TB drive. On that
> linux server I have mounted the QNAP (SMB/CIFS) at /nas and /data is my
> local 2TB drive on that server.
It's been a long time since I've done anything serious with rsync but it
should do what you want. Have you looked into running rsync as a
service/daemon on one of the devices and then using the rsync client on the
remote side to sync to it? I believe that allows the hash of the files to
happen locally on each side of the connection and no data needs to be read
remotely to see if data is different.
More information about the PLUG