[NTLUG:Discuss] "Undo" hard links?

Lance Simmons simmons.lance at gmail.com
Wed Nov 25 12:56:24 PST 2020


Cornelius, thank you  very much! I've been so busy I didn't get to try
this, but it worked just fine.

I already do local backups on two different internal drives and one
external drive. I use Google Drive so that I can also have multiple backups
at work. (I do almost all my work at home.)  I also do incremental backups
at home, but that takes too much space for my free Google account. At a
bare minimum, I want backups of my documents in two different physical
locations, and at least one of them incremental. (We're talking about a
whopping 4 GB, but it's about 25 years of my life.)

Is there a good alternative to Google Drive? Gnome has support for
Nextcloud built in, but a cursory look for Nextcloud providers didn't find
any cheap hosting.  I'd rather not spend the time and effort to set up
dynamic dns and have an outward-facing server.

Thank you again! This was exactly what I was looking for.

On Tue, Nov 10, 2020 at 6:29 PM Cornelius Keck <dfwuug at keck.us> wrote:

> Yes... brute force might do the trick. Make sure you have enough disk
> space. Given this scenario:
>
> pi at pi1b00:~/tmp $ ls -ls
> total 12
> 4 -rwxr-xr-x 1 pi pi  139 Nov 10 18:12 breakLinks.sh
> 4 drwxr-xr-x 2 pi pi 4096 Nov 10 18:11 D1
> 4 drwxr-xr-x 2 pi pi 4096 Nov 10 18:11 D2
> pi at pi1b00:~/tmp $ ls -ls D*
> D1:
> total 0
> 0 -rw-r--r-- 2 pi pi 0 Nov 10 18:11 f1
>
> D2:
> total 0
> 0 -rw-r--r-- 2 pi pi 0 Nov 10 18:11 f2
> pi at pi1b00:~/tmp $
>
> of two subdirectories, D1 containing one file, D2 a hard link to that
> same file (link count of f2 is the same as the one of f1, and larger
> than 1 (one)), run this script:
>
> pi at pi1b00:~/tmp $ cat breakLinks.sh
> #!/bin/sh
> find . -type f -links +1 -print | while read line
> do
>      cp "$line" "$line.tmp"
>      rm -f "$line"
>      mv "$line.tmp" "$line"
> done
> pi at pi1b00:~/tmp $
>
> This looks for files (type f) in the current directory, with more than
> one reference, i.e. inode reference count larger than 1 (one) (-links
> +1), then, for each found, copies it aside, removes the original
> (reducing the inode's link count), moving the temporary copy back into
> place. The quotes around $line are there to keep the script from acting
> up when running into file names containing spaces.
>
> Running that results in:
>
> pi at pi1b00:~/tmp $ ./breakLinks.sh
> pi at pi1b00:~/tmp $ ls -ls D*
> D1:
> total 0
> 0 -rw-r--r-- 1 pi pi 0 Nov 10 18:23 f1
>
> D2:
> total 0
> 0 -rw-r--r-- 1 pi pi 0 Nov 10 18:23 f2
> pi at pi1b00:~/tmp $
>
> Same as before, but the link count is 1 (one).
>
> If the files' timestamp is important, use "cp -p".
>
> Sure, one could do this in a one-liner using find's -exec option, but
> that's makes for a less readable explanation.
>
> On a personal note, Google Drive is hosted on somebody else's computer,
> so there is always the chance that somebody else might get to your data,
> or that Google turns off access on short notice, or changes its terms
> and conditions. While it is a convenient feature, from a privacy and
> security perspective, a local backup is preferable. Besides, external
> disk drives are fairly cheap, Black Friday is coming up, and chances are
> that your uplink is not as fast as a USB3 wire.
>
>
> Lance Simmons wrote:
> > I recently created a lot of hard links, thereby reducing the disk space
> > used by my Documents directory by a chunk. I was feeling pretty good
> about
> > this, but then discovered that Google Drive does not acknowledge hard
> > links. Every time the folder is synchronized (via a cron job), a few
> > hundred bogus "conflicts" are declared and  then resolved. Then, the next
> > time the folder is synchronized, the same "conflicts" are found and
> > resolved again.
> >
> > I love the idea of hard links, but now I want to return to having
> separate
> > files. Is there an easy way to (1) delete hard links, then (2) copy the
> > files (inodes), giving them the filesystem names of the original hard
> > links?
> >
> > I should switch away from Google Drive; maybe this annoyance will
> motivate
> > me.
> >
> > Lance
> >
>


-- 
Lance Simmons


More information about the Discuss mailing list