0

I have a remote machine, where the space is very limited (~15GB free) I have an NFS share mounted on this VM, the size of the share is ~600GBs. The NFS share has lightning-fast access from this machine.

I want to bring 600GB of files in the above share, to my local system (over internet) for offline use. (Chunked as may be 2GB archives using http or off-the-shelf tools - dropbox/onedrive) My local system has a slow access to this share. I have tried rsync -ing it to a local folder after mounting as NFS, But it goes on for ever - may be since it has millions of files.

Since the remote machine has less free space, Is there a way to create an incremental archive of the NFS share files in multiple sessions (Can restart archive creation after freeing up space.)

Environment:

  1. Local system is Mac
  2. The remote VM is Ubuntu

Local machine has ~ 1TB of free space.

I looked at tar/zpaq but they don't seem to have the necessary features. Is there any faster solution than writing a custom script?

Thanks in advance!

  • Is sshfs available on Mac? I found this: macFUSE. With sshfs you can use cp or whatever. I believe the fact NFS is involved is irrelevant. I believe if the files are available after you ssh to the remote VM then you can reach them with sshfs instead. – Kamil Maciorowski Mar 06 '21 at 15:31
  • This again depends on the not so fast network. The reason why I am looking for archives are primarily because of the efficient transfers. Individual files over SSH/NFS/rsyc etc. are prohibitively slow. – NS Gopikrishnan Mar 06 '21 at 15:52
  • Is the 600GB in one or more files? If in several files, how large are they? Do they contain text? – harrymc Mar 06 '21 at 16:05
  • @harrymc mainly executables, libraries, build tools. Large no. of individual files. – NS Gopikrishnan Mar 06 '21 at 16:08
  • How will you be bringing the files over? (Network from remote to local? Sneakernet?) Does the archive have to be compressed, or does it just need to hold many files? – u1686_grawity Mar 06 '21 at 16:14
  • @NSGopikrishnan: Zipping binary files is not efficient. If both machines have rsync or anything similar, you could do the transfer at about the same speed. Zipping won't get you a big improvement. (Add to your comment @harrymc for me to be notified.) – harrymc Mar 06 '21 at 16:24
  • Zipping/any archive format is just to get them to equal file sizes, maybe 2GB, No compression opted. Just to chunk it for the download. I will be moving it through Onedrive. @harrymc – NS Gopikrishnan Mar 06 '21 at 16:41
  • You may use rclone to sync to and from OneDrive. You can't do chunks since you don't have the space to compress the 600GB into one archive. – harrymc Mar 06 '21 at 16:46
  • Is it faster than mounting the NFS locally and rsync -ing it from mounted to local folder? If the no. of files remain huge(a million+), I think it will be too much overhead. I am coming across rclone for the first time @harrymc – NS Gopikrishnan Mar 06 '21 at 16:57
  • rclone mount mounts the remote as file system on a mountpoint. This will allow you to zip the whole thing in one go to OneDrive, but with any interruption you will have to restart from the beginning. I don't recommend this as the gains will be minor. – harrymc Mar 06 '21 at 17:07

0 Answers0