0

I've already tried to follow How to automatically archive a directory? with no success. Anyway here is my problem: I would like to know if there is some service or program that can run on a server to keep a folder updated with all the file that are created on on another server. I don't need a real sync services, beacuse I don't need to have the same content.

I try to explain better.

  1. Server a -> there is a root folder with some files and sub-folders (the files could be very heavy)

  2. Server b -> once a day it connects to Server A and download all the file and sub-folders in the root folder of the server A

  3. (could be nice but it's not mandatory) Server b -> after finish the download of the files and folders of server A, proceed to delete all the item in the server A to keep it free of space

Both server are running Ubuntu. I've tried the rsync command, but it doesn't seems to work vert well (or maybe I'm doing something wrong, I', not very expert in the linux world): every time I need to stay connected to the server with my user and when it lost the connection it start from the beginning.

  • In the other post, it is totally local. You need to add username@serverB:/path/to/store to your command. To make it run every day, use a cron job. Here is a how to http://www.selflinux.org/selflinux/html/cron01.html – Kev Inski Jan 13 '16 at 10:45
  • @KevInski thanks! I'll try immediately! And what about to delete the file after they been downloading? Is it possibile? – lollo64 Jan 13 '16 at 12:09

1 Answers1

0

Say you have a script with such an example code on Server A:

#!/bin/sh
rsync -acz /folder_to_copy username@serverB:/path/to/store
rm -rf /folder_to_remove

Keep in mind: Always use -c for building a checksum, especially if you copy remotely, even it is slower. If you have spaces in the pathname escape it with a \ (backslash) like this /path/with/a/space\ here. But try to avoid EDIT: spaces, (not backslashes) ;-D.

Make it executable with chmod u+x /path/to/script/with/name.sh

Now, if you run this script, it will first upload all files in /folder_to_copy to server B in /path/to/store. And then remove (recursively) all files and folders from that directory.

Now you can make a cron job for this script so it will run every night at 3pm e.g.

Always think about what you do, try to understand the given commands and test everything. man-pages are your friend

Kev Inski
  • 1,021
  • Ok, thanks for your support! I'm going to create the script and do a lot of testing! – lollo64 Jan 13 '16 at 15:18
  • There was a (backtick) aftersh` in line one. I changed it in my post – Kev Inski Jan 20 '16 at 08:36
  • 1
    You could just use the --remove-source-files option with rsync to delete files. This way if the some files failed to transfer correctly, they won't be deleted. – mcchots Jan 20 '16 at 08:54