Have you ever found a website/page that has several or perhaps dozens, hundreds or thousands of files that you need downloaded but don't have the time to manually do it?
wget's recursive function called with -r does that, but also with some quirks to be warned about.
If you're doing it from a standard web based directory structure, you will notice there is still a link to .. and wget will follow that.
Eg. let's say you have files in http://serverip/documen........
I was shocked that options like preserve and archive made no difference! This is a big deal and will catch people off guard.
Rsync include hidden files Solution:
You need to use something like: rsync -Pha /source/dir/. /dest/dir
*Notice the "." at the end of the source directory.
cp -a still ignores them too, the solution is the same:
cp -a /source/directory/. /destination........