Search My Blog

Wednesday, February 29, 2012

How to download recursively from an FTP site in the Command Line

This Article Discusses Mget and Wget. Wget is the one that interests me. There's more on the page though. See the link below...

Don


Linuxaria – Everything about GNU/Linux and Open source How to download recursively from an FTP site (in the Command Line)

Skipping on Down...

Wget

GNU Wget is a free software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive commandline tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc.
So this seem the perfect tool to be used on a server, also as plus wget is available for sure in any Linux distribution repository and this make installing it trivial.

The basic syntax for wget is

wget ftp://myusername:mypassword@ftp.yoursite.com/yourfile 

With a command like this one you use the FTP protocol with account myusername and the password mypassword to donwload from ftp.yoursite.com the file yourfile.
But we need some extra options to get a recursive download from that FTP site.

Extra Options

-r –recursive Turn on recursive retrieving.

-l depth –level=depth Specify recursion maximum depth level depth. The default maximum depth is 5.

So our command becomes:

wget -r --level=99 ftp://myusername:mypassword@ftp.yoursite.com/ 

In this way starting from the root directory wget download recursively down to 99 levels (or you can use inf for infinite)


Or you can use the -m option (that stands for mirror)
The -m option turns on mirroring i.e. it turns on recursion and time-stamping, sets infinite recursion depth and keeps FTP directory listings:

wget -m ftp://myusername:mypassword@ftp.yoursite.com/ 

If like me you have a really big site i suggest to run it with a nohup in front of the command and run it in background.

A final tip for wget, if you have to re-run it with the same site, you can also use the option -nc, in this way the files will not be downloaded 2 times.

-nc –no-clobber
If a file is downloaded more than once in the same directory, Wget’s behavior depends on a few options, including -nc. In certain cases, the local file will be clobbered, or overwritten, upon repeated download. In other cases it will be preserved.

When running Wget with -r or -p, but without -N, -nd, or -nc, re-downloading a file will result in the new copy simply overwriting the old. Adding -nc will prevent this behavior, instead causing the original version to be preserved and any newer copies on the server to be ignored.

Read More...
http://linuxaria.com/howto/how-to-download-recursively-from-an-ftp-site?lang=en

Popular Posts:

Related posts:

  1. Wget for fun
  2. PAC Manager: All your Connection are belong to us
  3. Synergy! as many PCs as you like with just one keyboard and one mouse!


No comments: