112

I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder.

I tried running the following command form my new server:

wget -r ftp://username:password@ip.of.old.host

But all I get is a made up index.html file.

What the right syntax for using wget recursively over FTP?

  • 4
    Note that, by default, wget -r has a maximum recursion depth of 5; if you have an deeper subdirectories, they will be ignored unless you change this (see -l in the man page). – mikewaters May 23 '11 at 19:21
  • What wget does also depends on wget version (or build) I came across one _GNU Wget 1.10.2 (Red Hat modified)_ where when path ends without '/' wget considers it a file and a download fails, but when `--timestamping` (-N) is used, it would work. Then when path ended in '/' it would always create an index.html instead of downloading files. Ending in * would also work for a dir. In all cases also `--recursive` was used. – papo Feb 18 '22 at 21:22

13 Answers13

148

Try -m for --mirror

wget -m ftp://username:password@ip.of.old.host
Dave Cheney
  • 18,307
  • 7
  • 48
  • 56
47

You have it right, you just need a trailing * on the end:

wget -r ftp://username:password@1.2.3.4/dir/*

For shared servers, you can use like this:

wget -r ftp://1.2.3.4/dir/* --ftp-user=username --ftp-password=password

Because most shared servers has ftp-username something like username@hostname, so, the first wget command not works, and second command works fine.

  • 2
    This would only go one directory deep.Better to use the -m flag – SvennD Aug 13 '15 at 12:18
  • in some cases second command is not working because of space, You can use first command for share hosting like this. ** wget -r ftp:/ /username%40host:password@host/dir/ ** – Wasim A. Apr 24 '17 at 07:11
  • 1
    Note: you should use either `-m` or `-r -l inf` because -r has a default recursion depth of 5. See https://www.gnu.org/software/wget/manual/wget.html#Recursive-Retrieval-Options – ndemou Oct 10 '17 at 20:45
10

Check the below wget command to download data from FTP recursively.

wget --user="<user-name>" --password="<password>" -r -np -nH --cut-dirs=1 --reject "index.html*" "<URL to download files>"

-r: Is for recursively download.

-np: Is for no parent ascending.

-nH: Is for disabling creation of directory having name same as URL i.e. abc.xyz.com

--cut-dirs: Is for ignoring number of parent directories. The value of this option will differ for your command.

You can check by executing the above command.

Peter Mortensen
  • 2,319
  • 5
  • 23
  • 24
AVJ
  • 201
  • 2
  • 2
8

Besides wget, you may also use lftp in script mode. The following command will mirror the content of a given remote FTP directory into the given local directory, and it can be put into the cron job:

lftp -c 'open <hostname>; user <username> <password>; mirror -e <remote-src-path> <local-dest-path>; quit'

It automatically handles recursion into directories and allows specifying the remote source starting directory from to download data from.

Peter Mortensen
  • 2,319
  • 5
  • 23
  • 24
zappan
  • 181
  • 3
5

You can use 'curlftpfs - mount a ftp host as a local directory' and, once mounted, you can use normal file tools like 'cp -r'.

chmeee
  • 7,270
  • 3
  • 29
  • 43
3

Should work try:

wget -r ftp://ftp:ftp@ftp.sunet.se/tst/
rkthkr
  • 8,503
  • 26
  • 38
3

Use -m rather than -r, because of the recursion depth.

http://www.editcorp.com/Personal/Lars_Appel/wget/wget_2.html#SEC11

Bojan Hrnkas
  • 115
  • 11
2

Use:

wget -m ftp://192.168.0.1

and it will mirror all the files and folders.

kenorb
  • 5,943
  • 1
  • 44
  • 53
Rajat
  • 3,329
  • 21
  • 29
  • As I said, logging in through FTP takes me to the /home/admin folder, and the files I need are in /var/www/html So when I run the command you suggested, I only get the contents of /home/admin I tried running it with ftp://192.168.0.1/var/www/html, but then it tries to CWD /home/admin/var/www/html. How do I make it go to this folder from the root? –  Jun 13 '09 at 17:23
  • I have no experience with this particular problem, but you could try making a symlink to /var/www/html under your home. Then you could use an address like ftp://192.168.0.1/html – prestomation Jun 13 '09 at 17:29
  • I tried creating a symlink, but it resulted with wget creating a similar symlink on my local. –  Jun 13 '09 at 17:32
1

That's the right syntax. Not sure why you aren't getting the expected results.

As ever there is more than one way to do it. Try ncftp, in particular ncftpget

goo
  • 2,838
  • 18
  • 15
  • I can't install more software on my new server. Shouldn't I be telling wget to download all files from /var/www/html? I tried wget -r ftp://username:password@ip.of.old.host/var/www/html but I got a directory not found error. –  Jun 13 '09 at 09:37
1

I can understand if you're trying to dump this into cron or something, but why not simply ftp into the server with your normal client and mget *? This might be a quicker path to success.

dr.pooter
  • 399
  • 5
  • 10
  • I tried mget * but it didn't work with sub-folders, saying the local sub-folder doesn't exist. Is there a way to make him create the local folders automatically? –  Jun 13 '09 at 17:16
  • Depending on your client, the -r switch usually does the trick. IE: mget -r * – dr.pooter Jun 15 '09 at 06:17
  • mmm. AFAIK standard ftp client in linux is not designed to retrieve directories recursively. I mean - there is no -r option. other clients like ncftp or lftp support recursive retrieval but they usually not available by default. – Stann Feb 13 '11 at 05:42
1

As I said, logging in through FTP takes me to the /home/admin folder, and the files I need are in /var/www/html

I think this will work in your case:

wget -r ftp://192.168.0.1/../../var/www/html
kubanczyk
  • 13,502
  • 5
  • 40
  • 55
0

I came across Windows hosting with the username as brinkster/username, so wget would throw an error if you use the syntax:

wget -m ftp://brinkster/username:password@ip.of.the.host

To get past this, use

wget -m ftp://brinkster%2Fusername:password@ip.of.old.host

I had to face this with Windows hosting @ brinkster.

Peter Mortensen
  • 2,319
  • 5
  • 23
  • 24
-3

wget --user username --password yourpassword ftp://example.com/ftpfiles/filename will do the job or you can add * instead of filename entring : makes wget think of port which is invalid in your case.

  • This answer to a four year old question with multiple upvoted answers doesn't offer anything new and even omits the requirement for a recursive copy. – Sven Jun 20 '13 at 00:10