3
How can I download the latest file available in ftp server using wget. If I am using the below command it is giving all the files.
wget ftp://id:password@ftpserver/dir/*
Regards, Ram.
3
How can I download the latest file available in ftp server using wget. If I am using the below command it is giving all the files.
wget ftp://id:password@ftpserver/dir/*
Regards, Ram.
6
The answer is that you probably can't do this using just wget or any other single tool that I am aware of.
What you probably need to do is write a script that will use wget/curl/whatever that will request a directory listing from the ftp server. Then the script will select and retrieve the right file based on the file's name, or some other criteria.
using wget I am able to get all files issue is if I have that file already then it is coming and storeing as file_name.1 I want to download the files which are not available there. – None – 2009-10-20T23:19:14.857
6
Just add the "-N" option to wget to ignore files older than what you have locally. You could also add the "-nc" to completely skip a file if it already exists, even if the one on FTP is newer.
wget -N ftp://id:password@ftpserver/dir/*
2
My Solution is this:
curl 'ftp://server.de/dir/'$(curl 'ftp://server.de/dir/' 2>/dev/null | tail -1 | awk '{print $(NF)}')
I needed also the latest items*.csv file however curl will not allow wildcart patterns so I solved with a grep command piped after curl. Thanks for sharing your solution as I like it! – Bob Siefkes – 2016-04-19T13:02:14.740
1
Try to specify the file you want to download. Using '*' will download every file in the directory 'dir'.
Example:
wget ftp://gnjilux.cc.fer.hr/welcome.msg
...will download exactly the file 'welcome.msg' from the mentioned server.
Check the Wget-Manual.
Update: I'm not sure if I get your problem. Are you trying to sync the content of the remote server (machine running the FTP daemon) with your local server? Are you looking for something like rsync functionality over FTP? If yes, you could try ftpsync (wget alone won't help in this case).
Hi The file is like file-name_YYYYMMDD.sql.gz where YYYY - year, MM - month, DD -date I cant specify the exact file name – None – 2009-10-20T17:18:05.707
Seems like a database backup / schema script – Sathyajith Bhat – 2009-10-21T12:10:05.323
0
Maybe my non-ideal but simple solution will be useful for somebody.
First, I've installed proftpd. Here is /ftp directory tree example:
/ftp/
/ftp/file1
/ftp/file1/subfile1
/ftp/file1/subfile2
/ftp/file2
Then I added task in crontab to execute every minute this script:
PACKAGES=$(find /ftp/* -type d -not -name 'lost+found')
for PACKAGE in "${PACKAGES[@]}"; do
echo $(ls -I 'latest' -Atp "${PACKAGE}" | grep -v '/' | head -n 1) > "${PACKAGE}"/latest
done
So for every file (or package in my case) you need to create individual directory and there will be file 'latest' that contains name of latest file in that directory. If you want to get latest version of some file then you can just execute command:
wget ftp://"${FTP_SRV}"/"${FILE}"/$(curl -s ftp://"${FTP_SRV}"/"${FILE}"/latest)
You should add some quotes to that script — and, ideally, not loop over find’s output in the first place. – Scott – 2017-10-02T07:52:55.117
@Scott, I've understood your remark about quotes, but I don't get you about loop over find's output. Could you clarify this? – HeroFromEarth – 2017-10-02T18:13:10.763
1Why is looping over find's output bad practice? – Scott – 2017-10-04T04:06:55.467
5why don't you give us a little bit less information to work with? – None – 2009-10-20T16:25:53.363