I do not know of any software that can do it our of the box.
I even suppose you won't find any, as using an HTTP URL as SFTP download specification is not common and there's not even any direct mapping from the URL to an SFTP path.
If you have a URL like
https://www.example.com/sample1/image1.jpg
The file as presented using the SFTP will be in /sample1/usage1.jpg
only if the website SFTP account is chrooted. If not, the file can be in path like /home/user/httpdocs/sample1/usage1.jpg
or any other.
So I believe you have to script it somehow.
Also you didn't specify if the hostname in the URL changes or is constant. If it changes, where do you get the host credentials from? Or are they included in the URLs too?
Below, see an example of PowerShell script using WinSCP .NET assembly.
Configure the $remoteRoot
accordingly.
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions
$sessionOptions.Protocol = [WinSCP.Protocol]::Sftp
$sessionOptions.HostName = "example.com"
$sessionOptions.UserName = "user"
$sessionOptions.Password = "mypassword"
$sessionOptions.SshHostKeyFingerprint = "ssh-rsa 2048 xxxxxxxxxxx...="
$session = New-Object WinSCP.Session
$remoteRoot = "/home/user"
try
{
# Connect
$session.Open($sessionOptions)
foreach ($line in [System.IO.File]::ReadLines("list.txt"))
{
if ($line -Match "http\://[a-z.]+(/(.*)/[a-z0-9.]+)$")
{
$remotePath = $matches[1]
$remoteDir = $matches[2]
$localDir = $remoteDir -Replace "/", "\"
if (!(Test-Path $localDir))
{
Write-Host "Creating directory $localDir"
New-Item $localDir -Type directory | Out-Null
}
Write-Host "Downloading $remotePath"
$session.GetFiles(($remoteRoot + $remotePath), ($localDir + "\")).Check()
}
else
{
Write-Host "$line does not have expected URL format"
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host $_.Exception.Message
exit 1
}
(I'm the author of WinSCP)
I believe the questioner is referring to a
sftp://
url and not ahttp://
url. I'm kind of amazed that there is no software that can do this out of the box. I bet if an sftp client were to add this to their feature list, it would make the author cool and popular. – cowlinator – 2019-04-08T21:12:42.647@cowlinator It would be quite inefficient to mass download files using SFTP URL. Using an URL as a download specification kind of implies a new connection for every file. SFTP/SSH connection is expensive to be opened for each file separately. SFTP/SSH is designed to be opened continuously, not per-request (as HTTP). That's why URL is rarely used for SFTP. There's even no standard for SFTP URL syntax. – Martin Prikryl – 2019-04-09T05:51:28.563
Whether or not a new connection is created for every file is up to the implementation of the client if the input is in the form of a list of URLs. I'm surprised there is no standard for SFTP URLs, this protocol has been around for decades... – cowlinator – 2019-04-09T20:45:59.420
In lack of an official standard, this RFC draft seems to be the closest thing to a standard that we have. https://tools.ietf.org/html/draft-ietf-secsh-scp-sftp-ssh-uri-04#section-4 And I think this is what most people would use if they created an SFTP URL.
– cowlinator – 2019-04-09T20:52:05.927Anyway, you may consider asking a new question that is explicitly about SFTP URLs. – Martin Prikryl – 2019-04-10T06:07:55.313