How to recursively upload a directory to a WebDAV server through HTTPS from the command line?

12

5

I'm facing a rather simple situation, I have to upload, as-is, a big tree of files to a WebDAV server that's reachable over HTTPS. I must start the upload from a linux box with command line only. I can install programs on the box.

I've tried Cadaver but it does not support recursive directory upload.

Do you know of simple tools/scripts to achieve that?


Ok, I found something that did it.

I started from the davpush.pl script that can be found here https://github.com/ptillemans/davpush

Some changes were needed:

  • replace all "dav://" to "https://"
  • add "print POUT "open";" before "print POUT $script;"

Damn, having to hack a perl script to simply upload a directory that's rude. I'm still looking for simple tools/scripts.

eskatos

Posted 2012-12-12T13:44:45.303

Reputation: 221

Answers

2

Try gnomevfs-copy:

Edit: gvfs-copy is not recursive. I patched it but yet havi to publish the code. In the meantime, check dave from perldav. It does recursive transfers.

If you don't have fuse disabled, you can try davfs2

If you are not adverse to code your own tool, you could use gvfs and get inspiration from the source code of gvfs-copy

I'm having a similar issue, so I may come back with a better solution

user36520

Posted 2012-12-12T13:44:45.303

Reputation: 1 795

kio-client could have done it too. Unfortunately it's a quite restricted box and I don't have gnomevfs-copy nor kio-client installed. – eskatos – 2013-04-02T11:30:21.387

Try dave if you can. It works recursively (but unfortunately to me, it doesn't understand multistatus response from the server) – user36520 – 2013-04-05T12:07:39.057

dave did it with the target server, thanks! In fact it's not so far from what I did based on the davpush script that use cadaver itself using the perl HTTP::DAV API. But with dave, one cannot easily write a shell script with a bunch of commands because it is interactive only. Answer accepted :) – eskatos – 2013-04-05T12:13:28.150

8

Here is a quickly hacked shell script that permits to do tha using cadaver:

#!/bin/sh

usage () { echo "$0 <src> <cadaver-args>*" >/dev/stderr; }
error () { echo "$1" >/dev/stderr; usage; exit 1; }

test $# '<' 3 || \
    error "Source and cadaver arguments expected!";

src="$1"; shift;
test -r "$src" || \
    error "Source argument should be a readable file or directory!";

cd "$(dirname "$src")";
src="$(basename "$src")";
root="$(pwd)";
rc="$(mktemp)";
{
    find "$src" '(' -type d -a -readable ')' \
    -printf 'mkcol "%p"\n';
    find "$src" '(' -type f -a -readable ')' \
    -printf 'cd "%h"\nlcd "%h"\n'            \
    -printf 'mput "%f"\n'                    \
    -printf 'cd -\nlcd "'"$root"'"\n';
    echo "quit";
} > "$rc";

cadaver -r "$rc" "$@";
rm -f "$rc";

If it is named davcpy.sh then a command like

davcpy.sh "<local-directories>/<dirname>" "https://<target-website>/<some-directories>/"

allows a recursive copy from

<local-directories>/<dirname>

into a remote one named

<some-directories>/<dirname>

Note that it uses the scripting facility of cadaver to still permit interactive typing of login/passwords. I think it is also robust enough to handle weird file and directory names containing spaces, but I did not test any case like that.

nberth

Posted 2012-12-12T13:44:45.303

Reputation: 191

1

A solution could be Rclone. This is a one way command-line sync program, similar to rsync, that supports WebDAV (amongst others). It can recursively copy a directory, skipping files that exist on the destination. It has some command line options to control sync behaviour, for instance, whether you want target files to be deleted if they are gone from the source. There are packages available for many distros but you can also install and run the plain binary. The first time, you'll need to define a "remote":

rclone config create my-remote webdav \
    url https://my-webdav-server/my-dir/ \
    vendor other \
    user 'onno'  pass 'mypasswd'

After that, you can copy or sync files and dirs:

rclone copy /home/onno/mydir my-remote:

Onnonymous

Posted 2012-12-12T13:44:45.303

Reputation: 111

1

A modification of nberth's answer that works on OSX:

#!/bin/sh

usage () { echo "$0 <src> <cadaver-args>*" >/dev/stderr; }
error () { echo "$1" >/dev/stderr; usage; exit 1; }

test $# '<' 3 || \
    error "Source and cadaver arguments expected!";

src="$1"; shift;
test -r "$src" || \
    error "Source argument should be a readable file or directory!";

cd "$(dirname "$src")";
src="$(basename "$src")";
root="$(pwd)";
rc="$(mktemp -t davcopy)";

{
    find "$src" -type d | xargs -I{} echo 'mkcol '{}
    find "$src" -type f \
    -exec echo 'cd '$(basename {}) \; \
    -exec echo 'lcd '$(basename {}) \; \
    -exec echo 'mput '{} \; \
    -exec echo 'cd -' \; \
    -exec echo 'lcd '"$root" \;
    echo "quit";
} > "$rc";

cadaver -r "$rc" "$@";
rm -f "$rc";

The usage is the same. Quoting from nberth's answer:

If [the above] is named davcpy.sh then a command like

davcpy.sh "<local-directories>/<dirname>" "https://<target-website>/<some-directories>/"

allows a recursive copy from

<local-directories>/<dirname>

into a remote one named

<some-directories>/<dirname>

JayQuerie.com

Posted 2012-12-12T13:44:45.303

Reputation: 113

0

I'm on Ubuntu Linux. With the help of fuse, (Filesystem in Userspace), and mountdavfs (davfs2), you can mount a subdirectory on the Webdav server as a local folder.

Open your terminal and proceed as follows:

  • Install davfs2 if it does not already exist: sudo apt-get install davfs2

  • Connect to the Webdav server:

    sudo mount.davfs -o user=knb,rw https://webdav.site.de/data /mnt/somedir

(Owner of the file system must be entered, otherwise no write rights are present)

Additionally I had to enter these lines into /etc/dafs2/davfs2.conf:

use_proxy       0         
use_locks       0
if_match_bug    1

See https://bugs.launchpad.net/ubuntu/+source/davfs2/+bug/466960

mount - Command - Result (Last line of output):

https://webdav.mysite.de/icdp on /media/somedir type fuse (rw,nosuid,nodev,relatime,user_id=1000,group_id=0,allow_other,max_read=16384,uid=1000,gid=0,helper=davfs)

Also check if there is a weird file ~/.davfs2/secrets - might be necessary, for permanent mounts that survives reboots.

Now you can issu commands like cp -vr /data/myphotos /media/somedir and the files will be copied recursively, and uploaded to the webdav site.

knb

Posted 2012-12-12T13:44:45.303

Reputation: 143