View all files in a website's directory?

11

8

Is it possible to list all files and directories in a given website directory from the Linux shell?

Something similar to:

ls -l some_directory

but instead of some_directory, it would be ls -l http://www.some_site.com/some_directory/. Obviously, the later will not work.

user1032531

Posted 2013-09-08T13:55:32.227

Reputation: 1 331

Answers

13

I was just wondering the same thing. The following is probably not the most efficient solution, but it seems to work. It recreates the directory structure of the webserver locally. (Found the first command via stackoverflow)

wget --spider -r --no-parent http://some.served.dir.ca/
ls -l some.served.dir.ca

Brian Z

Posted 2013-09-08T13:55:32.227

Reputation: 856

+1 for the ``--spideroption. I did not know that one thogh I have usedwgetandcurl` before. – Hennes – 2013-09-08T14:25:14.387

1

Yes, it is possible. Sometimes.

When you browse to a webpage (Say to http://demo.domain.tld/testdir/index.html) it will open the file you specified (in this case `index.html).

If you do not specify a file and there is a default present (e.g. the web-server is configured to show index.html, or index.php, ...) and you typed http://demo.domain.tld/testdir/ then it will automagically present you with the right file.

If that file is not present then it can do other things, such as listing the directory contents. This is quite useful when building a site, however it is also considered unwise from a security standpoint.

TL;DR: Yes, sometimes it is possible.

However the more practical approach is to simply SSH, [s]FTP or RDP to the web-server and issue a local directory listing command.

Hennes

Posted 2013-09-08T13:55:32.227

Reputation: 60 739

Thanks Heenes. Not my site, so I suppose SSH, etc will not work, correct? I was thinking of something a shell command like wget and not using the browser. – user1032531 – 2013-09-08T14:20:43.667

wget will get you all linked files. I guess you could also use something similar to list rather than fetch the files. However that still misses files which are not linked. – Hennes – 2013-09-08T14:24:24.010

0

Without recursion

lftp -e "cls -1 > /tmp/list; exit" "https://cloud-images.ubuntu.com/xenial/current/"
cat /tmp/list

Parag Doke

Posted 2013-09-08T13:55:32.227

Reputation: 115

0

I think that URL fuzzing is what you are looking for. Pentest tools offers an easy solution, but they do ask that you have the rights to search. Probably to reduce hacking. Here is an online solution.

https://pentest-tools.com/website-vulnerability-scanning/discover-hidden-directories-and-files

Else download and install Kali Linux. Everyone thinks it is for hackers but if you are a professional website builder I think it will be good to have. Essentially, this question asks "how-to create" something like a sitemap, which most domains provide anyway.

Alternatively, try Arch Linux solutions. https://blackarch.org/fuzzer.html

Phume

Posted 2013-09-08T13:55:32.227

Reputation: 1