0

I am using an application called Xenu Link Sleuth to try and find broken links on a site we host. When I go to the site through a browser it pops right open.

When I try to run it through Xenu it immediately throws a 404 not found error. I checked the Robots.txt file thinking that maybe Xenu was using that as a base but I determined that was not the cause.

Does anyone know of anything that could case this? Could there be security setup somewhere that I don't know about that throws a 404 for crawlers?

Any help appreciated, i'm stumped.

Also, this is an intranet site. Not sure if that makes a difference.

Abe Miessler
  • 905
  • 4
  • 11
  • 20

2 Answers2

1

404 error is thrown when a website page is not find. Please make sure that you set the default document for the site.

sysadmin1138
  • 131,083
  • 18
  • 173
  • 296
Gaurav Singh
  • 497
  • 2
  • 13
0

I would check a couple of thingsā€¦ First make sure the URL isn't breaking it. There were all sorts of articles earlier this week about hashbangs #! breaking things. Also, I would look at what your request looks like in the logs vs what this softwares request looks like in the logs.

cwebber
  • 491
  • 3
  • 7