I'm building a fairly large html site which relies on a lot of links between sections that need to be correct.
Is there any way I can check each link on a page and make sure that it doesn't return a 404?
I'm building a fairly large html site which relies on a lot of links between sections that need to be correct.
Is there any way I can check each link on a page and make sure that it doesn't return a 404?
One thing you might want to try is a tool for building an site map, these tools crawl your site to establish all the links etc and will report a 404 page. I use Google Sitemap Automater on the Mac, but I'm sure there are many windows versions that will do the same thing. They will then give you a list of links that point to 404 pages.
I use Xenu's Link Sleuth http://home.snafu.de/tilman/xenulink.html
Despite the "get used to" icon, it does this job very well. It crawls an entire site with multiple threads, there are Options what to include and what to exclude. Complete with a report at the end.