Way to bulk check page urls for robots/htaccess access?

Written by  on September 14, 2017 

Im using scrapebox to search for "noindex" in source code so i can filter not-indexing domains from my tiers.

but folder directories can also be blocked by htaccess and robots.txt – Im not sure how to check it >.>

maybe if I used some regex for robots.txt but what about htaccess? is there a tool for that?

Category : Uncategorized

Leave a Reply

Your email address will not be published. Required fields are marked *