Google's John Mueller said on Twitter that using the URL parameter tool is no replacement for using a robots.txt file for blocking content. John was asked "how reliable is it" when setting "crawl no urls" of a certain type of URL pattern. John said "it's not a replacement for the robots.txt -- if you need to be sure that something's not crawled, then block it properly."
No comments:
Post a Comment