Removed pages from sitemap, googlebot is still crawling them
I have a site with some 100,000 dynamic URLs. When I originally
generated my sitemap, I wrote a custom script to generate the sitemap.
I mistakenly included a bunch of URLs for which our site does not have
valid data. My site produces valid HTML output for these URLs, but the
page basically says 'Error! No valid data.' We consider them errors
and would like to prevent googlebot from accessing them so our error
log doesn't fill up.
I regenerated my sitemap without the erroneous URLs but googlebot has
since visited these error URLs again. How do I remove them? Does this
mean I have to put them in a robots.txt file? I really hope not
because there are quite a few of them.
0 Comments:
Yorum Gönder
<< Home