Re: Dead (archived) pages
Hi Phil,
A simple solution would be to
put those URLs out of bounds for bots like Google
in your robots.txt file and
not include them in the Google sitemap.
At the moment there is no way to use the
sitemap to indicate to Google what pages
to remove from its index.
Phil Payne wrote:
> I have a number of pages - quite a few, actually - that are no longer
> any use to most people who might be searching on my most useful
> keywords. I keep them for reference and as an audit trail.
>
> I don't really want to move them about, because quite a few are linked
> to from sites that aren't that well or frequently maintained.
>
> What I've done so far is set their sitemap priority to 0.0 and add
> robot metatags saying "noindex, follow".
>
> I'm wondering whether it would be a better idea for Google to implement
> the noindex decision in the sitemap - perhaps as a frequency keyword of
> "IGNORE" - or does "NEVER" combined with the robots "noindex" achieve
> the same result?
>
> There's absolutely no need for the Googlebot to download these pages,
> or even referenc them at all.
0 Comments:
Yorum Gönder
<< Home