sg

Pazartesi, Aralık 12, 2005

Re: printer friendly version of webpages getting crawled


Yes, the robots.txt file is the best way to avoid crawling. Google, and
other leading SEs, observe robots.txt rules. If you're neat and tidy
you may already have them in a single /print directory, so you can just
disallow that entire directory. If not you'd need to add them to the
robots file as you do them.

In addition it goes without saying you shouldn't have them listed in
your sitemap file. Only include things you want crawled.

0 Comments:

Yorum Gönder

<< Home


Komik Videolar   islam  şarkı sözleri  yemek tarifleri  gelibolu  huzur   sağlık