#**************************************************************************** # robots.txt # : Robots, spiders, and search engines use this file to detmine which # content they should *not* crawl while indexing your website. # : This system is called "The Robots Exclusion Standard." # : It is strongly encouraged to use a robots.txt validator to check # for valid syntax before any robots read it! # # Examples: # # Instruct all robots to stay out of the admin area. # : User-agent: * # : Disallow: /admin/ # # Restrict Google and MSN from indexing your images. # : User-agent: Googlebot # : Disallow: /images/ # : User-agent: MSNBot # : Disallow: /images/ #**************************************************************************** User-agent: * Disallow: /wishlist/ Disallow: /catalogsearch/result/ Disallow: /customer/ Disallow: /review/ Disallow: /control/ Disallow: /admin/ User-agent: AhrefsBot Disallow: / User-agent: ia_archiver Disallow: / User-agent: Baiduspider Disallow: / User-agent: baiduspider+ Disallow: / User-agent: Exabot Disallow: / User-agent: Ezooms Disallow: / User-agent: Mail.RU_Bot disallow: / User-agent: Monster Disallow: / User-agent: Phantom Disallow: / User-agent: Sosospider Disallow: / User-agent: Sosospider+(+http://help.soso.com/webspider.htm) Disallow: / User-agent: sogou Disallow: / User-agent: sogou spider Disallow: / User-agent: Sogou web spider/4.0(+http://www.sogou.com/docs/help/webmasters.htm#07) Disallow: / User-agent: Yandex Disallow: / User-agent: Mozilla/5.0 (compatible; YandexBot/3.0; +http://yandex.com/bots) Disallow: /