# Begin robots.txt file User-agent: Amazonbot Disallow: / User-agent: MS Search 4.0 Robot Disallow: / User-agent: Mozilla/4.0 (compatible; MSIE 4.01; Windows NT; MS Search 4.0 Robot) Microsoft Disallow: / User-agent: * Crawl-delay: 30 Disallow: /Activity-Feed/ Disallow: /admin/ Disallow: /App_Browsers/ Disallow: /App_Code/ Disallow: /App_Data/ Disallow: /App_GlobalResources/ Disallow: /bin/ Disallow: /Components/ Disallow: /Config/ Disallow: /contest/ Disallow: /controls/ Disallow: /DesktopModules/*.aspx Disallow: /Documentation/ Disallow: /HttpModules/ Disallow: /images/ Disallow: /Install/ Disallow: /Providers/ Disallow: /home/ctl/ Disallow: /default.aspx Disallow: /home.aspx Disallow: /login.aspx Disallow: /privacy.aspx Disallow: /stats.aspx Disallow: /thank-you.aspx Disallow: /*/ctl/ Disallow: /*ctl=login Disallow: /*ctl=sendpassword* Disallow: /*ctl=license* Disallow: /*fileticket=* Disallow: /*scheduleitem= Disallow: /*&Sched=* Disallow: /*submitreview* Disallow: /vlb.aspx Disallow: /VLB.aspx Disallow: /*.asmx # End of robots.txt file