# robots.txt for https://ticketly.co.ke # Last updated: 2025-10-08 # Purpose: Allow search engines to crawl public content while blocking sensitive or duplicate URLs. User-agent: * # --- Allow all main public routes --- Allow: / Allow: /about/ Allow: /contact/ Allow: /events/ Allow: /terms/ Allow: /privacy/ # --- Disallow private and irrelevant routes --- Disallow: /auth/ Disallow: /login/ Disallow: /register/ Disallow: /forgot-password/ Disallow: /dashboard/ Disallow: /profile/ Disallow: /admin/ Disallow: /api/ Disallow: /*?token=* Disallow: /*?session=* # --- Sitemap & Crawl Hints --- Sitemap: https://ticketly.co.ke/sitemap.xml # --- Crawl-delay for non-Google bots to avoid overloading --- User-agent: Bingbot Crawl-delay: 5 User-agent: Yandex Crawl-delay: 10 # --- Special instructions --- # Encourage Googlebot to index event pages more frequently. User-agent: Googlebot Allow: /about/ Allow: /terms/ Allow: /privacy/ Allow: /events/ Allow: /blog/ Disallow: /auth/ Crawl-delay: 2 # --- Social media bots (for better link previews) --- User-agent: Twitterbot Allow: / User-agent: FacebookExternalHit Allow: / User-agent: LinkedInBot Allow: / # --- End of File ---