# robot.txt file # Robots index all User-agent: * Disallow: /shop/orders Disallow: /admin Disallow: /shop/orders/ Disallow: /admin/ Disallow: /zoek/ Disallow: /tuinwinkel/zoek/ Disallow: /tuinwinkelzoek/ Disallow: /ajax/tuinvrienden/ # This will mean that a crawl bot will not come so fast back to your site (which will save memory load on your site) # 24 hours * 60 minutes * 60 seconds = 86400 seconds/day divided by 2 => 43200 pages a day can be crawled, every two seconds one Crawl-delay: 2 Sitemap: https://www.tuinadvies.be/sitemap.xml