# Specify the user-agent (the bot) you are giving instructions to. # The asterisk (*) means these rules apply to ALL robots (Googlebot, Bingbot, etc.) User-agent: * # The "Allow" directive explicitly tells bots they can crawl this path. # Allowing the root directory is the default, but it's good practice to be explicit. Allow: / # The "Disallow" directive tells bots NOT to crawl these paths. # Block sensitive areas, temporary folders, or resource-heavy directories. # Since you used PHP, you might want to block included files, but usually, just block admin or private areas. Disallow: /wp-admin/ # Example if using WordPress/CMS admin area Disallow: /temp/ # Example for temporary directories Disallow: /private/ # Example for private client files # Location of your Sitemap (CRITICAL for indexing!) # This is the single most important line for guiding robots. Sitemap: https://ads-special-events-websites.ca/sitemap.xml