# robots.txt for https://clarity.dexcom.eu # Remember, the crawler uses the first line that matches. # Explicitly allow, and disallow anything else that doesn't match User-Agent: * Disallow: /api Sitemap: https://clarity.dexcom.eu/sitemap.txt Allow: / Disallow: /upload/