I want to achieve that behavior:
Allow: /plans and Disallow: /plans/*
crawl: www.example.com/plans
Do not crawl:
I want to achieve that behavior:
Allow: /plans and Disallow: /plans/*
crawl: www.example.com/plans
Do not crawl:
It would be:
Allow: /plans$
Disallow: /plans/
Entries are assumed to have a trailing wildcard so /plans/
and /plans/*
are the same thing. However, this also means that /plans
will also match /plansandstuff
. This can be dealt with by using $
which matches "end of path".
See also: Robots.txt Specification
Keep in mind that the robots.txt file is advisory and not all crawlers pay attention to it.