Note that Crawl-delay is not part of the original robots.txt specification. But it’s no problem to include them for those parsers that understand it, as the spec defines: Unrecognised headers are ignored.
So older robots.txt parsers will simply ignore your Crawl-delay lines.
One or more user-agent lines that is followed by one or more rules. The group is terminated by a user-agent line or end of file. The last group may have no rules, which means it implicitly allows everything.
例えば
User-agent: a Crawl-delay: 5 User-agent: b Disallow:/
User-agent: a が Crawl-delay を解釈しない場合
User-agent: a User-agent: b Disallow:/
になる。このため Allow か Disallow は明示的に書く
User-agent: a Disallow: Crawl-delay: 5 User-agent: b Disallow:/