robots.txt

Note that Crawl-delay is not part of the original robots.txt specification. But it’s no problem to include them for those parsers that understand it, as the spec defines: Unrecognised headers are ignored.
So older robots.txt parsers will simply ignore your Crawl-delay lines.

web crawler - Robots.txt - What is the proper format for a Crawl Delay for multiple user agents? - Stack Overflow

One or more user-agent lines that is followed by one or more rules. The group is terminated by a user-agent line or end of file. The last group may have no rules, which means it implicitly allows everything.

https://developers.google.com/search/reference/robots_txt

例えば

User-agent: a
Crawl-delay: 5

User-agent: b
Disallow:/

User-agent: a が Crawl-delay を解釈しない場合

User-agent: a

User-agent: b
Disallow:/

になる。このため Allow か Disallow は明示的に書く

User-agent: a
Disallow:
Crawl-delay: 5

User-agent: b
Disallow:/

参考
https://www.robotstxt.org/robotstxt.html

Robots.txtの標準化仕様が(今さら)提出される - Qiita