I was thinking on a way similar to robots.txt that is used by good bots crawling a website. In the robots.txt I can define the User-agent, Allow and Disallow.
My goal is to pass the message about the request rate limiting to the bots also, saying for example that they are not allowed to go over xxx requests per second, minute, etc.
I know how to put a hard limit in place, but the goal hire is not to block them.
You need to check the bots' home pages for mechanisms to "throttle crawling" (useful search term).
For example, https://developers.google.com/search/docs/crawling-indexing/reduce-crawl-rate is Google's guide how to control Googlebot's crawl rate.
There is also the unofficial
Crawl-Delay
directive inrobots.txt
that some bots understand. More details can be found in https://websiteseochecker.com/blog/robots-txt-crawl-delay-why-we-use-crawl-delay-getting-started/.