SEO penalty to prevent "good robots" from making billable API calls?

The pages on my website generate 8 to 10 times more bot traffic than real users. I see about 90 different robots in my newspapers. They are all "good robots" by definition. Generally, search engines on search engines, SEO indexers, such as references and digital ad networks. My problem is the cost. I use the Google Maps and Places APIs and a unit cost per API call is charged. I know how to prevent robots from crawling pages, which is a 100% success for me, just like my ability to prevent them from making billable API calls.

The question that I have is What is the SEO impact of denying a bot to see the content provided by the APIs? I present the restaurant location text data obtained from Google Places and Yelp Fusion and places on an integrated Google map.

I plan to implement a strategy of allowing only certain search engine bots to access the APIs. Should I do this or can I just pass a generic card image and fictitious restaurant location data to bots without SEO penalty? I show a page for each restaurant with the address, city, state, zip code and phone number obtained from Google Places or the Yelp Fusion API. Yelp is free, but I will exceed their quota if I fulfill all bot requests.