With Google there are no problems, but Bing does seem to cache it. All it takes – for the bot – is to reload the page.
Whitelisting crawler IPs seems like boring and not so good solution to me.
I’ve added those headers to my nginx config “add_header Cache-Control “max-age=0, no-store, no-cache, must-revalidate”;” Do you think will it help?
The protection is lua based in-case it’s important information. Thank you.