seo – What is the affect of changing robots.txt file access permission?

I want to change access permission of robots.txt file to 403. Just my static ip can access directly not any one.

This doesn’t make sense. The only reason to have a robots.txt in the first place is to have it publicly accessible for search engines to determine if there are any URLs that should not be crawled.

Blocking robots.txt with a 403 HTTP response is effectively the same as not having a robots.txt file. ie. You are permitting unrestricted crawl access to your site.

If you don’t want a robots.txt file (ie. you want robots to crawl everything) then simply don’t include a robots.txt file and allow requests to 404. Alternatively, create an empty (or minimal) robots.txt to prevent a plethora of 404s in your access logs.

If you want to prevent crawling of certain URLs on your site then you need a robots.txt file and it needs to be publicly accessible. Access to this file should not be restricted.

why some people do that?

I’m curious where you got the idea of this from?