The CSAM scanning is not being done by a government agency. Apple is working with The National Center for Missing & Exploited Children, a private non-profit group.
In their recently released Expanded Protections for Children, Frequently Asked Questions paper, Apple stated:
Q: Could governments force Apple to add non-CSAM images to the hash list?
A: No. Apple would refuse such demands and our system has been designed to prevent that from happening. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.