python 3.x – Passing queue handles between worker processes, preferably so that only involved workers have access

I am planning to use mptools in a project. This is the architecture of the processes and queues as I envision them right now:

enter image description here

This is a little similar to what Pamela (who wrote mptools) implemented in her example in the mptools Github project and explains in her talk at python conference 2019.

In my project, I have more queues that are written to by several other processes. However, Pamela’s example has only one, the event_q, which she creates in the basic ProcWorker class in line 132.

I am new to python OOP and wonder: How would I share the handles to those queues between the processes? For example, can I extend the ProcWorker class in my code so that not event_q is inherited by all other workers, but rather status_q, io_q, calc_q, and plot_q? raw_q and calc_q are only written to by io-worker and calc_worker, respectively, and could perhaps (how?) be passed just between the two involved workers.

The only solution I see with my limited python skills is, to edit the file and add all those queues in the init() of ProcWorker.

    def __init__(self, name, startup_event, shutdown_event, event_q, *args): = name
        self.log = functools.partial(_logger, f'{} Worker')
        self.startup_event = startup_event
        self.shutdown_event = shutdown_event
        self.status_q = status_q
        self.plot_q = plot_q
        self.io_q = io_q
        self.calc_q = calc_q
        self.raw_q = raw_q
        self.calc_q = calc_q
        self.terminate_called = 0

That is a bad solution, though. I actually hope that there is a better solution to share the queue handles between the processes than just dumping them all in that init(). Please advice!