logistic regression – Derive the conditional distribution $ P (X | Y) $ from a logit model

Let's say we have two random variables $ X_1 sim N ( mu, sigma ^ 2) $ and $ X_2 sim Bern (0.5) $. The binary result variable $ Y $ is generated from
begin {align}
P (Y = 1 | X_1, X_2) = frac {e ^ { beta_0 + beta_1X_1 + beta_2 X_2}} {1 + e ^ { beta_0 + beta_1X_1 + beta_2 X_2}}
end {align}

$ Y sim Bern (P (Y = 1 | X_1, X_2)) $

Is there a simple way to calculate the conditional distributions of $ P (X_1 | Y = 1) $,$ P (X_1 | Y = 0) $,$ P (X_2 | Y = 1) $,$ P (X_2 | Y = 0) $? I think both $ X_1 $ and $ X_2 $ will keep the original distributions (Normal is still normal, Bernoulli is still Bernoulli).

We can make the conditional independence assumption if necessary. that is to say., $ P (X_1, X_2 | Y) = P (X_1 | Y) P (X_2 | Y) $.

The composition of the logit function and its inverse is not numerically invariant with respect to the decimal place

I have defined the following two functions:

logit[x_] : = Module[{},
   Log[x/(1 - x)]
   ];
invLogit[x_] : = Module[{},
   E^x/(1 + E^x)
   ];

One is the reverse of the other. however,

In[37]: = logit[invLogit[34.55555]]Outside[37]= 34.6574

Is it possible to increase the accuracy of calculations?