probability – Trying to show that $E[1/X] = 1/E(X)$ is not true in general.

I know that one can use Jensens inequality and such, however I’m wondering if there is a more elementary proof not involving measure theory and Lebesgue-integration. For example something like this:

If we use LOTUS for the discrete case,
begin{align}
mathbb{E}(g(X)) = sum_{x}g(x)f_X(x),
end{align}

for $g(X)=1/X$ we obtain
begin{align}
mathbb{E}left(frac{1}{X}right)&=sum_{x}frac{f_X(x)}{x}\
frac{1}{mathbb{E}(X)} &= frac{1}{sum_{x}xf_X(x)}.
end{align}

Setting the two expressions above equal we obtain the equation
begin{align}
left(sum_xfrac{f_X(x)}{x}right)left(sum_xxf_X(x)right) = sum_xf_X(x)^2 = 1.
end{align}

So the sum of the pmf squared equals to 1. Is there any conclusion we can draw from here?