lo.logic – Disjunction in weakened Robinson arithmetic

Let $ T $ denote the theory obtained by removing the axiom $ forall x ( x = 0 lor exists y , S y = x ) $ and restricting double negation elimination to disjunction-free formulas of Robinson arithmetic. In other words, $ T $ is axiomatized over intuitionistic logic by arithmetical axioms of Robinson arithmetic other than $ forall x ( x = 0 lor exists y , S y = x ) $, and double negation elimination for disjunction-free formulas.

I’ve been playing around with $ T $ for quite a while (actually months, on and off), and it seems that no genuine theorems containing a positive disjunction can be proven in $ T $. By genuine I mean those theorems that are not proven by mere logic, like those of the form $ A rightarrow A lor B $, or $ A lor B rightarrow C lor D $ where $ A $ implies $ C $ and $ B $ implies $ D $. For example, I couldn’t find any way of proving $ forall x ( x < 2 rightarrow x = 0 lor x = 1 ) $. As neither induction nor classical logic is at hand, non of the usual ways of thinking seems to work, at least as far as I could check. I’m really not sure about unprovability of this sentence, as I couldn’t find any useful method to show that it’s not provable.

So to avoid any possible complexity that could come from considering more general theorems containing positive disjunctions, I ask my question like this:

Is there a method for showing that $ forall x ( exists y , x + S y = 2 rightarrow x = 0 lor x = 1 ) $ is provable/unprovable in $ T $?

Verification of the disjunction between the sub-assemblies of a poset

If there is a poset $ (P, le) $ and two sets $ X subseteq P $ and $ Y subseteq P $, and we have a way $ f: P ^ 2 to 2 $ to efficiently calculate for everything $ (x, y) in P ^ 2 $ if there is a $ z in P $ such as $ (x le z) wedge (y le z) $, we want to come back $ mathbf {T} $ if there is a pair $ (x, y) in X times Y $ such as $ f (x, y) = $ 1 and $ mathbf {F} $ otherwise, using as few calls as possible $ f $ and $ le $.

information theory – Upper limit for disjunction together in product distributions

I came across this draft of a manual, where Exercise 6.8 mentions that the disjunction of 2-part ensemble can be solved with a expected number of $ O (n ^ {2/3} log ^ 2 n) $ bits if Alice and Bob's inputs are sampled independently.

Consider the following protocol. If there is a coordinate $ j in (n) $ such as $ H (X_j) $ and $ H (Yj) $ are both at least $ epsilon $, then Alice and Bob communicate $ X_j $,$ Y_j $. They condition the values ​​they see and repeat this step until no such coordinate can be found. At this point Alice and Bob use Shannon's coding theorem to code $ X $, $ Y $. Show how to adjust $ epsilon $ so that the expected communication can be limited by $ n ^ {2/3} log ^ 2 n $. Tip: use the fact that $ H (X_j) ge epsilon $ implies that $ Pr (X_j = 1) ge Omega ( epsilon / ( log (1 / ε))) $.

I guess the idea is to first communicate all the clues where $ X $ and $ Y $ have a large entropy, then use the fact that the remaining clues should have a small entropy. However, the details of the protocol and where the independence of $ X $ and $ Y $ is coming, is not clear to me.