protocol – (How/Why) – Work vs Block Size Limit and Leading Zeros

Tl:dr
How did Satoshi/the community decide an appropriate block-size-limit?
Why does the protocol increase mining difficulty by adding zeros?

I’ve some limited experience with some of the cryptographic concepts underlying bitcoins protocols.
I have two questions.
Given the fact that there is an upper limit on block size, perhaps the easiest way to breach this is to arrive at an some exceedingly large nonce which will become necessary due to other block contents. The question is, how do you set an appropriate limit for arbitrary blocks?
Would you need to know about inverting the hashing-algorithm in order to prove the efficacy of a given limit for an arbitrary ledger/header/nonce set?

Second question is about the relevance of increasing the leading zeros over time. Would it not be equivalent (wrt work done) to set a total number of zeros? That is, would you need to know about inverting the hash algorithm in order to prove that difficulty is increasing by adding leading zeros as opposed to any other “pattern” of any other hex character of the same cardinality?

Please forgive and correct my understanding of the issues in question.

java – Typecasting an annotation – how/why does that work?

Can you help me understand how/why is the typecast at (1) is working ?
I could not find anything indicating that Java annotation types can be typecast to an Interface type.

org.junit.Test //this is an annotation
junit.framework.Test //this is an interface
import org.junit.Test;
import junit.framework.AssertionFailedError;
import junit.framework.TestResult;
public class TestJunit3 extends TestResult {
   // add the error
   public synchronized void addError(Test test, Throwable t) {
      super.addError((junit.framework.Test) test, t); //how does this typecast work (1)
   }
   // add the failure
   public synchronized void addFailure(Test test, AssertionFailedError t) {
      super.addFailure((junit.framework.Test) test, t);
   }
   @Test
   public void testAdd() {
      // add any test
   }
   // Marks that the test run should stop.
   public synchronized void stop() {
      //stop the test here
   }
}

How/why is switch made from equation to inequality for certificate of optimality for dual problem in linear programming?

If the required conditions are met then you have

$$ x_1 + 6x_2 leq (y_1+y_3)x_1 + (y_2+y_3)x_2 = y_1 x_1 + y_2 x_2 + y_3(x_1 + x_2) leq 200y_1 + 300y_2 + 400y_3$$

By looking at the leftmost and rightmost expressions you get the inequality you want.

google search – How/why did non-www version of page on my site get indexed?

for some reason there are 3 or 4 pages on my site for which the non-www version is the one that Google has indexed. I 301 redirect all requests to https and www in .htaccess but it’s very odd to me that when I search for My Company Name demo that the only result for the corresponding page on my site does not have the www (https://my-company-name.com/request-demo). However if I search for My Company Name free trial, the result for the corresponding page does include the wwww (https://www.my-company-name.com/free-trial). The pages are almost identical in code… in fact the free trial page was created by cloning the demo page in WordPress.

Now that I have the redirect to www in place, this is no longer problematic, but for the sake of consistency, curiosity and avoiding introducing potential problems down the line (for SEO, will the www version of this page be considered “duplicate content”?), I’d love to “un-index” the non-www version and have the search results show (and link to) the www version.

Any ideas on how i can go about achieving this? And perhaps more importantly, does anyone have an explanation as to why/how this is happening?