1) If all predictions of a classifier have the same confidence, then cross entropy loss is minimized when confidence equals accuracy (proof).

This seems to emphasize accuracy before confidence.


2) There is symmetry that exists for why the normalization constants of the Multinomial and its conjugate Dirichelt prior have similar forms.

To see it look at the integral of a product of exponentials when the integral is taken over the space of bases verse when the integral is taken over the space of powers.


3) The statement A->B does not have a negation.

This contradicts classical electrical engineering theory (i.e. truth tables involving negation of A->B).


4) nth Catalan number Cn

n prime -> (Cn)-2 is divisible by n