For example, one may formulate axioms of probability theory as an extension of ordinary two-valued logic. From a handful of elementary axioms capturing what it means for degrees of belief in combinations of propositions (on some evidence) to be consistent, you can derive a differential equation whose solution is a functional representation of what is often taken as a definition of conditional probability in mathematical probability theory:
c * f(X and Y | Z ) = f(X | Y and Z) * f(Y | Z)
Here c is the constant of integration in the solution to the differential equation, and it turns out also to be the sum of the probabilities of all mutually exclusive events. It is not determined by any side conditions, so it is set to 1 by convention; but any other value would do. [0]Negative probabilities are different beast, best viewed as algebraic extensions to probability theory in the manner that, e.g., the integers are algebraic extensions of the natural numbers (i.e., by including additive inverses). But I am not very knowledgeable on the subject, so I will say no more.
0. For details, see Cox (1961), /The Algebra of Probable Inference/ (recently back in print) or Jaynes (2003), /Probability Theory: The Logic of Science/. Both are excellent books, the former covering probability theory and entropy as extensions of logical reasoning, and the latter covering all that and much else of Bayesian probability and statistics.
Well, you have basically two choices -- one and zero. Any nonzero real number is trivially equivalent to 1.
It's not obvious to me, though, that 0 would work just as well?
I think it's especially important to be able to make a distinction between truth and falsehood however. If they are both 0, this is difficult. :-)