Why does 1+1=2?
The proof starts from the Peano Postulates, which define the natural numbers N. N is the smallest set satisfying these postulates:
P1. 1 is in N.
P2. If x is in N, then its “successor” x’ is in N.
P3. There is no x such that x’ = 1.
P4. If x isn’t 1, then there is a y in N such that y’ = x.
P5. If S is a subset of N, 1 is in S, and the implication
(x in S => x’ in S) holds, then S = N.
Then you have to define addition recursively:
Def: Let a and b be in N. If b = 1, then define a + b = a’
(using P1 and P2). If b isn’t 1, then let c’ = b, with c in N
(using P4), and define a + b = (a + c)’.
Then you have to define 2:
Def: 2 = 1′
2 is in N by P1, P2, and the definition of 2.
Theorem: 1 + 1 = 2
Proof: Use the first part of the definition of + with a = b = 1.
Then 1 + 1 = 1′ = 2 Q.E.D.
Hope this answers will help you.
As an aside, if you like the idea of proving theorems by programming, you can try a dependently typed language, like the Haskell-based language Agda:
For your better understanding…
* First let us define natural numbers
data Nat : Set where
zero : Nat –Zero is a natural number
suc : Nat -> Nat –Every natural number has a successor
one = suc zero –definition of 1
two = suc one –definition of 2
* Addition gets defined as follows
_+_ : Nat -> Nat -> Nat
zero + y = y
(suc x) + y = suc (x + y)
* Now let us define equality
data _==_ {A : Set}(x : A) : A -> Set where
refl : x == x –Everything is equal to itself
* Finally, let us prove a theorem
* Types are propositions
* Objects of a type are proofs of the proposition
thm : (one + one) == two –thm is a proof of 1+1=2
thm = refl –and here is the proof
Of course, this is not much different from what we can already do at runtime in Haskell. However, Agda’s type-checker allows us to check proofs statically (that is, if your proof compiles then it is correct), and it can also prove more powerful theorems, like:
* first let us prove a simple lemma
suc-subs : (x y : Nat) -> x == y -> suc x == suc y
suc-subs x .x refl = refl
* This can of course be generalized to any type
* and any type from that function
* then we can prove a less trivial theorem
plus-assoc : (x y z : Nat) -> ((x + y) + z) == (x + (y + z))
plus-assoc zero y z = refl
plus-assoc (suc x) y z = suc-subs ((x + y) + z) (x + (y + z)) (plus-assoc x y z)
* this is a proof by induction on x,
* using our lemma for the inductive step
Thanks
I think the best way to capture the intuitive necessity of 1+1=2 is to say that… Whatever is 2 is also 1+1, and whatever is 1+1 is also 2. So, to avoid mapping “+” to some empirical acting – instead the whole 1+1 is to be taken as a concept.
I think of that this way. If there is a pair of things in front of us (or pair of things imagined, or assumed, etc…) we can either a)put attention on the pair qua pair or b)put attention on each of individuals as another to each-other. I think that a) corresponds to something being determined as 2, and b) corresponds to something being determined as 1+1. While there is the pair in front of me, I can switch my determination of it between a) and b) in similar way as it happens with the Necker cube. Somewhere there I think the necessity is “seen” (or intuited/comprehended).
So, I think that what happens in 1+1=2 is not identifying the left and right side of it as something separate, but as necessity if something determined as 2, it will also be possible to determine it as 1+1 (and the other way around).
BTW, I don’t think this is truth based on the definitions on the terms, or some formalism. I think a person can know what ‘1’ is and what ‘2’ is, and not know if 1+1=2. (Of course that doesn’t mean that those symbols can not be used in a formalism, that would map in some way to the intuitive understanding, which I think is the case with the idea of axiomatization of math)
Watch this video to get your answer.
Thanks
1+1=2 because of language. Maths was invented by humans. Humans decided that a single object would be called one and if you had two singles objects, you could put them together and that would be called two. Without humans there would be no human languages and therefore no maths.
“Actually, I am trying to ask a third question ‘what is the proof that 1+1=2; is it empirical or not?”
Does it aid our discussion to note that an “empirical” proof (which, presumably, refers to a thing known by way of experience) itself requires the proposition (i.e., 1 + 1 = 2) that you want to have proven?
What I mean to say is this: At the very least, our notion of experience requires two things. Those things are: a subject/experiencer and an object to be experienced. (I will disregard the possibilities of other necessary factors for an experience – e.g., other people, more than one distinct object of experience, more than one moment of experience, etc.) This much seems necessary: that experience requires two distinct entities. Together they form the experience. If it were the case that either of these two were not existant, then an experience would not occur. Therefore, you have built into your notion of experience the concepts of a thing and another distinct thing which, taken together, form a whole.
So, if we wished, we may substitue our factors here with symbols. The “subject” may – taken as a distinct entity – be referred to as “1.” The same would go for the distinct “object.” Finally, the experience at large could be symbolized as “2” (i.e., the prescence of both factors.)
Now, even if it were necessary to prove empirically the proof of contingent/factual objects’ relationships to one another, we seem to have the assumption of 1 + 1 = 2 already found within our concept of experience. Empirical proof of anything requires the assumption of this propositon.
So, can we prove something which is already assumed within our method of proof?
“One” and “two” are just words to define specific quantities. Two literally means two of one. I know that breaks every rule of defining a word, but it can’t be stated better. When you have two ones, you have two.
There’s a pretty great book called “The Number Devil” or something that talks about Roman numerals and a time before there was a “zero.” Roman numerals really can explain 1+1=2.
If you have |, and add another |, you have ||. | became known as 1, and || became known as 2. It’s that simple.