The first time I heard the idea that .999 = 1, it was explained to me like this:

1/3 = .333…

(1/3)3 = (.333…)3

1 = .999…

One divided by 3 equals point 3 repeating. If you multiply both sides by 3, you get 1 equals point 9 repeating.

When I first saw this proof, my mind was blown. I had heard that .999 repeating was exactly equal to 1 rather than almost, but I had never seen it proven. This was proof. I shared this with several people, and I was so proud to be “in the know”. But upon further examination, I see that there are a couple of problems with this seemingly easy math problem as I will demonstrate in this post.

The first is that 1 divided by 3 does not equal point 3 repeating. It’s *almost* equal. The explanation is long but educational. Bear with me on this one.

When we were kids first learning about division, we were taught a mathematical model that I’ll refer to as the Remainder Model. When we divided 10 by 3, we got “3 with a remainder of 1” (written as 3R1). If you wanted to check the math, you’d do it in reverse by multiplying by the number you divided by and then adding the remainder, so 3(3) + 1 gives you 10. This allowed us to divide uneven numbers exactly.

When we got older we replaced the Remainder Model with the Decimal Model. Instead of stopping at 3R1, we’d drop a zero from behind our original 3 (because it’s actually 3.000…) and then divide our remainder of 1 with a 0 (10) by 3 again, giving us 3.3. Now while we didn’t do this in school (at least I know *I* didn’t), we *could have* used both models to get an exact answer… all you’d have to do is carry the remainder as many times as you drop zeroes from behind the 3. Dividing 10 by 3 just a couple of times gave us 3.3R.1 (checked by multiplying 3(3.3) to get 9.9 and then adding the remainder of .1 to get 10), and doing it three times gave us 3.33R.01 and doing it ten times gave us 3.333333333R.000000001. Through inductive reasoning, we can see that if you divided 10 by 3 an infinite number of times you’d get 3.333…R.000…01. That remainder may get infinitely small, but it never disappears — no matter how many times you do the division, it will always carry a remainder.

Thus, 10 divided by 3 does not equal 3.333… but rather *almost* equals 3.333… because there’s a remainder. It’s easy to see why this is also true of dividing 1 by 3. In our proof above, we don’t actually get 1 = .999… but rather 1 = .999… with a remainder of .000…01.

I said above that there were “a couple of problems”, and this rounding error is just the first. The second one is with multiplying .333… by 3. “Why would that be a problem?” you may ask. Well, I don’t see it as a personal problem, but rather one for my opponents. There are those who would “reify” .333… (see my last post) and suggest that this problem is unsolvable because multiplication requires that both numbers have a finite end. After all, how else would you even start such a math problem? The answer is easy — through inductive reasoning. We know that any finite string of 3s multiplied by 3 will give us an equally long string of 9s, and there’s no reason to think this will change just because the string of 3s is infinitely long. But this is a problem for some people, and I’ve seen this proof less and less over time because mathematicians have realized that allowing this multiplication creates an obvious double standard. While this may seem a trivial problem compared to the first one, it is major enough to prevent many mathematicians from even offering this proof any more.

One final note: While I welcome criticism, please don’t tell me that my math is wrong unless you can suggest how it could have been done right. Don’t tell me that there is no remainder when you divide 1/3 unless you can demonstrate how it is done evenly. If you can’t offer an alternative explanation, there’s a good chance that there isn’t one.