You are viewing a single thread.
View all comments View context
-1 points

Eh, if you need special rules for 0.999… because the special rules for all other repeating decimals failed, I think we should just accept that the system doesn’t work here. We can keep using the workaround, but stop telling people they’re wrong for using the system correctly.

The deeper understanding of numbers where 0.999… = 1 is obvious needs a foundation of much more advanced math than just decimals, at which point decimals stop being a system and are just a quirky representation.

Saying decimals are a perfect system is the issue I have here, and I don’t think this will go away any time soon. Mathematicians like to speak in absolutely terms where everything is either perfect or discarded, yet decimals seem to be too simple and basal to get that treatment. No one seems to be willing to admit the limitations of the system.

permalink
report
parent
reply
1 point

The system works perfectly, it just looks wonky in base 10. In base 3 0.333… looks like 0.1, exactly 0.1

permalink
report
parent
reply
0 points

Oh the fundamental math works fine, it’s the imperfect representation that is infinite decimals that is flawed. Every base has at least one.

permalink
report
parent
reply
2 points
*

Noone in the right state of mind uses decimals as a formalisation of numbers, or as a representation when doing arithmetic.

But the way I learned decimal division and multiplication in primary school actually supported periods. Spotting whether the thing will repeat forever can be done in finite time. Constant time, actually.

The deeper understanding of numbers where 0.999… = 1 is obvious needs a foundation of much more advanced math than just decimals

No. If you can accept that 1/3 is 0.333… then you can multiply both sides by three and accept that 1 is 0.99999… Primary school kids understand that. It’s a bit odd but a necessary consequence if you restrict your notation from supporting an arbitrary division to only divisions by ten. And that doesn’t make decimal notation worse than rational notation, or better, it makes it different, rational notation has its own issues like also not having unique forms (2/6 = 1/3) and comparisons (larger/smaller) not being obvious. Various arithmetic on them is also more complicated.

The real take-away is that depending on what you do, one is more convenient than the other. And that’s literally all that notation is judged by in maths: Is it convenient, or not.

permalink
report
parent
reply
0 points

I never commented on the convenience or usefulness of any method, just tried to explain why so many people get stuck on 0.999… = 1 and are so recalcitrant about it.

If you can accept that 1/3 is 0.333… then you can multiply both sides by three and accept that 1 is 0.99999…

This is a workaround of the decimal flaw using algebraic logic. Trying to hold both systems as fully correct leads to a conflic, and reiterating the algebraic logic (or any other proof) is just restating the problem.

The problem goes away easily once we understand the limits of the decimal system, but we need to state that the system is limited! Otherwise we get conflicting answers and nothing makes sense.

permalink
report
parent
reply
3 points

The problem goes away easily once we understand the limits of the decimal system, but we need to state that the system is limited!

But the system is not limited: It has a representation for any rational number. Subjectively you may consider it inelegant, you may consider its use in some area inconvenient, but it is formally correct and complete.

I bet there’s systems where rational numbers have unique representations (never looked into it), and I also bet that they’re awkward AF to use in practice.

This is a workaround of the decimal flaw using algebraic logic.

The representation has to reflect algebraic logic, otherwise it would indeed be flawed. It’s the algebraic relationships that are primary to numbers, not the way in which you happen to put numbers onto paper.

And, honestly, if you can accept that 1/3 == 2/6, what’s so surprising about decimal notation having more than one valid representation for one and the same number? If we want our results to look “clean” with rational notation we have to normalise the fraction from 2/6 to 1/3, and if we want them to look “clean” with decimal notation we, well, have to normalise the notation, from 0.999… to 1. Exact same issue in a different system, and noone complains about.

permalink
report
parent
reply

Science Memes

!science_memes@mander.xyz

Create post

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don’t throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.


Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

Community stats

  • 12K

    Monthly active users

  • 2.2K

    Posts

  • 25K

    Comments