Well, you can naturally have zero of something. In fact, you have zero of most things right now.
0 is not a natural number. 0 is a whole number.
The set of whole numbers is the union of the set of natural numbers and 0.
Does the set of whole numbers not include negatives now? I swear it used to do
Integer == whole
I just found out about this debate and it’s patently absurd. The ISO 80000-2 standard defines ℕ as including 0 and it’s foundational in basically all of mathematics and computer science. Excluding 0 is a fringe position and shouldn’t be taken seriously.
I could be completely wrong, but I doubt any of my (US) professors would reference an ISO definition, and may not even know it exists. Mathematicians in my experience are far less concerned about the terminology or symbols used to describe something as long as they’re clearly defined. In fact, they’ll probably make up their own symbology just because it’s slightly more convenient for their proof.
From what i understand, you can pay iso to standardise anything. So it’s only useful for interoperability.
Yeah, interoperability. Like every software implementation of natural numbers that include 0.
Ehh, among American academic mathematicians, including 0 is the fringe position. It’s not a “debate,” it’s just a different convention. There are numerous ISO standards which would be highly unusual in American academia.
FWIW I was taught that the inclusion of 0 is a French tradition.
Counterpoint: if you say you have a number of things, you have at least two things, so maybe 1 is not a number either. (I’m going to run away and hide now)
I’d learned somewhere along the line that Natural numbers (that is, the set ℕ) are all the positive integers and zero. Without zero, I was told this were the Whole numbers. I see on wikipedia (as I was digging up that Unicode symbol) that this is contested now. Seems very silly.