The cardinality of the two intervals [0,1] and [0,2] are equivalent. E.g. for every number in the former you could map it to a unique number in the latter and vice versa. (Multiply or divide by two)
However in statistics, if you have a continuous variable with a uniform distribution on the interval [0, 2] and you want to know what the chances are of that value being between [0,1] then you do what you normally would for a discrete set and divide 1 by 2 because there are twice as many elements in the total than there are in half the range.
In other words, for weird theoretical math the amount of numbers in the reals is equivalent to the amount of any elements in a subset of the reals, but other than those weird cases, you should treat it as though they are different sizes.
If between 0 and 1 are an infinite number of real numbers, then between 0 and 2 are twice infinite real numbers, IIRC my college math. I probably don’t.
Isn’t it a bit like saying “there’s obviously more real numbers between 0 and 2 than between 0 and 1”? Which, to my knowledge, is a false statement.
The cardinality of the two intervals [0,1] and [0,2] are equivalent. E.g. for every number in the former you could map it to a unique number in the latter and vice versa. (Multiply or divide by two)
However in statistics, if you have a continuous variable with a uniform distribution on the interval [0, 2] and you want to know what the chances are of that value being between [0,1] then you do what you normally would for a discrete set and divide 1 by 2 because there are twice as many elements in the total than there are in half the range.
In other words, for weird theoretical math the amount of numbers in the reals is equivalent to the amount of any elements in a subset of the reals, but other than those weird cases, you should treat it as though they are different sizes.
If between 0 and 1 are an infinite number of real numbers, then between 0 and 2 are twice infinite real numbers, IIRC my college math. I probably don’t.
In math they’d both be equal