Dear Uncle Colin,

I’m trying to work out $0.03 \div 0.1$. The answer is apparently 0.3, but I don’t understand why a division would make something bigger!

Division Isn’t Very Intuitive, Decimals Especially

Hi, DIVIDE, and thanks for your message! There are several ways to tackle this. I’ll start with a mechanical method, then a more explanatory one.


Remember that when you’re dividing, you can multiply the first and second number by the same thing without changing the result: $6 \div 2$ is the same as $60 \div 20$ and as $18 \div 6$ - they all give you 3.

You’re working out $\frac{3}{100} \div \frac{1}{10}$ - and I might multiply top and bottom by 100 to make this nicer. You get $3 \div 10$, which is 0.3.


Imagine you’re doing a subtraction problem. When you subtract a positive number, your original number gets smaller. When you subtract 0, it doesn’t change; when you subtract a negative number, it gets bigger.

There’s a similar thing going on with division: if you divide by something bigger than 1, your original number gets smaller. When you divide by 1, it doesn’t change; when you divide by something between 0 and 1, it gets bigger - and that’s what’s happening here! You’re dividing by a number between 0 and 1.

Another way to ask the question is to say “what would I multiply by 0.1 (the bottom) to get 0.03 (the top)?” Multiplying by 0.1 is the same as dividing by 10, so we’re asking what we’d divide by 10 to get 0.03, and the answer to that is 0.3.

I hope that helps!

- Uncle Colin