Why Dividing by Zero is Undefined?
Students tend to hear one thing copiously, which is that “you can never divide a number by zero.” And once they enter high school, they hear the phrase “a number divided by zero yields an undefined result.” almost daily.
In the world of mathematics, every operation and its result has a very rational explanation because mathematics is built upon a foundation of very simple and essentially bulletproof principles. If we change some of these principles, we end up with an outcome where the rock-solid foundation of mathematics undergoes intriguing metamorphoses.
Many students relearn the same principles over and over again during their mathematical education. However, they tend to hear one thing copiously, which is that “you can never divide a number by zero.” And once they enter high school, they hear the phrase “a number divided by zero yields an undefined result.” almost daily.
Though dividing a number by zero seems very basic at the onset, it does not work that way. Suppose we are to ponder over the concept for a moment. In that case, we notice that it becomes increasingly complicated to wrap one’s mind around it. Because mathematicians have yet to effectively define the outcome of dividing a number by zero, it is still undefined. And for many, the lack of a definition for the outcome of dividing a number by zero is still quite difficult to process.
But how can a rational concept such as mathematics not define an operation as simple as this?
First, before dividing a number by zero, we need to assume that we know how to compute addition and subtraction. To better understand the division of any mathematical number, we can approach the matter in the following manner:
For example, let’s divide the number 1 by numbers very close to zero.
1/0.1 equals 10.
1/0.01 equals 100.
1/0.001 equals 1000.
1/0.0001 equals 10000.
1/0.00001 equals 100000.
1/0.000001 equals 1000000.
1/0.0000001 equals 10000000.