I'm not qualified to enter the debate about what should happen on divide by zero. I'll take whatever the language offers and go with that.
But, just as a curiosity, I remember a debate in my first programming language (APL) on what 0/0 should be
1. Error - since divide by zero is an error (or however it is defined in the language)
2. Zero - since zero divided by anything is zero
3. One - since anything divided by itself is one
It's been a long time, but I think the language architects decided on option 3
Previous
Next
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only