Posted on 12/08/2006 12:20:06 PM PST by LibWhacker
Schoolchildren from Caversham have become the first to learn a brand new theory that dividing by zero is possible using a new number - 'nullity'. But the suggestion has left many mathematicians cold.
Dr James Anderson, from the University of Reading's computer science department, says his new theorem solves an extremely important problem - the problem of nothing.
"Imagine you're landing on an aeroplane and the automatic pilot's working," he suggests. "If it divides by zero and the computer stops working - you're in big trouble. If your heart pacemaker divides by zero, you're dead."
Computers simply cannot divide by zero. Try it on your calculator and you'll get an error message.
But Dr Anderson has come up with a theory that proposes a new number - 'nullity' - which sits outside the conventional number line (stretching from negative infinity, through zero, to positive infinity).
'Quite cool'
The theory of nullity is set to make all kinds of sums possible that, previously, scientists and computers couldn't work around.
"We've just solved a problem that hasn't been solved for twelve hundred years - and it's that easy," proclaims Dr Anderson having demonstrated his solution on a whiteboard at Highdown School, in Emmer Green.
"It was confusing at first, but I think I've got it. Just about," said one pupil.
"We're the first schoolkids to be able to do it - that's quite cool," added another.
Despite being a problem tackled by the famous mathematicians Newton and Pythagoras without success, it seems the Year 10 children at Highdown now know their nullity.
S/he did'nt think s/he could get your undivided attention.
Yup. But airplanes can't navigate to "NAN". I had forgotten that IEEE floating point has an infinite representation. If divide by zero return that, and it worked OK in following transactions, that would be jake. But NAN doesn't compute.
Stubborn math professors who don't comprehend that there is a real world out there drive how these things operate and it makes things difficult. Got to put in special code to pre-test for this stuff just to make them feel good about the universe.
I am a programmer with a fairly extensive mathematical background. However, the schooling has had a fair number of years (not yet approaching infinity) to wear off. Your answer did not at all offend me.
Actually because I stands for current, the EE jocks have to use "j".
Yeah, and explain it to a guy with missing fingers . . .
Is nulity divided by nulity, zero or nul[ity]?
Hi, Tinman!... See my comment at post #70. It's the standard explanation you get in algebra class for not allowing division by zero. In short, no matter which numerator you try to divide zero into, you'll either run into a problem of nonexistence (of an answer) or a problem of non-uniqueness (of an answer). So division by zero is left undefined for all numerators.
Like this.
a = b
a^2 = a*b
a^2-b^2 = a*b-b^2
(a+b)(a-b) = b(a-b)
(a+b) = b
a+a = a
2a = a
2 = 1
Of course not. But the point is for the program to be able to catch the NaN before applying it, and do something appropriate instead.
I have yet to see anyone on this thread offer any compelling reason that "nullity" would be useful in a computer program.
It'd have some applications in discrete math, which is what computer scientists gravitate towards instead of high level calculus.
I have the feeling that 99.9% of all college professors teaching discrete math would be violently opposed the concept of "nullity".
I do not think that crossing the equator would be any reason for concern. I suspect this has been "worked around" many times and that there are many redundant systems in aircraft to take care of the problem.
"Loved those TRS-80's. Give me!"
I still have my TRS-80 Color Computer! It has 64k RAM AND Extended Basic!!!
Yeah. But if I thought I might get that result I don't think I'd pick such a limited data type as INT. But that's just me.
I'm don't see how the concept of "nullity" does that.
By giving hardware manufacurers an excuse. Mathematics may not need the concept of a "nullity", but humans do.
But, nevermind. I'm sure the mathematics geeks will get all offended by this and make such a stink that the concept will die and the hardware changes won't be made.
I'm sure centuries from now, programmers will be pre-testing for zero and handling the issue manually, and dealing with computer crashes from nowhere when they forget.
No, he calls it a "theorem." That means he claims to have a proof.
I'll go away now. <)8-(
;-)
Regards
or small values of 5
exactly!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.