Well, now we know where the French surrender to the Muslims has gone - it's in Ottawa surrendering to the EVIL genius of Dr. Computer Math!!!
I'll bet the ancient Greeks, Mayans and Aztecs would argue the premise that human brains can't handle math.
I'll also bet that medical science will dispute the argument that our brains have shrunk and can't handle today's math problems.
Finally, I'll bet that this idiot has become lazy and has grown too reliant on his computer and calculator to perform the tasks that manually solving math problems used to require.
A few? Most mathematics papers are still single-author. 75% or 80% if memory serves. A lot of highly significant results in mathematics are still created by lone wolf thinkers.
iThink, therefore, iAm.
Human computer ping ;o)
"We are animals..."
Well, there's your first mistake, Professor Nimrod.
Math is not a form of knowledge. It is an intellectual construct, useful in obtaining knowledge.
You can't prove the sun will rise tomorrow, but you can prove two plus two equals four, always and everywhere.
Apparently the reporter has never heard of Godel's Incompleteness Theorem. To wit: "In any axiomatic mathematical system there are propositions that cannot be proved or disproved within the axioms of the system."
Mathematics is a subset of statistics where the variance equals zero.
Heh, What's your mathematician opinion about this?
"Or rather, math problems have grown too big to fit inside our heads"
Paging Seymour Cray!
Joke (sort of):
A couple of guys are sitting around bored.
"What's the biggest number you can think of, Joe?"
"Um...er...two. You?"
"Uh...three!"
Likewise, there are legends of African tribes whose math system amounts to "1, 2, 3, many".
Sounds dumb, right? Have you ever seriously thought about how you count things at a glance? Given a bunch of stuff to count, aside from "1,2,3,...", the best most people can do is visually lump the items into groups of 2 or 3, then conceptually add those numbers up. Go ahead: drop 5 pennies on the desk in front of you and count them, paying strict attention to how you count them; you might go "1,2,3,4,5" and effectively perceive only 1, or you might mentally divide the group into subgroups of 2 and 3 then add 2+3=5, however you do it you're isolating subsets no larger than a size beyond which your mind cannot purely reference ... strictly speaking, the biggest number you can really think of is 3!
So if the human mind can REALLY only perceive "3" as the largest number, how do we get to pi, infinity, sqrt(-1), and proving the 4-color theorem? Same way we get to "5": divide, isolate, symbolize, combine, repeat - and thru use of tools.
Once we have a mental set of symbols representing whatever it is we are mathematically doing, we again rapidly run out of mental processing space. Wasn't long in human history before Og the caveman went from counting stuff to marking rocks so he could be reminded of what he counted and what the number was. From mud on cave walls to impressions on clay tablets to abacuses (sp?) to ink on parchment to pencil on paper to magnetic field directions to transistor states to holes on DVDs, humans have used external tools to store expressions of mental symbolism. And it's not cheating, and it's not considered the end of mathematics.
So on we go to the next phase: tool-assisted computation - and it's not new either. Where one mathematician might have decided that proving something required checking all the permutations of a problem, he would go thru an algorithmic process to check all the possibilities. Given a big enough problem, he might hire other people - professionally called "computers" - to work as a team to solve the problem. Then Charles Babbage and Lady Ada concocted a machine to do what people were doing (at which point a politician, clueless as ever in history, asked "if you put the wrong figures in, will you get the right figures out?" - but I digress). Von Neuman extended the concept into a theory of electromechanical computation, and soon the modern electronic digital computer was born, followed by huge networks of incredibly fast sub-nanosecond-cycle computing devices solving enormously complicated mathematical problems in shockingly short periods of time (Euclid spent decades trying to compute the first fractal, now trivially done in milliseconds).
So what's the upshot? We are not outside the limit of the human mind with advanced math. We have simply figured out, again, how to divide up a problem, assign symbols to the pieces and their relationships, create an algorithm to process them, and rather than doing the processing by hand turn it over to a human-built machine for rapid processing.
I'm a software engineer with an affinity to mathematics. Trust me (as it's my profession): it's all simple, it's just a matter of breaking down the problem into manageable chunks - ultimately to rearranging 1s and 0s - and telling a machine in detail what process to follow. It's all still understandable by humans, we just get a machine to do the hard/tedious work for us. When someone claims we've hit the human limit, that's usually time to get ready for a whole new way of doing things that goes far beyond that limit.
2 + 2 = 5, for very big 2.
Sir Francis Bacon died as a lonely and severely disappointed man.
When I discovered that pie are not square and corn bread was, I realized that I didn't have the capability for math.
bump