The trouble with common core math is that only those with an IQ of about 80 or below don’t intuitively understand that numbers can be broken down into units of tens, hundreds, etc.—and they are exactly the ones who would have no hope of mastering the stupid common core “long form” math exercises.
The ones who can do it don’t need it. Those who need it can’t do it.
The ones who can do it dont need it. Those who need it cant do it.
Just to confuse things, I will relate my interaction with math theory. A friend who doesnt have a high math background likes to teach the use of computers. But he tried to research what Analog to digital conversion was - and he couldnt figure out what analog meant. He knew me for an engineer, and asked via email what it was about. And he was asking the right person, because whenever I got a problem wrong in college it bugged the heck out of me, and I really wanted to know what was what. And I had gone through that, in the start of an analog computation class.The trouble with the term analog is that it derives from analogy," and yet analogy" doesnt easily carry over into computer/information thinking.
The first thing you learn in a course on analog computing is how to make a circuit to do addition. So your lab work on that lesson is to input voltages into an addition circuit, and measure the output voltage. So I did that, and the circuit worked as advertised. But, I saw a problem. It was not easy to accurately set the dial to create exactly the input voltages you wanted, and it was not possible to read the output voltage to more than about 1% accuracy. And, in the mindset I then had, that seemed to be a show stopper.And the reason it seemed like a dead end was the very fact that I was not distinguishing between addition as a concept and addition as the algorithm (recipe, in the lingo of the video that started this thread) which yielded a digital result. The confusion is actually that we use digital numbers as an analog of real quantities. It is not the real thing - the length of a board, for example - which is an analog, it is the number we create when we measure the length of board which is the analog of that physical thing.
So in that sense the term, analog-to-digital conversion is a misnomer, complete gibberish. What "A-D conversion" actually is, of course, is automated voltage measurement.
And I put it to you that some analogous form of sloppy thinking is going on when we take a low-IQ child who has trouble with the recipe and rote learning regime, and expect him to become more adept at the recipe after we befuddle him with abstractions. The result being that, not only the dullard but the smart child fails to understand (no wonder, because the teacher herself probably doesnt understand) the abstract concept, and doesnt get the recipe, either.
And if indeed the teacher does understand what shes teaching, her understanding is probably at a sophomoric level which motivates pride more than wisdom and compassion. If you want analogies and graphical demonstrations of math principles, good - find someone who is at least as good at that as I am - and put them to work making illustrations of math thinking which is simple enough for a parent before you go inflicting in on children and their parents via a game of telephone.
Children wont learn from teachers unless the teachers have parental support. And, teachers cannot expect, and do note deserve, parental support unless teachers respect parents.
And that, to me as a grandparent, is the real bottom line. Note that I did not qualify that bolded statement above with the phrase, if the parent has a PhD in every subject the child is being taught."