Maybe it's because some Christians are very poor representatives of their religion.
As an example; my child is diagnosed with a life-shortening disease, then my other child is diagnosed (a year later) with a crippling disease. I was devastated and trying to come to terms with a loving, just God and the whole *wrongness* of little kids being struck down so young in life. My Christian acquaintance stops by to see if we need anything and offers me this comfort: The reason both my children have been hit with this is because of the sins of our Fathers. Since we're *all* sinners, I, my husband or even the kids themselves must have done *something* to deserve this and once we're all convinced and repentant, then the children will either be healed "by the Stripes of Christ" or we'll be at peace carrying the cross of our sins through these diseases.
Now, I wasn't offended by all this. She's a good heart and I know that she meant well, but I will *not* convince my kids that it's somehow their fault or their parents' fault that their bodies have betrayed them. They did *nothing* wrong. My husband and I have both screwed up in our lives and we both have repented and straightened ourselves out. (And we repent every day that we continue to make new mistakes or fall back on our too human faults and work very hard to do better next time.)
I adore Christ, but Christians, on the other hand...
Christ was the Redeamer. Christ taught us that we can change and be forgiven. Christ was the sacrifice. And yet many, many Christians still believe that *everyone* needs a good whuppin'.
Your experience with this misguided soul has led you to a false conviction about things.
A pity. Both for you, and for Christians.
I know of nothing in the teachings of Jesus in which children are "hit" for the sins of our Fathers. Maybe you could ask her for some verses or background on this.
Instead, she should have told to you to find strength in Christ for the tough battles ahead.