Christians always refer to the gospel as offensive, but in the formerly Christian West this is sadly becoming an existential reality.
“Christians always refer to the gospel as offensive”,
What?