The pop “music” industry hasn’t been about music for a long time. They sell sex, lewdness, and degenerate lifestyles to the feral, unparented youth out there.
Some of them have talent, but rather than use it they sell what Hollywood finds easier to market.
Real musicians don’t need to openly exhibit moral turpitude, and their art is so much better.
I’m sure my parents felt the same way regarding Rock bands of the 60’ and 70’s.