I always found it odd that drug stores sold tobacco products, candy, soda, etc. since the health industry is all about people cutting down on consuming that stuff.
I think back before America became consumed with our looks, and everyone was on some sort of prescribed drug, drug stores were a sort of local convenience store. Now drug stores sell mostly beauty "aids" and prescriptions and are a big part of their business to the point where selling cigarettes is not worth the trouble.