The US COnstitution exists to limit the right of the federal government. Those powers not assigned to the federal government reside in the state.
Decency laws have existed for our entire history. The framers of the Constitution would be shocked to know people are interpreting it as giving individuals the 'right' to be naked or in any other state of undress.
Do people have the right to walk nude in public? Maybe - if the states permit it (most obviously do not). Are teachers given the right to wear bikinis to work? If not, then why do students have the right to show up with various parts of their anatomy sticking out?
Obviously, they do not. It is a matter for state or local law - as it has been our entire history.
Do you honestly contend that the government has the power to fine people if a portion of their underwear is showing?
Let me rephrase that. The legitimate power.