Long before there was an NFL there was college football.
College ball goes back the 1880s. The NFL was formed in the 1920s.
College football will survive unless they imitate the NFL and alienate their fans. Or until they get busted up for massive corruption, which may well get exposed in the wake of the DOJ prosecution of college basketball.
The NFL is a dead man walking, it may just take awhile for the zombie to fully collapse. But fans will not continue to put up with having leftwing SJW garbage dumped on them. And the NFL doesn’t know how the free itself from the monster it has been feeding for years.
Do you really think college football or ANY college sport or other pro sports (such as baseball) will never have these kinds of anti-American behaviors? That they won’t be susceptible to political correctness and activism?
Do you really think that?