Well what you’d do isn’t what most employers are doing so there’s that.
We can say whatever we want when it comes to college degrees/diplomas but they still have an advantage in the working world, regardless of what people without them want to imagine.
Not so much anymore. Most degrees are absolutely worthless in the fields they are given. Most degree holders are now working in positions not in their field of study and most of the time NOT requiring a degree at all. I used to work in the medical field and was VERY selective of whom I hired. Their place of training was EXTREMELY important.