As someone who has graduated from three colleges, I couldn’t agree more. Far leftism is taught almost as if it’s fact, and any attitudes or opinions to the contrary are shunned. I got my bachelors from a Catholic university in 2010, and even there, leftism lurked. I had one professor tell all the white people in his class that we were inherently racist simply because we were white… and this was back in 2009, long before all the DEI crap became more mainstream.
Colleges can teach you things, but what they can’t teach is common sense.