22
Comments (3)
sorted by:
2
Non_ducor_duco 2 points ago +2 / -0

No...Because academia was infiltrated, as was Hollywood/entertainment. We are all taught about social and environmental injustices and that government is the answer, as grade school-ers right on up to college. And I was in grade school in the 80's and 90's. It has only gotten worse.

2
TrumpWonByAlot 2 points ago +2 / -0

Thomas Sowell has said almost the exact same thing. This is the correct answer and I believe it was is strategy to destroy the U.S. or a power grab from the left. It became socially frowned upon to not go to college so as many people tried to go as possible which means more communist. If you spend your formative years (18-22) in a social bubble on campus being fed pro communist bullshit instead of working in the real world getting common sense and resilience your world view can be fucked for ever. You think the world is unfair and business owners sole goal is to take advantage of people because your virgin nerd professor hated the dumb frat guy from college who's more successful than him. In college you sit around thinking of theoretical answers to theoretical problems, where as working you're forced to create value and determine practical solutions to problems or your fired. Can you see how those thought processes correlate with the left and right?

1
Anon6992374 1 point ago +1 / -0

Huh. I feel a little less shame about patronizing the WF butcher now.