Universities shouldn't be able to force students to take nonsense liberal arts courses that have no relation to their major. For example, a STEM or Business major shouldn't have to take a feminist literature course in order to graduate.
As a recent college graduate that had to deal with this, it was infuriating to have to spend money on these nonsense liberal arts courses. The only reason universities force students to take these courses is to indoctrinate them, and because they realize they can milk Pell Grants and Federal loans. These pointless courses can be replaced with courses that actually relate to the student's major, and you could probably eliminate an entire years worth of courses just by getting rid of them (thus turning a 4-year degree into a 3-year degree).
I also fail to understand why universities are able to offer useless majors to students like "Women's studies", sociology, anthropology, philosophy, etc. Universities profit immensely by taking advantage of clueless students, and then these same students wonder why they can't land a job after graduating.
I dont know what to think on this one. A foundation in history and Western philosophy are key to maintaining our society... the problem is those liberal arts have been replaced with nonsense.
Ideally students would be able to take a history or philosophy course without being indoctrinated, but that just isn't possible today. The left has a complete stronghold on the liberal arts department in the vast majority of universities. Any conservative professor that attempts to legitimately teach history or philosophy is quickly ousted by their "colleagues", and reprimanded by the university.
I really wish students were able to take such courses, but I think the liberal arts field is just too far gone to save at this point. Its best just to prevent federal funds from being used to pay for liberal arts courses. Its the only way to force universities to make a change.