Most American elementary schools and high schools, and nearly all colleges and universities, teach everything that is significant from a liberal/Left perspective.
The only difference between your local college and a Christian seminary is that the latter is more honest.
One of the great mind destroyers of college education is the belief that if it's very complex, it's very profound.
Most women are not programmed to prefer a great career to a great man and a family. They feel they were sold a bill of goods at college and by the media.
In colleges throughout America, students are taught to have disdain for the white race. I know this sounds incredible, or at least exaggerated. It is neither.