America is supposed to pride itself on freedom of religion, race, and class. It's something that is carried on from ignorance and territorialism. I don't know if it's a transcendental pre-colonial mentality but I don't see too many other races of people trippin' about who is coming in and coming out. I think it's primarily racism disguised as class wars.