Historically it's kind of been a problem. The South's racism brought on the Civil War, Hitler's racism brought on WW2 and historic racism here in America created an impoverished group of people in the wealthiest nation on earth.
But none of this means anything to you. I take it you are a bitter white person that couldn't make it in a nation that gave you all the opportunities than anyone with even a bit of ambition would love?