Nikki Haley: America’s ‘Never Been a Racist Country’

Haley's declaration that the United States has never been racist comes just weeks after she omitted slavery as a cause of the Civil War.