This is getting irritating. The South was never America's exclusively racist segment, just its most egregious and/or obvious example. Otherwise, explain how things like the Indian Wars (perpetrated largely by non-southerners in the post-1830 era) or the Filipino Insurrections were fought? Having a seceding South doesn't change the fact that racism was the morality de jour of ALL Western nations, including the "progressive" Northern US. You could argue that makes it "less racist" in a strictly literal sense, but I don't see the lot of northern African-Americans (Jim Crow was pretty heavily enforced throughout most of the Midwest, mind), or Natives, or Asians, getting any better outside the South than OTL (unless you consider anti-Chinese acts, originating in California, to not be racist?).
.
Moreover - contrary to popular belief - abolitionists themselves were generally racists. Many hated slavery precisely because it meant having black people in their country. ("Free soil" proponents often wanted their territories to free of black people, period.) The American Colonization Society was founded for the express purpose of shipping as many freed slaves back to Africa as possible, and while it didn't send away as many it wanted, it did succeed in establishing the country of Liberia.
Those who actually believed that African Americans should be treated as regular citizens were a fringe minority. John Brown was one abolitionist who genuinely believed in racial equality - and was shunned and ridiculed by his peers for it.
Last edited: