A lot of you are talking as if if the South won, they would take over the whole country. But actually no, the South was not fighting to take over the US. They were fighting for their independence FROM the US, just like the US fought from the British. If the South won, the would become their own country, the CSA (Confederate States of America), just south of the USA.
But if the South had won and gained their independence, I'm fairly certain that the following things will happen:
- There will be no KKK.
- There will be MUCH more Blacks in the North, including Canada.
- Abraham Lincoln would not have been assassinated.
- The underground railroad would be in use for much longer, if not still in use.
- The CSA could/might be an ally of the Axis powers in WWII, as they share very similar ideals to those of the Nazis.
Not trying to derail this, but now I have a question.
What do you guys think would've happened if the Allies had lost WWI?
I personally think WWII would've happened anyways, but it would be Britain or France starting it, and the ideologies and reasons for fighting would be completely different.
But hey, at least the Holocaust probably wouldn't have happened.
As for this, I disagree. The main reason why Germany entered WWII was because they felt like they had been fucked over by the Allies in WWI. If they won WWI, not only would they receive money, they would not be given harsh punishments for the war. Thus, having no need for Hitler to "save" the Germans.
However, Japan would still start a war with China and America, as Japan was not hit hard by the Great Depression anyway. But because America does not need to send troops to Europe, the war with Japan will end much more quicker, and most likely no Atom Bomb will be used.