Monday, October 13, 2014
Who Really Won the War?
Recently, we finished the documentary we had been watching for a few weeks in class. The ending of the documentary greatly surprised me. The Civil War ended and with that I expected the documentary to come to a close but this documentary was about slavery and not the Civil War and apparently discrimination against black people greatly endured past the Civil War. It got so bad, that the north just decided they no longer even cared about the south and they could do anything they wanted to do to free black Americans. Organizations emerged that people wanted to shut down. I now have an explanation to how there was so much discrimination in the sixties, for example, even though it took time much after the Civil War. I always wondered if slavery was ended during the Civil War how do we find so much discrimination so long after the war. What also interests me, is how the north were just as guilty in the sixties as the south but during the after period of the war they were the ones trying to stop the discrimination. Did the north actually win the war? The south ended up brainwashing the whole country into thinking that black people were inferior to us and henceforth created discrimination far worse than anything seen in our country's history. Congratulations to general Robert E. Lee the true victor of the Civil War.
Subscribe to:
Post Comments (Atom)
Alex, I think that your questions and concerns really get to the heart of history: the problem of interpretation. We bring a lot of assumptions to the phrase "the end of slavery," and often those assumptions encompass an identity between slavery and racism. We figure, "Slavery is over, problem solved." Just a little bit of digging into the thoughts of those who were most responsible for the end of slavery reveal that the motivations for abolition are not what we might expect or hope, though; in fact, the gulf between expectation and reality is far enough that the question, "Who won?" can become meaningful today.
ReplyDelete