Thursday, February 25, 2010

Treaty of Versailles


The treaty of Veresailles was the treaty that ended World War one. It was signed on June 28, 1919. Basically the treaty blamed Germany for the cause of the war. At the end, Germany took the blame and they had to end up paying expensives for the war. Before the war, Germany owned a bunch of land all across Europe and Asia. After the treaty of Versailles was signed, Germany had lost alot of their land to the Allies which were the United States, Britian, and France.
So after the treaty was signed and in place, Germany had lost alot of land, went into an economic depression and was basically defeated really badly by this war and the treaty. Since Germany was in a bad shape at that time Hitler came into the picture. Hitler promised them that he wil be the one to put Germany back into the shape it was once. That Germany would be powerful with the most powerful military and that they would regain all the land that they lost. Germany was desperate to be strong again so they gave into his ways and soon enough Hitler became a dicatator and the rest went down in history. This is why the Treaty of Versailles is very controversial now in days. If the treaty was never signed or if Germany was never blamed for the war than Germany wouldnt have lost all their land and never lost so much money and they wouldnt have gave into Hitler. Then Hitler woulnt have gone into power and the holocaust would have never happened, and a World War two would have never existed.
It seems that way but is it really true? Did Germany's blame of the war really cause this chain of events? The treaty was not fair in my opinion. The allies shouldnt have blamed Germany for the War. They lost alot in the war and took forever to regain what they had lost. Without this blame in this treaty, then world war two would have never happened and maybe the world would have been differently both econmically and politically.

No comments:

Post a Comment