A Brief History of the German Empire in the 20th Century
The German Empire had expanded during the 19th century, and by the 20th century, it had colonies in Africa, such as Togoland. However, it would not last that much longer in the 20th century. During the first couple of decades, Germany began to militarize in preparation for a war that would ultimately ensure the demise of the German Empire.