English | German | Russian | Czech

Deutschland English

Meaning Deutschland meaning

What does Deutschland mean?

Deutschland

(= Germany) a republic in central Europe; split into East Germany and West Germany after World War II and reunited in 1990

Synonyms Deutschland synonyms

What other words have the same or similar meaning as Deutschland?

Deutschland English » English

Germany Federal Republic of Germany FRG

Are you looking for...?