English | German | Russian | Czech

West Germany English

Meaning West Germany meaning

What does West Germany mean?

West Germany

a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990

Synonyms West Germany synonyms

What other words have the same or similar meaning as West Germany?

West Germany English » English

Federal Republic of Germany

Are you looking for...?