English | German | Russian | Czech

western United States English

Meaning western United States meaning

What does western United States mean?

western United States

(= West) the region of the United States lying to the west of the Mississippi River

Synonyms western United States synonyms

What other words have the same or similar meaning as western United States?

western United States English » English

West

Are you looking for...?