Englishfor English speakers
Washington D.C.
English
Meaning Washington D.C. meaning
What does Washington D.C. mean?
Washington D.C.
—
noun
(= Washington, capital of the United States)
the capital of the United States in the District of Columbia and a tourist mecca; George Washington commissioned Charles L'Enfant to lay out the city in 1791
Are you looking for...?