Englishfor English speakers
dominionism
English
Meaning dominionism meaning
What does dominionism mean?
dominionism
noun
—
A tendency among some conservative Christians, especially in the USA, to seek influence or control over secular civil government through political action.
—
The belief that human beings should be free to dominate and exploit nature, including plants and animals.
Are you looking for...?