Open Access#12022
AMERICAN FEMINISM
This article shows the description of the theme of American Feminism. Feminism is defined as the belief in women's full social, economic, and political equality. Feminism arose largely in response to Western traditions that limited women's rights, but feminist thought has many forms and variations around the world. Shortly, this article is about the topic of American Feminism and its phases in America.