Sexism in the U.S.: Questions and Answers
Sexism is the belief, institutionalized in laws and customs, that females are inferior to males.
Sexism in the U.S. is a series of short books that define and explore sexism and intersecting forms of discrimination in the United States. Each book presents a series of questions (such as How Does Sexist Humor Work? or What Would a Non-Sexist Masculinity Require?) and then presents the cultural analysis and data necessary to understand the answer.
The answers to our questions about sexism are firmly rooted in the research and analysis of brilliant scholars and activists such as bell hooks, Audre Lorde, and Riane Eisler. Sexism in the U.S. brings the academic study of sexism and its intersection with other forms of discrimination into our mainstream discussions about sex, gender, race, and class.
There are eight books in the series:
Defining Sexism in the U.S.
Defining Sexism in the U.S. allows readers to explore the relationship between sexism, intersecting forms of discrimination (such as racism and homophobia), and power. Questions such as “Does Sexism Affect All Women Identically?”, “How is Sexism Connected to Beauty?”, and “Does Sexism Affect Men?” lay the groundwork for understanding how and why sexism functions within our society. This knowledge can lead to empowerment and healing—for individuals, local communities, and our nation as a whole.
Sexism and U.S. History
Sexism and U.S. History fills in some important blanks for readers who want to know how American women got where we are today. This is women’s history as it is rarely taught, for it covers the resistance to sexism that has always been a part of our national story. The U.S. was founded as a patriarchy, in which some men had more rights and freedoms than other men and all women. Through questions like “How Did U.S. Women Gain the Right to Vote?”; “How Did the Civil Rights Movement Further U.S. Women’s Labor Rights?”; “What Shaped Pornography in the U.S.?”; and “What is Title IX?” Sexism and U.S History gives a brief overview of how sexism has affected groups of American women—and how American women and men have worked to reduce sexism and intersecting forms of discrimination.
Sexism and the U.S. Media
Much of American media relies on sexism, racism, and a violent masculinity to portray images and tell stories about who we are and who we should be. These images and stories are extremely harmful, setting us up to invest in sexism and other forms of discrimination as the basis for perceiving ourselves and others. Sexism and the U.S. Media discusses the impact that sexism and intersecting forms of discrimination have on us as individuals and as a culture through questions such as “Does the Media Impact Human Thought and Behavior?” and “What is the Difference Between Pornography and Erotica?”
Sexism and Relationships in the U.S.
Romantic relationships of all kinds, friendships, even parenting—all of our relationships are affected by sexism. Sexism in relationships can be invisible or seem almost natural, as if “that’s just the way things are.” But it’s anything but natural—sexism and other forms of discrimination restrict us, limiting our ability to value and love one another and to express our full humanity. Through questions like “How Does Sexism Affect Our Perception of Gender and Sexuality?” and “Can We Enjoy Some Aspects of Traditional Gender Roles without Being Sexist?” Sexism and Relationships in the U.S. examines the impact that sexism has on all human relationships.
Sexism and Work in the U.S.
Sexism has a huge impact on the way we work, what work we value both economically and intellectually, and how we perceive work. From sexual harassment laws to parental leave, U.S. legal and cultural decisions about work are historically connected with sexist ideas about who works and what work counts as “real.” Sexism intersects with other forms of discrimination (such as classism) to shape the professional and personal lives of all Americans. Through questions like “Is There Really a Wage Gap?”, “Who Receives Welfare, and Why?”, and “Does Sexism in the Workplace Affect Men?” Sexism and Work in the U.S. considers the impact that sexism has on all Americans.
Sexism and Violence in the U.S.
Sexism and violence are closely linked in our culture, with devastating effects for American women and men. Through questions such as “Did the Violence Against Women Act Help All Women Equally?” “How is Sexism on College Campuses Related to Sexual Violence?” and “How Can We Better Address Violence Against Marginalized Women?”, Sexism and Violence in the U.S. takes a closer look at some topics in earlier books and places the discussion within the context of pornography, sex trafficking, prostitution, and our cultural norms about sexual assault and sexual violence.
Sexism, Christianity, and U.S. Policy
Those who are most outspoken about the combination of religion and political power have made it very clear: they don’t think women should have sex for pleasure, use birth control, prosecute rapists, or have legal abortions. While many of us would disagree with the most obviously sexist of their statements, we might also struggle with our own understanding of our faith, and how it should be connected to policies that deeply affect American women. Through questions such as “How Did Abortion Become THE issue?” and “Are There Christian Groups That Support Reproductive Justice?”, Sexism, Christianity, and U.S. Policy examines how and why policies shaped by a conservative Christianity are sexist, and how they intersect with other forms of discrimination in the lives of many American women.
Feminism in the U.S.
According to a 2008 poll by The Daily Beast, most American women understand that sexism negatively affects our lives and limits our opportunities: 63% said the press treated them unfairly, 68% reported workplace discrimination, and 72% said they were treated unequally in politics. Yet only 20% of women were willing to identify as “feminist.” Through questions such as “Can We Confront Sexism Without Feminism?” and “Do All People Who Believe in Equality Consider Themselves Feminists?”, Feminism in the U.S. defines feminism, discusses its history in the U.S., and gives an overview of the current feminist movement.