Gender Equality, Data Science, and Faith in Democratic Societies

Join me for talks in Portland Oregon Feb 28th, Santa Cruz March 2nd, and Stanford March 3rd 2017.

J. Nathan Matias
5 min readFeb 26, 2017
(1) Charlie Chaplin in Modern Times (1936). (2) Lady Bird Johnson reads in a Head Start class (1966). (3) Wikipedia mental health support illustration by Laurent Hrybyk.

This coming week, I’ll be sharing my research with several academic and religious communities in the Pacific Northwest and Bay Area. I would love to see you there. Keep reading for summaries of each talk.

  • FollowBias: Supporting Behavior Change Toward Gender Equality and Online Information Diversity, Tues Feb 28 (9am) at CSCW 2017 (Portland, OR)
  • Authoritarian and Democratic Directions for Data Science, Thurs March 2 (1:30pm) at UC Santa Cruz
  • Reducing Systemic Injustices by Humans and AI, Fri March 3 (7pm) at the Stanford Graduate Christian Fellowship

FollowBias: Supporting Behavior Change Toward Gender Equality and Online Information Diversity

(CSCW 2017, Tues, Feb 28, 9am in Ross Island/Morrison) (read it here)

Gatekeepers on social media often influence which people and groups receive media attention. Many of these people unknowingly discriminate, directing greater attention to men than to women. In our study of over 3,600 US and UK journalists, for a median account in our sample, women were only 21% of the Twitter accounts they followed. When we asked notable bloggers and journalists about their Twitter behavior, they consistently over-estimated the percentage of women they followed by 1.6 times on average. Many were shocked at what they learned about their own behavior.

Can design and data support us to follow our own values of equality, helping us broaden the diversity of our social information? In the paper I’m sharing at CSCW (read it here), I report the results of research in partnership with Sarah Szalavitz and Ethan Zuckerman. Together, we built FollowBias, software that gives feedback on the percentage of women people follow on Twitter and tracks any behavior change.

FollowBias also tested theories of “value consistency” in social psychology. Drawing from classic research in the late 1960s by Milton Rokeach, we hoped that showing people data on the difference between their values and behavior might cause them to pay attention to more women online.

In two pilot tests of FollowBias with over 130 people, we surveyed people on their views toward women and compared that to their behavior. We asked them for their explanations of those differences and observed changes in behavior over time. In this systems paper, our main goal was to report early findings on the working system and our pilot experiments. In the first pilot study, people did follow more women on average after seeing data on their behavior. We failed to find an effect in our followup study. Our early findings will guide a much larger experiment on broadening the gender diversity of who people pay attention to online.

In the talk at CSCW, I’ll (a) describe the history of data-driven activism on women’s equality, (b) report the results of our pilot studies, and (c) describe the political, artistic, and ethical trade-offs in using design for behavior change toward equality.

Can design and data support us to follow our own values of equality, helping us broaden the diversity of our social information sources? The answer is complicated.

The Experimenting Society: Authoritarian and Democratic Directions for Data Science

(UC Santa Cruz Tech4Good, Thurs, March 2nd, 1:30pm)

(1) Charlie Chaplin in Modern Times (1936). (2) Lady Bird Johnson reads at a Head Start program (1966). (3) Illustrations by Laurent Hrybyk from an article on Wikipedia mental health support.

How will the role of social experiments in democracy be transformed as large-scale social experiments become commonplace? In the mid-twentieth century, debates over authoritarian uses of statistics led to new paradigms in social psychology, management theory, and policy evaluation. In our own time, large-scale social experimentation and predictive modeling are reviving debates over the the authoritarian or democratic roles of data science in society. As even children learn to do data science with their and their friends’ data, we need ways to reconcile these capacities with democratic values.

In the mid-twentieth century, debates over authoritarian uses of statistics led to new paradigms in social psychology, management theory, and policy evaluation.

In the first part of this public talk within a class at UC Santa Cruz, you’ll hear about the history and future of democratic social experimentation, from Kurt Lewin and Karl Popper to Donald Campbell. In the second part, you will hear about my work on CivilServant, software that supports communities to conduct their own field experiments on the governance of algorithms and social behavior online.

Time: Thursday, 1:30pm to 3:05pm
Location: UC Santa Cruz Engineering 2, Room 506

Reducing Systemic Injustices by Humans and AI

(Stanford Graduate Christian Fellowship, Friday March 3rd, 7pm)

How can religious communities use data to live our faith more fully, especially where we and our technologies become part of the world’s injustices? On Friday, March 3rd, at 7pm, the Stanford University Graduate Christian Fellowship has invited me to give a public talk on “Reducing Systemic Injustices by Humans and AI.”

Institutional discrimination was the first major internal problem faced by early Christianity. Communities responded with accountability initiatives, institutional reform, and a generation of debate that gave us many of our most enduring ideas about the nature of the Christian faith. In this talk, I will connect those first-century debates with urgent contemporary questions about systemic injustice in the use of digital technologies.

Debates in early Christianity about discrimination, accountability, reform, and the theology of equality gave us many of our most enduring ideas about the Christian faith.

Today, advocates of digital media argue that it has broadened access to diverse voices and opportunities. Yet platform design and AI systems sometimes combine with common human tendencies to cause systematic discrimination, harassment, and other harms. In this talk, I’ll briefly describe my large-scale studies of online discrimination and my research helping people and AI systems change their behavior for the common good. I will also summarize work within Christianity to use data to understand and change our collective behavior to better follow our shared beliefs.

In this talk and in facilitated group discussions, I hope that attendees of all faiths and disciplines will find new ways to link their values with their research. Among Christians, I am excited to brainstorm ways that we can monitor our collective behavior, working together to reduce injustices that we and our communities contribute to.

--

--

J. Nathan Matias

Citizen social science to improve digital life & hold tech accountable. Assistant Prof, Cornell. citizensandtech.org Prev: Princeton, MIT. Guatemalan-American