Dear NSF: Part 1

In what will surely not be the last time I ask the U.S. government for money, I made the plea to be a fellow of theirs today. The Nation Science Foundation Graduate Research Fellowship Program (NSFGRFP)  supports new PhD students in their research aims for 3 years.  And I am told, and also suppose, that I want that. The way in which they ask you to prostrate is a standard multi-essay plus recommendations mode. The essays, when viewed as “papers for which you haven’t done the work,” were useful writing and thought exercises. I am quite energized by firming up a proposal about what I might do in the next three years, even if it seems incomprehensibly difficult and moon-shot-ish  right now.

The climate code suggested that I ought to show how my research could have “multiple returns on investment.” I would like it if my application did as well. Perhaps publishing them in full here will have some good consequence in the future.


 

Graduate Research Plan Statement

Blindspot: A Passive Implicit Bias Test From Digital Footprint

Gender, race, sexuality, nationality, social-class, native-language, weight, etc. are causes of implicit and explicit social biases that affect human relationships, and in the worse case make life difficult for many.  For example, identical academic resumés with men’s names get offered more jobs with higher starting salaries than women’s (Moss-Racusin et al.). Similarly, physicians show a bias to attribute patient’s symptoms to coronary disease for blacks more than whites. Invetiably, social technologies transfer these biases (Friedman et al.), but their design can mitigate or exacerbate the transfer. In a negative example, photography tools have been optimized for taking pictures of white people (Dyer).  Meanwhile the internet has been a boon to the Gay Liberation movement by connecting people without fear of homophobia (Weinrich). However, the Human-Computer Interaction literature does not address these issues much (Kannabiran et al). That is a problem because social technology is an ever-growing component of our lives, and implicit biases are subtext to every transaction. Implicit bias is difficult to recognize because it cannot be seen with introspection (Kang, et al.). Without taking an implicit bias test one may remain unaware of the problem, and currently those tests require time and effort. I propose to build easier, more integrated, passive implicit bias tests by utilizing a person’s digital footprint. This proposal involves three parts; to operationalize bias from passive data, to create tools using that operationalization, and finally to evaluate those tools with a field study.

Step 1: Operationalize bias from passive data

The first issue to tackle with this general approach is to be able to compute the implicit bias of a behaviour. The state of the art is the Implicit Association Test “IAT” (Greenwald et al. 1995), an isolated test-taking activity. My plan is to improve on the test by applying the notion to a person’s passive behaviour. I ask: can we treat the time reading combined topics as a passive test? For instance, if we know the content of a browsing history, and the time spent on each article, perhaps we already have the results of an implicit bias test. To evaluate the merit of our passive test we will correlate results with the IAT.

In comparing user and global biases I have been making inroads. In a recent project I built gender inequality indexes from the biographies of all Wikipedias by nationality. One result from this is that there is a high correlation between the nationality-index and the United Nation’s inequality index. As the UN measures inequality by position of power and education levels in countries, we know that so too do Wikipedia editors. I presented this finding at OpenSym ’15 (Klein), and continuation of the work is now supported by a Wikimedia Foundation grant.

I am also experienced in finding novel measures for human behaviour. Borrowing from the economics literature I have suggested a measure of “collaborativeness” of a group of Wikipedia editors based on the articles they edit (Klein et al.). This is also an implicit bias check of sorts, it looks at the underlying propensity for users to collaborate, based on how different they are from a global norm. My familiarity with social bias data and repurposing methods will be key in building a theory of passive implicit bias.

Step 2: Create tools using that operationalization

Once I develop a way to compute an implicit bias from a set of web pages, I would build a tool to allow any web user to monitor themselves. I envision this taking the form of a browser plug-in, that will, with permission, read and track your browser history. I will use the linked open data of the web – Wikidata, Freebase etc. – which provides semantic information on the internet at large for a “ground truth”. For instance if you are reading New York Times Opinion Pages, we will infer the amount of time spent reading about U.S. prison reform (domestic-interest & Police Chief Garry McCarthy, male, age 56) vs. Canada’s Muslims in the upcoming election (foreign-interest, Muslim-interest, & political elections). Over the scale on an entire browsing history, we can sum the time spent on and between different dimensions of bias.

To quell the issue of potential privacy violations this tool will be entirely open source, something with which I’m well acquainted. For instance I created a monitoring service which watches every edit on Wikipedia in realtime for citations, pings the author of the citation, and uploads any open access articles for link reliability. Seeing the benefit in this Crossref offered financial support to expand and maintain this service (Bilder). Thus my skills in building realtime tools for social good is already advanced.

Step 3: Evaluate those tools with a field study

The last stage of this proposal would be to conduct a field study of how well this implicit bias tool works. We would seek a stratified sample to install and use the browser plugin. Next, we would monitor participants at intervals, determining if being aware of their bias scores affects their future bias score. We must control for factors such as having browsing history monitored, and the frequency of seeing implicit bias scores. To test the method’s efficacy we would correlate results with standard implicit bias scores. I am at the beginning of my training in conducting field study. This is why I chose my advisor Dr. Haiyi Zhu, a computer scientist with ample experience in the area. At the moment we are embarking on an interview study on Couchsurfing and AirBnB, in which I am learning the interview skills and interactive methods crucial to field study.

Intellectual merit

This work put together – successful or not – would advance the fields of human-centered computer science and implicit social bias research. Our first step, to create a passive implicit bias test based on a digital footprint, will contribute by improving implicit bias measurement methods. Our tool-building will provide another return on investment by enriching the linked-open-data community with an open bias dataset, a dimension at the moment which is entirely missing. In total, the project successfully completed would mean a novel, unprecedentedly easy implicit bias check, usable by anyone.

Broader impact

This proposed work can help unlock technology’s potential to reduce implicit social bias (e.g.,  gender, race, sexuality, nationality) and equalize society. We are attempting to build a “one-click solution”, so that time and technical barriers to test one’s own implication are as low as possible. Thus the broadest swathe of society will be able look into a mirror of their own bias. Yet, another aspect of creating a more equal society lies in access to research. As someone that was an unaffiliated researcher before re-entering the academy, I know the value of Open Access work, as it was all I had for years. I have a data management plan in place to publish all papers and non-privacy-sensitive documents (including this one), under open licenses. I am already an attender of Open Knowledge Festival, and was a Featured Speaker at Wikimania 2015. By advocating for open access research I’m promoting the spread of knowledge above prestige and profit, and eventually a more egalitarian world.

References:

Moss-Racusin, Corinne A., John F. Dovidio, Victoria L. Brescoll, Mark J. Graham, and Jo Handelsman. “Science Faculty’s Subtle Gender Biases Favor Male Students.” Proceedings of the National Dougherty Academy of Sciences 109, no. 41 (October 9, 2012): 16474–79.

Steinpreis, Rhea E., Katie A. Anders, and Dawn Ritzke. “The Impact of Gender on the Review of the Curricula Vitae of Job Applicants and Tenure Candidates: A National Empirical Study.” Sex Roles 41, no. 7–8 (October 1999):

Friedman, Batya, and Helen Nissenbaum. “Bias in Computer Systems.” ACM Trans. Inf. Syst. 14, no. 3 (July 1996)

Kannabiran, G., Jeffrey B., and Shaowen B. “How HCI Talks About Sexuality: Discursive Strategies, Blind Spots, and Opportunities for Future Research.” CHI ’11. New York, NY, USA: ACM, 2011.

Reagle, Joseph. “‘Free as in Sexist?’ Free Culture and the Gender Gap.” First Monday 18, no. 1.

Dyer, Richard. “Making ‘white’ people white.” The social shaping of technology (1999): 134-140.

Greenwald, Anthony G., Debbie E. McGhee, and L. K. “Measuring Individual Differences in Implicit Cognition: The Implicit Association Test.” Journal of Personality and Social Psychology 74, no. 6 (1998): 1464–80.

Klein, Maximilian. “Wikipedia in the World of Global Gender Inequality Indices: What The Biography Gender Gap Is Measuring.” In Proceedings of the 11th International Symposium on Open Collaboration. San Francisco: ACM, 2015.

Klein, Maximilian, Thomas Maillart, and John Chuang. “The Virtuous Circle of Wikipedia: Recursive Measures of Collaboration Structures.” In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, 1106–15. CSCW ’15. New York, NY, USA: ACM, 2015.

Geoffrey Bilder. “Citation Needed | Crossref Blog.” http://crosstech.crossref.org/2014/08/citation-needed.html.

Green, Alexander R., Dana R. Carney, Daniel J. Pallin, Long H. Ngo, et al.“Implicit Bias among Physicians and Its Prediction of Thrombolysis Decisions for Black and White Patients.” Journal of General Internal Medicine 22, no. 9


 

Personal, Relevant Background and Future Goals Statement

An Attack On My Belief-System

The discovery of the reality of social biases was a turning point in my life. The first time I awoke to my implicit, internalised racism a mix of discomfort and amazement overcame me. It was during a protest on the civil-rights-famous steps of Sproul Hall in Berkeley, where after a few rebel-rousing speeches a black woman came to the stage and started delivering activist poetry. I had been brought up to think of myself as not racist, and yet in a rare moment of self-awareness I saw myself dismissing her content because of how she spoke.

 

After the world-shattering realization of how we can hide prejudices from ourselves, subsequent prejudices came to light more rapidly. My own misogyny became very real upon a reading of favourite academic Joseph Reagle’s “Free as in Sexist,” (Reagle) a deconstruction of sexism in Open Culture. (Of course it took a man to show me that.) I lost religious dogma at the holocaust memorials in Berlin and Auschwitz, when I saw that accepting any unquestioned message is dangerous. Only last month the blog of a woman I met at a wedding introduced me to the “fat stigma” I had unwittingly been harbouring (“Talkin Reckless”). These continuous epiphanies fuel my wonder at just how many unidentified stigmas I’m still holding?

 

The feeling I get from a solid attack on my belief-system is so powerful that chasing after it has become the driving force in my life. Now I want it to be my career. I want to use research methods promote a more equitable society by uncovering and addressing implicit stigmas. I foresee my future as working with an organization or think-tank, likely non-profit, who also focuses on equality issues. Whether by grant or employment, it is of paramount importance that I work with a social justice-oriented team. My past experience in working in that sector has been overwhelmingly positive. Yet owing to the fact that this line of work is more self-directed and typically outside large corporate structure, advancing my career will mean I need to become more independent as researcher.

 

After having worked as a self-employed developer and researcher the last two years, I see graduate studies as a best path to enabling my personal development. Past experience has shown me that I need to learn a larger swathe of research methods. For my particular project – building a tool to unearth implicit bias from browsing habits – I need to learn the ways in which people are best convinced. And I am also looking for graduate school to partner me with others doing similar things. Together, mastering technical methods, delving into psychology, and working within the landscape, will give my goal of exposing implicit bias the broadest possible impact.

 

I learned how to both borrow and re-use theory from other fields, and to take initiative on projects during my work on “The Virtuous Circle of Wikipedia” (Klein et al.). While volunteering at the Wikipedia booth during a poster session at UC Berkeley, a conversation quickly turned into a collaboration.  An economist asked me to provide Wikipedia data for testing economic complexity theory in that domain. Being mentored by my coauthor Maillart – who would describe himself as an “econo-physicist” – I  learned what it means to generalize findings and theories across domains. That has been an inspiration, as I now see the rich potential of exploring and reading outside computer science. The story continues as I also understood more about research collaboration dynamics. When the project started slipping behind schedule, I took the initiative to create a plan to finish on time by managing my superiors, assigning them task-lists, as well as plowing on with the  analysis myself. Thus they awarded me with first-authorship on the paper. That has brought me to the next challenge of completely leading a research effort, which I think graduate school can teach me.

 

I am prepared to manage a team, but see my relative paucity in methods of persuasion from “Wikipedia Indicators of Gender Inequality (WIGI)” (Klein and Konieczny), a grant I won from Wikimedia Foundation. The aim of WIGI is to provide a series of inequality indexes like the United Nations Gender Inequality Index based off Wikipedia and including time, ethnicity, and occupation dimensions. When correlated to other indexes it can tell us both about the world and Wikipedia (Klein). On the back of that poster and paper, I won a grant from the Wikimedia Foundation to make the dataset available as a service. As the principal grantee, I manage a paid team of 4, which has taught me the lessons of how to lead without pushing. At first I erred in being to Laissez-faire in trusting employees to be self-directed, and later settle on a more accountable weekly-homework model.

We’ve produced a prototype, which will make it much easier for other researchers to include inequality as a dimension in their projects. Yet, even though we have made this dataset, I am still at quite a loss of how to announce it and persuasively “sell” the research – another skill that graduate school can teach me.

 

Having been engaged in exemplary models of collaboration during my time as Research Assistant at OCLC, and a grantee of Creative Commons, I know the value of community.

“If you want to go fast you go alone; if you want to go far you go together,” I recall the CEO Online Computer Library Center (OCLC) saying in a company address. Founded in 1967, they are ancient by tech standards, but their long-term vision taught me what it meant to me a community member. During my time there as a research assistant I found an opportunity to use their bibliographic data to enrich Wikipedia’s articles. Having racked up over 2 million edits, I published about the process with help from the tight-knit librarian community (Klein and Kyrios).  My experience from OCLC in thinking about multiple stakeholders, and industry standards, is how I came to win a grant from Creative Commons to create a plug-in to include the technology of the Learning Resource Metadata Initiative (LRMI) in MediaWiki sites. Working with the LRMI board, and MediaWiki developers closely, we came to a win-win implementation, that will make highly searchable Google results of Open Educational Resources (Campbell). It was a big lesson to work with many people, but that sometimes slower pace pays off in the longevity of results. With my graduate studies I hope to find a network of colleagues with whom “to go far.”

 

Keeping sight of my overall goal to spread awareness of implicit bias, it is easy to explain how the trajectory of my experience will make a broad impact in society. Being privy to your own blind-spots, is difficult, if not paradox by definition. Yet, it is that difficulty which makes it such a worthy project. In fact to the western rationalist mind, paradox is often an indicator of faulty underpinnings. Those who are least affected by implicit bias are, not by coincidence, the most privileged in society. And the most privileged in society have an opportunity to change it. As someone who, more than not, is privileged, I feel it my responsibility to use my position, potential, and energy to continue bring the issue of inequality to light. There is no need for me to translate how my work and history will broadly impact society, because my work and history is about the broad unequal impacts in society.

 

References

Reagle, Joseph. “‘Free as in Sexist?’ Free Culture and the Gender Gap.” First Monday 18, no. 1 (December 30, 2012). http://journals.uic.edu/ojs/index.php/fm/article/view/4291.

“Talkin’ Reckless.” Accessed October 23, 2015. http://talkinreckless.com/.

Klein, Maximilian, Thomas Maillart, and John Chuang. “The Virtuous Circle of Wikipedia: Recursive Measures of Collaboration Structures.” In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, 1106–15. CSCW ’15. New York, NY, USA: ACM, 2015. doi:10.1145/2675133.2675286.

Klein, Maximilian. “Wikipedia in the World of Global Gender Inequality Indices: What The Biography Gender Gap Is Measuring.” In Proceedings of the 11th International Symposium on Open Collaboration. San Francisco: ACM, 2015. http://www.opensym.org/os2015/proceedings-files/p404-klein.pdf.

Klein, Maximilian, and Piotr Konieczny. “Gender Gap Through Time and Space: A Journey Through Wikipedia Biographies and the ‘WIGI’ Index.” arXiv:1502.03086 [cs], February 10, 2015. http://arxiv.org/abs/1502.03086.

Klein, Maximilian, and Alex Kyrios. “VIAFbot and the Integration of Library Data on Wikipedia.” The Code4Lib Journal, no. 22 (October 14, 2013). http://journal.code4lib.org/articles/8964?utm_source=rss&utm_medium=rss&utm_campaign=viafbot-and-the-integration-of-library-data-on-wikipedia.

Campbell, Lorna. “LRMI Implementation Cases Study: Untrikiwiki | Open World.” Accessed October 23, 2015. https://lornamcampbell.wordpress.com/2014/09/04/lrmi-implementation-cases-study-untrikiwiki/.