Searching through my draft blogs I find that the earliest is un-annotated, but prophetic collection of quotes I compiled in 2011. They are from Walter Isaacson's Steve Jobs biography, that too-strong dose of propaganda that turned me off of the mystique of supposedly messianic technology. (Perhaps I don't give enough weight to the fact that it was the first and last book that I read entirely on a laptop). I admit I was - and still am - wooed by the allure of technolust, but at that moment, I stopped seeing computational progress as a deliverance and started seeing it blunt tool, often overdressed.
Ironically it was Bill Gate's quotes in this book that dislodged and set in motion the changing of paradigms. (Of course at this time, it still took a white man's voice to get me to hear it). Graciously, when asked what he thought about the iPad 2, Gates quipped “Here I am, merely saving the world from malaria and that sort of thing, and Steve is still coming up with amazing new products.” The bite of this sarcasm shattered the coordinates of my reality. We still have malaria to eradicate! A marketing tech trends seems insignificant in this context.
I couldn't have forseen it then, but 5 years later it's inevitable that this summer I am working at Data Science for Social Good (DSSG), the "University of Chicago summer program to train aspiring data scientists to work on data mining, machine learning, big data, and data science projects with social impact." The fit is perfect, I am shoulder to shoulder with challenging colleagues who have as much penchant for the command line as desire to address human problems non-paternal ways.
Specifically my team's charge these 12 weeks is to partner with Tulsa Public Schools to better predict which 3rd graders are likely not to pass reading proficiency tests. Armed with the school data of Oklahomans aged 4-9, the goal is to help struggling students as early as Kindergarten, by suggesting which programs will likely improve their reading the most.
Perhaps what's most impressive to me about the DSSG organization is their healthy skepticism and introspection. There are weekly required discussion sections around the ethics of data science. Here The questions top of are in the vein of whether "approaching the world as a software problem is a category error that has led us into some terrible habits of mind?" and if "machine learning is like money laundering for bias."
The unresolved dilemma which I dealt with this week, was the way to handle student's grades as data in our prediction problem. I asked a colleague of mine who is working on another education project how she was treating grade data. The obvious thing to do, she responded, was GPA calculation, plus numeric counts of each letter earned. She was working at a High School level, but our early primary school students are so young that I take issue extracting meaning out of their academic "performance". This issue stems from the way that grades personally (dis)served me growing up. I felt that they were artificial switches I was meant to manipulate, completely disconnected with education. In some way I don't want to reify the power of the grade algorithmically. On the other hand we want the most accurate models, to best help our students. I've decided to take a cue from the team working to identify police offers likely to have adverse incidents with the public who have chosen not to include race as an attribute in their model, and eschew GPA for the time being.
Still, I'll acknowledge my privilege to be able to join a Gates-style tech-for-good crusade this summer. I read more now into his 2011 Forbes interview: "The metric of success is lives saved, kids who aren’t crippled,” says Gates. “Which is slightly different than units sold, profits achieved. But it’s all very measurable, and you can set ambitious goals and see how you do.” The ambitious goals have been set at DSSG, now there are 5 weeks left to see how we do.