Yesterday, Cathy O’Neil, author of the well-regarded book, Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, published a New York Times op-ed that was complete bullshit. It was a bummer because many people like and respect O’Neil’s work, but now they are left wondering what gives.
O’Neil’s op-ed was titled “The Ivory Tower Can’t Keep Ignoring Tech.” It fits a well-worn genre where writers decry how out-of-touch professors are. Wouldn’t it be great if those bookworms caught up? Shouldn’t they be creating the student of the future? Yada, yada, yada. The genre is especially beloved by certain kinds of conservatives who seek to delegitimize higher education, cut its funding, and destroy tenure. Oh, and by Nicholas Kristof.
O’Neil begins her piece by explaining that there are things called algorithms that “choose the information we see when we go online, the jobs we get, the colleges to which we’re admitted,” etc. “It goes without saying,” she writes, “that when computers are making decisions, a lot can go wrong.” It goes without saying in other ways, too. If you are a researcher who examines technology and society — and believe me, there are literally thousands of people around the world who do this — O’Neil’s sentences are laughably obvious. Indeed, “algorithm” has become such an overused buzzword in academia that colleagues and I have been poking fun at it for at least five years. Thus, this really funny tweet.
In Cathy O’Neil’s world, none of these conversations have been happening. She makes this strong (untrue) claim, “Academics have been asleep at the wheel, leaving the responsibility for this education to well-paid lobbyists and employees who’ve abandoned the academy.” Later she doubles-down: “There is essentially no distinct field of academic study that takes seriously the responsibility of understanding and critiquing the role of technology — and specifically, the algorithms that are responsible for so many decisions — in our lives.”
Let’s parse this double-down for a second: “There is essentially no distinct field of academic study that takes seriously the responsibility of understanding and critiquing the role of technology . . . in our lives.” Hogwash. There is a distinct academic field dedicated precisely to this cause. It’s called Science and Technology Studies, or Science, Technology, and Society (both STS). It emerged in the late-1960s and 1970s as scientists, engineers, activists, and others began to ask tough questions about environmental degradation, the Vietnam war, nuclear arms and power, and such.
There are STS programs, centers, and whole departments in countries around the world. I helped found an STS program at Stevens Institute of Technology, and this summer started working in the STS department at Virginia Tech, which has been graduating PhD’s working on these kinds of topics for over twenty years. The issue of technology and responsibility is tackled at the Society for the History of Technology, the History of Science Society, the Society for Social Studies of Science, and several other international professional gatherings and conferences on this planet. (Below, I’ll point out that there’s lots of people working on these issues in other kinds of programs/departments, too.)
“And specifically, the algorithms that are responsible for so many decisions.” Again, so much horseshit. I mean, really. There are so many people and academic organizations working on algorithms that I’m sitting here staring at the screen not knowing where to begin. I guess you could start with well-funded and fairly famous institutions, like Harvard’s Berkman Klein Center for Internet & Society, the private research institute Data & Society, and other ampersand lovers. There’s Berkeley’s Algorithms in Culture group and the Algorithm Studies Network. Media studies departments, library schools, and the computer history organization SIGCIS have loads of thinkers researching algorithms. Personally, I have most closely followed the work of Frank Pasquale, a law professor at the University of Maryland, who published The Black Box Society: The Secret Algorithms that Control Money and Information over two years ago. Just SO MANY PEOPLE.
Is Cathy O’Neil such a bad researcher that she is incapable of using the cutting-edge research tool called Google? All of the organizations and individuals above would have come up through simple web searches. Here’s a fun game: Take the faddish and kind of hollow academic formulae “Critical X Studies” and “Critical Studies of X” and pump in your favorite buzzword. What do we get with “algorithm”?
Oh, shnap. People! Research! Work! Stuff!
For sure, algorithms have become such an obvious topic that scholars in mainstream academic disciplines have been studying them . . . also for years. Like, here is video of the perfectly ordinary American historian Louis Hyman, Director of Cornell University’s Institute for Workplace Studies and Future of Work program, asking, “Is My Employer An Algorithm?” By the way, researchers have now put up a editable list of people working on Algorithmic Fairness, Accountability, and Transparency on Twitter. There are lots of them.
The saddest part of all, to my mind, is that O’Neil could have used this moment to highlight the work of junior scholars, especially that of younger women. (Like, hey, you should really check out anything the terrifyingly-bright-oh-my-God-I-wish-I-was-half-as-smart-as-she-is Stephanie Dick even touches.) danah boyd [sic], head of Data & Society, has famously done just an incredible job fostering the careers of women who study digital topics and who, sadly, are often ignored on the job market. Instead of helping such junior scholars, Cathy O’Neil used her soap box to pretend they don’t exist.
You can also approach this issue from another angle. It’s not like engineers and computer scientists have been waiting around for other people to come and inject them with morality. STEM majors throughout the United States typically require students to get some education in ethics. I taught a course called Computers & Society, which always covered algorithms and how we build politics and morality into technology, for five years at Stevens Institute of Technology. It continues to be taught there, and smart people teach similar classes at hundreds of universities around the globe.
Similarly, Costa Samaras, an assistant professor of Civil and Environmental Engineering at Carnegie Mellon University, points out that the Department of Engineering and Public Policy at CMU describes its work in this way, “Technology can help us build a happier, freer, and more fulfilling life, while keeping risks and undesirable impacts at acceptable levels. However, this process isn’t automate; it takes careful hard work by people who understand both technology and society.” Folks there work on digital topics, including algrorithms. Perhaps more to the point, CMU’s Department of Computer Science put up this post on Facebook.
Cathy O’Neil’s claims are nonsense.
Now, you might say that one big problem is that many academics suck at communicating, write tortured, pain-inducing sentences, and use garbage, meaningless jargon, like “sociotechnical imaginaries” or whatever, and I would agree with you. But I would also point out that there many, many scholars who excel at clear communication and who write for popular outlets, like the New Yorker, Slate, Aeon, the Los Angeles Review of Books, Harper’s, Pacific Standard, the Atlantic, and . . . the New York Times! And besides this is a different conversation than the one Cathy O’Neil chose to have.
When O’Neil realized things were going badly for her, she put up this half-assed apology and explanation on Twitter.
The only problem is that’s not what she argued. One of her earlier tweets repeated the “asleep at the wheel” line.
So what gives? Why did Cathy O’Neil make all these false claims? Well, we can’t tell yet. But typically when we see bogus statements about how “NO ONE IS DOING THIS,” they are followed quickly by “BUT LUCKY FOR YOU, HERE I AM!” This is why O’Neil’s op-ed probably would have been better titled, “You Have Been Waiting Your Whole Life for Me.” As one colleague wrote, “My theory is she’s launching an institute and this op-ed was trying to drum up interest/curiosity, and it just backfired.” Don’t be surprised if Cathy O’Neil soon announces she has the solutions for all our algorithmic problems — which you would be good to support financially.
In the end, either it is Cathy O’Neil who’s asleep at the wheel or she chose to misrepresent reality for self-serving ends. Neither way is good, and either way she should apologize.