There are only two kinds of people in the world. Are you left-brained or right-brained? Type A or Type B? Are you a 1 or a 0? Quantitative or creative? Creator or builder? Art or algorithms? Numbers or words? Shakespeare or Einstein?
I recently spoke to a Bowdoin student who claimed, “Bowdoin students can’t do math.” Arguably, their standards were slightly biased given an academic schedule heavy in the quantitative fields, but it raises the question—in today’s data-driven world, is the ability to code or perform calculus fundamental to being a productive member of society? Are liberal arts students who focus in the humanities or social sciences setting themselves up for failure in the Digital Age?
To answer these questions, I spent two hours with Associate Professor of Digital Humanities Crystal Hall and Director of Quantitative Reasoning Eric Gaze on the third floor of the Visual Arts Center, the cozy home of the Digital and Computational Studies department.
To ground our conversation, I shared two articles with them beforehand: “Google and Waze Must Stop Sharing Drunken-Driving Checkpoints, New York Police Demand,” by Michael Gold in the New York Times and “San Francisco Wants to Ban Government Face Recognition,” by Sidney Fussell in The Atlantic. I encourage you to read these articles yourself.
During our talk, the two professors focused on the competing interests of the state, the market and the private citizen or consumer with respect to issues concerning surveillance.
Gaze commented first. “The Waze app is a great way to counter the idea that the consumer is always right. For example, I want to use an EZ-Pass. And yet, do we want the idea that the government can give us tickets because of their access to your EZ-Pass? Most of us would call that invasive, and I agree. Surveillance is this issue that, when it benefits the consumer in terms of convenience, is great, but when it infringes upon their liberties, is an issue.”
I asked if there is a way to define this line. Hall provided a possible solution to the diverging interests.
“Let’s start over with the data. And let’s get the specialists in the room. The people who are impacted by the data. The people who create the data,” he said. “Let’s have an equal conversation about these situations, so the data actually captures the reality on the ground as opposed to relying on proxies we assume are going to accurately capture the reality. Let’s actually talk to the people that are living through these things, then see what we should be measuring.”
Hall pointed to a grassroots initiative in Chicago which uses sentiment analysis (text analysis tracking) of known gang members in combination with machine learning to track the likelihood of violence using tweets. The initiative partners with, and provides protection to, incarcerated or recently released gang members to gauge sentiment.
“Computer scientists can write code that can perform a task, but without knowing the meaning behind the data that’s going into it,” she said. “We need to start having a more holistic discussion about data management. That’s where I’m increasingly feeling we need to go.”
I asked if this necessarily means involving more people with a humanities background.
Hall chuckled. “As a Galileo scholar, everything I default to is in the 16th century. And I mean to point to this [century] as this watershed moment when the sciences and the humanities part ways. It is a false divide at that moment, but it is a belief that will persist until the 20th century: the perceived ‘two cultures,’ a division between ways of thinking and analyzing human experience. I think some of that comes to bear now. But I think the silos have been constructed so strongly that the people that should be cooperating are not.”
My repository of Galileo informatica is sparse, so I asked Hall to expand upon her reference to the mathematician.
“Galileo was a mathematician who strayed into philosophy, because he was using the telescope to look at the moon,” she said. “He was trying to bring all of the information available at the time to understand why the surface of the moon was rough when everyone said it was supposed to be a perfect celestial body. No one had done that before—he was accused of academic trespassing. This moment is seen as when the rational mind comes to bear on science and knowledge and the creative or poetic mind is diminished in terms of what people understand as capacity to understand the world.”
“The quantitative versus the qualitative,” Gaze said. “I would say the quantitative is the low-hanging fruit—it is easy to quantify things. Mark behavior with a number and categorize according to that parameter. The numbers help, but don’t tell the full story.”
“However,” Gaze continued, “this isn’t to suggest that algorithms are entirely useless. They are really great and help us in so many ways. Take health, for example. Machine learning can arguably analyze symptoms and give diagnoses on a much faster pace, and more accurately, than leaving the number crunching to humans. You still want that doctor! By offloading the calculations, it allows doctors to spend more time with their patients. Almost become more human, in a way.”
I checked my pulse. Yep, still human.
Quantitative or creative? Maybe the answer should be both.