Go to content, skip over navigation

Sections

More Pages

Go to content, skip over visible header bar
Home News Features Arts & Entertainment Sports OpinionAbout Contact Advertise

Note about Unsupported Devices:

You seem to be browsing on a screen size, browser, or device that this website cannot support. Some things might look and act a little weird.

Cathy O’Neil Visits Bowdoin

November 5, 2021

The Bowdoin Department of Mathematics welcomed Cathy O’Neil to campus for the annual Cecil and Marion Holmes Lecture on Monday. An accomplished author and Ph.D. graduate, O’Neil had an extensive career in finance and academia before founding O’Neil Risk Consulting & Algorithmic Auditing (ORCAA), an algorithmic auditing company. Her book and lecture, both entitled “Weapons of Math Destruction,” addressed algorithms and their societal implications.

O’Neil began the evening by refuting the widely-held misconception that algorithms are perfectly rational and unquestionably fact-based.

“What [algorithms] really end up doing is, very carefully, embedding opinions and political agendas of their makers into something that is really, really hard to understand,” she said.

This exclusivity contributes to what makes algorithms so subtly dangerous. According to O’Neil, algorithms rise to the level of ‘Weapons of Math Destruction’ when they are widespread, mysterious (secret, often shielded from the public eye) and dangerous (have the potential to negatively affect large swaths of people)—as such, she dubbed these algorithms: “WMDs.” Often, they are the perfect mechanism to avoid accountability.

O’Neil highlighted three situations in which WMDs exacerbated already-large scale problems, beginning with public school teaching quality. Value-added models (VAMs), algorithms that attempt to measure a teacher’s impact on their students, were implemented by the Bush and Obama administrations. Ultimately, VAMs were grossly unable to fulfill their intended purpose and were responsible for unjust firings of teachers.

WMDs have similarly become a problem in the hiring industry. Specifically, O’Neil shared the story of a young man applying to work at Kroger’s who was denied an interview on the grounds of an algorithm determining he failed a mental health assessment. Such an assessment was a flagrant violation of the Americans with Disabilities Act (ADA), but because the algorithm was embedded in an otherwise innocuous survey, it went undetected.

“WMDs are not just destructive to the individual, they’re destructive to society. They do more than just idiosyncratically deny people things they deserve,” O’Neil said. “They typically set out to solve a big messy problem and not only do they fail to solve that problem, they make it worse.”

Advertisement

More from News:

Sign up for our weekly newsletter.

Catch up on the latest reports, stories and opinions about Bowdoin and Brunswick in your inbox. Always high-quality. Always free.

Comments

Before submitting a comment, please review our comment policy. Some key points from the policy:

  • No hate speech, profanity, disrespectful or threatening comments.
  • No personal attacks on reporters.
  • Comments must be under 200 words.
  • You are strongly encouraged to use a real name or identifier ("Class of '92").
  • Any comments made with an email address that does not belong to you will get removed.

Leave a Reply

Any comments that do not follow the policy will not be published.

0/200 words