On How Computers See Race

Does technology covertly holds the biases of its creators? Alexis Stevens in Cluster Magazine writes about an unintended dimension of of facial-recognition-based surveillance software:

“HP Computers are Racist” is a 2009 YouTube video in which two electronics store employees demonstrate how face recognition and video tracking technology on Hewlett-Packard computers works more accurately for people of whiter skin tones. “I think,” one of the employees remarks with biting accuracy, “my blackness is interfering with the computer’s ability to—to follow me.”

The company issued an apology after the clip went viral, suggesting that face-detection algorithms have more difficulty identifying the contrast that helps discern facial structure in low lighting. An ironic outcome of this corporate oversight is that while black people are more likely to be eyed as suspicious and tracked in real life (e.g. stop-and-frisk), the engineering of webcams for a presumptively white target audience renders people of color more invisible to technology.

Whether it’s via Photo Booth, surveillance cameras, drones, or Instagram […] we are now beginning to see ourselves through the screen and think of ourselves as perpetually watched beings.

The chicken-vs.-egg question […] is: Are computers racist or are the humans who advance computer technology racist? The “Object-Oriented-Ontology” approach to technology favored by New Aesthetic theorists suggests that the answer is both. OOO is about our gaining empathy with our digital tech, and privileging the relationship that we have with these objects. By imparting agency on them, we can begin to imagine their inner lives and how they might relate to our condition. As Meg Jayanth writes, “phones know their location, algorithms read the news, the camera-mounted car is an organ of sight for the diffuse Google Street View body.” Some (arguably-Ellisonian) computers just don’t see black people.

4 Comments on "On How Computers See Race"

  1. Earth Star | Aug 23, 2012 at 4:40 pm |

    Let’s be clear on this. The technology has trouble identifying people of darker complexion. But they have NO trouble seeing them and labeling dark skins as unidentifiable threats still.
    Just like cops and airport folks already do.

  2. Big Homelander is watching Hue

  3. Taan Maat | Aug 23, 2012 at 10:51 pm |

     Good Evening in this New World

  4. I don’t see being easy to track as an advantage.

Comments are closed.