Artists Statement: We interact with our computers constantly, touching them more than we touch any person in our lives, and grooming them inside and out. For a month, I recorded all interactions with my phone and fed them into a machine learning system, which then output new, learned gestures. These "hallucinated" movements are awkward yet eerily accurate swipes, taps, and typing based on what my computer has learned from my interactions with it. Presented as an interactive sculpture, the gestures are enacted by a small robotic arm on the visitor's palm as they sit at a low, altar-like table. Notions of "you," "me," and "I" are doubled, enacting the understanding of the machine and at the same time a self-portrait of my interaction.
Developed while artist-in-residence at Bell Labs, this project seeks to explore the personal relationships we have with computers and technological systems. Following an approach laid out by Object-Oriented Ontology, rather than see human-centric metaphors as a failing when trying to understand an object, my work attempts to find poetic ways to increase the agency of and empathy for computers as a way of unpacking that relationship. Mind/body dualism is then not only about the human self; this piece extends the conversation to include bodies and minds of our technological devices, and to complicate the notion that we are separate from them at all. By using a machine-learning system trained on the intimate gestures of interacting with my phone, the piece's output is not canned but a way to summarize my phone's understanding of our relationship.
Artist(s): Jeff Thompson, Stevens Institute of Technology