Fly AI, crucial system mistakes


Computer software with increasing learning and problem-solving capabilities is being rapidly integrated into the background of our lives, monitoring many of our daily moves to make our lives easier and safer, from managing our emails to driving our cars. In this rapid process, it becomes easy to be unaware or minimize the questioning around the challenges that may come with these benefits. Some artists are quite attentive to this kind of intimate entanglement and try to call into question the fast, and perhaps acritical, integration of AI technologies, while seeking to unpack their potential and challenges. Fly AI, by David Bowen, is a piece that shows us that we don’t necessarily have to imagine sci-fi scenarios populated with super intelligent beings in order to critically reflect upon the ways in which we are endowing computers with the power to manage many aspects of our lives. The piece actually shifts our view towards the more plausible current applications and implications of AI capabilities, such as machine perception, in a somewhat ironic and thought provoking manner. Simply put, an AI system controls the fate of a colony of houseflies. This apparently bizarre experiment around a contained world where living creatures are sustained by machines might evoke a somewhat futuristic scenario, although through an apparently simple and commonly accessible build. The system uses a camera linked to a Raspberry Pi and the TensorFlow machine-learning image-recognition library (open-sourced by Google), to “guess” whether or not what it “sees” is as a fly. When the camera captures the image of the fly, the image recognition software classifies the image and ranks the observed item between 1 to 5 possible guesses of what it believes to be a fly. So the narrative behind this is whenever the system correctly identifies a fly, and ranks it 1 on the list, the colony is fed with water and nutrients according to the percentage of the ranking. Otherwise, if it fails to rank “fly” as the 1st match, the colony is left to starve. In the process, an eerie voice, slowly and systematically describes everything it believes to identify while, at the same time, outputting its doubts and errors. As Bowen reveals when portraying the piece, the system often makes mistakes due to limitations in image recognition training, thus explicitly exposing its imperfections and the actual lack of accuracy on which the fate of the flies depends on. Provocatively enough, the artist specifies that “the system is setup to run indefinitely with an indeterminate outcome”. In the context of Bowen’s work, which explores the outcomes and intersections between machines and the natural world, this piece creates a particular interdependency through a sort of perverse photogeny to which the unaware beings ultimately have to correspond to in order to survive, regardless of the faults and limitations of the system. Without giving any absolute answers, the installation eloquently questions what might happen when we start endowing machines with the human ability of making (often mis)informed observations and decisions about the world. Luisa Ribas


David Bowen – Fly AI