Review: Invisibility, MOD Museum, Adelaide
Misinformation, algorithms, big data, care work, climate change, cultural knowledge: they can all be invisible.
In her New York Times bestseller Weapons of Math Destruction (2016), subtitled “How Big Data Increases Inequality and Threatens Democracy,” mathematician and data scientist Cathy O’Neil unpacks the elusive algorithms of our everyday life and how accurate, fair or biased they are. they might be.
Algorithms hide behind the supposed objectivity of mathematics, but they largely contain the biases, subjective decisions, and cultural frameworks of those who design them. With few details on how these algorithms are created, O’Neil describes them as “impenetrable black boxes”.
The opacity is intentional.
In one of the spacious MOD’s upstairs galleries, we’re greeted with large text as we enter: “what do the algorithms think of you?”
Can an algorithm think?, we ask. And, if so, what informs the decisions He makes about us?
Biometric Mirror was created by science fiction artist Lucy McRae and computer scientists Niels Wouters and Frank Vetere. They created an algorithm to judge our personalities by asking 33,000 people to look at photographs of faces and come up with possible personality traits.
We don’t see who the photos are or who makes the assessment – and so we don’t know what biases might be reproduced.
You are invited to look at yourself in a mirror that scans your face. From this analysis, the algorithm creates a profile of your age, gender and personality, which appears as statistical data overlaid on your reflection.
When I look in the mirror I am told that I am neither trustworthy nor emotionally stable. The algorithm below guesses my age by a few years, and I get a high score for intelligence and uncertainty – an unnecessary combination.
Despite my doubts about the algorithm, I find that I focus on the most favorable data.
In this context, the data is benign. But facial recognition technology has been used to investigate and monitor activists and has been responsible for thousands of misidentifications by police in the UK.
Read more: Algorithms can decide your grades, job prospects and financial security. How do you know they are right?
Using data to inform cultural insights
In one of the most impressive works in the exhibition, contemporary data visualization is used to illustrate Indigenous forms of knowing and the intrinsic relationship between spatial awareness, country and kinship.
Ngapulara Ngarngarnyi Wirra (Our Family Tree) is a collaboration between Angie Abdilla of design agency Old Ways, New, Adnyamathanha and Narungga footballer Adam Goodes and contemporary artist Baden Pailthorpe.
In every AFL game Goodes played, his movements on the pitch were recorded via satellites, which connected to a tracking device on the back of his sweater. 20 million data points were then merged with data analyzes of a Red River gum, or Wirrato form an impressive data visualization projected onto two large screens in a darkened gallery.
Here, Goodes’ data is sent back to Country to be part of the tree’s roots as well as the swirling north and south winds of his ancestors. The data is also translated into sound and amplified, inviting us to listen to what would otherwise be inaudible.
In a small room between the screens – or in the tree – drone footage from the land of Adnyamathanha (Flinders Ranges) plays against the narrative of the creation story in the Adnyamathanha language.
The result is the synthesis of traditional indigenous knowledge with advanced technology, revealing different ways of perceiving space and time.
Read more: The land we play on: equality does not mean justice
The power of the unseen
While it’s easy to focus on how technology is used and displayed in the works of Invisibility, in the hallways and hanging from the ceiling of MOD are a few other exhibits that flesh out the concept of invisibility.
Women’s Work celebrates the leadership of South Australian Aboriginal women with striking black and white photography. Tucked away in the hallway on the second level is Fostering Ties, a series of images drawing attention to children in foster care.
This exhibition highlights invisibility as a means of dealing with our own blind spots, knowledge systems, prejudices and cultural frameworks.
What is invisible to us may not be invisible to those who belong to demographic, cultural or linguistic groups different from ours.
Calling attention to the invisible encourages us to change perspective. If we don’t have the answer to solve a problem, maybe another cultural perspective – or form of life – does.
Invisiblity is in MOD until November 2022.