This project explores algorithmic image "restoration" as epistemic violence. Aura Restoration Index processes 2,000 Korean institutional archive photographs through AI. Network analysis identifies "marginalized images"—photos with no visual analogues—then overwrites these outliers via statistical interpolations from Western training datasets. Visitors witness erasure of aesthetic singularity through algorithmic averaging on forensic-style interfaces.
Machine Appreciation System subverts 1970s visual literacy programs. Visitors arrange archival photo cards into narratives, triggering GPT-generated reflections. The output: syntactically elegant yet semantically void commentary—hallucinated semiotics as insight. Program records viewer choices, feeding confusion back as recursive training data.
The interface multiplies ambiguity: Who reads whom? Who trains whom? Cultural memory becomes continuously misread, re-coded, instrumentalized rather than preserved. [Core Question] What are the politics of algorithmic care when archival peripheries are forcibly normalized? Can speculative AI systems function as critical mirrors? This installation critiques automation-as-care by staging failure as form—an unfolding malfunction testing computational empathy's limits.
Aura.exe
: This Image Dose Not Exist
"The art of fixing a shadow," as W.H.F. Talbot described early photography, captured not merely light and dark but the magical process of making the ephemeral eternal. Today, as photographs dissolve into data streams and pixels become probability calculations, this project interrogates what happens when AI claims to "restore" cultural memory. Through the fictional profession of an "AI Photo Restorer," I explore how algorithmic intervention doesn't preserve but fundamentally rewrites visual history.
This project emerged from working with 2,000 photographs from the Seoul Museum of Photography's collection—Korean documentary and artistic works spanning decades. Rather than accepting AI's promise of restoration, I ask: What epistemic violence occurs when algorithms trained on majoritarian datasets attempt to "complete" Korean visual culture? What happens to marginalized aesthetics when computational imperialism replaces archival care?
The installation consists of two interconnected systems that stage restoration as calculated failure. Each component reveals how AI's "nonconscious cognition" (Hayles) transforms unique visual expressions into statistical probabilities, replacing cultural specificity with algorithmic universalism.