Our relationship with the world is increasingly mediated by machines - watching what we do online and off, profiling our behaviour, choosing the content we see and interfering with our perception of reality.
Our behaviour is meticulously tracked and analysed by unseen computational forces. Our every click is being monitored; and a significant proportion of those clicks relate to pornography. Estimates suggest that up to 30% of internet traffic is pornography related - and that nearly 40% of that is driven by bots. Despite this, little attention has been paid to what the machines might be learning about this aspect of human activity.
These works consider how machines interpret our seeming obsession with the sexual activities of strangers.
It is a far-from-controversial assertion that much of the history of the created image (certainly to the mid-20th Century if not beyond) has also been that of the relationship between the creator and the subject and, specifically, the relationship between the artist (male) and the model (female) - a relationship which, amongst other narratives, explores the inherent (skewed) power dynamic that has existed between men and women for centuries.
This is particularly and uniquely true of (male-focused, heterosexual) pornography (one might argue, given the nature of the market for porn, the vast majority of that produced in the pre-internet era). The female model, the male photographer, and the unseen eye of the eventual consumer combining to create imagery packed with hidden, coded messages about gender and power relations through history.
For the human viewer, images of women produced by men can be understood through a variety of lenses of social, political, economic and sexual history; we know to read a work by considering the circumstances of its creation and treat it accordingly (or at least we like to to think we do).
Now, though, it is not only humans who can see and ‘understand’ images. Technology colloquially referred to as ‘Artificial Intelligence’ is enabling machines to ‘see’ - neural network arrays are capable of examining and interrogating a vast quantity of imagery, and to deriving their own interpretations of what is being ‘seen’. The source material, though, is what we choose to feed it - it will ‘learn’ from our teachings, and, like all algorithmic ‘learning’, this process magnifies, amplifies and distorts any flaws or quirks or oddities in the training data.
As algorithmic processing becomes more sophisticated and complex, and as more of it is refined and improved upon by the machines themselves, so we lose our ability to fully understand how inputs become outputs and how the ‘black box’ is ‘thinking’ - and where might that lead.
What do the machines see when they look at porn? What are we teaching them? What are they learning? And what does it teach us about what we have taught ourselves?
The Machine Gaze is an exhibition which explores the above questions through cutting edge technology, producing a body of work immediately recognisable and yet utterly alien, which provides a timely critique of what we’re teaching our machines, and how this might play out.
Utilising a number of machine learning techniques, including Generative Adversarial Networks, neural style transfer and Google ‘deep dream’, this collection presents machine re-interpretations of our hidden desires.
Featuring computer-generated artworks based on a corpus of pornography spanning printed magazines from the 1980s to user-generated clips uploaded to PornHub, Shardcore has created reframed Philip K Dick’s question - except rather than Electric Sheep, these 'androids' dreams are much fleshier in nature.
For press enquiries, please contact themachinegaze@shardcore.org