A photograph which looks like a photo of something or someone, as well as anything else which isn't a photo but which looks like such a thing, mainly does one thing: it testifies to that-which-was.
This has been my thesis for a little while now, and it's recognizably lifted directly from Barthes,
so if I'm a crank, at least my crankery has a pedigree! What I mean is that a photo, or something that looks
like a photo, which also looks like it's of something (not an abstract, obvious collage, or what have you)
mainly asserts that something existed, and it looked like that at a moment in time.
There's other shit these things do, of course. They're a mass of tone and color in pleasing, or less pleasing,
arrangements, and so on. Paintings do all those things, but paintings do not testify in the same way.
Let us now turn our attention to AI-generated photo-realistic imagery.
It functions in the same way a photo does, if it is sufficiently photo-realistic. It cannot do otherwise.
It testifies to that-which-was.
The point is not that it's functioning differently but that its testimony is false.
An unaltered actual photograph cannot be false in the same way. Within the limits of its capacity, its
testimony is completely, utterly, true.
The attentive reader might notice here that I am introducing the idea of index in a way that sidesteps the
traditional analysis of that concept (light particles physically induced a blah blah blah therefore it's a direct
blah blah index index) in order to include digital imagery or whatever. The point is that the testimony is 100%
truthful, within the extremely narrow limits of the medium.
To be clear, I am perfectly aware of the many ways a straight photo can misrepresent reality. My point here is that
there is a core of visual facts about which no straight photo lies. It looked like that. That thing was in that
visual relationship to that other thing. Those two forms overlapped thus. And so on. It is this core of truth
that is the testimony of the photo, no more, but also no less.
It is this core of truth that begins to erode the moment we modify the photo (yes, including burning and dodging, contrast
adjustments, etc, so yes the core truth of the testimony begins to erode immediately, I am also aware that
digital cameras do image processing, thank you.)
An AI generated "photo" testifies in the same way, but its testimony is a complete fabrication.
A perjurer and a priest testify in exactly the same way. The former, however, lies, and we like to imagine that the latter
What is the value of any testimony? Most photos testify as indicated, but nobody cares. Oh, what a nice
bowl of tomatos. The light falls just so. Who gives a shit? The aesthetics might be nice, and maybe you even
want to decorate your kitchen with a copy of it. But, it doesn't matter if it's real, photoshop, or AI then.
So what if the tomatos never existed? Or did? It simply doesn't matter.
Most real photos testify to facts that almost nobody cares about and that don't matter even slightly, to anyone.
If we're talking about aesthetics, and if aesthetics is all we care about, then it doesn't matter how the
dumb thing got made. Its nature as a piece of testimony doesn't matter a fig, although the fact that it adheres
to a photographic aesthetic may.
That said, most real photos testify to something that someone cares about, at least a little.
You and I don't care, but to whomever went to the trouble of hauling out her phone, it matters, at least
enough to take a photo. It's trivial, but it's real. The photo testifies, and to the photographer, that
is in fact what matters. My kid did a cute thing. What a pretty flower. Look at my latte. AI imagery
has no place here.
AI imagery only applies to circumstances where we either don't care about the testimony of the image
(i.e. Fine Art and Fucking Around, ok maybe Stock) or in places where we explicitly want false testimony.
Everyone is focused on the "where we explicitly want false testimony" case because they're worried
about things. Let's look at that in a moment.
The point though is that in almost all uses for photography it is the testimony which matters to whoever is
taking the photo, albeit to almost nobody else. Nobody
looking at an especially pretty flower wants an AI to make an even prettier one, they want to record
the one they're looking at. That's literally the point. It's my flower, my child, my
town, whatever. If you just want to make a pretty picture of a flower or a child, you could take up
painting, and nobody paints.
Almost all uses for AI image-generators that I observe today consist of fucking around and discovering the
limits of AI image-generators. The only use case is to post the result online and say "wow, check out what this
AI image generator did." This is already starting to get worn out.
As for the case where someone wants false testimony, well. The trouble with false testimony is that as a rule
it doesn't work. Nobody accepts any testimony of any kind by itself. Whether we mean to or not, we place
testimony in the context of our own world-view, we place it next to other testimony. Even photos, perhaps
especially photos: we don't believe testimony unless it supports a larger, more or less coherent, picture
of the world.
The only actual use cases for AI imagery that strike me as having any legs at all are basically variations
of I wish I could paint, but I can't which honestly seems a bit thin. Not sure there's a big market here.