Long time readers, as well as n00bs with some exposure to the right sorts of rather pointy-headed theory, will recall that straight photographs are considered as "indexes" of whatever the camera was pointed at. In general, you can take an arbitrary speck of tone or color, and trace it back to the original scene and say "that bit comes from that bit" for every bit. Slightly more generally, you could probably say something like this:
There is a breakdown of the index into a large number of very simple units, which breakdown has the property that each of the units can be uniquely identified as coming from a specific very simple, equivalent, aspect of the original scene.
So this covers raw files, but also things like spectral decompositions and so on.
The point of the idea of "index" is that is exactly captures how much truth there is, notionally, in a photograph. A photograph faithfully and truthfully records certain tone and optionally color aspects of a scene at a moment in time, from a certain angle. A photograph is not isormorphic to the world, but it is isomorphic to that and that is what makes photos photographs and not paintings, not drawings, not whatever else you might imagine.
It is from this extremely limited isomorphic correspondence that follows the so-called Truth Claim of photography which in its most straightforward expression is that a photograph is what it manifestly is -- an index.
All this shit is basically tautological. An index is more or less defined to be whatever the hell it is that a photograph is, and the "truth claim" is that a photograph is an index. Wow. Deep. But the point is to pick apart what the hell it is that makes a photo not a painting. That picking apart actually does turn over a few interesting rocks. So, attend to the rocks we just turned over, above, rather than the circle of definitions.
Enter machine learning, neural networks, and contemporary high end phone cameras.
Google and Apple at least, and the rest cannot be far behind, have built some cameras that use one or more physical cameras which, depending on mode, each take one or more photos in the sense of indexes, and dump this pile of bits into a thoroughly trained neural network to produce an output JPEG (or raw) file.
Now, I ain't no expert on neural networks, but I am pretty confident of this: You shove a bunch of data in, and a bunch of data comes out. The stuff that comes out is strongly and... somewhat predictably... related to what went in. But the relationship is very holistic. There is no "this pixel came from.. " in play here. Each individual pixel of output comes from all the pixels input, in ways that are often profoundly not obvious.
What comes out of these modern AI-driven cameras is not an index.
It looks like a photograph, it looks like an index. But it's not. And there are tells. Girls look prettier, with fuller lips and larger eyes. The tonal range looks a little funky, because it's actually built from a shitload of pictures made in total darkness. The colors are almost psychedelic. Whatever. Occasionally there are hilarious fails where the AI gets confused and renders everything as heaps of kittens. Ok, not yet, but we all know they're coming, don't we?
Ok, so what? We're pretty used to doctored pictures these days, right?
Here's so what: In five years a device that actually produces a photographic index is going to be a special purpose device, purchased for that purpose, by retro weirdos who produce very few photographs. Virtually every photograph we see five years from now will not be an index in any meaningful fashion.
When photography was introduced, it was a bit of a surprise that the indexing properties turned out to be important. Sure, some people quickly (10-20-30 years) skipped past it's an easy way to paint to this could be used to teach, or as evidence, or something. I'm not sure anyone saw the general public trust in photos coming, though.
Now we see that trust being thrown away not merely here and there as a conscious choice of the Photoshop-hero, but as a basic operating principle of actual device we call a "camera", and it's not clear what the long term effects are. Let's repeat that: Photographs, as cultural objects, are on the cusp of collectively ceasing to be indexes. The circular mess of definitions we started with is about to collapse.
We're already seeing people experiencing body issues because not only all the photos of everyone else online are weirdly beautified, but the photos they see of themselves on snapchat or whatever are beautified by the service.
What happens when it becomes literally impossible to get a straight photograph of oneself?
What happens when journalists ("citizen" or standard-issue) are all using AI-powered phones? Is Putin really sneering or did the AI just try go make him handsome?
Will someone start building special "forensic" cameras to produce photographs admissible as evidence? What happens in the inevitable 20 year interval while the legal system catches up, and people are getting convicted on the basic of AI glitches (or freed on the basis of postulated AI glitches)?
What other social consequences will there be?
Me? I am perfectly aware that I am a fat, old, white man. There's basically no way to worsen my body image, but what about all the young pretty people? Why won't anyone think of the kids?
ReplyDeleteOMG. You are a scary guy, but since a long time ago someone predicted that NYC would be covered in some 12+ inches of horseshit because of the number of horses in NYC at the time, I am going with extra-prevailing intervening factors.
And if I'm wrong the social consequences won't change; it will still be right vs. wrong.
Actually, I *think* your index should properly be called an 'icon'. And as long as the image that comes out of these new-fangled devices is recognisably of Person X, it remains an icon, however untruthfully svelte X has become. At least we will still have mirrors, CCTV and passport photos to drag us back to reality.
ReplyDeleteI am also an old white man, but (like you, I am sure) I look better and less unkempt some days than I do others. Or so my wife says.
A recent piece in the New Yorker grapples with this too (Joshua Rothman's article "In the Age of A.I., Is Seeing Still Believing?"). Reading your post and thinking about Rothman's article initially sent me down the doom and gloom road. But I'm now thinking the loss of the index, as you put it, is really just taking us to paintings.
ReplyDeleteNobody ever thought of painting and drawings as being indexical. Even in the case of courtroom artists, we understand that the sketch of the accused is an artist's interpretation that is meant to be "faithful" but not absolutely realistic.
If you're right, this is where photography is going. I don't think it will be 5 years, but relatively soon we will be in a time when nobody except for the very clueless thinks a photograph is "real". Let's remember too that before photography nobody expected to see an image of anything that was indexical. So maybe there's some potential for understanding the future by looking at the past.
The future you've described will pose some practical challenges because we've come to count on the indexical nature of photography for a lot of practical reasons (photo journalism, evidence in court, science, etc.) I don't think as a society we can just give that up. Therefore, I'm hopeful that some clever people will figure out reliable ways to distinguish "straight" (indexical) photographs from made-up, computational (or whatever we call it) photographs. Perhaps there will be a reliable certification system.
I'm not sure what's going to happen in the non-evidentiary, non-scientific realms. Advertising will be in big trouble if nobody believes what they're seeing. We already don't believe what we're seeing in advertising, but some people still expect some degree of truthfulness. Good luck to advertisers when even that thin crust of truthfulness is gone. Photojournalism is in the exact same boat.
What's this all going to mean for art? Artists who use photography won't care; they already don't care about indexicality. But photographers who think of themselves as artists will be in a pickle. We're already seeing competitions that have explicit categories for "straight" photography and "manipulated" (with all the ambiguity that exists there). Probably the kinds of certification that will be needed in photography for evidentiary and scientific purposes will be mobilized there too.
Anyway, this is certainly an interesting time for photography.