Featured Post

Pinned Post, A Policy Note:

I have made a decision to keep this blog virus free from this point forward, at least until the smoke clears. This is not a judgement about ...

Saturday, November 9, 2019

Representation and Computer Vision

Another stuffy theory thing, sorry.

To review, representation is roughly the opinion of the subject posited by a photograph. This man is cruel. This flower is beautiful. Etcetera and so forth. The "politics of representation" is a topic of a certain amount of interest in the photographic academy: these are the issues surrounding who gets to decide what those opinions are.

The usual material in play here is the idea that white, privileged, photographers (or their functionaries) represent less privileged people in negative ways. We get things like primarily white, male, photojournalists going to Africa and coming back with endless pictures of Africans As Victims (of war, of famine, of Various Strife.) Now, this point of view ain't wrong. It is manifestly correct to some degree or another.

This set of ideas is generally harnessed to the the related idea that if only we had more women, more people of color, and so on, taking pictures, we would be getting different pictures with different representations out, and the world would be a better place. This is, well, again there is clearly some kind of truth to this, I think. Not that women and people of color necessarily take different pictures, because manifestly they do not — we have more or less infinite quantities of data which suggest that they take the same pictures, but there does anyways seem to be something there which we can't quite get hold of.

At least part of what's going on here is that this sort of auteur theory of photography is bankrupt, and obviously so. The communication of a photograph's representation is a string with two ends: the photographer, and the viewer. I do not wish to say anything so cheap as "if you see racism in the photo, then maybe it you that's the racist!!!!!" but there's something like that in there.

A photograph derives meaning from the cultural milieu in which it is viewed. Also, and not entirely unrelatedly, it is made with in a cultural milieu (sometimes the same one that it is viewed in, sometimes not). It is not enough to send a black guy to Africa to cover the war. If that's all you do you're likely to get much the same photos back, and you're likely to read them in the same way. If you want different pictures, with different representations, there is an entire surrounding culture that needs to change.

As a sort of interesting case study here, consider Computer Vision.

In this case, the seeing end of representation takes place in a completely different and alien world, the computer. The other end of the string isn't a person at all, but a computer program.

A friend of mine relates this tale: he was driving along, and got pulled over by some fairly tense cops. After a little while he learned that his car's license plates had been read by the cop's camera, been digitized and run through a database, and had popped up as "stolen car." This was very odd, because my friend had not stolen this car, it was his, and those plates had been on the car for some time. After a long and fairly tense interaction, the cops relaxed and came back in some emotional state between sheepish and grouchy. The license number did indeed correspond to a stolen car, but the state was wrong. In the USA two cars registered in two different states may indeed have the same license number, because the states issues their own license plates.

So what? Well, what we have here is a photograph, which depicts some license plate with a number on it: 123 ABC. The system the photograph was "seen" by posited a representation, an opinion of that photo, namely that it was the license plate of a stolen car.

The system's method of representation was, quite literally, wrong. Some moron forgot to code the state identifier into the database (at all? properly? who knows) so the representation of the otherwise perfectly valid photograph was wrong, potentially dangerously so for all involved. There's nothing wrong with the photo, it does not itself inherently code "stolen" and the photographer did not intend to insert the meaning "stolen" but the seeing of the photograph produced such a representation.

This is a kind of an edge case, in which the relevant meaning arises after the photograph was taken, it arises entirely in the "mind" of the viewer as it were. Obviously, a heavy-handed photographer can bash a meaning into a picture at the other end of the string by simply being extremely overt.

Normally, though, there is a kind of dance. The photographer tries something out, the viewer sees something, the two are related but not the same. Where the final meaning lands depends on both, and depends largely on the systems of meaning the making and viewing of the photo are embedded in.

Allow me to wrap up by reminding you of the work of Fabrice Monteiro, who I talked about in this essay. He's the fellow who photographs black models wearing horrendous instruments of slavery. Fabrice is trying something out, you're reading something, the meaning lands generally within a pretty narrow zone, but there's a fair bit of wiggle room even within that. Trying to guess what Fabrice is up to here is a bit fraught, and we're very likely to project our own ideas on to these pictures. Is he just messing with us? Is he trying to make a powerful statement about slavery? If so, is it simply "slavery bad" or is it more complicated? Is he talking about his own status as a guilty party, or as a victim by historical association? And what is his status, anyways? Is he descended from slavers, from slaves, neither, or both?

There is much we might read in these photo, and Fabrice strikes me as leaving the door pretty open.

Computer algorithms are perhaps special here in that they are far less likely to permit a nuanced reading. The car either pops as as stolen, or not, and consequences be damned. Still, they serve us handily in the role of illustrating that the string has two ends.

4 comments:

  1. Very interesting post! Reminds me of the street photography dilemma of making images of homeless people. Is there an ethical issue there or not? Is it exploitative in the absolute or just in the eye of the viewer?

    I’m not sure I have the answers. Ethics are hard!

    ReplyDelete
  2. That story about the out-of-state license plate is hilarious. Sort of. If the courts treat AI as something that can be sued, maybe the creators will have a reason to fix the bugs, but if the AIs are above suspicion then it may be statistically convenient to just put that guy in jail. It all depends on how politically powerful the AI creators become. Remember how long it took to declare cigarette smoking to be dangerous, and those companies weren't nearly as big as Apple/Amazon/etc.
    There won't be many cases like that, it will only ever be a small stochastic error in the data, nothing to get excited over. So some poor sap ends up in jail for no reason or goes broke hiring lawyers, that's the price of freedom, I guess.

    ReplyDelete
  3. Not THAT Ross CameronNovember 14, 2019 at 6:18 PM

    Thanks for that, an interesting perspective. Given your background , is it worth considering that code is designed and written by humans, and as such, it is possible to reflect the biases of those humans (admittedly, a whole different topic).
    Therefore, the reading of a photograph may well reflect biases inherent in the code. Or, there may be some unexpected edge cases whereby something new pops up, which may be interesting or irrelevant.
    Cheers

    ReplyDelete
    Replies
    1. Thanks! For my purposes, I felt it was instructive to kind of draw a line around the code itself.

      "Sure, it was made by someone, there were various forces that caused it to assign meaning in this way, much as there were forces that caused you to assign meaning in this way or that way, but let us set that aside and (for the moment) consider that meaning *is* assigned *in* *this* *way*.)

      Which is not to say you're wrong. The study of those shaping forces, be they a program's programmer, or your mom and dad, is also a worthwhile subject.

      Delete