Breaking: Google Street View Assassinates Photography [via Art Fag City]


One year ago, I started collecting screen captures of Google Street Views from a range of Street View blogs and through my own hunting. This essay illustrates how my Street View collections reflect the excitement of exploring this new, virtual world. The world captured by Google appears to be more truthful and more transparent because of the weight accorded to external reality, the perception of a neutral, unbiased recording, and even the vastness of the project. At the same time, I acknowledge that this way of photographing creates a cultural text like any other, a structured and structuring space whose codes and meaning the artist and the curator of the images can assist in constructing or deciphering. – Jon Rafman

I suppose this was inevitable, and has been touched on already but this essay and sequence of images should make most serious photographers run to the hills and think about their craft.

IMG MGMT: The Nine Eyes of Google Street View [Art Fag City]

  • ian aleksander adams

    yeah, you can theoretically automate an edit based on certain aspects, say naked women (using human recogognition software tweaked) or maybe even fog.

    But I’m guessing we’re a long way off from being able to create software that could do a satisfactory order given basic terms for a series, say:

    Naked girl photo, fog photo, tree photo, then naked man photo (preference: centered, desaturated, deadpan)

    I think a lot of the art of actually making a zine or a book comes from making unexpected or harmonious choices that are very hard to program or even explain in words let alone math (and pretty much everything has to be boiled down to math at some point if you’re talking about doing it with computers. You could do it via some sort of tagging or rating system, but then you’re back at the human step anyway)

  • Bryan Formhals

    Oh, I agree. I mean, we wouldn’t see any of these photographs unless he determined they were the images that were the most interesting.

    However, don’t you think photographic aesthetic sensibility can be measured? Within a frame there are shapes that can be measured. Patterns can emerge about which proportions and shapes are most frequent in the canon of photography. Patterns about sequencing can be measured. All this data could eventually put into a database and used by sophisticated software to analyze the images Google View is kicking out, and then this software could create an edit based on the patterns humans have determined to be significant or interesting photographic aesthetics. This may sound far fetched, but I think math plays a large role in photographic aesthetics.

  • ian aleksander adams

    Only problem is automated cameras don’t have an aesthetic sensibility re:editing their own work.

    Security cameras have been around for a while, but they don’t release tear jerking epics very often.

    I think the interesting part of making photographic work has always leaned heavily on editing, sequencing, and deciding what about the image is important (something rafman is still playing a huge part in here.) Stuff like this only drives that home for me.