Flickr Content “Filters”

March 24th, 2007

open gate

A new feature got launched this week, content filters. (no not those filtrs, alas). And for a feature that quietly, but seriously rocks, it got kind of a “meh” response from the blog community at large.

Partially the official blog post is a bit TLDR (hey, its complicated!), but more to the point I don’t think we’re culturally prepared for progress on issues like censorship — things are supposed to only get worse.

So let me put it this way

Upload your screenshots!

Your SL avatars, your historical anthropology of web design, your worst-UI-evar samples.

Upload your art!

Your paintings, your drawings, your renderings, your Photoshop tennis.

Upload your photos!

Including your arty nudes, and, um, less then arty nudes.

Murky Light Beams of Clarity!

NIPSA is a tricky and subtle tool for managing the delicate balancing act which is community norms. But it was largely opaque. Content filters are transparent (translucent?) and designed for use by the community to self-negotiate those community norms.

Tools like tags, and groups, and sets already allow people to explore and classify the various vectors of their self expression in all its crazy and complex permutations. But this takes that landscape of expressions, and adds new dimensions to it. (hyper-cubic landscapes of expression for Web 3.0 win! You heard it here first!)

Just do us a favor, and tell us they’re screenshots, tell us they’re illustrations, tell us they’re arty.

Photo by justinlincoln

update: 2007/6/13 reconsidering.

3 responses to “Flickr Content “Filters””

  1. I think we’ll finally make progress on image categorization when we finally have suitable tools that do it for us. ‘Trusting’ the community to adequately categorize their content for global cultures is merely a stopgap. There’s nothing magic or even approaching magic that I can see in a casual read. They’re just user tags/flags/bits that are set by the content creator. Content creators are rarely qualified to answer such deep pondering issues such as cultural differences where (for instance) a navel can be considered obscene to a particular viewer.

    There are a couple (sadly) of people who are actually working on image recognition towards this goal. i.e. This picture has a navel in it. But no uncovered bosoms. It currently takes a helluva neural net to pull this off, but considering the huge amount of sample data available today on the internet, the day is not far off when we can actually make this distinction reliably. Some company is eventually going to offer this rating service and make a gazillion bucks doing it. Simply slapping another flag/tag/bit on a piece of data is hardly the stuff of Web3. But what could lead us to web3 is to use the internet audience to classify all of our sample data. What objects, etc. are in all these pictures? Then when these are crunched through a neural net, its efficacy can easily be checked and adjusted if necessary. Not just are they questionable or OK, but literally what is this a picture of…

  2. kellan says:

    Hey Mike,

    Have to say I’m a 180° on this one. A computer’s ability to recognize that a photo contains a navel/bosoms is never going to be useful in navigating the community norm, the only possible approach is to give your community (which is an inclusive but defined body, not a generic concept) tools for regulating and communicating. Trying to reduce something so complicated to pixel recognition rules is doomed to failure.

    And any and all references to Web3.0 on this blog should be taken as tongue in cheek :)

  3. Bob Hooker says:

    What is so great about Flickr Filters, I fail to see a case made here.

    The way it works is this. Firstly everyone is set by default to the status of a child. When you join Flickr there is nothing to tell you that you have been set to see only safe material. If by some accident you find out that you can only see safe stuff, say you see the boring pictures that are safe and not the cool pictures people want, you have to learn that it is a simple mater of going to YOU>YOUR Account>Premissions and Privay (but ofcourse, its so obvious) going to the bottom of the page, knowing it search safe, no explanation of what it is, and changing it, with a login again. Flickr does not ask you to confirm your age, anyone can see anything as long as they get a Yahoo email account and know this trick.

    And when you start an account you are blocked from public searches until a Flickr staff person takes the time to check your account and validate that it is good. Ground work is not precisely well defined, it seems not thinking about children and grandparents is given as a rule and complaints from users can be automatic grounds for restrictions, so if someone is posting protests against Bush?

    It would have been interesting to read something in this blog post, a formed idea or concept.