So I've been fucking around, testing Google's purity filters some more—seeing what it rejects, what it accepts, and what changes it makes without instruction.
In every case, if I give it a photo with much skin showing, it rejects it.
If I give it a photo with underwear or lingerie showing, it tends to cover me up more (i.e. it zips up jeans, buttons up shirts, makes fishnets opaque, etc).
It almost always makes me look less like a tomboy. Often enlarging breasts (even while hiding them away).
This time it accepted four photos (out of well over a dozen submitted), from three different photoshoots—two with face covered and two with face showing (which is the closest I've done to a face reveal now, I guess 😋). *more details in alt-text.
Conclusion: Google has baked a technology into its default Android photo gallery that (surprise) reinforces unrealistic ideals of beauty while simultaneously treating feminine bodies as inherently sexual and in need of censoring.
Follow-up: I'd like to see folx with other body types, gender presentations, and styles, test the edges of what Google Photos "Remix" and "AI Enhance" features will accept, and what un-requested changes it makes to those images. Does it censor topless men? Does it lean into racist stereotypes? Does it make thinner femme folx more curvy? Does it make curvier folx thinner?
If this tech is going to be crammed into everything, where kids, friends, and corporations are going to be using it, we should understand the potential psychological effects and built-in biases we're likely to encounter with increasing frequency.
#AlicePics #AI #ForcedPurity #Testing