A really quaint idea from the ‘90s
You’d think this was satire but apparently it isn’t. Hold the presses: facial recognition technology “misgenders” trans and nonbinary people. You don’t say! How very shocking!
Yet, for all its advances, facial recognition technology—created by training computer vision algorithms on massive datasets of photographs of faces—might have a critical shortcoming: only being able to “see” two genders.
Sexes, you mean. The average differences in male and female faces aren’t a matter of gender but of sex. They’re physical. Vision algorithms don’t know from people’s inner feeling.
New research by Jed Brubaker, Jacob Paul, and Morgan Klaus Scheuerman (lead author) in the University of Colorado Boulder’s Information Science department reveals that many major facial recognition services misclassify the gender of trans and non-binary people.
Because they can’t see the magic gender, they can see only the sex. If you asked them to classify people’s politics or taste in food they would get that wrong too. If you want to check people’s invisible attributes, you don’t send facial recognition technology to do the job.
“We found that facial analysis services performed consistently worse on transgender individuals, and were universally unable to classify non-binary genders,” said Scheuerman, who is also a PhD student in Information Science at CU Boulder, in a statement.
Oh for god’s sake! How would they be able to? How would anyone? Do you feel confident that you can pick the non-binary people out of a crowd? Of course not! It’s a self-applied label, not a concrete recognizable specifiable category.
At a minimum, this miscategorization has the potential to result in social discomfort and exclusion, reinforcing stereotypes that serve to “other” those who do not identify with the traditional gender binary.
“When you walk down the street you might look at someone and presume that you know what their gender is, but that is a really quaint idea from the ‘90s and it is not what the world is like anymore,” Brubaker said. “As our vision and our cultural understanding of what gender is has evolved, the algorithms driving our technological future have not. That’s deeply problematic.”
Oh yes, so quaint, from those stupid ’90s, which might as well be the 4th century. It’s like corsets, or crucifying rebels, or living in a pleasant villa in Pompeii.
The article mentions that trans men were classified as women only 38% of the time, which supports the claim that testosterone does a good job masculizing female features. There’s no statistic mentioned for “misgendered” trans women, probably because it’s going to be high. Estrogen is not as powerful.
Even if they add in “non-binary” categories, other than “undetermined” I’ve no idea what the facial recognition technology would look for. Though now that this disturbing new research is out (100% non-binary misgendered!) watch for the non-binary to start cutting off their noses to spite their faces. Literally.
Who cares what gender a robot thinks you are?
Maybe the the systems could be modified to also scan for “my pronouns are…” buttons?
chokelaugh Sastra.
“those who do not identify with the traditional gender binary”
That implicit idea that the rest of us DO ‘identify with the traditional gender binary’. I don’t identify with the fact that I’m a man any more than I identify with the fact that I have blue eyes, that I’m of European descent, or that I’m about 6 ft tall. It just is, and I can either let that determine a set of stereotypes I need to follow or I can try to leave those stereotypes behind.
Facial recognition technology doesn’t have a good track record at all, throwing up far too many false positives, and by extrapolation, therefore missing a lot of positive positives.
As with any human-created system, it will contain human biases. I have seen reports of some systems being far better with the average white face than with Asian faces.
Anyway, as the system is designed to match a face against an existing database of photos, where the bloody hell does gender or sex enter into it? You have a match or you don’t have a match, with a face, not with genitals or feelz.
Once again, the whole TRA movement seems to be more about narcisim than anything else.
Admittedly, I didn’t read the study. Maybe if I had read it, this would be clearer to me.
What did the researchers think they were investigating? How could they have imagined the AI systems would be able to read people’s internal states, as opposed to the sizes and shapes of their facial features? (What in the world would it have meant if the AI had actually done that?!) Do the researchers find their results surprising?
Why do individuals care whether a computer has classified them as male or female or none of the above? How would people ever know they had been so classified? (Is there a right to have your internal conception of your gender recognized by a computer?)
Slightly off topic: The whole idea that, for instance, someone’s non-binariness is anything more than an aspect of their (and actually everybody’s) personality seems so silly to me. If you could travel 100 years into the future and discovered that everyone took for granted the idea that people come in 11 different genders, would that mean our knowledge of human existence had grown? Or would it just mean that our culture had changed, and people know accepted this once-strange idea?
In short: Huh?
Roj Blake #5 “Facial recognition technology doesn’t have a good track record”
My favourite, China’s Social Credit facial recognition cites Dong Mingzhu, head of Gree Electric, for jaywalking based on face recognition from advert on bus, displays her name and mugshot publicly. The neural network extracts features from pixels, which don’t have sex or gender! Recent brouhaha in U.K. because of biases for AI checking passport application photos.
Ben #6 “people’s internal states”
AI is being used to read facial expressions e.g. in China to help respond to children’s responses to lessons in school. Facebook VR tracking biometric data including facial recognition to detect depression, addiction etc. to help micro-targeting of ads.
Identifying people’s intangible attributes – personality traits and the like – from face structure is a return to phrenology. These people want quackery, known and long debunked quackery… and they want it now, dammit!
My god, this ninny lives inside a cocoon.
John Wasson @8:
I should have been clearer. When I said “internal states,” I meant things like beliefs, predispositions, prejudices, etc.
Surely if a computer can detect and identify a person’s emotions, it’s doing it on the basis of the configuration of their physical features. Happy people look like this, sad people look like that.
@ Ben 10 (oooh spooky)
Except, as many of us depressives know, our depression can be internalised while our exterior appears top of the world. I used to serve around 100 customers a day in my shop and none of them knew when I was having a depressive episode. Not even my intimate partner could detect it unless I dropped a few clues.
I guess i have more in common with trans people than I realised – it’s all on the inside, nothing outward for people to detect.
@1 I got a book of Ted Chiang’s short stories for Christmas, and the last one is an exploration of beauty bias (presuming a way to ‘short-circuit’ the area of the brain that responds to attractive faces). One of the characters responds by going ‘radically ugly’–it’s implied that in this world there’s a trend in some subcultures to have one’s nose removed. The interviewee says something like ‘if I’m standing next to a supermodel, who are people going to look at?’
Having a melancholic disposition myself, I am at one with Roj Blake’s remark at 11. Yes, one puts on an act. There’s a poem by Stevie Smith entitled ‘Not waving but drowning’; I recommend it. Oh, it’s short, so I’ll reproduce it:
Nobody heard him, the dead man,
But still he lay moaning:
I was much further out than you thought
And not waving but drowning.
Poor chap, he always loved larking
And now he’s dead
It must have been too cold for him his heart gave way,
They said.
Oh, no no no, it was too cold always
(Still the dead one lay moaning)
I was much too far out all my life
And not waving but drowning.
I will also ring in on Roj’s comment, because I, too, suffer from depression. Because I work as a college instructor, I cannot let that show most of the time, so people are unaware most of the time how serious it is. My husband is often unaware, and to a great degree, my current therapist has not sussed out the seriousness of my most recent episode, the one I am currently experiencing, because even in therapy I have trouble “letting my hair down”, as it were.
A computer recognition program might not understand the difference between a happy smile, a sad smile, a sarcastic smile, or a forced smile (like so many women wear because they are sick of being told “smile” all the time). To figure out someone’s internal state requires a lot of signals, not just a facial expression. I once saw a psychiatrist (only once; he was doing rounds on call for my doctor) when I was in the hospital, and he couldn’t comprehend my body position, which was slumped forward, arms wrapped around my pillow, eyes down, hair covering my face, etc. I saw his notes later when I requested, and received, my medical records, and he was so, so, so unaware that I was an acute depressive. Wow. If a trained psychiatrist can’t do it, how could a computer program that doesn’t understand irrational human thought?
Exactly.
Rebecca Reilly-Cooper used to underline this point often in her essays on the subject. I remember the first time I saw it as something of a lightbulb moment.
It’s worth checking out the research paper linked to from the article: it really goes above and beyond to run the correct ideological positions… biological sex is effectively non-existent / not important while gender identity and affirmation of such is the prime good, etc.
An excerpt:
Reading it prompted a thought: let’s assume a picture of a self-identified male (i.e. “AMAB”) at 18 years old is “correctly” identified as “male” by some software (i.e this individual accepts this classification as accurate at the time.). If this individual subsequently identifies as “female” at, say, 30 years old, and further asserts that “I was always female”, then suddenly this classification flips from accurate to inaccurate….. without the picture or the software changing in any manner.
Sounds workable…
It has a very Schrodinger’s Cat vibe to it.
I expect that the facial recognition bods will have to come up with a term to indicate how “manish” a face looks – without regard to their actual sex and gender. After all the main reason for trying to guess the sex of a face must be so you can describe it to somebody else – for ID it’s either a match or it isn’t.