Zuckerberg torches girls’ egos
Instagram might have something to do with it.
According to internal research leaked to the Wall Street Journal (WSJ), the app has made body image issues worse for one in three girls, and in one Facebook study of teenagers in the UK and the US, more than 40% of Instagram users who said they felt “unattractive” said the feeling began while using the app.
Instagram has more than 1 billion users worldwide and an estimated 30 million in the UK, with Kim Kardashian, Selena Gomez and Ariana Grande among the accounts with hundreds of millions of followers between them….
Two in five girls (40%) aged 11 to 16 in the UK say they have seen images online that have made them feel insecure or less confident about themselves. This increases to half (50%) in girls aged 17 to 21, according to research by Girlguiding in its annual girls’ attitudes survey.
Why would anyone want to look like Kim Kardashian though? She looks like a sex doll. She looks like someone who never does anything other than buff her appearance. Why don’t girls want to look like girls and women who do things, who are interesting, who have something to say?
Facebook’s in-depth research into the photo-sharing app stated that Instagram had a deeper effect on teenage girls because it focused more on the body and lifestyle, compared with TikTok’s emphasis on performance videos such as dancing, and Snapchat’s jokey face features. “Social comparison is worse on Instagram,” said the Facebook study.
Pretty much what I’m saying. Dancing, jokes, soccer, poetry, running, astronomy, rock climbing, painting – a million things other than having a Barbie doll face and tits. Why can’t girls just be people?
Beeban Kidron, the crossbench peer who sits on the joint committee into the online safety bill and was behind the recent introduction of a children’s privacy code, says Ofcom, the UK communications watchdog, will have a vital role in scrutinising algorithms.
“The value in algorithmic oversight for regulators, is that the decisions that tech companies make will become transparent, including decisions like FB took to allow Instagram to target teenage girls with images and features that ended in anxiety, depression and suicidal thoughts. Algorithmic oversight is the key to society wrestling back some control.”
It makes perfect sense that it’s Facebook, of course. Facebook started out as literally face book – Zuckerberg’s judgments on his Harvard classmates who had the bad luck to be female.
For some reason, on Instagram they throw a lot of feeds of young models my way. I’m really only interested in pets, bands, flowers, and what my actual; friends share. I’ve been scrolling away from them, but after reading them I choose to hit the “not interested ” option.
As a technical point, algorithmic oversight is effectively impossible. The algorithms in question are most likely based on machine learning (what we used to call AI) which means that the harmful stuff isn’t coded in as a set of explicit rules, it’s an emergent result of loftier-sounding goals (maximising metrics like profit), people’s behaviour and general rules of how self-organisation works (eg ‘rich-get-richer’ semantics of popularity).
So what are the regulators going to do? They can point to bad outcomes, but can they dictate how the algorithm or the way data is collected should change to prevent them happening again? I couldn’t… At least, not in a way that someone couldn’t easily find a way around.
Let me be clear: I think oversight and transparency of algorithms is vital. They’ve become a serious problem. But I think that problem is a hell of a lot bigger and more complicated than Ofcom thinks.
latsot, I have to admit, everytime I see “Ofcom”, I think of the Handmaid’s Tale.