Honestly we should probably regulate these algorithms in general. People like Andrew Tate are a problem, but not the only problem.
My mother went down a conspiracy rabbit hole and never came back out again. You’d be surprised how short the pipeline from gardening, to arts and crafts, to crunchiness, to antisemitism, homophobia, misogyny, new world orders, and all that bs is.
Covid was especially bad. It’s a very short online road from “Does the Covid vaccine have side effects?” to having opinions about Hunter Biden or that Ukrainians had it coming.
A lot of it seems aimed at American politics, and can infect anyone who speaks English. Starting to think Kojima’s wackier MSGV plotline has some merit to it.
Not just English sadly, it‘s being translated to German too and so my mother ended up going the same way, it was honestly surprising to hear her rant about George Soros and Clintons and Biden like that when she never cared about American politics before.
It is bananas, and this pipeline isn’t even purely alt-right, even though it leads into a lot of bonkers alt-right rhetoric and talking points. My mother (65 years old, white, cis-het woman, Swedish) has always been pretty liberal. She’s a nurse, with an education, though when I was little she changed gears and went into botany instead. She was always a bit crunchy, and I think that was the “open door” necessary for all this ridiculous propaganda to be let in.
When I grew up, she always had LGBTQ+ friends, like her closest friend is a lesbian potter/bus driver. Wonderful woman. Now she posts homo-and-transphobic propo videos on her facebook page, spread PragerU bullshit, and all sorts of other ridiculous things.
Don’t get me wrong, she was never an unproblematic person, and even without all this weird alt-right radical propo I’d still not be in touch with her but I just don’t understand how her personal moral values can have been subverted so.
On the topic of Kojima, you might be pretty close to the mark as there is a through line from MGS2 as well: themes of mis-information, fake news and the problems of having too much information for any single person to parse. The guy was way ahead of his time
You’d be surprised how short the pipeline from gardening, to arts and crafts, to crunchiness, to antisemitism, homophobia, misogyny, new world orders, and all that bs is
There’s lots of women that have been misled into transphobia out of concern for their rights. I know a few people this happened to and now they’re full on TERFs.
The Crunchy-to-Alt-Right Pipeline isn’t a long one and it shouldn’t be too big a surprise considering the Volkish movement that laid a lot of the groundwork for the Nazis had a lot of people involved in ideas of health, subsistence agriculture and the occult.
The movement combined sentimental patriotic interest in German folklore, local history and a “back-to-the-land” anti-urban populism with many parallels in the writings of William Morris. “In part this ideology was a revolt against modernity”, Nicholls remarked. As they sought to overcome what they felt was the malaise of a scientistic and rationalistic modernity, Völkisch authors imagined a spiritual solution in a Volk’s essence perceived as authentic, intuitive, even “primitive”, in the sense of an alignment with a primordial and cosmic order.
It’s absolutely bonkers to me that it always boils down to antisemitism at some point along the way. It’s always misogyny, LGBTQ+ phobia, racism, and antisemitism. The last one always tends to tag along on the tail-end of the others, like it’ll start off as “trans people preying on children” going into “building shadow governments to take over the world” and then it’s always “funded by the jews.”
It doesn’t make any sense to me. Why specifically jewish people?
People promoting an us vs them narrative tap into a primal tribal undercurrent - migrants, Jews, various colours of skin over the years, the Irish, Gypsies, an on and on. Often it’s less important who, just as long as you have someone to blame.
The big question is how? The algorithms aren’t the root cause of the problem, they are just amplifying natural human behaviour.
People have always fallen down these rabbit holes and any algorithm based on predicting what a person will be interested in will suffer a similar problem. How can you regulate what topics a person is interested in?
Do we need algorithms that predict what we’re interested in though? At what point do we go “ah this is actually causing more trouble than it’s worth?”
I’d be perfectly fine browsing content by category rather than having it fed to me based on some sort of black-box weighting system with no clear instructions for me to correct. I mean it works great here on Lemmy.
My theory is society has a suppressing affect on these things… It’s not nice to be a nazi, or to mistreat people you don’t like, so these things get hidden.
Algorithms do the opposite. Now someone with Nazi tendencies is surrounded by them and encouraged. Posts hating trans people get pushed by algorithms because they drive engagement (even if all the initial responses are negative, it’s still engagement to the algorithm, which will then boost the ‘popular’ post).
Things like lemmy and mastodon don’t do that and end up nicer places as a result.
My parents went on the hippy - - > Q-anon journey and never came out of it. It’s insane to me, but their generation just doesn’t really understand the internet and are very susceptible to being led.
I agree, anything online seems to be a potential gateway to some iffy content. I sometimes watch things YouTube, and despite never watching anything even vaguely political I regularly see alt-right videos pop up in the recommended videos.
These platforms only care about increasing engagement, and that kind of stuff seems to hook people, whether it draws them in through sympathy or outrage. I’m not sure how well this can be effectively regulated however.
This happens to me on YouTube so frequently it’s, frankly, pathetic. The attempt at polarization is so heavy-handed it’s depressing that people are getting sucked in.
Honestly we should probably regulate these algorithms in general. People like Andrew Tate are a problem, but not the only problem.
My mother went down a conspiracy rabbit hole and never came back out again. You’d be surprised how short the pipeline from gardening, to arts and crafts, to crunchiness, to antisemitism, homophobia, misogyny, new world orders, and all that bs is.
Covid was especially bad. It’s a very short online road from “Does the Covid vaccine have side effects?” to having opinions about Hunter Biden or that Ukrainians had it coming.
A lot of it seems aimed at American politics, and can infect anyone who speaks English. Starting to think Kojima’s wackier MSGV plotline has some merit to it.
Not just English sadly, it‘s being translated to German too and so my mother ended up going the same way, it was honestly surprising to hear her rant about George Soros and Clintons and Biden like that when she never cared about American politics before.
It is bananas, and this pipeline isn’t even purely alt-right, even though it leads into a lot of bonkers alt-right rhetoric and talking points. My mother (65 years old, white, cis-het woman, Swedish) has always been pretty liberal. She’s a nurse, with an education, though when I was little she changed gears and went into botany instead. She was always a bit crunchy, and I think that was the “open door” necessary for all this ridiculous propaganda to be let in.
When I grew up, she always had LGBTQ+ friends, like her closest friend is a lesbian potter/bus driver. Wonderful woman. Now she posts homo-and-transphobic propo videos on her facebook page, spread PragerU bullshit, and all sorts of other ridiculous things.
Don’t get me wrong, she was never an unproblematic person, and even without all this weird alt-right radical propo I’d still not be in touch with her but I just don’t understand how her personal moral values can have been subverted so.
On the topic of Kojima, you might be pretty close to the mark as there is a through line from MGS2 as well: themes of mis-information, fake news and the problems of having too much information for any single person to parse. The guy was way ahead of his time
That’s about 9 posts on a normal reddit thread…
There’s lots of women that have been misled into transphobia out of concern for their rights. I know a few people this happened to and now they’re full on TERFs.
It’s like these algorithms radicalise people.
It’s not like they radicalize them, they are 100% designed to make them angry and radicalize them because it drives more and more clicks.
The Crunchy-to-Alt-Right Pipeline isn’t a long one and it shouldn’t be too big a surprise considering the Volkish movement that laid a lot of the groundwork for the Nazis had a lot of people involved in ideas of health, subsistence agriculture and the occult.
From Wikipedia:
It’s absolutely bonkers to me that it always boils down to antisemitism at some point along the way. It’s always misogyny, LGBTQ+ phobia, racism, and antisemitism. The last one always tends to tag along on the tail-end of the others, like it’ll start off as “trans people preying on children” going into “building shadow governments to take over the world” and then it’s always “funded by the jews.”
It doesn’t make any sense to me. Why specifically jewish people?
People promoting an us vs them narrative tap into a primal tribal undercurrent - migrants, Jews, various colours of skin over the years, the Irish, Gypsies, an on and on. Often it’s less important who, just as long as you have someone to blame.
The big question is how? The algorithms aren’t the root cause of the problem, they are just amplifying natural human behaviour.
People have always fallen down these rabbit holes and any algorithm based on predicting what a person will be interested in will suffer a similar problem. How can you regulate what topics a person is interested in?
Do we need algorithms that predict what we’re interested in though? At what point do we go “ah this is actually causing more trouble than it’s worth?”
I’d be perfectly fine browsing content by category rather than having it fed to me based on some sort of black-box weighting system with no clear instructions for me to correct. I mean it works great here on Lemmy.
Lemmy literally has an algorithm to rank posts
Or do you sort your posts by new?
What would you propose for YouTube?
My theory is society has a suppressing affect on these things… It’s not nice to be a nazi, or to mistreat people you don’t like, so these things get hidden.
Algorithms do the opposite. Now someone with Nazi tendencies is surrounded by them and encouraged. Posts hating trans people get pushed by algorithms because they drive engagement (even if all the initial responses are negative, it’s still engagement to the algorithm, which will then boost the ‘popular’ post).
Things like lemmy and mastodon don’t do that and end up nicer places as a result.
@dojan @Mr_Will
I’d recommend you read ‘Weapons of Math Destruction’.
Algorithms are usually developed with the best of intentions but no one really knows how they will behave out in the wild.
#algorithms
Thank you! I looked it up, and it sounds really interesting. Will have a deeper dive into it!
My dad went from militant anti-thiest to parroting christo-fascist talking points about 'wokeness" surprisingly quickly.
I guess the common through line is bigotry. Whether it’s directed at Christians, Muslims, women, gays or trans, it is all the same to him.
It still seems strange to me that he’ll hate on the church, and then go carry its water in hate campaign anyways.
My parents went on the hippy - - > Q-anon journey and never came out of it. It’s insane to me, but their generation just doesn’t really understand the internet and are very susceptible to being led.
I agree, anything online seems to be a potential gateway to some iffy content. I sometimes watch things YouTube, and despite never watching anything even vaguely political I regularly see alt-right videos pop up in the recommended videos.
These platforms only care about increasing engagement, and that kind of stuff seems to hook people, whether it draws them in through sympathy or outrage. I’m not sure how well this can be effectively regulated however.
This happens to me on YouTube so frequently it’s, frankly, pathetic. The attempt at polarization is so heavy-handed it’s depressing that people are getting sucked in.
The difficult question is how to decide what opinions are acceptable and which ones should be banned
I don’t think it’s safe for a government to be in charge of banning certain political opinions. Even if you personally disagree with them