Twitter published it and twitter's algorithms pushed the dangerous content onto her feed. They have responsibility for it.I would agree the Molly Russell case could be attributed to online words in a direct sense. But those directly to blame are the evil people who have posted the nonsense. Not Twitter itself.
So you haven't grasped that people can become radicals by exposure to material that they wouldn't otherwise have been exposed to.The radicals will just continue regardless of any laws anyway because they are fanatics who already don`t care about law and order or society at large.
In case you hadn't noticed, your post's still here ;-)It’a clear from your comment that you’re not a supporter of free speech.
The algorithm just gives you what it thinks you want from previous interactions. If you click on 10 videos about cats it will keep sending you cat videos.Twitter published it and twitter's algorithms pushed the dangerous content onto her feed. They have responsibility for it.
Better moderation would have prevented her death.
Similarly other people have been driven to commit crimes of violence that they wouldn't have considered if they hadn't been deliberately exposed to lies and malicious content.
So you haven't grasped that people can become radicals by exposure to material that they wouldn't otherwise have been exposed to.
No it doesn't, and that's the problem.The algorithm just gives you what it thinks you want from previous interactions.
I'v had a facebook account for years and don't get any irrelevant or offensive content sent to me, it has got to be to do with where you have gone on facebook and the algorithms pick up on your likes.No it doesn't, and that's the problem.
Read up on the Molly Russell case and you'll find that the algorithms pushed more and radically greater dangerous content to her than she ever started looking at.
If you've a Facebook account you'll find that you get pushed content that you've never searched for AT ALL. Maybe your 'friends' might have looked at it, maybe just people in groups you've joined have engaged with it. But you still get video content pushed at you that is irrelevant and offensive.
It’s more insidious than that. It’s where people or groups you have connected to have looked/been interested in. So if you read a post about stained glass, others who have read that post might also have looked at church windows and other people who looked at church windows might have looked at religious iconography and others who looked at iconography might have looked at crucifixes. The algorithm might toss you a recommendation of crucifixions on the off chance you’ll click so they can monetise an add. It’s all about the probability of getting you to click. Of course once you’ve clicked the click bait you’ll get a lot more of the same plus even more of the seemingly random click bait offerings.I'v had a facebook account for years and don't get any irrelevant or offensive content sent to me, it has got to be to do with where you have gone on facebook and the algorithms pick up on your likes.
Lucky you. It's pretty obvious from my feed that content is being pushed at me that I've NEVER looked for or engaged with.I'v had a facebook account for years and don't get any irrelevant or offensive content sent to me, it has got to be to do with where you have gone on facebook and the algorithms pick up on your likes.