Online Safety Act

UKworkshop.co.uk

Help Support UKworkshop.co.uk:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
However, the blanket control of everything "online" is probably not going to be super helpful as people up to no good don`t care about rules and regulations anyway.
No one's 'taking control'. It's really just a poke for organisations to run online services safely and diligently.
Most of the tricky aspects that will need serious intervention are for sites with over 7 million users. With fines at 10% of turnover, very many small hobby sites won't have much, if any, exposure to financial risk.
Making sites operate responsibly can only be a good thing.
 
The most challenging bit might be age verification, but arguably if there is no dodgy content then that might not be needed - and the solution might simply be to ask for a 4,000 word essay on the benefits and alternatives in sharpening ;)
There is a member who could do that in his sleep. ;)

OOPs have I just offended under the section "Offences relating to abuse and insults" if my comment has been taken as such. :ROFLMAO:
 
No one's 'taking control'. It's really just a poke for organisations to run online services safely and diligently.
Most of the tricky aspects that will need serious intervention are for sites with over 7 million users. With fines at 10% of turnover, very many small hobby sites won't have much, if any, exposure to financial risk.
Making sites operate responsibly can only be a good thing.
There is a measure of control here, currently it`s not super invasive but there could certainly be gradual and small increases to the control that may not be desirable and will be harder to spot.
Sites should ideally act responsibly, and most want to anyway.
I saw something the other day that was saying currently youtube could not possibly check even a small number of the 5 billion hours of uploaded video per day, that is 1000 hours per minute. This must be similar for instagram and others. Given this information the task is basically impossible no matter what the regulations say.
Who is responsible for any given bit of content ? Is instagram or twitter responsible for videos or comments uploaded or the person who made and or uploaded it? I tend to think the latter. The "platforms" are just the blank page or canvas.
 
I think there is generally a bit of an over-reaction to this...
A forum like this will count as a small low risk scenario which means the list below which summarises as:

- Someone responsible (admin / moderators / owner) - already have
- Ability to moderate content - built in
- Reporting and complaints - already have the ability to report
- Terms of Service - might need tweaking
- Ability to block users - built in

Not seeing anything terribly exciting novel or new in that...

Governance and accountability
Individual accountable for illegal content safety duties and reporting and complaints duties

Content Moderation:
Having a content moderation function to review and assess suspected illegal content
Having a content moderation function that allows for the swift take down of illegal content

Reporting & Complaints
Enabling complaints
Having an easy to find, easy to access, and easy to use complaints system and processes
Appropriate action for relevant complaints about suspected illegal content
Appropriate action for relevant complaints which are appeals – determination (services that are neither large nor multi risk)
Appropriate action for relevant complaints which are appeals – action following determination
Appropriate action for relevant complaints about proactive technology, which are not appeals
Appropriate action for all other relevant complaints
Exception: manifestly unfounded complaints

Terms of Service
Terms of service: substance (all services)
Terms of service: clarity and accessibility

User Access
Removing accounts of proscribed organisations


The aim is to block:
- child sexual abuse
- controlling or coercive behaviour
- extreme sexual violence
- extreme pornography
- fraud
- racially or religiously aggravated public order offences
- inciting violence
- illegal immigration and people smuggling
- promoting or facilitating suicide
- intimate image abuse
- selling illegal drugs or weapons
- sexual exploitation
- terrorism

Do you really think that there is any such issue here?
All of that is already moderated out - and you are not required to never have it - simply to act fast to remove it.

The most challenging bit might be age verification, but arguably if there is no dodgy content then that might not be needed - and the solution might simply be to ask for a 4,000 word essay on the benefits and alternatives in sharpening ;)
You didn't mention Jacob...
 
The problem is the legislation is wide open to interpretation. Many off topic 2 threads could easily fall into the prohibited catagory. The issue with all these initiatives is that policing of them usually starts with a report from an ‘offended’ member of the public, rather than some kind of monitoring.
 
There is a measure of control here, currently it`s not super invasive but there could certainly be gradual and small increases to the control that may not be desirable and will be harder to spot.
Couldn't you say the same for any law that prohibits something?
I saw something the other day that was saying currently youtube could not possibly check even a small number of the 5 billion hours of uploaded video per day, that is 1000 hours per minute. This must be similar for instagram and others. Given this information the task is basically impossible no matter what the regulations say.
Who is responsible for any given bit of content ? Is instagram or twitter responsible for videos or comments uploaded or the person who made and or uploaded it? I tend to think the latter. The "platforms" are just the blank page or canvas.
I think the point would be that the law could be enforced in certain, key cases, not that all cases will be identified and acted on.
 
As it’s only if you rock up here to promote …

- child sexual abuse
- controlling or coercive behaviour
- extreme sexual violence
- extreme pornography
- fraud
- racially or religiously aggravated public order offences
- inciting violence
- illegal immigration and people smuggling
- promoting or facilitating suicide
- intimate image abuse
- selling illegal drugs or weapons
- sexual exploitation
- terrorism

that you’ll be impacted I’m not sure what the problem is?
 
I'm a supporter of free speech, but not without responsibility and consequences.
There is too much toxic content on the internet and those responsible are not being held to account for either creating it or publishing it.
Some providers give an unmoderated platform to individuals who would never traditionally have it because everyone's voice SHOULD NOT be equal, and those too young and too undiscerning are harmed by it.
I haven't read the new legislation but I expect that I will broadly support it because I feel the pendulum is currently too far the wrong way.
At some point in the future, I may feel the opposite. That's normal. The happy medium isn't easy to find or to maintain for long.

It’a clear from your comment that you’re not a supporter of free speech.
Anyone who claims they are and then says ‘but’ is then going to say what they really think.
The ‘I support free speech’ before that is nothing but a throat clearing exercise to then launch into all the reasons why they want to control speech.
And I am then supposed to the. trust you with identifying the ‘middle ground’ given you can’t grasp the fundamentals of what free speech is?

If someone has a good idea and the other a bad idea, the middle ground is not neutral. Politics is about good or bad ideas and through free speech, we can discuss, argue, disagree on what those are.
This IS what makes us successful and separates us from the world.
It’s not always easy and it’s not always enjoyable but it is a mechanism in which we can speak truth to power.

There is no greater harm done to a people or society, than buy those removing freedoms in the name of doing good.
 
Last edited:
And all of those will get your content deleted and you permanently banned from this forum, regardless of legislation.
Exactly - I wouldn't want to be a part of a community that had that form of content...
As to what is Extreme - if it is published in a newspaper without wrappers in the newsagent, then arguably it should be okay - otherwise, not...
The biggest issue is likely to be hate - with a proliferation of NCHI (non crime hate incidents) being recorded by the police it is not difficult to see that spilling over onto the internet - your choice of chisel makes me feel real genuinely assaulted / my lived experience is insulted / you clearly hate me by buying those screwdrivers ;) (or work out the likely places where keyboard warriors fight each other verbally)...in fact if it stops that kind of posting the overall quality of forums should go up!
 
It’a clear from your comment that you’re not a supporter of free speech.
Anyone who claims they are and then says ‘but’ is then going to say what they really think.
I'm a supporter of free speech (or rather, freedom of expression, which is the actual human right you mean in a UK context) BUT not extending that to permitting paedophiles to share CSA images.
 
I saw something the other day that was saying currently youtube could not possibly check even a small number of the 5 billion hours of uploaded video per day,
Except they already do. They run automated checks for music copyright infringements.
As one of the biggest players in the world for AI, Google will have to apply their tech to ensure complying with UK law.

Worth noting that there's lot in the Offcom documents about site owners needing to remove offending content briskly once they're made aware of it's non-compliance.
In the cases of TikTok/X/Instagram where content can get tens of thousands of views in minutes, they'll need to put in place systems to take down content very quickly as soon as it's flagged up to reduce risk.
For small low traffic sites like UKW you could achieve even lower risks by just have moderators remove content within 8 hours.
 

Latest posts

Back
Top