6

UK Government's Online Safety Bill - regulator Ofcom responds

 2 years ago
source link: https://diginomica.com/online-safety-bill-ofcom-responds
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

UK Government's Online Safety Bill - regulator Ofcom responds

The UK wants regulators to support innovation and economic growth.

Meanwhile, the post-Brexit government claims it spurns red tape, in favour of light-touch rules that foster creativity and nurture prosperity.

Yet at the same time, Britain is introducing complex, wide-ranging legislation that may make it a more restrictive market than the US or EU.

At least, that's according to the tech platforms that are consulting with Whitehall on its plans. But they would say that; they’ve made billions of dollars from the status quo. Some corporations have market caps to rival UK GDP.

Image of a keyboard with the UK flag as the enter key

(Image by www_slon_pics from Pixabay )

Even so, factor in enough Brexit red tape to stretch from Westminster to Dover and a cynic might observe that Britain says one thing about sweeping bureaucracy aside, but increasingly does the opposite. The draft Online Safety Bill is one example. With supporting documentation, it runs to hundreds of pages that seek to minimize online harms and rein in the power of Big Tech – just the sort of thing Eurosceptics used to criticize the EU for doing.

It mandates a duty of care among internet companies, and few would argue that is unnecessary. However, the promised light touch is not in evidence: the Bill is complex, unwieldy, and in some cases contradictory. So much so that tech platforms told a Westminster eForum this week that they have yet to plough through the whole thing and understand its implications.

That might be grandstanding or (whisper it) 'Fake News', but equally it may suggest the Bill is too complex. That’s not to say its big-picture aims are wrong, of course. And it remains a work in progress, though the clock is ticking. The government wants to be seen to act.

So, what does communications industry regulator Ofcom think of the Online Safety Bill, and its own role when it becomes law? Richard Wronka is Director of Online Safety Policy at Ofcom. Speaking at this week’s eForum, he said:

We've heard concerns that tech companies haven't always prioritized safety. There's sometimes been a lack of clarity around how platforms’ decisions impact on user safety. And concerns have also been raised about how tech companies’ business models might, in some circumstances, cut across user safety, or that safety hasn't been designed into services from the outset.

So, this new regime provides a welcome opportunity for greater transparency and accountability within the technology sector. And through that increased accountability comes the prospect of a safer life online for the UK public.

However, we recognize that improving user safety cannot happen in a vacuum. It's critical that regulation also protects the rights of users in relation to freedom of expression and privacy, while also supporting competition and innovation.

Thus far, Wronka has merely described the size of the problem and the government’s ambitions to solve it. What about Ofcom’s emergent responsibilities? Despite the detail of the Bill still being debated, he said:

We can already speak confidently to the key principles that will shape our regulatory approach. First, it will be centered on internet users, both adults and children, and their rights online.

It will pay particular attention to the impact of the regime on freedom of expression, building on our experience of protecting that in relation to the broadcast regulation responsibilities we've held for almost 20 years. 

Our regulation will also be proportionate, ensuring that our expectations or actions are calibrated to the nature and size of different services, and to the risk and severity of harm. This will be essential in supporting competition and innovation in the technology sector, which we absolutely want to do.

We'll look to be evidence based. We'll use our own research programmes and other robust evidence sources to understand internet users. And this will help define our priorities and inform our expectations over time.

And we'll also look to be consultative. We want to build on the expertise and experience of a range of organizations, including industry, civil society, and other digital regulators that have a stake in this regime.

Finally, we will act independently. There are, of course, ways in which government and other organizations can have an input to the regime, but we'll discharge our functions without fear or favour.

Proportionality

Throughout the passage of the Bill, Ofcom will continue to advise both government and the UK Parliament on the workability and effectiveness of proposals. And once all powers are live, said Wronka, it aims to publish consultation draft codes covering legal harms, risk assessments, and other guidance.

He added:

Oversight of this new online safety regime is absolutely the critical role, but we're really excited by the opportunity to contribute to a safe online environment for the UK public.

Big Tech has consistently been in regulators’ crosshairs. But what about smaller organizations, communities, and websites, which appear to be in scope of the Bill?

Wronka responded:

I loop back to the proportionality principle. We will be trying incredibly hard to tie our regulatory expectations to the risk of harm and to the size and capacity of services. That is very much the way that Ofcom sees the nature of online safety regulation.

But we recognize, of course, that there will be challenges for many businesses, including smaller services, as the new regime is implemented. And there is a role for Ofcom in being as clear as possible about our expectations.

We'll do that primarily through our codes of practice, which are designed to provide appropriate measures that services can put in place. They will then be deemed to comply with the requirements of the regime.

One criticism of the Bill’s latest draft is that online media literacy seems to have fallen out of scope, though it is one of the most important ways to educate users about risks – particularly vulnerable groups, such as children and minorities.

Wronka said:

Our own media literacy duties are derived from the Communications Act, which is pushing on for 20 years old. However, we have found our powers and duties to be relatively elastic and flexible on this.

We are confident that our existing duties will allow us to engage in and undertake a range of media literacy initiatives in the future. This is a key area working in partnership with other organizations. However, the regulator isn't always going to be the right voice to reach different parts of the UK population.

Legal harms

Another question is the extent to which legal hate speech will be within scope of the Bill – hurtful or harmful content that does not cross the threshold of a crime. How much should providers see that as protected freedom of speech, when cyberbullying and identity-targeted hate may cause real damage to vulnerable people via their platforms?

The depth of so-called legal harms’ applicability and coverage in the Bill – the core principle of which is internet platforms’ duty of care – is far from clear, admitted Wronka.

He added:

The government has said we'll set those out through secondary legislation. Ultimately, and quite rightly, those will be decisions for government and Parliament.

On the basis that [tackling] hate speech is a priority […] there is discretion for platforms to decide, following their own risk assessments, on their own tolerance levels for legal but harmful speech.

This is one of the ways in which the Bill will protect freedom of expression. But we will expect platforms to be very clear with their users what their threshold is, and then to implement it in a consistent and understandable way.

My take

That’s certainly the pragmatic response. However, we live in an era in which the culture war being fought by some is increasingly incendiary, divisive, and damaging. Some social influencers have become commodity traders in cynicism and bile, citing freedom of speech as evidence that they are good people with robust points of view.

It’s not a crime to be self-centered, hectoring, vain, or arrogant, and – hey – it guarantees clicks and engagement. However, the challenge is such people or viewpoints are freed from the real-world consequences of their actions by the platforms they use. It becomes the victim’s problem to just suck it up and fight back.

It's an approach that makes money for online platforms and it ramps up those all-important clicks and eyeballs – or it appears to. But in some cases what it really flags is how badly designed those platforms are, and how easily they can be gamed or manipulated so individuals and organisations can establish a power base.

The fact that, in 2022, it is still practically impossible to tell whether 10,000 Likes or retweets are down to a post or tweet being popular, or because it has triggered 10,000 bots, fake accounts, and troll farms is perhaps the biggest problem of our age.

This systemic flaw in social platforms encourages the types of behaviour that the Bill wants to tackle. But can it? Sadly, that is far from clear.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK