How the draft Online Safety Bill would affect the development of Free / open source software

Screenshot of @revk's source code for voip-answer.c, licensed under GNU GPL 3.0

As readers of my personal blog, or people who know me, will probably be aware, I'm a proponent, and user, of Free and open source software.

And this got me thinking: how would the draft Online Safety Bill affect the development of Free / open source software, if it were passed in its current form?

Edit 20211023: in case it were insufficiently clear, the document is not yet a law, nor even a bill. It is at pre-bill stage. This means both that now is the time to engage with it, and also that it is not something with which you need to comply right now.

I'm thinking in particular of the common means for collaboratively developing Free / open source software:

  • a hosted, Internet-accessible, versioning system, to keep control of the source tree(s), and to enable other people interested in contributing code to submit patches
  • an issues list, to which people can post messages, and subscribe to receive other contributors' messages, as well as just viewing the message history online.
  • a wiki, for documentation, with some luck.

Let's say that, rather than using Github or one of its ilk, you are self-hosting this infrastructure.

Do you fall within the scope of the draft Online Safety Bill, and what does that mean?

tl;dr

tl;dr:

  • yes, but it's not 100% clear
  • you'd have lots of obligations
  • you'd probably just use Github or similar instead, because the hosting your own service will probably cease to be worth it

It's almost as if the draft Online Safety Bill is an attempt to use regulation to dissuade people from hosting things online, through the imposition of massive amounts of complex red tape…

Is your publicly available repository within the scope of the draft Online Safety Bill?

My view is "yes", but with a slight doubt.

The definition of "user-to-user service" is:

an internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.

"content" means:

anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description.

Source code patches, transmitted over the Internet, fall within this.

"internet service" means:

a service that is made available by means of the internet.

There is no doubt that, in this example, the repository is "made available by means of the internet".

Is it a "service"?

The aspect which gives me that slight doubt is whether a source code repository, to which others can contribute patches, and download source files or releases, is a "service".

The term "service" is not defined in the draft bill. Nor is it contained in the Interpretation Act 1978.

In a technical sense, the infrastructure in question is operating a server — it sends and receives in response to communications from clients — and so I suspect some would argue that that amounts to a "service".

In a less technical sense, if I operate as a handyman, putting up shelves for people, for example, I'd say I was providing a service.

But if I'm just willing to help friends out occasionally, even if I do exactly the same activity, I am sceptical I am providing a "service".

Is an online system, with which others can interact with it if they wish, necessarily a service? I think there's room to argue here, but I expect that proponents of the draft bill would consider a hosted source code repository to be a "service".

It would help if the term "service" was clearly defined on the face of the draft legislation. Given the significant breadth and scope of the obligations in the draft Bill, leaving something as key as this unclear is unreasonable, and places unacceptable risk on people who may or may not be service providers.

Is it a "regulated user-to-user service"?

If we assume that this infrastructure for hosting your own Free / open source software project is a "user-to-user service", the next question is whether it is a regulated user-to-user service.

A “regulated user-to-user service” means a user-to-user service that— has links with the United Kingdom, and is not exempt. There are some exemptions, in Schedule 1 to the draft Bill, but none apply here, so I'm not going to consider that bit further.

A service "has links with the United Kingdom" if it "has a significant number of United Kingdom users", or else the UK forms "one of the target markets for the service".

I am not sure that making a Free / open source project available in this manner has a "target market", so I am sceptical that this bit applies.

There is no definition of "significant number". Is that 10? 100? 1000? 10,000? If 20% of the total user base (of whatever size) are in the UK? 50%? Again, this is unacceptably vague.

It's perhaps a bit circular, but I'm going to suppose that a popular Free / open source project may well have a "significant number" of users in the UK.

As service also has links with the UK if it is capable of being used in the UK by individuals, and there are:

reasonable grounds to believe that there is a material risk of significant harm to individuals in the United Kingdom arising from— content present on the service.

Pretty much everything online is capable of being used in the UK by individuals, so the question is whether there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the United Kingdom arising from— content present on the service.

Does that mean that the content itself — e.g. the source code, or what is written on an issues list — must be harmful?

Or might one argue that what is being distributed is a de-centralised end-to-end encrypted messaging system, and that, since criminals could download it and use it to further criminal goals (let's say the distribution of illegal images of child abuse), the repository could cause a material risk of harm to children?

To my mind, that sounds like regulation of the code itself, rather than regulation of the repository, but stranger things have happened.

As we cannot conclude that Free / open source projects are necessarily excluded from the scope of the draft bill, we must assume that they are included, or are at least capable of being included, depending on the outcome of vague, subject tests.

Assuming it is a user-to-user-service, what would you have to do?

The "you" here could be an entity, if there is one, but it could also be an individual running the server for their own hobby project.

I emphasise that because the numerous obligations which follow are not just the burden of large, well-lawyered, organisations.

You must comply with all of the following:

  • the illegal content risk assessment duty
  • each of the illegal content duties
  • the duty about rights to freedom of expression and privacy
  • the duties about reporting and redress
  • each of the record-keeping and review duties

If Ofcom decides that what you are doing is a "Category 1 service" — not set out in law; this is left to regulatory discretion - then you'd have even more obligations.

The "illegal content risk assessment" duty

The “illegal content risk assessment duty” requires you to:

  • carry out an illegal content risk assessment before United Kingdom users are able to access the service.
  • carry out an illegal content risk assessment before making any significant change to any aspect of the design or operation of a service to which such an assessment is relevant
  • keep an illegal content risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.

There is a whole list of factors which you have to take into account.

The "illegal content" duties

In addition to doing and maintaining your "illegal content risk assessment", you must:

  • take proportionate steps to mitigate and effectively manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service
    • you must take into account "all the findings of the most recent illegal content risk assessment"
    • but fortunately you can also take into account "the size and capacity of the provider of a service" — but this relates only to determining what is proportionate. You'd still need to assess what steps are possible, before then applying the proportionality filter.
  • operate the service using proportionate systems and processes designed to—:
    • minimise the presence of priority illegal content;
    • minimise the length of time for which priority illegal content is present;
    • minimise the dissemination of priority illegal content;
    • where you are alerted by a person to the presence of any illegal content, or become aware of it in any other way, swiftly take down such content.
  • specify in the terms of service how individuals are to be protected from illegal content, addressing each of the points above
    • what terms of service? Nothing in the draft bill compels you to have terms of service...
  • ensure that— the terms of service are clear and accessible, and that they are applied consistently

The duty about rights to freedom of expression and privacy

You must have regard to the importance of— protecting users’ right to freedom of expression within the law, and protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures.

Got a code of conduct for your project? Or a moderation policy for a mailing list? Then you'd better assess its compatibility with "freedom of expression within the law".

You don't have a code of conduct? That's a real shame, but it's unclear whether an unstated policy / a lack of a code of conduct is, in itself a "safety policy or procedure", which triggers this requirement. It seems silly to suggest that a default of nothing, which could also inhibit speech (e.g. through discouraging everyone's lawful contributions by permitting hostile engagements).

The duties about reporting and redress

You must:

  • operate a service using systems and processes that allow users and affected persons to easily report content of the following kinds—
    • content which they consider to be illegal content
    • content, present on a part of a service that it is possible for children to access, which they consider to be content that is harmful to children
      • note that, even if your service is not likely to be accessed by children, it seems that you must permit someone to report something which they think could be harmful to children
    • content which they consider to be content that is harmful to adults
  • operate a complaints procedure that—:
    • allows for thirteen different kinds of complaints
    • provides for you to take appropriate action in response to such complaints, and
    • is easy to access, easy to use (including by children) and transparent
  • make the policies and procedures that govern the handling and resolution of complaints above publicly available and easily accessible (including to children)

The "record-keeping and review" duties

You must:

  • make and keep a written record of every illegal content risk assessment
  • make and keep a written record of any steps taken to comply with the duties above, unless the steps you take are described and recommended in a code of practice
  • review compliance with the duties above:
    • regularly, and
    • as soon as reasonably practicable after making any significant change to any aspect of the design or operation of the service

Wow, that's a lot of obligations

Yes, it is, isn't it!

I have no idea how smaller providers are going to cope with this.

I expect, if it passed, I'll end up writing a downloadable toolkit, to try to help with most of it, but, even with the benefit of something like that, it's still a significant administrative overhead if you just want to run up a git server to let others contribute to your project.

I suspect that it will just drive people towards managed/centrally-hosted services, which deal with this significant compliance burden on your behalf. I'm far from certain that that's a good thing.

But I guess one way of dealing with "online harms" is using regulation to dissuade people from hosting things online...

But surely git repos and the like are not what the draft bill is aimed at, so there won't be any enforcement?

That's a brave statement.

I don't think that code repositories are even close to what the draft bill is aimed at. But they appear to fall within the scope of "user-to-user services", and they are not on the list of exemptions, so...

This is one of the many failings of this legislative land grab: it is so broad, that the unintended consequences (if they are indeed unintended) will be significant.

The correct thing to do is to go back to the drawing broad, with clear, precise legislation, with powers debated and enacted by Parliament on the face of the legislation, to tackle clear, precise issues. Not lengthy and complex legislation which devolves significant power to codes of practice and regulators, and which snares in its net all manner of services which need not be there — but this is a byproduct of trying to rush something through without an agreed set of clear, precise objectives.

Relying on regulatory discretion — "don't worry, it's not intended to apply to you, even though it does" — is a risky substitute for doing it properly. And it places the risk on you, the people running these repos, not on the lobbyists pushing for this, not on the government, and not on the regulator. On you and your businesses.