As we enter 2018, we are only a few months away from the British government’s proposed deadline for imposing age verification obligations on porn sites. The measures were introduced as part of the Digital Economy Act 2017, and require commercial providers of pornography to ensure that their material cannot be accessed by anyone under 18. An appointed regulator, subsequently revealed to be the British Board of Film Classification (BBFC), will have the power to order Internet service providers to block sites which fail to comply. Unsurprisingly, the law has provoked a range of criticisms. Online freedom campaigners such as the Open Rights Group are concerned about yet another attempt to censor the web — there is no opt-out, even though most British households have no children — while some members of the adult entertainment industry worry about the implications for user privacy and the financial impact on smaller sites. Even Private Eye have questioned the commercial wisdom of allowing a company which operates several large porn sites to also provide the age verification service that its own competitors may be forced to rely on.
Private Eye on age verification, the BBFC and Mindgeek: “What could possibly go wrong?” pic.twitter.com/7gT1S6hSYZ
— Myles Jackman (@MylesJackman) January 4, 2018
It is not difficult to see why this legislation was waved through Parliament. On the face of it, the suggestion that providers of adult content should be required to verify the age of their viewers is entirely reasonable. After all, we don’t allow children to visit sex shops or employ the services of prostitutes, so it hardly seems overly draconian to expect vendors of online pornography to play by the same rules. The problem, as always, is that little word “online”, and the government’s continued failure to understand that the internet is a global communications network, not a broadcast TV channel whose content is subject to regulation by the British state or its delegated film classification authority. Most of us would instinctively disapprove of children watching people have sex (although evidence of actual harm seems to be rather scant), but whether or not a policy is well-intentioned has no bearing on whether it can actually be made to work.
At this point it remains to be seen how the blocking will work in practice, as the law is not particularly clear in this regard. There is no blanket obligation for Internet service providers to identify and block sites on their own, but they will have to implement a block if ordered to do so by the regulator. It does not appear possible for ISPs to challenge an order or receive compensation for the costs of installing filtering equipment, and there is no customer opt-out even for adults who live alone. The legislation is similarly vague about exactly how ISPs are expected to deny access to offending material. Web filtering is notoriously ineffective, with circumvention techniques ranging from the trivial (Googling for an alternative address for the blocked site) to the mildly inconvenient (using the Tor browser, signing up for a free secure browsing service), so it is arguably impossible to actually prevent a customer from accessing a non-compliant site. Will their inability to deliver a magic blocking solution result in legal consequences for an ISP? Until they get dragged into court, we just don’t know.
Perhaps the most striking aspect of this policy is how far ahead of the curve it is. The typical procedure for introducing censorship is to start with something that almost nobody would defend — child pornography is often a good choice — and then gradually expand the list, incorporating more and more material on the basis that it is similar to something already banned (the beauty of this approach is that it can be continued forever: there is always something just beyond the reach of existing law that can be declared a “loophole” in need of closing). However, with the Digital Economy Act the government have skipped several steps in this process of expansion. There is not yet any general obligation for ISPs to block access to illegal or copyright-infringing content hosted overseas, unless ordered to do so by a court, but there will soon be a requirement to prevent consenting adults from viewing legal pornography involving other consenting adults. Even if we accept the government’s view that regulating the internet is necessary, they seem to have picked a very odd place to start.
This weird, almost schizophrenic quality extends to other aspects of the law as well. Age verification is required on sites that provide pornography “on a commercial basis” to the UK — because apparently kids looking at porn is perfectly fine if nobody is getting paid — but ministers have suggested that a credit card is sufficient to verify a customer’s age. So the law is intended to target “commercial” vendors that don’t take credit cards? Is there a massive problem with children purchasing porn using Bitcoin or postal orders? Is the legislation aimed entirely at ad-supported sites? Presumably there will be no impact on the dozens of web forums, Reddit communities and social media accounts which share adult content, all of which can be found with a few search engine queries but clearly do not constitute commercial pornography vendors. The law seems to contain a curious mixture of ineffective half-measures and authoritarian overreach, managing to be both impotent and draconian at the same time — almost an achievement in itself, but it does not inspire much confidence that the policy will work.
With policies such as this it is often tempting to dismiss legislators as ignorant control freaks, but this would be an oversimplification of the issue. Certainly there are some who seem to behave more like conspiracy cranks than rational authority figures — one of the loudest proponents of porn filtering, Claire Perry MP, rather hilariously accused a journalist of hacking her website after he reported on the story in 2013. A more charitable interpretation, however, is that politicians just have incorrect mental models of what they are trying to regulate. The internet is not a TV station, a babysitter, a shopping mall, a kindergarten, or a support group. It is more like a global, electronic road system: a content-neutral network that carries data from place to place with little or no knowledge of what is being sent. This does not mean that it should be unregulated; we can reasonably ask that roads be appropriately paved or that broadband be available at a reasonable price. But just as road-builders are not expected to prevent drunk driving, it is not the responsibility of Internet service providers to supervise the online activities of children. That is what parents are for.
To those of us who do not purchase porn online or are tech-savvy enough to ignore web blocking, this legislation may seem like an irrelevance. Clueless attempts to regulate technology are a dime a dozen in today’s political climate. Nevertheless, we should not underestimate the significance of this great leap forward for censorship in the UK. The introduction of a compulsory nationwide internet filter, which even adults have no right to disable, represents a considerable victory for the Mary Whitehouses our age, and it would be extremely naive to imagine that they will stop here. Once the government has decided that every British citizen is a child in need of protection by a state-appointed regulator, it is open season on all manner of content that might be considered unsuitable for children: foul language, discussions of sex and drugs, tobacco or soft drink adverts, controversial political speech, adult-rated movies and games, information about avoiding web filters… the list will go on, and on, and on. If we don’t stand up for our rights as adults and reject this creeping infantilisation, we will soon find ourselves being told what we can see, what we can read — and ultimately, what we can think.
Matthew Mott is a writer and photographer with a background in technology, based in the UK. He can be found on Twitter @InfiniteDissent