Banning apps which could be used to share porn seems worrying not least because a clear correlation between a ban and countering sexual abuse, particularly child abuse, which is often cited as the rationale underlying proposed bans, does not appear to have been established.
There definitely are problems with evidence-based policy but if one argues in favour of banning specific apps or, indeed, entire technologies on the basis that it'd help counter child porn images (which are, not to put too fine a point on it, indecent images of children and, where real children are featured, evidence of child abuse), it is perhaps worth asking what makes a specific app or technology suitable for being so banned but not others, and to what extent the enthusiasm for a specific ban has roots which can be traced to class. One man's porn is another man's art, as they say.
There definitely are problems with evidence-based policy but if one argues in favour of banning specific apps or, indeed, entire technologies on the basis that it'd help counter child porn images (which are, not to put too fine a point on it, indecent images of children and, where real children are featured, evidence of child abuse), it is perhaps worth asking what makes a specific app or technology suitable for being so banned but not others, and to what extent the enthusiasm for a specific ban has roots which can be traced to class. One man's porn is another man's art, as they say.
The case of websites is somewhat easier to determine than the issue of apps since it's possible to look at either the content available through specific URLs or the overwhelming use of entire websites, and determine whether they deserve to be banned. This is an approach which the Delhi High Court adopted in its judgment in CS(COMM) 724/2017 issued on 10 April 2019. Although the case itself dealt with websites which are overwhelmingly used for copyright infringement, the court drew analogies with child porn with the result being that, by inference, it should be possible to argue that a website devoted to child porn and perhaps other forms of non-consensual porn, even it is hydra-headed (popping up as it may with minor alphanumeric variations), can legitimately be blocked in its entirety through a judicial order.
However, the trouble with porn-related regulation is that although there are forms of porn that are clearly morally and legally indefensible, pushing them off easily-accessible platforms doesn't necessarily get rid of the explicit imagery itself. It makes it less visible: out of sight and out of mind.
Forcing indefensible, non-consensual 'porn' entirely out of sight is not necessarily always the best way to counter it. The worst abusers -- 'consumers', as they are often flippantly called despite their choosing to watch crimes for entertainment -- will find ways to access it. And non-consensual 'porn' whether it is child porn or trophy porn is filmed sexual abuse, as it is worth reiterating over and over again.
Making explicit imagery publicly-invisible can have the presumably inadvertent effect of pushing those who feature in it further into the fringes of public consciousness. In the case of 'child porn': abused kids or, in the case of trophy porn: raped women.
Maybe there is reason for banning or restricting technologies which can potentially be used to help non-consensual explicit imagery proliferate just as there definitely is good reason to ban manifestations of technology which are without doubt solely devoted to the proliferation of non-consensual porn. However, the arguments in favour seem of such bans in the case of possible (and not actual) abuse seem to to stem, in large part, from an evidence-free visceral and entirely understandable desire to make non-consensual porn such as 'child porn' disappear.
However, the trouble with porn-related regulation is that although there are forms of porn that are clearly morally and legally indefensible, pushing them off easily-accessible platforms doesn't necessarily get rid of the explicit imagery itself. It makes it less visible: out of sight and out of mind.
Forcing indefensible, non-consensual 'porn' entirely out of sight is not necessarily always the best way to counter it. The worst abusers -- 'consumers', as they are often flippantly called despite their choosing to watch crimes for entertainment -- will find ways to access it. And non-consensual 'porn' whether it is child porn or trophy porn is filmed sexual abuse, as it is worth reiterating over and over again.
Making explicit imagery publicly-invisible can have the presumably inadvertent effect of pushing those who feature in it further into the fringes of public consciousness. In the case of 'child porn': abused kids or, in the case of trophy porn: raped women.
Maybe there is reason for banning or restricting technologies which can potentially be used to help non-consensual explicit imagery proliferate just as there definitely is good reason to ban manifestations of technology which are without doubt solely devoted to the proliferation of non-consensual porn. However, the arguments in favour seem of such bans in the case of possible (and not actual) abuse seem to to stem, in large part, from an evidence-free visceral and entirely understandable desire to make non-consensual porn such as 'child porn' disappear.
Unfortunately, making non-consensual porn disappear from an app which is easily publicly-accessible or making the app itself disappear, especially if the app is not devoted to the proliferation of non-consensual porn alone, may not be enough to protect children or others victimised buy such porn.
It has to be said that it is hard to protect the victims of sexual violence, and it is even harder to protect those victims who are not even visible to the public. It is possible that bans, unless they are carefully targeted, would achieve little apart from making those who are abused even more difficult to find.
Ultimately, deciding on whether or not a ban is appropriate requires balancing a number of priorities: protecting those who feature in non-consensual porn, restricting the proliferation of porn (since, whether or not one thinks it should be legal, the fact of the matter is that it is not currently legal), and protecting children who are able to access porn as consumers.
The argument that bans are appropriate because children are being or could be targeted as consumers of porn, of course, takes one down a rabbit-hole trying to negotiate what is and isn't appropriate. And legal. Here, too, it's worth asking: Who's affected? How? Where's the evidence? After all, so-called 'age locks' are likely to be easily circumvented by even mildly-tech-savvy teenagers. And 'age-verification to access explicit imagery' gives rise to a whole host of privacy concerns for consumers in particular and safety concerns for consenting sex workers in particular.
The argument that bans are appropriate because children are being or could be targeted as consumers of porn, of course, takes one down a rabbit-hole trying to negotiate what is and isn't appropriate. And legal. Here, too, it's worth asking: Who's affected? How? Where's the evidence? After all, so-called 'age locks' are likely to be easily circumvented by even mildly-tech-savvy teenagers. And 'age-verification to access explicit imagery' gives rise to a whole host of privacy concerns for consumers in particular and safety concerns for consenting sex workers in particular.
We need to ask more questions about what the best course of action is, and whether bans are appropriate given that they may not only fail to protect those who are abused and but also almost certainly have adverse free speech and safety implications.
We've gone down this road many, many times trying to ban porn. Claiming that we're protecting children. Claiming that we're countering violence against women. Remember the 857-site ban? We've asked questions. Just not always the right ones. And not enough of them.
Ultimately, this isn't about any specific case or app or technology in isolation. It's about its being high time we started interrogating who exactly we're trying to protect with porn bans, how we're going about it, and whether bans have their intended effect.
If there's anyone who deserves protection when it comes to explicit imagery, it's those who feature in it -- children and adults -- and those who are exposed to it as children. Discussions relating to specific apps and technologies ebb soon enough. The issues they raise, however, remain.
(Edited and cross-posted from Twitter.)
We've gone down this road many, many times trying to ban porn. Claiming that we're protecting children. Claiming that we're countering violence against women. Remember the 857-site ban? We've asked questions. Just not always the right ones. And not enough of them.
Ultimately, this isn't about any specific case or app or technology in isolation. It's about its being high time we started interrogating who exactly we're trying to protect with porn bans, how we're going about it, and whether bans have their intended effect.
If there's anyone who deserves protection when it comes to explicit imagery, it's those who feature in it -- children and adults -- and those who are exposed to it as children. Discussions relating to specific apps and technologies ebb soon enough. The issues they raise, however, remain.
(Edited and cross-posted from Twitter.)