It seems as the media – in the UK at least – would do well to only moderate reactively in future, according to HoldtheFrontPage.co.uk.
The written judgement in the case of Imran Karim v Newsquest has been made available, and it confirms that where newspapers and other online publishers are unaware of defamatory (or otherwise unlawful) UGC posts on their websites, they will have a defence to a claim for damages (e.g. for libel) if they act quickly to remove such UGC when notified of a complaint.
However, if they pre-moderate (and are thus aware of the UGC), they may then be liable for the content, as a knowing publisher.
The so-called ‘hosting’ defense – regulation 19 of the Electronic Commerce (EC Directive) Regulations 2002 – relies upon publishers, previously unaware of the offending UGC, taking it down reasonably quickly once notified.
A few large media publishers have already taken the route of reactive moderation – MailOnline for one – but the legal position was uncertain until this ruling. Here, the Judge upheld the rights of MailOnline to protect the identities of anonymous UGC contributors who had posted on an news story.
Online publishers can now confidently approach UGC in the same way as Internet Service Providers have been accustomed to dealing with UGC on the bulletin boards they host since the well-known ruling on ISPs’ liability in Godfrey v Demon Internet (1999).
This clearly adds much power to the arm of those who support complete freedom of expression, anonymous posting and little moderation. But before we leap to the conclusion that this judgement has opened the gates on a complete free-for-all, publishers should be aware that different considerations may apply where a newspaper is hosting a bulletin board or forum on a controversial topic which it is aware has, in the past, repeatedly received defamatory or otherwise unlawful UGC.
In this scenario, moderating and actively weeding out dubious content might still be the best approach to minimise risk, since a case could be made that the publisher should have been aware from previous history that unlawful content was likely to appear. And the hosting defence is only applicable where it is damages being sought (versus, for example an injunction).
Publishers also – I hope – shouldn’t lose sight of the end goal, in that they should be attempting to provide interesting content, which adds to, rather than detracts from, their reader’s experience. No publisher wants their brand to be associated with bad material. The risk of being sued for defamation is not the only risk that they are running: they are potentially hosting illegal content in the form of obscenity, terrorism, racial abuse, child endangerment .. the list goes on. Add to the that the tedious inevitability of spam and the inevitable action of trolls, and publishers have a difficult decision to make when it comes to moderation.