…. but their moderation certainly need a fix.
I was just having a little think about Facebook’s news yesterday that it won’t be putting a CEOP panic button on all of its pages. Instead Facebook says it will have links to organisations including the Child Exploitation and Online Protection (CEOP) centre on its reporting pages.
Although I’m a huge supporter of CEOP’s marvellous work, I have to say I can see Facebook’s point.
Grooming activity is not the only reason why people may want to report a post – think of bullying, copyright, hate crime, terrorist activity, inappropriate language or imagery – there are a multitude of reasons why one user may wish to report another. If a CEOP panic button were prominent on each post, the danger would be that all these issues would be reported to CEOP and not Facebook. This would potentially be extremely counterproductive. CEOP would be drowning in issues it cannot directly act on and potentially missing time-crucial grooming complaints.
There would probably be a significant delay in take-down time and user accounts being blocked if the wrong route was taken to report inappropriate material. (Interestingly though, the CEOP report button does take you to a page where children can get help with dealing with cyberbullying: it’s top of their list actually, well above sexual behaviour).
Although it’s part of the Virtual Global Taskforce, CEOP is a UK police organisation (though obviously with links to international forces). Facebook cannot offer the same facility on all its sites, but it could offer tailored local links to help centres on each national site.
If my understanding of the recent tragic murder of Ashleigh Hall is correct, (a seventeen year old girl who was lured by a man using a false identity on Facebook), then a CEOP reporting button would probably not have helped. She didn’t think she was being groomed by a much older paedophile. She thought she was starting a relationship with a handsome young man her own age. And we have to be realistic about this: research shows that there are some vulnerable teenagers who welcome the attention, even when they know the relationship is not appropriate, and are unlikely to report it. What they don’t realise is the huge danger of emotional and physical damage they are courting, and it is education (both of the user and their parents/carers) which will be the key here.
I think therefore that the linking of the Ashleigh Hall case to the CEOP button is a bit of a false one. What is true is that a lot of approaches made by adults to children online are actually very direct. This is in contrast to what I think is the general perception, that it’s mostly about long drawn out relationship/trust building before eventually turning conversations towards sex (i.e. creating a situation where the child wouldn’t want to make a report). Where the button would make an impact is with kids who are approached with an overt sexual proposition straight away. Having a quick/easy way to report that behaviour (by whatever system) may result in more convictions of predators trying the direct approach.
Certainly, Facebook’s reporting systems need to be far more prominent. Personally, I’d like to see ‘report’ offered alongside ‘like’ and ‘comment’ against posts, or at least to be much more visible. Currently the path to report is this: Click through on the profile of the offending poster, then scroll right down (under their friends, photos etc.) on the left hand column, to find this rather recessive link:
If you didn’t know it was there, you’d have a hard time finding it. Why don’t they run a page from the main navigation telling you how to report someone? Run a ‘Report It to Facebook ‘ button on every page? It may not be the most efficient reporting system either, since it’s the user profile which is reported rather than the offending post and so it may be hard for Facebook moderators to locate the offending material and view it in context.
When making a report there is a choice of 1. blocking this person from communicating with you, or 2. reporting the user to Facebook. The dropdown under reporting offers categories of offence: Nudity or pornography / Fake profile / Racist or hate speech / Cyberbullying / Threatens me or others / Unwanted contact. I’m assuming Facebook is proposing help links based upon the choice at this point.
Update 22 Mar 10: Malcom Coles’ critique of Facebook reporting procedures reminded me that I hadn’t looked at the separate reporting procedure in Facebook for Fan Pages and Groups (versus personal profiles). Many thanks to Malcom and to Blaise Grimes Viort for the link. (Malcom, I’ve reproduced your screengrabs, for which thanks).
Entire Fan Pages: It is possible to report the whole Fan Page (in the left hand column of these pages is a “Report page” link), but the options offered as to why you may be reporting it are very limited, and there is no free text option.
Note that you won’t get any communication from Faecbook about their actions.
Fan Page Posts: It is not possible to report individual pieces of content created by the Fan Page owner (you’d have to report the whole page, see above). However, a ‘report’ link exists on each comment posted by Fan to that content. Again though, the reasons why you may report are very limited – there is no free text option to explain anything not completely obvious – for example, that a case is sub judice or an image copyright. You’d have to rely on Facebook moderators being very well informed about – well, pretty much everything.
Fan Page Discussions: As Macolm Coles points out, there’s no way to report an entire discussion thread in Fan Pages in one go: you would either have to flag each comment or report the Fan Page (and have no way of telling the moderators what it is you are objecting to).
Groups: The reporting system for a group is a big improvement: here you are invited to categorise the nature of your objection and say where it was you found it.
So what do you think? Facebook are due to meet with CEOP again in Washington on 12th April to ‘discuss it further’. I hope the network can put some really good plans on the table which will satisfy CEOP that they really are doing all they can to make the site as safe a place as it can be. At the moment it seems sorely lacking.
Update 12/07/2010: An interesting compromise appears to have been reached today. Here’s the press release from CEOP:
“Today represents a huge step forward. By adding this app, Facebook users will have direct access to all the services that sit behind our ClickCEOP button and this should provide reassurance for the many parents whose teenage children use Facebook.“We know from speaking to offenders that a visible deterrent could protect young people online. We urge all Facebook users not only to add the app, but also to bookmark it so that others can see that they’re in control online. Our dialogue with Facebook about adopting the ClickCEOP button is well documented – this is a good day for child protection.”
“Nothing is more important than the safety of our users, which is why we have invested so much in making Facebook one of the safest places on the internet. There is no single silver bullet to making the internet safer but by joining forces with CEOP, we have developed a comprehensive solution which marries our expertise in technology with CEOP’s expertise in online safety. Together we have developed a new way of helping young people stays safe online and backed this with an awareness campaign to publicise it to young users. It is only through the constant and concerted effort of the industry, police, parents and young people themselves that we can all keep safe online – whether on Facebook or elsewhere.”