We currently offer user generated content (UGC) moderation and brand protection services around the clock in around 50 languages. If you don’t see the language you need, just ask, and we’ll try to source it for you.
What do we moderate?
Any platforms or media possible. We regularly monitor and moderate user generated content on all social networks (Facebook, Twitter, Google+, YouTube and Pinterest) news media, advertising campaigns and forums. Our online moderators judge competitions, triage live Twitter feeds, moderate audio, video, text and images. We use technology and tools to help us, but our moderators are all highly trained humans, with a deep cultural understanding of the projects they are working on.
We also work with specialist training company Moderation Gateway developing content for its new online moderation foundation training course, the first in the digital industry to provide a certified qualification for professional UGC moderators.
Brand protection. Brands are engaging with their consumers across social networks, micro-sites, live chat pages, outdoor billboards, blogs, forums and more. Engagement is great – but it’s a fact that all user created content – whether text, audio, images or video – carries a potential risk to the brand it is associated with. It could contain:
- Defamatory or libellous material
- Breach of copyright
- Obscene content or abusive or intimidating comments
- Child abuse or safety issues
- Spam, or off-topic comments
Our highly trained, dedicated teams of moderators are there 24/7 to offer brand protection and give your online audience a better experience, free from spam, flame wars or unsuitable user generated content. We work with you to escalate issues either to your internal divisions (PR or customer service for example) or to the appropriate legal authorities.
How do we moderate?
Depending on your needs, we may pre, post, or reactively moderate user generated content. Online moderating isn’t all about deleting swear words either (although we do a lot of that!). Our multilingual moderators, who are familiar with the cultural nuances of territories they are working in, also strive to improve the experience for the users: guiding and encouraging them to participate, warning to follow community guidelines and helping contributors to revise postings where appropriate. We’ll monitor your communities and feed back any issues.
What moderation tools do we use?
We are completely platform neutral and we will work with whatever our clients provide to moderated their UGC. That being said, there are some platforms and filters we can recommend and have helped to develop, so we know they’re good! We’ll give the benefit of our expertise, moderator feedback on your chosen solution, and help you with any technical issues as far as we can. While filters, including intelligent filters, are very important to help us do our jobs, they will never take the place of human eyes and experience.
You’ll get regular comprehensive reports to keep you informed of what we are moderating and why; we’ll tell you what your community are saying and advise you if you need more or fewer hours. We’ll be there to react to changes, scaling up or down quickly to cope with the situation.
Who works on your project?
Our multilingual moderators are based all around the world and are trained and supervised by Project Team Leaders and our UK and US Production Directors. All staff are regularly police checked, and staff working on children’s projects are interviewed face-to-face. The teams are assigned uniquely to a client based on experience and aptitude towards the project.
In the words of one of our clients:
“Most importantly, eModeration treats our community as if it were their own. They are as dedicated to our project as any member of our team and are a delight to work with” (A & E Networks)