Removed messages on Messenger Kids
We take our role in keeping abuse off our services seriously. That's why we have developed a set of Community Standards that outline what is and isn't allowed on Facebook's family of products.
On Messenger Kids, we apply our Community Standards more strictly. We err on the side of caution when it comes to removing content on Messenger Kids because it is an app designed for children under 13. We take extra steps to help keep the product and its content safe for kids.
What kinds of content do you remove from Messenger Kids? What types of content are against the standards you use for the app?
On Messenger Kids, we have a stricter application of our current policies — especially concerning the following areas:
- Bullying
- Nudity or sexual content
- Graphic and violent content, such as animal cruelty or fight videos involving children
- Celebrating crime
- Profanity
On Facebook, certain types of violent content may be placed behind a graphic warning label and hidden from people between the ages of 13 and 18. On Messenger Kids, we go a step further, and delete this content.
We know and appreciate that sometimes parents share nude images of their children with good intentions; however, we generally remove these types of images to help avoid the possibility of other people reusing them in inappropriate ways.
Can I appeal and get the content back if I don't think it should have been removed?
At this time, you can't appeal the removal of content. However, we are continuously taking feedback and may consider this in the future.
Are you looking at every message sent on Messenger Kids?
We do not look at every message sent on Messenger Kids, but we have started to use our proactive detection technology to identify content that may violate our policies and send it to our teams for further review. Our top priority is the safety of the community.
If you see anything on Messenger Kids that may go against our Community Standards, you can report it.