TikTok, the Chinese-based short video app owned by ByteDance and known for its colorful filters and dancing animal emojis, has taken steps to address harmful content on its platform. TikTok recently launched a campaign against anorexia and has pledged to work with the NHS to create content that promotes healthy body image. This is not the first time TikTok has made moves to limit harmful content on its platform — it also removed videos featuring inappropriate comments about underage girls or sexual violence several months ago. The company’s efforts are part of a larger trend among social media platforms that are working hard to address issues of mental health and well-being while promoting positive self-expression among users worldwide.

TikTok knows harmful content is an issue. The platform is working with NHS, charities, and other organizations to ensure that the app doesn’t promote or encourage harmful behavior. It has also introduced a toolkit for people who have mental health issues, including eating disorders and self-harm.

TikTok is working with the National Eating Disorder Association (NEDA) to create educational content about eating disorders for its app.

The partnership comes one year after NEDA launched a campaign calling for TikTok and other social media platforms to be more accountable for their role in spreading harmful messages about body image.

NEDA CEO Mary Kate Bingham called the move an “important step” in tackling the issue head-on, but stressed that companies like TikTok have a long way to go before they can truly be considered safe spaces for users.

Social media can be a great tool for connecting with people, finding and following content you enjoy, and even finding new clothes, makeup and hair. Social media is also a great place to see how other people look. Many of us use social media to check out our friends’ posts or photos (sometimes too often), but we don’t think about the impact it can have on our own body image.

The steps TikTok is taking are not unique. Other social media platforms have policies in place to limit the spread of harmful content, including Facebook and Instagram.

For example, Facebook has a policy that bans hate speech, which includes language that attacks individuals based on their race or ethnicity, religion and gender identity. This policy also prohibits group pages that support violence or encourage others to commit violent acts against certain groups of people.

Instagram has similar policies in place for its users: Any images containing nudity must be marked as private (the only exception being artistic nudity), while any photo depicting sex acts or sexually suggestive poses must be marked as mature content and users under 18 years old must be blocked from seeing them.

TikTok is taking steps to address harmful content on its platform. In a statement released Thursday, the company said it has partnered with the National Health Service (NHS) in Britain to create educational videos that address issues such as mental health and self-harm. TikTok says it will also be launching a new feature called “Rapid Response,” which enables users to report harmful posts and receive support from counselors within 15 minutes of filing a complaint. We believe that working with the NHS to create content and collaborating with other platforms will help make TikTok a safer place for young people.