TikTok to remove videos that violate policy automatically

The social networking application TikTok is now all set to introduce a new feature where automation will be used to remove all the content that violates community guidelines.

In the US and Canada, the company is going to start using automated reviewing systems to weed out videos with nudity, violence, graphic content, illegal activities, and violations of its minor safety policy.

At present, all the videos uploaded go through technology tools that work to recognize and flag any potential violations which are then reviewed by a safety team member.

In case, the violation is present and identified, the video is removed and the user is notified, TikTok said.

The ByteDance-owned company added that over the next few weeks it will begin automatically removing some types of content that violate policy over minor safety. This will be in addition to the removals confirmed by the safety team.

The company said this will help its safety team to concentrate more on highly contextual and nuanced areas, such as bullying and harassment.

Furthermore, TikTok also added it will send a warning in the app upon first violation. However, repeated violations will result in the user being notified and the account permanently removed.

Earlier, the changes have come under fire for amplifying hate speech and misinformation globally across their platforms including Facebook and TikTok.

Labourer tortured, humiliated for TikTok video in Faisalabad

More from this category

Advertisment

Advertisment

Follow us on Facebook

Search