YouTube said that it removed more than 30,000 videos last month that contained hate speech content.

In ablog postpublished on Tuesday, September 3, the video platform also announced it plans to update its current harassment policy, which will represent a “fundamental shift in our policies.”

“We’ve been removing harmful content since YouTube started, but our investment in this work has accelerated in recent years,” YouTube said. “Because of this ongoing work, over the last 18 months we’ve reduced views on videos that are later removed for violating our policies by 80%, and we’re continuously working to reduce this number further.”

YouTube’s latest updates — that they said are “coming soon” — focus on removing content, raising authoritative voices, rewarding trusted creators, and reducing the spread of material that is against policy.

“We go to great lengths to ensure content that breaks our rules isn’t widely viewed, or even viewed at all, before it’s removed,” YouTube said in Tuesday’s post.

YouTube initially made updates to itsanti-hate speech policyin June. The updates now remove videos that feature supremacist views, as well as videos that deny the existence of “well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary.”

According to YouTube, the videos containing hate speech content that were removed over the last month represented only about 3% of the views that videos about knitting had during the same time frame.

The video platform has had a year of policy updates due to a variety of different issues. In April, YouTube updated its harassment policy because of creator-on-creator harassment that was occurring on the platform.

In June, theWall Street Journal reportedthat YouTube was considering significant changes to its recommendations algorithm in regards to videos aimed at children. According to reports, the Federal Trade Commission (FTC) was in the late stages ofinvestigating the platform’s treatment of kids.

Then in August, YouTube was sued by a group of LGBTQ YouTube creators due to alleged discrimination toward the LGBTQ creators and community, according toThe Verge.

Digital Trends reached out to YouTube to see when the new updates will officially be implemented to the platform, but we haven’t received a response.