
San Francisco, CA - Aug 16, 2025 - TikTok has announced a series of updates to its Community Guidelines, aiming to create a safer, more transparent, and user-friendly environment for its global community. The changes, which include clearer language, new policies on AI-generated content, and stricter rules for live creators, are set to take effect on September 13, 2025. These revisions come as part of TikTok’s ongoing efforts to comply with international regulations, such as the EU’s Digital Services Act and the U.S.’s TAKE IT DOWN Act, while addressing emerging challenges like misinformation and platform misuse.
The updates were informed by input from creators, safety experts, and organizations across more than 30 markets, including TikTok’s regional Advisory Councils. Sandeep Grover, TikTok’s Global Head of Trust and Safety, emphasized that the revisions use simpler language and include “rules-at-a-glance” summaries to make the guidelines easier to understand and follow.
Clearer Rules for AI-Generated Content
One of the most notable changes involves AI-generated content (AIGC) and edited media. TikTok now requires clear labeling—such as stickers or captions indicating “synthetic,” “AI-generated,” or “altered”—for any content that uses AI or editing tools to realistically depict people, scenes, or events. This builds on previous policies but adds explicit prohibitions against AIGC that misleads users on matters of public importance or causes harm to individuals. The platform has simplified its wording, removing specific references to fake endorsements by public figures and instead focusing on broader harms, like misleading depictions during crises or of authoritative sources.
These AI updates reflect TikTok’s response to the growing use of generative tools, which can blur the line between fact and fiction. For instance, content creators using AI for viral effects or avatars must now disclose alterations to maintain transparency and prevent deception.
Enhanced Responsibilities for Live Creators
Live streaming features are also seeing subtle but significant tweaks. Creators are now explicitly responsible for monitoring all activities during LIVE sessions, including third-party tools like real-time translation or voice-to-text features that interact with viewer comments. Failure to ensure compliance could result in violations.
Additionally, TikTok is cracking down on commercial content in LIVE broadcasts. Creators must disclose any promotional material, and content that directs users to purchase products off-platform—especially in regions where TikTok Shop is available—will face reduced visibility. This move aims to prioritize on-platform engagement and curb exploitative sales tactics, such as aggressive off-site promotions during streams. The expanded “Accounts and Features” section now provides clearer safety measures for LIVE, alongside direct messaging, comments, and TikTok Shop, to better protect users from emerging harms.
Tackling Misinformation and Regulated Goods
The guidelines have refined policies on misinformation, with enhanced language to address subtle forms of harm, including climate-related falsehoods and bullying. TikTok prohibits content that could mislead voters, interfere with elections, or cause societal damage, maintaining a focus on civic integrity without major overhauls in this area. Bullying policies have been updated for clarity, ensuring creators better understand obligations to avoid content that harms individuals.
A unified “Regulated Goods and Services” policy consolidates rules on gambling, alcohol, tobacco, drugs, firearms, and dangerous weapons, making it easier for users to navigate restrictions. Promotion of these items is strictly limited, with exceptions for verified TikTok Shop sellers under specific conditions.
Personalization and Enforcement Improvements
TikTok is emphasizing personalized experiences, noting that search results, recommendations, and comments will vary based on user behavior, such as past searches, watches, likes, and reports. This customization aims to make the platform more relevant but also means the For You Feed (FYF) eligibility standards are now decentralized across guideline sections, rather than listed in one place. Users can access tools like a safety toolkit and account settings to manage preferences and interactions.
On enforcement, TikTok has boosted transparency by explaining how it handles crises and respects local norms. Over 85% of violative content is now removed via automation, with 99% taken down before user reports. Creators can appeal decisions easily, and the platform continues to train moderators for consistent application. The moderation language has shifted from emphasizing a “safe, trustworthy, and vibrant” space to a “safe, fun, and creative place for everyone,” signaling a focus on positivity.
These updates underscore TikTok’s commitment to evolving with user feedback and regulatory demands, helping creators and viewers alike navigate the platform more safely.