TikTok has taken down the controversial “chubby filter” after facing widespread backlash. Originally created on CapCut—a video editing app owned by TikTok’s parent company, ByteDance. The filter artificially made users appear heavier. It was often used in comparison videos, where people would show their real faces before applying the effect, usually set to Doechii’s song Anxiety.
Many saw the trend as nothing more than a new way to mock larger bodies, reinforcing the idea that being thin is the only way to be attractive. The backlash spread across social media, with users criticising the filter for making weight gain the punchline of a joke. One TikTok user said, “Imagine just existing and seeing thousands of people use an AI filter to look like you—only to ridicule it.”
Health professionals also weighed in, calling the filter damaging. Eating disorder specialists pointed out that it fuels fatphobia and contributes to low self-esteem, especially among younger users. Such trends only add to the discrimination that larger-bodied people already experience daily.
Following the criticism, TikTok confirmed that the filter had been removed from CapCut and that existing videos using it were being restricted from recommendations. The platform is also reviewing whether these videos violate its community guidelines on harmful body image content.
This isn’t the first time TikTok has had to address concerns over beauty-related filters. In late 2024, the platform announced plans to block users under 18 from accessing extreme beauty filters that alter facial features, such as enhancing jawlines and enlarging lips. While intended to protect young users, the effectiveness of these restrictions remains questionable, given how easily age settings can be manipulated.
The removal of the chubby filter is just another example of how social media trends can quickly spiral into harmful territory. While filters can be creative tools, they also have the power to reinforce outdated beauty standards and contribute to a culture of body shaming.