In an effort to promote healthier self-image among young users, YouTube is implementing changes to limit the recommendation of fitness and weight-related videos to kids and teenagers.
The platform’s Youth and Families Advisory Committee explained, “Certain content may seem harmless in isolation, but could potentially be problematic for some teens if viewed repeatedly. This includes videos that compare physical features, idealize specific body types or fitness levels, or showcase social aggression through non-contact confrontations.”
By restricting the frequency of these recommendations, YouTube aims to mitigate the risk of young viewers developing unrealistic expectations or negative self-perceptions due to constant exposure to unattainable ideals.
These modifications, already in place in the United States, are now being extended to Europe and the UK, as reported by Vice.
YouTube has faced criticism in the past for its algorithm’s tendency to lead viewers down potentially harmful content paths. For instance, a study last year revealed that the platform was inadvertently promoting gun-related content to young users.
While YouTube disputed the accuracy of this particular study, it’s encouraging to see the company taking steps to limit the spread of potentially harmful content.
As part of the upcoming update, YouTube is also introducing a “Family Center” feature to help parents monitor their children’s activity on the site more effectively.
“The Family Center will provide parents with insights into their teens’ YouTube activity, including the number of uploads, subscriptions, and comments,” YouTube stated, according to The Independent.
“Both parents and teens will receive proactive email notifications for significant events, such as when teens upload a video or start a livestream, offering opportunities for guidance on responsible content creation,” the platform added.
Familyguide previously reported on YouTube’s safety measures:
In late 2022, YouTube updated its creator guidelines to discourage content related to sexual themes, drug use, and dishonest behavior, alongside a controversial profanity policy change.
“Our policies are continuously evolving,” a Google representative explained. “We are committed to fostering a healthy digital advertising ecosystem.” While these changes primarily benefit advertisers, they also contribute to creating a safer environment for children on the platform.
The updated guidelines categorized topics related to drug trade organizations as “harmful or dangerous acts” and included drug usage as grounds for demonetization.
“We prohibit monetization of content featuring graphic sexual text, images, audio, or games, as well as non-consensual sexual themes, whether simulated or real,” a Google representative clarified. These additions expanded YouTube’s strict policies on sexual content, including nudity, sexual entertainment, and related merchandise.
You may also like
-
“Shocking Truth: The Hidden Danger Lurking in Your Child’s Backpack!”
-
“The Silent Epidemic: How Your Child’s Screen Time Could Be Stealing Their Smile”
-
“You Won’t Believe How This Simple Family Tradition Transformed a Worship Leader’s Daughter”
-
“Country Star’s Heartfelt Plea: The Song That’s Making Parents Everywhere Reach for the Tissues”
-
“Shocking Discovery: Your Phone Addiction Could Be Silently Sabotaging Your Child’s Future!”