The digital transition of television has enabled a variety of free streaming for children's programming. YouTube is a vast video-sharing website that now feels like its own television channel. It's easy for children ages 2-11 to navigate and access YouTube on a smart TV, phone, or computer. YouTube has excellent quality educational content, but there is a regulatory issue that it's not a child-friendly platform due to the overwhelming amount of content and advertisements.
In January of 2020, YouTube engaged in formal self-regulation to create a safer platform for kids. Any content marked "made for kids" will not have access to serve targeted advertising on these videos. This regulation stems from a deal with the Federal Trade Commission and the New York attorney general to settle allegations that YouTube violated the U.S Children's Online Privacy Protection Act (COPPA). YouTube was illegally collected data from kids under 13 and sued for $170 million. Now YouTube forces content creators to determine whether their videos or channel is child-directed content. If the FTC determines a channel has violated the law by mislabeling, penalties are up to $42,530 per violation. However, nobody understands what is considered child-directed content? The COPPA rules feel as if they're vague and hard to implement correctly. These new COPPA- compliance rules have projected massive advertising-revenue losses for small channels that produced children's educational programming. Some YouTube channels have stopped creating because they would lose enormous ad income from the new YouTube rules. However, certain companies' models aren't dependent on ad sales and instead sell context-based spots. These companies will likely stay on the platform.
In the book Understanding Media Industries, I learned that formal self-regulation creates self-imposed rules limiting or categorizing content. I believe YouTube only started to self-regulate children's media because the government saw the company taking advantage of children's data. The platform always had a set of guidelines that outlined what type of content is permitted on the platform for the public's safety. Children are a massive percentage of YouTube's consumers, and the platform needed to adjust its regulations to ensure the safety of children using the site. Another term I learned from the book is "Safe Harbor." For television, "10:00 PM to 6:00 AM is considered the safe harbor in which indecent content is allowed because broadcasters can reasonably assume children will not be in the audience" (80). YouTube tried to create a safe harbor by creating YouTube Kids; this extension of YouTube was to help funnel all the child-friendly videos into a single location. Unfortunately, complaints from parents stated that YouTube Kids still contained content that wasn't safe for children, even though the creators marked their videos as "made for kids."
Children spend massive amounts of time on YouTube, and the company needs to understand their responsibility towards young, developing minds. During the pandemic, I worked as a first-grade reading teacher, and fortunately, the school provided each student with a laptop to learn virtually. After students would complete their work, I would allow them free time. As I monitored, I questioned some of the YouTube content they were watching. An example, one of my students was watching a video of plush toys in adult scenarios. The content wasn't vulgar, and the student clicked the video because they saw toys. I understand how these videos become misleading, and I think there should be a safer searching tool for children navigating the platform.
Spangler, Todd. “What You Need to Know About YouTube’s New COPPA Child-Direct Content Rules,” Variety, January 3, 2020,
Havens, Timothy, and Amanda D. Lotz. Understanding Media Industries. Second ed., Oxford University Press, 2017.