October 11, 2024

At just 17, Chris McCarty is taking matters into their own hands to protect children from being exploited for cash in family vlogs.
As part of their project for the Girl Scouts Gold Award, the highest honor in the program, the Seattle teenager spent months researching child influencers: Kids who rake in serious cash for their appearances in YouTube vlogs, which are often run by their parents. They were so fascinated and appalled by the lack of regulation around child labor and social media that they utilized their high school’s senior independent study program to phone-bank their neighbors to gauge community interest in the issue.
In January, McCarty cold-emailed a number of local lawmakers, including Washington State Representative Emily Wicks, who serves on the Children, Youth & Families Committee. McCarty presented their research, convincing the representative why she should work with a teenager to draft a new bill at the very end of the legislative session.
“I randomly got an email from Chris, and they said, ‘Here’s the problem, and here’s the potential solution,’” Wicks told TechCrunch. “I really wanted to provide the opportunity to help them understand exactly how the legislative process works, no matter how far we were able to get with the bill.”
With a 20-year age difference, McCarty and Wicks have distinctly different experiences when it comes to growing up on social media, but they worked together to propose House Bill 2023.
“Children are generating interest in and revenue for the content, but receive no financial compensation for their participation,” the bill about family vlogs reads. “Unlike in child acting, these children are not playing a part and lack legal protections.”
If passed in the Washington state legislature, the bill would apply to content that generates at least 10 cents per view and has an individual minor featured over 30% of the time. In that case, a percentage of the family vlog’s income would be set aside in a trust to be given to the child when they turn 18. At that time, the individual could also request that the content they appear in to be removed from the tech platform.
The bill was introduced late in the legislative session, so it has not yet been heard on the House floor.
“I’m actually not running for re-election … but I want to hand off this bill to the right people to continue working on this,” Wicks said. She highlighted the value of intergenerational collaboration on legislature, especially when it comes to topics like the impact of social media on children. “This is what being in the legislature is all about — teaching people, and helping and being the director of these great ideas people have.”
At any level of government, the journey from a proposed bill to a law is a tough one, especially when it comes to legislation like HB2023, which would require direct cooperation from tech companies like YouTube or Instagram. But as social media stars continue to edge their way into the entertainment industry, the lack of regulation for children in this line of work is becoming more concerning.
Life was good for Myka Stauffer, a family vlogger who amassed a combined 1 million YouTube subscribers across two channels. She rose to prominence after producing a series of 27 videos that chronicled her family’s journey as they went through the emotional and arduous process of adopting a child from China, Huxley. Stauffer was swimming in sponsorships from brands like Glossier and Fabletics, and she was high-profile enough that People Magazine announced the birth of her next son, Onyx.
But in 2020, three years after the adoption of 5-year-old Huxley, fans realized that he suddenly no longer appeared in any of her videos. At their audience’s urging, Stauffer and her husband revealed that they had re-homed Huxley because they felt that they couldn’t properly care for a child with autism.
Of course, YouTube commenters and armchair Twitter experts alike exploded into an uproar of discourse: Was this just a couple in over their head, helping a child secure caretakers that could better nurture him? Or did they build their fame and riches off of the story of an oblivious toddler, making his life story public to millions of viewers, only to discard him when things became too hard?
McCarty became interested in the ethics of family vlogging when they learned about Huxley’s story.
“If Myka Stauffer is doing it, there are probably a lot of other families doing it too,” McCarty told TechCrunch. “These children, often they’re not old enough to even understand what’s going on, much less fully consent to it.”
As early as 2010, amateur YouTubers realized that “cute kid does stuff” is a genre prone to virality. David DeVore, then 7, became an internet sensation when his father posted a YouTube video of his reaction to anesthesia called “David After Dentist.” David’s father turned the public’s interest in his son into a small business, earning around $150,000 within five months through ad revenue, merch sales and a licensing deal with Vizio. He told The Wall Street Journal at the time that he would save the money for his children’s college costs, as well as charitable donations. Meanwhile, the family behind the “Charlie bit my finger” video made enough money to buy a new house.
Over a decade later, some of YouTube’s biggest stars are children who are too young to understand the life-changing responsibility of being an internet celebrity with millions of subscribers. Seven-year-old Nastya, whose parents run her YouTube channel, was the sixth-highest-earning YouTube creator in 2022, earning $28 million. Ryan Kaji, a 10-year-old who has been playing with toys on YouTube since he was 4, earned $27 million from a variety of licensing and brand deals.
Just days ago, the controversial longtime YouTuber Trisha Paytas rebranded their channel to the “Paytas-Hacmon Family Channel,” which documents “the lives of Trish, Moses and Baby Paytas-Hacmon (coming sept 2022).” Already, Paytas has posted on their family vlog about ultrasounds, maternity clothing and what they eat while pregnant.
That’s not to say that influencer parents are inherently money hungry and shortsighted. Kaji’s parents, for instance, said they set aside 100% of his income from his Nickelodeon series “Ryan’s Mystery Playdate” for when he comes of age. But there are almost no guardrails to prevent these children from experiencing exploitation.
Huxley was adopted by another family, and an investigation by local authorities concluded that he is “very happy and well taken care of.” But when he gets old enough, he will discover that when he was only a toddler, private details about his intercontinental adoption and disability were the subject of widespread online debate and news coverage. The Stauffer family profited off of Huxley’s appearance in their family vlogs (though they now no longer work as social media personalities, for clear reasons), but Huxley will never be compensated for his unwitting appearances in their videos.
“I really think the biggest issue here is a lack of advocacy, because so many people don’t see this as a problem, or they don’t think about it,” McCarty said. In addition to working on HB2032, they started a website called Quit Clicking Kids to raise awareness about problems facing child influencers. “But then once you get those really personal stories from specific instances of [child] influencers, I think that’s the thing that changes minds the most.”
In Hollywood, child actors are protected by the Coogan law, which requires parents to put 15% of a minor’s gross wages into a trust account, called a Coogan account. This protection, enacted in 1939 and updated in 1999, inspired a similar clause in the bill that Wicks introduced in Washington state.
“A lot of kids on social media accounts, there’s no protection for them right now,” McCarty told TechCrunch. “They don’t have any kind of regulated hours the way child actors do now or Coogan laws to protect them. So I really think that we need to offer the same considerations from child actors to child stars in vlogs.”
In 2018, California State Assembly member Kansen Chu attempted to pass legislation that would classify “social media advertising” as a form of child labor. This would have required minors to obtain work permits, which would offer them greater financial protections, since children in California can only get a work permit if they have a Coogan account. But before it was written into law, the legislation was considerably weakened. Now, these child labor classifications don’t apply if a minor’s online performance is unpaid and shorter than an hour, which exempts much of kid influencers’ work.
In France, a law took effect in 2021 that required any earnings of a child influencer to be put into a bank account that they can only access once they turn 16. Like the bill proposed in Washington state, this law enforces a “right to be forgotten,” which allows the child to request that the content they appear in be taken off the internet.
Since Frances Haugen’s landmark leaks of internal Facebook documents detailing the negative effect of social media on minors, several bills have been introduced (or reintroduced) in the U.S. Senate designed to protect the safety of ordinary kids using social media.
In October, representatives from YouTube, TikTok and Snap all agreed at a hearing that parents should have the ability to erase online data for their children or teens, a concept that appears in proposed updates to the historic Children’s Online Privacy Protection Act (COPPA).
Another way to regulate the monetization of family vlog and child influencer content would be for activists like McCarty to wage pressure campaigns that urge leaders at companies like YouTube or Instagram to put guardrails up themselves. But Wicks and McCarty believe that the legislative route is more likely to be effective.
“I say, work with [tech companies] to see if they can be good players,” Wicks said. “How could we potentially work with them to say, ‘This is what our intent is, this is what we’re trying to do. … How, based on what you have today, can we make this happen?’”
Even if lawmakers mandate that tech companies wipe a child’s digital footprint, for example, there’s no guarantee that the platforms themselves have the technology to honor these requests. U.S. senators have grappled with these challenges, questioning reticent executives from Snapchat, TikTok, YouTube and Meta about how the seemingly opposing forces of Big Tech and government might cooperate.
“There have been hearings, but there hasn’t really been any kind of meaningful, concrete action,” McCarty said. “So to me, going through legislation was a better way to get more people talking about it, to make concrete change.”
In hearing with Snap, TikTok and YouTube, lawmakers tout new rules to protect kids online

Casey Neistat’s David Dobrik documentary explores what happens when creators cross the line

source

About Author