On Feb. 13, JoJo Siwa posted a YouTube video from a Target store, detailing her plan to buy “every single item of JoJo merch” inside. She starts with clothes, piling her own trademarked shirts and dresses into a cart. “I literally got one of everything I could find!” Siwa says to the camera. “Now, let’s go see if they’ve got toys.”
Siwa encapsulates many of the things that made YouTube the world’s most-watched video site. She dances, sings and screams excitedly into the camera, drawing millions of viewers, mostly young girls. The 15-year-old kidfluencer also highlights how YouTube’s success with children has created an ethical and perhaps even legal minefield for its owner, Alphabet Inc.’s Google.
In addition to shooting quirky videos, Siwa cuts endorsement deals and sells two branded apparel lines with Target Corp., the second-largest U.S. retailer. When she posts clips from the company’s stores, she’s creating content that is difficult to distinguish from advertising and will likely be watched by hundreds of thousands of impressionable kids.
Since it was founded in 2005, YouTube has operated beyond the reach of rules that govern advertising on traditional television. But the site has grown so large and influential that the days of light-touch regulation may soon be over. Kids’ programming is where the crackdown is most likely. The problem with sponsored content is that it’s not always clear what’s an ad. Kids are particularly vulnerable to being manipulated by paid clips that masquerade as legitimate content. On TV, the ground rules are clearer: Ads come when the show takes a break.
“The uptick in sponsored content and child influencers is very overwhelming,” said Dona Fraser, director of the Children’s Advertising Review Unit, an industry watchdog funded by companies including Google. “This has exploded in front of our eyes. How do you now wrangle every child influencer out there?”
The Federal Trade Commission warned dozens of Instagram influencers in 2017 that they weren’t disclosing properly when a company was paying them to peddle a product.
“YouTube content creators are responsible for ensuring their content complies with local laws, regulations and YouTube Community Guidelines, including paid product placements,” YouTube said in a statement. “If content is found to violate these policies, we take action, which can include removing content.”
YouTube removed one video featuring Siwa shopping at Target, after Bloomberg News asked about it on Tuesday.
Some video creators are loath to disclose clearly that their YouTube videos are sponsored. Kristine Pack runs “Family Fun Pack,” a channel with close to 8 million subscribers that posts sponsored clips. She says some major marketers demand such assertive disclosure – giant “paid for by” text that runs on the video – that it ends up turning off viewers. “I would’t even want to watch that video,” she said about one clip she made with her kids. “It’s literally nothing but an ad.” Pack stresses that she always discloses paid content.
For viewers, deciphering when a video crosses the line into marketing is not always easy. Two weeks before JoJo Siwa’s February Target visit, she posted a video taking her younger brother on a shopping spree at the retailer. Siwa also posts shopping videos in other places where her clothes are sold, including Walmart stores. The clips aren’t tagged as ads.
Joe Poulos, a Target spokesman, said the company did not pay directly for either of Siwa’s videos shot at the retailer’s stores. He also said they have never compensated Siwa for “creating or distributing any content for Target.” Walmart didn’t respond to a request for comment.
Siwa’s video-recorded splurge on her own clothes would never clear regulatory hurdles to appear on TV, even without sponsorship from the retailer, according to Josh Golin, executive director for the Campaign for a Commercial-Free Childhood.
At the end of one Target shopping spree, Siwa tells viewers to visit the retailer’s stores and website to check out her product line. On children’s TV, that portion would categorize the entire video as an ad in the eyes of the Federal Communications Commission, which oversees TV standards, according to a person familiar with the agency’s thinking. A Target spokesman directed questions about the video to Siwa. Emails sent to the address listed on her YouTube channel weren’t returned.
YouTube avoids rules governing TV for children in part by citing the age restriction in its terms of service. Kids under 13 cannot use the video site. But children often lie about their age when signing up, and the sheer volume of videos aimed at children, as well as testimonials from parents, suggest that they are heavy users. When Siwa’s Target shopping spree clip ends, YouTube recommends an unending series of other clips that also appeal to young kids including nursery rhymes, cartoons and toy “unboxing.”
“If they really were honest brokers about whether kids were allowed on the platform, they wouldn’t have so much kids’ content,” said Colby Zintl, vice president for Common Sense Media, which is pushing Congress to strengthen oversight of how children use services from Google and Facebook.
“YouTube does not allow users under 13 to create or own accounts on YouTube, and when we identify an account of someone who is underage we terminate that account,” the company said.
YouTube tried to address this problem in 2015 when it launched YouTube Kids, a mobile app for viewers younger than 13 that requires parental consent. The initiative highlighted how difficult it is for YouTube to monitor and filter all the videos that are uploaded to its service. Child and consumer advocacy groups complained to the Federal Trade Commission that the YouTube Kids app contained inappropriate content, including explicit sexual language and jokes about pedophilia.
If anything, the problem has gotten worse since then. In late 2017, YouTube purged thousands of videos aimed at kids after finding creepy clips spreading through its supposedly family friendly online community. It also kicked a considerable amount of children’s content out of a program called Google Preferred, a premium package of videos that command higher ad prices, according to three people familiar with the move.
But problems persisted. Last month, advertisers including Walt Disney Co. and toy maker Hasbro Inc., paused advertising on YouTube after a blogger demonstrated how the video site’s comments section could be used by pedophiles to tag and share clips of young girls. Google suspended comments on some videos featuring minors and deleted hundreds of accounts that had left concerning comments.
The attempted cleanup has had an unintended impact and might even be exacerbating the problem of surreptitious ads aimed at underage users. Some video creators turned to sponsored clips after YouTube’s crackdown dented their income from advertising.
Dave Pickett, who makes Lego videos, was dropped from Google Preferred and saw his monthly income from YouTube drop to four figures, from five figures before. (He declined to share exact figures citing non-disclosure agreements with Google.) Pickett started hunting for sponsored content deals instead. He found a couple of deals worth roughly $3,000 over fourth months — not enough to support a full-time career as a YouTube creator.
YouTube doesn’t allow paid promotional content on YouTube Kids. Video creators are supposed to check a box that they received money or free products when they upload their video to the main YouTube site. Those videos are then supposed to be blocked from running on the YouTube Kids app.
However, the Campaign for a Commercial-Free Childhood has found prominent influencers with sponsored videos on YouTube Kids, suggesting that they are either not checking the box or Google is being misleading when it says these videos won’t appear on the app, according to Golin. EvanTubeHD, an incredibly popular channel, had at least one clip sponsored by toy maker Mattel Inc. running on the app this week. “We apply rigorous standards to providing disclosures on paid promotions and following YouTube’s guidelines,” EvanTubeHD wrote in an emailed statement. “This one video which included all required disclosures inadvertently went live without the proper box checked making it available on YouTube Kids. As soon as it was brought to our attention it was immediately corrected and removed from YouTube Kids.”
A group of child advocacy and consumer groups said in April that YouTube is used by more than 80 percent of U.S. children aged 6 to 12. Google collects personal data about these kids and makes “significant profits” from ads targeting them without first providing direct notice to parents and obtaining consent as required by the Children’s Online Privacy Protection Act, or COPPA, the group claimed, while asking the FTC to investigate.
“YouTube is pretending not to be a site for children when it suits them. And yet they are heavily profiting from children being on the site,” said Golin. The Campaign for a Commercial-Free Childhood was one of the groups calling on the FTC to probe Google.
YouTube has avoided removing obvious kid’s content from its main website and just running it on the Kids app. That could be because of viewership numbers. The app has roughly 18 million monthly visitors, according to a person familiar with the company’s numbers. YouTube declined to comment. Google recently reported almost two billion monthly logged-in users for YouTube as a whole.
CARU’s Fraser suggested to YouTube that it build a toggle button that lets parents switch to a mode for kids with different videos and age restrictions. YouTube hasn’t responded to the idea, she said.
“They built it. They can rebuild it,” she said. “It’s a matter of whether or not they want to make the investment to do so.”