A New York judge will hold a hearing Sunday on a landmark ruling that Instagram removed videos in which minors touched themselves in ways that violated the company’s policy on “bias-based violation” — meaning videos that used sexually explicit language and in some cases engaged in actual sexual abuse. The ruling extends a similar ruling this week against YouTube in which a judge said the firm violated federal statutes by banning videos showing people who were LGBTQ.
Instagram blocked children from videos that encouraged masturbation — in the same way that it controls hateful speech and violent attacks — but did not block content that showed genitalia or sexual acts. The case of TikTok, a live streaming app owned by Tencent, which has more than 100 million users in the U.S., found that the company must allow children under 13 to use the app if their parent gives consent.
A spokesperson for TikTok, “We have strict policies against child endangerment and removing content that promotes sexual exploitation and child pornography. When any minors express their desire to use the app safely — many of our users are parents — we oblige. Unfortunately, when a family member or someone they know use our app inappropriately or to the point of endangerment, this is where our focus turns. We want to be clear that if any minors feel uncomfortable on our platform and ask to be removed, we’re always happy to consider a review. And we work hard to provide our community with the tools they need, especially parents, to give their kids a safe and positive experience.”
It’s probably true that the TikTok app is not going to help teenagers get high and the 14-year-old boy whose evidence led to the ruling feels uncomfortable with the app, but preventing millions of young people from using a newly popular app is likely to be counterproductive. And in the case of Instagram’s ruling, the company appears to have taken the injunction literally: It recently added a 3-D-gimmick that allows commenters to “drill” in obscene and violent behavior.