Guide to parental controls on social networks

0



CNN Business

Just over a year ago, social media companies were tipped off about how they were protecting or failing to protect their youngest users.

In a series of congressional hearings, executives from Facebook (FB), TikTok, Snapchat and Instagram faced tough questions from lawmakers about how their platforms can drive young users to harmful content, damaging mental health and body image (particularly among teenage girls), and lacked sufficient parental controls and safeguards to protect teens.

Those hearings, which followed revelations in what became known as whistleblower Frances Haugen’s “Facebook Papers” about Instagram’s impact on teens, prompted companies to pledge to to change. All four social networks have since introduced more parental control tools and options aimed at better protecting younger users. Some have also made changes to their algorithms, such as forcing teens to see less sensitive content and increasing their moderation efforts. But some lawmakers, social media experts and psychologists say new solutions are still limited and there is still a long way to go.

“More than a year after the Facebook Papers dramatically exposed Big Tech abuse, social media companies have taken only small, slow steps to clean up their act,” the senator told CNN Business. Richard Blumenthal, who chairs the Senate Consumer Protection Subcommittee. “Trust in Big Tech is long gone and we need real rules to keep kids safe online.”

Michela Menting, director of digital security at market research firm ABI Research, agreed that social media platforms “offer very little substance to counter the ills their platforms suffer”. Their solutions, she said, require guardians to enable various parental controls, such as those intended to filter, block and restrict access, and more passive options, such as monitoring and surveillance tools that s are running in the background.

Alexandra Hamlet, a New York-based clinical psychologist, recalls being invited to a roundtable about 18 months ago to discuss ways to improve Instagram, especially for younger users. “I don’t see a lot of our ideas implemented,” she said. Social media platforms, she added, must work to “continue to improve parental controls, protect young people from targeted advertising and remove objectively harmful content”.

The social media companies featured in this article declined to comment or did not respond to a request for comment on criticism that more needs to be done to protect young users.

For now, guardians should learn to use parental controls while keeping in mind that teenagers can often bypass these tools. Here’s a look at what parents can do to keep their kids safe online.

After the fallout from the leaked documents, Meta-owned Instagram suspended its much-criticized plan to release a version of Instagram for kids under 13 and focused on securing its core service for younger users. .

It has since introduced an education hub for parents with resources, tips and expert articles on user safety, and rolled out a tool that allows guardians to see how much time their kids spend on Instagram and fix time limits. Parents can also receive updates on accounts their teen follows and accounts that follow them, and see and be notified if their child updates their privacy and account settings. Parents can also see which accounts their teens have blocked. The company also offers video tutorials on using the new monitoring tools.

Another feature encourages users to take a break from the app, such as suggesting they take a deep breath, write something, check a to-do list, or listen to a song, after a predetermined amount of time. Instagram also said it’s taking a “stricter approach” to the content it recommends for teens and will actively nudge them towards different topics, such as architecture and travel destinations, if they dwell on it. any type of content for too long.

Facebook’s Security Center provides monitoring tools and resources, such as articles and tips from leading experts. “Our vision for Family Center is to ultimately enable parents and guardians to help their teens manage experiences through Meta technologies, all from one place,” said Liza Crenshaw, spokesperson. from Meta, to CNN Business.

The hub also offers a guide to Meta’s VR parental supervision tools from ConnectSafely, a nonprofit aimed at helping kids stay safe online, to help parents discuss VR with their teens. Guardians can see which accounts their teen has blocked and access supervision tools, as well as approve their teen’s download or purchase of a default blocked app based on their rating, or block specific apps that may be inappropriate for their teenager.

In August, Snapchat introduced a parents’ guide and hub aimed at giving guardians more information about how their teens are using the app, including who they’ve talked to over the past week (without disclosing the content of these conversations). To use this feature, parents need to create their own Snapchat account and teens need to sign up and give permission.

Although this is Snapchat’s first official foray into parental controls, there were previously a few safety measures for younger users, such as requiring teenagers to be mutual friends before they could start chatting. communicate with each other and prohibit them from having public profiles. Teenage users have their Snap Map location sharing tool turned off by default, but can also use it to leak their real-time location to a friend or family member even when their app is closed as a safety precaution. Meanwhile, a Friend Check Up tool encourages Snapchat users to review their friend lists and make sure they still want to be in touch with certain people.

Snap has previously said it is working on more features, such as allowing parents to see what new friends their teens have added and allowing them to privately report accounts that may be interacting with their child. He’s also working on a tool to give young users the ability to let their parents know when they report an account or piece of content.

The company told CNN Business that it will continue to build on its safety features and consider feedback from the community, policymakers, safety and mental health advocates and other experts to improve the tools over time.

In July, TikTok announced new ways to filter out mature or “potentially problematic” videos. The new safeguards assigned a “maturity score” to videos detected as potentially containing mature or complex themes. It also rolled out a tool that aims to help people decide how long they want to spend on TikToks. The tool lets users set regular screen breaks and provides a dashboard that details how many times they’ve opened the app, a breakdown of daytime and nighttime usage, and more.

The popular short video app currently offers a Family Pairing hub, which allows parents and teens to customize their security settings. A parent can also link their TikTok account to their teen’s app and set parental controls, including how much time they can spend on the app each day; restrict exposure to certain content; decide whether teens can search for videos, hashtags, or live content; and whether their account is private or public. TikTok also offers its Guardian Guide which explains how parents can best protect their children on the platform.

In addition to parental controls, the app restricts access to certain features for younger users, like live and direct messaging. A pop-up also appears when teens under 16 are ready to post their first video, asking them to choose who can watch the video. Push notifications are blocked after 9 p.m. for account users ages 13-15 and 10 p.m. for users ages 16-17.

The company said it will do more to raise awareness of its parental control features in the days and months ahead.

Discord did not appear before the Senate last year, but the popular messaging platform has been criticized for its difficulty in flagging problematic content and its ability for strangers to connect with younger users.

In response, the company recently updated its Safety Center, where parents can find tips on how to turn on safety settings, FAQs on how Discord works, and tips on how to talk about safety online. online with teenagers. Some existing parental control tools include an option to prohibit a minor from receiving a friend request or direct message from someone they don’t know.

Still, it is possible for minors to connect with strangers on public servers or in private chats if the person was invited by someone else in the room or if the channel link is dropped in a public group that the user has accessed. By default, all users – including users between the ages of 13 and 17 – can receive friend invites from anyone on the same server, which then opens up the ability for them to send private messages.

Share.

Comments are closed.