Children live on the web and benefit from it. But they also face hazards from toys, apps and even well-intentioned adults sharing their personal data. Here’s how to help toddlers and teens navigate the ever-changing online world.
A couple of years ago, around 6:00 a.m. on a school day, Gail Gould was making breakfast for her 12-year-old son when he walked into the kitchen, terrified. He handed Gould his smartphone, and on the screen, she saw a lewd photo. “I just couldn’t believe what I was seeing,” she says. “I was in shock!”
Her son—a sixth grader in Houston, Texas—had received the photo via text from an anonymous number, and Gould wanted to understand how and why. Was it someone they knew? Did a hacker break into their home network and access his information? To find out, Gould, a health educator, quickly alerted the boy’s school, reported the incident to police, and even hired a private investigator. But the number, the authorities told her, was untraceable.
Two weeks before their son had received the obscene message, Gould and her husband, a business litigation lawyer, had just given him his first smartphone. They worried he was too young, but kept hearing from friends that he needed it for research at school. So they drafted a smartphone contract, restricting where, when, and how he could use his device, and warned him about inappropriate pictures, emails or notifications he might receive.
“Just let me know if that happens and we’ll handle it,” Gould told her son.
“Oh Mom,” she remembers him saying. “I’m sure it will be fine.”
Clearly it wasn’t. Gould’s son was repulsed by the photo, but he moved on quickly. The incident, however, was a warning sign: Gould realized she needed to prepare for the onslaught of technology that would define her son’s teenage years.
“Kids are growing up with most of their lives under surveillance.”
More than a decade after Apple released the iPhone, revolutionizing the way we communicate, the world is still reckoning with the ramifications. Because digital media and technology are evolving so rapidly, it’s increasingly hard for people to keep up with the changes—especially parents such as Gould. According to a 2018 Pew Research Center study, 95 percent of teens now have a smartphone or access to one, and close to half say they’re online almost constantly. Meanwhile, the amount of time young children spend on mobile devices tripled between 2013 and 2017, from 15 to 48 minutes a day, according to a report from the education and advocacy nonprofit, Common Sense Media.
As smartphones and other technological changes transform our lives—bringing the world closer than ever before—these devices and platforms also pose dilemmas for parents and their children. From hackable smart toys, to social media that siphons our personal data, to laws that many believe are outdated, experts say children’s privacy is being violated—constantly. As Ariel Fox Johnson, senior counsel for policy and privacy at Common Sense Media, puts it: “Kids are growing up with most of their lives under surveillance.”
Navigating this territory can be intimidating, even unsettling, leaving parents feeling helpless. And there’s a lot at risk, namely children’s personal and financial security. But adults can help kids stay safe and smart online at every age—and ensure they grow up engaging in positive ways with the technologies that will define their generation.
The surveillance problems start at home. Parents are often unwitting enablers, experts say, exposing their kids to marketers, hackers, bullies, and even predators. Every fall, moms and dads let loose a deluge of back-to-school photos on social media, making names, ages, and schools—sometimes even favorite foods and pets (which might one day inspire passwords or security questions)—accessible to anyone who can see that parent’s post. The same goes for welcome-to-the-world pictures, which typically give away babies’ full names and dates of birth, making it easier for children’s identities to be stolen.
“Everyday digital acts by parents wind up jeopardizing kids’ digital data privacy and security,” says Leah Plunkett, an associate dean of the University of New Hampshire School of Law, and an expert on digital privacy and youth. “Those acts are typically done with the best of intentions, and almost all the time, they are perfectly legal.”
Smart toys are often culpable, too. In 2015, Mattel released Hello Barbie, an interactive, Wi-Fi-enabled doll that listened to kids, responded, and remembered conversations. When a child pressed a button on the toy, everything he or she said—likes, dislikes, answers to questions the doll asked—was sent to cloud servers. Mattel’s technology partner, ToyTalk (now PullString), stored and analyzed those recordings with voice-recognition software, then formulated “personalized” responses for Hello Barbie to deliver back to the child. Over time, she could learn a child’s name and interests based on answers to questions the doll asked.
Critics pounced. They warned that the smart doll not only hampered creative play, which is critical to child development, but it could give kids a false sense of safety with smart toys—and be vulnerable to data breaches and hackers. Experts also worried about the ramifications of a toy storing and analyzing children’s private conversations. Germany banned the doll, and a local publication nicknamed it “Stasi Barbie.” Still, Hello Barbie sold out in 2015, and while Mattel did not continue manufacturing it, the company developed more toys with similar technology, like Barbie’s Hello Dreamhouse—a two-story smart home that recognizes 100 voice commands, from opening and closing the front door to turning on disco lights for a party.
“We were the first to launch a product that had this child-friendly speech recognition, and with that came a lot of education for parents about what that meant,” a Mattel spokesperson said, adding that all the company’s products comply with the law and are verified by an independent company to make sure they protect children’s safety and security. The spokesperson also said that “new digital innovation products like Hello Barbie require parental consent to set up and place parents front and center to control the experience. ”Neither Mattel nor ToyTalk “use any data collected to advertise or market to children or anyone else.”
Today, 95 percent of teens have a smartphone or access to one.
Over the last few years, smart toys have proliferated, and so have the privacy questions they raise. Last year, a security researcher realized that CloudPets, internet-connected stuffed animals that let parents and kids leave voice messages for each other, also left a trail of digital breadcrumbs for hackers, including logins, passwords, and recordings on an easily purloined database. Germany banned a smart doll called My Friend Cayla over concerns that hackers could listen in on children, even speak to them. Many companies argue that they collect data to provide and improve their services, Plunkett says, and that they protect or anonymize that data. But there are so many smart toys on the market that scoop up information using cameras, mics, and other functions that the FBI issued a PSA last year reminding consumers to “consider cyber security prior to introducing smart, interactive, internet-connected toys into their homes or trusted environments.”
“I’m waiting for the day a connected toy is being used in a custody battle,” says Johnson of Common Sense Media, “because I’m sure the kid will have said something to the doll or the doll overheard something.”
Even gadgets that are meant to simplify everyday life can be problematic. Some baby monitors are notoriously hackable, and smartwatches for children are rife with “serious security and privacy flaws,” according to a report from the Norwegian Consumer Council, a government agency focused on consumer protection. In 2015, a hacker accessed personal information from nearly five million parents and over 200,000 children using VTech’s Learning Lodge app store, exposing names, emails, passwords, genders, and birth dates.
“Some people love Nest and Alexa,” says Ereni Markos, associate professor at Suffolk University and an expert in digital privacy and marketing, “but it’s important to understand how convenience and cost weigh off each other. Alexa playing your favorite music is convenient, but what is ‘she’ doing with all that information?”
Young people’s love of technology started long before Snapchat and Instagram. Here’s a brief history of the gadgets and platforms that have changed childhood.
Nintendo Entertainment System (NES) is released. Its massive success is attributed at least partly to the instantly-classic Super Mario Bros.
Nokia releases the first consumer cell phone that enables users to send SMS text messages. Texting took a while to catch on. In 1995 the average text messaging customer was sending 0.4 texts per month.
AOL’s Instant Messenger is released, giving young people a whole new way to communicate with friends and strangers—as themselves and through made-up personas.
Google beta is launched. The site averaged 10,000 search queries a day in its first year. Today, Google processes more than 40,000 search queries every second.
Microsoft releases Xbox with its tremendously popular Halo franchise. The game also let teens to chat and connect online.
First YouTube video uploaded. By 2011, users were uploading 48 hours of video every minute.
Facebook opens to all users over 13. What started as a social network for college students in 2004 grew to more than 100 million users by 2008. Today Facebook has 2.23 billion monthly active users.
Apple introduces the iPhone. In its first year of existence, customers bought 1.4 million iPhones, and between 2007 and 2016 Apple sold more than one billion worldwide.
Instagram launches in 2010, followed by Snapchat in 2011. The rise of social media platforms transformed the way teens and tweens socialize and present themselves online. Today 72 percent of U.S. teens ages 13 to 17 say they use Instagram and 69 percent of teens say they use Snapchat.
Most people have a vague understanding that every move they make online is being tracked by data brokers who buy and sell that information for profit. But few people know about the thriving commercial marketplace for student data, and according to experts, it’s plagued by a lack of transparency, accessibility, and adequate governance. Schools collect an extraordinary amount of information, including students’ names, test scores, Social Security numbers, internet searches, even health and disciplinary data. And they are legally permitted to share some of this data with third parties (including government agencies and for-profit companies) without prior parental consent. A new study from Fordham University’s Center on Law and Information Policy found that data brokers can purchase student lists based on “ethnicity, affluence, religion, lifestyle, awkwardness, and even a perceived or predicted need for family planning services.”
“We don’t have the legal system in place even to know what data brokers collect about us, let alone to see it or try to fix it.”
“You wind up with potentially large amounts of sensitive information leaving the school ecosystem without anyone realizing it,” says Plunkett, whose book, Sharenthood: How the Digital Tech Habits of Parents, Teachers, and other Trusted Adults Harm Kids and Teens, will be out next fall. “We don’t have the legal system in place even to know what data brokers collect about us, let alone to see it or try to fix it.”
Experts are not exactly sure how all this information will be used today, tomorrow, or decades from now. Employers and admissions officers are already scrutinizing what young people post online. Could health care companies take advantage of medical data collected decades earlier via smart baby booties, which monitor heart rate, oxygen levels, and sleep? Could life insurance companies assess a 25-year-old’s risk based on information collected and aggregated by a data broker back when the applicant was a child? What if credit card companies start targeting people because they struggled with math in high school? “The more private digital data is available to future decision makers, the more they will start to use it,” Plunkett says. “Unless explicitly prohibited by law, regulation or policy, they will inevitably look to use it for really high-stakes, life-altering decisions.”
In 1995, before “tweeting” meant anything other than a bird’s warble—before Google even existed—Kathryn Montgomery and her husband, Jeff Chester, noticed something sinister on the internet. They’d recently founded the Center for Media Education, now the Center for Digital Democracy, a nonprofit focused on media policies for children. At a conference in New York about kids and the digital world, they realized that everyone was amped about marketing to children. “We could see exactly where it was going,” says Montgomery, professor emerita at American University’s School of Communication. She and Chester, the center’s executive director, went on to spearhead the research and advocacy that led to the only federal law protecting children’s online privacy.
Enacted in 1998, the Children’s Online Privacy Protection Act (COPPA) puts parents in control of what information websites, apps, and services aimed at kids can collect from children under 13. “I think it’s amazing we were able to get anything passed,” Montgomery admits. “What I wanted was a law that would protect children up to the age of 18, but I had strong opposition from the [technology] industry and the civil liberties communities.”
Despite updates, the law remains under-enforced according to some privacy experts, and it has failed to keep up with emerging digital technology. A 2017 study from the market research firm Smarty Pants found that 85 percent of 6-to-12-year-olds use YouTube daily, which is technically for users 13 and up. Another issue: companies may not always comply with the law. Both YouTube and the Google Play Store have been accused of allegedly violating COPPA. (Google, which owns YouTube, did not respond to a request for comment. But in April, when asked about allegations that thousands of children’s apps in the Google Play Store might be violating COPPA, the company told The Washington Post, “We’re taking the researchers’ report very seriously and looking into their findings.”) In September, New Mexico’s attorney general filed a lawsuit against app maker Tiny Lab Productions and a handful of its advertising partners, including Twitter and Google, for violating COPPA, claiming dozens of Android apps harvested children’s data and shared it with advertising and online tracking companies. According to The New York Times, Tiny Lab’s founder, Jonas Abromaitis, “said he believed he had followed the law and Google’s requirements, because the app asked for users’ ages and tracked those who identified as over 13.” Still, Google ended its relationship with Tiny Lab and removed its apps from the Play Store. A recent Times analysis of 20 children’s apps on Android and iOS found that several “sent data to tracking companies, potentially violating children’s privacy law.”
1. Keep computers and tablets in a public space.
It’s a lot easier to keep tabs on your kids’ online activity when they’re in the kitchen as opposed to their bedroom with the door closed. Create a public charging station in your home, a place where all devices (including yours) are charged and left overnight.
2. Draft an online conduct contract.
Set clear rules and consequences that everyone can agree on. Make sure your kids understand your expectations, whether they’re using a phone, laptop, or video game.
3. Start with the basics.
Teach kids basic ways to protect themselves without limiting their fun. Start with simple changes like opting out of location sharing, setting strong passwords, and covering computer and smartphone cameras when they’re not in use.
4. Be their friend online.
If you agree to let your tween use Instagram or Snapchat, make it clear that you’ll have access to the account. You’ll be a friend. You’ll know the password.
5. Don’t be afraid to ask questions.
Talk, educate, and listen. Be informed about what you bring into your home, and what devices and apps your kids are using. And be conscious of your own digital behavior. Don’t use your phone at the breakfast table unless you want the whole family to do the same.
The majority of Americans use social media today, but parents and teens gravitate to different platforms.
of teens believe that
tech companies manipulate users
to spend more time on their devices.
Parents and kids aren’t reading the terms of service. No surprise there. But they’d have a better idea of what’s happening to their data if they did.
say it is “extremely important” for social media sites to ask permission before sharing or selling their personal information.
almost never or only occasionally/sometimes read the terms of service on social networking sites.
About eight in 10 teens (79%) have changed their privacy settings on a social networking site to limit what they share with others (86% of parents have done the same).
of 13- to-17-year-olds check social media daily. Let’s see how that breaks down further:
of teens who use
social media say they have
on the same platform,
to keep one account hidden from certain people.
Until the laws change, there are many ways parents can push back against shrewd marketers, data brokers, and all those hackable devices. At a basic level, experts say, be informed: make sure you understand (and activate) the privacy settings of any device your child uses and know how much information it’s collecting. If that feels overwhelming, the device may not belong in your home. Changing default passwords on your wireless network, smart toys, and other devices helps, as does covering computer and smartphone cameras when they’re not in use. Opting out of location sharing—on your own posts as well as your kids’— reduces accessibility to personal information.
For children under four, the rule is simple: limit screen time—yours and theirs. There are so many developmental milestones (face-to-face interactions, social skills, language) that are more important for babies and toddlers to master. The American Academy of Pediatrics recommends no screen time other than video chats for children under 18 months. For kids older than two, limit screen time to an hour or less each day with high-quality content. But screen time isn’t the only issue for little ones. The mere presence of a parent’s smartphone can be problematic. “If the parent has the phone face-up during playtime, that says the phone is a partner, and can affect the quality and depth of the interaction with the child,” says Dave Anderson, senior director of the ADHD and Behavior Disorders Center and director of programs at the Child Mind Institute in New York. “Everything kids learn about screens comes from the people in front of them.”
In elementary school, children start exploring the internet, and that’s a good thing, experts say. Common Sense Media recommends age-appropriate apps, movies, books, and TV shows, but parents should pay close attention. Last year, reports surfaced of children clicking their way to wildly inappropriate YouTube videos that twist kid-friendly storylines into something more menacing. Online games are a problem, too, because they have embedded chat functions. “The most frequent issue is not that there’s a predator but that the chat is not age-appropriate. Kids are saying swear words or things that are racist or sexist,” Anderson says. “Make sure that if an elementary school kid is playing a game, turn off the chat function. Try to monitor how they communicate with other users.” Also, watch with them. That way, parents can intervene, helping kids make sense of what they’re seeing and ensuring they’re only interacting with friends and classmates online. And before you hand over your smartphone to your young child so you can work or fix dinner, create a folder on the Home screen filled with age-appropriate apps you’ve personally researched.
Make sure that if an elementary school kid is playing a game, turn off the chat function.
As children enter middle school, experts say parents should continue setting good screen habits by defining boundaries and checking in often. That’s not always easy. Adolescents’ brains are wired for instant gratification, not long-term risk assessment, so they’re far more likely to post personal or inappropriate information. That’s why parents need to do the hypothetical thinking for them. One way to start is by asking your middle schooler what kinds of tech they’re interested in, and why. If you agree to let your tween use social media, make it clear that you’ll have access to the account. Create a charging station in your home where all devices (including yours) are charged and left overnight. Signing a smartphone contract isn’t a panacea, as the Goulds found out, but it does help set expectations for the entire family.
Markos, the digital privacy expert from Suffolk University, urges parents to take tech out of kids’ bedrooms. “They’ll be miserable about it, but that’s how you protect your kid. Be that parent,” she says. “No matter how many privacy controls you put on, they can bypass and get any content they wish for.”
Most of all, Anderson says, avoid unnecessary criticism of the kinds of media you do allow. No get-off-my-lawn whining about how things were better in your day, he adds. “Let them know you’re interested in their digital world.”
By high school, parents should continue keeping tabs on their teens’ digital lives, but experts caution them from spying (unless there’s a reason). “It’s like teaching your child to drive. You don’t just give them the keys and say, ‘Hey, go on the freeway,’” says Kaveri Subrahmanyam, chair of child and family studies at California State University, Los Angeles. “They get a permit. Then you have to drive 50 hours in the car with them. They are restricted.”
Eventually, she says, parents can ease up, letting their kids use technology on their own, always with an open line of communication. And never discount the importance of modeling the behavior you hope to see in young people. “If the parents are constantly on their phones, guess what? You don’t have to be Einstein to know that you’re child will do the same,” says Subrahmanyam. “If your behavior changes, your child’s will, too.”
For Gould, one of the biggest takeaways from what happened with the lewd photo is the importance of talking to your kids about technology. “I was almost glad that it happened,” she says of the incident. It was the moment she realized she had a lot to learn about her son’s digital life. Within the first four months after he received the disturbing photo, he used a smartphone app to make her car radio go haywire. He then used it to interfere with the mouse on laptops in the school library. He later disabled his teacher’s phone, then figured out how to skirt around parental controls on his own device. “We took his phone away for a year,” Gould says. “He was upset, but he kept violating the tenets of the contract.”
The next year—seventh grade—was intense, too. Gould and her son battled over whether he could play first-person shooter games like Call of Duty and Halo. “We wouldn’t let him even though all his friends were. These games are meant for 18 and up—he was 13!” She says. Then he discovered YouTube. When she learned he’d been watching a star named PewDiePie, she Googled him and found he’d posted videos with anti-semitic imagery and used a racial slur. So she sat her son down for a talk. Soon after, she brought that conversation to his school and organized a viewing of Screenagers, a documentary about growing up in the digital age, for students and parents. “He was so embarrassed,” Gould says, “But so many parents had no idea what kids were doing with their phones.”
Now that her son is almost 15, Gould still checks his emails, texts, and browsing history, and reminds him to limit his screen time, but “he has more impulse control and is not as careless about what he does online,” she says. “We have a lot of family meetings and discussions, and I think he understands our doctrine. It’s clear to him what we find acceptable and unacceptable.”