a man holding binoculars with the Facebook logo in place of the lenses

The Real State
of Children’s Privacy

By Claire Quinn
For over 20 years, the United States Children’s Online Privacy Protection Act (COPPA) has been the gold standard in privacy protections globally. However, it covers children 12 or under with no protections for teenage children between the ages of 13 and 17 years. With implementing the Act, many disgruntled online services claimed the requirements were too onerous. Some simply added a “get out of jail free card” to their terms of service, stating they were not for children under the age of 13. Weak “age gates,” that children could easily circumvent, have led to the use of services that are inappropriate and harmful to young users, which have failed to protect their privacy. Many services have actual knowledge that children are using them, but deliberately ignore this, treating kids as commodities — exploiting data to generate vast profits. The FTC enforces COPPA, as can the State Attorneys General, but enforcement has been sporadic over the years. The importance of children’s privacy has not been a priority, though FTC approved. COPPA Safe Harbors have battled to support the tiny percentage of online services that have joined their programs to open up to scrutiny and assessment and get it right for children.

New and emerging technologies and online developments have surged ahead at a phenomenal pace, leaving regulations behind and outdated. More and more children are coming online each day. Children make up a huge number of online users globally. UNICEF estimates one in three children (under 18) are online today and they also make up one of the most vulnerable segments of the population. There is no “on/off switch” for today’s children; they are permanently on, constantly connected via devices, which are collecting and processing their data from what they look at, how long they view, to their eye movements, mood, and psychological profiles. The datafication of children starts early, sometimes before birth, with parents using pregnancy and health apps. By the time the child starts school, they have a sophisticated profile that will shape their lives often without their or their parent’s knowledge.

Hitting the headlines

Lack of privacy protections and abuse of children’s personal information have hit the headlines of late, sparking an outcry and increasing pressure on U.S. legislators to act. Meta’s (formerly Facebook’s) whistle-blower, Frances Haugen, lifted the lid late last year on the social platform’s exploitative practices and put the issue firmly in front of a wider audience.

“The company’s leadership knows ways to make Facebook and Instagram safer and won’t make the necessary changes because they have put their immense profits before people.” Haugen said in her written statement, “Their profit optimizing machine is generating self-harm and self-hate — especially for vulnerable groups, like teenage girls. These problems have been confirmed repeatedly by Facebook’s own internal research.”

The real state of children’s privacy would seem a sorry one. However, the tide is turning with more awareness among parents and the public, thanks to whistle-blowers and privacy advocacy groups and a raft of legislation in the pipeline to protect children under the age of 18. It is a time of unprecedented change in the way the world approaches privacy for children, and it is being driven out of Europe and the U.S.

In 2018, the General Data Protection Act came into force. It was the first privacy regulation to speak directly to children since COPPA. The U.K. GDPR, the Data Privacy Act of 2018, included a requirement to launch an age-appropriate design code, known as the Children’s Code. The Code has a statutory footing and is having a global impact. California has followed in the U.K.’s footsteps with its own version of the Children’s Code.

The U.K. Code came into force in September 2020. At around the same time, Instagram, TikTok, and YouTube announced changes to their platforms that met some, not all, of the Code’s 15 standards while stating these changes had already been in the making.

TikTok turned off notifications for children past bedtime. Instagram disabled targeted advertisements for under-18s. YouTube turned off auto play for teen users.

California has also gone some way in addressing children’s privacy in the California Consumer Privacy Act. Children 13 to 16 years of age must affirmatively allow the sale of their personal information. If a child is under the age of 13 years, a parent or guardian must affirmatively allow for the sale of information.

Today, the FTC is completing its COPPA Rule Review, brought forward three years, to address some of the latest challenges in the children’s space. In May, the FTC also issued a policy statement to address EdTech providers that collect and process student data. It clearly signaled a crackdown on companies illegally tracking children’s learning online and warned that “it is against the law to force parents and schools to surrender children’s privacy rights in order to access educational tools.”

At the end of July, two child and teen privacy bills moved forward following a meeting of the U.S. Senate Committee on Commerce, Science, and Transportation. The Children & Teen’s Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA) will now go before the full Senate. The European Data Privacy Board will also address children’s privacy, which is expected to publish guidance in the coming months.

Pulling back the curtain
However, the actual issues related to student privacy have not enjoyed such prominence, and today, many EdTech providers continue to operate both knowingly and unknowingly in ways that cannot protect children’s privacy, and at worst, abuse and exploit their personal information.

For those poor actors, the complexity of complying with student digital privacy laws provides an excuse for ignoring them; for the talented actors, this complexity proves a challenge when offering valuable services to students and schools.

Much has changed in the education space during the past two years because of the COVID epidemic and the need for children to learn online, at home, and in virtual classrooms. It has been a period of unprecedented change in the learning environment. In 2020, over 40 million U.S. students in K-12 enrolled in distance learning and had to rely on online resources. Many online services not built for use by students and schools were suddenly in use, such as Zoom for video classes. Zoom found itself in an unexpected space and could not manage student use compliantly or safely. Zoom bombing hit the headlines in 2020 when students taking part in online lessons were subject to poor actors hijacking the screen with inappropriate content.

https://www.census.gov/library/stories/2020/08/schooling-during-the-covid-19-pandemic.html

In March 2021, the federal government approved over $7 billion to help ensure students and teachers have the Internet connectivity and devices they need at home to continue learning and teaching.

https://www.commonsensemedia.org/kids-action/articles/huge-win-to-connect-kids-and-teachers-at-home?j=8265676&sfmc_sub=171288772&l=2048712_HTML&u=166082407&mid=6409703&jb=7027&utm_source=advocacy_homework-gap-achievement_20210311&utm_medium=email

Rapid change brings with it risks for both students and for the EdTech providers that may have little understanding, experience, or resources to comply with the growing number of state privacy laws, or with the key relevant federal laws: Children’s Online Privacy Protection Act (COPPA) and the Family Education Rights and Privacy Act (FERPA).

The consequences of not complying can be severe financial penalties, as well as brand damage.

Over the past few years, EdTech providers have come under increased scrutiny. Google found itself in the firing line. New Mexico’s attorney general brought an action against the tech giant claiming its tools tracked students’ activities on their personal devices outside the classroom. It also faced a lawsuit last year from two students alleging it collected biometric data of thousands of students using its tools. Education platform Edmodo suffered a major data breach, which exposed millions of students’ and teachers’ information online, highlighting the need to prioritize security measures. Several senators wrote to EdTech companies asking them to provide more information on their data collection and use practices.

Navigating the different state laws and aligning the requirements in one online service is not only a challenge but can prove costly for companies. Add the challenge of understanding FERPA, an Act which came into force decades ago before much of the new technology even existed, and COPPA’s education exception, which is not entirely clear on who provides consent for the collection and use of student data, and it is no wonder why many EdTech services are out of compliance.

It was not until 2017 that the FTC and the Department of Education held a Student Privacy and EdTech Workshop to discuss how these regulations intersect and whether they needed to be updated. The intersection between the two statutes happens in schools, where teachers use education technology in the classroom. However, many EdTech apps and websites used by individuals that are not verified to be teachers, can add children to “classrooms” where they can share data and personal information unchecked.

Schools, districts, and teachers do not have the resources (or expertise) to understand which services are privacy-safe or what these services are doing with the data they collect — from location information to behavior and preferences. Free services need to make money somehow, and if it is free, it usually means the collection of personal data will be sold to generate revenue. Student data is valuable to third parties, advertisers, and marketers.

Child privacy issues and concerns have grown in recent years, along with public awareness. This may have been behind the FTC’s decision to bring its COPPA Rule Review forward, and it would seem highly likely that student data will be addressed when it is finally published.

Why highly likely? Fast-forward to May of this year. The FTC issued a policy statement to address EdTech providers that collect and process student data. It clearly signaled a crackdown on companies illegally tracking children’s learning online and warned that “it is against the law to force parents and schools to surrender children’s privacy rights in order to access educational tools.”

https://www.ftc.gov/news-events/news/press-releases/2022/05/ftc-crack-down-companies-illegally-surveil-children-learning-online

The statement reiterates the key tenets of COPPA, including a prohibition against companies requiring children to provide more information than reasonably needed for participation in an activity. It also reinforces that EdTech providers, who collect personal information from a child with the school’s authorization, are prohibited from using the information for any other commercial purpose, including marketing or advertising. It states that providers must have reasonable security measures in place to protect children. The policy statement notes that companies that cannot follow the COPPA Rule could face potential civil penalties and new requirements and limitations on their business practices aimed at stopping unlawful conduct.

In other areas, advocacy groups are battling to bring in measures to support schools to protect student data. The Student Digital Privacy Consortium (SDPC), a non-profit that sent an open letter to the FTC on the recent Policy Statement, fears the FTC did not go far enough to clarify the overlap between COPPA and FERPA.

The SDPC is behind the first model National Data Privacy Agreement (NDPA) for school districts to use with their technology service providers. Schools and EdTech companies have struggled to create data privacy agreements (DPAs) that adequately protect student data. The federal laws and differing state laws require specific contractual clauses or protections. EdTech providers are often overwhelmed by the number of agreements they need to review, comprehend without expertise, and comply with, if the students and teachers are to use their services. The NDPA aims to streamline the process and is a step in the right direction.

Security has always been a challenge for online services and is pertinent to EdTech providers working with student data and records. Unauthorized disclosures could have real and harmful effects on a child.

In 2021, the K–12 Cybersecurity Act of 2021 came into law. The law charges the director of the Cybersecurity and Infrastructure Security Agency (CISA) to bring together a team and gather stakeholder input from K-12 schools around the U.S. over a four-month period, then merge that knowledge into a set of cybersecurity guidelines, followed by the development of an online toolkit to assist school districts as they strengthen their digital security environment. There is optimism this legislation will lead school districts to help protect against data breaches.

Learning is one of the most important aspects of a child’s life, and the responsible use of personal information by educational technology providers is vital to the process. However, the challenge of aligning regulations at federal and state levels, the complexity of the requirements, the lack of clarity on how to implement controls to meet requirements, and a lack of enforcement — set against the backdrop of growing numbers of students and schools working online — means there is a long way to go before the situation is as it should be.

About the author

Claire Quinn is chief privacy officer and head of the KPA Program, CIPP/e at PRIVO. She is a subject matter expert (SME) specializing in COPPA, GDPR, Student Digital Privacy and child safety in the digital world. She has worked closely with major well-known child directed brands, third party service providers, moderation companies, CEOP and U.S. law enforcement. Claire also works directly with regulators, including the FTC and EU data protection authorities and has over 25 years’ experience in the media industry.

Her expertise is often called upon by regulators and industry, and she regularly takes part in industry events. Claire was on the FTC COPPA Rule review panel in DC. She is due to take part on a panel on child privacy at the International Association of Privacy Professionals Brussels Conference in 2022, and regularly takes part in webinars. Claire has also contributed to many articles and white papers.

Her previous experience includes launching B2C websites for United News & Media, Head of Lycos U.K. — content and channels where she helped to launch Europe’s biggest chatroom. In addition, Claire was chief of safety at MMO WeeWorld.com, a U.S. virtual world with mobile apps for tweens and teens. She also took up a role as an ambassador for The Children’s Exploitation and Online Protection agency.