Tech Safety Directory
Laws & Rules Protecting Minors
Children's Online Privacy Protection Act (COPPA)

united-states  USA


The Children’s Online Privacy Protection Act (COPPA) was passed by Congress in 1998. COPPA required the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s online privacy. COPPA was designed to protect children under age 13 and place parents in control over what information is collected from their young children online. Sites, apps, games and other online services that are directed to children under 13 years old need parental consent before collecting personal information from children under 13. The COPPA rule also applies to general audience sites and apps that know they are collecting personal information from kids. Usually kids are asked to provide their parents email when registering on a site / app in order for the service to provide notice of its data collection needs and to get the proper level of parental consent.

California Consumer Privacy Act (CCPA)

united-states  USA (California)


Under the law that went into effect Jan. 1, 2021, Californians can demand that companies tell them what information they've collected about them, and to delete and no longer sell their personal information. The law extends extra protections for teens up to age 16, prohibiting companies from selling their data unless explicitly given permission.


united-states  USA (California)

The CPRA amends and expands the California Consumer Privacy Act (CCPA)—California’s current privacy law that itself is nearly brand new. Most of the CPRA’s substantive provisions will not take effect until January 1, 2023. However, the CPRA’s expansion of the “Right to Know” impacts personal information (PI) collected during the ramp-up period, on or after January 1, 2022. In short, CPRA strengthens the rights of California residents, tightening business regulations on the use of personal information (PI), and establishing a new government agency for state-wide data privacy enforcement called the California Privacy Protection Agency (CPPA), among key changes to the Golden State’s data privacy regime.

 It includes:
a. New criteria for which businesses are regulated;
b. New category of “sensitive personal information”;
c. New and expanded consumer privacy rights:

Brand-new rights
Right to Correction. Consumers may request any correction of their PI held by a business if that information is inaccurate.
Right to Opt Out of Automated Decision Making Technology. The CPRA authorizes regulations allowing consumers to opt out of the use of automated decision making technology, including “profiling,” in connection with decisions related to a consumer’s work performance, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.
Right to Access Information About Automated Decision Making. The CPRA authorizes regulations allowing consumers to make access requests seeking meaningful information about the logic involved in the decision making processes and a description of the likely outcome based on that process.
Right to Restrict Sensitive PI. Consumers may limit the use and disclosure of sensitive PI for certain secondary purposes, including prohibiting businesses from disclosing sensitive PI to third parties, subject to certain exemptions.
Audit Obligations. The CPRA authorizes regulations that will require mandatory risk assessments and cybersecurity audits for high-risk activities. The risk assessments must be submitted to the newly established California Privacy Protection Agency (see below) on a “regular basis.”

Modified rights

Modified Right to Delete. Businesses are now required to notify third parties to delete any consumer PI bought or received, subject to some exceptions.
Expanded Right to Know. The PI that must be reflected in a “Right to Know” response is expanded to include, for valid requests, PI collected beyond the prior 12 months, if collected after January 1, 2022.
Expanded Right to Opt Out. The CCPA already grants consumers the right to opt out of the sale of their PI to third parties, which implicitly includes sensitive PI; however, the opt-out right now covers “sharing” of PI for cross-context behavioral advertising as outlined below.
Strengthened Opt-In Rights for Minors. Extends the opt-in right to explicitly include the sharing of PI for behavioral advertising purposes. As with the opt-out right, businesses must wait 12 months before asking a minor for consent to sell or share his or her PI after the minor has declined to provide it.
Expanded Right to Data Portability. Consumers may request that the business transmit specific pieces of PI to another entity, to the extent it is technically feasible for the business to provide the PI in a structured, commonly used and machine-readable format.

d. Directly regulates the sharing of PI for cross-context behavioral advertising
e. Creates a new privacy enforcement authority
f. Adopts certain GDPR principles
g. Service providers and contractors: The CPRA amends the definition of “service provider” and introduces “contractors,” a new category of recipients of PI who process PI made available to them by businesses pursuant to a written contract.
i. New consent standard
j. Data breaches and private right of action
General Data Protection Regulation (GDPR)

european-union  Europe (EU)

The GDPR went into effect May 25, 2018. The regulation focuses on providing data protection and privacy for all individuals within the European Union and all individuals whose data is processed by an EU controller regardless of location. It also includes special protections for children’s data. Recital 38 protects young users because they may be less aware of the risks, consequences and safeguards concerned with marketing. The GDPR sets the age of consent at 16, but individual member states may lower this as far as 13. A child below the age of consent cannot provide consent for themselves. When consent is the lawful basis for processing a child’s data reasonable efforts to verify that the person giving consent is old enough to do so, are required. Online services must obtain consent from the holder of parental responsibility for the child. View the Age of Digital Consent Map to see the age determined by each EU member state.

ICO’s Children’s Code

united-kingdom  UK

"The Children’s Code (or the Age Appropriate Design Code) contains 15 standards that online services such as apps, online games, and web and social media sites, need to follow. This ensures they are complying with the their obligations under data protection law to protect children’s data online.
It came into force on 2 September 2020 with a 12 month transition period to give organisations time to prepare. The code applies to UK-based companies and non-UK companies who process the personal data of UK children."

Virginia Consumer Data Protection Act (CDPA)

united-states  US (Virginia)

The CDPA establishes rights for Virginia consumers to control how companies use individuals’ personal data by granting residents the rights to access, correct, delete, know, and opt-out of the sale and processing for targeted advertising purposes of their personal information, similar to the CCPA. The CDPA was signed into law on March 2, 2021, but it will not go into effect until January 1, 2023.

Personal Information and Electronic Documents Act (PIPEDA)

canada  Canada

"PIPEDA is Canada’s federal private sector privacy law. Organizations covered by PIPEDA must generally obtain an individual's consent when they collect, use or disclose that individual's personal information. People have the right to access their personal information held by an organization. They also have the right to challenge its accuracy.

Personal information can only be used for the purposes for which it was collected. If an organization is going to use it for another purpose, they must obtain consent again. Personal information must be protected by appropriate safeguards."

Online Safety Bill

united-kingdom  UK

"The draft Online Safety Bill delivers the government’s manifesto commitment to make the UK the safest place in the world to be online while defending free expression. The Online Safety Bill is to protect children online and tackle some of the worst abuses on social media, including racist hate crimes.Ministers have added landmark new measures to the Bill to safeguard freedom of expression and democracy, ensuring necessary online protections do not lead to unnecessary censorship.The draft Bill marks a milestone in the UK Government’s fight to make the internet safe. The draft legislation imposes a duty of care on digital service providers to moderate user-generated content in a way that prevents users from being exposed to illegal and/or harmful stuff online. The draft Bill includes changes to put an end to harmful practices, while ushering in a new era of accountability and protections for democratic debate, including:

New additions to strengthen people’s rights to express themselves freely online, while protecting journalism and democratic political debate in the UK.
Further provisions to tackle prolific online scams such as romance fraud, which have seen people manipulated into sending money to fake identities on dating apps.
Social media sites, websites, apps and other services hosting user-generated content or allowing people to talk to others online must remove and limit the spread of illegal and harmful content such as child sexual abuse, terrorist material and suicide content.

Ofcom will be given the power to fine companies failing in a new duty of care up to £18 million or ten per cent of annual global turnover, whichever is higher, and have the power to block access to sites.

A new criminal offence for senior managers has been included as a deferred power. This could be introduced at a later date if tech firms don’t step up their efforts to improve safety."
Illinois Biometric Information Privacy Act (BIPA)

united-states  US (Illinois)

Under BIPA, a private entity cannot collect, capture, purchase, receive through trade or otherwise obtain a person’s biometric identifier or biometric information without: (a) informing the subject in writing that a biometric identifier or biometric information is being collected or stored; (b) informing the subject in writing of the specific purpose and duration for which it is being collected, stored and used; and (c) receiving the subject’s written consent. BIPA also requires that private entities that possess biometric identifiers or biometric information. the most significant aspect of BIPA is that it provides a private right of action for individuals harmed by BIPA violations and statutory damages up to $1,000 for each negligent violation and up to $5,000 for each intentional or reckless violation. The statute itself does not contain a statute of limitations.

united-states  US

The Family Educational Rights and Privacy Act (FERPA) is a Federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable program of the U.S. Department of Education. FERPA gives parents certain rights with respect to their children's education records. These rights transfer to the student when he or she reaches the age of 18 or attends a school beyond the high school level. Students to whom the rights have transferred are "eligible students."
PPRA (Protection of Pupil Rights Amendment)

united-states  US

"The Protection of Pupil Rights Amendment (PPRA) is a federal law that affords certain rights to parents of minor students with regard to surveys that ask questions of a personal nature. Briefly, the law requires that schools obtain written consent from parents before minor students are required to participate in any U.S. Department of Education funded survey, analysis, or evaluation that reveals information concerning certain areas.

The No Child Left Behind Act of 2001 contains a major amendment to PPRA that gives parents more rights with regard to the surveying of minor students, the collection of information from students for marketing purposes, and certain non-emergency medical examinations. In addition, an eight category of information (*) was added to the law. "
Student Online Personal Information Protection Act (“SOPIPA”)

united-states  US

SOPIPA is aimed at protecting the privacy and security of student data. The law is unique in that it puts responsibility for protecting student data directly on industry by expressly prohibiting education technology service providers from selling student data, using that information to advertise to students or their families, or "amassing a profile" on students to be used for noneducational purposes. In addition, the law requires online service providers to ensure that any data they collect is secure and to delete student information at the request of a school or district. SOPIPA provides clear rules of the road to ensure children's information isn't exploited for commercial or harmful purposes, and it ensures that information stays out of the wrong hands. It also supports innovation and personalized learning, so schools and students can harness the benefits of technology. It makes the edtech companies who collect and handle students' sensitive information responsible for compliance; it applies whether or not a contract is in place with a school; and it applies to apps, cloud-computing programs, and all manner of online edtech services. The law also addresses security procedures and practices of covered information in order to protect information from unauthorized access, destruction, use, modification or disclosure.
California AB 1584, Education Code section 49073.1 – Privacy of Pupil Records: 3rd-Party Digital

united-states  US (California)

"(1) Gather or maintain only information that pertains directly to school safety or to pupil safety.
(2) Provide a pupil with access to any information about the pupil gathered or maintained by the school district, county office of education, or charter school that was obtained from social media, and an opportunity to correct or delete such information.
(3) (A) Destroy information gathered from social media and maintained in its records within one year after a pupil turns 18 years of age or within one year after the pupil is no longer enrolled in the school district, county office of education, or charter school, whichever occurs first.
(B) Notify each parent or guardian of a pupil subject to the program that the pupil’s information is being gathered from social media and that any information subject to this section maintained in the school district’s, county office of education’s, or charter school’s records with regard to the pupil shall be destroyed in accordance with subparagraph (A). The notification required by this subparagraph may be provided as part of the notification required pursuant to Section 48980. The notification shall include, but is not limited to, all of the following:
(i) An explanation of the process by which a pupil or a pupil’s parent or guardian may access the pupil’s records for examination of the information gathered or maintained pursuant to this section.
(ii) An explanation of the process by which a pupil or a pupil’s parent or guardian may request the removal of information or make corrections to information gathered or maintained pursuant to this section.
(C) If the school district, county office of education, or charter school contracts with a third party to gather information from social media on an enrolled pupil, require the contract to do all of the following:
(i) Prohibit the third party from using the information for purposes other than to satisfy the terms of the contract.
(ii) Prohibit the third party from selling or sharing the information with any person or entity other than the school district, county office of education, charter school, or the pupil or his or her parent or guardian.
(iii) Require the third party to destroy the information immediately upon satisfying the terms of the contract.
(iv) Require the third party, upon notice and a reasonable opportunity to act, to destroy information pertaining to a pupil when the pupil turns 18 years of age or is no longer enrolled in the school district, county office of education, or charter school, whichever occurs first. The school district, county office of education, or charter school shall provide notice to the third party when a pupil turns 18 years of age or is no longer enrolled in the school district, county office of education, or charter school. Notice provided pursuant to this clause shall not be used for any other purpose."
K-12 Cybersecurity Act of 2021

united-states  US

The K–12 Cybersecurity Act of 2021, the federal government’s first foray into K-12 cybersecurity, was passed into law in an effort to aid student data security. The law charges the director of the Cybersecurity and Infrastructure Security Agency (CISA) to bring together a team and gather appropriate stakeholder input from K-12 schools around the US over a four-month period, then consolidate that knowledge into a set of cybersecurity guidelines over the next two months, followed by the development of an online toolkit to assist school districts as they strengthen their digital security environment.

California Age-Appropriate Design Code Act (AADC)

united-states  US (California)


California has passed the bill for its Age-Appropriate Design Code Act (AADC). In the world of children’s privacy it is expected to have a global impact. Modeled on the UK’s Children ‘s Code it requires privacy by design in all online services for children or that attract a large child audience, children being users under 18 years old.

To date the federal Children’s Online Privacy Protection Act (COPPA) has been the gold standard in the US but the Code will bring additional requirements for services already complying with COPPA which protects children 12 or under only. If the Code is signed into law by the state governor and it is expected to be, sites, apps, platforms, metaverses and connected devices will all need to comply by 2024.This may sound a long time off but actually it is a relatively short window in terms of the fundamental changes some services will need to make to comply or face significant fines of up to $7,500 per affected child. It will be enforced by the state attorney general.

There is time to get into shape but work should start at the design stage for any new services in build or now for any online services that do not currently meet the requirements and it's likely most won’t.

Here’s some of the key requirements that will need to be addressed at a high level:

  • Establish the age range of younger users to treat them appropriately.
  • Provide mechanisms for children to report their privacy concerns.
  • Provide age appropriate and clear privacy notices for children.
  • Algorithms that exploit children’s data to serve the harmful content are prohibited.
  • Precise location tracking is prohibited unless necessary for the operation of the service.
  • Transparency on location tracking is required i.e., include clear messaging to a child that it is on.
  • Do not sell children’s data unless it is essential to the service and do not profile children to serve targeted ads.
  • Only use data for the purpose it was collected.
  • Ensure data minimization, if the data is not needed for a specific and legitimate purpose then don’t collect it.
Privacy Act ,1988

australia  Australia

The Privacy Act 1988 protects an individual’s personal information regardless of their age. It doesn’t specify an age after which an individual can make their own privacy decision. For their consent to be valid, an individual must have capacity to consent.

An organization or agency handling the personal information of an individual under the age of 18 must decide if the individual has the capacity to consent on a case-by-case basis. As a general rule, an individual under the age of 18 has the capacity to consent if they have the maturity to understand what’s being proposed. If they lack maturity, it may be appropriate for a parent or guardian to consent on their behalf.

If it’s not practical for an organization or agency to assess the capacity of individuals on a case-by-case basis, as a general rule, an organization or agency may assume an individual over the age of 15 has capacity, unless they’re unsure.

Review the Privacy Act 1988

Proposed Regulatory Changes

The American Data Privacy and Protection Act (ADPPA)

united-states  USA


The American Data Privacy and Protection Act (ADPPA)

The American Data Privacy and Protection Act (ADPPA) aims to give internet users more control over their personal data. If passed, the ADPPA will be the first comprehensive federal privacy law in the US. 

In regards to children, Section 205 calls out data protections for children and minors: Covered entities are subject to additional requirements for covered data with respect to individuals under age 17. Targeted advertising is expressly prohibited [if covered entities have actual knowledge that an individual is under 17]/[to any individual under 17]. Covered entities may not transfer the covered data of individuals between 13 and 17 years old to third parties without express affirmative consent [where the covered entity has actual knowledge the individual is between 13 and 17].

This section establishes a Youth Privacy and Marketing Division at the FTC, which shall be responsible for addressing privacy and marketing concerns with respect to children and minors. The division must submit annual reports to Congress and hire staff that includes experts in youth development, data protection, digital advertising, and data analytics.

This section also requires the FTC Inspector General to submit a report to Congress every two years analyzing the fairness and effectiveness of the safe harbor provisions in the Children’s Online Privacy Protection Act of 1998 (COPPA). These reports must be published on the FTC web site.

Notably, there are several other important provisions with heightened or specific call-outs to children and minors throughout the legislation.

Other highlights of the bill include: 

  • A data minimization approach: companies will be allowed to collect and use users' data only for 17 permitted purposes. These include users' authentication, fraud prevention and online payments.
  • Stricter limitations on targeted ads: On top of the provisions mentioned above, the FTC will be responsible for creating standard opt-out methods that companies will be obliged to follow.
  • A ban on using sensitive data for targeted ads: This includes health information, precise geo-localization details like personal IP address and private communications. 

According to the ADPPA's pre-emption principle, no States will be allowed to enforce their own regulations on the same privacy issues that the federal law will cover. This will de-facto statutes like the California's Consumer Privacy Rights Act.

The Children & Teen’s Online Privacy Protection Act (COPPA 2.0)

united-states  USA


The Children & Teen’s Online Privacy Protection Act (COPPA 2.0) would:

  • Build on COPPA’s consent requirements by prohibiting internet companies from collecting personal information from users who are 13 to 15 years old without the user’s consent
  • Ban targeted advertising (as opposed to contextual advertising) directed at children 
  • Establish a “Digital Marketing Bill of Rights for Teens” that limits the collection of personal information of teens
  • Require an operator of a website, online service, online application, or mobile application, or connected device that is direct to children or minors or is used or is reasonably likely to be used by children or minors in a manner that involves the collection of their personal information, —get consent in order to collect children and minors’ data
  • Create an “Eraser Button” for parents and kids by requiring companies to permit users to eliminate personal information from a child or teen when technologically feasible
  • Establish a Youth Marketing and Privacy Division at the Federal Trade Commission (FTC)
  • Require online companies to explain the types of personal information collected, how that information is used and disclosed, and the policies for collection of personal information
  • Require that internet connected devices targeted toward children meet robust cyber security standards
  • Commission reports on the effectiveness of the COPPA safe harbor program.

For more information on The Children & Teen’s Online Privacy Protection Act (COPPA 2.0) click here.

The Kids Online Safety Act of 2022 (KOSA) bill

united-states  USA


  • Provides parents and kids safeguards and tools to protect kids’ experiences online: The bill requires social media platforms provide minors with options to protect their information, disable addictive product features, and opt-out of algorithmic recommendations—and requires platforms to enable the strongest settings by default. The bill also gives parents new controls to help support their children and spot harmful behaviors, including by providing children and parents with a dedicated channel to report harms to kids to the platform. 
  • Creates accountability for social media’s harms to kids: The bill creates a duty for social media platforms to prevent and mitigate harms to minors, such as content promoting of self- harm, suicide, eating disorders, substance abuse, and sexual exploitation. It also requires social media platforms perform an annual independent audit assessing risks to minors, their compliance with this Act, and whether the platform is taking meaningful steps to prevent those harms. 
  • Opens up black box algorithms: The bill provides academic researchers and non-profit organizations with access to critical datasets from social media platforms to foster research regarding harms to the safety and well-being of minors.
  • For more information about KOSA, click here.
Washington State - SENATE BILL 5813

united-states  USA, Washington


In Washington State, Senator Reuven Carlyle and four co-sponsors introduced a bill that addresses several privacy and data protection matters, which includes provisions specifying collection and security practices for data from children and adolescent users. The bill identifies adolescents (ages 13 through 17) as a class separate from other children under 18 and has three main provisions: establishing consumer rights regarding data about children and adolescents, requiring consent for use of that data, and mandating a set of duties for businesses that collect and process that data.

The proposal also would require consent from parents of children, or the adolescents themselves, to collect and process personal data, and further would require a separate and express consent from an adolescent to sell their personal data or conduct targeting advertising. Finally, the bill would require businesses to conduct a data protection assessment, be transparent about collection and processing practices, secure personal data, and minimize data collection and retention.
Senate Bill 9563, the New York Child Data Privacy and Protection Act

united-states  USA, New York


Senate Bill 9563, the New York Child Data Privacy and Protection Act, covers minors age 17 and under and would require data protection impact assessments, privacy-by-default settings and limits on children's data practices. The bill also calls for a ban on targeted advertising against children.

While similar to the new California Age-Appropriate Design Code, New York takes further steps by including a number of provisions meant to help families in the event of serious harms committed against kids online. A key aspect that sets the new bill apart from its predecessor is a stipulation requiring tech companies provide parents with a way to notify them in case of emergencies — a sort of 911 for digital crimes.

Online Safety Bill

united-kingdom  UK


The Bill introduces new rules for companies whose services host user-generated content such as images, videos and comments, or which allow UK users to talk with other people online through messaging, comments and forums. This includes:

  • the biggest and most popular social media platforms

  • sites such as forums and messaging apps, some online games, cloud storage and the most popular pornography sites

  • search engines, which play a significant role in enabling users to access harmful content

Sites which publish pornographic content will also be required under the legislation to ensure that children cannot access age-inappropriate material. Those platforms which fail to protect people will need to answer to the regulator and could face fines of up to ten per cent of their revenues or, in the most serious cases, being blocked.

The regulator will have the powers necessary to take appropriate action against all companies in scope, no matter where they are based. This is essential given the global nature of the internet. Some services with user-generated content will be exempt from the new framework, including news websites, some retail services, some services used internally by businesses and email services. These services will also have a duty to bring in user empowerment tools, giving adult users more control over whom they interact with and the legal content they see, as well as the option to verify their identity.

Bill C-27 Consumer Privacy Protection Act (CPPA)

canada  Canada

Bill C-27 introduces three new acts: the Consumer Privacy Protection Act (“CPPA”), the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act (“AIDA”), which would replace the current Personal Information Protection and Electronic Documents Act (“PIPEDA”).

Bill C-27’s proposed Consumer Privacy Protection Act (CPPA) includes new protections for minors by requiring a higher standard of diligence and protection in respect to the collection and processing of their personal information.

Online Privacy Bill

australia  Australia

The Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (the Online Privacy Bill) will give effect to the Australian Government’s commitment to strengthen the Privacy Act 1988. It enables the introduction of a binding Online Privacy code for social media and certain other online platforms, and increases penalties and enforcement measures.

The Online Privacy Bill addresses the pressing privacy challenges posed by social media and other online platforms. The Privacy Act Review seeks to build on the outcomes of the Online Privacy Bill to ensure that Australia's privacy law framework empowers consumers, protects their data and best serves the whole of the Australian economy.

More information and status of the bill, click here.

Submit an Additional Resource