Search
TSD_Laws_icon-01
Tech Safety Directory
Key Child Privacy Protections
medium_Map
Children's Online Privacy Protection Act (COPPA)

united-states  USA

 

The Children’s Online Privacy Protection Act (COPPA) was passed by Congress in 1998. COPPA required the Federal Trade Commission (FTC) to issue and enforce regulations concerning children’s online privacy. COPPA was designed to protect children under age 13 and place parents in control over what information is collected from their young children online. Sites, apps, games and other online services that are directed to children under 13 years old need parental consent before collecting personal information from children under 13. The COPPA rule also applies to general audience sites and apps that know they are collecting personal information from kids. Usually kids are asked to provide their parents email when registering on a site / app in order for the service to provide notice of its data collection needs and to get the proper level of parental consent.

California Consumer Privacy Act (CCPA)

united-states  USA (California)

 

Under the law that went into effect Jan. 1, 2021, Californians can demand that companies tell them what information they've collected about them, and to delete and no longer sell their personal information. The law extends extra protections for teens up to age 16, prohibiting companies from selling their data unless explicitly given permission.

CPRA

united-states  USA (California)

The CPRA amends and expands the California Consumer Privacy Act (CCPA)—California’s current privacy law that itself is nearly brand new. Most of the CPRA’s substantive provisions will not take effect until January 1, 2023. However, the CPRA’s expansion of the “Right to Know” impacts personal information (PI) collected during the ramp-up period, on or after January 1, 2022. In short, CPRA strengthens the rights of California residents, tightening business regulations on the use of personal information (PI), and establishing a new government agency for state-wide data privacy enforcement called the California Privacy Protection Agency (CPPA), among key changes to the Golden State’s data privacy regime.

 It includes:
a. New criteria for which businesses are regulated;
b. New category of “sensitive personal information”;
c. New and expanded consumer privacy rights:

Brand-new rights
Right to Correction. Consumers may request any correction of their PI held by a business if that information is inaccurate.
Right to Opt Out of Automated Decision Making Technology. The CPRA authorizes regulations allowing consumers to opt out of the use of automated decision making technology, including “profiling,” in connection with decisions related to a consumer’s work performance, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.
Right to Access Information About Automated Decision Making. The CPRA authorizes regulations allowing consumers to make access requests seeking meaningful information about the logic involved in the decision making processes and a description of the likely outcome based on that process.
Right to Restrict Sensitive PI. Consumers may limit the use and disclosure of sensitive PI for certain secondary purposes, including prohibiting businesses from disclosing sensitive PI to third parties, subject to certain exemptions.
Audit Obligations. The CPRA authorizes regulations that will require mandatory risk assessments and cybersecurity audits for high-risk activities. The risk assessments must be submitted to the newly established California Privacy Protection Agency (see below) on a “regular basis.”

Modified rights

Modified Right to Delete. Businesses are now required to notify third parties to delete any consumer PI bought or received, subject to some exceptions.
Expanded Right to Know. The PI that must be reflected in a “Right to Know” response is expanded to include, for valid requests, PI collected beyond the prior 12 months, if collected after January 1, 2022.
Expanded Right to Opt Out. The CCPA already grants consumers the right to opt out of the sale of their PI to third parties, which implicitly includes sensitive PI; however, the opt-out right now covers “sharing” of PI for cross-context behavioral advertising as outlined below.
Strengthened Opt-In Rights for Minors. Extends the opt-in right to explicitly include the sharing of PI for behavioral advertising purposes. As with the opt-out right, businesses must wait 12 months before asking a minor for consent to sell or share his or her PI after the minor has declined to provide it.
Expanded Right to Data Portability. Consumers may request that the business transmit specific pieces of PI to another entity, to the extent it is technically feasible for the business to provide the PI in a structured, commonly used and machine-readable format.

d. Directly regulates the sharing of PI for cross-context behavioral advertising
e. Creates a new privacy enforcement authority
f. Adopts certain GDPR principles
g. Service providers and contractors: The CPRA amends the definition of “service provider” and introduces “contractors,” a new category of recipients of PI who process PI made available to them by businesses pursuant to a written contract.
i. New consent standard
j. Data breaches and private right of action
General Data Protection Regulation (GDPR)

european-union  Europe (EU)

The GDPR went into effect May 25, 2018. The regulation focuses on providing data protection and privacy for all individuals within the European Union and all individuals whose data is processed by an EU controller regardless of location. It also includes special protections for children’s data. Recital 38 protects young users because they may be less aware of the risks, consequences and safeguards concerned with marketing. The GDPR sets the age of consent at 16, but individual member states may lower this as far as 13. A child below the age of consent cannot provide consent for themselves. When consent is the lawful basis for processing a child’s data reasonable efforts to verify that the person giving consent is old enough to do so, are required. Online services must obtain consent from the holder of parental responsibility for the child. View the Age of Digital Consent Map to see the age determined by each EU member state.

ICO’s Children’s Code

united-kingdom  UK

"The Children’s Code (or the Age Appropriate Design Code) contains 15 standards that online services such as apps, online games, and web and social media sites, need to follow. This ensures they are complying with the their obligations under data protection law to protect children’s data online.
It came into force on 2 September 2020 with a 12 month transition period to give organisations time to prepare. The code applies to UK-based companies and non-UK companies who process the personal data of UK children."

Virginia Consumer Data Protection Act (CDPA)

united-states  US (Virginia)

The CDPA establishes rights for Virginia consumers to control how companies use individuals’ personal data by granting residents the rights to access, correct, delete, know, and opt-out of the sale and processing for targeted advertising purposes of their personal information, similar to the CCPA. The CDPA was signed into law on March 2, 2021, but it will not go into effect until January 1, 2023.

Personal Information and Electronic Documents Act (PIPEDA)

canada  Canada

"PIPEDA is Canada’s federal private sector privacy law. Organizations covered by PIPEDA must generally obtain an individual's consent when they collect, use or disclose that individual's personal information. People have the right to access their personal information held by an organization. They also have the right to challenge its accuracy.

Personal information can only be used for the purposes for which it was collected. If an organization is going to use it for another purpose, they must obtain consent again. Personal information must be protected by appropriate safeguards."

Illinois Biometric Information Privacy Act (BIPA)

united-states  US (Illinois)

Under BIPA, a private entity cannot collect, capture, purchase, receive through trade or otherwise obtain a person’s biometric identifier or biometric information without: (a) informing the subject in writing that a biometric identifier or biometric information is being collected or stored; (b) informing the subject in writing of the specific purpose and duration for which it is being collected, stored and used; and (c) receiving the subject’s written consent. BIPA also requires that private entities that possess biometric identifiers or biometric information. the most significant aspect of BIPA is that it provides a private right of action for individuals harmed by BIPA violations and statutory damages up to $1,000 for each negligent violation and up to $5,000 for each intentional or reckless violation. The statute itself does not contain a statute of limitations.
FERPA

united-states  US

The Family Educational Rights and Privacy Act (FERPA) is a Federal law that protects the privacy of student education records. The law applies to all schools that receive funds under an applicable program of the U.S. Department of Education. FERPA gives parents certain rights with respect to their children's education records. These rights transfer to the student when he or she reaches the age of 18 or attends a school beyond the high school level. Students to whom the rights have transferred are "eligible students."
PPRA (Protection of Pupil Rights Amendment)

united-states  US

"The Protection of Pupil Rights Amendment (PPRA) is a federal law that affords certain rights to parents of minor students with regard to surveys that ask questions of a personal nature. Briefly, the law requires that schools obtain written consent from parents before minor students are required to participate in any U.S. Department of Education funded survey, analysis, or evaluation that reveals information concerning certain areas.

The No Child Left Behind Act of 2001 contains a major amendment to PPRA that gives parents more rights with regard to the surveying of minor students, the collection of information from students for marketing purposes, and certain non-emergency medical examinations. In addition, an eight category of information (*) was added to the law. "
Student Online Personal Information Protection Act (“SOPIPA”)

united-states  US (California)

The Student Online Personal Information Protection Act (“SOPIPA”) was passed in 2014 in California and went into effect in 2016. SOPIPA is considered by many to be the most comprehensive student data privacy legislation in the United States that specifically addresses the changing nature of technology usage in schools by putting responsibility for compliance on the edtech industry.

SOPIPA is aimed at protecting the privacy and security of student data. The law is unique in that it puts responsibility for protecting student data directly on industry by expressly prohibiting education technology service providers from selling student data, using that information to advertise to students or their families, or "amassing a profile" on students to be used for noneducational purposes. In addition, the law requires online service providers to ensure that any data they collect is secure and to delete student information at the request of a school or district. SOPIPA provides clear rules of the road to ensure children's information isn't exploited for commercial or harmful purposes, and it ensures that information stays out of the wrong hands. It also supports innovation and personalized learning, so schools and students can harness the benefits of technology. It makes the edtech companies who collect and handle students' sensitive information responsible for compliance; it applies whether or not a contract is in place with a school; and it applies to apps, cloud-computing programs, and all manner of online edtech services. The law also addresses security procedures and practices of covered information in order to protect information from unauthorized access, destruction, use, modification or disclosure.

California AB 1584, Education Code section 49073.1 – Privacy of Pupil Records: 3rd-Party Digital

united-states  US (California)

"(1) Gather or maintain only information that pertains directly to school safety or to pupil safety.
(2) Provide a pupil with access to any information about the pupil gathered or maintained by the school district, county office of education, or charter school that was obtained from social media, and an opportunity to correct or delete such information.
(3) (A) Destroy information gathered from social media and maintained in its records within one year after a pupil turns 18 years of age or within one year after the pupil is no longer enrolled in the school district, county office of education, or charter school, whichever occurs first.
(B) Notify each parent or guardian of a pupil subject to the program that the pupil’s information is being gathered from social media and that any information subject to this section maintained in the school district’s, county office of education’s, or charter school’s records with regard to the pupil shall be destroyed in accordance with subparagraph (A). The notification required by this subparagraph may be provided as part of the notification required pursuant to Section 48980. The notification shall include, but is not limited to, all of the following:
(i) An explanation of the process by which a pupil or a pupil’s parent or guardian may access the pupil’s records for examination of the information gathered or maintained pursuant to this section.
(ii) An explanation of the process by which a pupil or a pupil’s parent or guardian may request the removal of information or make corrections to information gathered or maintained pursuant to this section.
(C) If the school district, county office of education, or charter school contracts with a third party to gather information from social media on an enrolled pupil, require the contract to do all of the following:
(i) Prohibit the third party from using the information for purposes other than to satisfy the terms of the contract.
(ii) Prohibit the third party from selling or sharing the information with any person or entity other than the school district, county office of education, charter school, or the pupil or his or her parent or guardian.
(iii) Require the third party to destroy the information immediately upon satisfying the terms of the contract.
(iv) Require the third party, upon notice and a reasonable opportunity to act, to destroy information pertaining to a pupil when the pupil turns 18 years of age or is no longer enrolled in the school district, county office of education, or charter school, whichever occurs first. The school district, county office of education, or charter school shall provide notice to the third party when a pupil turns 18 years of age or is no longer enrolled in the school district, county office of education, or charter school. Notice provided pursuant to this clause shall not be used for any other purpose."
K-12 Cybersecurity Act of 2021

united-states  US

The K–12 Cybersecurity Act of 2021, the federal government’s first foray into K-12 cybersecurity, was passed into law in an effort to aid student data security. The law charges the director of the Cybersecurity and Infrastructure Security Agency (CISA) to bring together a team and gather appropriate stakeholder input from K-12 schools around the US over a four-month period, then consolidate that knowledge into a set of cybersecurity guidelines over the next two months, followed by the development of an online toolkit to assist school districts as they strengthen their digital security environment.

California Age-Appropriate Design Code Act (AADCA)

united-states  US (California)

 

California has passed the bill for its Age-Appropriate Design Code Act (AADC). In the world of children’s privacy it is expected to have a global impact. Modeled on the UK’s Children ‘s Code it requires privacy by design in all online services for children or that attract a large child audience, children being users under 18 years old.

To date the federal Children’s Online Privacy Protection Act (COPPA) has been the gold standard in the US but the Code will bring additional requirements for services already complying with COPPA which protects children 12 or under only. If the Code is signed into law by the state governor and it is expected to be, sites, apps, platforms, metaverses and connected devices will all need to comply by 2024.This may sound a long time off but actually it is a relatively short window in terms of the fundamental changes some services will need to make to comply or face significant fines of up to $7,500 per affected child. It will be enforced by the state attorney general.

There is time to get into shape but work should start at the design stage for any new services in build or now for any online services that do not currently meet the requirements and it's likely most won’t.

Here’s some of the key requirements that will need to be addressed at a high level:

  • Establish the age range of younger users to treat them appropriately.
  • Provide mechanisms for children to report their privacy concerns.
  • Provide age appropriate and clear privacy notices for children.
  • Algorithms that exploit children’s data to serve the harmful content are prohibited.
  • Precise location tracking is prohibited unless necessary for the operation of the service.
  • Transparency on location tracking is required i.e., include clear messaging to a child that it is on.
  • Do not sell children’s data unless it is essential to the service and do not profile children to serve targeted ads.
  • Only use data for the purpose it was collected.
  • Ensure data minimization, if the data is not needed for a specific and legitimate purpose then don’t collect it.
Privacy Act ,1988

australia  Australia

The Privacy Act 1988 protects an individual’s personal information regardless of their age. It doesn’t specify an age after which an individual can make their own privacy decision. For their consent to be valid, an individual must have capacity to consent.

An organization or agency handling the personal information of an individual under the age of 18 must decide if the individual has the capacity to consent on a case-by-case basis. As a general rule, an individual under the age of 18 has the capacity to consent if they have the maturity to understand what’s being proposed. If they lack maturity, it may be appropriate for a parent or guardian to consent on their behalf.

If it’s not practical for an organization or agency to assess the capacity of individuals on a case-by-case basis, as a general rule, an organization or agency may assume an individual over the age of 15 has capacity, unless they’re unsure.

Review the Privacy Act 1988

Utah Social Media Regulation Act

united-states  US (Utah)

 

Utah Governor Spencer Cox signed the Social Media Regulation Act into law on March 23, 2023, that incorporates both HB311 and SB 152. The regulation will go into force next year on March 1, 2024, holding social media companies accountable to ensure minors under 18 are protected online.  

H.B. 311 Highlights:

  • Prohibits social media companies using design features that cause a minor to become addicted to that platform.

  • Authorizes a private right of action to collect attorney fees and damages from a social media company for harm incurred by a minor's use of the company's social media platform and creates a rebuttable presumption that harm and causation occurred in some circumstances.

S.B. 152 Requires a social media company to:

  • Verify the age of a Utah resident seeking to maintain or open a social media account
  • Obtain the consent of a parent or guardian before a Utah resident under the age of 18 may maintain or open an account
  • Prohibit permitting a Utah resident to open an account if that person does not meet age requirements under state or federal law
  • Prohibit minors direct messaging with certain accounts
  • Not show the minor's account in search results
  • Not display advertising
  • Not collect, share, or use personal information from the account, with certain exceptions
  • May not target or suggest ads, accounts, or content
  • Shall limit hours of access to a minor, subject to parental or guardian direction from having access during the hours of 10:30 p.m. to 6:30 a.m.
  • Provide a parent or guardian access to the content and interactions of an account held by a Utah resident under the age of 18 with a password or other means for the parent or guardian to access the account, which shall allow the parent or guardian to view all posts the Utah minor account holder makes under the social media platform account; and all responses and messages sent to or by the Utah minor account holder in the social media platform account.
Virginia Consumer Data Protection Act (CDPA)

united-states  US (Virginia)

 

The Virginia Consumer Data Protection Act (CDPA) was introduced on January 1, 2021 to the House of Delegates and was signed into law by Governor Ralph Northam on March 2, 2021. The CDPA is scheduled to go into effect on January 1, 2023.

The CDPA became the second comprehensive data privacy law to be adopted in the US after the CCPA. While the CCPA and CDPA share similarities when it comes to data privacy and protection, some important differences remain

The CPDA currently applies to for-profit entities that:

(i) conduct business in Virginia or offer products or services targeted to residents in Virginia and,

(ii) control or process the data of at least 100,000 consumers or,

(iii) control or process the data of at least 25,000 consumers and derive more than 50% of revenue from the sale of personal data.

This regulation introduced the following consumer rights:

  • Right to know, access, and confirm
  • Right to deletion
  • Right to opt-out of sale (defined as the exchange of personal data for monetary consideration)
  • Right to opt-out of processing for targeted advertising
  • Right to opt-out of profiling
  • Right to nondiscrimination
  • Right to data portability
  • Right to rectification/correction

Children
Sensitive data is provided greater protection and includes personal data collected from children. Businesses that comply with verifiable parental consent requirements under the Children’s Online Privacy Protection Act are deemed compliant with the CDPA obligations to obtain parental consent.

Connecticut Data Privacy Act (CDPA)

united-states  US (Connecticut)

 

The Connecticut Data Privacy Act (CTDPA), which will go into effect July 1, 2023, is now the fifth and latest comprehensive state consumer privacy law.  

The CTDPA has many similarities with other states (California, Virginia, Colorado and Utah) that have passed consumer privacy laws, but is most similar to the Virginia Consumer Data Privacy Act (VCDPA) and the Colorado Privacy Act (CPA), which are more consumer-oriented  

The CTDPA applies to persons conducting business in Connecticut or producing products or services targeted to Connecticut residents, and who during the preceding calendar year either: 

  • Controlled or processed the personal data of 100,000 or more consumers annually, except for personal data controlled or processed solely for the purpose of completing a payment transaction. 
  • Derived over 25 percent of their gross revenue from the sale of personal data and controlled or processed the personal data of 25,000 or more consumers.3 

 

In addition to requiring businesses to respond to consumer requests regarding their personal data described above, this law creates further affirmative obligations for businesses, including that they must: 

  • Minimize the collection of personal data and refrain from processing personal data for purposes not disclosed to the consumer (unless the business has otherwise obtained consumer consent); 
  • Establish and maintain reasonable technical and physical data security practices to protect personal data; and 
  • Provide Connecticut residents with a privacy notice describing the categories of personal data processed and the purpose of the processing, if the entity shares or sells personal data with third parties, and how the consumer may exercise their right to access, modify, delete, or opt-out of the business’s use of personal data for targeted advertising or sale. 

CTDPA and Data of Minors
Specifically, controllers and processors that comply with the requirements of the Children’s Online Privacy Protection Act (COPPA) are compliant with any parental consent requirements of the CTDPA. The Controller cannot process personal data for purposes of selling or targeted advertising, without the Consumer's consent when knowing the Consumer is between 13 and 16 years old. 

The CTDPA also mandated that the General Assembly will convene a task force to study available ways to "verify the age of a child who creates a social media account." 

This law does not create private right of action for consumers, but instead invests exclusive enforcement authority in the Connecticut Attorney General. During the first two years of implementation, the Attorney General must issue a notice of violation and permit the business an opportunity to cure the violation within 60 days of notice. Beginning in 2025, however, the opportunity to cure is no longer guaranteed. 

The Data Protection Act 2018

united-kingdom  UK

 

The Data Protection Act 2018 controls how your personal information is used by organizations, businesses or the government. The Data Protection Act 2018 is the UK’s implementation of the General Data Protection Regulation (GDPR). 

Everyone responsible for using personal data has to follow strict rules called ‘data protection principles’. They must make sure the information is: 

  • used fairly, lawfully and transparently 
  • used for specified, explicit purposes 
  • used in a way that is adequate, relevant and limited to only what is necessary 
  • accurate and, where necessary, kept up to date 
  • kept for no longer than is necessary 
  • handled in a way that ensures appropriate security, including protection against unlawful or unauthorised processing, access, loss, destruction or damage 

There is stronger legal protection for more sensitive information, such as: 

  • race 
  • ethnic background 
  • political opinions 
  • religious beliefs 
  • trade union membership 
  • genetics 
  • biometrics (where used for identification) 
  • health 
  • sex life or orientation 

There are separate safeguards for personal data relating to criminal convictions and offences. 

Your rights 

Under the Data Protection Act 2018, you have the right to find out what information the government and other organizations store about you. These include the right to: 

  • be informed about how your data is being used 
  • access personal data 
  • have incorrect data updated 
  • have data erased 
  • stop or restrict the processing of your data 
  • data portability (allowing you to get and reuse your data for different services) 
  • object to how your data is processed in certain circumstances 

You also have rights when an organization is using your personal data for: 

  • automated decision-making processes (without human involvement) 
  • profiling, for example to predict your behavior or interests 

Make a complaint  

If you think your data has been misused or that the organization holding it has not kept it secure, you should contact them and tell them.

 

If you’re unhappy with their response or if you need any advice you should contact the Information Commissioner’s Office (ICO). 

ICO
icocasework@ico.org.uk
Telephone: 0303 123 1113
Textphone: 01625 545860
Monday to Friday, 9am to 4:30pm

Find out about call charges 

 

Information Commissioner’s Office
Wycliffe House Water Lane
Wilmslow
Cheshire
SK9 5AF  

 

You can also chat online with an advisor

The ICO can investigate your claim and take action against anyone who’s misused personal data. 

You can also visit their website for information on how to make a data protection complaint

The Privacy and Electronic Communications Regulations (PECR) 

united-kingdom  UK

 

The Privacy and Electronic Communications Regulations (PECR) sit alongside the Data Protection Act and the UK GDPR. They give people specific privacy rights in relation to electronic communications. There are specific rules on: 

  • marketing calls, emails, texts and faxes; 
  • cookies (and similar technologies); 
  • keeping communications services secure; and 
  • customer privacy as regards traffic and location data, itemized billing, line identification, and directory listings. 

 

ICO has several ways of taking action to change the behavior of anyone who breaches PECR. They include criminal prosecution, non-criminal enforcement and audit. The Information Commissioner can also serve a monetary penalty notice imposing a fine of up to £500,000 which can be issued against the organization or its directors. 

These powers are not mutually exclusive. 

PECR restrict unsolicited marketing by phone, fax, email, text, or other electronic message. There are different rules for different types of communication. The rules are generally stricter for marketing to individuals than for marketing to companies. 

 

You will often need specific consent to send unsolicited direct marketing. The best way to obtain valid consent is to ask customers to tick opt-in boxes confirming they are happy to receive marketing calls, texts or emails from you. 

 

PECR have been amended a number of times. Click here for updates. 

 

Texas SCOPE

united-states  US (Texas)

 

The Data Protection Act 2018 controls how your personal information is used by organizations, businesses or the government. The Data Protection Act 2018 is the UK’s implementation of the General Data Protection Regulation (GDPR). 

Everyone responsible for using personal data has to follow strict rules called ‘data protection principles’. They must make sure the information is: 

  • used fairly, lawfully and transparently 
  • used for specified, explicit purposes 
  • used in a way that is adequate, relevant and limited to only what is necessary 
  • accurate and, where necessary, kept up to date 
  • kept for no longer than is necessary 
  • handled in a way that ensures appropriate security, including protection against unlawful or unauthorised processing, access, loss, destruction or damage 

There is stronger legal protection for more sensitive information, such as: 

  • race 
  • ethnic background 
  • political opinions 
  • religious beliefs 
  • trade union membership 
  • genetics 
  • biometrics (where used for identification) 
  • health 
  • sex life or orientation 

There are separate safeguards for personal data relating to criminal convictions and offences. 

Your rights 

Under the Data Protection Act 2018, you have the right to find out what information the government and other organizations store about you. These include the right to: 

  • be informed about how your data is being used 
  • access personal data 
  • have incorrect data updated 
  • have data erased 
  • stop or restrict the processing of your data 
  • data portability (allowing you to get and reuse your data for different services) 
  • object to how your data is processed in certain circumstances 

You also have rights when an organization is using your personal data for: 

  • automated decision-making processes (without human involvement) 
  • profiling, for example to predict your behavior or interests 

Make a complaint  

If you think your data has been misused or that the organization holding it has not kept it secure, you should contact them and tell them.

 

If you’re unhappy with their response or if you need any advice you should contact the Information Commissioner’s Office (ICO). 

ICO
icocasework@ico.org.uk
Telephone: 0303 123 1113
Textphone: 01625 545860
Monday to Friday, 9am to 4:30pm

Find out about call charges 

 

Information Commissioner’s Office
Wycliffe House Water Lane
Wilmslow
Cheshire
SK9 5AF  

 

You can also chat online with an advisor

The ICO can investigate your claim and take action against anyone who’s misused personal data. 

You can also visit their website for information on how to make a data protection complaint

Arkansas Social Media Safety Act

united-states  US (Arkansas)

 

The Data Protection Act 2018 controls how your personal information is used by organizations, businesses or the government. The Data Protection Act 2018 is the UK’s implementation of the General Data Protection Regulation (GDPR). 

Everyone responsible for using personal data has to follow strict rules called ‘data protection principles’. They must make sure the information is: 

  • used fairly, lawfully and transparently 
  • used for specified, explicit purposes 
  • used in a way that is adequate, relevant and limited to only what is necessary 
  • accurate and, where necessary, kept up to date 
  • kept for no longer than is necessary 
  • handled in a way that ensures appropriate security, including protection against unlawful or unauthorised processing, access, loss, destruction or damage 

There is stronger legal protection for more sensitive information, such as: 

  • race 
  • ethnic background 
  • political opinions 
  • religious beliefs 
  • trade union membership 
  • genetics 
  • biometrics (where used for identification) 
  • health 
  • sex life or orientation 

There are separate safeguards for personal data relating to criminal convictions and offences. 

Your rights 

Under the Data Protection Act 2018, you have the right to find out what information the government and other organizations store about you. These include the right to: 

  • be informed about how your data is being used 
  • access personal data 
  • have incorrect data updated 
  • have data erased 
  • stop or restrict the processing of your data 
  • data portability (allowing you to get and reuse your data for different services) 
  • object to how your data is processed in certain circumstances 

You also have rights when an organization is using your personal data for: 

  • automated decision-making processes (without human involvement) 
  • profiling, for example to predict your behavior or interests 

Make a complaint  

If you think your data has been misused or that the organization holding it has not kept it secure, you should contact them and tell them.

 

If you’re unhappy with their response or if you need any advice you should contact the Information Commissioner’s Office (ICO). 

ICO
icocasework@ico.org.uk
Telephone: 0303 123 1113
Textphone: 01625 545860
Monday to Friday, 9am to 4:30pm

Find out about call charges 

 

Information Commissioner’s Office
Wycliffe House Water Lane
Wilmslow
Cheshire
SK9 5AF  

 

You can also chat online with an advisor

The ICO can investigate your claim and take action against anyone who’s misused personal data. 

You can also visit their website for information on how to make a data protection complaint

Proposed Regulatory Changes

Kids PRIVACY Act

united-states  USA

 

The Protecting the Information of our Vulnerable Children and Youth Act” or the “Kids PRIVCY Act” was introduced by U.S. Rep. Kathy Castor (FL14) to strengthen the Children’s Online Privacy Protection Act (COPPA). The bill builds on COPPA's strengths and expands privacy protections for children and teenagers, and incorporates key elements of the UK's Age-Appropriate Design Code, including expansion of coverage to sites likely to be accessed by children and teenagers, a requirement for a Privacy and Security Impact Assessment, and direction to operators to make the best interests of children and teenagers a primary design consideration. 

 

The legislation specifically strengthens privacy protections for children and teenagers by: 

  • Banning Companies from Providing Targeted Advertisements to Children and Teenagers: Prohibits companies from targeting children and teenagers based on their personal information and behavior. 
  • Considering Best Interests of Children and Teenagers: Requires an operator to make the best interests of children and teenagers a primary design consideration when designing its service. 
  • Requiring Opt-In Consent for all Individuals Under 18: Companies must obtain specific, informed, and unambiguous opt-in consent before collecting, retaining, selling, sharing, or using a young consumer or child’s personal information. 
  • Creating a Right to Access, Correct, and Delete Personal Information: Companies must provide individuals the opportunity to access, correct, or delete their personal information at any time. 
  • Protecting Additional Types of Information: Expands the type of information explicitly covered to include physical characteristics, biometric information, health information, education information, contents of messages and calls, browsing and search history, geolocation information, and latent audio or visual recordings. 
  • Requiring User-Friendly Privacy Policies: Companies must make publicly available privacy policies that are clear, easily understood, and written in plain and concise language. 
  • Creating a Protected Class of “Teenagers” Ages 13-17: For the first time in statute, the bill provides protection for teenagers 13-17, allowing them to control who collects their personal information and what companies can do with it. 
  • Expands Coverage of Companies: Applies to all sites likely to be accessed by children and teens, not just child-directed services. 
  • Limiting Disclosure to Third Parties: The bill prohibits companies from sharing personal information without consent. Furthermore, it creates additional duties companies must comply with before disclosing any personal information with third parties. 
  • Requiring Reasonable Data Security Policies, Practices, and Procedures: Requires companies to have a written security policy, point of contact for information security management and processes to identify, assess, and mitigate vulnerabilities. 
  • Prohibiting Industry Self-Regulation: Repeals dangerous safe harbor provision that allow for lax enforcement and rubberstamping of potentially unlawful practices. 
  • Strengthening FTC Enforcement: Raises the maximum allowable civil penalty per violation by 50 percent and allows the FTC to pursue punitive damages. Also establishes a Youth Privacy and Marketing Division at the FTC. 
  • Providing for Parental Enforcement: Parents will be able to bring civil actions to help enforce the bill and any resulting regulations. 
  • Banning Forced Arbitration: In a much-needed reversal of current law, companies will no longer be able force their consumers to waive their right to sue. 
The American Data Privacy and Protection Act (ADPPA)

united-states  USA

 

The American Data Privacy and Protection Act (ADPPA)

The American Data Privacy and Protection Act (ADPPA) aims to give internet users more control over their personal data. If passed, the ADPPA will be the first comprehensive federal privacy law in the US. 

In regards to children, Section 205 calls out data protections for children and minors: Covered entities are subject to additional requirements for covered data with respect to individuals under age 17. Targeted advertising is expressly prohibited [if covered entities have actual knowledge that an individual is under 17]/[to any individual under 17]. Covered entities may not transfer the covered data of individuals between 13 and 17 years old to third parties without express affirmative consent [where the covered entity has actual knowledge the individual is between 13 and 17].

This section establishes a Youth Privacy and Marketing Division at the FTC, which shall be responsible for addressing privacy and marketing concerns with respect to children and minors. The division must submit annual reports to Congress and hire staff that includes experts in youth development, data protection, digital advertising, and data analytics.

This section also requires the FTC Inspector General to submit a report to Congress every two years analyzing the fairness and effectiveness of the safe harbor provisions in the Children’s Online Privacy Protection Act of 1998 (COPPA). These reports must be published on the FTC web site.

Notably, there are several other important provisions with heightened or specific call-outs to children and minors throughout the legislation.

Other highlights of the bill include: 

  • A data minimization approach: companies will be allowed to collect and use users' data only for 17 permitted purposes. These include users' authentication, fraud prevention and online payments.
  • Stricter limitations on targeted ads: On top of the provisions mentioned above, the FTC will be responsible for creating standard opt-out methods that companies will be obliged to follow.
  • A ban on using sensitive data for targeted ads: This includes health information, precise geo-localization details like personal IP address and private communications. 

According to the ADPPA's pre-emption principle, no States will be allowed to enforce their own regulations on the same privacy issues that the federal law will cover. This will de-facto statutes like the California's Consumer Privacy Rights Act.

The Children & Teen’s Online Privacy Protection Act (COPPA 2.0)

united-states  USA

 

The Children & Teen’s Online Privacy Protection Act (COPPA 2.0) would:

  • Build on COPPA’s consent requirements by prohibiting internet companies from collecting personal information from users who are 13 to 15 years old without the user’s consent
  • Ban targeted advertising (as opposed to contextual advertising) directed at children 
  • Establish a “Digital Marketing Bill of Rights for Teens” that limits the collection of personal information of teens
  • Require an operator of a website, online service, online application, or mobile application, or connected device that is direct to children or minors or is used or is reasonably likely to be used by children or minors in a manner that involves the collection of their personal information, —get consent in order to collect children and minors’ data
  • Create an “Eraser Button” for parents and kids by requiring companies to permit users to eliminate personal information from a child or teen when technologically feasible
  • Establish a Youth Marketing and Privacy Division at the Federal Trade Commission (FTC)
  • Require online companies to explain the types of personal information collected, how that information is used and disclosed, and the policies for collection of personal information
  • Require that internet connected devices targeted toward children meet robust cyber security standards
  • Commission reports on the effectiveness of the COPPA safe harbor program.

For more information on The Children & Teen’s Online Privacy Protection Act (COPPA 2.0) click here.

The Kids Online Safety Act of 2022 (KOSA) bill

united-states  USA

 

  • Provides parents and kids safeguards and tools to protect kids’ experiences online: The bill requires social media platforms provide minors with options to protect their information, disable addictive product features, and opt-out of algorithmic recommendations—and requires platforms to enable the strongest settings by default. The bill also gives parents new controls to help support their children and spot harmful behaviors, including by providing children and parents with a dedicated channel to report harms to kids to the platform. 
  • Creates accountability for social media’s harms to kids: The bill creates a duty for social media platforms to prevent and mitigate harms to minors, such as content promoting of self- harm, suicide, eating disorders, substance abuse, and sexual exploitation. It also requires social media platforms perform an annual independent audit assessing risks to minors, their compliance with this Act, and whether the platform is taking meaningful steps to prevent those harms. 
  • Opens up black box algorithms: The bill provides academic researchers and non-profit organizations with access to critical datasets from social media platforms to foster research regarding harms to the safety and well-being of minors.
  • For more information about KOSA, click here.
Washington State - SENATE BILL 5813

united-states  USA, Washington

 

In Washington State, Senator Reuven Carlyle and four co-sponsors introduced a bill that addresses several privacy and data protection matters, which includes provisions specifying collection and security practices for data from children and adolescent users. The bill identifies adolescents (ages 13 through 17) as a class separate from other children under 18 and has three main provisions: establishing consumer rights regarding data about children and adolescents, requiring consent for use of that data, and mandating a set of duties for businesses that collect and process that data.

The proposal also would require consent from parents of children, or the adolescents themselves, to collect and process personal data, and further would require a separate and express consent from an adolescent to sell their personal data or conduct targeting advertising. Finally, the bill would require businesses to conduct a data protection assessment, be transparent about collection and processing practices, secure personal data, and minimize data collection and retention.
Senate Bill 9563, the New York Child Data Privacy and Protection Act

united-states  USA, New York

 

Senate Bill 9563, the New York Child Data Privacy and Protection Act, covers minors age 17 and under and would require data protection impact assessments, privacy-by-default settings and limits on children's data practices. The bill also calls for a ban on targeted advertising against children.

While similar to the new California Age-Appropriate Design Code, New York takes further steps by including a number of provisions meant to help families in the event of serious harms committed against kids online. A key aspect that sets the new bill apart from its predecessor is a stipulation requiring tech companies provide parents with a way to notify them in case of emergencies — a sort of 911 for digital crimes.

Online Safety Bill

united-kingdom  UK

 

The Bill introduces new rules for companies whose services host user-generated content such as images, videos and comments, or which allow UK users to talk with other people online through messaging, comments and forums. This includes:

  • the biggest and most popular social media platforms

  • sites such as forums and messaging apps, some online games, cloud storage and the most popular pornography sites

  • search engines, which play a significant role in enabling users to access harmful content

Sites which publish pornographic content will also be required under the legislation to ensure that children cannot access age-inappropriate material. Those platforms which fail to protect people will need to answer to the regulator and could face fines of up to ten per cent of their revenues or, in the most serious cases, being blocked.

The regulator will have the powers necessary to take appropriate action against all companies in scope, no matter where they are based. This is essential given the global nature of the internet. Some services with user-generated content will be exempt from the new framework, including news websites, some retail services, some services used internally by businesses and email services. These services will also have a duty to bring in user empowerment tools, giving adult users more control over whom they interact with and the legal content they see, as well as the option to verify their identity.

Bill C-27 Consumer Privacy Protection Act (CPPA)

canada  Canada

Bill C-27 introduces three new acts: the Consumer Privacy Protection Act (“CPPA”), the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act (“AIDA”), which would replace the current Personal Information Protection and Electronic Documents Act (“PIPEDA”).

Bill C-27’s proposed Consumer Privacy Protection Act (CPPA) includes new protections for minors by requiring a higher standard of diligence and protection in respect to the collection and processing of their personal information.

Online Privacy Bill

australia  Australia

The Privacy Legislation Amendment (Enhancing Online Privacy and Other Measures) Bill 2021 (the Online Privacy Bill) will give effect to the Australian Government’s commitment to strengthen the Privacy Act 1988. It enables the introduction of a binding Online Privacy code for social media and certain other online platforms, and increases penalties and enforcement measures.

The Online Privacy Bill addresses the pressing privacy challenges posed by social media and other online platforms. The Privacy Act Review seeks to build on the outcomes of the Online Privacy Bill to ensure that Australia's privacy law framework empowers consumers, protects their data and best serves the whole of the Australian economy.

More information and status of the bill, click here.

Submit an Additional Resource