data protection - ESRB Ratings https://www.esrb.org/tag/data-protection/ Entertainment Software Rating Board Mon, 28 Nov 2022 20:26:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.3 https://www.esrb.org/wp-content/uploads/2019/06/cropped-Favicon.png data protection - ESRB Ratings https://www.esrb.org/tag/data-protection/ 32 32 2019 – Another Historic Year for Children’s Privacy https://www.esrb.org/privacy-certified-blog/2019-another-historic-year-for-childrens-privacy-coppa/ Mon, 24 Feb 2020 21:19:40 +0000 https://www.esrb.org/?p=2656 2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if […]

The post 2019 – Another Historic Year for Children’s Privacy appeared first on ESRB Ratings.

]]>
2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if passed, would reshape COPPA. Outside the United States, the United Kingdom’s Information Commissioner’s Office (ICO) has been driving the conversation with its game-changing Age Appropriate Design Code (Code), which is awaiting Parliament’s approval. And the world’s largest mobile storefronts—Apple and Google Play—have taken steps to be more protective of children’s privacy.

FTC Sets COPPA Record, Then Breaks It
The FTC remains the world’s top cop when it comes to children’s privacy. It began 2019 by breaking the record for the largest monetary penalty in a COPPA case when it agreed to a settlement with the operators of TikTok (f/k/a Musical.ly) for $5.7 million. TikTok demonstrated a more aggressive approach both in how the FTC defines an online service “directed to children” and also in how it applies COPPA’s “actual knowledge” standard.

The record set in TikTok, however, did not last long. In September 2019, the FTC and the New York Attorney General announced a COPPA settlement with Google and YouTube, which included a $136 million penalty paid to the FTC and $34 million penalty paid to New York—either of which on its own would have been the largest ever monetary penalty in a COPPA case by a significant margin.

More significantly, the settlement required YouTube to materially change its business. Going forward, all YouTube channel owners must specify whether their channels are directed to children. If a channel is not child-directed, the channel owner must still identify individual videos that are child-directed. When content is identified as child-directed, YouTube turns off several features of the platform, including (i) the collection of personal data for behavioral advertising, and (ii) the ability for users to leave public comments.

The FTC also settled a third COPPA case in 2019 against the operators of i-Dressup.com, a dress-up website that had already been shut down. While that case settled for the relatively modest sum of $35,000, it has important symbolic value insofar as it shows even small companies can land on the FTC’s radar.

FTC Solicits COPPA Comments
In addition to its enforcement activities, the FTC was extremely busy in 2019 soliciting public comments on its application of the COPPA Rule and hosting a COPPA workshop. The request for comments, which was published in July 2019, surprised the privacy community and industry because it occurred three years before the FTC’s typical 10-year cycle. The FTC received over 175,000 submissions by the December deadline, including one submission from us.

U.S. Lawmakers Seek to Update COPPA
In March 2019, Senators Markey and Hawley introduced what became known as COPPA 2.0. The bill would, among other things:

      •Extend COPPA protections to minors 13 to 15 years old;
      •Extend COPPA to operators of online services that have “constructive knowledge” they are collecting personal information from children or minors; and
      •Prohibit the use of children’s personal information for targeted marketing and place limits on the use of minors’ personal information for that purpose.

(Bonus material: In January 2020, lawmakers introduced two bills that would amend COPPA: The Protect Kids Act, which was introduced by Congressmen Tim Walberg and Bobby Rush, would, among other things, extend COPPA’s protections to all children under 16 years old. The PRIVCY Act, introduced by Congresswoman Kathy Castor, would essentially re-write COPPA. Among other things, it would extend protections to children under 18 years old and remove the concept of “child-directed” online services in favor of an “actual and constructive knowledge” standard.)

Outside the U.S., the ICO is Trying to Re-Define How Online Services Approach Children’s Privacy
In 2019, the ICO released its long-awaited proposal for an Age Appropriate Design Code—a set of 15 “standards” aimed at placing the best interests of the child above all other considerations for online services “likely to be accessed” by children under 18 years old. The standards would require, among other things, high-privacy default settings, communicating privacy disclosures and choices in ways that are appropriate to the ages of the children likely to access the online service, and eliminating uses detrimental to children’s wellbeing.

Following the initial consultation period, in November 2019, the ICO revised the Code and submitted the final version to the Secretary of State. The Code now awaits approval by Parliament.

Storefronts Take Steps to Strengthen Children’s Privacy
Apple and Google Play also took steps to strengthen children’s privacy. In May 2019, for example, the FTC announced Apple and Google Play removed three dating apps from their storefronts after the FTC warned the apps were violating COPPA.

In addition, both Apple and Google Play revised their developer policies. On Google Play, developers must now identify the target audience for each of their apps. Apps targeted to children have certain restrictions, including with respect to the types of adverts served and networks used. For its part, Apple has placed restrictions on the use of third-party analytics and advertising in apps directed to children.

Have more questions about recent developments in the area of children’s privacy? Feel free to reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post 2019 – Another Historic Year for Children’s Privacy appeared first on ESRB Ratings.

]]>
2019 – Another Historic Year for Children’s Privacy https://www.esrb.org/privacy-certified-blog/2019-another-historic-year-for-childrens-privacy-coppa/ Mon, 24 Feb 2020 21:19:40 +0000 https://www.esrb.org/?p=2656 2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if […]

The post 2019 – Another Historic Year for Children’s Privacy appeared first on ESRB Ratings.

]]>
2019 was truly a historic year for children’s privacy. Regulatory enforcement activity in the United States hit an all-time high. The Federal Trade Commission (FTC) surprised industry and the privacy community by embarking on a review of its Children’s Online Privacy Protection (COPPA) Rule three years ahead of schedule. U.S. lawmakers introduced legislation that, if passed, would reshape COPPA. Outside the United States, the United Kingdom’s Information Commissioner’s Office (ICO) has been driving the conversation with its game-changing Age Appropriate Design Code (Code), which is awaiting Parliament’s approval. And the world’s largest mobile storefronts—Apple and Google Play—have taken steps to be more protective of children’s privacy.

FTC Sets COPPA Record, Then Breaks It
The FTC remains the world’s top cop when it comes to children’s privacy. It began 2019 by breaking the record for the largest monetary penalty in a COPPA case when it agreed to a settlement with the operators of TikTok (f/k/a Musical.ly) for $5.7 million. TikTok demonstrated a more aggressive approach both in how the FTC defines an online service “directed to children” and also in how it applies COPPA’s “actual knowledge” standard.

The record set in TikTok, however, did not last long. In September 2019, the FTC and the New York Attorney General announced a COPPA settlement with Google and YouTube, which included a $136 million penalty paid to the FTC and $34 million penalty paid to New York—either of which on its own would have been the largest ever monetary penalty in a COPPA case by a significant margin.

More significantly, the settlement required YouTube to materially change its business. Going forward, all YouTube channel owners must specify whether their channels are directed to children. If a channel is not child-directed, the channel owner must still identify individual videos that are child-directed. When content is identified as child-directed, YouTube turns off several features of the platform, including (i) the collection of personal data for behavioral advertising, and (ii) the ability for users to leave public comments.

The FTC also settled a third COPPA case in 2019 against the operators of i-Dressup.com, a dress-up website that had already been shut down. While that case settled for the relatively modest sum of $35,000, it has important symbolic value insofar as it shows even small companies can land on the FTC’s radar.

FTC Solicits COPPA Comments
In addition to its enforcement activities, the FTC was extremely busy in 2019 soliciting public comments on its application of the COPPA Rule and hosting a COPPA workshop. The request for comments, which was published in July 2019, surprised the privacy community and industry because it occurred three years before the FTC’s typical 10-year cycle. The FTC received over 175,000 submissions by the December deadline, including one submission from us.

U.S. Lawmakers Seek to Update COPPA
In March 2019, Senators Markey and Hawley introduced what became known as COPPA 2.0. The bill would, among other things:

      •Extend COPPA protections to minors 13 to 15 years old;
      •Extend COPPA to operators of online services that have “constructive knowledge” they are collecting personal information from children or minors; and
      •Prohibit the use of children’s personal information for targeted marketing and place limits on the use of minors’ personal information for that purpose.

(Bonus material: In January 2020, lawmakers introduced two bills that would amend COPPA: The Protect Kids Act, which was introduced by Congressmen Tim Walberg and Bobby Rush, would, among other things, extend COPPA’s protections to all children under 16 years old. The PRIVCY Act, introduced by Congresswoman Kathy Castor, would essentially re-write COPPA. Among other things, it would extend protections to children under 18 years old and remove the concept of “child-directed” online services in favor of an “actual and constructive knowledge” standard.)

Outside the U.S., the ICO is Trying to Re-Define How Online Services Approach Children’s Privacy
In 2019, the ICO released its long-awaited proposal for an Age Appropriate Design Code—a set of 15 “standards” aimed at placing the best interests of the child above all other considerations for online services “likely to be accessed” by children under 18 years old. The standards would require, among other things, high-privacy default settings, communicating privacy disclosures and choices in ways that are appropriate to the ages of the children likely to access the online service, and eliminating uses detrimental to children’s wellbeing.

Following the initial consultation period, in November 2019, the ICO revised the Code and submitted the final version to the Secretary of State. The Code now awaits approval by Parliament.

Storefronts Take Steps to Strengthen Children’s Privacy
Apple and Google Play also took steps to strengthen children’s privacy. In May 2019, for example, the FTC announced Apple and Google Play removed three dating apps from their storefronts after the FTC warned the apps were violating COPPA.

In addition, both Apple and Google Play revised their developer policies. On Google Play, developers must now identify the target audience for each of their apps. Apps targeted to children have certain restrictions, including with respect to the types of adverts served and networks used. For its part, Apple has placed restrictions on the use of third-party analytics and advertising in apps directed to children.

Have more questions about recent developments in the area of children’s privacy? Feel free to reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post 2019 – Another Historic Year for Children’s Privacy appeared first on ESRB Ratings.

]]>
ICO Publishes the Final Version of the Age Appropriate Design Code https://www.esrb.org/privacy-certified-blog/ico-publishes-the-final-version-of-the-age-appropriate-design-code/ Thu, 23 Jan 2020 22:16:50 +0000 https://www.esrb.org/?p=2615 On January 21, the UK’s Information Commissioner’s Office (ICO) published the final version of its Age Appropriate Design Code (the Code). The Code, which was first released for comment in April 2019, is comprised of 15 Standards that will impact the way in which companies assess the age of and risks to their users; the […]

The post ICO Publishes the Final Version of the Age Appropriate Design Code appeared first on ESRB Ratings.

]]>
On January 21, the UK’s Information Commissioner’s Office (ICO) published the final version of its Age Appropriate Design Code (the Code). The Code, which was first released for comment in April 2019, is comprised of 15 Standards that will impact the way in which companies assess the age of and risks to their users; the types of personal data they collect; how that data is used and shared; how they present privacy disclosures, tools, and choices to their users; and the overall design of their products and services. The overarching principle of the Code is that online products and services “likely to be accessed by children” under 18 years old must be designed with the best interests of those children in mind.

The Standards set forth in the final version of the Code are largely unchanged from the initial draft in April. However, after a long consultation period, the final version of the Code does reflect some important compromises by the ICO.

First, while the ICO makes clear that compliance with the Standards will be required, it has clarified that the additional 100+ pages of guidance in the Code is just that, guidance. Companies will have some flexibility to come up with their own methods to comply with the Standards. That said, companies would be shortsighted not to give proper weight to the ICO’s guidance.

Second, the initial draft of the Code essentially placed the burden on companies to prove their online products and services were not likely to be accessed by children. In the final version, the ICO clarifies that the analysis will likely depend on:

  • the nature and content of the service and whether [it] has particular appeal for children; and
  • the way in which the service is accessed and any measures [the company] put[s] in place to prevent children gaining access.

These factors allow companies far more flexibility than the presumptive approach taken in the initial draft, which will hopefully reduce the amount of unnecessary data collection done solely to confirm a user’s age.

Third, and related, the final version of the Code takes a risk-based, proportionate approach to age verification. Whether and how a company verifies a user’s age will depend on (i) the age range(s) of the users; (ii) the level of certainty the company has about the age range(s); and (iii) the risks the online products and services pose to those users. Under certain low-risk circumstances, for example, a traditional age gate, where a user’s self-declared age is accepted without verification, might be appropriate. In contrast, the initial draft of the Code seemingly banned traditional age gates, which would have required companies to employ more intrusive verification methods.

The Code still has some final hurdles to overcome, including approval by Parliament, and will begin with a 12-month transition period. Companies, however, will likely need all that time, and possibly more!

Have more questions about the Age Appropriate Design Code? Feel free to reach out to us through our Contact page to learn more about our program. Be sure to follow us on Twitter and LinkedIn for more privacy-related updates.

The post ICO Publishes the Final Version of the Age Appropriate Design Code appeared first on ESRB Ratings.

]]>