children's online privacy protection rule

167
-1- [Billing Code: 6750-01S] FEDERAL TRADE COMMISSION 16 CFR Part 312 [RIN 3084-AB20] CHILDREN’S ONLINE PRIVACY PROTECTION RULE AGENCY: Federal Trade Commission (“FTC” or “Commission”). ACTION: Final rule amendments. SUMMARY: The Commission amends the Children’s Online Privacy Protection Rule (“COPPA Rule” or “Rule”), consistent with the requirements of the Children’s Online Privacy Protection Act, to clarify the scope of the Rule and strengthen its protections for children’s personal information, in light of changes in online technology since the Rule went into effect in April 2000. The final amended Rule includes modifications to the definitions of operator, personal information, and website or online service directed to children. The amended Rule also updates the requirements set forth in the notice, parental consent, confidentiality and security, and safe harbor provisions, and adds a new provision addressing data retention and deletion. EFFECTIVE DATE: The amended Rule will become effective on July 1, 2013. ADDRESSES: The complete public record of this proceeding will be available at www.ftc.gov . Requests for paper copies of this amended Rule and Statement of Basis and Purpose (“SBP”) should be sent to: Public Reference Branch, Federal Trade Commission, 600 Pennsylvania Avenue, N.W., Room 130, Washington, D.C. 20580. FOR FURTHER INFORMATION CONTACT: Phyllis H. Marcus or Mamie Kresses, Attorneys, Division of Advertising Practices, Bureau of Consumer Protection, Federal Trade

Post on 14-Sep-2014

331 views

Category:

Education


1 download

DESCRIPTION

All product and company names mentioned herein are for identification and educational purposes only and are the property of, and may be trademarks of, their respective owners.

TRANSCRIPT

Page 1: Children's Online Privacy Protection Rule

-1-

[Billing Code: 6750-01S]

FEDERAL TRADE COMMISSION

16 CFR Part 312

[RIN 3084-AB20]

CHILDREN’S ONLINE PRIVACY PROTECTION RULE

AGENCY: Federal Trade Commission (“FTC” or “Commission”).

ACTION: Final rule amendments.

SUMMARY: The Commission amends the Children’s Online Privacy Protection Rule

(“COPPA Rule” or “Rule”), consistent with the requirements of the Children’s Online Privacy

Protection Act, to clarify the scope of the Rule and strengthen its protections for children’s

personal information, in light of changes in online technology since the Rule went into effect in

April 2000. The final amended Rule includes modifications to the definitions of operator,

personal information, and website or online service directed to children. The amended Rule also

updates the requirements set forth in the notice, parental consent, confidentiality and security,

and safe harbor provisions, and adds a new provision addressing data retention and deletion.

EFFECTIVE DATE: The amended Rule will become effective on July 1, 2013.

ADDRESSES: The complete public record of this proceeding will be available at www.ftc.gov.

Requests for paper copies of this amended Rule and Statement of Basis and Purpose (“SBP”)

should be sent to: Public Reference Branch, Federal Trade Commission, 600 Pennsylvania

Avenue, N.W., Room 130, Washington, D.C. 20580.

FOR FURTHER INFORMATION CONTACT: Phyllis H. Marcus or Mamie Kresses,

Attorneys, Division of Advertising Practices, Bureau of Consumer Protection, Federal Trade

Page 2: Children's Online Privacy Protection Rule

2011 NPRM, 76 FR 59804, available at1

http://ftc.gov/os/2011/09/110915coppa.pdf.

2012 SNPRM, 77 FR 46643, available 2

at http://ftc.gov/os/2012/08/120801copparule.pdf.

-2-

Commission, 600 Pennsylvania Avenue, NW, Washington, DC 20580, (202) 326-2854 or (202)

326-2070.

SUPPLEMENTARY INFORMATION:

Statement of Basis and Purpose

I. Overview and Background

A. Overview

This document states the basis and purpose for the Commission’s decision to adopt

certain amendments to the COPPA Rule that were proposed and published for public comment

on September 27, 2011 (“2011 NPRM”), and supplemental amendments that were proposed and1

published for public comment on August 6, 2012 (“2012 SNPRM”). After careful review and2

consideration of the entire rulemaking record, including public comments submitted by

interested parties, and based upon its experience in enforcing and administering the Rule, the

Commission has determined to adopt amendments to the COPPA Rule. These amendments to

the final Rule will help to ensure that COPPA continues to meet its originally stated goals to

minimize the collection of personal information from children and create a safer, more secure

online experience for them, even as online technologies, and children’s uses of such

technologies, evolve.

The final Rule amendments modify the definitions of operator to make clear that the

Rule covers an operator of a child-directed site or service where it integrates outside services,

Page 3: Children's Online Privacy Protection Rule

-3-

such as plug-ins or advertising networks, that collect personal information from its visitors;

website or online service directed to children to clarify that the Rule covers a plug-in or ad

network when it has actual knowledge that it is collecting personal information through a child-

directed website or online service; website or online service directed to children to allow a

subset of child-directed sites and services to differentiate among users, and requiring such

properties to provide notice and obtain parental consent only for users who self-identify as under

age 13; personal information to include geolocation information and persistent identifiers that

can be used to recognize a user over time and across different websites or online services; and

support for internal operations to expand the list of defined activities.

The Rule amendments also streamline and clarify the direct notice requirements to ensure

that key information is presented to parents in a succinct “just-in-time” notice; expand the non-

exhaustive list of acceptable methods for obtaining prior verifiable parental consent; create two

new exceptions to the Rule’s notice and consent requirements; strengthen data security

protections by requiring operators to take reasonable steps to release children’s personal

information only to service providers and third parties who are capable of maintaining the

confidentiality, security, and integrity of such information; require reasonable data retention and

deletion procedures; strengthen the Commission’s oversight of self-regulatory safe harbor

programs; and institute voluntary pre-approval mechanisms for new consent methods and for

activities that support the internal operations of a website or online service.

B. Background

The COPPA Rule, 16 CFR Part 312, issued pursuant to the Children’s Online Privacy

Protection Act (“COPPA” or “COPPA statute”), 15 U.S.C. 6501 et seq., became effective on

April 21, 2000. The Rule imposes certain requirements on operators of websites or online

Page 4: Children's Online Privacy Protection Rule

See 16 CFR 312.3.3

See 16 CFR 312.7 and 312.8.4

See 16 CFR 312.10.5

See Request for Public Comment on the Federal Trade Commission’s6

Implementation of the Children’s Online Privacy Protection Rule (“2010 FRN”), 75 FR 17089(Apr. 5, 2010).

Id.7

-4-

services directed to children under 13 years of age, and on operators of other websites or online

services that have actual knowledge that they are collecting personal information online from a

child under 13 years of age (collectively, “operators”). Among other things, the Rule requires

that operators provide notice to parents and obtain verifiable parental consent prior to collecting,

using, or disclosing personal information from children under 13 years of age. The Rule also3

requires operators to keep secure the information they collect from children, and prohibits them

from conditioning children’s participation in activities on the collection of more personal

information than is reasonably necessary to participate in such activities. The Rule contains a4

“safe harbor” provision enabling industry groups or others to submit to the Commission for

approval self-regulatory guidelines that would implement the Rule’s protections.5

The Commission initiated review of the COPPA Rule in April 2010 when it published a

document in the FEDERAL REGISTER seeking public comment on whether the rapid-fire pace

of technological changes to the online environment over the preceding five years warranted any

changes to the Rule. The Commission’s request for public comment examined each aspect of6

the COPPA Rule, posing 28 questions for the public’s consideration. The Commission also7

Page 5: Children's Online Privacy Protection Rule

Information about the June 2010 public roundtable is located at8

http://www.ftc.gov/bcp/workshops/coppa/index.shtml.

Public comments in response to the Commission’s 2010 FRN are located at9

http://www.ftc.gov/os/comments/copparulerev2010/index.shtm. Comments cited herein tothe Federal Register Notice are designated as such, and are identified by commenter name,comment number, and, where applicable, page number.

See supra note 1.10

Public comments in response to the 2011 NPRM are located at11

http://www.ftc.gov/os/comments/copparulereview2011/. Comments cited herein to the 2011NPRM are designated as such, and are identified by commenter name, comment number, and,where applicable, page number.

-5-

held a public roundtable to discuss in detail several of the areas where public comment was

sought. 8

The Commission received 70 comments from industry representatives, advocacy groups,

academics, technologists, and individual members of the public in response to the April 5, 2010

request for public comment. After reviewing the comments, the Commission issued the 20119

NPRM, which set forth several proposed changes to the COPPA Rule. The Commission10

received over 350 comments in response to the 2011 NPRM. After reviewing these comments,11

and based upon its experience in enforcing and administering the Rule, in the 2012 SNPRM, the

Commission sought additional public comment on a second set of proposed modifications to the

Rule.

The 2012 SNPRM proposed modifying the definitions of both operator and website or

online service directed to children to allocate and clarify the responsibilities under COPPA when

independent entities or third parties, e.g., advertising networks or downloadable software kits

(“plug-ins”), collect information from users through child-directed sites and services. In

addition, the 2012 SNPRM proposed to further modify the definition of website or online service

Page 6: Children's Online Privacy Protection Rule

Public comments in response to the 2012 SNPRM are available online at12

http://ftc.gov/os/comments/copparulereview2012/index.shtm. Comments cited herein to theSNPRM are designated as such, and are identified by commenter name, comment number, and,where applicable, page number.

One commenter, Go Daddy, expressed concern that the definition of collects or13

collection is silent as to personal information acquired from children offline that is uploaded,stored, or distributed to third parties by operators. Go Daddy (comment 59, 2011 NPRM), at 2.

-6-

directed to children to permit websites or online services that are directed both to children and to

a broader audience to comply with COPPA without treating all users as children. The

Commission also proposed modifying the definition of screen or user name to cover only those

situations where a screen or user name functions in the same manner as online contact

information. Finally, the Commission proposed to further modify the revised definitions of

support for internal operations and persistent identifiers. The Commission received 99

comments in response to the 2012 SNPRM. After reviewing these additional comments, the12

Commission now announces this final amended COPPA Rule.

II. Modifications to the Rule

A. Section 312.2: Definitions

1. Definition of Collects or Collection

a. Collects or collection, paragraph (a)

In the 2011 NPRM, the Commission proposed amending paragraph (a) to change the

phrase “requesting that children submit personal information online” to “requesting, prompting,

or encouraging a child to submit personal information online.” The proposal was to clarify that

the Rule covers the online collection of personal information both when an operator requires it to

participate in an online activity, and when an operator merely prompts or encourages a child to

provide such information. The comments received divided roughly equally between support of13

Page 7: Children's Online Privacy Protection Rule

However, Congress limited the scope of COPPA to information that an operator collects onlinefrom a child; COPPA does not govern information collected by an operator offline. See 15U.S.C. 6501(8) (defining the personal information as “individually identifiable informationabout an individual collected online. . . .”); 144 Cong. Rec. S11657 (Oct. 7, 1998) (Statement ofSen. Bryan) (“This is an online children’s privacy bill, and its reach is limited to informationcollected online from a child.”).

See Institute for Public Representation (comment 71, 2011 NPRM), at 19; 14

kidSAFE Seal Program (comment 81, 2011 NPRM), at 5; Alexandra Lang (comment 87, 2011NPRM), at 1.

NCTA (comment 113, 2011 NPRM), at 17-18.15

Id.16

See 16 CFR 312.2: “Collects or collection means the gathering of any personal17

information from a child by any means, including but not limited to . . .”

-7-

and opposition to the proposed change to paragraph (a). Those in favor cited the increased clarity

of the revised language as compared to the existing language.14

Several commenters opposed the revised language of paragraph (a). For example, the

National Cable and Telecommunications Association (“NCTA”) expressed concern that the

revised language suggests that “COPPA obligations are triggered even without the actual or

intended collection of personal information.” NCTA asked the Commission to clarify that15

“prompting” or “encouraging” does not trigger COPPA unless an operator actually collects

personal information from a child. 16

The Rule defines collection as “the gathering of any personal information from a child by

any means,” and the terms “prompting” and “encouraging” are merely exemplars of the means by

which an operator gathers personal information from a child. This change to the definition of17

collects or collection is intended to clarify the longstanding Commission position that an operator

that provides a field or open forum for a child to enter personal information will not be shielded

Page 8: Children's Online Privacy Protection Rule

Several other commenters raised concern that the language “prompting, or18

encouraging” could make sites or services that post third-party “Like” or “Tweet This” buttonssubject to COPPA. See Association for Competitive Technology (comment 5, 2011 NPRM), at6; Direct Marketing Association (“DMA”) (comment 37, 2011 NPRM), at 6; see also AmericanAssociation of Advertising Agencies (comment 2, 2011 NPRM), at 2-3; Interactive AdvertisingBureau (“IAB”) (comment 73, 2011 NPRM), at 12. The collection of personal information byplug-ins on child-directed sites is addressed fully in the discussion regarding changes to thedefinition of operator. See Part II.A.4.a., infra.

Under the Rule, operators who offered services such as social networking, chat,19

and bulletin boards and who did not pre-strip (i.e., completely delete) such information weredeemed to have “disclosed” personal information under COPPA’s definition of disclosure. See16 CFR 312.2.

-8-

from liability merely because entry of personal information is not mandatory to participate in the

activity. It recognizes the reality that such an operator must have in place a system to provide

notice to and obtain consent from parents to deal with the moment when the information is

“gathered.” Otherwise, once the child posts the personal information, it will be too late to18

obtain parental consent.

After reviewing the comments, the Commission has decided to modify paragraph (a) of

the definition of collects or collection as proposed in the 2011 NPRM.

b. Collects or Collection, paragraph (b)

Section 312.2(b) of the Rule defines “collects or collection” to cover enabling children to

publicly post personal information (e.g., on social networking sites or on blogs), “except where

the operator deletes all individually identifiable information from postings by children before

they are made public, and also deletes such information from the operator’s records.” This19

exception, often referred to as the “100% deletion standard,” was designed to enable sites and

Page 9: Children's Online Privacy Protection Rule

See P. Marcus, Remarks from COPPA’s Exceptions to Parental Consent Panel at20

the Federal Trade Commission’s Roundtable: Protecting Kids’ Privacy Online 310 (June 2,2010), available athttp://www.ftc.gov/bcp/workshops/coppa/COPPARuleReview_Transcript.pdf.

See 75 FR at 17090, Question 9.21

See Entertainment Software Association (“ESA”) (comment 20, 2010 FRN), at22

13-14; R. Newton (comment 46, 2010 FRN), at 4; Privo, Inc. (comment 50, 2010 FRN), at 5; B.Szoka (comment 59, 2010 FRN), at 19; see also Wired Safety (comment 68, 2010 FRN), at 15.

-9-

services to make interactive content available to children, without providing parental notice and

obtaining consent, provided that all personal information was deleted prior to posting.20

The 2010 FRN sought comment on whether to change the 100% deletion standard,

whether automated systems used to review and post child content could meet this standard, and

whether the Commission had provided sufficient guidance on the deletion of personal

information. In response, several commenters urged a new standard, arguing that the 100%21

deletion standard, while well-intentioned, was an impediment to operators’ implementation of

sophisticated automated filtering technologies that may actually aid in the detection and removal

of personal information. 22

In the 2011 NPRM, the Commission stated that the 100% deletion standard set an

unrealistic hurdle to operators’ implementation of automated filtering systems that could promote

engaging and appropriate online content for children, while ensuring strong privacy protections

by design. To address this, the Commission proposed replacing the 100% deletion standard with

a “reasonable measures” standard. Under this approach, an operator would not be deemed to

have collected personal information if it takes reasonable measures to delete all or virtually all

Page 10: Children's Online Privacy Protection Rule

See 76 FR at 59808.23

See Institute for Public Representation (comment 71, 2011 NPRM), at 19.24

See NCTA (comment 113, 2011 NPRM), at 8. 25

DMA (comment 37, 2011 NPRM), at 7. 26

See DMA id.; Institute for Public Representation (comment 71, 2011 NPRM), at27

3; kidSAFE Seal Program (comment 81, 2011 NPRM), at 5; NCTA (comment 113, 2011NPRM), at 8; Toy Industry Association (comment 163, 2011 NPRM), at 8.

See TechFreedom (comment 159, 2011 NPRM), at 6.28

-10-

personal information from a child’s postings before they are made public, and also to delete such

information from its records.” 23

Although the Institute for Public Representation raised concerns about the effectiveness of

automated filtering techniques, most comments were resoundingly in favor of the “reasonable24

measures” standard. For example, one commenter stated that the revised language would enable

the use of automated procedures that could provide “increased consistency and more effective

monitoring than human monitors,” while another noted that it would open the door to “cost-25

efficient and reliable means of monitoring children’s communications.” Several commenters26

noted that the proposed reasonable measures standard would likely encourage the creation of

more rich, interactive online content for children. Another commenter noted that the revised27

provision, by offering greater flexibility for technological solutions, should help minimize the

burden of COPPA on children’s free expression. 28

The Commission is persuaded that the 100% deletion standard should be replaced with a

reasonable measures standard. The reasonable measures standard strikes the right balance in

ensuring that operators have effective, comprehensive measures in place to prevent public online

disclosure of children’s personal information and ensure its deletion from their records, while also

Page 11: Children's Online Privacy Protection Rule

76 FR at 59808.29

Privacy Rights Clearinghouse indicated its belief that this change would give30

operators added incentive to notify parents of their information collection practices, particularlywith regard to online tracking and behavioral advertising. See Privacy Rights Clearinghouse(comment 131, 2011 NPRM), at 2; see also Consumers Union (comment 29, 2011 NPRM), at 2; kidSAFE Seal Program (comment 81, 2011 NPRM), at 6.

See DMA (comment 37, 2011 NPRM), at 9-10; IAB (comment 73, 2011 NPRM),31

at 12; NCTA (comment 113, 2011 NPRM), at 17-18; National Retail Federation (comment 114,2011 NPRM), at 2-3; TechAmerica (comment 157, 2011 NPRM), at 5-6.

-11-

retaining the flexibility operators need to innovate and improve their mechanisms for detecting

and deleting such information. Therefore, the final Rule amends paragraph (b) of the definition of

collects or collection to adopt the reasonable measures standard proposed in the 2011 NPRM.

c. Collects or Collection, paragraph (c)

In the 2011 NPRM, the Commission proposed to modify paragraph (c) of the Rule’s

definition of collects or collection to clarify that it includes all means of passively collecting

personal information from children online, irrespective of the technology used. The Commission

sought to accomplish this by removing from the original definition the language “or use of any

identifying code linked to an individual, such as a cookie.” 29

The Commission received several comments supporting, and several comments30

opposing, this proposed change. Those opposing the change generally believed that this change 31

somehow expanded the definition of personal information. As support for their argument, these

commenters also referenced the Commission’s proposal to include persistent identifiers within

the definition of personal information.

The Commission believes that paragraph (c), as proposed in the 2011 NPRM, is

sufficiently understandable. The paragraph does nothing to alter the fact that the Rule covers

Page 12: Children's Online Privacy Protection Rule

See Part II.C.10.g., infra.32

See 2011 NPRM, 76 FR at 59809.33

The Commission intended this change to clarify what was meant by the terms34

release of personal information and support for the internal operations of the website or onlineservice, where those terms are referenced elsewhere in the Rule and are not directly connectedwith the terms disclose or disclosure.

See kidSAFE Seal Program (comment 81, 2011 NPRM), at 8 (“[P]aragraph (b)35

under the definition of “disclose or disclosure” should have the following opening clause: Subject to paragraph (b) under the definition of “collects or collection,” making personalinformation collected by an operator from a child publicly available . . . .”).

-12-

only the collection of personal information. Moreover, the final Rule’s exception for the limited

use of persistent identifiers to support internal operations – 312.5(c)(7) – clearly articulates the

specific criteria under which an operator will be exempt from the Rule’s notice and consent

requirements in connection with the passive collection of a persistent identifier. Accordingly,32

the Commission adopts the definition of collects or collection as proposed in the 2011 NPRM.

2. Definition of Disclose or Disclosure

In the 2011 NPRM, the Commission proposed making several minor modifications to

Section 312.2 of the Rule’s definition of disclosure, including broadening the title of the

definition to disclose or disclosure to clarify that in every instance in which the Rule refers to

instances where an operator “disclose[s]” information, the definition of disclosure shall apply. 33

In addition, the Commission proposed moving the definitions of release of personal information

and support for the internal operations of the website or online service contained within the

definition of disclosure to make them stand-alone definitions within Section 312.2 of the Rule. 34

One commenter asked the Commission to modify paragraph (b) of the proposed

definition by adding an opening clause linking it to the definition of collects or collection. 35

Page 13: Children's Online Privacy Protection Rule

36 The Rule’s definition of personal information included the sub-category “an e-mail address or other online contact information, including but not limited to an instantmessaging user identifier, or a screen name that reveals an individual’s e-mail address.” The2011 NPRM proposed replacing that sub-category of personal information with online contactinformation.

37 76 FR at 59810.

-13-

While this commenter did not state its reasons for the proposed change, the Commission believes

that the language of paragraph (b) is sufficiently clear so as not to warrant making the change

suggested. Therefore, the Commission modifies the definition of disclosure or disclosure as

proposed in the 2011 NPRM.

3. Definition of Online Contact Information

Section 312.2 of the Rule defines online contact information as “an e-mail address or any

other substantially similar identifier that permits direct contact with a person online.” The 2011

NPRM proposed clarifications to the definition to flag that the term broadly covers all identifiers

that permit direct contact with a person online and to ensure consistency between the definition

of online contact information and the use of that term within the definition of personal

information. The proposed revised definition identified commonly used online identifiers,36

including email addresses, instant messaging (“IM”) user identifiers, voice over Internet protocol

(“VOIP”) identifiers, and video chat user identifiers, while also clarifying that the list of

identifiers was non-exhaustive and would encompass other substantially similar identifiers that

permit direct contact with a person online. The Commission received few comments37

addressing this proposed change.

One commenter opposed the modification, asserting that IM, VOIP, and video chat user

identifiers do not function in the same way as email addresses. The commenter’s rationale for

Page 14: Children's Online Privacy Protection Rule

See DMA (comment 37, 2011 NPRM), at 11.38

39 kidSAFE Seal Program (comment 81, 2011 NPRM), at 7. Acknowledging theCommission’s position that cell phone numbers are outside of the statutory definition of onlinecontact information, kidSAFE advocates for a statutory change, if needed, to enable mobile appoperators, in particular, to reach parents using contact information “relevant to their ecosystem.”

At the same time, the Commission believes it may be impractical to expect40

children to correctly distinguish between mobile and land-line phones when asked for their

-14-

this argument was that not all IM identifiers reveal the IM system in use, which information is

needed to directly contact a user. The Commission does not find this argument persuasive. 38

While an IM address may not reveal the IM program provider in every instance, it very often

does. Moreover, several IM programs allow users of different messenger programs to

communicate across different messaging platforms. Like email, instant messaging is a

communications tool that allows people to communicate one-to-one or in groups – sometimes in

a faster, more real-time fashion than through email. The Commission finds, therefore, that IM

identifiers provide a potent means to contact a child directly.

Another commenter asked the Commission to expand the definition of online contact

information to include mobile phone numbers. The commenter noted that, given the Rule’s

coverage of mobile apps and web-based text messaging programs, operators would benefit

greatly from collecting a parent’s mobile phone number (instead of an email address) in order to

initiate contact for notice and consent. The Commission recognizes that including mobile39

phone numbers within the definition of online contact information could provide operators with

a useful tool for initiating the parental notice process through either SMS text or a phone call. It

also recognizes that there may be advantages to parents for an operator to initiate contact via

SMS text – among them, that parents generally have their mobile phones with them and that

SMS text is simple and convenient. However, the statute did not contemplate mobile phone40

Page 15: Children's Online Privacy Protection Rule

parents’ mobile numbers.

41 Moreover, given that the final Rule’s definition of online contact informationencompasses a broad, non-exhaustive list of online identifiers, operators will not be undulyburdened by the Commission’s determination that cell phone numbers are not online contactinformation.

2012 SNPRM, 77 FR at 46644.42 The Commission acknowledged that thisdecision reversed a previous policy choice to place the burden of notice and consent entirelyupon the information collection entity.

-15-

numbers as a form of online contact information, and the Commission therefore has determined

not to include mobile phone numbers within the definition. Thus, the final Rule adopts the41

definition of online contact information as proposed in the 2012 SNPRM.

4. Definitions of Operator and Website or Online Service Directed toChildren

In the 2012 SNPRM, the Commission proposed modifying the definitions of both

operator and website or online service directed to children to allocate and clarify the

responsibilities under COPPA when independent entities or third parties, e.g., advertising

networks or downloadable plug-ins, collect information from users through child-directed sites

and services. Under the proposed revisions, the child-directed content provider would be strictly

liable for personal information collected by third parties through its site. The Commission

reasoned that, although the child-directed site or service may not own, control, or have access to

the personal information collected, such information is collected on its behalf due to the benefits

it receives by adding more attractive content, functionality, or advertising revenue. The

Commission also noted that the primary-content provider is in the best position to know that its

site or service is directed to children, and is appropriately positioned to give notice and obtain

consent. By contrast, if the Commission failed to impose obligations on the content providers,42

Page 16: Children's Online Privacy Protection Rule

In so doing, the Commission noted that it believed it could hold the information43

collection entity strictly liable for such collection because, when operating on child-directedproperties, that portion of an otherwise general audience service could be deemed directed tochildren. 2012 SNPRM, 77 FR at 46644-46645.

See, e.g., Facebook (comment 33, 2012 SNPRM), at 3-4.44

See Microsoft (comment 66, 2012 SNPRM), at 6; IAB (comment 49, 201245

SNPRM), at 5; DMA (comment 28, 2012 SNPRM), at 5.

See, e.g., Institute for Public Representation (comment 52, 2012 SNPRM), at 20;46

Common Sense Media (comment 20, 2012 SNPRM), at 6.

-16-

there would be no incentive for child-directed content providers to police their sites or services,

and personal information would be collected from young children, thereby undermining

congressional intent. The Commission also proposed imputing the child-directed nature of the

content site to the entity collecting the personal information only if that entity knew or had

reason to know that it was collecting personal information through a child-directed site. 43

Most of the comments opposed the Commission’s proposed modifications. Industry

comments challenged the Commission’s statutory authority for both changes and the breadth of

the language, and warned of the potential for adverse consequences. In essence, many industry

comments argued that the Commission may not apply COPPA where independent third parties

collect personal information through child-directed sites, and that even if the Commission had44

some authority, exercising it would be impractical because of the structure of the “online

ecosystem.” Many privacy and children’s advocates agreed with the 2012 SNPRM proposal to45

hold child-directed content providers strictly liable, but some expressed concern about holding

plug-ins and advertising networks to a lesser standard.46

Page 17: Children's Online Privacy Protection Rule

15 U.S.C. 6501(2). The Rule’s definition of operator reflects the statutory47

language. See 16 CFR 312.2.

-17-

For the reasons discussed below, the Commission, with some modifications to the

proposed Rule language, will retain the strict liability standard for child-directed content

providers that allow other online services to collect personal information through their sites. The

Commission will deem a plug-in or other service to be a covered co-operator only where it has

actual knowledge that it is collecting information through a child-directed site.

a. Strict Liability for Child-Directed Content Sites: Definition ofOperator

Implementing strict liability as described above requires modifying the current definition

of operator. The Rule, which mirrors the statutory language, defines operator in pertinent part,

as:

(A) any person who operates a website located on the Internet oran online service and who collects or maintains personalinformation from or about the users of or visitors to such websiteor online service, or on whose behalf such information is collectedor maintained, where such website or online service is operated forcommercial purposes, including any person offering products orservices for sale through that website or online service, involvingcommerce . . . . 47

In the 2012 SNPRM, the Commission proposed adding a proviso to that definitionstating:

Personal information is collected or maintained on behalf of anoperator where it is collected in the interest of, as a representativeof, or for the benefit of, the operator.

Page 18: Children's Online Privacy Protection Rule

See, e.g., Application Developers Alliance (comment 5, 2012 SNPRM), at 3-4;48

Association of Competitive Technology (comment 7, 2012 SNPRM), at 4-5; IAB (comment 49,2012 SNPRM), at 5-6; Online Publishers Association (comment 72, 2012 SNPRM), at 10-11;Magazine Publishers of America (comment 61, 2012 SNPRM), at 3-5; The Walt Disney Co.(comment 96, 2012 SNPRM), at 4-5; S. Weiner (comment 97, 2012 SNPRM), at 1-2;WiredSafety (comment 98, 2012 SNPRM), at 3.

See DMA (comment 28, 2012 SNPRM), at 12; Internet Commerce Coalition49

(comment 53, 2012 SNPRM), at 5; TechAmerica (comment 87, 2012 SNPRM), at 2-3.

See, e.g., Gibson, Dunn & Crutcher (comment 39, 2012 SNPRM), at 7-9;50

Facebook (comment 33, 2012 SNPRM), at 6 (entities acting primarily for their own benefit notconsidered to be acting on behalf of another party).

See, e.g., Business Software Alliance (comment 12, 2012 SNPRM), at 2-4;51

Internet Commerce Coalition (comment 53, 2012 SNPRM), at 5; see also, e.g., IAB (comment49, 2012 SNPRM), at 5; DMA (comment 28, 2012 SNPRM), at 6; Online Publishers Association(comment 72, 2012 SNPRM), at 10-11; The Walt Disney Co. (comment 96, 2012 SNPRM), at 3-5.

See Center for Democracy & Technology (“CDT”) (comment 15, 2012 SNPRM),52

at 4-5; DMA (comment 28, 2012 SNPRM), at 5; Google (comment 41, 2012, SNPRM), at 3-4;

-18-

Industry, particularly online content publishers, including app developers, criticized this

proposed change. Industry comments argued that the phrase “on whose behalf” in the statute48

applies only to agents and service providers, and that the Commission lacks the authority to49

interpret the phrase more broadly to include any incidental benefit that results when two parties

enter a commercial transaction. Many commenters pointed to an operator’s post-collection50

responsibilities under COPPA, e.g., mandated data security and affording parents deletion rights,

as evidence that Congress intended to cover only those entities that control or have access to the

personal information.51

Commenters also raised a number of policy objections. Many argued that child-directed

properties, particularly small app developers, would face unreasonable compliance costs and that

the proposed revisions might choke off their monetization opportunities, thus decreasing the52

Page 19: Children's Online Privacy Protection Rule

Lynette Mattke (comment 63, 2012 SNPRM).

See Google (comment 41, 2012 SNPRM), at 3; Application Developers Alliance53

(comment 5, 2012 SNPRM), at 5; Association for Competitive Technology (comment 6, 2012SNPRM), at 5; The Walt Disney Co. (comment 96, 2012 SNPRM), at 4; ConnectSafely(comment 21, 2012 SNPRM), at 2.

See Application Developers Alliance (comment 5, 2012 SNPRM), at 3; Online54

Publishers Association (comment 72, 2012 SNPRM), at 11; The Walt Disney Co. (comment 96,2012 SNPRM), at 4; DMA (comment 28, 2012 SNPRM), at 4.

See, e.g., Online Publishers Association (comment 72, 2012 SNPRM), at 1155

(publisher should be entitled to rely on third party’s representations about its informationpractices); The Walt Disney Co. (comment 96, 2012 SNPRM), at 5 (operator of a site directed tochildren should be permitted to rely on the representations made by third parties regarding theirpersonal information collection practices, as long as the operator has undertaken reasonableefforts to limit any unauthorized data collection); Internet Commerce Coalition (comment 53,2012 SNPRM), at 6 (the Commission should state that operators whose sites or services aretargeted to children should bind third party operators whom they know are collecting personalinformation through their sites or services to comply with COPPA with regard to thatinformation collection).

See Institute for Public Representation (comment 52, 2012 SNPRM), at 18-19;56

Common Sense Media (comment 20, 2012 SNPRM), at 4-6; EPIC (comment 31, 2012 SNPRM),

-19-

incentive for developers to create engaging and educational content for children. They also53

argued that a strict liability standard is impractical given the current online ecosystem, which

does not rely on close working relationships and communication between content providers and

third parties that help monetize that content. Some commenters urged the Commission to54

consider a safe harbor for content providers that exercise some form of due diligence regarding

the information collection practices of plug-ins present on their site.55

Privacy organizations generally supported imposing strict liability on content providers.

They agreed with the Commission’s statement in the 2012 SNPRM that the first-party content

provider is in a position to control which plug-ins and software downloads it integrates into its

site and that it benefits by allowing information collection by such third parties. They also56

Page 20: Children's Online Privacy Protection Rule

at 5-6; Catholic Bishops (comment 92, 2012 SNPRM), at 3; CDT (comment 15, 2012 SNPRM),at 3.

See Institute for Public Representation (comment 52, 2012 SNPRM), at 19;57

Common Sense Media (comment 20, 2012 SNPRM), at 5.

See CDT (comment 15, 2012 SNPRM), at 5; Apple (comment 4, 2012 SNPRM),58

at 3-4; Assert ID (comment 6, 2012 SNPRM), at 5.

Although this issue is framed in terms of child-directed content providers59

integrating plug-ins or other online services into their sites because that is by far the most likelyscenario, the same strict liability standard would apply to a general audience content providerthat allows a plug-in to collect personal information from a specific user when the provider hasactual knowledge the user is a child.

-20-

noted how unreasonable it would be for parents to try to decipher which entity might actually be

collecting data through the child-directed property.57

Finally, many commenters expressed concern that the language describing “on whose

behalf” reaches so broadly as to cover not only child-directed content sites, but also marketplace

platforms such as Apple’s iTunes App Store and Google’s Android market (now Google Play) if

they offered child-directed apps on their platforms. These commenters urged the Commission58

to revise the language of the Rule to exclude such platforms.

After considering the comments, the Commission retains a strict liability standard for

child-directed sites and services that allow other online services to collect personal information

through their sites. The Commission disagrees with the views of commenters that this is59

contrary to Congressional intent or the Commission’s statutory authority. The Commission does

not believe Congress intended the loophole advocated by many in industry: personal

information being collected from children through child-directed properties with no one

responsible for such collection.

Page 21: Children's Online Privacy Protection Rule

National Organization for Marriage v. Daluz, 654 F.3d 115, 121 (1 Cir. 2011)60 st

(statute requiring expenditure reports by independent PAC to the treasurer of the candidate “onwhose behalf” the expenditure was made meant to the candidate who stands to benefit from theindependent expenditure’s advocacy); accord American Postal Workers Union v. United StatesPostal Serv., 595 F. Supp 1352 (D.D.C. 1984) (Postal Union’s activities held to be “on behalfof” a political campaign where evidence showed union was highly politicized, with goal ofelecting a particular candidate); Sedwick Claims Mgmt. Servs. v. Barrett Business Servs., Inc.,2007 WL 1053303 (D. Or. 2007) (noting that 9 Circuit has interpreted the phrase “on behalf of”th

to include both “to the benefit of” and in a representative capacity); United States v. DishNetwork, LLC, 2010 U.S. Dist. LEXIS 8957, 10 (C.D. Ill. Feb. 3, 2010) (reiterating the court’sprevious opinion that the plain meaning of the phrases “on whose behalf” or “on behalf of” is anact by a representative of, or an act for the benefit of, another).

Application Developers Alliance (comment 5, 2012 SNPRM), at 2; see also61

Gibson, Dunn & Crutcher (comment 39, 2012 SNPRM), at 7.

Application Developers Alliance (comment 5, 2012 SNPRM), at 4.62

-21-

Nor is the Commission persuaded by comments arguing that the phrase “on whose

behalf” must be read extremely narrowly, encompassing only an agency relationship. Case law

supports a broader interpretation of that phrase. Even some commenters opposed to the60

Commission’s interpretation have acknowledged that the Commission’s proposal is based on “an

accurate recognition that online content monetization is accomplished through a complex web of

inter-related activities by many parties,” and have noted that to act on behalf of another is to do

what that person would ordinarily do herself if she could. That appears to be precisely the61

reason many first-party content providers integrate these services. As one commenter pointed

out, content providers “have chosen to devote their resources to develop great content, and to let

partners help them monetize that content. In part, these app developers and publishers have

made this choice because collecting and handling children’s data internally would require them

to take on liability risk and spend compliance resources that they do not have.” Moreover,62

Page 22: Children's Online Privacy Protection Rule

Id.; see also Association for Competitive Technology (comment 7, 201263

SNPRM), at 5; see generally DMA (comment 28, 2012 SNPRM), at 5; Facebook (comment 33,2012 SNPRM), at 3; Online Publishers Association (comment 72, 2012 SNPRM), at 11.

Id.64

-22-

content-providing sites and services often outsource the monetization of those sites “to partners”

because they do not have the desire to handle it themselves.63

In many cases, child-directed properties integrate plug-ins to enhance the functionality or

content of their properties or gain greater publicity through social media in an effort to drive

more traffic to their sites and services. Child-directed properties also may obtain direct

compensation or increased revenue from advertising networks or other plug-ins. These benefits

to child-directed properties are not merely incidental; as the comments point out, the benefits

may be crucial to their continued viability.64

The Commission recognizes the potential burden that strict liability places on child-

directed content providers, particularly small app developers. The Commission also appreciates

the potential for discouraging dynamic child-directed content. Nevertheless, when it enacted

COPPA, Congress imposed absolute requirements on child-directed sites and services regarding

restrictions on the collection of personal information; those requirements cannot be avoided

through outsourcing offerings to other operators in the online ecosystem. The Commission

believes that the potential burden on child-directed sites discussed by the commenters in

response to the 2012 SNPRM will be eased by the more limited definition of persistent

identifiers, the more expansive definition of support for internal operations adopted in the Final

Rule, and the newly-created exception to the Rule’s notice and parental consent requirements

Page 23: Children's Online Privacy Protection Rule

See Part II.A.5.b., infra (discussion of persistent identifiers and support of internal65

operations).

The type of due diligence advocated ranged from essentially relying on a plug-in66

or advertising network’s privacy policy to requiring an affirmative contract. See, e.g., The WaltDisney Co. (comment 96, 2012 SNPRM), at 5 (operator should be able to rely on third party’srepresentations about its information collection practices, if operator makes reasonable efforts tolimit unauthorized data collection); Gibson, Dunn & Crutcher (comment 39, 2012 SNPRM), at23-24 (provide a safe harbor for operators that certify they do not receive, own, or control anypersonal information collected by third parties; alternatively, grant a safe harbor for operatorsthat also certify they do not receive a specific benefit from the collection, or that obtain thirdparty’s certification of COPPA compliance); Internet Commerce Coalition (comment 53, 2012SNPRM), at 6-7 (provide a safe harbor for operators whose policies prohibit third partycollection on their sites).

See Common Sense Media (comment 20, 2012 SNPRM), at 4-5; EPIC (comment67

31, 2012 SNPRM), at 6; Institute for Public Representation (comment 52, 2012 SNPRM), at 18-19.

Some commenters, although not conceding the need to impose strict liability on68

any party, noted that if the burden needed to fall on either the primary content provider or theplug-in, it was better to place it on the party that controlled the child-directed nature of the

-23-

that applies when an operator collects only a persistent identifier and only to support the

operator’s internal operations.65

The Commission considered including the “due-diligence” safe harbor for child-directed

content providers that many of the comments proposed. Nevertheless, as many other66

comments pointed out, it cannot be the responsibility of parents to try to pierce the complex

infrastructure of entities that may be collecting their children’s personal information through any

one site. For child-directed properties, one entity, at least, must be strictly responsible for67

providing parents notice and obtaining consent when personal information is collected through

that site. The Commission believes that the primary-content site or service is in the best position

to know which plug-ins it integrates into its site, and is also in the best position to give notice

and obtain consent from parents. Although the Commission, in applying its prosecutorial68

Page 24: Children's Online Privacy Protection Rule

content. See, e.g., CTIA (comment 24, 2012 SNPRM), at 8-9; CDT (comment 15, 2012SNPRM), at 4-5. Not surprisingly, industry members primarily in the business of providingcontent did not share this view. See, e.g., Association for Competitive Technology (comment 7,2012 SNPRM), at 4-5; Business Software Alliance (comment 12, 2012 SNPRM), at 2-4;Entertainment Software Association (comment 32, 2102 SNPRM), at 9; Online PublishersAssociation (comment 72, 2012 SNPRM), at 10-11; The Walt Disney Co. (comment 96, 2012SNPRM), at 6.

This clarification to the term “on behalf of” is intended only to address platforms69

in instances where they function as an conduit to someone else’s content. Platforms may wellwear multiple hats and are still responsible for complying with COPPA if they themselves collect personal information directly from children.

-24-

discretion, will consider the level of due diligence a primary-content site exercises, the

Commission will not provide a safe harbor from liability.

When it issued the 2012 SNPRM, the Commission never intended the language

describing “on whose behalf” to encompass platforms, such as Google Play or the App Store,

when such stores merely offer the public access to someone else’s child-directed content. In

these instances, the Commission meant the language to cover only those entities that designed

and controlled the content, i.e., the app developer or site owner. Accordingly, the Commission

has revised the language proposed in the 2012 SNPRM to clarify that personal information will

be deemed to be collected on behalf of an operator where it benefits by allowing another person

to collect personal information directly from users of such operator’s site or service, thereby

limiting the provision’s coverage to operators that design or control the child-directed content. 69

Accordingly, the Final Rule shall read:

Personal information is collected or maintained on behalf of anoperator when: (a) it is collected or maintained by an agent orservice provider of the operator; or (b) the operator benefits byallowing another person to collect personal information directlyfrom users of such operator’s website or online service.

Page 25: Children's Online Privacy Protection Rule

See Business Software Alliance (comment 12, 2012 SNPRM), at 4-5; Digital70

Advertising Alliance (comment 27, 2012 SNPRM), at 2; Google (comment 41, 2012 SNPRM),at 4; Internet Commerce Coalition (comment 53, 2012 SNPRM), at 7; Magazine Publishers ofAmerica (comment 61, 2012 SNPRM), at 8; Toy Industry Association (comment 89, 2012SNPRM), at 10-11; see also ACLU (comment 3, 2012 SNPRM), at 2-3; TechAmerica (comment87, 2012 SNPRM), at 3.

See CDT (comment 15, 2012 SNPRM), at 2; CTIA (comment 24, 2012 SNPRM),71

at 10; Entertainment Software Association (comment 32, 2012 SNPRM), at 9; MarketingResearch Association (comment 62, 2012 SNPRM), at 2; Tangman (comment 85, 2012SNPRM).

-25-

b. Operators Collecting Personal Information Through Child-Directed Sites and Online Services: Moving to an ActualKnowledge Standard

In the 2012 SNPRM, the Commission proposed holding responsible as a co-operator any

site or online service that “knows or has reason to know” it is collecting personal information

through a host website or online service directed to children. Many commenters criticized this

standard. Industry comments contended that such a standard is contrary to the statutory mandate

that general audience services be liable only if they have actual knowledge they are collecting

information from a child. They further argued that the standard is vague because it is70

impossible to determine what type of notification would provide a “reason to know.” Thus, the

commenters argued that the standard triggers a duty to inquire. In addition, commenters stated71

that even after inquiring, it might be impossible to determine which sites are truly directed to

children (particularly in light of the Commission’s revised definition of website directed to

children to include those sites that are likely to attract a disproportionate percentage of children

Page 26: Children's Online Privacy Protection Rule

See DMA (comment 28, 2012 SNPRM), at 9; Magazine Publishers of America72

(comment 61, 2012 SNPRM), at 8; Menessec (comment 65, 2012 SNPRM); Privo (comment 76,2012 SNPRM), at 8.

See Common Sense Media (comment 20, 2012 SNPRM), at 6; Institute for Public73

Representation (comment 52, 2012 SNPRM), at 20-22.

See Digital Advertising Alliance (comment 27, 2012 SNPRM), at 2; DMA74

(comment 28, 2012 SNPRM), at 8-9; Entertainment Software Association (comment 32, 2012SNPRM), at 13-14.

-26-

under 13). Conversely, many privacy advocates believed it is necessary to impose some duty72

of inquiry, or even strict liability, on the entity collecting the personal information.73

After considering the comments, the Commission has decided that while it is appropriate

to hold an entity liable under COPPA for collecting personal information on websites or online

services directed to children, it is reasonable to hold such entity liable only where it has actual

knowledge that it is collecting personal information directly from users of a child-directed site or

service. In striking this balance by moving to an actual knowledge standard, the Commission

recognizes that this is still contrary to the position advocated by many industry comments: that a

plug-in or advertising network that collects personal information from users of both general

audience and child-directed sites must be treated monolithically as a general audience service,

liable only if it has actual knowledge that it is collecting personal information from a specific

child. However, the COPPA statute also defines website or online service directed to children74

to include “that portion of a commercial website or online service that is targeted to children.”

Where an operator of an otherwise general audience site or online service has actual knowledge

it is collecting personal information directly from users of a child-directed site, and continues to

collect that information, then, for purposes of the statute, it has effectively adopted that child-

Page 27: Children's Online Privacy Protection Rule

Similarly, when a behavioral advertising network offers age-based advertising75

segments that target children under 13, that portion of its service becomes an online servicedirected to children. Contra DMA (comment 28, 2012 SNPRM), at 12. The Commission alsobelieves that narrowing the definition of persistent identifiers and further revisions to thedefinition of website or online service directed to children ease (although not entirely eliminate)many of the concerns expressed in industry comments. See, e.g., CDT (comment 15, 2012SNPRM), at 3; Digital Advertising Alliance (comment 27, 2012 SNPRM), at 2; EntertainmentSoftware Association (comment 32, 2012 SNPRM), at 14 (combination of reason to knowstandard and expanded definition of persistent identifiers creates an unworkable result).

See Microsoft (comment 66, 2012 SNPRM), at 2; TRUSTe (comment 90, 201276

SNPRM), at 4; see also Association for Competitive Technology (comment 7, 2012 SNPRM), at3-4; Google (comment 41, 2012 SNPRM), at 4; DMA (comment 28, 2012 SNPRM), at 7; Viacom (comment 95, 2012 SNPRM), at 8-9.

-27-

directed content as its own and that portion of its service may appropriately be deemed to be

directed to children. 75

Commenters urged that, whatever standard the Commission ultimately adopts, it provide

guidance as to when a plug-in or advertising network would be deemed to have knowledge that it

is collecting information through a child-directed site or service. Knowledge, by its very76

nature, is a highly fact-specific inquiry. The Commission believes that the actual knowledge

standard it is adopting will likely be met in most cases when: (1) a child-directed content

provider (who will be strictly liable for any collection) directly communicates the child-directed

nature of its content to the other online service; or (2) a representative of the online service

recognizes the child-directed nature of the content. The Commission does not rule out that an

accumulation of other facts would be sufficient to establish actual knowledge, but those facts

would need to be analyzed carefully on a case-by-case basis.

Page 28: Children's Online Privacy Protection Rule

See 16 CFR 312.2 (paragraph (c), definition of personal information). 77

2011 NPRM, 76 FR at 59810.78

Id.79

See DMA (comment 37, 2011 NPRM), at 15-16; ESA (comment 47, 201180

NPRM), at 9; NCTA (comment 113, 2011 NPRM), at 12; Scholastic (comment 144, 2011NPRM), at 12; A. Thierer (comment 162, 2011 NPRM), at 6; TRUSTe (comment 164, 2011NPRM), at 3; The Walt Disney Co. (comment 170, 2011 NPRM), at 21.

-28-

5. Definition of Personal Information

a. Screen or User Names

The Rule defines personal information as including “a screen name that reveals an

individual’s e-mail address.” In the 2011 NPRM, the Commission proposed to modify this77

definition to include “a screen or user name where such screen or user name is used for functions

other than or in addition to support for the internal operations of the website or online service.” 78

The Commission intended this change to address scenarios in which a screen or user name could

be used by a child as a single credential to access multiple online properties, thereby permitting

him or her to be directly contacted online, regardless of whether the screen or user name

contained an email address. 79

Some commenters expressed concern that the Commission’s screen-name proposal

would unnecessarily inhibit functions that are important to the operation of child-directed

websites and online services. In response to this concern, the 2012 SNPRM proposed covering80

screen names as personal information only in those instances in which a screen or user name

rises to the level of online contact information. In such cases, the Commission reasoned, a

Page 29: Children's Online Privacy Protection Rule

See 2011 NPRM, 76 FR at 59810 (proposed definition of online contact81

information).

See Common Sense Media (comment 20, 2012 SNPRM), at 7; Information82

Technology Industry Council (comment 51, 2012 SNPRM), at 2; Marketing ResearchAssociation (comment 62, 2012 SNPRM), at 3; Promotion Marketing Association (comment 77,2012 SNPRM), at 8; TechAmerica (comment 87, 2012 SNPRM), at 5-6.

See, e.g., Promotion Marketing Association, id.83

See DMA (comment 28, 2012 SNPRM), at 16; ESA (comment 32, 201284

SNPRM), at 5; kidSAFE Seal Program (comment 56, 2012 SNPRM), at 5; NCTA (comment 69,2012 SNPRM), at 4-5; Online Publishers Association (comment 72, 2012 SNPRM), at 12; ToyIndustry Association (comment 89, 2012 SNPRM), at 13; TRUSTe (comment 90, 2012SNPRM), at 5-6.

-29-

screen or user name functions much like an email address, an instant messaging identifier, or

“any other substantially similar identifier that permits direct contact with a person online.” 81

The Commission received a number of comments in support of this change from industry

associations and advocacy groups. Commenters recognized the change as providing operators82

with the flexibility to use screen or user names both for internal administrative purposes and

across affiliated sites, services, or platforms without requiring prior parental notification or

verifiable parental consent. 83

A number of commenters, however, despite clear language otherwise in the 2012

SNPRM, continued to express concern that the Commission’s proposed revision would limit

operators’ use of anonymized screen names in place of children’s real names in filtered chat,

moderated interactive forums, or as log-in credentials providing users with seamless access to

content across multiple platforms and devices. Some of these commenters urged the84

Commission to refine the definition further, for example, by explicitly recognizing that the use

of screen names for activities such as moderated chat will not be deemed as permitting “direct

Page 30: Children's Online Privacy Protection Rule

See Online Publishers Association (comment 72, 2012 SNPRM), at 12; TRUSTe 85

TRUSTe (comment 90, 2012 SNPRM), at 5-6.

See kidSAFE Seal Program (comment 56, 2012 SNPRM), at 5.86

See ESA (comment 32, 2012 SNPRM), at 5.87

See Common Sense Media (comment 20, 2012 SNPRM), at 7.88

-30-

contact” with a child online and therefore will not require an operator using anonymous screen

names to notify parents or obtain their consent. Others suggested a return to the Commission’s85

original definition of screen or user names, i.e., only those that reveal an individual’s online

contact information (as newly defined). Yet others hoped to see the Commission carve out86

from the definition of screen or user name uses to support an operator’s internal operations (such

as using screen or user names to enable moderated or filtered chat and multiplayer game

modes).87

The Commission sees no need to qualify further the proposed description of screen or

user name. The description identifies precisely the form of direct, private, user-to-user contact

the Commission intends the Rule to cover – i.e., “online contact [that] can now be achieved via

several methods besides electronic mail.” The Commission believes the description permits88

operators to use anonymous screen and user names in place of individually identifiable

information, including use for content personalization, filtered chat, for public display on a

website or online service, or for operator-to-user communication via the screen or user name.

Moreover, the definition does not reach single log-in identifiers that permit children to transition

between devices or access related properties across multiple platforms. For these reasons, the

Commission modifies the definition of personal information, as proposed in the 2012 SNPRM,

Page 31: Children's Online Privacy Protection Rule

See 16 CFR 312.2 of the existing Rule (paragraph (f), definition of personal89

information).

See 2011 NPRM, 76 FR at 59812 (proposed definition of personal information,90

paragraphs (g) and (h)).

Those comments are discussed in the 2012 SNPRM, 77 FR at 46647.91

-31-

to include “a screen or user name where it functions in the same manner as online contact

information, as defined in this Section.”

b. Persistent Identifiers and Support for Internal Operations

Persistent identifiers have long been covered by the COPPA Rule, but only where they

are associated with individually identifiable information. In the 2011 NPRM, and again in the89

2012 SNPRM, the Commission proposed broader Rule coverage of persistent identifiers.

First, in the 2011 NPRM, the Commission proposed covering persistent identifiers in two

scenarios – (1) where they are used for functions other than or in addition to support for the

internal operations of the website or online service, and (2) where they link the activities of a

child across different websites or online services. After receiving numerous comments on the90

proposed inclusion of persistent identifiers within the definition of personal information, the91

Commission refined its proposal in the 2012 SNPRM.

In the Commission’s refined proposal in the 2012 SNPRM, the definition of personal

information would include a persistent identifier “that can be used to recognize a user over time,

or across different websites or online services, where such persistent identifier is used for

functions other than or in addition to support for the internal operations of the website or online

Page 32: Children's Online Privacy Protection Rule

Id.92

The proposed definition of support for internal operations was:93

[T]hose activities necessary to: (a) maintain or analyze thefunctioning of the Web site or online service; (b) perform networkcommunications; (c) authenticate users of, or personalize thecontent on, the Web site or online service; (d) serve contextualadvertising on the Web site or online service; (e) protect thesecurity or integrity of the user, Web site, or online service; or (f)fulfill a request of a child as permitted by §§ 312.5(c)(3) and (4);so long as the information collected for the activities listed in(a)–(f) is not used or disclosed to contact a specific individual orfor any other purpose.

Id. at 46648.

Contextual advertising is “the delivery of advertisements based upon a94

consumer’s current visit to a web page or a single search query, without the collection andretention of data about the consumer’s online activities over time.” See Preliminary FTC StaffReport, “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework forBusinesses and Policymakers,” (Dec. 2010), at 55 n.134, available athttp://ftc.gov/os/2010/12/101201privacyreport.pdf. Such advertising is more transparent andpresents fewer privacy concerns as compared to the aggregation and use of data across sites andover time for marketing purposes. See id.

For example, the term “personalize the content on the website or online service”95

was intended to permit operators to maintain user-driven preferences, such as game scores, orcharacter choices in virtual worlds.

-32-

service.” The Commission also proposed to set forth with greater specificity the types of92

permissible activities that would constitute support for internal operations. The proposed93

revision to this latter definition was intended to accomplish three goals: (1) to incorporate into

the Rule text many of the types of activities – user authentication, maintaining user preferences,

serving contextual advertisements, and protecting against fraud or theft – that the Commission94

initially discussed as permissible in the 2011 NPRM; (2) to specifically permit the collection of

persistent identifiers for functions related to site maintenance and analysis, and to perform

network communications that many commenters viewed as crucial to their ongoing operations ;95

Page 33: Children's Online Privacy Protection Rule

Id. 96

15 U.S.C. 6501(8)(F) defines personal information to include “any other identifier97

that the Commission determines permits the physical or online contacting of a specificindividual.” See, e.g., Gibson Dunn & Crutcher (comment 39, 2012 SNPRM), at 20 (“Thisexpansion of the definition of ‘personal information’ is inconsistent with the text of COPPA,which limits ‘personal information’ to categories of information that by themselves can be usedto identify and contact a specific individual. Every category of information that COPPAenumerates – name, physical address, email address, telephone number, and Social Securitynumber – as well as the catch-all for ‘any other identifier that the Commission determinespermits the physical or online contacting of a specific individual,’ 15 U.S.C. § 6501(8)(A)-(F) –is information that makes it possible to identify and contact a specific individual”); see alsoBusiness Software Alliance (comment 12, 2012 SNPRM), at 5-6; CTIA (comment 24, 2012SNPRM), at 14-17; Chappell (comment 18, 2012 SNPRM), at 1; DMA (comment 28, 2012SNPRM), at 10; Facebook (comment 33, 2012 SNPRM), at 9; Information Technology IndustryCouncil (comment 51, 2012 SNPRM), at 2; Internet Commerce Coalition (comment 53, 2012SNPRM), at 11-13; Microsoft (comment 66, 2012 SNPRM), at 3; NetChoice (comment 70, 2012SNPRM), at 7; TechFreedom (comment 88, 2012 SNPRM), at 5-6.

See Application Developers Alliance (comment 5, 2012 SNPRM), at 6; Business98

Software Alliance (comment 12, 2012 SNPRM), at 6); Information Technology and InnovationFoundation (comment 50, 2012 SNPRM), at 6-7; NetChoice (comment 70, 2012 SNPRM), at 6.

-33-

and (3) to make clear that none of the information collected may be used or disclosed to contact

a specific individual, including through the use of behavioral advertising.96

Most of the commenters who responded to the 2012 SNPRM opposed the Commission’s

refinement. Many continued to argue, as they had done in response to the 2011 NPRM, that

because persistent identifiers only permit contact with a device, not a specific individual, the

Commission was exceeding its statutory authority by defining them as personal information. 97

Others argued strenuously for the benefits to children, parents, operators, and commerce of

collecting anonymous information on, and delivering advertisements to, unknown or unnamed

users. Some commenters maintained that, to comply with COPPA’s notice and consent98

Page 34: Children's Online Privacy Protection Rule

Facebook (comment 33, 2012 SNPRM), at 9-10; Google (comment 41, 201299

SNPRM), at 5; J. Holmes (comment 47, 2012 SNPRM).

Association for Competitive Technology (comment 7, 2012 SNPRM), at 5;100

Business Software Alliance (comment 12, 2012 SNPRM), at 6-7; CTIA (comment 24, 2012SNPRM), at 17-18; DMA (comment 28, 2012 SNPRM), at 10-12; Internet Commerce Coalition(comment 53, 2012 SNPRM), at 12; Microsoft (comment 66, 2012 SNPRM), at 3-5; NetChoice(comment 70, 2012 SNPRM), at 8-9.

See DMA (comment 28, 2012 SNPRM), at 11 (warning that an exhaustive list is101

likely to have unintended consequences if companies are not afforded flexibility as technologiesevolve); Digital Advertising Alliance (comment 27, 2012 SNPRM), at 3; Internet CommerceCoalition (comment 53, 2012 SNPRM), at 3-4, 12 (“[T]he definition of ‘support for the internaloperations’ of a website is too narrow . . . . This list of ‘exempt’ collections is incomplete andrisks quickly becoming outmoded.”); Magazine Publishers of America (comment 61, 2012SNPRM), at 11; Online Publishers Association (comment 72, 2012 SNPRM), at 8; PromotionMarketing Association (comment 77, 2012 SNPRM), at 7; Computer and CommunicationsIndustry Association (comment 27, 2011 NPRM), at 4 (the exceptions are narrow and “immobileshort of another rulemaking”).

See, e.g., Association for Competitive Technology (comment 7, 2012 SNPRM), at102

5; IAB (comment 49, 2012 SNPRM), at 4; TechFreedom (comment 88, 2012 SNPRM), at 11;

-34-

requirements in the context of persistent identifiers, sites would be forced to collect more

personal information on their users, contrary to COPPA’s goals of data minimization. 99

Because the proposed definition of persistent identifiers ran hand-in-hand with the

proposed carve-out for permissible activities, most commenters also opined on the proposed

scope of the definition of support for internal operations. Unsurprisingly, these commenters100

urged the Commission to broaden the definition either to make the list of permissible activities

non-exhaustive, or to clarify that activities such as ensuring legal and regulatory compliance,101

intellectual property protection, payment and delivery functions, spam protection, statistical

reporting, optimization, frequency capping, de-bugging, market research, and advertising and

marketing more generally would not require parental notification and consent on COPPA-

covered sites or services. Other commenters expressed confusion about which entities102

Page 35: Children's Online Privacy Protection Rule

Toy Industry Association (comment 89, 2012 SNPRM), at 15; Viacom Inc. (comment 95, 2012SNPRM), at 13.

CDT (comment 15, 2012 SNPRM), at 6-7; Google (comment 41, 2012 SNPRM),103

at 5; Toy Industry Association (comment 89, 2012 SNPRM), at 14.

Institute for Public Representation (comment 52, 2012 SNPRM), at 13.104

See CDT (comment 15, 2012 SNPRM), at 6 (“We do, however, agree with the105

Commission that behavioral targeting of children using unique identifiers should trigger COPPAcompliance obligations”); Internet Commerce Coalition (comment 53, 2012 SNPRM), at 12; seealso AT&T (comment 8, 2011 NPRM), at 7; Future of Privacy Forum (comment 55, 2011NPRM), at 2; WiredTrust (comment 177, 2011 NPRM), at 9; Visa Inc. (comment 168, 2011NPRM), at 2.

See 2011 NPRM, 76 FR at 59811.106

-35-

operating on or through a property could take advantage of the support for internal operations

exemption. Children’s advocacy groups, by contrast, expressed fear that the proposed103

definition was already “so broad that it could exempt the collection of many persistent identifiers

used to facilitate targeted marketing.”104

Several commenters supported the Commission’s premise that the collection of certain

persistent identifiers permits the physical or online contacting of a specific individual, but asked

the Commission to take a different tack to regulating such identifiers. Rather than cover all

persistent identifiers and then carve out permissible uses, these commenters suggested a simpler

approach: the Commission should apply the Rule only to those persistent identifiers used for the

purposes of contacting a specific child, including through online behavioral advertising. 105

The Commission continues to believe that persistent identifiers permit the online

contacting of a specific individual. As the Commission stated in the 2011 NPRM, it is not

persuaded by arguments that persistent identifiers only permit the contacting of a device. This106

interpretation ignores the reality that, at any given moment, a specific individual is using that

Page 36: Children's Online Privacy Protection Rule

See J. Bowman,“Real-time Bidding – How It Works and How To Use It,” Warc107

Exclusive (Feb. 2011), available athttp://www.improvedigital.com/en/wp-content/uploads/2011/09/Warc-RTB-Feb11.pdf(“With real-time bidding, advertisers can decide to put a specific ad in front of a specificindividual web user on a given site, bid for that impression and – if they win the bid – serve thead, all in the time it takes for a page to load on the target consumer’s computer.”); L. Fisher,“eMarketer’s Guide to the Digital Advertising Ecosystem: Mapping the Display AdvertisingPurchase Paths and Ad Serving Process” (Oct. 2012), available athttp://www.emarketer.com/Corporate/reports (media buyers can deliver personalized,impression-by-impression, ads based on what is known about individual viewer attributes,behaviors, and site context).

15 U.S.C. 6501(8).108

-36-

device. Indeed, the whole premise underlying behavioral advertising is to serve an

advertisement based on the perceived preferences of the individual user.107

Nor is the Commission swayed by arguments noting that multiple individuals could be

using the same device. Multiple people often share the same phone number, the same home

address, and the same email address, yet Congress still classified these, standing alone, as

“individually identifiable information about an individual.” For these reasons, and the reasons108

stated in the 2011 NPRM, the Commission will retain persistent identifiers within the definition

of personal information.

However, the Commission recognizes that persistent identifiers are also used for a host of

functions that have little or nothing to do with contacting a specific individual, and that these

uses are fundamental to the smooth functioning of the Internet, the quality of the site or service,

and the individual user’s experience. It was for these reasons that the Commission proposed to

expand the definition of support for internal operations in the 2012 SNPRM.

Page 37: Children's Online Privacy Protection Rule

See Toy Industry Association (comment 89, 2012 SNPRM), at 14; see also ESA109

(comment 32, 2012 SNPRM), at 8; NetChoice (comment 70, 2012 SNPRM), at 7-8.

This interpretation of affiliate relationships is consistent with prior Commission110

articulations. See FTC Report, Protecting Consumer Privacy in an Era of Rapid Change (March2012), at 41-42, available at http://ftc.gov/os/2012/03/120326privacyreport.pdf (“TheCommission maintains the view that affiliates are third parties, and a consumer choicemechanism is necessary unless the affiliate relationship is clear to consumers”); see also kidSAFE Seal Program (comment 56, 2012 SNPRM), at 5 (asking the Commission to clarifywhat is meant by the phrase “‘across different websites or online services’ in the context ofpersistent identifiers”).

-37-

The Commission has determined to retain the approach suggested in the 2011 NPRM and

refined in the 2012 SNPRM, with certain revisions. First, the final Rule modifies the proposed

definition of persistent identifier to cover “a persistent identifier that can be used to recognize a

user over time and across different websites or online services.” This modification takes into

account concerns several commenters raised that using a persistent identifier within a site or

service over time serves an important function in conducting site performance assessments and

supporting intra-site preferences. However, in this context, not every website or service with a109

tangential relationship will be exempt – the term “different” means either sites or services that

are unrelated to each other, or sites or services where the affiliate relationship is not clear to the

user.110

Second, the Commission has determined that the carve-out for use of a persistent

identifier to provide support for the internal operations of a website or online service is better

articulated as a separate exception to the Rule’s requirements. For this reason, it has amended

Section 312.5(c) (“Exceptions to prior parental consent”) to add a new exception providing that

where an operator collects only a persistent identifier for the sole purpose of providing support

for its internal operations, the operator will have no notice or consent obligations under the Rule.

Page 38: Children's Online Privacy Protection Rule

See, e.g., Digital Advertising Alliance (comment 27, 2012 SNPRM), at 3; DMA111

(comment 28, 2012 SNPRM), at 11; IAB (comment 73, 2011 NPRM), at 10-11; MagazinePublishers of America (comment 61, 2012 SNPRM), at 11; Microsoft (comment 66, 2012SNPRM), at 5; Online Publishers Association (comment 123, 2011 NPRM), at 4-5; Viacom Inc.(comment 95, 2012 SNPRM), at 14.

-38-

This is a change in organization, rather than a substantive change, from the Commission’s earlier

proposals.

In addition, in response to the arguments made in a number of comments, the

Commission has further modified the 2012 SNPRM proposed definition of support for internal

operations to add frequency capping of advertising and legal or regulatory compliance to the

permissible uses enumerated therein. The Commission declines to add certain other language111

proposed by commenters, such as intellectual property protection, payment and delivery

functions, spam protection, optimization, statistical reporting, or de-bugging, because it believes

that these functions are sufficiently covered by the definitional language permitting activities

that “maintain or analyze” the functions of the website or service, or protect the “security or

integrity” of the site or service. Under this revised definition, most of the activities that

commenters cite to as important to permitting the smooth and optimal operation of websites and

online services will be exempt from COPPA coverage.

The Commission also is cognizant that future technical innovation may result in

additional activities that websites or online services find necessary to support their internal

operations. Therefore, the Commission has created a voluntary process – new Section 312.12(b)

– whereby parties may request Commission approval of additional activities to be included

within the definition of support for internal operations. Any such request will be placed on the

public record for notice and comment, and the Commission will act on it within 120 days.

Page 39: Children's Online Privacy Protection Rule

See EPIC (comment 31, 2012 SNPRM), at 9. The Commission disagrees with the112

contention by certain commenters that the word “necessary” is confusing and unduly restrictive. See Online Publishers Association (comment 72, 2012 SNPRM), at 9. In this context, the termmeans that an operator may collect a covered persistent identifier if it uses it for the purposeslisted in the definition of support for internal operations. The operator need not demonstratethat collection of the identifier was the only means to perform the activity.

144 Cong. Rec. S8482 (Statement of Sen. Bryan (1998)).113

See, e.g., Association for Competitive Technology (comment 7, 2012 SNPRM), at114

5; IAB (comment 73, 2011 NPRM), at 11.

-39-

The final amended language makes clear that operators may only engage in activities

“necessary” to support the covered functions. The Commission agrees with commenter EPIC

that “[t]he presence of the word ‘necessary’ [in the statute] . . . indicates that the use of persistent

identifiers is to be limited to the above activities, and that these activities are to be narrowly

construed.” Moreover, operators may not use persistent identifiers that fall within the Rule’s112

definition of personal information for any purposes other than those listed within the definition

of support for internal operations. Accordingly, the Rule will require operators to obtain

parental consent for the collection of persistent identifiers where used to track children over time

and across sites or services. Without parental consent, operators may not gather persistent

identifiers for the purpose of behaviorally targeting advertising to a specific child. They also

may not use persistent identifiers to amass a profile on an individual child user based on the

collection of such identifiers over time and across different websites in order to make decisions

or draw insights about that child, whether that information is used at the time of collection or

later.113

Several commenters sought clarification of whether a party’s status as a first party or a

third party would affect its ability to rely upon the support for internal operations definition. 114

Page 40: Children's Online Privacy Protection Rule

See 2011 NPRM, 76 FR at 59813.115

Id.116

-40-

To the extent that a child-directed content site or service engages service providers to perform

functions encompassed by the definition of support for internal operations, those functions will

be covered as support for the content-provider’s internal operations. If a third party collecting

persistent identifiers is deemed an operator under the Rule (e.g., because it has actual knowledge

it is collecting personal information from users of a child-directed site or service, or it has actual

knowledge it is collecting personal information from a child through a general audience site or

service), that operator may rely on the Rule’s support for internal operations definition when it

uses persistent identifier information for functions that fall within it.

c. Photographs, Videos, and Audio Files

The Rule’s existing definition of personal information includes photographs only when

they are combined with “other information such that the combination permits physical or online

contacting.” Given the prevalence and popularity of posting photos, videos, and audio files

online, in the 2011 NPRM, the Commission reevaluated the privacy and safety implications of

such practices as they pertain to children. The Commission determined that the inherently

personal nature of photographs, and the fact that they may contain information such as

embedded geolocation data, or can be paired with facial recognition technology, makes them

identifiers that “permit the physical or online contacting of a specific individual.” The115

Commission found the same risks attendant with the online uploading of video and audio files. 116

Accordingly, the Commission proposed creating a new category within the definition of

Page 41: Children's Online Privacy Protection Rule

Institute for Public Representation (comment 71, 2011 NPRM), at 33; 117

Privacy Rights Clearinghouse (comment 131, 2011 NPRM), at 2.

See DMA (comment 37, 2011 NPRM), at 17; Promotion Marketing Association118

(comment 133, 2011 NPRM), at 12; NCTA (comment 113, 2011 NPRM), at 16. Certaincommenters interpreted the Commission’s proposal as inapplicable to user-generated content,but applicable to an operator’s own use of children’s images or voices. See CTIA (comment 32,2011 NPRM), at 12; National Retail Federation (comment 114, 2011 NPRM), at 4; F. Page(comment 124, 2011 NPRM).

See American Association of Advertising Agencies (comment 2, 2011 NPRM), at119

4; Internet Commerce Coalition (comment 74, 2011 NPRM), at 5; Promotion MarketingAssociation (comment 133, 2011 NPRM), at 12; see also DMA (comment 37, 2011 NPRM), at17.

-41-

personal information covering a photograph, video, or audio file where such file contains a

child’s image or voice.

Some commenters supported this proposal. For example, the Institute for Public

Representation, on behalf of a group of children’s privacy advocates, stated that “[b]ecause

photographs, videos, and audio files can convey large amounts of information about children that

can make them more vulnerable to behavioral advertising, and possibly put their personal safety

at risk as well, these types of information should be included in the definition of personal

information.”117

Several commenters criticized the Commission’s proposal, claiming that the effect would

limit children’s participation in online activities involving “user-generated content.” Several118

commenters issued blanket statements that photos, videos, and audio files, in and of themselves,

do not permit operators to locate or contact a child. Other commenters stated that the119

Commission’s proposal is premature, arguing that facial recognition technologies are only in

Page 42: Children's Online Privacy Protection Rule

See Intel Corp. (comment 72, 2011 NPRM), at 6-7; Motion Picture Association of120

America (“MPAA”) (comment 109, 2011 NPRM), at 13.

See Privo (comment 76, 2012 SNPRM), at 7; DMA (comment 37, 2011 NPRM),121

at 17-18; Promotion Marketing Association (comment 133, 2011 NPRM), at 12; WiredSafety(comment 177, 2011 NPRM), at 10.

ESA (comment 47, 2011 NPRM), at 14 n.21; kidSAFE Seal Program (comment122

81, 2011 NPRM), at 11.

See WiredSafety (comment 177, 2011 NPRM), at 10 (“the risk of using a123

preteen’s clear image in still photos or in video formats is obvious”); see also Intel (comment 72,2011 NPRM), at 7 (“we propose limiting the Commission’s new definition to ‘a photograph,video or audio file where such file contains a child’s image or voice which may reasonably allowidentification of the child’”). The Commission believes that operators who choose to blurphotographic images of children prior to posting such images would not be in violation of theRule.

-42-

their nascent stages. Finally, several commenters argued that the Commission should narrow120

the scope of its proposal, exempting from coverage photos, videos, or audio files that have been

prescreened to remove any metadata or other individually identifiable information. Others121

asked the Commission to carve out from coverage photos or videos where used to support

internal operations of a site or service. Commenter WiredSafety urged the Commission to 122

adopt a standard that would permit operators to blur images of children before uploading them,

thereby reducing the risks of exposure.123

The Commission does not dispute that uploading photos, videos, and audio files can be

entertaining for children. Yet, it is precisely the very personal nature of children’s photographic

images, videos, and voice recordings that leads the Commission to determine that such files meet

the standard for “personal information” set forth by Congress in the COPPA statute. That is, in

and of themselves, such files “permit the physical or online contacting of a specific

Page 43: Children's Online Privacy Protection Rule

15 U.S.C. 6501(8)(F) (italics added).124

Privacy Rights Clearinghouse (comment 131, 2011 NPRM), at 2; see also125

TRUSTe (comment 164, 2011 NPRM), at 7 (“biometrics such as those provided in a photo,video or audio recording are personal information and greater protections need to be provided”).

The Commission notes that this amendment would not apply to uploading photos126

or videos on general audience sites such as Facebook or YouTube, absent actual knowledge thatthe person uploading such files is a child.

76 FR at 59813.127

Id. Adding new paragraph (j) to the definition of personal information in 16 CFR128

312.2.

-43-

individual.” As the Privacy Rights Clearinghouse stated, “[a]s facial recognition advances,124

photos and videos have the potential to be analyzed and used to target and potentially identify

individuals.” Given these risks, the Commission continues to believe it is entirely appropriate125

to require operators who offer young children the opportunity to upload photos, videos, or audio

files containing children’s images or voices to obtain parental consent beforehand. Therefore,126

the Commission adopts the modification of the definition of personal information regarding

photos, videos, and audio files as proposed in the 2011 NPRM, without qualification.

d. Geolocation Information

In the 2011 NPRM, the Commission stated that, in its view, existing paragraph (b) of the

definition of personal information already covered any geolocation information that provides

precise enough information to identify the name of a street and city or town. However,127

because geolocation information can be presented in a variety of formats (e.g., coordinates or a

map), and in some instances can be more precise than street name and name of city or town, the

Commission proposed making geolocation information a stand-alone category within the

definition of personal information.128

Page 44: Children's Online Privacy Protection Rule

See AT&T (comment 8, 2011 NPRM), at 5; see also American Association of129

Advertising Agencies (comment 2, 2011 NPRM), at 4; CTIA (comment 32, 2011 NPRM), at 9;DMA (comment 37, 2011 NPRM), at 17; Promotion Marketing Association (comment 133,2011 NPRM), at 13; Software & Information Industry Association (“SIIA”) (comment 150, 2011NPRM), at 8; Verizon (comment 167, 2011 NPRM), at 6.

See Internet Commerce Coalition (comment 74, 2011 NPRM), at 5; see also130

AT&T (comment 8, 2011 NPRM), at 5-6.

See, e.g., CTIA (comment 32, 2011 NPRM), at 9; Future of Privacy Forum131

(comment 55, 2011 NPRM), at 5; Verizon (comment 167, 2011 NPRM), at 6 (“Consistent withCongressional intent, geolocation information should be treated as personal information onlywhen the data is tied to a specific individual.”).

15 U.S.C. 6501(8)(B).132

-44-

Similar to the comments raised in response to the 2010 FRN, a number of commenters

opposed this change. These commenters argued that anonymous, technical geolocation

information, without the addition of any other identifier, was insufficient to contact an individual

child. The Internet Commerce Coalition stated that in identifying geolocation information129

“sufficient to identify a street name and name of city or town” as personal information, the

Commission has missed the key to what makes an address “personal,” namely the street

number. Accordingly, such commenters asked the Commission to clarify that geolocation130

information will only be deemed personal information if, when combined with some other

information or identifier, it would permit contacting an individual.131

These commenters overlook that the COPPA statute does not require the submission of a

street number to make address information “personal.” Nor is it limited to home address,

primary residence, or even a static address. Rather, Congress chose to use the words “or other

physical address, including street name and name of city or town.” This word choice not only132

permits the inclusion of precise mobile (i.e., moving) location information, it may very well

Page 45: Children's Online Privacy Protection Rule

For this reason, the Commission finds those comments focusing on the potential133

to capture a large geographic area to be inapposite. See IAB (comment 73, 2011 NPRM), at 6(“without an address or other additional data to identify a household or individual, a street nameand city could encompass a large geographic area and as many as 1,000 households. Forexample, Sepulveda Boulevard, in the Los Angeles area, is over 40 miles long”).

See Consumers Union (comment 29, 2011 NPRM), at 3; see also EPIC (comment134

41, 2011 NPRM), at 8-9 (“As with IP addresses and user names, geolocation information can beused to track a particular device, which is usually linked to a particular individual.”).

See American Association of Advertising Agencies (comment 2, 2011 NPRM), at135

4; AT&T (comment 8, 2011 NPRM), at 6; DMA (comment 37, 2011 NPRM), at 17; PromotionMarketing Association (comment 133, 2011 NPRM), at 13; Verizon (comment 167, 2011NPRM), at 6.

-45-

mandate it. As commenter Consumers Union stated, “[s]ince a child’s physical address is133

already considered personal information under COPPA, geolocation data, which provides precise

information about a child’s whereabouts at a specific point in time, must also necessarily be

covered.” 134

In addition, the Commission disagrees with those commenters who state that geolocation

information, standing alone, does not permit the physical or online contacting of an individual

within the meaning of COPPA. Just as with persistent identifiers, the Commission rejects the135

notion that precise geolocation information allows only contact with a specific device, not the

individual using the device. By that same flawed reasoning, a home or mobile telephone number

would also only permit contact with a device.

Several commenters asked the Commission to refine the Rule’s coverage of geolocation

so that it targets particular uses. Commenter CTIA, citing photo-sharing services as an example,

asked that geolocation information embedded in metadata (as often is the case with digital

Page 46: Children's Online Privacy Protection Rule

CTIA (comment 32, 2011 NPRM), at 9.136

kidSAFE Seal Program (comment 81, 2011 NPRM), at 11.137

TRUSTe (comment 164, 2011 NPRM), at 3.138

See 76 FR at 59813 n.87.139

-46-

photographs) be excluded from the Rule’s coverage. Arguing that there should be a legal136

difference between using geolocation information for convenience or to protect a child’s safety

and to market to a child, commenter kidSAFE Seal Program suggested that geolocation data only

be considered “personal information” when it is being used for marketing purposes. Finally,137

commenter TRUSTe asked that the Commission amend the definition to cover “precise

geolocation data that can be used to identify a child’s actual physical location at a given point in

time.” 138

The Commission sees no basis for making the suggested revisions. With respect to

excluding geolocation information in metadata, the Commission notes that in the 2011 NPRM, it

specifically cited such geolocation metadata as one of the bases for including photographs of

children within the definition of personal information. With respect to the comment from 139

kidSAFE Seal Program, the statute does not distinguish between information collected for

marketing as opposed to convenience; therefore, the Commission finds no basis for making such

a distinction for geolocation information. Finally, the Commission sees little to no practical

distinction between “geolocation data that can be used to identify a child’s actual physical

location at a given point in time” and geolocation information “sufficient to identify street name

and name of a city or town,” and it prefers to adhere to the statutory language. Accordingly, the

Commission modifies the definition of personal information as proposed in the 2011 NPRM,

Page 47: Children's Online Privacy Protection Rule

See 2011 NPRM, 76 FR at 59804, 59809. The Commission originally proposed140

to define release of personal information as “the sharing, selling, renting, or any other means ofproviding personal information to any third party.” The Commission’s revised definitionremoves the phrase “or any other means of providing personal information” to avoid confusionand overlap with the second prong of the definition of disclosure governing an operator makingpersonal information collected from a child publicly available, e.g., through a social network, achat room, or a message board. See 16 CFR 312.2 (definition of disclosure).

Id.141

-47-

and covered operators will be required to notify parents and obtain their consent prior to

collecting geolocation information from children.

6. Definition of Release of Personal Information

In the 2011 NPRM, the Commission proposed to define the term release of personal

information separately from the definition of disclosure, since the term applied to provisions of

the Rule that did not solely relate to disclosures. The Commission also proposed technical140

changes to clarify that the term “release of personal information” addresses business-to-business

uses of personal information, not public disclosures, of personal information. The141

Commission received little comment on this issue and therefore adopts the proposed changes.

7. Definition of Website or Online Service Directed to Children

In the 2012 SNPRM, the Commission proposed revising the definition of website or

online service directed to children to allow a subset of sites falling within that category an option

not to treat all users as children. The proposed revision was sparked by a comment from The

Walt Disney Company that urged the Commission to recognize that sites and services directed to

children fall along a continuum and that those sites targeted to both children and others should

be permitted to differentiate among users. Noting that Disney’s suggestion in large measure

Page 48: Children's Online Privacy Protection Rule

See ACLU (comment 3, 2012 SNPRM), at 3; Online Publishers Association142

(comment 72, 2012 SNPRM), at 4.

See DMA (comment 28, 2012 SNPRM), at 13-14; Institute for Public143

Representation (comment 52, 2012 SNPRM), at 25-27; Privo (comment 76, 2012 SNPRM), at 3;TechFreedom (comment 88, 2012 SNPRM), at 3; Toy Industry Association (comment 89, 2012SNPRM), at 12; WiredTrust and WiredSafety (comment 98, 2012 SNPRM), at 3-4.

See Facebook (comment 33, 2012 SNPRM), at 10; Viacom Inc. (comment 95,144

2012 SNPRM), at 5.

-48-

reflected the prosecutorial discretion already applied by the Commission in enforcing COPPA,

the Commission proposed revisions to implement this concept. The Commission received

numerous comments on this proposal. Although many commenters expressed support for the

concept, the proposed implementing language was criticized.

Paragraphs (a) and (b) of the SNPRM’s proposed revisions sought to define the subset of

sites directed to children that would still be required to treat all users as children: those that

knowingly target children under 13 as their primary audience, and those that, based on the

overall content of the site, are likely to attract children under 13 as their primary audience.

Paragraph (c) sought to describe those child-directed sites that would be permitted to age-screen

to differentiate among users – namely those sites that, based on overall content, are likely to

draw a disproportionate number of child users.

Although most commenters concurred that operators intentionally targeting children as

their primary audience should be covered as websites directed to children, some worried about142

the precise contours of the term “primary audience” and sought guidance as to percentage

thresholds. Some commenters also opposed any interpretation of COPPA that required child-143

directed websites to presume all users are children.144

Page 49: Children's Online Privacy Protection Rule

See, e.g., Online Publishers Association (comment 72, 2012 SNPRM), at 4 (“The145

plain meaning of ‘targeted’ in this context requires a deliberate selection of an audience ofchildren.”).

See 15 U.S.C. 6501(10)(A) (“The term ‘website or online service directed to146

children’ means – (i) a commercial website or online service that is targeted to children; or (ii)that portion of a commercial website or online service that is targeted to children.”).

See ACLU (comment 3, 2012 SNPRM), at 4 (“paragraphs (a) and (b) of the147

proposed definition are largely noncontroversial”).

See, e.g., U.S. Conference of Catholic Bishops (comment 92, 2012 SNPRM), at 4.148

-49-

Many commenters argued that the Commission exceeded its authority by defining

website or online service directed to children based on criteria other than the sites’ intent to

target children. These commenters argued that Congress, by defining websites directed to

children as those “targeted” to children, was imposing a subjective intent requirement. The145

Commission disagrees. The Commission believes that if Congress had wanted to require

subjective intent on the part of an operator before its site or service could be deemed directed to

children, it would have done so explicitly. Intent cannot be the only scenario envisioned by146

Congress whereby a site would be deemed directed to children. Certainly, a website or online147

service that has the attributes, look, and feel of a property targeted to children under 13 will be

deemed to be a site or service directed to children, even if the operator were to claim that was

not its intent.

Paragraph (c) sought to describe those child-directed sites that would be permitted to age-

screen to differentiate among users, namely those sites that, based on overall content, are likely

to draw a disproportionate number of child users. While a handful of comments supported this

definition, for the most part, it was criticized by a spectrum of interests. On one side were148

advocates such Common Sense Media, EPIC, and the Institute for Public Representation. These

Page 50: Children's Online Privacy Protection Rule

Institute for Public Representation (comment 52, 2012 SNPRM), at (i).149

Common Sense Media (comment 20, 2012 SNPRM), at 9; EPIC (comment 31,150

2012 SNPRM), at 4-5; Institute for Public Representation, supra note 149, at 27-28.

See, e.g., P. Aftab (comment 1, 2012 SNPRM), at 6-7; NCTA (comment 69, 2012151

SNPRM), at 14; Marketing Research Association (comment 62, 2012 S NPRM), at 2; NetChoice(comment 70, 2012 SNPRM), at 4-5; SIIA (comment 84, 2012 SNPRM), at 10.

-50-

advocates argued that recognizing a category of sites and services directed to mixed-audiences,

targeted both to young children and others, would undercut the other revisions the Commission

has proposed, thereby lessening privacy protections for children. Such advocates also argued149

that the proposed category might create incentives, or loopholes, for operators that currently

provide child-directed websites or services to claim their online properties are covered by

paragraph (c) of the definition and become exempt from COPPA by age-gating. 150

On the other side were a number of commenters who feared that the proposal would

significantly expand the range of websites and online services that fall within the ambit of

COPPA’s coverage, including both teen-oriented and general-audience sites and services that

incidentally appeal to children as well as adults. Much of this fear appears to have been driven

by the specific language the Commission proposed; that is, sites or services that, based on their

overall content, were “likely to attract an audience that includes a disproportionately large

percentage of children under age 13 as compared to the percentage of such children in the

general population.” Some argued that the use of the term “disproportionate” is vague,151

Page 51: Children's Online Privacy Protection Rule

See, e.g., CDT (comment 15, 2012 SNPRM), at 7-10; Family Online Safety152

Institute (comment 34, 2012 SNPRM), at 3; Internet Commerce Coalition (comment 53, 2012SNPRM), at 9; T. Mumford (comment 68, 2012 SNPRM); Online Publishers Association(comment 72, 2012 SNPRM), at 6; Viacom (comment 95, 2012 SNPRM), at 5.

See, e.g., DMA (comment 28, 2012 SNPRM), at 14; Magazine Publishers of153

America (comment 61, 2012 SNPRM), at 6-7.

See CDT (comment 15, 2012 SNPRM), at 7.154

See ACLU (comment 3, 2012 SNPRM), at 5; DMA (comment 28, 2012155

SNPRM), at 14-15; Magazine Publishers of America (comment 61, 2012 SNPRM), at 8; ToyIndustry Association (comment 89, 2012 SNPRM), at 7, 11.

Entertainment Software Association (comment 32, 2012 SNPRM), at 2; Online156

Publishers Association (comment 72, 2012 SNPRM), at 7-8; Viacom Inc. (comment 95, 2012SNPRM), at 6.

-51-

potentially unconstitutional, unduly expansive, or otherwise constitutes an unlawful shift152 153

from the statute’s actual knowledge standard for general audience sites to one of constructive

knowledge. Many worried that the Commission’s proposal would lead to widespread age-154

screening, or more intensive age-verification, across the entire body of websites and online

services located on the Internet. Other commenters suggested that the Commission implement155

this approach through a safe harbor, not by revising a definition.156

The comments reflect a misunderstanding of the purpose and effect of the change

proposed in the 2012 SNPRM. The Commission did not intend to expand the reach of the Rule

to additional sites and services, but rather to create a new compliance option for a subset of

websites and online services already considered directed to children under the Rule’s totality of

the circumstances standard.

To make clear that it will look to the totality of the circumstances to determine whether a

site or service is directed to children (whether as its primary audience or otherwise), the

Page 52: Children's Online Privacy Protection Rule

2011 NPRM, 76 FR at 59814. 157

See DMA (comment 37, 2011 NPRM), at 18-19; MPAA (comment 109, 2011158

NPRM), at 19.

See Verizon (comment 167, 2011 NPRM), at 10.159

See SIIA (comment 150, 2011 NPRM), at 9.160

-52-

Commission has revised and reordered the definition of website or online service directed to

children as follows. Paragraph (a) of the definition contains the original Rule language setting

forth several factors the Commission will consider in determining whether a site or service is

directed to children. In addition, paragraph (a) amends this list of criteria to add musical content,

the presence of child celebrities, and celebrities who appeal to children, as the Commission

originally proposed in the 2011 NPRM. Although some commenters expressed concern that157

these additional factors might capture general audience sites, produce inconsistent results, or158 159

be overly broad (since musicians and celebrities often appeal both to adults and children), the160

Commission believes that these concerns are unfounded. The Commission reiterates that these

factors are some among many that the Commission will consider in assessing whether a site or

service is directed to children, and that no single factor will predominate over another in this

assessment.

Paragraph (b) of the definition sets forth the actual knowledge standard for plug-ins or ad

networks, as discussed in Part II.A.4.b herein, whereby a plug-in, ad network, or other property

is covered as a website or online service directed to children under the Rule when it has actual

knowledge that it is collecting personal information directly from users of a child-directed

website or online service.

Page 53: Children's Online Privacy Protection Rule

See 2012 SNPRM, 77 FR at 46646.161

The Commission intends the word “primary” to have its common meaning, i.e.,162

something that stands first in rank, importance, or value. This must be determined by the totalityof the circumstances and not through a precise audience threshold cut-off. See definition of“primary.” Merriam-Webster.com (2012), available at http://www.merriam-webster.com (lastaccessed Nov. 5, 2012).

P. Aftab (comment 1, 2012 SNPRM), at 5; Facebook (comment 33, 2012163

SNPRM), at 12-13; Future of Privacy Forum (comment 37, 2012 SNPRM), at 8.

-53-

The Commission amends paragraph (c) of the definition to clarify when a child-directed

site would be permitted to age-screen to differentiate among users. This paragraph codifies the

Commission’s intention to first apply its “totality of the circumstances” standard to determine

whether any website or online service falling under paragraph (c) is directed to children. The

Commission then will assess whether children under age 13 are the primary audience for the site

or service. Paragraph (c) codifies that a site or service that is directed to children, but that does

not target children as its primary audience, may use an age screen in order to apply all of

COPPA’s protections only to visitors who self-identify as under age 13. As the Commission

stated in the 2012 SNPRM, at that point, the operator will be deemed to have actual knowledge

that such users are under 13 and must obtain appropriate parental consent before collecting any

personal information from them and must also comply with all other aspects of the Rule. 161

The Commission retains its longstanding position that child-directed sites or services

whose primary target audience is children must continue to presume all users are children and to

provide COPPA protections accordingly. Some commenters contend that the Commission162

should permit this presumption to be rebutted, even on sites primarily targeting children, by the

use of a simple age screen that distinguishes child users from other users. Although the163

Commission is now permitting this on sites or services that target children only as a secondary

Page 54: Children's Online Privacy Protection Rule

See DMA (comment 28, 2012 SNPRM), at 8 (an operator’s choice of content164

serves as a proxy for knowledge that its users are primarily children under 13).

See 2011 NPRM, 76 FR at 59816.165

-54-

audience or to a lesser degree, the Commission believes adopting this standard for all child-

directed sites would virtually nullify the statutory distinction between “actual knowledge” sites

and those directed to children, creating a de facto actual knowledge standard for all operators.164

Finally, paragraph (d) of the definition restates the statutory proviso that a site or service

will not be deemed to be child-directed where it simply links to a child-directed property.

B. Section 312.4: Notice

1. Direct Notice to a Parent

In the 2011 NPRM, the Commission proposed refining the Rule requirements for the

direct notice to ensure a more effective “just-in-time” message to parents about an operator’s

information practices. As such, the Commission proposed to reorganize and standardize the165

direct notice requirement to set forth the precise items of information that must be disclosed in

each type of direct notice the Rule requires. The proposed revised language of § 312.4 specified,

in each instance where the Rule requires direct notice, the precise information that operators

must provide to parents regarding the items of personal information the operator already has

obtained from the child (generally, the parent’s online contact information either alone or

together with the child’s online contact information); the purpose of the notification; action that

the parent must or may take; and what use, if any, the operator will make of the personal

information collected. The proposed revisions also were intended to make clear that each form

Page 55: Children's Online Privacy Protection Rule

Id.166

See EPIC (comment 41, 2011 NPRM), at 9; Institute for Public Representation167

(comment 71, 2011 NPRM), at 40-41; kidSAFE Seal Program (comment 81, 2011 NPRM), at12; NCTA (comment 113, 2011 NPRM), at 22.

AssertID (comment 6, 2012 SNPRM), at 2.168

IAB (comment 73, 2011 NPRM), at 13.169

-55-

of direct notice must provide a hyperlink to the operator’s online notice of information

practices. 166

In general, commenters supported the Commission’s proposed changes as providing

greater clarity and simplicity to otherwise difficult-to-understand statements. These changes167

were viewed as especially important in an era of children’s intense engagement with mobile

applications accessed through a third-party app store and where an online notice might not be as

readily accessible. Only one commenter objected to the concept of placing greater emphasis168

on the direct, rather than the online, notice, stating that the changes would unduly necessitate

lengthy direct notices and would prove overwhelming for parents and challenging to implement

in the mobile environment.169

The Commission also proposed adding a paragraph setting out the contours of a new

direct notice in situations where an operator voluntarily chooses to collect a parent’s online

contact information from a child in order to provide parental notice about a child’s participation

in a website or online service that does not otherwise collect, use, or disclose children’s personal

information. The Commission’s proposal for a voluntary direct notice in situations where an

operator does not otherwise collect, use, or disclose personal information from a child garnered

Page 56: Children's Online Privacy Protection Rule

N. Savitt (comment 142, 2011 NPRM), at 2.170

H. Valetk (comment 166, 2011 NPRM), at 3. 171

TRUSTe (comment 164, 2011 NPRM), at 10. 172

Lifelock (comment 93, 2011 NPRM), at 1.173

For example, to be considered by the various Commission-approved COPPA safe174

harbor programs.

N. Savitt (comment 142, 2011 NPRM), at 2.175

-56-

very little attention. Only one commenter sought clarification of the specific language the

Commission proposed. 170

Several commenters urged the Commission to use the occasion of the Rule review to

develop a model COPPA direct notice form that operators voluntarily could adopt, to mandate171

that such notifications be optimized for the particular devices on which they are displayed, or172

to implement a website rating system. The Commission believes that these suggestions are173

better suited as “best practices” rather than as additions to the text of the Rule.174

The Commission has determined to retain in the final Rule the modifications proposed in

the 2011 NPRM. However, the Commission has reorganized the paragraphs to provide a better

flow and guidance for operators, and has clarified that the voluntary direct notice provision

described above is, indeed, voluntary for operators who choose to use it.175

2. Notice on the Website or Online Service

In the 2011 NPRM, the Commission proposed several changes to the Rule’s online notice

requirement. First, the Commission proposed requiring all operators collecting, using, or

Page 57: Children's Online Privacy Protection Rule

Id.176

Institute for Public Representation (comment 71, 2011 NPRM), at 38-39.177

See Facebook (comment 50, 2011 NPRM), at 9; NCTA (comment 113, 2011178

NPRM), at 22; Toy Industry Association (comment 89, 2012 SNPRM), at 6.

IAB (comment 73, 2011 NPRM), at 12.179

DMA (comment 37, 2011 NPRM), at 20.180

kidSAFE Seal Program (comment 81, 2011 NPRM), at 12 (“Would this rule181

apply to one-time joint sponsors of a promotion who co-collect information on a website?”).

-57-

disclosing information on a website or online service to provide contact information, including,

at a minimum, the operator’s name, physical address, telephone number, and email address. 176

This proposal marked a change from the existing Rule’s proviso that such operators could

designate one operator to serve as the point of contact.

With the exception of the Institute for Public Representation, commenters who spoke177

to the issue opposed mandating that the online notice list all operators. Some objected to the

sheer volume of potentially confusing information this would present to parents, and stated178

that the proposal provided no additional consumer benefit to parents, given that the existing Rule

implies that the single operator designee should be prepared to “respond to all inquiries from

parents concerning the operators’ privacy policies and use of children’s information.” Some179

also spoke to the burden on the primary operator of having to maintain a current list of all

applicable operators’ contact information, and expressed confusion as to which operators180

needed to be listed. 181

The Commission believes that a requirement for the primary operator to provide specific,

current, contact information for every operator that collects information on or through its website

Page 58: Children's Online Privacy Protection Rule

76 FR at 59815.182

Id.183

Institute for Public Representation (comment 71, 2011 NPRM), at 40.184

-58-

or service has the potential to confuse parents, for whom such online notices are intended to be

accessible and useful. After considering the comments, the Commission has determined to

retain the Rule’s “single operator designee” proviso; that is, an operator will be required to list

all operators collecting or maintaining personal information from children through the website or

online service, but need only list the contact information for the one operator who will be

responsible for responding to parents’ inquiries.

In the 2011 NPRM, the Commission also proposed eliminating the Rule’s current lengthy

– yet potentially under-inclusive – recitation of an operator’s information collection, use, and

disclosure practices in favor of a simple statement of: (1) what information the operator collects

from children, including whether the website or online service enables a child to make personal

information publicly available; (2) how the operator uses such information; and (3) the

operator’s disclosure practices for such information. As a part of this revision, the182

Commission proposed removing the required statement that the operator may not condition a

child’s participation in an activity on the child’s disclosing more personal information than is

reasonably necessary to participate in such activity. This proposal was opposed by the183

Institute for Public Representation, which views the statement as a way to educate parents as to

whether or not the operator actually complies with data minimization principles. This184

organization also asked the Commission to require operators to disclose information to parents

Page 59: Children's Online Privacy Protection Rule

Id.185

See 2011 NPRM, 76 FR at 59815 (“In the Commission’s experience, this blanket186

statement, often parroted verbatim in operators’ privacy policies, detracts from the keyinformation of operators’ actual information practices, and yields little value to a parent trying todetermine whether to permit a child’s participation.”).

Id.187

-59-

on how the data they collect is secured from potential breaches. The Commission has185

considered this input but nevertheless adopts both of these changes in the final Rule.

The Commission sees great value for parents of streamlined online notices and continues

to believe that the removal of extraneous information from such notices will further this goal. 186

Accordingly, the Commission modifies the Rule as proposed in the 2011 NPRM to remove an

operator’s recitation in its online notice that it will not condition a child’s participation on the

provision of more information than is necessary. Again, however, the substantive requirement

of § 312.7 remains in place. In addition, and again in the interest of streamlining the online187

notices, the Commission declines to require operators to explain the measures they take to

protect children’s data. Nevertheless, the Rule’s enhanced provisions on confidentiality and data

security will help protect data collected from children online.

Finally, focusing on the part of the Commission’s proposal that would require operators

of general audience sites or services that have separate children’s areas to post links to their

notices of children’s information practices on the home or landing page or screen of the

children’s area, the Toy Industry Association asked the Commission to forgo mandating links in

any location where mobile apps can be purchased or downloaded because, in their view,

changing commercial relationships may make it difficult to frequently update privacy policies in

Page 60: Children's Online Privacy Protection Rule

Toy Industry Association (Comment 163, 2011 NPRM), at 4.188

FTC Staff Report, “Mobile Apps for Kids: Disclosures Still Not Making the189

Grade” (Dec. 2012), at 7 (“Mobile Apps for Kids II Report”), available athttp://www.ftc.gov/os/2012/12/121210mobilekidsappreport.pdf (noting that “informationprovided prior to download is most useful in parents’ decision-making since, once an app isdownloaded, the parent already may have paid for the app and the app already may be collectingand disclosing the child’s information to third parties”).

Paragraph (a) of § 312.5 reads: 190

(1) An operator is required to obtain verifiable parental consent before anycollection, use, and/or disclosure of personal information from children, includingconsent to any material change in the collection, use, and/or disclosure practicesto which the parent has previously consented.(2) An operator must give the parent the option to consent to the collection anduse of the child’s personal information without consenting to disclosure of his orher personal information to third parties.

15 U.S.C. 6501(9).191

-60-

apps marketplaces. The final amended Rule does not mandate the posting of such information188

at the point of purchase but rather on the app’s home or landing screen. However, the

Commission does see a substantial benefit in providing greater transparency about the data

practices and interactive features of child-directed apps at the point of purchase and encourages

it as a best practice. 189

C. Section 312.5: Parental Consent

A central element of COPPA is its requirement that operators seeking to collect, use, or

disclose personal information from children first obtain verifiable parental consent.190

“Verifiable parental consent” is defined in the statute as “any reasonable effort (taking into

consideration available technology), including a request for authorization for future collection,

use, and disclosure, described in the notice.” Accordingly, the Rule requires that operators: 191

Page 61: Children's Online Privacy Protection Rule

See 16 CFR 312.5(b).192

Paragraph (b)(2) also sets out the sliding scale “email plus” method for obtaining193

parental consent in the instance where an operator collects a child’s personal information onlyfor internal use. The Commission’s determination to retain the email plus method is discussedin Part II.C.7, infra.

See Federal Trade Commission’s Roundtable: Protecting Kids’ Privacy Online at 194

195, 208-71 (June 2, 2010), available athttp://www.ftc.gov/bcp/workshops/coppa/COPPARuleReview_Transcript.pdf.

See DMA (comment 17, 2010 FRN), at 10, 12; Microsoft (comment 39, 2010195

FRN), at 7; Toy Industry Association, Inc. (comment 63, 2010 FRN), at 3; WiredSafety.org.(comment 68, 2010 FRN), at 18;

-61-

must make reasonable efforts to obtain verifiable parental consent,taking into consideration available technology. Any method toobtain verifiable parental consent must be reasonably calculated inlight of available technology to ensure that the person providingconsent is the child’s parent. § 312.5(b)(1).

The Rule sets forth a non-exhaustive list of methods that meet the standard of verifiable

parental consent. Specifically, paragraph (b)(2) states: 192

Methods to obtain verifiable parental consent that satisfy therequirements of this paragraph include: providing a consent formto be signed by the parent and returned to the operator by postalmail or facsimile; requiring a parent to use a credit card inconnection with a transaction; having a parent call a toll-freetelephone number staffed by trained personnel; using a digitalcertificate that uses public key technology; and using e-mailaccompanied by a PIN or password obtained through one of theverification methods listed in this paragraph.193

Participants at the Commission’s June 2, 2010 COPPA roundtable and commenters to194

the 2010 FRN generally agreed that, while no one method provides complete certainty that the

operator has reached and obtained consent from a parent, the methods listed in the Rule continue

to have utility for operators and should be retained. 195

Page 62: Children's Online Privacy Protection Rule

See, e.g., BOKU (comment 5, 2010 FRN); DMA (comment 17, 2010 FRN), at196

11-12; EchoSign, Inc. (comment 18, 2010 FRN); ESA (comment 20, 2010 FRN), at 7-9;Facebook (comment 22, 2010 FRN), at 2; J. Hiller (comment 27, 2010 FRN), at 447-50; M. Hoal(comment 30, 2010 FRN); Microsoft (comment 39, 2010 FRN), at 4; MPAA (comment 42, 2010FRN), at 12; RelyID (comment 53, 2010 FRN), at 3; TRUSTe (comment 64, 2010 FRN), at 3; H.Valetk (comment 66, 2010 FRN), at 6; WiredSafety.org (comment 68, 2010 FRN), at 7; S.Wittlief (comment 69, 2010 FRN).

See BOKU (comment 5, 2010 FRN); ESA (comment 20, 2010 FRN), at 11-12;197

TRUSTe (comment 64, 2010 FRN), at 3; H. Valetk (comment 66, 2010 FRN), at 6-7.

See WiredSafety.org (comment 68, 2010 FRN), at 24 (noting that operators are198

considering employing online financial accounts, such as iTunes, for parental consent).

See ESA (comment 20, 2010 FRN), at 9-10; Microsoft (comment 39, 2010 FRN),199

at 7.

See ESA (comment 20, 2010 FRN), at 12; Janine Hiller (comment at 27, 2010200

FRN), at 447.

See DMA (comment 17, 2010 FRN), at 12; EchoSign (comment 18, 2010 FRN);201

ESA (comment 20, 2010 FRN), at 10; Toy Industry Association (comment 63, 2010 FRN), at 11.

-62-

A number of commenters urged the Commission to expand the list of acceptable

mechanisms to incorporate newer technologies, or to otherwise modernize or simplify the Rule’s

mechanisms for parental consent. Suggested methods of obtaining parental consent included196

sending a text message to the parent’s mobile phone number, offering online payment services197

other than credit cards, offering parental controls in gaming consoles, offering a centralized198 199

parental consent mechanism or parental opt-in list, and permitting electronic signatures.200 201

In the 2011 NPRM, the Commission announced its determination that the record was

sufficient to justify certain proposed mechanisms, but insufficient to adopt others. The 2011

NPRM proposed several significant changes to the mechanisms of verifiable parental consent set

forth in paragraph (b) of § 312.5, including: adding several newly recognized mechanisms for

Page 63: Children's Online Privacy Protection Rule

See Yahoo! (comment 80, 2011 NPRM), at 4; DMA (comment 37, 2011 NPRM), 202

at 23; kidSAFE Seal Program (comment 81, 2011 NPRM), at 16; NCTA (comment 113, 2011NPRM), at 9; Facebook (comment 50, 2011 NPRM), at 8-9.

See K. Dennis (comment 34, 2011 NPRM), at 2; A. Thierer (comment 162, 2011203

NPRM), at 9; R. Newton (comment 118, 2011 NPRM).

-63-

parental consent; eliminating the sliding scale approach to parental consent; and adding two new

processes for evaluation and pre-clearance of parental consent mechanisms.

1. Electronic Scans and Video Verification

In the 2011 NPRM, the Commission proposed including electronically scanned versions

of signed parental consent forms and the use of video verification methods among the Rule’s

non-exhaustive list of acceptable consent mechanisms. The proposal received support from

several commenters, including Yahoo!, the DMA, kidSAFE Seal Program, the NCTA, and

Facebook. Other commenters expressed reservations about whether these new methods would202

offer practical, economical, or scalable solutions for operators. 203

As stated in the 2011 NPRM, the Commission finds that electronic scans and video

conferencing are functionally equivalent to the written and oral methods of parental consent

originally recognized by the Commission in 1999. It does not find the concerns of some

commenters, that operators are not likely to widely adopt these methods, a sufficient reason to

exclude them from the Rule. The list of consent mechanisms is not exhaustive and operators

remain free to choose the ones most appropriate to their individual business models. Therefore,

Section 312.5(b) of the final Rule includes electronic scans of signed consent forms and video-

conferencing as acceptable methods for verifiable parental consent.

Page 64: Children's Online Privacy Protection Rule

See application of Privo, Inc. to become a Commission-approved COPPA safe204

harbor program (Mar. 2004), available at http://www.ftc.gov/os/2004/04/privoapp.pdf, at 25.

The COPPA statute itself lists Social Security number among the items205

considered to be personal information. See 16 CFR 312.2. In other contexts, driver’s licensesand social security numbers, among other things, have traditionally been considered byCommission staff to be personal, or sensitive, as well. See FTC Staff Report, “Self-RegulatoryPrinciples for Online Behavioral Advertising” (Feb. 2009), at 20 n.47, 42, 44, available athttp://www.ftc.gov/os/2009/02/P085400behavadreport.pdf.

The use of a driver’s license to verify a parent, while not specifically enumerated206

in the Final Rule as an approved method of parental consent, was addressed in the Statement ofBasis and Purpose in connection with a discussion of the methods to verify the identity ofparents who seek access to their children’s personal information under § 312.6(a)(3) of the Rule. See 1999 Statement of Basis and Purpose, 64 FR at 59905. There, the Commission concludedthat the use of a driver’s license was an acceptable method of parental verification.

-64-

2. Government-Issued Identification

The Commission also proposed in the 2011 NPRM to allow operators to collect a form of

government-issued identification – such as a driver’s license, or a segment of the parent’s Social

Security number – from the parent, and to verify the parent’s identity by checking this

identification against databases of such information, provided that the parent’s identification is

deleted from the operator’s records promptly after such verification is complete. Some operators

already use this method of obtaining parental consent, and it is one of several available

verification methods offered by the COPPA safe harbor program Privo. In the NPRM, the204

Commission stated its recognition that information such as Social Security number, driver’s

license number, or another record of government-issued identification is sensitive data. In205

permitting operators to use government-issued identification as an approved method of parental

verification, the Commission emphasized the importance of limiting the collection of such

identification information to only those segments of information needed to verify the data. 206

For example, the Commission noted that the last four digits of a person’s Social Security number

Page 65: Children's Online Privacy Protection Rule

See, e.g., Privo, Inc., “Request for Safe Harbor Approval by the Federal Trade207

Commission for Privo, Inc.’s Privacy Assurance Program under Section 312.10 of the Children’sOnline Privacy Protection Rule,” 25 (Mar. 3, 2004), available athttp://www.ftc.gov/os/2004/04/privoapp.pdf.

For instance, Facebook commented that this mechanism achieves the delicate208

balance of making it easy for the parent to provide consent, while making it difficult for the childto pose as the parent; when combined with responsible data disposal practices, this method alsoprotects the parent’s information against unauthorized use or disclosure. See Facebook (comment 50, 2011 NPRM), at 9; see also kidSAFE Seal Program (comment 81, 2011 NPRM),at 16.

Intel and the Marketing Research Association cautioned the Commission to avoid209

sending mixed messages about using such sensitive information while at the same time advisingoperators to adhere to principles of data minimization. Intel (comment 72, 2011 NPRM), at 7;Marketing Research Association (comment 97, 2011 NPRM), at 3.

See Institute for Public Representation (comment 71, 2011 NPRM), at 42; see210

also TechFreedom (comment 159, 2011 NPRM), at 8 (requiring users to go through an ageverification process would lead to a loss of personal privacy); New York Intellectual PropertyLaw Association (comment 117, 2011 NPRM), at 3 (parents’ privacy rights should notneedlessly be put at risk in order to protect their children’s privacy).

-65-

are commonly used by verification services to confirm a person’s identity. The Commission207

also stated its belief that the requirement that operators immediately delete parents’ government-

issued identification information upon completion of the verification process provides further

protection against operators’ unnecessary retention, use, or potential compromise of such

information. Commenters in favor of adding this mechanism pointed out that using available

technology to check a driver’s license number or partial Social Security number reasonably

ensures that the person providing consent is the parent. 208

Other commenters expressed concern that allowing operators to collect sensitive

government identification information from parents raises serious privacy implications. Many209

commenters opined that the serious risks to parents’ privacy outweighed the benefits of the

proposal. Some further argued that normalizing the use of this sensitive data for such a210

Page 66: Children's Online Privacy Protection Rule

See CDT (comment 17, 2011 NPRM), at 9; A. Thierer (comment 162, 2011211

NPRM), at 8.

kidSAFE Seal Program asked the Commission to consider whether operators can212

retain parents’ verification information as proof that the verification occurred. See kidSAFESeal Program (comment 81, 2011 NPRM), at 16. With regard to credit card information orgovernment-issued identifiers, the Commission would consider whether an operator had retaineda sufficiently truncated portion of the data as to make it recognizable to the parent but unusablefor any other purpose.

-66-

purpose would diminish users’ alertness against identity theft schemes and other potentially

nefarious uses.211

As the federal agency at the forefront of improving privacy protections for consumers,

the Commission is sensitive to the privacy concerns raised by the comments. The Commission

is also aware that both operators and parents benefit from having a choice of several acceptable

methods for verifiable parental consent. Moreover, the Commission is not compelling any

operator to use this method. The Commission believes that, on balance, government-issued ID

provides a reliable and simple means of verifying that the person providing consent is likely to

be the parent, and that the requirement that operators delete such data immediately upon

verification substantially minimizes the privacy risk associated with that collection. Therefore,

the Commission adopts this method among the Rule’s non-exhaustive list of acceptable consent

methods.212

3. Credit Cards

The 2011 NPRM also proposed including the term “monetary” to modify “transaction” in

connection with use of a credit card to verify parental consent. This added language was

intended to make clear the Commission’s long-standing position that the Rule limits use of a

Page 67: Children's Online Privacy Protection Rule

See 71 FR at 13247, 13253, 13254 (Mar. 15, 2006) (requirement that the credit213

card be used in connection with a transaction provides extra reliability because parents obtain atransaction record, which is notice of the purported consent, and can withdraw consent ifimproperly given); Fed. Trade Comm’n, Frequently Asked Questions about the Children’sOnline Privacy Protection Rule, Question 33, available athttp://www.ftc.gov/privacy/coppafaqs.shtm#consent.

But see Part II.C.4., infra. Several comments note that some alternative payment214

systems, such as the use of a username and password in the iTunes store, afford equal notice andprotections to parents for both paid and unpaid transactions by providing the primary accountholder with a separate, contemporaneous notification of each discrete transaction.

-67-

credit card as a method of parental consent to situations involving actual monetary

transactions. The Commission received one comment specifically addressing this proposed213

language; EPIC supported the change as correctly limiting the circumstances under which credit

cards can be used as verification. The final Rule incorporates this change, stating “credit card in

connection with a monetary transaction.” 214

4. Alternative Online Payment Systems

At the outset of the Rule review, the Commission sought comment on whether to

consider modifying the Rule to include alternative online payment systems, in addition to credit

cards, as an acceptable means of verifying parental consent in connection with a monetary

transaction. The Commission stated in the 2011 NPRM that, at such time, the record was

insufficient to support a proposal to permit the use of alternative online payment systems for this

purpose. The NPRM also indicated that the Commission was mindful of the potential for

children’s easy access to, and use of, alternative forms of payments (such as gift cards, debit

cards, and online accounts). Thus, the Commission welcomed further discussion of the risks and

benefits of using electronic payment methods as a consent mechanism.

Page 68: Children's Online Privacy Protection Rule

See, e.g., Association for Competitive Technology (comment 5, 2011 NPRM), at215

7; DMA (comment 37, 2011 NPRM), at 23; eBay (comment 40, 2011 NPRM), at 3-4; kidSAFE(comment 81, 2011 NPRM), at 16; Scholastic (comment 144, 2011 NPRM), at 9-10.

Other commenters similarly urged that the Rule permit the use of alternate216

payment systems, where such systems are tied to a valid credit card account, require the user toenter a password, and provide the primary account holder with clear notification of eachtransaction through email confirmation. See Association for Competitive Technology (comment5, 2011 NPRM), at 7; kidSAFE (comment 81, 2011 NPRM), at 16; see also eBay (comment 40,2011 NPRM), at 3-4 (indicating its interest in leveraging PayPal business model to implement ayouth account program directly linking children’s accounts to verified parent accounts).

-68-

Several commenters to the 2011 NPRM asked the Commission to reconsider its position

that online payment systems are not yet reliable enough to provide verifiable parental consent,

arguing that certain online payment options can meet the same stringent criteria as credit

cards. In particular, Scholastic stressed the importance to operators, particularly in the context215

of digital apps and other downloadable content, of providing customers the flexibility to use

various convenient electronic payment methods. Scholastic urged the Commission to amend the

Rule to provide that payment methods other than credit cards, such as debit cards and electronic

payment systems, can satisfy the Rule’s consent mechanism requirements if they provide

separate notification of each discrete monetary transaction to the primary account holder. 216

The Commission, upon review of all of the relevant comments, is persuaded that it

should allow the use of other payment systems, in addition to credit cards, provided that any

such payment system can meet the same stringent criteria as a credit card. As Scholastic

articulated in its comment, the Rule should allow operators to use any electronic or online

payment system as an acceptable means of obtaining verifiable parental consent in connection

with a monetary transaction where (just as with a credit card) the payment system is used in

conjunction with a direct notice meeting the requirements of § 312.4(c) and the operator

Page 69: Children's Online Privacy Protection Rule

See DMA (comment 17, 2010 FRN), at 12; EchoSign (comment 18, 2010 FRN);217

ESA (comment 20, 2010 FRN), at 10; Toy Industry Association (comment 63, 2010 FRN), at 11.For instance, the ESA proposed that the Commission incorporate a “sign and send” method,given that numerous commonly available devices allow users to input data by touching orwriting on the device’s screen.

See Electronic Signatures in Global and National Commerce Act, 15 U.S.C.218

7006(5).

-69-

provides notification of each discrete monetary transaction to the primary account holder.

Accordingly, § 312.5(b)(2) of the final Rule includes the following language “requiring a parent,

in connection with a monetary transaction, to use a credit card, debit card, or other online

payment system that provides notification of each discrete transaction to the primary account

holder.”

5. Electronic or Digital Signatures

In response to the 2010 FRN, several commenters recommended that the Commission

accept electronic or digital signatures as a form of verifiable consent. In the 2011 NPRM, the217

Commission concluded that the term “electronic signature” has many meanings, ranging from

“an electronic sound, symbol, or process, attached to or logically associated with a contract or

other record and executed or adopted by a person with the intent to sign the record,” to an218

electronic image of the stylized script associated with a person. The Commission determined

that electronic signatures, without more indicia of reliability, were problematic in the context of

Page 70: Children's Online Privacy Protection Rule

See 2011 NPRM at 59818. (The Commission indicated several concerns about219

allowing electronic signatures, including that, given the proliferation of mobile devices amongchildren and the ease with which children could sign and return an on-screen consent, suchmechanisms may not “ensure that the person providing consent is the child’s parent.” TheCommission also noted that, although the law recognizes electronic signatures for the assertionthat an individual signed a document, they do not necessarily confirm the underlying identity ofthe individual signing the document).

See, e.g., DMA (comment 37, 2011 NPRM), at 23 (Congress passed ESIGN Act220

over a decade ago and consumers prefer completing transactions online with digital signaturesover using cumbersome offline processes); ESA (comment 47, 2011 NPRM), at 22-23(electronic sign-and-send method meets the statutory standard of “reasonably calculated, in lightof available technology, to ensure that the person providing consent is the child’s parent,” whileaccommodating parents’ use of tablet, mobile device, and small-screen technologies lackingcomputer peripherals such as printers or scanners); TechFreedom (comment 159, 2011 NPRM),at 8 (urging Commission to promote development of solutions such as electronic signatures now,rather than wait for next Rule revision).

While the Commission recognizes that some children also may circumvent the221

Rule’s parental notice and consent mechanisms by signing and sending parental consent formsthrough mail, fax, or electronic scan, it believes these methods clearly are not as simple for thechild as using a computer or handheld device to instantly pen and send a signature.

-70-

COPPA’s verifiable parental consent requirement. The NPRM welcomed further comment on219

how to enhance the reliability of these convenient methods.

In commenting on the 2011 NPRM, several commenters asked the FTC to reconsider the

utility of electronic signatures in the online world. The Commission has determined not to220

include electronic or digital signatures within the non-exhaustive list of acceptable consent

mechanisms provided for in § 312.5, given the great variability in the reliability of mechanisms

that may fall under this description. For instance, the Commission believes that simple digital

signatures, which only entail the use of a finger or stylus to complete a consent form, provide too

easy a means for children to bypass a site or service’s parental consent process, and thus do not

meet the statutory standard of “reasonably calculated, in light of available technology, to ensure

that the person providing consent is the child’s parent.” However, the Rule would not prohibit221

Page 71: Children's Online Privacy Protection Rule

See ESA (comment 20, 2010 FRN), at 4; Microsoft (comment 39, 2010 FRN), at222

7.

2011 NPRM, 76 FR 59818 (Sept. 27, 2011), available at223

http://ftc.gov/os/2011/09/110915coppa.pdf.

-71-

an operator’s acceptance of a digitally signed consent form where the signature provides other

indicia of reliability that the signor is an adult, such as an icon, certificate, or seal of authenticity

that accompanies the signature. At the same time, the Commission does not seek to limit or

proscribe other types of digital signatures that may also meet the statutory standard. For these

reasons, digital or electronic signatures are not included within the Rule’s non-exhaustive list of

parental consent mechanisms.

6. Platform Methods of Parental Consent

In response to the 2010 FRN, several commenters asked the Commission to consider

whether, and in what circumstances, parental control features in game consoles, and presumably

other devices, could be used to provide notice to parents and obtain verified consent under

COPPA. In the 2011 NPRM, the Commission acknowledged that parental control features can222

offer parents a great deal of control over a child’s user experience and can serve as a complement

to COPPA’s parental consent requirements. However, the Commission concluded that, at that

time, it did not appear that any such systems were adequately designed to comply with COPPA,

and that the record was insufficient for it to determine whether a hypothetical parental consent

mechanism would meet COPPA’s verifiable parental consent standard. The Commission, in the

2011 NPRM, encouraged continued exploration of the concept of using parental controls in

gaming consoles and other devices to notify parents and obtain their prior verifiable consent.223

Page 72: Children's Online Privacy Protection Rule

The Commission notes that Privo, Inc., one of the approved COPPA safe harbors,224

offers the option to its members to have Privo administer notice and consent programs formember operators.

See, e.g., P. Aftab (comment 1, 2012 SNPRM), at 7; Association for Competitive225

Technology (comment 5, 2011 NPRM), at 7-8 and (comment 7, 2012 SNPRM), at 8; Computerand Communications Industry Association (“CCIA”) (comment 27, 2011 NPRM), at 7-8; CDT(comment 15, 2012 SNPRM), at 5-6; Connect Safely (comment 21, 2012 SNPRM), at 3; ESA(comment 47, 2011 NPRM), at 21-26; Facebook (comment 33, 2012 SNPRM), at 18-20; Futureof Privacy Forum (comment 55, 2011 NPRM), at 5-6 and (comment 37, 2012 SNPRM), at 3-6;Microsoft (comment 107, 2011 NPRM), at 13-15 and (comment 66, 2012 SNPRM), at 6;Novachi, Inc. (comment 119, 2011 NPRM); SIIA (comment 150, 2011 NPRM), at 10-12;TechFreedom (comment159, 2011 NPRM), at 7 and (comment 88, 2012 SNPRM), at 13; TheWalt Disney Co. (comment 170, 2011 NPRM), at 17-19.

See, e.g., Association for Competitive Technology (comment 5, 2011 NPRM), at226

7-8 and (comment 7, 2012 SNPRM), at 8; CCIA (comment 27, 2011 NPRM), at 7-8; Facebook(comment 33, 2012 SNPRM), at 18-20; Future of Privacy Forum (comment 55, 2011 NPRM), at5-6 and (comment 37, 2011 SNPRM), at 3-6; Microsoft (comment 107, 2011 NPRM), at 13-15and (comment 66, 2012 SNPRM), at 13; SIIA (comment 150, 2011 NPRM), at 10-12. Future ofPrivacy Forum’s 2012 comment included proposed Rule language. See also NetChoice(comment 70, 2012 SNPRM), at 12 (proposing Rule language to clarify that COPPA allows forthe use of common consent mechanisms).

-72-

In response to both the 2011 NPRM and the 2012 SNPRM, numerous stakeholders,

including several platform providers, website and app developers, and child and privacy

advocates, asked the Commission to consider modifications to the Rule to make clear that

operators can choose to use a common mechanism – administered by a platform, gaming

console, device manufacturer, COPPA safe harbor program, or other entity – for the purpose224

of providing notice and obtaining parental consent for multiple operators simultaneously. 225

Commenters offered a variety of proposals. For instance, several commenters envisioned

that platform providers could provide a general notice and obtain consent to collect personal

information for those purposes specified in the general notice, and that app developers wanting

to collect or use information in ways differing from the general notice would need to

independently provide a second separate notice to parents and obtain their consent. Facebook226

Page 73: Children's Online Privacy Protection Rule

Facebook (comment 33, 2012 SNPRM), at 18-19.227

The Walt Disney Co. (comment 170, 2011 NPRM), at 18.228

ESA contemplates that the platforms would provide a notice “that makes it clear229

that the child’s personal information will be disclosed to third-party game publishers andapplication providers who may collect, use, and disclose such information through the console orhandheld in order to provide a joint or related service,” and that parental consent “might beeffective across any of the console or handheld maker’s related video game platforms andwebsites clearly referenced in the console or handheld maker’s privacy policy.” ESA (comment47, 2011 NPRM), at 26. Other proposals for common consent mechanisms included outsourcingthe process to identity management services, which operators could access through opentechnology standards. See Novachi (comment 119, 2011 NPRM). CDT acknowledged thepotential utility of platform-based outsourcing notice and consent, provided that the Commissionrequired additional safeguards for common consent mechanisms, including parental controls forthe ongoing management of consent. CDT (comment 15, 2012 SNPRM), at 5-6.

-73-

proposed that operators may also use such common consent mechanisms to meet other COPPA

obligations, such as providing parental access to children’s data collected by operators. The227

Walt Disney Company proposed two possible mechanisms: a “‘Kids Privacy Portal’ – through

which parents can express privacy preferences in one place for multiple online activities,” or a

joint agreement between the platform operator and application providers “that determines how

data will be collected and used, and how parents exercise control.” The Entertainment228

Software Association (“ESA”) proposed a similar program for video game platforms whereby

consoles or hand-held device makers could leverage their existing parental controls

technologies.229

Commenters cited several potential benefits of common consent mechanisms, including:

(1) encouraging the development of interactive content for children by easing the burden

individualized notice and consent places on operators, especially in the context of mobile

Page 74: Children's Online Privacy Protection Rule

See, e.g., CCIA (comment 27, 2011 NPRM), at 7-8 (stating that platform-based230

consent programs would “promote COPPA’s goals” by encouraging developers “who do nothave the resources to independently acquire verifiable parental consent” to create content andservices for children; see also ConnectSafely.org (comment 21, 2012 SNPRM), at 3; P. Aftab(comment 1, 2012 SNPRM), at 7; Tech Freedom (comment 159, 2011 NPRM), at 7.

For example, Microsoft stated that common consent mechanisms “would benefit231

parents because requiring each third party separately to obtain parental consent could beconfusing, overwhelming, and costly for parents.” Microsoft (comment 66, 2012 SNPRM), at 6.

Microsoft, id.; see also CCIA (comment 27, 2011 NPRM), at 8; Facebook232

(comment 33, 2012 SNPRM), at 19 (“A rule that enables operators to leverage a commonplatform for notice and consent would substantially advance the Commission’s goal of ensuringthat parents receive clear, understandable, and manageable information; it would also minimizethe practical and economic costs to parents as a result of multiple consent requests.”);TechAmerica (comment 87, 2012 SNPRM), at 8.

CDT (comment 15, 2012 SNPRM), at 6.233

-74-

apps ; (2) focusing parental attention on one streamlined notice rather than on multiple,230

confusing, notices ; and (3) promoting privacy by eliminating the need for each of these other231

operators to separately collect online contact information from the child in order to obtain

parental consent. The Center for Democracy and Technology acknowledges that, while not all232

parents may want to delegate to platforms the authority to get consent on behalf of individual

operators, “others may want to empower their kids to share and obtain information through

certain applications without being forced to sign off on every interaction with a new web

service.” 233

The Commission believes that common consent mechanisms, such as a platform, gaming

console, or a COPPA safe harbor program, hold potential for the efficient administration of

notice and consent for multiple operators. A well-designed common mechanism could benefit

operators (especially smaller ones) and parents alike if it offers a proper means for providing

notice and obtaining verifiable parental consent, as well as ongoing controls for parents to

Page 75: Children's Online Privacy Protection Rule

Under the system proposed by the Future of Privacy Forum, parents would be234

apprised of a common set of information practices to which they could consent on an aggregatebasis, then would receive individualized notices for additional practices that go beyond thoseoutlined in the common notice. The platform would also ensure that parents have access to easymechanisms through which to retract their consent to the child’s use of any particular site orservice. Future of Privacy Forum (comment 37, 2012 SNPRM), at 4-6.

As noted in note 219, supra, one such common consent mechanism is currently235

provided by an approved COPPA safe harbor, and there may be others already in operation aswell.

The Commission would want to explore further the difficulties of making sure the236

notice accurately reflects each individual operator’s information practices; how to provideparents with a means to access the operator’s privacy policy with regard to information collectedfrom children; and giving parents controls sufficient to refuse to permit an operator’s further useor future collection of their child’s personal information, and to direct the operator to delete thechild’s personal information and or disable the child’s account with that operator.

-75-

manage their children’s accounts. The Commission believes that such methods could greatly234

simplify operators’ and parents’ abilities to protect children’s privacy.

Despite the potential benefits, the Commission declines, at this time, to adopt a specific

provision for the following reasons. First, even without an express reference in the Rule to such

a process, nothing forecloses operators from using a common consent mechanism so long as it

meets the Rule’s basic notice and consent requirements. Second, the Commission did not235

specifically seek comment on this precise issue; nor has it proposed any language in either the

NPRM or the SNPRM to address this point. Accordingly, the Commission is reluctant to adopt

specific language without the benefit of notice and comment on such language to explore all

potential legal and practical challenges of using a common consent mechanism. Finally, the236

Commission believes that parties interested in using a common consent mechanism have the

option to participate in the voluntary Commission approval process set forth in Section 312.5(3)

Page 76: Children's Online Privacy Protection Rule

See Part II.C.8., infra.237

See 2010 Rule Review, supra note 6, at 17091.238

The sliding scale approach was adopted in the Rule in response to comments that239

stated that internal uses of information, such as marketing to children, presented less risk thanexternal disclosures of the information to third parties or through public postings. See 1999Statement of Basis and Purpose, 64 FR at 59901. Other internal uses of children’s personalinformation may include sweepstakes, prize promotions, child-directed fan clubs, birthday clubs,and the provision of coupons.

The Commission notes that, assuming an operator has obtained a parent’s mobile240

phone number from the parent in response to the first email, confirmation of a parent’s consentmay done via an SMS or MMS text to the parent.

By contrast, for uses of personal information that involve disclosing the241

information to the public or third parties, the Rule requires operators to use more reliable

-76-

of the final Rule. That process would enable the Commission to evaluate, and other interested237

parties to publicly comment upon, such proposals in an effort to bring to market sound and

practical solutions that will serve a broad base of operators.

7. The Sliding Scale (“Email Plus”) Method

In conducting the Rule review, the Commission sought comment on whether the sliding

scale set forth in § 312.5(b)(2) remains a viable approach to verifiable parental consent. Under238

the sliding scale, an operator, when collecting personal information only for its internal use, may

obtain verifiable parental consent through an email from the parent, so long as the email is

coupled with an additional step. Such an additional step has included obtaining a postal239

address or telephone number from the parent and confirming the parent’s consent by letter or

telephone call, or sending a delayed confirmatory email to the parent after receiving consent. 240

The purpose of the additional step is to provide greater assurance that the person providing

consent is, in fact, the parent. This consent method is often called “email plus.” 241

Page 77: Children's Online Privacy Protection Rule

methods of obtaining verifiable parental consent, including but not limited to those identified in§ 312.5(b)(1).

64 FR at 59902 (“[E]mail alone does not satisfy the COPPA because it is easily242

subject to circumvention by children.”).

See id. at 59901 (“The Commission believes it is appropriate to balance the costs243

imposed by a method against the risks associated with the intended uses of the informationcollected. Weighing all of these factors in light of the record, the Commission is persuaded thattemporary use of a “sliding scale” is an appropriate way to implement the requirements of theCOPPA until secure electronic methods become more available and affordable.”).

See 71 FR at 13247, 13255, 13254 (Mar. 15, 2006).244

See WiredSafety.org (comment 68, 2010 FRN), at 21 (“We all assumed [email245

plus] would be phased out once digital signatures became broadly used. But when newauthentication models and technologies failed to gain in parental adoption, it was continued andis in broad use for one reason – it’s simple.”).

-77-

In adopting the sliding scale approach in 1999, the Commission recognized that the email

plus method was not as reliable as the other enumerated methods of verifiable parental

consent. However, it believed that this lower cost option was acceptable as a temporary242

option, in place until the Commission determined that more reliable (and affordable) consent

methods had adequately developed. In 2006, the Commission extended use of the sliding243

scale indefinitely, stating that the agency would continue to monitor technological developments

and modify the Rule should an acceptable electronic consent technology develop. 244

Email plus has enjoyed wide appeal among operators, who credit its simplicity. The245

Commission sought comment in response to the 2010 FRN and at the June 2010 public

roundtable on whether to retain email plus in the final Rule. Numerous commenters to the 2010

FRN, including associations who represent operators, supported the continued retention of this

Page 78: Children's Online Privacy Protection Rule

See R. Newton, Remarks from Emerging Parental Verification Access and246

Methods Panel at the Federal Trade Commission’s Roundtable: Protecting Kids’ Privacy Onlineat 211-13 (June 2, 2010), available athttp://www.ftc.gov/bcp/workshops/coppa/COPPARuleReview_Transcript.pdf; DMA(comment 17, 2010 FRN), at 10; IAB (comment 34, 2010 FRN), at 2; R. Newton (comment 46,2010 FRN), at 3; PMA (comment 51, 2010 FRN), at 4-5; Toy Industry Association, Inc.(comment 63, 2010 FRN), at 8.

See Privo, Inc. (comment 50, 2010 FRN), at 5 (“the presentation of a verified247

email is much less reliable if there is virtually no proofing or analyzing that goes on to determinewho the email belongs to”); RelyId (comment 53, 2010 FRN), at 3 (“The email plus mechanismdoes not obtain verifiable parental consent at all. It simply does not ensure that a parent‘authorizes’ anything required by the COPPA statute. The main problem with this approach isthat the child can create an email address to act as the supposed parent’s email address, send theemail from that address, and receive the confirmatory email at that address.”); see also D. Tayloeand P. Spaeth, Remarks from Federal Trade Commission’s Roundtable: Protecting Kids’Privacy Online, at 215-17 (email plus is very unreliable).

-78-

method as a low-cost means to obtain parents’ consent. At the same time, several246

commenters, including safe harbor programs and proponents of new parental consent

mechanisms, challenged the method’s reliability, given that operators have no real way of

determining whether the email address a child provides is that of the parent, and there is no

requirement that the parent’s email response to the operator contain any additional information

providing assurance that it is from a parent. 247

In the 2011 NPRM, the Commission proposed eliminating email plus as a means of

obtaining parental consent. The Commission considered whether operators’ continued reliance

on email plus may have inhibited the development of more reliable methods of obtaining

verifiable parental consent. The Commission also made clear that, although internal uses may

pose a lower risk of misuse of children’s personal information than the sharing or public

disclosure of such information, all collections of children’s information merit strong verifiable

parental consent.

Page 79: Children's Online Privacy Protection Rule

See K. Dennis, AssertID (comment 34, 2011 NPRM), at 2; AssertID (comment 6,248

2012 SNPRM), at 1; TRUSTe (comment 164, 2011 NPRM), at 11; EPIC (comment 41, 2011NPRM), at 9; Institute for Public Representation (comment 71, 2011 NPRM), at 41; S. Leff,WhooGoo (comment 60, 2012 SNPRM).

See AssertID, supra note 248; Institute for Public Representation, supra note 248.249

See, e.g., American Association of Advertising Agencies (comment 2, 2011250

NPRM); Association of Educational Publishers (comment 7, 2011 NPRM); ATT (comment 8,2011 NPRM); d. boyd (comment 13, 2011 NPRM); DMA (comment 37, 2011 NPRM); ESA(comment 47, 2011 NPRM); Internet Commerce Coalition (comment 74, 2011 NPRM); kidSAFE Seal Program (comment 81, 2011 NPRM); Magazine Publishers of America (comment61, 2012 SNPRM); Marketing Research Association (comment 97, 2011 NPRM); R. Newton(comment 118, 2011 NPRM); N. Savitt (comment 142, 2011 NPRM); Scholastic (comment 144,2011 NPRM).

See, e.g., Association of Educational Publishers (comment 7, 2011 NPRM), at 1251

(email plus is effective way to balance parental involvement with children’s freedom to pursueeducational experiences online); Scholastic (comment 144, 2011 NPRM), at 3 (email plus strikesa balance between the ease of getting consent and low safety risk to children from internal use oftheir data); Toy Industry Association (comment 163, 2011 NPRM), at 4-5 (similar cost-effectiveand efficient technologies to replace this method have not yet been developed); NCTA

-79-

Several commenters supported the Commission’s proposal to eliminate email plus.

These commenters opined that children can easily circumvent email plus and thus, that it is not

sufficiently effective to meet the statutory requirement of being reasonably calculated to ensure

that it is the parent providing consent. Some of these commenters also echoed the248

Commission’s concern that operators’ continued reliance on email plus is a disincentive to

innovation. 249

A majority of the comments, however, strongly urged the Commission to retain email

plus. Several commenters indicated that email plus remains a widely used and valuable tool250

for communicating with parents and obtaining consent. These commenters maintained that

email plus is easy for companies and parents to use, easy to understand, effective, and

affordable. In addition, several commenters expressed concern that other approved methods251

Page 80: Children's Online Privacy Protection Rule

(comment 113, 2011 NPRM), at 20 (termination of email plus will have negative consequencesand leave operators with no viable alternative); Privo (comment 132, 2011 NPRM), at 2 (emailplus is a reasonable approach that can be understood by all constituents); d. boyd (comment 13,2011 NPRM), at 5-6 (email plus imposes fewer burdens on families, particular low-income andimmigrant families, than other available mechanisms); DMA (comment 37, 2011 NPRM), at 21(elimination of email plus would create economic challenges in a difficult economic time).

See Association for Competitive Technology (comment 7, 2012 SNPRM), at 6252

(FTC should not remove easy to understand email plus without finding ways to make parentalconsent simpler); Toy Industry Association (comment 89, 2012 SNPRM), at 15 (the alternativesto email plus are not likely to be useful, effective, or cost-effective); see also AmericanAssociation of Advertising Agencies (comment 2, 2011 NPRM), at 2 (this could result in amajor reduction in parental consents obtained, solely due to burdensomeness of process);Association of Educational Publishers (comment 7, 2011 NPRM), at 2 (methods such as print,fax, or scan impede timely access to online resources; requiring credit cards or identificationimposes barriers that may alienate parents; and other mechanisms impose financial costs onoperators that may result in less free content); ESA (comment 47, 2011 NPRM), at 17-18(requiring other methods of consent will make it harder to offer children robust content; nopublic benefit in requiring operators to make the costly changeover to other mechanisms);Scholastic (comment 144, 2011 NPRM), at 5-6 (credit card use is not an option for Scholastic,which offers free services; existing options are cumbersome and slow for parents and operators,and newly proposed options are less privacy protective, affordable, or accessible than emailplus); TechFreedom (comment 159, 2011 NPRM), at 7-8 (making parental consent moredifficult to obtain would disproportionately burden smaller players in the market and retard newentry); Wired Trust (comment 177, 2011 NPRM), at 5 (eliminating email plus will likely resultin reduction in innovative and valuable online features for children).

See d. boyd (comment 13, 2011 NPRM), at 6 (no data to suggest that children are253

evading email plus more than other consent mechanisms); Scholastic (comment 144, 2011NPRM), at 8 (no evidence that proposed methods are significantly more reliable); see alsokidSAFE Seal Program (comment 81, 2011 NPRM), at 13-14 (the Commission has not shownany harm to children due to use of email plus); SIIA (comment 150, 2011 NPRM), at 12-13(proposing that only a small percentage of children are likely to falsify parental consent).

See, e.g., ACT (comment 7, 2012 SNPRM), at 6; Internet Commerce Coalition254

(comment 74, 2011 NPRM), at 5; Marketing Research Association (comment 97, 2011 NPRM),at 3; A. Thierer (comment 162, 2011 NPRM), at 7; WiredTrust (comment 177, 2011 NPRM), at

-80-

for obtaining consent would impose significant burdens on operators and parents. 252

Commenters also questioned whether other methods for verifiable parental consent are any more

reliable than email plus. Finally, several commenters challenged the FTC’s assumption that253

eliminating email plus would spur further innovation in parental consent mechanisms. 254

Page 81: Children's Online Privacy Protection Rule

5.

See 16 CFR 312.5(b)(1).255

-81-

The Commission is persuaded by the weight of the comments that email plus, although

imperfect, remains a valued and cost-effective consent mechanism for certain operators.

Accordingly, the final Rule retains email plus as an acceptable consent method for operators

collecting personal information only for internal use. Nevertheless, the Commission continues

to believe that email plus is less reliable than other methods of consent, and is concerned that,

twelve years after COPPA became effective, so many operators rely upon what was supposed to

be a temporary option. The Commission is also concerned about perpetuating for much longer a

distinction between internal and external uses of personal information that the COPPA statute

does not make. Thus, the Commission strongly encourages industry to innovate to create

additional useful mechanisms as quickly as possible.

8. Voluntary Process for Commission Approval of Parental ConsentMechanisms

Under the Rule, methods to obtain verifiable parental consent “must be reasonably

calculated, in light of available technology, to ensure that the person providing consent is the

child’s parent.” The Rule thus provides operators with the opportunity to craft consent255

mechanisms that meet this standard but otherwise are not enumerated in paragraph (b)(2) of §

312.5. Nevertheless, the recent Rule review process revealed that, whether out of concern for

potential liability, ease of implementation, or lack of technological developments, operators have

Page 82: Children's Online Privacy Protection Rule

The June 2, 2010 Roundtable and the public comments reflect a tension between256

operators’ desire for new methods of parental verification and their hesitation to adopt consentmechanisms other than those specifically enumerated in the Rule. See Remarks from FederalTrade Commission’s Roundtable: Protecting Kids’ Privacy Online at 226-27 (June 2, 2010),available at http://www.ftc.gov/bcp/workshops/coppa/COPPARuleReview_Transcript.pdf;CDT (comment 8, 2010 FRN), at 3 (“innovation in developing procedures to obtain parentalconsent has been limited as websites choose to use the methods suggested by the FTC out of fearthat a more innovative method could lead to liability”).

-82-

been reluctant to utilize consent methods other than those specifically set forth in the Rule. As256

a result, little technical innovation in the area of parental consent has occurred.

To encourage the development of new consent mechanisms, and to provide transparency

regarding consent mechanisms that may be proposed, the Commission in the 2011 NPRM

proposed establishing a process in the Rule through which parties may, on a voluntary basis,

seek Commission approval of a particular consent mechanism. Applicants who seek such

approval would be required to present a detailed description of the proposed parental consent

mechanism, together with an analysis of how the mechanism meets the requirements of §

312.5(b)(1) of the Rule. The Commission would publish the application in the FEDERAL

REGISTER for public comment, and approve or deny the applicant’s request in writing within

180 days of its filing.

The NPRM stated the Commission’s belief that this new approval process, aided by

public input, would allow the Commission to give careful consideration, on a case-by-case basis,

to new forms of obtaining consent as they develop in the marketplace. The Commission also

noted that the new process would increase transparency by publicizing approvals or rejections of

particular consent mechanisms, and should encourage operators who may previously have been

Page 83: Children's Online Privacy Protection Rule

See CCIA (comment 27, 2011 NPRM), at 6 (voluntary approval mechanism is an257

“excellent step” to encourage innovation, provide assurance to potential operators, and ensureparents’ participation); Yahoo! (comment 180, 2011 NPRM), at 4 (streamlined approval processfor new mechanisms is critical to encouraging innovation); see also Consumers Union (comment29, 2011 NPRM), at 5; FOSI (comment 51, 2011 NPRM), at 7; kidSAFE Seal Program(comment 81, 2011 NPRM), at 16.

See, e.g., CCIA (comment 27, 2011 NPRM), at 6 (process must be completed258

more quickly in order to be useful to industry); Facebook (comment 50, 2011 NPRM), at 14(Commission’s extensive experience with COPPA should enable its more expeditious approvalor disapproval of new mechanisms).

See, e.g., CCIA (comment 27, 2011 NPRM), at 6 (while public comment is259

important, the Commission should consider “an alternate private track” for consent mechanismsinvolving proprietary technology or a competitive advantage); Facebook (comment 50, 2011NPRM), at 15 (public comment requirement could negatively affect economic incentives forinnovation where rival operators might be able to copy the mechanism).

Facebook (comment 50, 2011 NPRM), at 15.260

-83-

tentative about exploring technological advancements to come forward and share them with the

Commission and the public.

The Commission received several comments expressing support for the concept of a

voluntary Commission approval process for new consent mechanisms. At the same time,257

several commenters that supported the concept also opined that the 180-day approval period was

too lengthy and would likely to discourage use of the program. Commenters also expressed258

concerns that applications for approval would be subject to public comment. One commenter259

asked the Commission instead to consider publicly releasing a letter explaining the

Commission’s decision to approve or disapprove a mechanism and thereby signaling what is an

acceptable consent mechanism, without causing undue delay or risking the disclosure of

proprietary information. 260

Page 84: Children's Online Privacy Protection Rule

DMA (comment 37, 2011 NPRM), at 24. 261

-84-

One commenter opposed to the voluntary approval process asserted that it would be ultra

vires to the COPPA statute and would create a de facto requirement for FTC approval of any

new consent mechanisms, thereby discouraging operators from developing or using new means

not formally approved by the Commission. The Commission does not believe that offering261

operators the opportunity to apply for a voluntary approval process will either de facto create an

additional COPPA requirement or chill innovation. This is just one more option available to

operators.

The Commission also is persuaded by the comments requesting that it shorten the 180-

day approval period. Accordingly, the final Rule’s provision for Commission approval of

parental consent mechanisms provides that the Commission shall issue a written determination

within 120 days of the filing of the request. The Commission anticipates that some commenters

will find that this time period also is longer than desired; however, it sets a reasonable time

frame in which to solicit public comment and carefully determine whether a consent mechanism

is sufficiently well-designed to fulfill the Rule’s requirements.

The Commission has determined not to alter the requirement that the proposed

mechanisms undergo public review and comment. This is an important component of the

approval process. Moreover, just as the Commission has done for COPPA safe harbor

applicants, it would permit those entities that voluntarily seek approval of consent mechanisms

to seek confidential treatment for those portions of their applications that they believe warrant

trade secret protection. In the event an applicant is not comfortable with the Commission’s

Page 85: Children's Online Privacy Protection Rule

See MPAA (comment 42, 2010 FRN), at 12; Rebecca Newton (comment 46, 2010262

FRN), at 2; Privo (comment 50, 2010 FRN), at 2; PMA (comment 51, 2010 FRN), at 5; B. Szoka(comment 59, 2010 FRN), Szoka Responses to Questions for the Record, at 56; TRUSTe(comment 64, 2010 FRN), at 3; see also generally WiredSafety.org (comment 68, 2010 FRN), at31-32.

-85-

determination as to which materials will be placed on the public record, it will be free to

withdraw the proposal from the approval process.

Accordingly, the Commission has amended the Rule to institute this voluntary approval

process. For ease of organization, the Commission has created a new section – 312.12

(“Voluntary Commission Approval Processes”) – to encompass both this approval process and

the process for approval of additional activities under the support for internal operations

definition.

9. Safe Harbor Approval of Parental Consent Mechanisms

Several commenters urged the Commission to permit Commission-approved safe harbor

programs to serve as laboratories for developing new consent mechanisms. The Commission262

stated its agreement in the 2011 NPRM that establishing such a system may aid the pace of

development in this area. The Commission also stated that, given the measures proposed to

strengthen Commission oversight of safe harbor programs, allowing safe harbors to approve new

consent mechanisms would not result in the loosening of COPPA’s standards for parental

consent. Thus, the 2011 NPRM included a proposed Rule provision stating that operators

participating in a Commission-approved safe harbor program may use any parental consent

mechanism deemed by the safe harbor program to meet the general consent standard set forth in

§ 312.5(b)(1). Although one commenter expressed concern that this would lead to a “race to the

Page 86: Children's Online Privacy Protection Rule

CommonSense Media (comment 26, 2011 NPRM), at 16 (raising concern that263

safe harbor providers may “race to the bottom” to offer operators low-cost consent programswith low standards of verifiable consent, unless the Commission requires safe harbors topublicly disclose their approvals and report them to the FTC).

See, e.g., eBay (comment 40, 2011 NPRM), at 4; kidSAFE Seal Program264

(comment 81, 2011 NPRM), at 16; TRUSTe (comment 164, 2011 NPRM), at 11 (noting costbenefit to operators to get early review of mechanism at design or wireframe stage).

See 15 U.S.C. 6502(b)(2); 16 CFR 312.5(c).265

The Act and Rule currently permit the collection of limited personal information266

for the purposes of: (1) obtaining verified parental consent; (2) providing parents with a right toopt-out of an operator’s use of a child’s email address for multiple contacts of the child; and (3)to protect a child’s safety on a website or online service. See 15 U.S.C. 6502(b)(2); 16 CFR312.5(c)(1)-(5).

-86-

bottom” by safe harbor programs, most of the comments were favorable. Moreover, the263 264

Commission believes its added oversight will prevent any “race to the bottom” efforts.

Accordingly, the Commission adopts this provision unchanged from its September 2011

proposal.

10. Exceptions to Prior Parental Consent

The COPPA Act and the Rule address five fact patterns under which an operator may

collect limited pieces of personal information from children prior to, or sometimes without,

obtaining parental consent. These exceptions permit operators to communicate with the child265

to initiate the parental consent process, respond to the child once or multiple times, and protect

the safety of the child or the integrity of the website. The 2011 NPRM proposed minor266

changes to the Rule to add one new exception.

Page 87: Children's Online Privacy Protection Rule

N. Savitt (comment 142, 2011 NPRM), at 2; see also kidSAFE Seal Program267

(comment 81, 2011 NPRM), at 17 (this exception should also allow the collection of a child’sonline contact information to enable the operator to notify the child that the parent hasconsented).

15 U.S.C. 6502(b)(2)(B).268

See Part II.B.1., supra (discussing the parallel correction to § 312.4(c)(1) (direct269

notice to a parent required under § 312.5(c)(1)).

-87-

a. Section 312.5(c)(1)

The Rule’s first exception, § 312.5(c)(1), permits an operator to collect “the name or

online contact information of a parent or child” to be used for the sole purpose of obtaining

parental consent. In view of the limited purpose of the exception – to reach the parent to initiate

the consent process – the Commission proposed in the 2011 NPRM to limit the information

collection under this exception to the parent’s online contact information only. However, as one

commenter pointed out, the COPPA statute expressly provides that, under this exception, an267

operator can collect “the name or online contact information of a parent or child.”268

Accordingly, the Commission retains § 312.5(c)(1) allowing for the collection of the

name or online contact information of the parent or child in order to initiate the notice and

consent process.269

b. Section 312.5(c)(2)

The 2011 NPRM proposed adding one additional exception to parental consent in order

to give operators the option to collect a parent’s online contact information for the purpose of

providing notice to, or updating, the parent about a child’s participation in a website or online

Page 88: Children's Online Privacy Protection Rule

At least a few online virtual worlds directed to very young children already270

follow this practice. Because the Rule did not include such an exception, these operatorstechnically were in violation of COPPA.

See, e.g., DMA (comment 37, 2011 NPRM), at 26; kidSAFE Seal Program271

(comment 81, 2011 NPRM), at 17-18; N. Savitt (comment 142, 2011 NPRM), at 2.

See N. Savitt (comment 142, 2011 NPRM), at 2 (proposing that the exception272

clearly indicate that providing such notice is optional); kidSAFE (comment 81, 2011 NPRM), atat 18 (seeking clarification that parent’s online contact information is linkable to child’s accountfor updating purposes).

-88-

service that does not otherwise collect, use, or disclose children’s personal information. The270

proposed exception, numbered 312.5(c)(2), provided that the parent’s online contact information

may not be used for any other purpose, disclosed, or combined with any other information

collected from the child. The Commission indicated its belief that collecting a parent’s online

contact information for the limited purpose of notifying the parent of a child’s online activities in

a site or service that does not otherwise collect personal information is reasonable and should be

encouraged.

The few comments addressing this proposed additional exception generally supported

it. Certain commenters recommended minor clarifications, such as adding language to271

indicate that the notice is voluntary and that operators can link a parent’s email address to the

child’s account. Upon consideration of the commenters’ suggestions, the Commission has272

made minor changes to the language of this exception to clarify that its use is voluntary and that

operators can use the exception to provide notice and subsequent updates to parents. The

Commission did not find that clarification is needed to enable operators to link the parent’s email

to the child’s account. Therefore, § 312.5(c)(2) of the final Rule permits the collection of a

parent’s online contact information to provide voluntary notice to, and subsequently update the

Page 89: Children's Online Privacy Protection Rule

Section 312.4(c)(2) of the final Rule sets out the direct notice requirements under273

this exception. See Part II.B.1., supra.

See Promotion Marketing Association (comment 133, 2011 NPRM), at 5-6.274

-89-

parent about, the child’s participation in a website or online service that does not otherwise

collect, use, or disclose children’s personal information, where the parent’s contact information

is not used or disclosed for any other purpose.273

c. Section 312.5(c)(3) (One-Time Use Exception)

Section 312.5(c)(2) of the Rule provides that an operator is not required to provide notice

to a parent or obtain consent where the operator has collected online contact information from a

child for the sole purpose of responding on a one-time basis to a child’s request, and then deletes

the information. The 2011 NPRM proposed a minor change to the language of the one-time use

exception, stating that the exception would apply where the operator collected a child’s online

contact information for such purpose. One commenter pointed out that the Rule language,

“online contact information from a child,” is taken directly from the COPPA statute. The

commenter also expressed concern that the Commission’s proposed change to the language may

prevent operators from offering several popular one-time use activities under this exception. 274

In proposing this minor change, the Commission did not intend to further constrict the

permissible uses of online contact information under the one-time-use exception (such as

notifications regarding a contest or sweepstakes, homework help, birthday messages, forward-to-

a-friend emails, or other similar communications). The Commission is persuaded, therefore, to

retain the existing language in § 312.5(c)(3) permitting the collection of online contact

information from a child.

Page 90: Children's Online Privacy Protection Rule

Under this exception, the Rule requires the operator only to provide the parent 275

the opportunity to opt-out of granting consent, rather than requiring it to obtain opt-in consent.

See DMA (comment 37, 2011 NPRM), at 25-26.276

See 15 U.S.C. 6502(b)(2)(C ) (statute requires operator to “use reasonable efforts277

to provide a parent notice”).

-90-

d. Section 312.5(c)(4) (Multiple Use Exception)

The Rule provides that an operator may notify a parent via email or postal address that it

has collected a child’s online contact information to contact a child multiple times (for instance,

to provide the child with a newsletter or other periodic communication). The 2011 NPRM275

proposed revising the multiple contacts exception to allow for the collection of a child’s and a

parent’s online contact information; and to strike the collection of postal address on the basis that

it is now outmoded for this use. Although one commenter argued that postal address continues

to provide a reasonable means of contacting the parent, the Commission believes that the276

revised provision provides operators with a sufficient and practical means of contacting a parent

in connection with the multiple use exception. The Commission also notes that the collection of

postal address for the purpose of providing notice to a parent is not specifically provided for in

the COPPA statute or elsewhere in the Rule’s notice requirements. Therefore, the language of277

§ 312.5(4), as proposed in the 2011 NPRM, is hereby adopted in the final Rule.

e. Section 312.5(c)(5) (Child Safety Exception)

The 2011 NPRM proposed minor changes to the language of the child safety exception to

state the purpose of the exception up-front, and to make clear that the operator can collect both

the child’s and the parent’s online contact information where it is necessary to protect the safety

Page 91: Children's Online Privacy Protection Rule

kidSAFE Seal Program (comment 81, 2011 NPRM), at 18.278

-91-

of the child and where the information is not used for any other purpose. The Commission

received one comment recommending that the Rule also allow for the collection of the parent’s

name, which the commenter believes may aid in contacting the parent, if necessary. The278

Commission recognizes that the circumstances under which the child-safety exception becomes

important may vary significantly. As such, the Commission is persuaded to further modify this

exception to allow for collection of the parent’s name, given that the exception is available only

where necessary to protect the safety of a child and where such information is not used or

disclosed for any purpose unrelated to the child’s safety. Section 312.5(c)(5) of the final Rule

therefore provides that an operator can collect a child’s and a parent’s name and online contact

information, to protect the safety of a child, where such information is not used or disclosed for

any purpose unrelated to the child’s safety.

f. Section 312.5(c)(6) (Security of the Site or Service Exception)

The final Rule incorporates the language of the Rule, with only minor, non-substantive

changes to sentence structure.

g. Section 312.5(c)(7) (Persistent Identifier Used to Support InternalOperations Exception)

As described in Section II.C.5.b. above, the final Rule creates an exception for the

collection of a persistent identifier, and no other personal information, where used solely to

provide support for the internal operations of the website or online service. Where these criteria

are met, the operator will have no notice or consent obligations under this exception.

Page 92: Children's Online Privacy Protection Rule

-92-

h. Section 312.5(c)(8) (Operator Covered Under Paragraph (b) ofDefinition of Website or Online Service Directed to ChildrenCollects a Persistent Identifier from a Previously Registered User)

Paragraph (b) of the definition of website or online service directed to children sets forth

the actual knowledge standard for plug-ins under the Rule. The Commission is providing for a

new, narrow, exception to the Rule’s notice and consent requirements for such an operator where

it collects a persistent identifier, and no other personal information, from a user who

affirmatively interacts with the operator and whose previous registration with that operator

indicates that such user is not a child. The Commission has determined that, in this limited

circumstance where an operator has already age-screened a user on its own website or online

service, and such user has self-identified as being over the age of 12, the burden of requiring that

operator to assume that this same user is a child outweighs any benefit that might come from

providing notice and obtaining consent before collecting the persistent identifier in this instance.

This exception only applies if the user affirmatively interacts with the operator’s online service

(e.g., by clicking on a plug-in), and does not apply if the online service otherwise passively

collects personal information from the user while he or she is on another site or service.

D. Section 312.8: Confidentiality, Security, and Integrity of Personal InformationCollected From Children

In the 2011 NPRM, the Commission proposed amending § 312.8 to strengthen the

provision requiring operators to maintain the confidentiality, security, and integrity of personal

information collected from children. Specifically, the Commission proposed adding a

requirement that operators take reasonable measures to ensure that any service provider or third

Page 93: Children's Online Privacy Protection Rule

See 2011 NPRM, 76 FR at 59821. The Rule was silent on the data security279

obligations of third parties. However, the online notice provision in the Rule required operatorsto state in their privacy policies whether they disclose personal information to third parties, andif so, whether those third parties have agreed to maintain the confidentiality, security, andintegrity of the personal information they obtain from the operator. See § 312.4(b)(2)(iv) of theRule.

EPIC (comment 41, 2011 NPRM), at 10-11; see also H. Valetk (comment 166,280

2011 NPRM), at 2.

CDT (comment 17, 2011 NPRM), at 2.281

Privacy Rights Clearinghouse (comment 131, 2011 NPRM), at 2. 282

Marketing Research Association (comment 97, 2011 NPRM), at 4.283

DMA (comment 37, 2011 NPRM), at 26.284

-93-

party to whom they release children’s personal information has in place reasonable procedures to

protect the confidentiality, security, and integrity of such personal information. 279

The Commission received a number of comments in support of its proposal. EPIC

asserted, “[third-party data collectors] are the ‘least cost avoiders’ and can more efficiently

protect the data in their possession than could the data subjects who have transferred control over

their personal information.” The CDT found the proposal to be a “sensible requirement that280

third-party operators put in place reasonable security procedures.” And the Privacy Rights281

Clearinghouse stated, “the proposed revision . . . would enhance consumer trust and reduce the

likelihood that data will be mishandled when disclosed to an outside party.”282

Several commenters opposed the Commission’s proposal outright, finding it to be unduly

onerous on small businesses or ultra vires to the statute. The Commission finds this283 284

opposition unpersuasive. The requirement that operators take reasonable care to release

children’s personal information only to entities that will keep it secure flows directly from the

Page 94: Children's Online Privacy Protection Rule

15 U.S.C. 6502(b)(1)(D).285

See Facebook (comment 50, 2011 NPRM), at 15-16 (“The current definition of286

third party in Section 312.1 sweeps so broadly that it also encompasses other users who canview content or receive communications from the child – including, for example, the child’srelatives or classmates. Under the proposed amendment, operators would be obligated to takereasonable measures to ensure that these relatives and classmates have ‘reasonable procedures’in place to protect the child’s personal information”); CDT (comment 17, 2011 NPRM), at 2(“consistent with the Commission’s goal of addressing business-to-business data sharing, theCommission should make it clear that these additional data security requirements apply only toother FTC-regulated entities with which the operator has a contractual relationship”).

See 2011 NPRM, 76 FR at 59809.287

IAB (comment 73, 2011 NPRM), at 14 (“The IAB is concerned that these288

requirements, if finalized, would create a risk of liability to companies based on highlysubjective standards and on third party activities ”); MPAA (comment 109, 2011 NPRM), at 16-17 (“the proposed requirement that operators take measures sufficient to ensure compliance byvendors and other third parties might be misapplied to make operators the effective guarantors ofthose measures. As a practical matter, no business is in a position to exercise the same degree of

-94-

statutory requirement that covered operators “establish and maintain reasonable procedures to

protect the confidentiality, security, and integrity of personal information collected from

children.”285

Several commenters asked the Commission to consider narrowing the proposal so that it

applies only to third parties with whom the operator has a contractual relationship, rather than to

all third parties, given the breadth of the Rule’s definition of third party. These concerns are286

obviated by the Commission’s proposal in the 2011 NPRM to narrow the definition of release to

include only business-to-business disclosures, and not the sort of open-to-the-public disclosures

that worry the commenters.287

Other commenters expressed concern with the Commission’s use of the words

“reasonable measures” and “ensure” in the proposed revised language, stating that such phrases

are too subjective to be workable and set an impossible-to-reach standard. Requiring288

Page 95: Children's Online Privacy Protection Rule

control over another, independent business as it can exercise over its own operations.”).

See, e.g., In the Matter of Compete, Inc., FTC File No. 102 3155 (proposed289

consent order) (Oct. 29, 2012), available at http://www.ftc.gov/os/caselist/1023155/121022competeincagreeorder.pdf; In the Matter ofFranklin’s Budget Car Sales, Inc., FTC Docket No. C-4371 (consent order) (Oct. 3, 2012),available at http://ftc.gov/os/caselist/1023094/121026franklinautomalldo.pdf; In the Matterof EPN, Inc., FTC Docket No. C-4370 (consent order) (Oct. 3, 2012), available athttp://ftc.gov/os/caselist/1123143/121026epndo.pdf; In the Matter of Upromise, Inc., FTCDocket No. C-4351 (consent order) (Apr. 3, 2012), available athttp://www.ftc.gov/os/caselist/1023116/120403upromisedo.pdf.

15 U.S.C. 6502(b)(1)(D).290

Facebook (comment 50, 2011 NPRM), at 16; MPAA (comment 109, 2011291

NPRM), at 16-17.

-95-

operators to use “reasonable measures” both to establish their own data protection programs and

to evaluate the programs of others has long been the standard the Commission employs in the

context of its data security actions, and provides companies with the flexibility necessary to

effectuate strong data privacy programs. Importantly, the reasonable measures standard is the289

one set by Congress for operators’ confidentiality, security, and integrity measures in the

COPPA statute. 290

The Commission finds merit, however, in the concerns expressed about the difficulty

operators may face in “ensuring” that any service provider or any third party to whom it releases

children’s personal information has in place reasonable procedures to protect the confidentiality,

security, and integrity of children’s personal information. The Motion Picture Association of291

America (“MPAA”) urged the Commission to take the approach adopted in the Safeguards Rule

implemented under the Gramm-Leach-Bliley Act. Entities covered by the Safeguards Rule are

required to take “reasonable steps to select and retain service providers that are capable of

Page 96: Children's Online Privacy Protection Rule

16 CFR 314.4(d).292

See 76 FR at 59822. 293

-96-

maintaining appropriate safeguards for the customer information at issue” and to “requir[e]

service providers by contract to implement and maintain such safeguards.”292

After reviewing these comments, the Commission has decided to modify the standard

required when an operator releases children’s personal information to service providers and third

parties. Operators must inquire about entities’ data security capabilities and, either by contract

or otherwise, receive assurances from such entities about how they will treat the personal

information they receive. They will not be required to “ensure” that those entities secure the

information absolutely.

Accordingly, the revised confidentiality, security, and integrity provision (§ 312.8) reads:

The operator must establish and maintain reasonable procedures toprotect the confidentiality, security, and integrity of personalinformation collected from children. The operator must also takereasonable steps to release children’s personal information only toservice providers and third parties who are capable of maintainingthe confidentiality, security and integrity of such information, andwho provide assurances that they will maintain the information insuch a manner.

E. Section 312.10: Data Retention and Deletion Requirements

In the 2011 NPRM, the Commission proposed adding a data retention and deletion

provision (new Section 312.10). The general tenet of data security, that deleting unneeded293

information is an integral part of any reasonable data security strategy (discussed in the

Commission’s 1999 COPPA Rulemaking), informed the Commission’s rationale for this new

Page 97: Children's Online Privacy Protection Rule

See 1999 Notice of Proposed Rulemaking, 64 FR at 22750, 22758-59 (“The294

Commission encourages operators to establish reasonable procedures for the destruction ofpersonal information once it is no longer necessary for the fulfillment of the purpose for which itwas collected. Timely elimination of data is the ultimate protection against misuse orunauthorized disclosure.”).

See 15 U.S.C. 6502(b)(1)(D).295

EPIC (comment 41, 2011 NPRM), at 4-5; Institute for Public Representation296

(comment 71, 2011 NPRM), at 42-43; Sarah Kirchner (comment 82, 2011 NPRM); PrivacyRights Clearinghouse (comment 131, 2011 NPRM), at 2-3.

Institute for Public Representation, supra note 296, at 42-43.297

See EPIC (comment 41, 2011 NPRM), at 12; Privacy Rights Clearinghouse298

(comment 131, 2011 NPRM), at 2-3.

-97-

provision. In addition, the new proposed provision flowed from the statutory authority granted294

in COPPA for regulations requiring operators to establish and maintain reasonable procedures to

protect the confidentiality, security, and integrity of personal information collected from

children.295

The Commission received support for its data retention and deletion proposal from

several consumer groups and an individual commenter. The Institute for Public296

Representation stated that, without such a provision, operators have no incentive to eliminate

children’s personal information and may retain it indefinitely. Other supporters mentioned297

that a requirement to retain and eliminate data works in tandem with the Rule’s requirement that

data be kept confidential and secure, and has the added benefit of reducing the risk and impact of

data breaches.298

Page 98: Children's Online Privacy Protection Rule

American Association of Advertising Agencies (comment 2, 2011 NPRM), at 3; 299

DMA (comment 37, 2011 NPRM), at 27; NCTA (comment 113, 2011 NPRM), at 21; NationalRetail Federation (comment 114, 2011 NPRM), at 4; TRUSTe (comment 164, 2011 NPRM), at11-12; Yahoo! (comment 180, 2011 NPRM), at 15-16.

See DMA (comment 37, 2011 NPRM), at 26; Yahoo! (comment 180, 2011300

NPRM), at 15.

See National Retail Federation (comment 114, 2011 NPRM), at 4; TRUSTe301

(comment 164, 2011 NPRM), at 12.

For this reason, the Commission declines to adopt the Institute for Public302

Representation’s request that it require companies to delete children’s personal informationwithin three months. See Institute for Public Representation (comment 71, 2011 NPRM), at 43.

-98-

Other commenters, primarily industry members, opposed the addition of a data retention

and deletion provision, stating that it was unnecessary, vague, and unduly prescriptive. These299

commenters especially objected to the combination of the data retention and deletion provision

with the proposed expansion of the definition of personal information to include persistent

identifiers. They asserted that the proposed deletion requirement would require companies to

delete non-personally identifiable information, such as data used for website and marketing

analytics.300

The Commission chose the phrases “for only as long as is reasonably necessary” and

“reasonable measures” to avoid the very rigidity about which commenters opposing this

provision complain. Such terms permit operators to determine their own data retention needs301

and data deletion capabilities, without the Commission dictating specific time-frames or data

destruction practices.302

While this new provision may require operators to give additional thought to notions of

data retention and deletion, it should not add significantly to operators’ burden. The existing

Rule already prohibits operators from conditioning a child’s participation in an activity on the

Page 99: Children's Online Privacy Protection Rule

16 CFR 312.7.303

16 CFR 312.8.304

See 15 U.S.C. 6503.305

See 2011 NPRM, 76 FR at 59822 (citing the 1999 Statement of Basis and306

Purpose, 64 FR at 59906).

-99-

child disclosing more personal information than is reasonably necessary to participate. 303

Operators also must establish and maintain reasonable procedures to protect the confidentiality,

security, and integrity of personal information collected from children. This new data304

retention and deletion provision, Section 312.10, requires operators to anticipate the reasonable

lifetime of the personal information they collect from children, and apply the same concepts of

data security to its disposal as they are required to do with regard to its collection and

maintenance.

Therefore, the Commission modifies Section 312.10 as originally proposed, without

change from its 2011 proposal.

F. Section 312.11: Safe Harbors

The COPPA statute established a “safe harbor” for participants in Commission-approved

COPPA self-regulatory programs. As noted in the 2011 NPRM, with the safe harbor305

provision, Congress intended to encourage industry members and other groups to develop their

own COPPA oversight programs, thereby promoting efficiency and flexibility in complying with

COPPA’s substantive provisions. COPPA’s safe harbor provision also was intended to reward306

operators’ good faith efforts to comply with COPPA. The Rule therefore provides that operators

fully complying with an approved safe harbor program will be “deemed to be in compliance”

Page 100: Children's Online Privacy Protection Rule

See 16 CFR 312.10(a) and (b)(4). 307

See 2011 NPRM, 76 FR at 59822-24.308

CARU (comment 20, 2011 NPRM); Entertainment Software Rating Board309

(“ESRB”) (comment 48, 2011 NPRM); Privo (comment 132, 2011 NPRM); TRUSTe (comment164, 2011 NPRM).

DMA (comment 37, 2011 NPRM); IAB (comment 73, 2011 NPRM); kidSAFE310

Seal Program (comment 81, 2011 NPRM).

See, e.g., CARU (comment 20, 2011 NPRM), at 2 (“In general, CARU believes311

that most of the proposed modifications will not only strengthen the safe harbor program, butwill facilitate and enhance the Commission’s named goals of reliability, accountability,

-100-

with the Rule for purposes of enforcement. In lieu of formal enforcement actions, such operators

instead are subject first to the safe harbor program’s own review and disciplinary procedures.307

In the 2011 NPRM, the Commission proposed several significant substantive changes to

the Rule’s safe harbor provision to strengthen the Commission’s oversight of participating safe

harbor programs. The proposed changes include a requirement that applicants seeking

Commission approval of self-regulatory guidelines submit comprehensive information about

their capability to run an effective safe harbor program. The changes also establish more

rigorous baseline oversight by Commission-approved safe harbor programs of their members. In

addition, the changes require Commission-approved safe harbor programs to submit periodic

reports to the Commission. The Commission also proposed certain structural and linguistic

changes to increase the clarity of the Rule’s safe harbor provision.308

The Commission received several comments regarding the proposed changes, including

comments from all four of the COPPA safe harbor programs the Commission had approved by

2011, as well as from several other industry associations. With the exception of a few areas309 310

discussed below, commenters favorably viewed the Commission’s proposed revisions. First,311

Page 101: Children's Online Privacy Protection Rule

transparency and sustainability.”).

CARU (comment 20, 2011 NPRM), at 3; ESRB (comment 48, 2011 NPRM), at 2; 312

kidSAFE Seal Program (comment 81, 2011 NPRM), at 20; TRUSTe (comment 164, 2011NPRM), at 12.

See, e.g., kidSAFE Seal Program (comment 81, 2011 NPRM), at 20 (“KSP313

supports this change and believes more detailed information during the application process willgive the FTC greater comfort regarding the operations of safe harbor programs”); see alsoCARU (comment 20, 2011 NPRM), at 3; ESRB (comment 48, 2011 NPRM), at 3; TRUSTe(comment 164, 2011 NPRM), at 13. One commenter sought assurance that such materials willbe treated confidentially. kidSAFE Seal Program (comment 81, 2011 NPRM), at 20. Safeharbor applicants may designate materials as “confidential,” and the Commission will apply thesame standards of confidentiality to such materials as it does to other voluntary submissions. See 15 U.S.C 46(f) and 57b-2, and the Commission’s Rules of Practice 4.10-4.11, 16 CFR 4.10-4.11.

-101-

among commenters who mentioned them, there was uniform support for the proposed revised

criteria for approval of self-regulatory guidelines, which would mandate that (at a minimum)

safe harbor programs conduct annual, comprehensive reviews of each of their members’

information practices. Accordingly, the Commission retains paragraph (b)(2) (“Criteria for312

approval of self-regulatory guidelines”) without change from its 2011 proposal.

In paragraph (c) (“Request for Commission approval of self-regulatory program

guidelines”), the Commission proposed requiring applicants to explain in detail their business

model and their technological capabilities and mechanisms for initial and continuing assessment

of subject operators’ fitness for membership in the safe harbor program. Again, commenters

who mentioned it uniformly supported this change. Accordingly, the Commission revises313

paragraph (c) (“Request for Commission approval of self-regulatory program guidelines”)

without change from its 2011 proposal.

Page 102: Children's Online Privacy Protection Rule

The proposed change would have required safe harbor programs to submit314

periodic reports – within one year after the revised Rule goes into effect and every eighteenmonths thereafter – of the results of the independent audits under revised paragraph (b)(2) and ofany disciplinary actions taken against member operators. See 2011 NPRM, 76 FR at 59823.

See CARU (comment 20, 2011 NPRM), at 3 (“Much of the value of self-315

regulation is that issues can be handled quickly and effectively. The reporting of ‘any’ actiontaken against a website operator may have a chilling effect on website operators’ willingness toraise compliance issues themselves”); DMA (comment 37, 2011 NPRM), at 26 (“Based onfeedback from our members, the DMA has reason to believe that this revision would decreaseinterest and participation in the safe harbor programs in contravention of the Commission’s goalof increasing safe harbor participation”); see also ESRB (comment 48, 2011 NPRM), at 4; IAB(comment 73, 2011 NPRM), at 14; kidSAFE Seal Program (comment 81, 2011 NPRM), at 20;Privo (comment 132, 2011 NPRM), at 8; TRUSTe (comment 164, 2011 NPRM), at 13.

-102-

The response to the 2011 proposal for periodic reporting by safe harbors to the

Commission (paragraph (d)) was more ambivalent. While commenters generally supported314

stronger Commission oversight of safe harbor activities post-approval, they were concerned that

a requirement forcing safe harbors to “name names” of violative member operators would chill

the programs’ abilities to recruit and retain members, and generally would be counter to notions

of self-regulation. 315

The Commission continues to believe that there is great value in receiving regular reports

from its approved safe harbor programs. It is persuaded, however, that these reports need not

name the member operators who were subject to a safe harbor’s annual comprehensive review.

Rather, the Commission has revised paragraph (d) to permit safe harbors to submit a report to the

Commission containing an aggregated summary of the results of the independent assessments

conducted under paragraph (b)(2). In addition, to simplify matters, the Commission has changed

the required reporting period to an annual requirement rather than one occurring every eighteen

Page 103: Children's Online Privacy Protection Rule

The kidSAFE Seal Program also sought to limit the Rule’s reporting requirements316

to “material” descriptions of disciplinary action taken against member operators (paragraph(d)(1)), “reasonable” Commission requests for additional information (paragraph (d)(2)), and“material” consumer complaints (paragraph (d)(3)). See kidSAFE Seal Program (comment 81,2011 NPRM), at 21. The Commission believes that such limitations are unnecessary and that thewording of the requirements in revised paragraph (d) will not be overly burdensome forcompliance by safe harbor programs.

-103-

months after the first annual report. Therefore, the Commission amends paragraph (d) of the316

safe harbor provision so that it reads:

(d) Reporting and recordkeeping requirements. Approved safeharbor programs shall:

(1) By July 1, 2014, and annually thereafter, submit areport to the Commission containing, at aminimum, an aggregated summary of the results ofthe independent assessments conducted underparagraph (b)(2), a description of any disciplinaryaction taken against any subject operator underparagraph (b)(3), and a description of any approvalsof member operators’ use of a parental consentmechanism, pursuant to § 312.5(b)(4);

(2) Promptly respond to Commission requests foradditional information; and

(3) Maintain for a period not less than three years, andupon request make available to the Commission forinspection and copying:

(i) Consumer complaints alleging violations ofthe guidelines by subject operators;

(ii) Records of disciplinary actions taken againstsubject operators; and

(iii) Results of the independent assessments ofsubject operators’ compliance requiredunder paragraph (b)(2).

Page 104: Children's Online Privacy Protection Rule

5 U.S.C. 601-612.317

See 5 U.S.C. 603-04.318

See 5 U.S.C. 605.319

-104-

III. Final Regulatory Flexibility Act Analysis

The Regulatory Flexibility Act of 1980 (“RFA”) requires a description and analysis of317

proposed and final Rules that will have significant economic impact on a substantial number of

small entities. The RFA requires an agency to provide an Initial Regulatory Flexibility Analysis

(“IRFA”) with the proposed Rule, and a Final Regulatory Flexibility Analysis (“FRFA”), if any,

with the final Rule. The Commission is not required to make such analyses if a Rule would318

not have such an economic effect. As described below, the Commission anticipates the final319

Rule amendments will result in more websites and online services being subject to the Rule and

to the Rule’s disclosure and other compliance requirements. As discussed in Part IV.C, below,

the Commission believes that a high proportion of operators of websites and online services

potentially affected by these revisions are small entities as defined by the RFA.

As described in Part I.B above, in September 2011, the Commission issued a Notice of

Proposed Rulemaking setting forth proposed changes to the Commission’s COPPA Rule. The

Commission issued a Supplemental Notice of Proposed Rulemaking in August 2012 in which

the Commission proposed additional and alternative changes to the Rule. In both the 2011

NPRM and 2012 SNPRM, the Commission published IRFAs and requested public comment on

the impact on small businesses of its proposed Rule amendments. The Commission received

approximately 450 comments, combined, on the changes proposed in the 2011 NPRM and the

2012 SNPRM. Numerous comments expressed general concern that the proposed revisions

Page 105: Children's Online Privacy Protection Rule

See, e.g., D. Russell-Pinson (comment 81, 2012 SNPRM), at 1; Ahmed Siddiqui320

(comment 83, 2012 SNPRM), at 1; Mindy Douglas (comment 29, 2012 SNPRM), at 1; KarenRobertson (comment 80, 2012 SNPRM), at 1; R. Newton (comment 118, 2011 NPRM), at 1.

See DMA (comment 37, 2011 NPRM), at 17; National Cable &321

Telecommunications Association (comment 113, 2011 NPRM), at 15-16.

-105-

would impose costs on businesses, including small businesses; few comments discussed the320

specific types of costs that the proposed revisions might impose, or attempted to quantify the

costs or support their comments with empirical data.

In the 2011 NPRM and 2012 SNPRM, the Commission proposed modifications to the

Rule in the following five areas: Definitions, Notice, Parental Consent, Confidentiality and

Security of Children’s Personal Information, and Safe Harbor Programs. The Commission

proposed modifications to the definitions of operator, personal information, support for internal

operations, and website or online service directed to children. Among other things, the

proposed definition of personal information was revised to include persistent identifiers where

they are used for purposes other than support for internal operations, and to include screen and

user names where they function as online contact information. In addition, the Commission

proposed adding a new Section to the Rule regarding data retention and deletion.

The Commission shares the concern many commenters expressed that operators be

afforded enough time to implement changes necessary for them to comply with the final Rule

amendments. Accordingly, the final Rule will go into effect on July 1, 2013.321

A. Need for and Objectives of the Final Rule Amendments

The objectives of the final Rule amendments are to update the Rule to ensure that

children’s online privacy continues to be protected, as directed by Congress, even as new online

Page 106: Children's Online Privacy Protection Rule

-106-

technologies evolve, and to clarify existing obligations for operators under the Rule. The legal

basis for the final Rule amendments is the Children’s Online Privacy Protection Act, 15 U.S.C.

6501 et seq.

B. Significant Issues Raised by Public Comments, Summary of the Agency’sAssessment of These Issues, and Changes, if Any, Made in Response to SuchComments

In the IRFAs, the Commission sought comment regarding the impact of the proposed

COPPA Rule amendments and any alternatives the Commission should consider, with a specific

focus on the effect of the Rule on small entities. As discussed above, the Commission received

hundreds of comments in response to the rule amendments proposed in the NPRM and SNPRM.

The most significant issues raised by the public comments, including comments addressing the

impacts on small businesses, are set forth below. While the Commission received numerous

comments about the compliance burdens and costs of the rules, the Commission did not receive

much quantifiable information about the nature of the compliance burdens. The Commission

has taken the costs and burdens of compliance into consideration in adopting these amendments.

(1) Definitions

Definition of Collects or Collection

As described above in Part II.A.1.b., the Commission proposed amendments to the Rule

provision that allows sites and services to make interactive content available to children, without

providing parental notice and obtaining consent, if all personal information is deleted prior to

posting. The Commission proposed replacing this 100% deletion standard with a “reasonable

measures” standard to further enable sites and services to make interactive content available to

Page 107: Children's Online Privacy Protection Rule

See, e.g., Application Developers Alliance (comment 5, 2012 SNPRM), at 3-5;322

Association for Competitive Technology (comment 7, 2012 SNPRM), at 3-5; Center forDemocracy & Technology (“CDT”) (comment 15, 2012 SNPRM), at 4-5; DMA (comment 28,2012 SNPRM), at 5, 17; J. Garrett (comment 38, 2012 SNPRM), at 1; L. Mattke (comment 63,2012 SNPRM); S. Weiner (comment 97, 2012 SNPRM), at 1-2.

-107-

children, without providing parental notice and obtaining consent, thereby reducing burdens on

operators. Most comments favored the “reasonable measures” standard, and the Commission

has adopted it.

Definitions of Operator and Website or Online Service Directed to Children

As discussed above in Part II.A.4., the Commission’s proposed rule changes clarify the

responsibilities under COPPA when independent entities or third parties, e.g., advertising

networks or downloadable plug-ins, collect information from users through child-directed sites

and services. Under the proposed revisions, the child-directed content provider would be strictly

liable for personal information collected from its users by third parties. The Commission also

proposed imputing the child-directed nature of the content site to the entity collecting the

personal information if that entity knew or had reason to know that it was collecting personal

information through a child-directed site. Most of the comments opposed the Commission’s

proposed modifications. Some of these commenters asserted that the proposed revisions would

impracticably subject new entities to the Rule and its compliance costs.322

With some modifications to the proposed Rule language, the Commission has retained

the proposed strict liability standard for child-directed content providers that allow third parties

to collect personal information from users of the child-directed sites, as discussed in Part

II.A.5.b. The Commission recognizes the potential burden that strict liability places on child-

directed content providers, particularly small app developers, but believes that the potential

Page 108: Children's Online Privacy Protection Rule

-108-

burden will be eased by the changes to the definitions of persistent identifier and support for

internal operations adopted in the Final Rule, as well as the exception to notice and parental

consent – § 312.5(c)(7) – where an operator collects only a persistent identifier only to support

its internal operations. Further, in light of the comments received, the Commission revised the

language proposed in the 2012 SNPRM to clarify that the language describing “on whose

behalf” does not encompass platforms, such as Google Play or the App Store, that offer access to

someone else’s child-directed content. Also in light of the comments received, the Commission

deemed third-party plug-ins to be co-operators only where they have actual knowledge that they

are collecting personal information from users of a child-directed site. This change will likely

substantially reduce the number of operators of third-party plug-ins, many of whom are small

businesses, who must comply with the Rule in comparison to the proposal in the 2012 SNPRM.

In response to comments requesting it, the Commission is also providing guidance in Part

II.A.4.b. above as to when it believes this “actual knowledge” standard will likely be met.

Definition of Online Contact Information

The Commission proposed clarifications to the definition of online contact information to

flag that the term broadly covers all identifiers that permit direct contact with a person online

and to ensure consistency between the definition of online contact information and the use of

that term within the definition of personal information. The proposed revised definition

identified commonly used online identifiers, including email addresses, instant messaging

(“IM”) user identifiers, voice over Internet protocol (“VOIP”) identifiers, and video chat user

identifiers, while also clarifying that the list of identifiers was non-exhaustive. This amendment,

which serves to clarify the definition, should not increase operators’ burden.

Page 109: Children's Online Privacy Protection Rule

-109-

Definition of Personal Information

a. Screen or User Names

As described above, the Commission in the 2011 NPRM proposed modifications to the

inclusion of screen names in the definition of personal information. Numerous commenters

expressed concern that the Commission’s screen-name proposal would unnecessarily inhibit

functions that are important to the operation of child-directed websites and online services. In

response to this concern, the 2012 SNPRM proposed covering screen names as personal

information only in those instances in which a screen or user name rises to the level of online

contact information. As discussed in Part II.A.5.a., the Commission has adopted the proposal in

the SNPRM. The revision permits operators to use anonymous screen and user names in place

of individually identifiable information, including use for content personalization, filtered chat,

for public display on a website or online service, or for operator-to-user communication via the

screen or user name. Moreover, the definition does not reach single log-in identifiers that permit

children to transition between devices or access related properties across multiple platforms.

Thus, the provision for screen or usernames does not create any additional compliance burden

for operators.

b. Persistent Identifiers and Support for Internal Operations

In the 2011 NPRM, and again in the 2012 SNPRM, the Commission proposed

broadening the definition of personal information to include persistent identifiers, except where

used to support the internal operations of the site or service. Numerous commenters opposed the

inclusion of persistent identifiers, while others sought to broaden the definition of support for

internal operations to allow for more covered uses of persistent identifiers. Some commenters

Page 110: Children's Online Privacy Protection Rule

Facebook (comment 33, 2012 SNPRM), at 9-10; Google (comment 41, 2012323

SNPRM), at 5; J. Holmes (comment 47, 2012 SNPRM).

-110-

maintained that, to comply with COPPA’s notice and consent requirements in the context of

persistent identifiers, sites would be burdened to collect more personal information on their

users, which is also contrary to COPPA’s goals of data minimization. As set forth in Part323

II.A.5.b, the Commission believes that persistent identifiers permit the online contacting of a

specific individual and thus are personal information. However, the Commission recognizes that

including persistent identifiers within the definition of personal information may impose a

burden on some operators to provide notice to parents and obtain consent under circumstances

where they previously had no COPPA obligation. The Commission also recognizes that

persistent identifiers are used for a host of functions that are unrelated to contacting a specific

individual and fundamental to the smooth functioning of the Internet, the quality of the site or

service, and the individual user’s experience. Thus, the final Rule further restricts the proposed

definition of persistent identifiers to “a persistent identifier that can be used to recognize a user

over time and across different websites or online services, where such persistent identifier is

used for functions other than or in addition to support for the internal operations of the website

or online service.” (Emphasis added.) The final Rule also modifies the definition of support for

internal operations to broaden the list of activities covered within this category. As a result of

these modifications, fewer uses of persistent identifiers will be covered in the Final Rule than in

the proposals, thereby resulting in fewer operators being subject to the final Rule amendments.

Page 111: Children's Online Privacy Protection Rule

See National Cable & Telecommunications Association (comment 113, 2011324

NPRM), at 16; Wired Trust (comment 177, 2011 NPRM), at 10; Toy Industry Association(comment 163, 2011 NPRM), at 14; Privo (comment 132, 2011 NPRM), at 7; see also Center forDemocracy and Technology (comment 17, 2011 NPRM), at 7-8.

-111-

c. Photographs, Videos, and Audio Files

In the 2011 NPRM, the Commission proposed creating a new category within the

definition of personal information covering a photograph, video, or audio file where such file

contains a child’s image or voice. Some commenters supported this proposal; others were

critical. The latter claimed that the proposal’s effect would limit children’s participation in

online activities involving “user-generated content,” that photos, videos, and/or audio files, in

and of themselves, do not permit operators to locate or contact a child, or that the Commission’s

proposal is premature. The Commission determined, as discussed in Part II.A.5.c, that such324

files meet the standard for “personal information” set forth in the COPPA statute. While

recognizing that defining personal information to include photos, videos, and/or audio files may

affect a limited number of operators, this is warranted given the inherently personal nature of

this content.

d. Geolocation Information

In the 2011 NPRM, the Commission stated that, in its view, existing paragraph (b) of the

definition of personal information already covered any geolocation information that provides

precise enough information to identify the name of a street and city or town. To make this clear,

the Commission has made geolocation information a stand-alone category within the definition

of personal information. Thus, this amendment should impose little or no additional burden on

operators.

Page 112: Children's Online Privacy Protection Rule

-112-

Definition of Website or Online Service Directed to Children

In the 2012 SNPRM, the Commission proposed revising the definition of website or

online service directed to children to allow a subset of sites falling within that category an option

not to treat all users as children. However, several commenters expressed concern and confusion

that the proposed amendment would expand COPPA’s reach to sites or services not previously

covered under the definition of website directed to children, and thus would be likely to impose

COPPA’s burdens on operators not previously covered by the Rule. The Commission has

clarified in Part II.A.7 that it did not intend to expand the reach of the Rule to additional sites

and services through the proposed revision, but rather to create a new compliance option for a

subset of websites and online services already considered directed to children under the Rule’s

totality of the circumstances standard. The Commission also clarified when a child-directed site

would be permitted to age-screen to differentiate among users, thereby providing further

guidance to businesses. This amendment will ease compliance burdens on operators of sites or

services that qualify to age-screen their visitors. In addition, the Commission has made further

clarifying edits to the definition of website or online service directed to children to incorporate

the “actual knowledge” standard for plug-ins or ad networks, as discussed above.

(2) Section 312.4: Notice

Direct Notice to a Parent

The Commission proposed refining the Rule requirements for the direct notice to ensure

a more effective “just-in-time” message to parents about an operator’s information practices.

Commenters generally supported the Commission’s proposed changes as providing greater

clarity and simplicity to otherwise difficult-to-understand statements. The Commission adopted

Page 113: Children's Online Privacy Protection Rule

-113-

the proposed modification but, in light of suggestions in the comments, reorganized the

paragraphs to provide a better flow and guidance for operators.

Notice on the Website or Online Service

The Commission proposed to change the Rule’s online notice provision to require all

operators collecting, using, or disclosing information on a website or online service to provide

contact information, including, at a minimum, the operator’s name, physical address, telephone

number, and email address. This proposal marked a change from the existing Rule’s “single

operator designee” proviso that such operators could designate one operator to serve as the point

of contact. Almost all commenters who spoke to the issue opposed mandating that the online

notice list all operators. Among the varied reasons cited in opposition to this change was the

potential burden on operators. After considering the comments, the Commission has determined

to retain the Rule’s “single operator designee” proviso.

(3) Section 312.5: Parental Consent

Based on input the Commission received at its June 2, 2010 COPPA roundtable and

comments to the 2010 FRN, in the 2011 NPRM the Commission proposed several significant

changes to the mechanisms of verifiable parental consent set forth in paragraph (b) of § 312.5.

These included recognizing electronic scans of signed consent forms, video conferencing,

government-issued ID, and a credit card in connection with a monetary transaction as additional

mechanisms for operators to obtain parental consent. In response to comments, the Commission

also adopted amendments to allow the use of other payment systems, in addition to credit cards,

in connection with a monetary transaction as verifiable parental consent, provided that any such

Page 114: Children's Online Privacy Protection Rule

-114-

payment system notifies the primary account holder of each discrete transaction. These changes

provide operators with further flexibility in complying with the Rule.

The Commission also proposed eliminating the sliding scale (“email plus”) approach to

parental consent for operators collecting personal information only for internal use. As

discussed in Part II.C.7, most commenters urged the Commission to retain email plus, in part

because they asserted it is more affordable and less burdensome for operators to use than other

approved methods for obtaining consent. Persuaded by the weight of the comments, the

Commission retained email plus as an acceptable consent method for internal use of personal

information, thereby providing operators with the choice of a mechanism many deem useful and

affordable.

Finally, the Commission also added two new voluntary processes for evaluation and pre-

clearance of parental consent mechanisms: use of an FTC preapproval process and use of a safe

harbor program for such purpose. The availability of these voluntary pre-clearance mechanisms

may provide benefits to participating operators in reducing the burden associated with the start-

up of a new COPPA compliance mechanism.

(4) Section 312.8: Confidentiality, Security, and Integrity of PersonalInformation Collected from Children

In 2011, the Commission proposed amending § 312.8 of the Rule to require that

operators take reasonable measures to ensure that any service provider or third party to whom

they release children’s personal information has in place reasonable procedures to protect the

confidentiality, security, and integrity of such personal information. Although many

commenters supported this proposal, some raised concerns about the language “reasonable

Page 115: Children's Online Privacy Protection Rule

See, e.g., DMA (comment 37, 2011 NPRM), at 27; Toy Industry Association325

(comment 163, 2011 NPRM), at 16-17.

-115-

measures” and “ensure.” Other commenters opposed the requirement as unduly onerous on

small businesses. The Commission found merit in the concerns expressed about the difficulty

operators may face in “ensuring” that any service provider or third party has in place reasonable

confidentiality and security procedures. Thus, the Commission has lessened the burden on

operators that would have been imposed by the earlier proposal by requiring operators to take

reasonable steps to release personal information only to service providers and third parties

capable of maintaining it securely.

(5) Section 312.10: Data Retention and Deletion Requirements

The Commission also has added a data retention and deletion provision (new Section

312.10) to the Rule to require operators to anticipate the reasonable lifetime of the personal

information they collect from children, and apply the same concepts of data security to its

disposal as they are required to do with regard to its collection and maintenance. While several

commenters supported this provision, several others objected to it as unnecessary, vague, or

unduly prescriptive. These commenters especially objected to the burden imposed by the325

combination of the data retention and deletion provision with the proposed expansion of the

definition of personal information to include persistent identifiers. The Commission believes

these concerns are not warranted in light of the language of the final Rule amendments, and that

this requirement should not add significantly to operators’ burdens.

Page 116: Children's Online Privacy Protection Rule

-116-

(6) Section 312.11: Safe Harbors

The Commission proposed changing the Rule’s safe harbor provision to strengthen the

Commission’s oversight of participating safe harbor programs. Among other things, the

Commission proposed requiring those programs to submit periodic reports to the Commission.

Commenters generally viewed the proposed revisions favorably, but expressed concern that the

proposed language requiring safe harbors to name violative member operators, would chill

participation in the programs. Heeding these concerns, the Commission will not require regular

reports from approved safe harbor programs to name the member operators who were subject to

a safe harbor’s annual comprehensive review. The final Rule amendments instead will require

safe harbor programs to submit an aggregated summary of the results of the annual,

comprehensive reviews of each of their members’ information practices. These amendments

ensure the effectiveness of the safe harbor programs upon which numerous operators rely for

assistance in their compliance with COPPA.

C. Description and Estimate of the Number of Small Entities Subject to theFinal Rule or Explanation Why No Estimate Is Available

The revised definitions in the Final Rule will affect operators of websites and online

services directed to children, as well as those operators that have actual knowledge that they are

collecting personal information from children. The Final Rule amendments will impose costs on

entities that are “operators” under the Rule. The Commission staff is unaware of any

comprehensive empirical evidence concerning the number of operators subject to the Rule.

However, based on the public comments received and the modifications adopted here, the

Commission staff estimates that approximately 2,910 existing operators may be subject to the

Page 117: Children's Online Privacy Protection Rule

See U.S. Small Business Administration Table of Small Business Size Standards326

Matched to North American Industry Classification System Codes, available athttp://www.sba.gov/sites/default/files/files/Size_Standards_Table.pdf.

Association for Competitive Technology (comment 7, 2012 SNPRM), at 2327

(ACT’s research “found that 87% of educational apps are created by companies qualifying as‘small’ by SBA guidelines”). ACT gave only limited information about how it calculated thisfigure.

-117-

Rule’s requirements and that there will be approximately 280 new operators per year for a

prospective three-year period.

Under the Small Business Size Standards issued by the Small Business Administration,

“Internet publishing and broadcasting and web search portals” qualify as small businesses if they

have fewer than 500 employees. Consistent with the estimate set forth in the 2012 SNPRM,326

Commission staff estimates that approximately 85-90% of operators potentially subject to the

Rule qualify as small entities. The Commission staff bases this estimate on its experience in this

area, which includes its law enforcement activities, discussions with industry members, privacy

professionals, and advocates, and oversight of COPPA safe harbor programs. This estimate is

also consistent with the sole comment that attempted to quantify how many operators are small

entities. 327

D. Description of the Projected Reporting, Recordkeeping, and OtherCompliance Requirements of the Final Rule Amendments, Including anEstimate of the Classes of Small Entities Which Will Be Subject to the Ruleand the Type of Professional Skills That Will Be Necessary to Comply

The final Rule amendments will likely increase certain disclosure and other compliance

requirements for covered operators. In particular, the requirement that the direct notice to

parents include more specific details about an operator’s information collection practices,

pursuant to a revised § 312.4 (Notice), would impose a one-time cost on operators. The addition

Page 118: Children's Online Privacy Protection Rule

-118-

of language in § 312.8 (confidentiality, security, and integrity of personal information collected

from children) will require operators to “take reasonable steps” to release children’s personal

information only to third parties capable of maintaining its confidentiality, security, and

integrity, and who provide assurances that they will do so. The final Rule amendments contain

additional reporting requirements for entities voluntarily seeking approval to be a COPPA safe

harbor self-regulatory program, and additional compliance requirements for all Commission-

approved safe harbor programs. Each of these improvements to the Rule may entail some added

cost burden to operators, including those that qualify as small entities, but the Commission has

considered these burdens and responded to commenters as described in Part III.C., above.

The revisions to the Rule’s definitions will also likely increase the number of operators

subject to the final Rule amendments’ disclosure and other compliance requirements. In

particular, the revised definition of operator will cover additional child-directed websites and

online services that choose to integrate plug-ins or advertising networks that collect personal

information from visitors. Similarly, the addition of paragraph (b) to the definition of website or

online service directed to children, which clarifies that the Rule covers a website or online

service that has actual knowledge that it is collecting personal information directly from users of

a website or online service directed to children, will potentially cover additional websites and

online services. These amendments may entail some added cost burden to operators, including

those that qualify as small entities; however, as described above, other final Rule amendments

will ease the burdens on operators and facilitate compliance.

The estimated burden imposed by these modifications to the Rule’s definitions is

discussed in the Paperwork Reduction Act section of this document, and there should be no

Page 119: Children's Online Privacy Protection Rule

-119-

difference in that burden as applied to small businesses. While the Rule’s compliance

obligations apply equally to all entities subject to the Rule, it is unclear whether the economic

burden on small entities will be the same as or greater than the burden on other entities. That

determination would depend upon a particular entity’s compliance costs, some of which may be

largely fixed for all entities (e.g., website programming) and others that may be variable (e.g.,

choosing to operate a family friendly website or online service), and the entity’s income or profit

from operation of the website or online service (e.g., membership fees) or from related sources

(e.g., revenue from marketing to children through the site or service). As explained in the

Paperwork Reduction Act section, in order to comply with the Rule’s requirements, operators

will require the professional skills of legal (lawyers or similar professionals) and technical (e.g.,

computer programmers) personnel. As explained earlier, the Commission staff estimates that

there are approximately 2,910 website or online services that would qualify as operators under

the final Rule amendments, that there will be approximately 280 new operators per year for a

three-year period, and that approximately 85-90% of all such operators would qualify as small

entities under the SBA’s Small Business Size standards.

E. Steps the Agency Has Taken to Minimize Any Significant Economic Impacton Small Entities, Consistent with the Stated Objectives of the ApplicableStatute

In drafting the amendments to the Rule’s definitions, the Commission has attempted to

avoid unduly burdensome requirements for all entities, including small businesses. The

Commission believes that the final Rule amendments will advance the goal of children’s online

privacy in accordance with COPPA. For each of the modifications, the Commission has taken

Page 120: Children's Online Privacy Protection Rule

See, e.g.,United States v. RockYou, Inc., No. 3:12-cv-01487-SI (N.D. Cal., entered 328

Mar. 27, 2012); United States v. Godwin, No. 1:11-cv-03846-JOF (N.D. Ga., entered Feb. 1,2012); United States v. W3 Innovations, LLC, No. CV-11-03958 (N.D. Cal., filed Aug. 12,2011); United States v. Industrious Kid, Inc., No. CV-08-0639 (N.D. Cal., filed Jan. 28, 2008);United States v. Xanga.com, Inc., No. 06-CIV-6853 (S.D.N.Y., entered Sept. 11, 2006); UnitedStates v. Bonzi Software, Inc., No. CV-04-1048 (C.D. Cal., filed Feb. 17, 2004); United States v.Looksmart, Ltd., No. 01-605-A (E.D. Va., filed Apr. 18, 2001); United States v.Bigmailbox.Com, Inc., No. 01-606-B (E.D. Va., filed Apr. 18, 2001).

-120-

into account the concerns evidenced by the record. On balance, the Commission believes that

the benefits to children and their parents outweigh the costs of implementation to industry.

The Commission has considered, but has decided not to propose, an exemption for small

businesses. The primary purpose of COPPA is to protect children’s online privacy by requiring

verifiable parental consent before an operator collects personal information. The record and the

Commission’s enforcement experience have shown that the threats to children’s privacy are just

as great, if not greater, from small businesses or even individuals than from large businesses.328

Accordingly, an exemption for small businesses would undermine the very purpose of the statute

and Rule.

Nonetheless, the Commission has taken care in developing the final Rule amendments to

set performance standards that regulated entities must achieve, but provide them with the

flexibility to select the most appropriate, cost-effective, technologies to achieve COPPA’s

objective results. For example, the Commission has retained the standard that verifiable parental

consent may be obtained via any means reasonably calculated, in light of available technology,

to ensure that the person providing consent is the child’s parent. The new requirements for

maintaining the security of children’s personal information and deleting such information when

no longer needed do not mandate any specific means to accomplish those objectives. The

Commission has adopted the “reasonable measures” standard enabling operators to use

Page 121: Children's Online Privacy Protection Rule

-121-

competent filtering technologies to prevent children from publicly disclosing personal

information, which the Commission believes will make it easier for operators to avoid the

collection of children’s personal information. The new definition of support for internal

operations is intended to provide operators with the flexibility to collect and use personal

information for purposes consistent with ordinary operation, enhancement, or security measures.

Moreover, the changes to website or online service directed to children should provide greater

flexibility to “family friendly” sites and services in developing mechanisms to provide the

COPPA protections to child visitors.

IV. Paperwork Reduction Act

The existing Rule contains recordkeeping, disclosure, and reporting requirements that

constitute “information collection requirements” as defined by 5 CFR 1320.3(c) under the OMB

regulations that implement the Paperwork Reduction Act (“PRA”), as amended, 44 U.S.C. 3501

et seq. OMB has approved the Rule’s existing information collection requirements through July

31, 2014. In accordance with the PRA, the Commission is seeking OMB approval of the final

Rule amendments under OMB Control No. 3084-0117. The disclosure, recordkeeping, and

reporting requirements under the final Rule amendments discussed above constitute “collections

of information” for purposes of the PRA.

Upon publication of the 2011 NPRM and the 2012 SNPRM, the FTC submitted the

proposed Rule amendments and a Supporting Statement to OMB. In response, OMB filed

comments (dated October 27, 2011 and August 10, 2012) indicating that it was withholding

approval pending the Commission’s examination of the public comments in response to the 2011

NPRM and 2012 SNPRM. The remainder of this section sets forth a revised PRA analysis,

Page 122: Children's Online Privacy Protection Rule

44 U.S.C. 3502(11). In determining whether information will have “practical329

utility,” OMB will consider “whether the agency demonstrates actual timely use for theinformation either to carry out its functions or make it available to third-parties or the public,either directly or by means of a third-party or public posting, notification, labeling, or similardisclosure requirement, for the use of persons who have an interest in entities or transactionsover which the agency has jurisdiction.” 5 CFR 1320.3(l).

-122-

factoring in relevant public comments and the Commission’s resulting or self-initiated changes

to the proposed Rule.

A. Practical Utility

According to the PRA, “practical utility” is “the ability of an agency to use information,

particularly the capability to process such information in a timely and useful fashion.” The329

Commission has maximized the practical utility of the new disclosure (notice) and reporting

requirements contained in the final Rule amendments, consistent with the requirements of

COPPA.

(1) Disclosure Requirements

The final Rule amendments to Section 312.4(c) more clearly articulate the specific

information that operators’ direct notices to parents must include about their information

collection and use practices. The succinct, “just-in-time” notices will present key information to

parents to better enable them to determine whether to permit their children to provide personal

information online, seek access from a website or online service operator to review their

children’s personal information, and object to any further collection, maintenance, or use of such

information. The final Rule amendments to the definitions of operator and website or online

service directed to children in Section 312.2 will better ensure that parents are provided notice

when a child-directed site or service chooses to integrate into its property other services that

Page 123: Children's Online Privacy Protection Rule

2011 NPRM, 76 FR at 59815.330

See id.331

-123-

collect visitors’ personal information. For example, the final Rule amendment to the definition

of operator clarifies that child-directed websites that do not collect personal information from

users, but that employ downloadable software plug-ins or permit other entities, such as

advertising networks, to collect personal information directly from their users, are covered

operators with responsibility for providing parental notice and obtaining consent. Additionally,

the changes to the definition of website or online service directed to children, among other

things, will clarify that the Rule covers a plug-in or ad network where it has actual knowledge

that it is collecting personal information directly from users of a child-directed website or online

service.

To avoid obscuring the most meaningful, material information for consumers, however,

the Commission removed a previously proposed requirement, set forth in the 2011 NPRM, that

all operators collecting, using, or disclosing information on a website or online service must

provide contact information. The Commission retained the existing Rule’s proviso that such330

operators could designate one operator to serve as the point of contact. For the same reason, the

Commission has streamlined the Rule’s online notice requirement to require a simple statement

of: (1) what information the operator collects from children, including whether the website or

online service enables a child to make personal information publicly available; (2) how the

operator uses such information; and (3) the operator’s disclosure practices for such

information. As a part of this revision, the Commission also removed the required statement331

Page 124: Children's Online Privacy Protection Rule

See id.332

-124-

that the operator may not condition a child’s participation in an activity on the child’s disclosure

of more personal information than is reasonably necessary to participate in such activity.332

(2) Reporting Requirements

As stated above, the Commission believes that there is great value in receiving annual

reports from its approved safe harbor programs. Obtaining this information (in addition to the

Commission’s right to access program records) will better ensure that all safe harbor programs

keep sufficient records and that the Commission is routinely apprised of key information about

the safe harbors’ programs and membership oversight. Further, requiring annual reports to

include a description of any safe harbor approvals of new parental consent mechanisms will

inform the Commission of the emergence of new feasible parental consent mechanisms for

operators. Additionally, the final Rule amendments impose more stringent requirements for safe

harbor applicants’ submissions to the Commission to better ensure that applicants are capable of

administering effective safe harbor programs.

Thus, given the justifications stated above for the amended disclosure and reporting

requirements, the final Rule amendments will have significant practical utility.

Page 125: Children's Online Privacy Protection Rule

Id. at 59826.333

-125-

B. Explanation of Estimated Incremental Burden Under the Final RuleAmendments

1. Disclosure: 69,000 hours (for new and existing operators, combined)

2. Reporting: 720 hours (one-time burden, annualized, and recurring)

3. Labor Costs: $21,508,900

4. Non-Labor/Capital Costs: $0

Estimating PRA burden of the final Rule amendments’ requirements depends on various

factors, including the number of firms operating websites or online services directed to children

or having actual knowledge that they are collecting or maintaining personal information from

children, and the number of such firms that collect persistent identifiers for something other than

support for the internal operations of their websites or online services.

In its 2011 NPRM PRA analysis, FTC staff estimated that there were then approximately

2,000 operators subject to the Rule. Staff additionally stated its belief that the number of

operators subject to the Rule would not change significantly as a result of the proposed revision

to the definition of personal information proposed in the 2011 NPRM. Staff believed that333

altering that definition would potentially increase the number of operators, but that the increase

would be offset by other proposed modifications. These offsets included provisions allowing the

use of persistent identifiers to support the internal operations of a website or online service, and

permitting the use of “reasonable measures,” such as automated filtering, to strip out personal

information before posting children’s content in interactive venues. The 2011 NPRM PRA

analysis also assumed that some operators of websites or online services will adjust their

information collection practices so that they will not be collecting personal information from

Page 126: Children's Online Privacy Protection Rule

Id.334

Id.335

Under the PRA, agencies may seek from OMB a maximum three year clearance336

for a collection of information. 44 U.S.C. 3507(g).

Likewise, no comments were received in response to the February 9, 2011 and337

May 31, 2011 Federal Register notices (76 FR 7211 and 76 FR 31334, respectively, available athttp://www.gpo.gov/fdsys/pkg/FR-2011-02-09/pdf/2011-2904.pdf andhttp://www.gpo.gov/fdsys/pkg/FR-2011-05-31/pdf/2011-13357.pdf) seeking comment on theinformation requirements associated with the existing COPPA Rule and the FTC burdenestimates for them. These notices included the Commission staff estimate that roughly 100 newweb entrants each year will fall within the Rule’s coverage.

2011 NPRM, 76 FR at 59826; accord 76 FR 7211 at 7213 and 76 FR at 31335.338

2012 SNPRM, 77 FR at 46650.339

-126-

children. In the 2011 NPRM PRA analysis, staff estimated that approximately 100 new334

operators per year (over a prospective three-year OMB clearance ) of websites or online335 336

services would likely be covered by the Rule through the proposed modifications. No comments

filed in response to the 2011 NPRM took direct issue with these estimates. Commission staff337

also estimated that no more than one safe harbor applicant will submit a request within the next

three years, and this estimate has not been contested. 338

In its 2012 SNPRM PRA analysis, staff stated that the proposed modifications to the

Rule would change the definitions of operator and website or online service directed to children,

potentially increasing the number of operators subject to the Rule. Staff added, however, that

the proposed amendments to the definitions of support for internal operations and website or

online service direct to children should offset some of the effects of these other definitional

expansions. The 2012 SNPRM PRA analysis also assumed that some operators of websites or339

online services would adjust their information collection practices so that they would not be

Page 127: Children's Online Privacy Protection Rule

Id.340

Id.341

Id.342

Commenter Association for Competitive Technology therefore is mistaken in343

asserting that the “FTC has estimated 500 existing education app makers will be affected by theproposed rule, and an additional 125 newly affected entities each successive year.” Associationfor Competitive Technology (comment 7, 2012 SNPRM), at 2. The Commission’s previousPRA analyses did not specifically estimate numbers of “education app makers,” and thecommenter did not account for the Commission’s 2011 NPRM estimate of 2,000 existingentities.

Under the existing OMB clearance for the pre-amended Rule, however, the FTC344

had already accounted for an estimated 100 new operators each requiring approximately 60hours to comply with the Rule. See 76 FR at 7211, 7212 (Feb. 9, 2011); 76 FR at 31334, 31335(May 31, 2011). Thus, to avoid double-counting what has already been submitted to OMB and

-127-

collecting personal information from children. Based on those assumptions, FTC staff340

estimated that, in addition to the 2,000 existing operators already covered by the Rule (per the

2011 NPRM PRA analysis), there would be approximately 500 existing operators of websites or

online services likely to be newly covered due to the proposed modifications. Staff also341

estimated that 125 additional new operators per year (over a prospective three-year clearance)

would be covered by the Rule through the proposed modifications. That was incremental to the

previously cleared FTC estimate of 100 new operators per year for the then existing Rule. The342

FTC’s 2011 NPRM and 2012 SNPRM analyses thus cumulatively accounted for an estimated

2,500 existing operators and 225 new operators each year that would be subject to the proposed

Rule amendments.343

Given the public comments received, the Commission now estimates, as detailed further

below, that the final Rule amendments will cover 2,910 existing operators of websites or online

services and 280 new operators per year. These groups of covered operators would generally344

Page 128: Children's Online Privacy Protection Rule

cleared, the ensuing calculations for new operators’ disclosure burden account strictly for thedifference between the revised population estimate (280) and the currently cleared estimate(100), i.e., 180 additional new operators.

Association for Competitive Technology (comment 7, 2012 SNPRM), at 2-3; S.345

Weiner (comment 97, 2012 SNPRM), at 1-2; J. Garrett (comment 38, 2012 SNPRM), at 1; seealso DMA (comment 28, 2012 SNPRM), at 17.

Association for Competitive Technology (comment 7, 2012 SNPRM), at 2.346

Id. (“Unlike the game sector, where one developer may have several applications347

in the top 100, Educational Apps tended to be much closer to a one-to-one ratio between app andcreator at 1.54 apps per developer.”).

Id. ACT’s comment does not describe the methodology it used to categorize apps348

as being directed to children under 13.

Id. at 2-3.349

-128-

consist of certain traditional website operators, mobile app developers, plug-in developers, and

advertising networks.

Existing Operators

The Commission received several comments directed to its estimates of the number of

existing operators, all of which assert that the Commission significantly underestimated these

numbers. The Association for Competitive Technology (“ACT”) cited data showing that as of345

September 2012, there were approximately 74,000 “education” apps in the iTunes App Store,

and 30,000 in the Android market. Based on its review of “top” apps, ACT calculated a ratio346

of 1.54 apps per developer of “education” apps in the iTunes App Store, and that347

approximately 60% of apps in this category were directed to children under 13. Based on this348

information, ACT calculated that approximately 28,800 app developers would be “potentially

affected” by the proposed modifications to the Rule set forth in the 2011 NPRM and 2012

SNPRM. One commenter, the moderator of an online group called “Parents With Apps,”349

Page 129: Children's Online Privacy Protection Rule

S. Weiner (comment 97, 2012 SNPRM), at 1-2.350

J. Garrett (comment 38, 2012 SNPRM), at 1.351

“App Store Metrics,” 148Apps.biz (accessed Nov. 14, 2012), available at352

http://148apps.biz/app-store-metrics; “Android Statistic Top Categories,” AppBrain (accessedNov. 15, 2012), available at http://www.appbrain.com/stats/android-market-app-categories.

Although there are other mobile app platforms and distribution channels, the353

Commission believes that the education, games, and entertainment categories in the iTunes AppStore and the Google Play store adequately approximate the relevant universe of unique mobileapp developers whose apps may be directed to children under 13.

-129-

stated that the group has more than 1,400 small developers of family-friendly apps as

members. Another commenter stated that the Silicon Valley Apps for Kids Meetup group had350

“well over 500 members” as of September 2012, and that “the kids app market is incredibly

vibrant with thousands of developers, over 500 of which” are group members.351

Per the industry information source cited by ACT, the Commission believes that as of

November 2012, there were approximately 75,000 education apps in the iTunes App Store and

approximately 33,000 education apps in the Android market. ACT’s comment appears to352

suggest that it would be reasonable for the Commission to base its PRA estimate of the number

of existing operators subject to the final Rule amendments on the number of “Education” app

developers. The Commission agrees that developer activity in the “Education” category, to the

extent it can be discerned through publicly available information, is a useful starting point for

estimating the number of mobile app developers whose activities may bring them within

coverage of the final Rule amendments. As discussed below, the Commission also looks to

information about “Education” apps in the Google Play store, and apps in the game and

entertainment categories in both the iTunes App Store and Google Play, as a basis for its

estimates for this PRA analysis.353

Page 130: Children's Online Privacy Protection Rule

In estimating this percentage (and similar percentages throughout this section) for354

purposes of the PRA analysis, the Commission’s staff attempted to err on the side of inclusion tocount any apps that were likely to be used by children, whether independently or with parents’assistance. To ensure a generous accounting of operators potentially subject to the Rule, thisestimate included, for example, even toddler apps unlikely to be used by children themselveswithout direct parental assistance.

See Mobile Apps for Kids II Report, at 9-10, supra note 189.355

-130-

Similar to what appears to have been ACT’s methodology, Commission staff reviewed a

list, generated using the desktop version of iTunes, of the Top 200 Paid and Top 200 Free

“Education” apps in the iTunes App Store as of early November 2012. Based on the titles and a

prima facie review of the apps’ descriptions, staff believes that approximately 56% of them may

be directed to children under 13. Averaging this figure and ACT’s 60% calculation, FTC staff354

estimates that 58% of “Education” Apps in the iTunes App Store may be directed to children

under 13, meaning that 43,500 of those 75,000 “Education” apps may be directed to children

under 13. To determine a ratio for the Education apps for the Android platform, Commission

staff reviewed listings of the Top 216 Paid and Top 216 Free “Education” apps in the Google

Play store as of mid-November 2012. Staff believes that approximately 42% of them may be

directed to children under 13; 42% of 33,000 apps yields 13,860 apps that may be directed to

children under 13. Adding these projected totals together yields 57,360 such apps for both

platforms, combined.

It is unreasonable to assume, however, that all apps directed to children under 13 collect

personal information from children, and that no developers only collect persistent identifiers in

support for their internal operations. Data from the Mobile Apps for Kids II Report indicate that

about 59% of the apps surveyed transmit device identification or other persistent identifiers, to

their developers. However, it is not clear how many of those app developers would be using355

Page 131: Children's Online Privacy Protection Rule

See L. Akemann (comment 2, 2012 SNPRM), at 1; DMA (comment 37, 2011356

NPRM), at 7, 14; Scholastic (comment 144, 2011 NPRM), at 13-14; TRUSTe (comment 164,2011 NPRM), at 5.

See Mobile Apps for Kids II Report, at 5-6, 10, supra note 189 (14 of 400 apps357

tested transmitted the mobile device’s geolocation or phone number). These apps alsotransmitted device identification.

-131-

those persistent identifiers in a way that would fall within the final Rule’s amended definition of

personal information. Indeed, the Commission believes, based on the comments received, that

many developers would use such persistent identifiers to support internal operations as defined

in the final Rule amendments and not for other purposes, such as behavioral advertising directed

to children. Furthermore, the Commission believes that some mobile app developers, like356

some other operators of websites or online services, will adjust their information collection

practices so that they will not be collecting personal information from children. The data in the

staff report do suggest, however, that approximately 3.5% of apps directed to children under 13

could be collecting location information or a device’s phone number, thus making their

developers more likely to be covered by the final Rule amendments. The Commission357

believes it is reasonable to assume that an additional 1.5% of those apps could be collecting

other personal information, including transmitting persistent identifiers to developers (or their

partners) to use in ways that implicate COPPA. This results in an estimate of 5% of apps that

may be directed to children under 13, i.e., approximately 2,870 apps, that operate in ways that

implicate the final Rule amendments.

Page 132: Children's Online Privacy Protection Rule

The Commission believes it is reasonable to assume, as ACT appears to, that358

developers responsible for multiple apps directed to children under 13 will typically have asingle set of privacy practices, a single privacy policy to describe them, and will develop a singlemethod of disclosing the information required by the final Rule amendments. Any marginalincrease in developer burdens addressed in this PRA analysis arising from developers publishingadditional apps is therefore not likely to be significant.

This appears to be a larger universe of data than ACT consulted in generating its359

education-apps-to-developer ratio of 1.54. See Association for Competitive Technology(comment 7, 2012 SNPRM), at 2. Data from the industry source ACT cites indicate a moregeneral apps-to-developer ratio of approximately 3.8 apps per developer of iTunes App Storeapps. See “App Store Metrics,” 148Apps.biz (accessed Nov. 14, 2012), available athttp://148apps.bix/app-store-metrics (727,938 Total Active Apps; 191,366 Active Publishersin the U.S. App Store).

See Mobile Apps for Kids II Report, at 26, supra note 189 (approximately 1.6%360

of developers of apps studied developed apps for both Android and iOS); FTC Staff, MobileApps for Kids: Current Privacy Disclosures are Disappointing, at 8-9 (Feb. 2012), available at

-132-

To estimate the number of developers responsible for these apps, Commission staff358

used the “Browse” function in iTunes, to generate a list of 6,000 apps in the “Education”

category. Sorting that list by “Genre” generates a list of approximately 3,300 apps for which

“Education” was listed as the “Genre.” Approximately 1,800 developers were listed in

connection with these apps. Dividing 3,300 apps by 1,800 developers yields an iTunes

education-apps-per-developer ratio of approximately 1.83, and the Commission assumes this359

ratio would apply for Android apps, as well. Assuming a 1.83 education-apps-to-developer

ratio, it appears that approximately 1,570 developers (2,870 ÷ 1.83) are responsible for apps

directed to children under 13 that operate in ways likely to implicate the final Rule amendments.

At least one more adjustment to this total of approximately 1,570 potentially affected

developers is warranted, however. Commission staff’s research for its two Mobile Apps for

Kids reports indicate that approximately 2.2% of developers of apps that may be directed to

children under 13 develop apps for both iOS and Android. To avoid double-counting360

Page 133: Children's Online Privacy Protection Rule

http://www.ftc.gov/os/2012/02/120216mobile_apps_kids.pdf (approximately 2.7% ofdevelopers of apps studied developed apps for both Android and iOS). Averaging these twopercentages indicates developer overlap of approximately 2.2%.

“App Store Metrics,” 148Apps.biz (accessed Nov. 14, 2012), available at361

http://148apps.bix/app-store-metrics.

See note 357, supra.362

-133-

developers that develop for both platforms, the Commission subtracts 18 developers from the

total (i.e., 1,570 x 2.2% = 34.54; 35 ÷ 2 = 17.5), leaving approximately 1,552 potentially affected

developers of iOS and Android education apps that may be directed to children under 13.

The Commission believes it is also reasonable to add to this total existing developers of

game and entertainment apps directed to children under 13. Commission staff reviewed a list,

generated using the desktop version of iTunes, of the Top 200 Paid and Top 200 Free “Game”

apps in the iTunes App Store as of mid November 2012. Staff believes that approximately 7%

of them may be directed to children under 13. Publicly available industry data show that

approximately 131,000 game apps were available in the iTunes App Store as of mid-November

2012; thus, approximately 9,170 of those apps may be directed to children under 13. 361

Assuming 5% of those apps operate in ways that bring their developers within the ambit of the

final Rule amendments, at a general app-to-developer ratio of 3.8 apps per developer, this362

yields approximately 120 developers (9,170 x .05 = 458.5; 458.5 ÷ 3.8 = 120.66). Commission

staff observed that approximately 35% of developers of games that may be directed to children

under the age of 13 also develop similar education apps. Thus, of the aforementioned 120

developers, 65% would not already have been counted in the previous tally of educational app

developers. This calculation yields an estimate of approximately 78 additional developers of

Page 134: Children's Online Privacy Protection Rule

“App Store Metrics,” 148Apps.biz (accessed Nov. 14, 2012), available at363

http://148apps.bix/app-store-metrics.

-134-

iTunes games apps primarily directed to children under 13 that likely are covered by the final

Rule amendments.

Performing a similar calculation for iTunes “Entertainment” app developers yields few

additional existing developers that are likely to be covered. Commission staff reviewed a list,

generated using the desktop version of iTunes, of the Top 200 Paid and Top 200 Free

“Entertainment” apps in the iTunes App Store as of mid-November 2012. Staff believes that

approximately 2.5% of them may be directed to children under 13. Publicly available industry

data show that approximately 67,600 “Entertainment” apps were available in the iTunes App

Store as of mid-November 2012; thus, approximately 1,690 of those apps may be directed to363

children under 13. Assuming 5% of those apps operate in ways that bring their developers

within the ambit of the final Rule amendments, at a general app-to-developer ratio of 3.8 apps

per developer, this yields approximately 22 developers (1,690 x .05 = 84.5; 84.5 ÷ 3.8 = 22.24).

Commission staff observed that approximately 84% of developers of “Entertainment” apps that

may be directed to children under the age of 13 also develop similar education and game apps.

Thus, of the aforementioned 22 developers, 16% would not already have been counted in the

previous tally of educational and games app developers. This calculation yields an estimate of

approximately 4 additional developers of iTunes entertainment apps primarily directed to

children under 13 that likely are covered by the final Rule amendments.

To account for Android “Games” apps, Commission staff reviewed listings of the Top

216 Paid and Top 216 Free “Games” apps in the Google Play store as of mid-November 2012.

Staff believes that approximately 3% of them may be directed to children under 13. Three

Page 135: Children's Online Privacy Protection Rule

“Android Statistic Top Categories,” AppBrain (accessed Nov. 15, 2012),364

available at http://www.appbrain.com/stats/android-market-app-categories (total calculatedby adding the number of apps in each “Games” subcategory).

Id.365

-135-

percent of 75,000 apps yields about 2,250 Android “Games” apps that may be directed to364

children under 13. Assuming 5% of those apps operate in ways that bring their developers

within the ambit of the final Rule amendments, at a general app-to-developer ratio of 3.8 apps

per developer, this yields approximately 30 developers (2,250 x .05 = 112.5; 112.5 ÷ 3.8 = 29.6).

Assuming that, as Commission staff observed in the iTunes App Store, approximately 35% of

developers of games that may be directed to children under the age of 13 also develop similar

education apps, 65% of the aforementioned 30 developers would not already have been counted

in the previous tally of educational app developers. This calculation yields an estimate of

approximately 19 additional developers of Android games apps primarily directed to children

under 13 that likely are covered by the final Rule amendments.

Similarly, for Android “Entertainment” apps, Commission staff reviewed listings of the

Top 216 Paid and Top 216 Free “Entertainment” apps in the Google Play store as of mid-

November 2012. Staff believes that approximately 2% of them may be directed to children

under 13. Two percent of 67,000 apps yields about 1,340 Android “Entertainment” apps that365

may be directed to children under 13. Assuming 5% of those apps operate in ways that bring

their developers within the ambit of the final Rule amendments, at a general app-to-developer

ratio of 3.8 apps per developer, this yields approximately 18 developers (1,340 x .05 = 67; 67 ÷

3.8 = 17.63). Assuming that, as Commission staff observed with regard to the iTunes App Store,

approximately 84% of developers of entertainment apps that may be directed to children under

Page 136: Children's Online Privacy Protection Rule

See 2011 NPRM, 76 FR at 59812, 59813; 2012 SNPRM, 77 FR at 46649.366

-136-

the age of 13 also develop similar education and game apps, 16% of the aforementioned 18

developers would not already have been counted in the prior tally of educational and game app

developers. This calculation yields an estimate of approximately 3 additional developers of

Android entertainment apps primarily directed to children under 13 that likely are covered by the

final Rule amendments.

Thus, the FTC estimates that approximately 1,660 mobile app developers (1,552 for

iTunes and Android education apps + 78 for iTunes games apps + 4 for iTunes entertainment

apps + 19 for Android games apps + 3 for Android entertainment apps = 1,656) are existing

operators of websites or online services that will be covered by the final Rule amendments. The

FTC’s 2011 NPRM PRA estimate of 2,000 existing operators already covered by the Rule and

its 2012 SNPRM PRA estimate of 500 newly covered existing operators, however, already366

partially accounted for these mobile app developers because these estimates covered all types of

operators subject to COPPA, including mobile app developers. As discussed above, comments

on the FTC staff’s estimate of the number of existing operators focused almost entirely on an

asserted understatement of the number of mobile app developers that would be covered by the

final Rule amendments. The estimate otherwise was not contested. Thus, the total numbers of

mobile app developers set forth herein must be substituted for the total (unspecified) number of

mobile app developers subsumed within the 2011 NPRM and 2012 SNPRM PRA estimates.

The Commission believes it is reasonable to substitute the above-noted estimate of 1,660

mobile app developers for half, i.e., 1,250, of the 2,500 existing operators previously estimated

to be “covered” and “newly covered” by the 2011 NPRM and 2012 SNPRM PRA estimates.

Page 137: Children's Online Privacy Protection Rule

Disclosure burdens do not increase when taking into account plug-in developers367

and advertising networks with actual knowledge because the burden will fall on either theprimary-content site or the plug-in, but need not fall on both. They can choose to allocate theburden between them. The Commission has chosen to account for the burden via the primary-content site or service because it would generally be the party in the best position to give noticeand obtain consent from parents.

S. Weiner (comment 97, 2012 SNPRM), at 1-2.368

-137-

Based on its experience, the Commission believes that half – if not more – of the existing

operators currently covered by the Rule already develop or publish mobile apps. The remaining

1,250 operators would account for traditional website and other online service providers that are

not mobile app developers, as well as plug-in developers and advertising networks that could be

covered by the “actual knowledge” standard. Thus, combining these totals (1,660 + 1,250)367

yields a total of 2,910 operators of existing websites or online services that would likely be

covered by the final Rule amendments.

New Operators

The Commission received one comment asserting that the Commission significantly

underestimated the number of new operators per year that will be covered by the proposed Rule

amendments. One commenter, the moderator of an online group called “Parents With Apps,”

stated that this group of more than 1,400 small developers of family-friendly apps grows by at

least 100 new developers every six months. This would constitute an annual growth rate of368

nearly 15% (200 new developers per year divided by 1,400 developers in the group = 0.1429).

Although the Commission believes this rate of increase is due, at least in part, to increased

awareness among developers of the group’s existence rather than growth in the number of new

developers, the Commission concludes it is reasonable to incorporate this information into its

revised estimate. Assuming a base number of 1,660 existing mobile app developers estimated to

Page 138: Children's Online Privacy Protection Rule

See also Association for Competitive Technology (comment 5, 2011 SNPRM), at369

2 (“total unique apps across all platforms continue to grow beyond the one million mark” sinceApple’s 2008 launch of its App Store; “[t]he mobile app marketplace has grown to a five billiondollar industry from scratch in less than four years.”).

Bureau of Labor Statistics, U.S. Department of Labor, Occupational Outlook370

Handbook, 2012-13 Edition, Software Developers, http://www.bls.gov/ooh/computer-and-information-technology/software-developers.htm (visited November 16, 2012).

-138-

be covered by the final Rule amendments, a 15% growth rate would yield, year-over-year after

three years, an additional 864 new developers, or approximately 290 per year averaged over a

prospective three-year clearance (1,660 x 1.15 = 1,909; 1,909 x 1.15 = 2,195; 2,195 x 1.15 =

2,524; 2,524 - 1,660 = 864; 864 ÷ 3 = 288).369

Bureau of Labor Statistics (“BLS”) projections suggest a much more modest rate of

growth. BLS has projected that employment of software application developers will increase

28% between 2010 and 2020. Assuming 10% of that total 28% growth would occur each year370

of the ten-year period, and a base number of 1,660 existing mobile app developers, one can

derive an increase of approximately 46 (1,645 x 0.028 = 46.48) new mobile app developers per

year on average that will be covered by the final Rule amendments. Combining the average

based on the annual growth rate of Parents With Apps and that based on the BLS software

application developer growth projection yields an increase of approximately 168 (290 + 46 =

336; 336 ÷ 2 = 168) new mobile app developers per year on average that will be covered by the

proposed Rule amendments.

As with its previous estimates of existing developers, mobile app developers were

already included in the Commission’s 2011 NPRM PRA estimate of 100 new operators and the

Commission’s 2012 SNPRM PRA estimate of 125 additional new operators per year. As noted

above, the Commission’s 2011 NPRM and 2012 SNPRM PRA estimates of new operators were

Page 139: Children's Online Privacy Protection Rule

See note 342, supra.371

-139-

contested only as they relate to their estimation of new mobile app developers. Thus, the total

number of new mobile app developers set forth herein should replace the total (unspecified)

number of new mobile app developers subsumed within the 2011 NPRM and 2012 SNPRM

PRA estimates.

The Commission believes it is reasonable to substitute the above-noted estimate of 168

mobile app developers for half, i.e., 113, of the 225 new operators previously estimated to be

covered by the 2011 NPRM and 2012 SNPRM PRA estimates. The remainder of the prior

estimates would account for new website and other online service providers other than new

mobile app developers, as well as new plug-in developers and advertising networks that could be

covered by the “actual knowledge” standard. Thus, combining these totals (168 + 113 = 281)

yields a total of approximately 280 new operators per year (over a prospective three-year

clearance) of websites or online services that would likely be covered by the final Rule

amendments. Given that the FTC’s existing clearance already accounts for an estimate of 100

new operators, the incremental calculation for additional OMB clearance is 180 new operators371

x 60 hours each = 10,800 hours.

C. Recordkeeping

Under the PRA, the term “recordkeeping requirement” means a requirement imposed by

or for an agency on persons to maintain specified records, including a requirement to

(A) retain such records; (B) notify third parties, the Federal Government, or the public of the

existence of such records; (C) disclose such records to third parties, the Federal Government, or

the public; or (D) report to third parties, the Federal Government, or the public regarding such

Page 140: Children's Online Privacy Protection Rule

Under 5 CFR 1320.3(b)(2), OMB excludes from the definition of PRA “burden”372

the time and financial resources needed to comply with agency-imposed recordkeeping,disclosure, or reporting requirements that customarily would be undertaken independently in thenormal course of business. Thus, on further reflection, the FTC has determined not to includerecordkeeping costs for safe harbors as it did in the 2011 NPRM PRA analysis.

See N. Savitt (comment 142, 2011 NPRM), at 1; NCTA (comment 113, 2011373

NPRM), at 23-24.

TIA contends that in the 2012 SNPRM, the Commission “disregarded the374

empirical economic input” regarding compliance costs that TIA had submitted in response to the2011 NPRM, including hour and labor cost estimates. Toy Industry Association (comment 89,2012 SNPRM), at 16. Although the Commission did not discuss TIA’s 2011 comments in the

-140-

records.” The final amendments do not affect the Rule’s existing recordkeeping requirements.

Moreover, FTC staff believes that most of the records listed in the Rule’s pre-existing safe

harbor recordkeeping provisions consist of documentation that such parties have kept in the

ordinary course of business irrespective of the Rule. Any incremental burden, such as that for372

maintaining the results of independent assessments under section 312.11(d), would be, in staff’s

view, marginal.

D. Disclosure Hours

(1) New Operators’ Disclosure Burden

Under the existing OMB clearance for the Rule, the FTC has estimated that new

operators will each spend approximately 60 hours to craft a privacy policy, design mechanisms

to provide the required online privacy notice and, where applicable, direct notice to parents in

order to obtain verifiable consent. Several commenters noted that this 60-hour estimate failed to

take into account accurate costs of compliance with the Rule, but they did not provide the

Commission with empirical data or specific evidence on the number of hours such activities

require. The Toy Industry Association (“TIA”) asserts that the Commission underestimated373 374

Page 141: Children's Online Privacy Protection Rule

SNPRM – which focused on the potential incremental compliance cost changes that theCommission anticipated would flow from certain newly proposed Rule amendments – it hasconsidered TIA’s 2011 and 2012 comments on compliance costs as discussed herein.

Toy Industry Association (comment 89, 2012 SNPRM), at 16-17; Toy Industry375

Association (comment 163, 2011 NPRM), at 17-18; see also DMA (comment 28, 2012SNPRM), at 17.

Toy Industry Association (comment 163, 2011 NPRM), at 18.376

Id. at 17. Also with specific regard to potential costs associated with obtaining377

and verifying parental consent, TIA estimates that dedicating employees specifically to this taskwould, if the FTC were to require a “scanned form type of control regime,” require additionalsalary and benefit costs. Id. at 18.

Id. at 17.378

Id. at 18.379

-141-

the number of hours shown in the 2011 NPRM and 2012 SNPRM PRA calculations, and that375

“[d]epending on the FTC’s final revisions to the COPPA Rule, the time it takes to implement

technological changes could more than triple the Commission’s 60-hour estimate.” These376

assertions appear to be based primarily on TIA’s concern that the FTC’s estimate did not include

costs “of ‘ensuring’ security procedures of third parties, securing deletion, managing parental

consents, or updating policies to disclose changes in ‘operators.’ In addition, the FTC seems to

reference only top level domains and, as such, its estimates for implementation of new verifiable

parental consent requirements are very low.” TIA states that “the additional processes and377

procedures mandated under the revised proposed Rule will potentially include privacy policy

and operational changes, with related resource-intensive measures, such as organizational

management and employee training.” Moreover, TIA suggests that changes proposed in the378

2011 NPRM to the treatment of screen or user names would entail “enormous” costs that the

FTC did not quantify.379

Page 142: Children's Online Privacy Protection Rule

See Part II.D., supra. As for the “reasonable steps” requirement, the time and380

financial resources operators devote to this task would likely be incurred, anyway, in the normalcourse of their seeking to preserve the security of children’s data conveyed to those third parties.To reiterate, PRA “burden” does not include effort expended in the ordinary course of businessindependent of a regulatory requirement. 5 CFR 1320.3(b)(2). See also Toy IndustryAssociation (comment 163, 2011 NPRM), at 16 (“Operators regularly investigate agents, serviceproviders, and business partners to assure that they will responsibly maintain the security andconfidentiality of children’s data . . . .”).

See Part II.B.2, supra.381

See Part II.C.7, supra. Furthermore, the requirement to obtain parental consent is382

not a collection of information under the PRA.

See Part II.A.5.a, supra. This change also appears to moot NCTA’s concern that383

operators would be faced with substantial costs if “forced to redesign” websites to eliminate theuse of unique screen or user names. NCTA (comment 113, 2011 NPRM), at 23 n.69.

TIA also cites the potential cost of needing to “develop communication tools and384

respond to complaints from parents who may mistakenly believe that companies are altering datacollection practices. . . .” Toy Industry Association (comment 163, 2011 NPRM), at 18. Thisspeculative cost does not relate to any “information collection requirement” in the final Ruleamendments.

-142-

Substantially all of TIA’s concerns about understated burden estimates relate to proposed

requirements that the Commission has ultimately determined not to adopt. For example, the

final Rule amendments do not require operators to “ensure” that third-parties secure information,

but that they “take reasonable steps” to release children’s information only to third parties

capable of maintaining it securely and provide assurances that they will do so. The380

Commission is not eliminating the “single operator designee” proviso of the Rule’s online notice

requirement. It is not eliminating email plus as an acceptable consent method for operators381

collecting personal information only for internal use. The Commission determined to treat382

screen names as personal information only in those instances in which a screen or user name

rises to the level of online contact information. Thus, in the Commission’s view, TIA’s383

proposed increase to the above-noted estimate of 60 hours for compliance is not warranted.384

Page 143: Children's Online Privacy Protection Rule

TIA states that this first-year cost associated with compliance should not be385

“amortized” over three years. Toy Industry Association (comment 89, 2012 SNPRM), at 17. As stated supra note 336, however, agencies may seek up to three years of clearance from OMB,and this is what the FTC routinely does for rulemakings. Moreover, OMB seeks estimates ofannual burden (reflective of the clearance period sought). See 5 CFR 1320.5(a)(1)(iv)(B).

-143-

Applying, then, the 60 hours estimate to the portion of new operators not accounted for in

the FTC’s previously cleared burden totals yields a cumulative total of 10,800 hours (180 new

operators x 60 hours each).

(2) Existing Operators’ Disclosure Burden

The final Rule amendments will not impose ongoing incremental disclosure time per

entity, but, as noted above, would result in an estimated 2,910 existing operators covered by the

Rule. These entities will have a one-time burden to re-design their existing privacy policies and

direct notice procedures that would not carry over to the second and third years of a prospective

three-year OMB clearance under the PRA. Commission staff believes that an existing operator’s

time to make these changes would be no more than that for a new entrant crafting its online and

direct notices for the first time, i.e., 60 hours. Annualized over three years of a prospective

clearance, this amounts to 20 hours ((60 hours + 0 + 0) ÷ 3) per year. Aggregated for the385

estimated 2,910 existing operators that would be subject to the Rule, annualized disclosure

burden would be 58,200 hours per year.

E. Reporting Hours

The final Rule amendments do not impose reporting requirements on operators; they do,

however, for safe harbor programs. Under the FTC’s already cleared estimates, pre-

amendments, staff projected that each new safe harbor program applicant would require 265

Page 144: Children's Online Privacy Protection Rule

76 FR at 7211, 7212 (Feb. 9, 2011); 76 FR at 31334, 31335 (May 31, 2011). 386

These safe harbor reporting hour estimates have not been contested. For PRA purposes,annualized over the course of three years of clearance, this averages roughly 100 hours per year,given that the 265 hours is a one-time, not recurring, expenditure of time for an applicant.

-144-

hours to prepare and submit its safe harbor proposal. The final Rule amendments, however,386

require a safe harbor applicant to submit a more detailed proposal than what the Rule, prior to

such amendments, mandated. Existing safe harbor programs will thus need to submit a revised

application and new safe harbor applicants will have to provide greater detail than they would

have under the original Rule. The FTC estimates this added information will entail

approximately 60 additional hours for each new, and each existing, safe harbor to prepare.

Accordingly, for this added one-time preparation, the aggregate incremental burden is 60 hours

for the projected one new safe harbor program per three-year clearance cycle and 300 hours,

cumulatively, for the five existing safe harbor programs. Annualized for an average single year

per three-year clearance, this amounts to 20 hours for one new safe harbor program, and 100

hours for the existing five safe harbor programs; thus, cumulatively, the burden is 120 hours.

The final Rule amendments require safe harbor programs to audit their members at least

annually and to submit periodic reports to the Commission on the aggregate results of these

member audits. As such, this will increase currently cleared burden estimates pertaining to safe

harbor applicants. The burden for conducting member audits and preparing these reports likely

will vary for each safe harbor program depending on the number of members. Commission staff

estimates that conducting audits and preparing reports will require approximately 100 hours per

program per year. Aggregated for one new (100 hours) and five existing (500 hours) safe harbor

programs, this amounts to an increased disclosure burden of 600 hours per year. Accordingly,

the annualized reporting burden for one new and five existing safe harbor applicants to provide

Page 145: Children's Online Privacy Protection Rule

See 76 FR at 7211, 7212-7213 (Feb. 9, 2011); 76 FR at 31334, 31335 n.1 (May387

31, 2011) (FTC notices for renewing OMB clearance for the COPPA Rule).

As explained in the 2012 SNPRM, “[t]he estimated rate of $180 is roughly388

midway between [BLS] mean hourly wages for lawyers ($62.74) in the most recent annualcompilation available online [as of August 2012] and what Commission staff believes moregenerally reflects hourly attorney costs ($300) associated with Commission informationcollection activities.” 77 FR at 46651, n.54. This estimated rate was an upward revision of theCommission’s estimate of $150 per hour used in the 2011 NPRM. See 76 FR at 59827 n.204and accompanying text. The estimated mean hourly wages for technical labor support ($42) isbased on an average of the salaries for computer programmers, software developers, informationsecurity analysts, and web developers as reported by the BLS. See National Occupational andWages - May 2011, available athttp://www.bls.gov/news.release/archives/ocwage_03272012.pdf.

Toy Industry Association (comment 89, 2012 SNPRM), at 16; Toy Industry389

Association (comment 163, 2011 NPRM), at 17.

-145-

the added information required (120 hours) and to conduct audits and prepare reports (600

hours) is 720 hours, cumulatively.

F. Labor Costs

(1) Disclosure

The Commission assumes that the time spent on compliance for new operators and

existing operators covered by the final Rule amendments would be apportioned five to one

between legal (lawyers or similar professionals) and technical (e.g., computer programmers,

software developers, and information security analysts) personnel. In the 2012 SNPRM, based387

on BLS compiled data, FTC staff assumed for compliance cost estimates a mean hourly rate of

$180 for legal assistance and $42 for technical labor support. These estimates were challenged388

in the comments.

TIA asserts that the Commission underestimates the labor rate for lawyers used in the

Commission’s 2011 NPRM and 2012 SNPRM compliance cost calculations. Given the389

Page 146: Children's Online Privacy Protection Rule

Toy Industry Association (comment 163, 2011 NPRM), at 17. See also NCTA390

(comment 113, 2011 NPRM), at 23 n.70 (“NCTA members typically consult with attorneys whospecialize in data privacy and security laws and whose average rates are 2-3 times theCommission’s [2011 NPRM] estimates [of $150 per hour].”).

Toy Industry Association (comment 89, 2012 SNPRM), at 18. 391

Id., at 10 (citation omitted).392

-146-

comments received, the Commission believes it appropriate to increase the estimated mean

hourly rate of $180 for legal assistance used in certain of the Commission’s 2011 NPRM and

2012 SNPRM compliance cost calculations. TIA stated in its 2011 comment that the “average

rates” of “specialized attorneys who understand children’s privacy and data security laws” with

whom its members typically consult are “2-3 times the Commission’s estimates” of $150 per

hour set forth in the 2011 NPRM. TIA reiterated this information in its 2012 comment and390 391

added: “According to The National Law Journal’s 2011 annual billing survey, the average

hourly firm-wide billing rate (which combines partner and associate rates) ranges from $236 to

$633, not taking into account any area of specialization.” While the Commission believes392

TIA’s information provides useful reference points, it does not provide an adequate basis for

estimating an hourly rate for lawyers for compliance cost calculation purposes.

As an initial matter, the Commission notes that TIA has cited a range of average hourly

rates that its members pay for counsel, not a single average hourly rate, and it did not submit the

underlying data upon which those average rate calculations were based. The range of average

hourly rates TIA stated that its members typically pay (i.e., $300-$450 per hour) may include

some unusually high or low billing rates that have too much influence on the arithmetic means

Page 147: Children's Online Privacy Protection Rule

See Federal Judicial Center, Reference Manual on Scientific Evidence (3 Ed.),393 rd

David H. Kay and David A. Freedman, Reference Guide on Statistics at 238 (“[t]he mean takesaccount of all the data – it involves the total of all the numbers; however, particularly with smalldatasets, a few unusually large or small observations may have too much influence on themean.”).

Toy Industry Association (comment 89, 2012 SNPRM), at 19. Fifty-one law394

firms supplied the average rate information used in the survey’s tabulation, “A nationwidesampling of law firm billing rates,” to which the TIA appears to refer.

The Commission recognizes that many attorneys who specialize in COPPA395

compliance and data security law often work at large law firms located in major metropolitanareas. However, just as the nature of online technology and the mobile marketplace allowoperators to live almost anywhere, see Association for Competitive Technology (comment 5,2011 NPRM), at 2 (the “nature of this industry allows developers to live almost anywhere”), italso allows them to seek the counsel of competent lawyers practicing anywhere in the UnitedStates.

-147-

for those averages to be representative of the rates operators are likely to have to pay. Without393

more information about the distribution of the underlying rates factored into each average, or the

distribution of the averages within the cited range, TIA’s information is of limited value.

Likewise, as TIA’s comments appear to implicitly recognize, routine COPPA compliance

counseling would likely be performed by a mix of attorneys billed at a range of hourly rates.

Unfortunately, the information submitted in TIA’s comments does not indicate how that

workload is typically apportioned as between “high-level partner[s]” whose “support” is required

for “complex” COPPA compliance matters and other, less senior, attorneys at a law firm. The

National Law Journal survey the TIA cites is also a useful reference point, but it is a non-

scientific survey of the nation’s 250 largest law firms that are located predominantly in major394

metropolitan areas. Beyond the range of average hourly firm-wide billing rates that TIA cites,395

the survey states that the average firm-wide billing rate (partners and associates) in 2011 was

$403, the average partner rate was $482, and the average associate rate was $303.

Page 148: Children's Online Privacy Protection Rule

Cf. Civil Division of the United States Attorney’s Office for the District of396

Columbia, United States Attorney’s Office, District of Columbia, Laffey Matrix – 2003-2013,available at http://www.justice.gov/usao/dc/divisions/Laffey_Matrix_2003-2013.pdf (updated“Laffey Matrix” for calculating “reasonable” attorneys fees in suits in which fee shifting isauthorized can be evidence of prevailing market rates for litigation counsel in the Washington,DC area; rates in table range from $245 per hour for most junior associates to $505 per hour formost senior partners).

Toy Industry Association (comment 89, 2012 SNPRM), at 18.397

-148-

The Commission believes it reasonable to assume that the workload among law firm

partners and associates for COPPA compliance questions could be competently addressed and

efficiently distributed among attorneys at varying levels of seniority, but would be weighted

most heavily to more junior attorneys. Thus, assuming an apportionment of two-thirds of such

work is done by associates, and one-third by partners, a weighted average tied to the average

firm-wide associate and average firm-wide partner rates, respectively, in the National Law

Journal 2011 survey would be about $365 per hour. The Commission believes that this rate –

which is very near the mean of TIA’s stated range of purported hourly rates that its members

typically pay to engage counsel for COPPA compliance questions – is an appropriate measure to

calculate the cost of legal assistance for operators to comply with the final Rule amendments. 396

TIA also states that the 2012 SNPRM estimate of $42 per hour for technical support is

too low, and that engaging expert technical personnel can, on average, involve hourly costs that

range from $72 to $108. Similar to TIA’s hours estimate, discussed above, the Commission397

believes that TIA’s estimate may have been based on implementing requirements that,

ultimately, the Commission has determined not to adopt. For example, technical personnel will

not need to “ensure” the security procedures of third parties; operators that have been eligible to

use email plus for parental consents will not be required to implement new systems to replace it.

Page 149: Children's Online Privacy Protection Rule

-149-

It is unclear whether TIA’s estimate for technical support is based on the types of disclosure-

related tasks that the final Rule amendments would actually require, other tasks that the final

Rule amendments would not require, or non-disclosure tasks not covered by the PRA.

Moreover, unlike its estimate for lawyer assistance, TIA’s estimates for technical labor are not

accompanied by an adequate explanation of why estimates for technical support drawn from

BLS statistics are not an appropriate basis for the FTC’s PRA analysis. Accordingly, the

Commission believes it is reasonable to retain the 2012 SNPRM estimate of $42 per hour for

technical assistance based on BLS data.

Thus, for the 180 new operators per year not previously accounted for under the FTC’s

currently cleared estimates, 10,800 cumulative disclosure hours would be composed of 9,000

hours of legal assistance and 1,800 hours of technical support. Applied to hourly rates of $365

and $42, respectively, associated labor costs for the 180 new operators potentially subject to the

proposed amendments would be $3,360,600 (i.e., $3,285,000 for legal support plus $75,600 for

technical support).

Similarly, for the estimated 2,910 existing operators covered by the final Rule

amendments, 58,200 cumulative disclosure hours would consist of 48,500 hours of legal

assistance and 9,700 hours for technical support. Applied at hourly rates of $365 and $42,

respectively, associated labor costs would total $18,109,900 (i.e., $17,702,500 for legal support

plus $407,400 for technical support). Cumulatively, estimated labor costs for new and existing

operators subject to the final Rule amendments is $21,470,500.

Page 150: Children's Online Privacy Protection Rule

Based on Commission staff’s experience with previously approved safe harbor398

programs, staff anticipates that most of the legal tasks associated with safe harbor programs willbe performed by in-house counsel. Cf. Toy Industry Association (comment 89, 2012 SNPRM),at 19 (regional BLS statistics for lawyer wages can support estimates of the level of in-houselegal support likely to be required on an ongoing basis). Moreover, no comments were receivedin response to the February 9, 2011 and May 31, 2011 Federal Register notices (76 FR at 7211and 76 FR at 31334, respectively, available athttp://www.gpo.gov/fdsys/pkg/FR-2011-02-09/pdf/2011-2904.pdf andhttp://www.gpo.gov/fdsys/pkg/FR-2011-05-31/pdf/2011-13357.pdf), which assumed a laborrate of $150 per hour for lawyers or similar professionals to prepare and submit a new safeharbor application. Nor was that challenged in the comments responding to the 2011 NPRM.

See Bureau of Labor Statistics National Compensation Survey: Occupational399

Earnings in the United States, 2010, at Table 3, available athttp://www.bls.gov/ncs/ocs/sp/nctb1477.pdf. This rate has not been contested.

-150-

(2) Reporting

The Commission staff assumes that the tasks to prepare augmented safe harbor program

applications occasioned by the final Rule amendments will be performed primarily by lawyers,

at a mean labor rate of $180 an hour. Thus, applied to an assumed industry total of 120 hours398

per year for this task, incremental associated yearly labor costs would total $21,600.

The Commission staff assumes periodic reports will be prepared by compliance officers,

at a labor rate of $28 per hour. Applied to an assumed industry total of 600 hours per year for399

this task, associated yearly labor costs would be $16,800.

Cumulatively, labor costs for the above-noted reporting requirements total approximately

$38,400 per year.

Page 151: Children's Online Privacy Protection Rule

NCTA commented that the Commission failed to consider costs “related to400

redeveloping child-directed websites” that operators would be “forced” to incur as a result of theproposed Rule amendments, including for “new equipment and software required by theexpanded regulatory regime.” NCTA (comment 113, 2011 NPRM), at 23. Similarly, TIAcommented that the proposed Rule amendments would entail “increased monetary costs withrespect to technology acquisition and implementation. . . .” Toy Industry Association (comment163, 2011 NPRM), at 17. These comments, however, do not specify projected costs or whichRule amendments would entail the asserted costs.

-151-

G. Non-Labor/Capital Costs

Because both operators and safe harbor programs will already be equipped with the

computer equipment and software necessary to comply with the Rule’s new notice requirements,

the final Rule amendments should not impose any additional capital or other non-labor costs. 400

V. Revised Rule

List of Subjects in 16 CFR Part 312

Children, Communications, Consumer Protection, Electronic Mail, E-mail, Internet, Online

Service, Privacy, Record Retention, Safety, Science and Technology, Trade Practices, Website,

Youth.

Accordingly, for the reasons stated above, the Federal Trade Commission revises Part

312 of Title 16 of the Code of Federal Regulations to read as follows:

PART 312 – CHILDREN’S ONLINE PRIVACY PROTECTION RULE

Sec.

312.1 Scope of regulations in this part.

312.2 Definitions.

312.3 Regulation of unfair or deceptive acts or practices in connection with the collection, use,and/or disclosure of personal information from and about children on the Internet.

312.4 Notice.

312.5 Parental consent.

312.6 Right of parent to review personal information provided by a child.

Page 152: Children's Online Privacy Protection Rule

-152-

312.7 Prohibition against conditioning a child’s participation on collection of personalinformation.

312.8 Confidentiality, security, and integrity of personal information collected from children.

312.9 Enforcement.

312.10 Data retention and deletion requirements.

312.11 Safe harbor programs.

312.12 Voluntary Commission Approval Processes.

312.13 Severability.

AUTHORITY: 15 U.S.C. 6501-6508

§ 312.1 Scope of regulations in this part.

This part implements the Children’s Online Privacy Protection Act of 1998, (15 U.S.C. 6501,et seq.,) which prohibits unfair or deceptive acts or practices in connection with the collection,use, and/or disclosure of personal information from and about children on the Internet.

§ 312.2 Definitions.

Child means an individual under the age of 13.

Collects or collection means the gathering of any personal information from a child by anymeans, including but not limited to:

(a) Requesting, prompting, or encouraging a child to submit personal informationonline;

(b) Enabling a child to make personal information publicly available in identifiableform. An operator shall not be considered to have collected personal informationunder this paragraph if it takes reasonable measures to delete all or virtually allpersonal information from a child’s postings before they are made public and alsoto delete such information from its records; or

(c) Passive tracking of a child online.

Commission means the Federal Trade Commission.

Page 153: Children's Online Privacy Protection Rule

-153-

Delete means to remove personal information such that it is not maintained in retrievable formand cannot be retrieved in the normal course of business.

Disclose or disclosure means, with respect to personal information:

(a) The release of personal information collected by an operator from a child inidentifiable form for any purpose, except where an operator provides suchinformation to a person who provides support for the internal operations of thewebsite or online service; and

(b) Making personal information collected by an operator from a child publiclyavailable in identifiable form by any means, including but not limited to a publicposting through the Internet, or through a personal home page or screen posted ona website or online service; a pen pal service; an electronic mail service; amessage board; or a chat room.

Federal agency means an agency, as that term is defined in Section 551(1) of title 5, UnitedStates Code.

Internet means collectively the myriad of computer and telecommunications facilities,including equipment and operating software, which comprise the interconnected world-widenetwork of networks that employ the Transmission Control Protocol/Internet Protocol, or anypredecessor or successor protocols to such protocol, to communicate information of all kinds bywire, radio, or other methods of transmission.

Online contact information means an email address or any other substantially similaridentifier that permits direct contact with a person online, including but not limited to, an instantmessaging user identifier, a voice over internet protocol (VOIP) identifier, or a video chat useridentifier.

Operator means any person who operates a website located on the Internet or an onlineservice and who collects or maintains personal information from or about the users of or visitorsto such website or online service, or on whose behalf such information is collected ormaintained, or offers products or services for sale through that website or online service, wheresuch website or online service is operated for commercial purposes involving commerce:

Page 154: Children's Online Privacy Protection Rule

-154-

(a) Among the several States or with 1 or more foreign nations;

(b) In any territory of the United States or in the District of Columbia, or between anysuch territory and

(1) Another such territory, or

(2) Any State or foreign nation; or

(c) Between the District of Columbia and any State, territory, or foreign nation. Thisdefinition does not include any nonprofit entity that would otherwise be exemptfrom coverage under Section 5 of the Federal Trade Commission Act (15 U.S.C.45).

Personal information is collected or maintained on behalf of an operator when: (a) it is collectedor maintained by an agent or service provider of the operator; or (b) the operator benefits byallowing another person to collect personal information directly from users of such website oronline service.

Parent includes a legal guardian.

Person means any individual, partnership, corporation, trust, estate, cooperative, association,or other entity.

Personal information means individually identifiable information about an individualcollected online, including:

(a) A first and last name;

(b) A home or other physical address including street name and name of a city ortown;

(c) Online contact information as defined in this section;

(d) A screen or user name where it functions in the same manner as online contactinformation, as defined in this section;

(e) A telephone number;

(f) A Social Security number;

(g) A persistent identifier that can be used to recognize a user over time and acrossdifferent websites or online services. Such persistent identifier includes, but isnot limited to, a customer number held in a cookie, an Internet Protocol (IP)address, a processor or device serial number, or unique device identifier;

(h) A photograph, video, or audio file where such file contains a child’s image orvoice;

Page 155: Children's Online Privacy Protection Rule

-155-

(i) Geolocation information sufficient to identify street name and name of a city ortown; or

(j) Information concerning the child or the parents of that child that the operatorcollects online from the child and combines with an identifier described in thisdefinition.

Release of personal information means the sharing, selling, renting, or transfer of personalinformation to any third party.

Support for the internal operations of the website or online service means those activitiesnecessary to:

(a) maintain or analyze the functioning of the website or online service;

(b) perform network communications;

(c) authenticate users of, or personalize the content on, the website or online service;

(d) serve contextual advertising on the website or online service or cap the frequencyof advertising;

(e) protect the security or integrity of the user, website, or online service;

(f) ensure legal or regulatory compliance; or

(g) fulfill a request of a child as permitted by §§ 312.5(c)(3) and (4);

so long as the information collected for the activities listed in paragraphs (a)-(g) is not used ordisclosed to contact a specific individual, including through behavioral advertising, to amass aprofile on a specific individual, or for any other purpose.

Third party means any person who is not:

(a) An operator with respect to the collection or maintenance of personal informationon the website or online service; or

(b) A person who provides support for the internal operations of the website or onlineservice and who does not use or disclose information protected under this part forany other purpose.

Obtaining verifiable consent means making any reasonable effort (taking into considerationavailable technology) to ensure that before personal information is collected from a child, aparent of the child:

Page 156: Children's Online Privacy Protection Rule

-156-

(a) Receives notice of the operator’s personal information collection, use, anddisclosure practices; and

(b) Authorizes any collection, use, and/or disclosure of the personal information.

Website or online service directed to children means a commercial website or online service,or portion thereof, that is targeted to children.

(a) In determining whether a website or online service, or a portion thereof, isdirected to children, the Commission will consider its subject matter, visualcontent, use of animated characters or child-oriented activities and incentives,music or other audio content, age of models, presence of child celebrities orcelebrities who appeal to children, language or other characteristics of the websiteor online service, as well as whether advertising promoting or appearing on thewebsite or online service is directed to children. The Commission will alsoconsider competent and reliable empirical evidence regarding audiencecomposition, and evidence regarding the intended audience.

(b) A website or online service shall be deemed directed to children when it hasactual knowledge that it is collecting personal information directly from users ofanother website or online service directed to children.

(c) A website or online service that is directed to children under the criteria set forthin (a) above, but that does not target children as its primary audience, shall not bedeemed directed to children if it: (i) does not collect personal information fromany visitor prior to collecting age information; and (ii) prevents the collection,use, or disclosure of personal information from visitors who identify themselvesas under age 13 without first complying with the notice and parental consentprovisions of this part.

(d) A website or online service shall not be deemed directed to children solelybecause it refers or links to a commercial website or online service directed tochildren by using information location tools, including a directory, index,reference, pointer, or hypertext link.

§ 312.3 Regulation of unfair or deceptive acts or practices in connection with the collection,use, and/or disclosure of personal information from and about children on the Internet.

General requirements. It shall be unlawful for any operator of a website or online servicedirected to children, or any operator that has actual knowledge that it is collecting ormaintaining personal information from a child, to collect personal information from a child in amanner that violates the regulations prescribed under this part. Generally, under this part, anoperator must:

(a) Provide notice on the website or online service of what information it collectsfrom children, how it uses such information, and its disclosure practices for suchinformation (§ 312.4(b));

Page 157: Children's Online Privacy Protection Rule

-157-

(b) Obtain verifiable parental consent prior to any collection, use, and/or disclosureof personal information from children (§ 312.5);

(c) Provide a reasonable means for a parent to review the personal informationcollected from a child and to refuse to permit its further use or maintenance (§312.6);

(d) Not condition a child’s participation in a game, the offering of a prize, or anotheractivity on the child disclosing more personal information than is reasonablynecessary to participate in such activity (§ 312.7); and

(e) Establish and maintain reasonable procedures to protect the confidentiality,security, and integrity of personal information collected from children (§ 312.8).

§ 312.4 Notice.

(a) General principles of notice. It shall be the obligation of the operator to providenotice and obtain verifiable parental consent prior to collecting, using, ordisclosing personal information from children. Such notice must be clearly andunderstandably written, complete, and must contain no unrelated, confusing, orcontradictory materials.

(b) Direct notice to the parent. An operator must make reasonable efforts, takinginto account available technology, to ensure that a parent of a child receives directnotice of the operator’s practices with regard to the collection, use, or disclosureof personal information from children, including notice of any material change inthe collection, use, or disclosure practices to which the parent has previouslyconsented.

(c) Content of the direct notice to the parent.

(1) Content of the direct notice to the parent under § 312.5(c)(1) (Notice toObtain Parent’s Affirmative Consent to the Collection, Use, or Disclosureof a Child’s Personal Information). This direct notice shall set forth:

(i) That the operator has collected the parent’s online contactinformation from the child, and, if such is the case, the name of thechild or the parent, in order to obtain the parent’s consent;

(ii) That the parent’s consent is required for the collection, use, ordisclosure of such information, and that the operator will notcollect, use, or disclose any personal information from the child ifthe parent does not provide such consent;

(iii) The additional items of personal information the operator intendsto collect from the child, or the potential opportunities for thedisclosure of personal information, should the parent provideconsent;

(iv) A hyperlink to the operator’s online notice of its informationpractices required under § 312.4(d);

(v) The means by which the parent can provide verifiable consent tothe collection, use, and disclosure of the information; and

Page 158: Children's Online Privacy Protection Rule

-158-

(vi) That if the parent does not provide consent within a reasonabletime from the date the direct notice was sent, the operator willdelete the parent’s online contact information from its records.

(2) Content of the direct notice to the parent under § 312.5(c)(2) (VoluntaryNotice to Parent of a Child’s Online Activities Not Involving theCollection, Use or Disclosure of Personal Information). Where anoperator chooses to notify a parent of a child’s participation in a websiteor online service, and where such site or service does not collect anypersonal information other than the parent’s online contact information,the direct notice shall set forth:

(i) That the operator has collected the parent’s online contactinformation from the child in order to provide notice to, andsubsequently update the parent about, a child’s participation in awebsite or online service that does not otherwise collect, use, ordisclose children’s personal information;

(ii) That the parent’s online contact information will not be used ordisclosed for any other purpose;

(iii) That the parent may refuse to permit the child’s participation in thewebsite or online service and may require the deletion of theparent’s online contact information, and how the parent can do so;and

(iv) A hyperlink to the operator’s online notice of its informationpractices required under § 312.4(d).

(3) Content of the direct notice to the parent under § 312.5(c)(4) (Notice to aParent of Operator’s Intent to Communicate with the Child MultipleTimes). This direct notice shall set forth:

(i) That the operator has collected the child’s online contactinformation from the child in order to provide multiple onlinecommunications to the child;

(ii) That the operator has collected the parent’s online contactinformation from the child in order to notify the parent that thechild has registered to receive multiple online communicationsfrom the operator;

(iii) That the online contact information collected from the child willnot be used for any other purpose, disclosed, or combined with anyother information collected from the child;

(iv) That the parent may refuse to permit further contact with the childand require the deletion of the parent’s and child’s online contactinformation, and how the parent can do so;

Page 159: Children's Online Privacy Protection Rule

-159-

(v) That if the parent fails to respond to this direct notice, the operatormay use the online contact information collected from the child forthe purpose stated in the direct notice; and

(vi) A hyperlink to the operator’s online notice of its informationpractices required under § 312.4(d).

(4) Content of the direct notice to the parent required under § 312.5(c)(5)(Notice to a Parent In Order to Protect a Child’s Safety). This directnotice shall set forth:

(i) That the operator has collected the name and the online contactinformation of the child and the parent in order to protect thesafety of a child;

(ii) That the information will not be used or disclosed for any purposeunrelated to the child’s safety;

(iii) That the parent may refuse to permit the use, and require thedeletion, of the information collected, and how the parent can doso;

(iv) That if the parent fails to respond to this direct notice, the operatormay use the information for the purpose stated in the direct notice;and

(v) A hyperlink to the operator’s online notice of its informationpractices required under § 312.4(d).

(d) Notice on the website or online service. In addition to the direct notice to theparent, an operator must post a prominent and clearly labeled link to an onlinenotice of its information practices with regard to children on the home or landingpage or screen of its website or online service, and, at each area of the website oronline service where personal information is collected from children. The linkmust be in close proximity to the requests for information in each such area. Anoperator of a general audience website or online service that has a separatechildren’s area must post a link to a notice of its information practices with regardto children on the home or landing page or screen of the children’s area. To becomplete, the online notice of the website or online service’s informationpractices must state the following:

(1) The name, address, telephone number, and e-mail address of all operatorscollecting or maintaining personal information from children through thewebsite or online service. Provided that: the operators of a website oronline service may list the name, address, phone number, and e-mailaddress of one operator who will respond to all inquiries from parentsconcerning the operators’ privacy policies and use of children’sinformation, as long as the names of all the operators collecting ormaintaining personal information from children through the website oronline service are also listed in the notice;

Page 160: Children's Online Privacy Protection Rule

-160-

(2) A description of what information the operator collects from children,including whether the website or online service enables a child to makepersonal information publicly available; how the operator uses suchinformation; and, the operator’s disclosure practices for such information;and

(3) That the parent can review or have deleted the child’s personalinformation, and refuse to permit further collection or use of the child’sinformation, and state the procedures for doing so.

§ 312.5 Parental consent.

(a) General requirements.

(1) An operator is required to obtain verifiable parental consent before anycollection, use, or disclosure of personal information from children,including consent to any material change in the collection, use, ordisclosure practices to which the parent has previously consented.

(2) An operator must give the parent the option to consent to the collectionand use of the child’s personal information without consenting todisclosure of his or her personal information to third parties.

(b) Methods for verifiable parental consent.

(1) An operator must make reasonable efforts to obtain verifiable parentalconsent, taking into consideration available technology. Any method toobtain verifiable parental consent must be reasonably calculated, in lightof available technology, to ensure that the person providing consent is thechild’s parent.

(2) Existing methods to obtain verifiable parental consent that satisfy the

requirements of this paragraph include:

(i) Providing a consent form to be signed by the parent and returnedto the operator by postal mail, facsimile, or electronic scan;

(ii) Requiring a parent, in connection with a monetary transaction, touse a credit card, debit card, or other online payment system thatprovides notification of each discrete transaction to the primaryaccount holder;

(iii) Having a parent call a toll-free telephone number staffed by trainedpersonnel;

(iv) Having a parent connect to trained personnel via video-conference;

(v) Verifying a parent’s identity by checking a form of government-issued identification against databases of such information, where

Page 161: Children's Online Privacy Protection Rule

-161-

the parent’s identification is deleted by the operator from itsrecords promptly after such verification is complete; or

(vi) Provided that, an operator that does not “disclose” (as defined by§312.2) children’s personal information, may use an email coupledwith additional steps to provide assurances that the personproviding the consent is the parent. Such additional steps include: sending a confirmatory email to the parent following receipt ofconsent, or obtaining a postal address or telephone number fromthe parent and confirming the parent’s consent by letter ortelephone call. An operator that uses this method must providenotice that the parent can revoke any consent given in response tothe earlier email.

(3) Safe harbor approval of parental consent methods. A safe harborprogram approved by the Commission under § 312.11 may approve itsmember operators’ use of a parental consent method not currentlyenumerated in paragraph (b)(2) where the safe harbor program determinesthat such parental consent method meets the requirements of paragraph(b)(1).

(c) Exceptions to prior parental consent. Verifiable parental consent is requiredprior to any collection, use, or disclosure of personal information from a childexcept as set forth in this paragraph:

(1) Where the sole purpose of collecting the name or online contact

information of the parent or child is to provide notice and obtain parentalconsent under § 312.4(c)(1). If the operator has not obtained parentalconsent after a reasonable time from the date of the informationcollection, the operator must delete such information from its records;

(2) Where the purpose of collecting a parent’s online contact information is toprovide voluntary notice to, and subsequently update the parent about, thechild’s participation in a website or online service that does not otherwisecollect, use, or disclose children’s personal information. In such cases, theparent’s online contact information may not be used or disclosed for anyother purpose. In such cases, the operator must make reasonable efforts,taking into consideration available technology, to ensure that the parentreceives notice as described in § 312.4(c)(2);

(3) Where the sole purpose of collecting online contact information from achild is to respond directly on a one-time basis to a specific request fromthe child, and where such information is not used to re-contact the child orfor any other purpose, is not disclosed, and is deleted by the operator fromits records promptly after responding to the child’s request;

(4) Where the purpose of collecting a child’s and a parent’s online contactinformation is to respond directly more than once to the child’s specificrequest, and where such information is not used for any other purpose,disclosed, or combined with any other information collected from thechild. In such cases, the operator must make reasonable efforts, taking

Page 162: Children's Online Privacy Protection Rule

-162-

into consideration available technology, to ensure that the parent receivesnotice as described in § 312.4(c)(3). An operator will not be deemed tohave made reasonable efforts to ensure that a parent receives notice wherethe notice to the parent was unable to be delivered;

(5) Where the purpose of collecting a child’s and a parent’s name and onlinecontact information, is to protect the safety of a child, and where suchinformation is not used or disclosed for any purpose unrelated to thechild’s safety. In such cases, the operator must make reasonable efforts,taking into consideration available technology, to provide a parent withnotice as described in § 312.4(c)(4);

(6) Where the purpose of collecting a child’s name and online contactinformation is to:

(i) protect the security or integrity of its website or online service;

(ii) take precautions against liability;

(iii) respond to judicial process; or

(iv) to the extent permitted under other provisions of law, to provideinformation to law enforcement agencies or for an investigation ona matter related to public safety; and where such information is notbe used for any other purpose;

(7) Where an operator collects a persistent identifier and no other personalinformation and such identifier is used for the sole purpose of providingsupport for the internal operations of the website or online service. Insuch case, there also shall be no obligation to provide notice under §312.4; or

(8) Where an operator covered under paragraph (b) of the definition of website or online service directed to children collects a persistentidentifier and no other personal information from a user who affirmativelyinteracts with the operator and whose previous registration with thatoperator indicates that such user is not a child. In such case, there alsoshall be no obligation to provide notice under § 312.4.

§ 312.6 Right of parent to review personal information provided by a child.

(a) Upon request of a parent whose child has provided personal informationto a website or online service, the operator of that website or onlineservice is required to provide to that parent the following:

(1) A description of the specific types or categories of personalinformation collected from children by the operator, such as name,address, telephone number, e-mail address, hobbies, andextracurricular activities;

(2) The opportunity at any time to refuse to permit the operator’sfurther use or future online collection of personal information from

Page 163: Children's Online Privacy Protection Rule

-163-

that child, and to direct the operator to delete the child’s personalinformation; and

(3) Notwithstanding any other provision of law, a means of reviewingany personal information collected from the child. The meansemployed by the operator to carry out this provision must:

(i) Ensure that the requestor is a parent of that child, takinginto account available technology; and

(ii) Not be unduly burdensome to the parent.

(b) Neither an operator nor the operator’s agent shall be held liable under anyFederal or State law for any disclosure made in good faith and followingreasonable procedures in responding to a request for disclosure of personalinformation under this section.

(c) Subject to the limitations set forth in § 312.7, an operator may terminateany service provided to a child whose parent has refused, under paragraph(a)(2) of this section, to permit the operator’s further use or collection ofpersonal information from his or her child or has directed the operator todelete the child’s personal information.

§ 312.7 Prohibition against conditioning a child’s participation on collection of personalinformation.

An operator is prohibited from conditioning a child’s participation in a game, the offering of aprize, or another activity on the child’s disclosing more personal information than is reasonablynecessary to participate in such activity.

§ 312.8 Confidentiality, security, and integrity of personal information collected fromchildren.

The operator must establish and maintain reasonable procedures to protect the confidentiality,security, and integrity of personal information collected from children. The operator must alsotake reasonable steps to release children’s personal information only to service providers andthird parties who are capable of maintaining the confidentiality, security and integrity of suchinformation, and who provide assurances that they will maintain the information in such amanner.

§ 312.9 Enforcement.

Subject to §§ 6503 and 6505 of the Children’s Online Privacy Protection Act of 1998, a violationof a regulation prescribed under section 6502 (a) of this Act shall be treated as a violation of arule defining an unfair or deceptive act or practice prescribed under Section 18(a)(1)(B) of theFederal Trade Commission Act (15 U.S.C. 57a(a)(1)(B)).

Page 164: Children's Online Privacy Protection Rule

-164-

§ 312.10 Data retention and deletion requirements.

An operator of a website or online service shall retain personal information collected online froma child for only as long as is reasonably necessary to fulfill the purpose for which theinformation was collected. The operator must delete such information using reasonablemeasures to protect against unauthorized access to, or use of, the information in connection withits deletion.

§ 312.11 Safe harbor programs.

(a) In general. Industry groups or other persons may apply to the Commission forapproval of self-regulatory program guidelines (“safe harbor programs”). Theapplication shall be filed with the Commission’s Office of the Secretary. TheCommission will publish in the FEDERAL REGISTER a document seekingpublic comment on the application. The Commission shall issue a writtendetermination within 180 days of the filing of the application.

(b) Criteria for approval of self-regulatory program guidelines. Proposed safe harborprograms must demonstrate that they meet the following performance standards:

(1) Program requirements that ensure operators subject to the self-regulatoryprogram guidelines (“subject operators”) provide substantially the same orgreater protections for children as those contained in §§ 312.2 through312.8, and 312.10.

(2) An effective, mandatory mechanism for the independent assessment ofsubject operators’ compliance with the self-regulatory program guidelines. At a minimum, this mechanism must include a comprehensive review bythe safe harbor program, to be conducted not less than annually, of eachsubject operator’s information policies, practices, and representations. The assessment mechanism required under this paragraph can be providedby an independent enforcement program, such as a seal program.

(3) Disciplinary actions for subject operators’ non-compliance with self-regulatory program guidelines. This performance standard may besatisfied by:

(i) Mandatory, public reporting of any action taken against subjectoperators by the industry group issuing the self-regulatoryguidelines;

(ii) Consumer redress;

(iii) Voluntary payments to the United States Treasury in connectionwith an industry-directed program for violators of the self-regulatory guidelines;

(iv) Referral to the Commission of operators who engage in a patternor practice of violating the self-regulatory guidelines; or

(v) Any other equally effective action.

Page 165: Children's Online Privacy Protection Rule

-165-

(c) Request for Commission approval of self-regulatory program guidelines. Aproposed safe harbor program’s request for approval shall be accompanied by thefollowing:

(1) A detailed explanation of the applicant’s business model, and thetechnological capabilities and mechanisms that will be used for initial andcontinuing assessment of subject operators’ fitness for membership in thesafe harbor program;

(2) A copy of the full text of the guidelines for which approval is sought andany accompanying commentary;

(3) A comparison of each provision of §§ 312.2 through 312.8, and 312.10with the corresponding provisions of the guidelines; and

(4) A statement explaining:

(i) how the self-regulatory program guidelines, including theapplicable assessment mechanisms, meet the requirements of thispart; and

(ii) how the assessment mechanisms and compliance consequencesrequired under paragraphs (b)(2) and (b)(3) provide effectiveenforcement of the requirements of this part.

(d) Reporting and recordkeeping requirements. Approved safe harbor programs shall:

(1) By July 1, 2014, and annually thereafter, submit a report to theCommission containing, at a minimum, an aggregated summary of theresults of the independent assessments conducted under paragraph (b)(2),a description of any disciplinary action taken against any subject operatorunder paragraph (b)(3), and a description of any approvals of memberoperators’ use of a parental consent mechanism, pursuant to § 312.5(b)(4);

(2) Promptly respond to Commission requests for additional information; and

(3) Maintain for a period not less than three years, and upon request makeavailable to the Commission for inspection and copying:

(i) Consumer complaints alleging violations of the guidelines bysubject operators;

(ii) Records of disciplinary actions taken against subject operators; and

(iii) Results of the independent assessments of subject operators’compliance required under paragraph (b)(2).

(e) Post-approval modifications to self-regulatory program guidelines. Approvedsafe harbor programs must submit proposed changes to their guidelines for reviewand approval by the Commission in the manner required for initial approval ofguidelines under paragraph (c)(2). The statement required under paragraph (c)(4)must describe how the proposed changes affect existing provisions of theguidelines.

(f) Revocation of approval of self-regulatory program guidelines. The Commissionreserves the right to revoke any approval granted under this section if at any time

Page 166: Children's Online Privacy Protection Rule

-166-

it determines that the approved self-regulatory program guidelines or theirimplementation do not meet the requirements of this part. Safe harbor programsthat were approved prior to the publication of the Final Rule amendments must,by March 1, 2013, submit proposed modifications to their guidelines that wouldbring them into compliance with such amendments, or their approval shall berevoked.

(g) Operators’ participation in a safe harbor program. An operator will be deemedto be in compliance with the requirements of §§ 312.2 through 312.8, and 312.10if that operator complies with Commission-approved safe harbor programguidelines. In considering whether to initiate an investigation or bring anenforcement action against a subject operator for violations of this part, theCommission will take into account the history of the subject operator’sparticipation in the safe harbor program, whether the subject operator has takenaction to remedy such non-compliance, and whether the operator’s non-compliance resulted in any one of the disciplinary actions set forth in paragraph(b)(3).

§ 312.12 Voluntary Commission Approval Processes.

(a) Parental consent methods. An interested party may file a written request forCommission approval of parental consent methods not currently enumerated in§ 312.5(b). To be considered for approval, a party must provide a detaileddescription of the proposed parental consent methods, together with an analysis ofhow the methods meet § 312.5(b)(1). The request shall be filed with theCommission’s Office of the Secretary. The Commission will publish in theFEDERAL REGISTER a document seeking public comment on the request. TheCommission shall issue a written determination within 120 days of the filing ofthe request; and

(b) Support for internal operations of the website or online service. An interestedparty may file a written request for Commission approval of additional activitiesto be included within the definition of support for internal operations. To beconsidered for approval, a party must provide a detailed justification why suchactivities should be deemed support for internal operations, and an analysis oftheir potential effects on children’s online privacy. The request shall be filed withthe Commission’s Office of the Secretary. The Commission will publish in theFEDERAL REGISTER a document seeking public comment on the request. TheCommission shall issue a written determination within 120 days of the filing ofthe request.

Page 167: Children's Online Privacy Protection Rule

-167-

§ 312.13 Severability.

The provisions of this part are separate and severable from one another. If any provision isstayed or determined to be invalid, it is the Commission’s intention that the remaining provisionsshall continue in effect.

By direction of the Commission, Commissioner Rosch abstaining, and Commissioner

Ohlhausen dissenting.

Donald S. Clark

Secretary