You Are Viewing FTC

Review of the Safe Harbor soon?

Posted by fgilbert on March 27th, 2014

In a short statement following the EU-US summit held in Brussels earlier this week, Herman Van Rompuy, President of the European Council, announced on March 27, 2014, that the United States and the European Union have agreed to take steps to address concerns caused by last year’s revelations on the USA NSA surveillance programs, and restore trust.

He indicated that, with respect to commercial use of personal data, the United States “have agreed to a review of the so-called Safe Harbour framework” to ensure transparency and legal certainty. In addition, with respect to government access to personal data, the parties will “negotiate an umbrella agreement on data protection by this summer, based on equal treatment of EU and US citizens.”

The full text of Mr. Van Rampuy’s statement is available at http://www.consilium.europa.eu/uedocs/cms_data/docs/pressdata/en/ec/141919.pdf

 

New FTC COPPA Rule will better protect 21st century children

Posted by fgilbert on December 19th, 2012

The Federal Trade Commission final updated COPPA Rule, published this morning (December 19, 2012),  brings child protection online to the 21st century. While most of the high level requirements, which stem directly from the Child Online Privacy Protection Act (COPPA) remain unchanged, the updated Rule contains references to modern technologies such as geolocation, plug-ins and mobile apps, and modern methods of financing websites, such as behavioral targeting. It also takes into account more than ten years of practice and attempts to address some of the shortcomings and complexities of the prior rule. For example, the new Rule requires better accountability from Safe Harbor programs, which will have to annually audit their members and also report annually to the FTC on the outcome of these annual reviews.  It also requires better accountability from companies.  Companies that release children personal information to third parties service providers or otherwise will be responsible for ensuring that these third parties are capable of protecting the confidentiality, security and integrity of children’s personal information, and that they actually do provide these protections when handling the children information in their custody.

 

More covered entities

The new definition of “operator” now also covers website or online service directed to children that integrate outside services, such as a plug-in or ad network.  The new definition of “website or online service” will also include plug-ins and ad networks that has actual knowledge that it is collecting personal information through a child-directed website or service.

 

More personal information protected

The definition of personal information is expanded to include:

  • Geolocation information
  • Photos, videos, and audio files that contain a child’s image or voice
  • Persistent identifiers, such as IP address or mobile device IDs, that can be used to recognize a user over time and across different websites or online services.

 

More permitted activities

Conversely, more activities are specifically permitted. These contextual advertising, frequency capping, legal compliance, site analysis, and network communications. However, this does not include behavioral advertising. Parental consent is required when using or disclosing information to contact a specific person or develop a profile on that person.

 

New form of disclosures

The Rule still requires a direct notice to parents in addition to the online notice of information practices, but it streamlines what website or service must disclose in their online privacy statements describing their information practices.

 

New forms of parental consent

The new Rule offers more ways in which parents can communicate their consents. These additional means include electronic scans of signed parental consent forms (in addition to mail and fax), videoconferencing, use of government-issued ID, and use of online payment systems (other than credit or debit cards) that provides notification of each discrete transaction to the primary account holder.

 

Stronger security and confidentiality

While operators continue to be responsible for protecting the confidentiality, security and integrity of children’s information, they will be required, in addition, to ensure, before releasing information to service providers and third parties, that these entities are capable of maintaining the confidentiality, security, and integrity of the information. They will be responsible for obtaining assurances that these measures will be maintained.

 

New limited retention and disposal rules

Operators will be expected to retain personal information collected online from a child for only as long as reasonably necessary to fulfill the purpose for which the information was collected. They will also be required to delete such information by using reasonable measures to protect against unauthorized access to, or use of, the information in connection with its deletion.

 

New monitoring and reporting requirements

The new Rule strengthens the FTC’s oversight of safe harbor programs. Safe harbor programs will be required to arrange for annual assessment of operators compliance with the program guidelines, and to provide the FTC with an annual report of the aggregated results of these independent assessments.

 

Compete web analytics under FTC supervision for 20 years

Posted by fgilbert on October 22nd, 2012

The Federal Trade Commission has published a proposed settlement with Compete, Inc. a web analytics company, for violation of Section 5 of the FTC in connection with its collection, use, and lack of protection of personal information (including some highly sensitive information).

Compete uses tracking software to collect data on the browsing behavior of millions of consumers. Then, it uses the data to generate reports, which it sells to clients who want to improve their website traffic and sales.

According to the FTC, Consumers were invited to join a “Consumer Input Panel,” which was promoted using ads that pointed consumers to a Compete website, www.consumerinput.com. Compete told consumers that by joining the “Panel” they could win rewards while sharing their opinions about products and services. It also promised that consumers who installed the Compete Toolbar (from compete.com) could have “instant access” to data about the websites they visited.

Compete did not disclose to consumers that it would collect detailed information such as information they provided in making purchases, not just “the web pages you visit.” Once installed, the Compete tracking component operated in the background, and automatically collected information that consumers entered into websites, such as usernames, passwords, search terms, credit card and financial account information, security codes and expiration dates, and Social Security Numbers.

In addition, Compete represented to consumers that their personal information would be removed from the data it collected before transmitting it to its servers and that it would take reasonable security measures to protect against unauthorized access to, alteration, disclosure or destruction of personal information.”

The FTC accused Compete of violating federal law by using web-tracking software that collects personal data without disclosing the extent of the collection and by failing to honor promises it made to protect the collected personal data, not providing reasonable and appropriate data security; transmitting sensitive information from secure websites in readable text; failing to design and implement reasonable safeguards to protect consumers’ data; and failing to use readily available measures to mitigate the risk to consumers’ data.

The proposed settlement order would require Compete and its licensees to:

  • Fully disclose what information they collect;
  • Obtain consumers’ express consent before collecting any data from Compete software downloaded onto consumers’ computers;
  • Delete or anonymize the consumer data it already has collected; and
  • Provide directions to consumers for uninstalling its software.

In addition, the settlement bars misrepresentations about the company’s privacy and data security practices and requires that it implement a comprehensive information security program with independent third-party audits every two years for 20 years. A copy of the proposed consent decree with Compete is available at: http://www.ftc.gov/os/caselist/1023155/121022competeincagreeorder.pdf

Compete also licensed its web-tracking software to other companies. Upromise, one of Compete licensees, settled similar FTC charges earlier this year.  The final consent order is available at: http://www.ftc.gov/os/caselist/1023116/120403upromisedo.pdf.

Posted in FTC

FTC v. Google V2.0 – Lessons Learned

Posted by fgilbert on August 13th, 2012

The Federal Trade Commission has published its long-awaited Proposed Consent Order with Google to close its second investigation into Google’s practices (Google 2). Under the proposed document, Google would agree to pay a record $22.5 million civil penalty to settle charges that it misrepresented to users of Apple Safari’s browser that it would not place tracking cookies on their browser, or serve targeted ads. It would also have to disable all tracking cookies that it had said it would not place on consumer’s computers, and report to the FTC by March 8, 2014 on how it has complied with this remediation requirement.

Google 2 Unique Aspects

Unlike most consent orders published by the FTC, the Google 2 Consent Order does not address primarily the actual violations privacy promises made. Rather, it addresses the fact that Google’s activities allegedly violate a prior settlement with the FTC, dated October 2011 (Google 1).

As such, beyond evidencing the FTC’s ongoing efforts to ensure that companies live up to the privacy promises that they make to consumers, Google 2 clearly shows that the FTC takes seriously the commitments that it requires from companies that it has previously investigated. When an FTC consent decree requires a 20-year commitment to abide by certain practices, the FTC may, indeed, return and ensure that the obligations outlined on the consent decree are met.

Privacy Promises are made everywhere

A significant aspect of the proposed Google 2 Consent Order and related Complaint, is that privacy promises are made in numerous places beyond a company’s online privacy statement. They are found, as well as, in other representations made by the company, such as through its regulatory filings, or in its marketing or promotional documents. In the Google 1 enforcement action, the FTC looked at the promises and representations made in Google’s Safe Harbor self-certification filings. In the Google 2 enforcement action, the FTC looked at the promises and representations made in Google’s statements that it complied with the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI).

Misrepresentation of compliance with NAI Code

In the third count of the FTC Complaint in Google 2, the FTC focuses on Google’s representation that it adheres to, or complies with the NAI Self-Regulatory Code of Conduct. The alleged violation of this representation allows the FTC to claim that Google violated its obligation under Google 1 to not “misrepresent the extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity”.

Evolution of the FTC Common Law

Google 2 shows a clear evolution of the FTC “Common Law” of Privacy. As the concept of privacy compliance evolves, the nature of the FTC’s investigations becomes more refined and more expansive. In its prior cases, the FTC first focused on violations of companies’ privacy promises made in their public Privacy Statements. Then, more recently, in several consent orders – including Google 1 - the FTC expanded the scope of its enforcement action to include violations of the Safe Harbor Principles outlined by the US Department of Commerce and the EU Commission. Now, with Google 2, the FTC expands again the scope of its enforcement actions to include potential violation of representations made of compliance with the NAI Self Regulatory Code of Conduct. This trend is likely to continue, and in future cases, we should expect to see an expansion of the FTC investigations into verifying compliance with statements made that a company follows other self-regulatroy industry standards.

What consequences for Businesses

Companies often use their membership in industry groups or privacy programs as a way to show their values, and to express their commitment to certain standards of practice. This was the case for Google with the Safe Harbor of Department of Commerce and of the European Union (Google 1), and with the Network Advertising Initiative (Google 2).

These promises to comply with the rules of a privacy program are not just statements made for marketing purposes. The public reads them, and so do the FTC and other regulators.

Privacy programs such as the Safe Harbor or the NAI Code have specific rules.  As shown in the Google 1 and Google 2 cases, failure to comply with the rules, principles and codes of conducts associated with membership in these programs could be fatal.

If the disclosures made are not consistent with the actual practices and procedures, such deficiency would expose the company to claims of unfair and deceptive practice; or in the case of Google, to substantial fines for failure to comply with an existing consent decree barring future misrepresentation.

If your company makes promises or statements about its privacy – or security – practices, remember and remind your staff that these representations may have significant consequences, and may create a minefield if not attended to properly; and

  • Look for these representations everywhere, and not just in the official company Privacy Statement; for example, look at the filings and self-certification statements, the cookie disclosures, the marketing or sales material, the advertisements;
  • Periodically compare ALL promises that your business makes with what each of your products, services, applications, technologies, devices, cookies, tags, etc. in existence or in development actually does;
  • Educate your IT, IS, Marketing, Communications, Sales, and Legal teams about the importance of working together, and coordinating efforts so that those who develop statements and disclosures about the companies policies and values fully understand, and are aware of all features and capabilities of the products or services that others in the Company are designing and developing;
  • If your company claims that it is a member of a self-regulatory or other privacy compliance program, make sure that you understand the rules, codes of conduct or principles of these programs or industry standards; and ensure that the representations of your company’s compliance with these rules, codes of conduct, principles are accurate, clear and up-to-date;
  • Ensure that ALL of your company’s products and services comply and are consistent with All of the promises made by , or on behalf of, the company in ALL of its statements, policies, disclosures, marketing materials, and at ALL times.

Proposed Changes to FTC COPPA Rule

Posted by fgilbert on August 1st, 2012

The FTC has issued a NPRM seeking comments on proposed changes to the COPPA Regulations. These changes are intended to take into account the evolution of web technologies, such as plug-ins and the use third party cookies and ad networks; they would also clarify some of the requirements for websites that contain child-oriented material that may appeal to both parents and children. This new NPRM pertains to changes to the COPPA Regulation that diverge from previously proposed changes that the FTC presented in its September 2011 proposal.

  • Expansion of the definitions of “operator” and “website or service directed to children”

The proposed changes to the definitions of “operator” and “website or online service directed to children” would clarify that an operator that integrates the services of third parties that collect personal information from visitors of its site or service would itself be considered a covered “operator” under the Rule. Further, an ad network or plug-in would also be subject to COPPA if it knows or has reason to know that it is collecting personal information through a child-directed site or service.

  • Clarification of the definition of “personal information”

The proposed change the definition of “personal information” would make it clear that a persistent identifier – e.g., a persistent cookie – would be deemed “personal information” subject to the Rule if it can be used to recognize a user over time or across different sites or services.

However, the use of tracking technologies or identifiers for authenticating users, improving navigation, for site analysis, maintaining user preferences, serving contextual ads, and protecting against fraud and theft would not be considered the collection of “personal information” if the collected data is not used or shared to contact a specific individual, e.g. for behaviorally-targeted advertising.

  • Mixed audience websites

The proposed changes would also clarify that mixed audience websites that contain child-oriented content and whose audience includes both young children and others, including parents, would be allowed to age-screen all visitors in order to provide COPPA’s protections only to users under age 13. However, those child-directed sites or services that knowingly target children under 13 as their primary audience or whose overall content is likely to attract children under age 13 as their primary audience would still be required to treat all users as children

  • Text of the Notice of Proposed Rule Making

The text of the Notice of Proposed Rule Making is available at http://www.ftc.gov/os/2012/08/120801copparule.pdf

Remove any P2P Filesharing Software from your Network

Posted by fgilbert on June 7th, 2012

Remove any P2P filesharing software from your network or be prepared to enter into a 20-year relationship with the Federal Trade Commission. This is what will happen to EPN, Inc., a debt collection business based in Provo, Utah and to Franklin’s Budget Car Sales, Inc., of Statesboro, Georgia, a car dealership. In both cases, the P2P software caused sensitive personal information of thousands of consumers to be accessible to users of other computers connected to the same peer-to-peer network.

On June 7, 2012, the FTC published proposed settlement agreements with these two businesses because they had allowed peer-to-peer file sharing software to be installed on their network.

The FTC case against EPN, Inc. alleges that the lack of security measures at the company allowed the company’s COO to install P2P file-sharing software on the company’s network. As a result, sensitive information including Social Security numbers, health insurance numbers, and medical diagnosis codes of 3,800 hospital patients were available to any computer connected to the P2P network.

The case against Franklin’s Budget Car Sales, Inc. alleges that the installation of P2P software on the company’s network resulted in sensitive financial information of 95,000 consumers such as, names, addresses, Social Security Numbers, dates of birth, and driver’s license numbers to be made available on the P2P network.

In both cases, the companies were charged with failure to observe commonly used best practices:

  • Failure to have an appropriate information security plan;
  • Failure to assess risks to the consumer information collected and stored online;
  • Failure to use reasonable measures to ensure security of the network, such as scanning its networks to identify any P2P file-sharing applications operating on them
  • Failure to adopt policies to prevent or limit unauthorized disclosure of information;
  • Failure to prevent, detect and investigate unauthorized access to personal information on the company’s networks;
  • Failure to adequately train employees;
  • Failure to employ reasonable measures to respond to unauthorized access to personal information.

Failure to implement reasonable and appropriate data security measures as described above was an unfair act or practice and violated federal law, namely Section 5 of the FTC Act. In addition, Franklin Car Sales, as a “financial institution” subject to the Gramm-Leach-Bliley Act (GLBA) was found to have violated both the GLBA Safeguards Rule and Privacy Rule by failing to provide annual privacy notices and a mechanism by which consumers could opt out of information sharing with third parties.

The proposed consent order against EPN and Franklin would require the companies to establish and maintain comprehensive information security programs, and cease any misrepresentation about their data handling practices. The settlement orders with the two companies are substantially similar. They:

  • Bar any future misrepresentations about the privacy, security, confidentiality, and integrity of any personal information;
  • Require the companies to establish and maintain a comprehensive information security program; and
  • Require the companies to undergo data security audits by independent auditors every other year for 20 years.

As always with FTC consent orders, each violation of such an order may result in a civil penalty of up to $16,000.

 

Posted in FTC

FTC v. Myspace

Posted by fgilbert on May 8th, 2012

On May 8, 2012, Myspace agreed to settle Federal Trade Commission charges that it misrepresented its protection of users’ personal information.

The two majors issues at stake were misrepresentation of privacy practices, and misrepresentation of compliance with Safe Harbor principles.

Misrepresentation of Privacy Practices

Myspace assigns a persistent unique identifier, called a “Friend ID,” to each profile created on Myspace. A user’s profile may publicly display the user’s name, age, gender, picture, hobbies, interests, and lists of users’ friends. 

The Myspace privacy policy promised that it would not share a user’s personally identifiable information, or use such information in a way that was inconsistent with the purpose for which it was submitted, without prior notice to, and consent from, the user. It also promised that the information used to customize ads would not identify users to third parties and would not share non-anonymized browsing activity.

The FTC charged that Myspace provided advertisers with the Friend ID of users who were viewing particular pages on the site. Advertisers could use the Friend ID to locate a user’s Myspace profile and obtain personal information publicly available on the profile. Advertisers also could combine the user’s real name and other personal information with additional information to link broader web-browsing activity to a specific individual.

Misrepresentation of Compliance with Safe Harbor Principles

Myspace certified that it complied with the U.S.-EU Safe Harbor principles, which include a requirement that consumers be given notice of how their information will be used and the choice to opt out.

The FTC alleged that the way in which Myspace handled personal information was inconsistent with its representations of compliance with the Safe Harbor principles.

Proposed Settlement

The proposed settlement order would:

  • Bar Myspace from misrepresenting the extent to which it 
protects the privacy of users’ personal information
  • Bar Myspace from misrepresenting the extent to which it belongs to or complies with any privacy, security or other compliance program, including the U.S.-EU Safe Harbor Framework.
  • Require Myspace to establish a comprehensive privacy program designed to protect consumers’ information;
  • Require Myspace to obtain biennial assessments of its privacy program by independent, third party auditors for 20 years.
  • Expose Myspace to a civil penalty of up to $16,000 for each future violation, if any, of the consent order.

The proposed settlement is open for comments; it will be finalized and will become effective after the end of the comment period.

 

Posted in FTC

Mobile App Privacy Webinar on April 19, 2012

Posted by fgilbert on April 17th, 2012

On Thursday April 17, 2012, at 10am PT / 1pm ET, I will be moderating and presenting at a one-hour webinar organized by the Practising Law Institute: “A New Era for Mobile Apps?  What Companies Should Know to Respond to Recent Mobile Privacy Initiatives”.

The webinar will start with an overview of the technologies and ecosystem that surround the operation and use of mobile application, presented by Chris Conley, Technology and Civil Liberties Attorney, ACLU Northern California (San Francisco).

Patricia Poss, Chief, BCP Mobile Technology Unit, Federal Trade Commission (Washington DC) will then comment on the two reports recently published by the Federal Trade Commission:  “Mobile Apps for Children” (February 2012) and the final report “Protecting Consumer Privacy in an Era of Rapid Change”, which both lay out a framework for mobile players (March 2012).

I will follow with an overview of the recent agreement between the California State Attorney General and six major publishers of mobile apps, which sets up basic rules and structures for the publication and enforcement of mobile app privacy policies, and the Consumer Privacy Bill of Rights, which was unveiled by the White House in February 2012.  I will end with suggestions for implementing privacy principles in the mobile world.

To register for this webinar, please visit PLI website.

 

FTC issues Report on Kids Privacy & Mobile Apps

Posted by fgilbert on February 16th, 2012

On February 16, 2012, the FTC released a new Report on Privacy issues in Mobile Apps. There are good lessons to be drawn from the document, both for mobile apps developers and for companies that operate websites. What is true for mobile apps is generally also true for websites.

Among other things, the report recommends:

  • Everyone – stores, developers and third parties providing services – should play an active role in providing key information to parents.
  • Information about data practices should be provided in simple and short disclosures.
  • It should be clear whether the app connects with social media
  • It should be clear whether it contains ads.
  • Third parties that collect data also should disclose their privacy practices.
  • App stores also should take responsibility for ensuring that parents have basic information.

The full report is available at: http://www.ftc.gov/opa/2012/02/mobileapps_kids.shtm


Never too Small to Face an FTC COPPA Action

Posted by fgilbert on November 9th, 2011

Some companies think that they are small and can fly under the radar, and need not worry about compliance.  They should rethink their analysis of their legal risks after the recent FTC action against a small social networking site.

On November 8, 2011 the FTC announced a proposed settlement with the social networking site www.skidekids.com, which collected personally information from children without obtaining prior parental consent, in violation of COPPA, and made false statements in its website privacy notice, in violation of the FTC Act.

In this case, the personal information of 5,600 children was illegally collected. This was much less than the violations identified in some of the recent FTC COPPA enforcement actions. For example, the 2006 action against Xanga revealed that Xanga had collected 1.7 million records, the 2008 action against Sony, that Sony had collected 30,000 records, and the 2011 action against W3 Innovations identified 50,000 illegally collected records.

The Problem

The social networking site Skid-e-kids targeted children ages 7-14 and allowed them to register, create and update profile information, create public posts, upload pictures and videos, send messages to other Skid-e-kids members, and “friend” them.

According to the FTC complaint, the website owner – a sole proprietor – was prosecuted for:

  • Failing to provide sufficient notice of its personal data handling practices on its website;
  • Failing to provide direct notice to parents about these practices; and
  • Failing to obtain verifiable parental consent.

In addition, these practices were found to be misleading and deceptive, which in turn was deemed to violate Section 5 of the FTC Act.

The site online privacy statement claimed that the site requires child users to provide a parent’s valid email address in order to register on the website and that it uses this information to send parents a message that can be used to activate the Skid-e-kids account, to notify the parent about its privacy practices, and that it can use the contact information to send the parent communications about features of the site.

According to the FTC, however, Skid-e-kids, actually registered children on the website without collecting a parent’s email address or obtaining permission for their children to participate. Children who registered were able to provide personal information, including their date of birth, email address, first and last name, and city.

The Proposed Settlement

The proposed Consent Decree and Settlement Order against Jones O. Godwin, sole owner of the site www.skidekids.com is available at http://www.ftc.gov/os/caselist/1123033/111108skidekidsorder.pdf. The proposed settlement would:

  • Bar Skid-e-Kids from future violations of COPPA and misrepresentations about the collection and use of children’s information.
  • Require the deletion of all information collected from children in violation of the COPPA Rule;
  • Require that the site post a clear and conspicuous link to www.onguardonline.gov, the FTC site focusing on the protection of children privacy, and that the site privacy statement as well as the privacy notice for parents also contain a reference to the On Guard Online site;
  • Require that, for 5 years, the company engaged qualified privacy professionals to conduct annual assessments of the effectiveness of its privacy controls or become a member in good standing of a COPPA Safe Harbor program approved by the FTC;
  • Require that, for 8 years, records be kept to demonstrate compliance with the above.

A lenient fine … subject to probation

An interesting aspect of the proposed settlement is that the settlement, in effect, imposes only a $1,000 fine to the defendant. The fine is to be paid within five days of the entry of the order. However, if Skid-e-Kids fails to comply with some of the requirements of the Settlement, it will have to pay the full $100,000 fine that is provided for in the settlement.

Specifically, a $100,000 will be assessed if:

  • The defendant fails (a) to have initial and annual privacy assessment (for a total of 5 annual assessments) conducted by a qualified professional approved by the FTC and identifying the privacy controls that have been implemented, how they have been implemented and certifying that the controls are sufficiently effective; or (b) to become a member in good standing of a COPPA Safe Harbor program approved by the FTC for 5 years; or
  • The disclosures made about the defendant’s financial condition are materially inaccurate or contain material misrepresentations.

The Lesson for Site with Children Content

This new case is a reminder that the COPPA Rule contains specific requirements that must be followed, no matter the size of the site, when intending to collect children personal information. The COPPA rule defines procedures and processes that must be followed rigorously.

Among other things, the COPPA Rule requires websites that are directed to children and general audience websites that have actual knowledge that they are collecting children information to:

  • Place on its website a conspicuous link to its privacy statement;
  • Provide specified information in the website privacy statement, describe in clear terms what personal information of children is collected, how it used, and explain what rights children and parents have to review and delete this information;
  • Provide a notice directly to the parents, which must include the website privacy statement, and inform the parents that their consent is required for the collection and use of the children’s information by the site, and how their consent can be obtained;
  • Obtain verifiable consent from the parents before collecting or using the children’s information;
  • Give parents the option to agree to the collection and use of the children’s information without agreeing to the disclosure of this information to third parties.

In addition, we suggest also including, clearly and conspicuously, (a) in the website privacy statement; (b) in the notice to parents; and (c) at each location where personal information is collected a notice that invites the user to visit the On Guard Online website of the Federal Trade Commission for tips on protecting children’s privacy online: www.onguardonline.gov/topics/kids-privacy.aspx.