You Are Viewing FTC

FTC Guidance – Six Steps Towards More Secure Cloud Computing

Posted by fgilbert on June 28th, 2020

The June 15, 2020 FTC Blogpost, titled Six Steps Towards More Secure Cloud Computing provides a concise, valuable checklist for businesses that use or intend to use cloud services, so that they make their use of cloud services safer. The document is a reminder of the basic golden rules concerning data security when using a third-party service provider.

  • Security is your responsibility
  • Take regular inventories; know what data you have, and where it is
  • Don’t store what you don’t need
  • Take advantage of the security features offered by your cloud service provider to meet your own security obligations
  • Evaluate the risk to the data, and use carefully the controls offered by your CSP
  • Make good use of encryption
  • Stay alert; security is a never-ending quest.


Facebook : Record Settlement

Posted by fgilbert on July 12th, 2019

Facebook might be required to pay a $5 billion dollar fine to the Federal Trade Commission, as a settlement of the investigation of the Cambridge Analytica data scandal, according to a report published by Bloomberg Law.

Customarily, these settlements are published for consultation, and become final several weeks or months later.

Social Networking App to pay $5.7 M Fine in COPPA Case

Posted by fgilbert on February 27th, 2019

On February 27, 2019, the operators of the video social networking app, now known as Tik Tok agreed to pay a $5.7 million fine to settle allegations by the Federal Trade Commission that the company illegally collected personal information from children.[1]This amount is the largest one ever obtained by the FTC in a children’s privacy case.
The app was widely used. Since 2014, 65 million accounts have been registered in the United States and more than 200 million worldwide. The complaint[2]noted that the app operators were aware that a significant percentage of users were younger than 13 and had received thousands of complaints from parents that their children under 13 had created accounts.
In its complaint, the FTC alleged that violated COPPA and the COPPA Rule[3]by failing to notify parents about the collection and use of personal information from users under 13, failing to obtain parental consent before such collection and use, and failing to delete personal information at the request of parents.
To register, users had to provide first and last name, user name, a short biography, and a profile picture, as well as an email address and phone number. The app allowed users to create short videos lip-syncing to music and share those videos with other users.  It also allowed users to interact with other users by commenting on their videos and sending direct messages.
According to the complaint, user accounts were public by default; a child’s profile bio, username, picture, and videos could be seen by other users. While the site allowed users to change their default setting from public to private so that only approved users could follow them, users’ profile pictures and bios remained public, and users could still send them direct messages. The complaint noted that there had been public reports of adults trying to contact children through the app.
Among other things, the settlement includes a $5.7 million fine, an obligation to take offline all videos made by children under the age of 13, and an ongoing obligation to comply with COPPA.[4]

[1]Settlement Order available at
[2]Complaint available at:
[3]COPPA Rule available at:
[4]Settlement Order available at

Yelp to pay $450,000 penalty for COPPA violation

Posted by fgilbert on September 17th, 2014

The Federal Trade Commission has announced a proposed settlement with Yelp, Inc. for COPPA violations. The FTC alleged that, for five years, Yelp illegally collected and used the personal information of children under 13 who registered on its mobile app service.

According to the FTC complaint, Yelp collected personal information from children through the Yelp app without first notifying parents and obtaining their consent. The Yelp app registration process required individuals to provide their date of birth. Several thousand registrants provided a date of birth showing they were under 13 years old. Even though it had knowledge that these registrants were children, Yelp did not follow the requirements of the COPPA Rule and collected their personal information without proper notice to, and consent from, their parents. Information collected included name, e-mail address, geolocation, and any other any information that these children posted on Yelp. In addition, the complaint alleges that Yelp did not adequately test its app to ensure that users under 13 were prohibited from registering.

Under the terms of the proposed settlement agreement, among other things, Yelp must:

  • pay a $450,000 civil penalty;
  • delete information it collected from individuals who stated they were 13 or younger at the time they registered for the service; and
  • submit a compliance report to the FTC in one year outlining its COPPA compliance program.

In a separate action, FTC alleged that TinyCo also improperly collected Children information in violation of COPPA. Under the settlement agreement between TinyCo and the FTC, TinyCo will pay a $300,000 civil penalty.

The FTC Recommends Data Broker Legislation

Posted by fgilbert on May 27th, 2014

The Federal Trade Commission (FTC) is calling for legislation to shed some light on data brokers’ practices and give consumers some control over the use of their personal information. In its 110-page report, “Data Brokers: a Call for Transparency and Accountability”, published on May 27, 2014, the FTC outlines the content of legislation that it is recommending to enable consumers to learn of the existence and activities of data brokers and to have reasonable access to information about them held by data brokers.

This report is the result of an 18 month-study of the practices of nine data brokers – Acxiom, CoreLogic, Datalogix, eBureau, ID Analytics, Intelius, PeekYou, Rapleaf, and Recorded Future. This study started in December 2012, when the FTC served orders on these data brokers, requiring them to provide information about their collection and use of consumers personal data.

In its Data Broker Report, the FTC observes that data brokers collect and store billions of data elements covering nearly every U.S. consumer. The extent of consumer profiling is such that these data brokers know every minute details of consumers’ everyday lives, such as income, socioeconomic status, political and religious affiliations, online and in-store purchases, social media activity, magazine subscriptions. The ability to create such detailed and precise profiles creates significant privacy concerns. According to the FTC Data Broker Report, one of the data brokers studied holds 700 billion data elements, and another adds more than 3 billion new data points to its database each month.

In most cases, data is collected behind the scenes, without consumer knowledge. The FTC Data Broker Report notes that personal data often passes through multiple layers of data brokers who share data with each other. Data brokers combine online and offline data, which may result in potentially sensitive inferences such as those related to ethnicity, income, religion, political leanings, age, or health conditions, such as pregnancy, diabetes, or high cholesterol. Many of the purposes for which data is collected pose risks to consumers, such as the unanticipated secondary uses of the data. For instance, data collected to offer discounts to potential purchasers of motorcycles, could also be interpreted by an insurance provider as a sign of risky behavior, resulting in an increase in life insurance premium. Some data brokers unnecessarily store data about consumers indefinitely, which may create security risks, in addition to the privacy risks described above.

The FTC Data Broker Report recommends that Congress enact legislation to require the following:

For brokers that provide marketing products:

  • The creation of a centralized mechanism, such as an Internet portal, where data brokers can identify themselves, describe their information collection and use practices, and provide links to access and opt-out tools;
  • Data brokers to give consumers access to their data, including any sensitive data, at a reasonable level of detail;
  • Data brokers to inform consumers that they derive certain inferences from raw data;
  • Data brokers to disclose the names and/or categories of their data sources, to enable consumers to correct wrong information with the original source;
  • Consumer-facing entities (e.g., retailers) to provide prominent notice to consumers when they share information with data brokers, along with the ability to opt-out of such sharing; and
  • Consumer-facing entities to obtain consumers’ affirmative express consent before collecting and sharing sensitive information with data brokers.

For brokers that provide “risk mitigation” products:

  • When a consumer-facing company uses a data broker’s risk mitigation product to assist in the decision making process, that company would have to identify the information on which it relied when it decided to limit a consumer’ ability to complete a transaction;
  • Data brokers to allow consumer to access the information used and to correct it, as appropriate.

 For brokers that provide “people search” products:

  • Data brokers to allow consumers to access their own information;
  • Data brokers to allow consumers to opt-out of having the information included in a people search product;
  • Data brokers to disclose the original sources of the information so consumers can correct it;
  • Data brokers to disclose any limitations of an opt-out feature. 

What the FTC Data Broker Report means for data brokers and others

For the past few years, the Federal Trade Commission has monitored, and attempted to guide online behavioral advertising and behavioral targeting. However, while it has repeatedly requested the advertising industry to self regulate its practices, it has not suggested, or even less outlined, proposed legislation.

With its 18-month evaluation of the data broker industry, and the issuance of its Data Broker Report on May 27, 2014, the Federal Trade Commission increases the pressure. This time, without asking for self-regulation, the FTC calls directly for legislation requiring transparency and accountability from data brokers and the availability of access and correction rights for consumers. This is an important step, which may also provide guidance in related areas.

In its Data Broker Report, the Federal Trade Commission limited the scope of its initiative to the use of big data by data brokers, i.e. entities that collect and process data for resale or licensing purposes. It did not address the use of big data by non-brokers – entities that are using the new, sophisticated tools available from big data technologies to mine a wide range of data about their own customers that they have accumulated over the years. While limiting its focus to a segment of the big data users, the FTC made a powerful call for legislation, and provided very specific direction on the principles that should be addressed in that legislation.

The FTC Data Broker Report is a major milestone, compared with the recent White House Big Data Report (May 2014), which suggested legislation that would be based on the White House Consumer Privacy Bill of Rights (February 2012), but did not identify with specificity the elements that this legislation should address or contain, or pointed to the White House Consumer Privacy Bill of Rights without explaining in what way legislation that would be based on the White House Consumer Privacy Bill of Rights would address the specific and unique issues raised by the use of big data technologies and techniques by data brokers.

The FTC Data Broker Report, on the other hand, provides a blue print for legislation that focuses on the unique issues raised by the massive collection of personal data. The principles outlined by the FTC are more directly useable, more practicable, and more pragmatic. They are also better adapted to the idiosyncracies of the world of data brokers, where all uses of data are secondary uses, and were not anticipated – and probably not disclosed – in the privacy disclosures of the customer facing companies that collected the data in the first place.  Thus, it would be much easier to act upon the call for action, and draft legislative text.

It should be further noted that, while the FTC Data Broker Report is limited to a specific market, the ideas that it submits to the U.S. legislator could easily be expanded or extrapolated to all users of big data, i.e. those entities other than data brokers who use big data techniques and massive computing powers for their internal purpose. Thus, entities other than data brokers that process large amounts of data with the intent of producing personal profiles or inferring personal interests, practices, or other characteristics of individuals should consider evaluating the guidance provided in both the FTC Data Broker Report – in addition to that provided in the White House Big Data Report – when trying to anticipate the direction that laws, regulations, and enforcement might take in the next few years with respect to the secondary uses of personal data.

The FTC Data Broker Report is published at:




Review of the Safe Harbor soon?

Posted by fgilbert on March 27th, 2014

In a short statement following the EU-US summit held in Brussels earlier this week, Herman Van Rompuy, President of the European Council, announced on March 27, 2014, that the United States and the European Union have agreed to take steps to address concerns caused by last year’s revelations on the USA NSA surveillance programs, and restore trust.

He indicated that, with respect to commercial use of personal data, the United States “have agreed to a review of the so-called Safe Harbour framework” to ensure transparency and legal certainty. In addition, with respect to government access to personal data, the parties will “negotiate an umbrella agreement on data protection by this summer, based on equal treatment of EU and US citizens.”

The full text of Mr. Van Rampuy’s statement is available at


New FTC COPPA Rule will better protect 21st century children

Posted by fgilbert on December 19th, 2012

The Federal Trade Commission final updated COPPA Rule, published this morning (December 19, 2012),  brings child protection online to the 21st century. While most of the high level requirements, which stem directly from the Child Online Privacy Protection Act (COPPA) remain unchanged, the updated Rule contains references to modern technologies such as geolocation, plug-ins and mobile apps, and modern methods of financing websites, such as behavioral targeting. It also takes into account more than ten years of practice and attempts to address some of the shortcomings and complexities of the prior rule. For example, the new Rule requires better accountability from Safe Harbor programs, which will have to annually audit their members and also report annually to the FTC on the outcome of these annual reviews.  It also requires better accountability from companies.  Companies that release children personal information to third parties service providers or otherwise will be responsible for ensuring that these third parties are capable of protecting the confidentiality, security and integrity of children’s personal information, and that they actually do provide these protections when handling the children information in their custody.


More covered entities

The new definition of “operator” now also covers website or online service directed to children that integrate outside services, such as a plug-in or ad network.  The new definition of “website or online service” will also include plug-ins and ad networks that has actual knowledge that it is collecting personal information through a child-directed website or service.


More personal information protected

The definition of personal information is expanded to include:

  • Geolocation information
  • Photos, videos, and audio files that contain a child’s image or voice
  • Persistent identifiers, such as IP address or mobile device IDs, that can be used to recognize a user over time and across different websites or online services.


More permitted activities

Conversely, more activities are specifically permitted. These contextual advertising, frequency capping, legal compliance, site analysis, and network communications. However, this does not include behavioral advertising. Parental consent is required when using or disclosing information to contact a specific person or develop a profile on that person.


New form of disclosures

The Rule still requires a direct notice to parents in addition to the online notice of information practices, but it streamlines what website or service must disclose in their online privacy statements describing their information practices.


New forms of parental consent

The new Rule offers more ways in which parents can communicate their consents. These additional means include electronic scans of signed parental consent forms (in addition to mail and fax), videoconferencing, use of government-issued ID, and use of online payment systems (other than credit or debit cards) that provides notification of each discrete transaction to the primary account holder.


Stronger security and confidentiality

While operators continue to be responsible for protecting the confidentiality, security and integrity of children’s information, they will be required, in addition, to ensure, before releasing information to service providers and third parties, that these entities are capable of maintaining the confidentiality, security, and integrity of the information. They will be responsible for obtaining assurances that these measures will be maintained.


New limited retention and disposal rules

Operators will be expected to retain personal information collected online from a child for only as long as reasonably necessary to fulfill the purpose for which the information was collected. They will also be required to delete such information by using reasonable measures to protect against unauthorized access to, or use of, the information in connection with its deletion.


New monitoring and reporting requirements

The new Rule strengthens the FTC’s oversight of safe harbor programs. Safe harbor programs will be required to arrange for annual assessment of operators compliance with the program guidelines, and to provide the FTC with an annual report of the aggregated results of these independent assessments.


Compete web analytics under FTC supervision for 20 years

Posted by fgilbert on October 22nd, 2012

The Federal Trade Commission has published a proposed settlement with Compete, Inc. a web analytics company, for violation of Section 5 of the FTC in connection with its collection, use, and lack of protection of personal information (including some highly sensitive information).

Compete uses tracking software to collect data on the browsing behavior of millions of consumers. Then, it uses the data to generate reports, which it sells to clients who want to improve their website traffic and sales.

According to the FTC, Consumers were invited to join a “Consumer Input Panel,” which was promoted using ads that pointed consumers to a Compete website, Compete told consumers that by joining the “Panel” they could win rewards while sharing their opinions about products and services. It also promised that consumers who installed the Compete Toolbar (from could have “instant access” to data about the websites they visited.

Compete did not disclose to consumers that it would collect detailed information such as information they provided in making purchases, not just “the web pages you visit.” Once installed, the Compete tracking component operated in the background, and automatically collected information that consumers entered into websites, such as usernames, passwords, search terms, credit card and financial account information, security codes and expiration dates, and Social Security Numbers.

In addition, Compete represented to consumers that their personal information would be removed from the data it collected before transmitting it to its servers and that it would take reasonable security measures to protect against unauthorized access to, alteration, disclosure or destruction of personal information.”

The FTC accused Compete of violating federal law by using web-tracking software that collects personal data without disclosing the extent of the collection and by failing to honor promises it made to protect the collected personal data, not providing reasonable and appropriate data security; transmitting sensitive information from secure websites in readable text; failing to design and implement reasonable safeguards to protect consumers’ data; and failing to use readily available measures to mitigate the risk to consumers’ data.

The proposed settlement order would require Compete and its licensees to:

  • Fully disclose what information they collect;
  • Obtain consumers’ express consent before collecting any data from Compete software downloaded onto consumers’ computers;
  • Delete or anonymize the consumer data it already has collected; and
  • Provide directions to consumers for uninstalling its software.

In addition, the settlement bars misrepresentations about the company’s privacy and data security practices and requires that it implement a comprehensive information security program with independent third-party audits every two years for 20 years. A copy of the proposed consent decree with Compete is available at:

Compete also licensed its web-tracking software to other companies. Upromise, one of Compete licensees, settled similar FTC charges earlier this year.  The final consent order is available at:

Posted in FTC

FTC v. Google V2.0 – Lessons Learned

Posted by fgilbert on August 13th, 2012

The Federal Trade Commission has published its long-awaited Proposed Consent Order with Google to close its second investigation into Google’s practices (Google 2). Under the proposed document, Google would agree to pay a record $22.5 million civil penalty to settle charges that it misrepresented to users of Apple Safari’s browser that it would not place tracking cookies on their browser, or serve targeted ads. It would also have to disable all tracking cookies that it had said it would not place on consumer’s computers, and report to the FTC by March 8, 2014 on how it has complied with this remediation requirement.

Google 2 Unique Aspects

Unlike most consent orders published by the FTC, the Google 2 Consent Order does not address primarily the actual violations privacy promises made. Rather, it addresses the fact that Google’s activities allegedly violate a prior settlement with the FTC, dated October 2011 (Google 1).

As such, beyond evidencing the FTC’s ongoing efforts to ensure that companies live up to the privacy promises that they make to consumers, Google 2 clearly shows that the FTC takes seriously the commitments that it requires from companies that it has previously investigated. When an FTC consent decree requires a 20-year commitment to abide by certain practices, the FTC may, indeed, return and ensure that the obligations outlined on the consent decree are met.

Privacy Promises are made everywhere

A significant aspect of the proposed Google 2 Consent Order and related Complaint, is that privacy promises are made in numerous places beyond a company’s online privacy statement. They are found, as well as, in other representations made by the company, such as through its regulatory filings, or in its marketing or promotional documents. In the Google 1 enforcement action, the FTC looked at the promises and representations made in Google’s Safe Harbor self-certification filings. In the Google 2 enforcement action, the FTC looked at the promises and representations made in Google’s statements that it complied with the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI).

Misrepresentation of compliance with NAI Code

In the third count of the FTC Complaint in Google 2, the FTC focuses on Google’s representation that it adheres to, or complies with the NAI Self-Regulatory Code of Conduct. The alleged violation of this representation allows the FTC to claim that Google violated its obligation under Google 1 to not “misrepresent the extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity”.

Evolution of the FTC Common Law

Google 2 shows a clear evolution of the FTC “Common Law” of Privacy. As the concept of privacy compliance evolves, the nature of the FTC’s investigations becomes more refined and more expansive. In its prior cases, the FTC first focused on violations of companies’ privacy promises made in their public Privacy Statements. Then, more recently, in several consent orders – including Google 1 – the FTC expanded the scope of its enforcement action to include violations of the Safe Harbor Principles outlined by the US Department of Commerce and the EU Commission. Now, with Google 2, the FTC expands again the scope of its enforcement actions to include potential violation of representations made of compliance with the NAI Self Regulatory Code of Conduct. This trend is likely to continue, and in future cases, we should expect to see an expansion of the FTC investigations into verifying compliance with statements made that a company follows other self-regulatroy industry standards.

What consequences for Businesses

Companies often use their membership in industry groups or privacy programs as a way to show their values, and to express their commitment to certain standards of practice. This was the case for Google with the Safe Harbor of Department of Commerce and of the European Union (Google 1), and with the Network Advertising Initiative (Google 2).

These promises to comply with the rules of a privacy program are not just statements made for marketing purposes. The public reads them, and so do the FTC and other regulators.

Privacy programs such as the Safe Harbor or the NAI Code have specific rules.  As shown in the Google 1 and Google 2 cases, failure to comply with the rules, principles and codes of conducts associated with membership in these programs could be fatal.

If the disclosures made are not consistent with the actual practices and procedures, such deficiency would expose the company to claims of unfair and deceptive practice; or in the case of Google, to substantial fines for failure to comply with an existing consent decree barring future misrepresentation.

If your company makes promises or statements about its privacy – or security – practices, remember and remind your staff that these representations may have significant consequences, and may create a minefield if not attended to properly; and

  • Look for these representations everywhere, and not just in the official company Privacy Statement; for example, look at the filings and self-certification statements, the cookie disclosures, the marketing or sales material, the advertisements;
  • Periodically compare ALL promises that your business makes with what each of your products, services, applications, technologies, devices, cookies, tags, etc. in existence or in development actually does;
  • Educate your IT, IS, Marketing, Communications, Sales, and Legal teams about the importance of working together, and coordinating efforts so that those who develop statements and disclosures about the companies policies and values fully understand, and are aware of all features and capabilities of the products or services that others in the Company are designing and developing;
  • If your company claims that it is a member of a self-regulatory or other privacy compliance program, make sure that you understand the rules, codes of conduct or principles of these programs or industry standards; and ensure that the representations of your company’s compliance with these rules, codes of conduct, principles are accurate, clear and up-to-date;
  • Ensure that ALL of your company’s products and services comply and are consistent with All of the promises made by , or on behalf of, the company in ALL of its statements, policies, disclosures, marketing materials, and at ALL times.

FTC v. Google 2012 – Misrepresentation of Compliance with NAI Code a Key Element

Posted by fgilbert on August 9th, 2012

Google was hit by a $22.5 million penalty as a result of an investigation by the Federal Trade Commission covering Google’s practices with users of the Safari browser. A very interesting aspect of this new case against Google (Google 2), is that it raises the issue of Google’s violation of the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI Code). This is an interesting evolution in the history of the FTC rulings. At first, the FTC focused on violation of privacy promises made in Privacy Statements, then it went on to pursue violation of the Safe Harbor Principles. In this new iteration, the FTC attacks misrepresentation of compliance with industry standard.

Misrepresentation of user’s ability to control collection or use of personal data

Two elements distinguish this case (Google 2) from most of the prior enforcement actions of the FTC. One is that the large fine results, not directly from the actual violations of privacy promises made in Google’s privacy policy, but rather from the fact that Google’s activities are found to violate a prior settlement with the FTC, dated October 2011 (Google 1).

In Google 1, Google promised not to misrepresent:

  • (a) The purposes for which it collects and uses personal information;
  • (b) The extent to which users may exercise control over the collection, use and disclosure of personal information; and
  • (c) The extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity.

According to the FTC complaint in Google 2, Google represented to Safari users that it would not place third party advertising cookies on the browsers of Safari users who had not changed the default browser setting (which by default, blocked third party cookies) and that it would not collect or use information about users’ web-browsing activity. These representations were found to be false by the FTC, resulting in a violation of Google’s obligation under Google 1 (see paragraph (b) in bulleted list above.

Misrepresentation of compliance with NAI Code

The second, and more interesting element of the Google 2 decision, is the FTC analysis of Google’s representation that it adheres to, or complies with the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI Code). In the third count of the FTC Complaint in Google 2, the FTC focuses on Google’s alleged violation of the NAI Code.

This alleged violation allows the FTC to show that Google violated its obligation under Google 1 to not “misrepresent the extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity” (see the requirement under (c) in the bulleted list above). The FTC found that the representation of Google’s compliance with the NAI Code was false, and thus violated its obligation in Google 1 not to make any misrepresentation about following compliance programs.

Evolution of the FTC Common Law

Google 2 shows an interesting evolution of the FTC “Common Law.” In its prior cases, the FTC first focused on violations of companies’ privacy promises made in their public Privacy Statements. Then, in several consent orders published in 2011, including Google 1, the FTC expanded the scope of its enforcement action to violations of the Safe Harbor of the US Department of Commerce and the EU Commission. Now, with Google 2, the FTC expands again the scope of its enforcement action to include, as well, violation of Industry Standards such as the NAI Code.

What this means for businesses

The Google 2 Consent Order has significant implications for all businesses.

Companies often use their membership in industry groups as a way to show their values, and to express their commitment to certain standards of practice. Beware which industry group or program you join; understand their rules. As a member of that group or program, you must adhere by its code of conduct, rules or principles. Make sure that you do, and that all of the aspects of your business do comply with these rules.

When a business publicizes its membership in an industry group or a self-regulatory program, it also publicly represents that it complies with the rules or principles of that group or program. For example, those of the Safe Harbor (as was the case under Google 1) or those of the NAI (as was the case under Google 2), or others. Remember that these representations may have significant consequences, and may create a minefield if not attended properly. To stay out of trouble, the company must also make sure that these representations are accurate, and that it does abide by these promises at all times, and with respect to all of its products.

When a company makes a public commitment to abide by certain rules, it must make sure that it does comply with these rules; otherwise, it is exposed to an unfair and deceptive practice action. Make sure that you periodically compare ALL promises your business makes, with what ALL of your products, services, applications, technologies, actually do.