You Are Viewing FTC

Yelp to pay $450,000 penalty for COPPA violation

Posted by fgilbert on September 17th, 2014

The Federal Trade Commission has announced a proposed settlement with Yelp, Inc. for COPPA violations. The FTC alleged that, for five years, Yelp illegally collected and used the personal information of children under 13 who registered on its mobile app service.

According to the FTC complaint, Yelp collected personal information from children through the Yelp app without first notifying parents and obtaining their consent. The Yelp app registration process required individuals to provide their date of birth. Several thousand registrants provided a date of birth showing they were under 13 years old. Even though it had knowledge that these registrants were children, Yelp did not follow the requirements of the COPPA Rule and collected their personal information without proper notice to, and consent from, their parents. Information collected included name, e-mail address, geolocation, and any other any information that these children posted on Yelp. In addition, the complaint alleges that Yelp did not adequately test its app to ensure that users under 13 were prohibited from registering.

Under the terms of the proposed settlement agreement, among other things, Yelp must:

  • pay a $450,000 civil penalty;
  • delete information it collected from individuals who stated they were 13 or younger at the time they registered for the service; and
  • submit a compliance report to the FTC in one year outlining its COPPA compliance program.

In a separate action, FTC alleged that TinyCo also improperly collected Children information in violation of COPPA. Under the settlement agreement between TinyCo and the FTC, TinyCo will pay a $300,000 civil penalty.

The FTC Recommends Data Broker Legislation

Posted by fgilbert on May 27th, 2014

The Federal Trade Commission (FTC) is calling for legislation to shed some light on data brokers’ practices and give consumers some control over the use of their personal information. In its 110-page report, “Data Brokers: a Call for Transparency and Accountability”, published on May 27, 2014, the FTC outlines the content of legislation that it is recommending to enable consumers to learn of the existence and activities of data brokers and to have reasonable access to information about them held by data brokers.

This report is the result of an 18 month-study of the practices of nine data brokers – Acxiom, CoreLogic, Datalogix, eBureau, ID Analytics, Intelius, PeekYou, Rapleaf, and Recorded Future. This study started in December 2012, when the FTC served orders on these data brokers, requiring them to provide information about their collection and use of consumers personal data.

In its Data Broker Report, the FTC observes that data brokers collect and store billions of data elements covering nearly every U.S. consumer. The extent of consumer profiling is such that these data brokers know every minute details of consumers’ everyday lives, such as income, socioeconomic status, political and religious affiliations, online and in-store purchases, social media activity, magazine subscriptions. The ability to create such detailed and precise profiles creates significant privacy concerns. According to the FTC Data Broker Report, one of the data brokers studied holds 700 billion data elements, and another adds more than 3 billion new data points to its database each month.

In most cases, data is collected behind the scenes, without consumer knowledge. The FTC Data Broker Report notes that personal data often passes through multiple layers of data brokers who share data with each other. Data brokers combine online and offline data, which may result in potentially sensitive inferences such as those related to ethnicity, income, religion, political leanings, age, or health conditions, such as pregnancy, diabetes, or high cholesterol. Many of the purposes for which data is collected pose risks to consumers, such as the unanticipated secondary uses of the data. For instance, data collected to offer discounts to potential purchasers of motorcycles, could also be interpreted by an insurance provider as a sign of risky behavior, resulting in an increase in life insurance premium. Some data brokers unnecessarily store data about consumers indefinitely, which may create security risks, in addition to the privacy risks described above.

The FTC Data Broker Report recommends that Congress enact legislation to require the following:

For brokers that provide marketing products:

  • The creation of a centralized mechanism, such as an Internet portal, where data brokers can identify themselves, describe their information collection and use practices, and provide links to access and opt-out tools;
  • Data brokers to give consumers access to their data, including any sensitive data, at a reasonable level of detail;
  • Data brokers to inform consumers that they derive certain inferences from raw data;
  • Data brokers to disclose the names and/or categories of their data sources, to enable consumers to correct wrong information with the original source;
  • Consumer-facing entities (e.g., retailers) to provide prominent notice to consumers when they share information with data brokers, along with the ability to opt-out of such sharing; and
  • Consumer-facing entities to obtain consumers’ affirmative express consent before collecting and sharing sensitive information with data brokers.

For brokers that provide “risk mitigation” products:

  • When a consumer-facing company uses a data broker’s risk mitigation product to assist in the decision making process, that company would have to identify the information on which it relied when it decided to limit a consumer’ ability to complete a transaction;
  • Data brokers to allow consumer to access the information used and to correct it, as appropriate.

 For brokers that provide “people search” products:

  • Data brokers to allow consumers to access their own information;
  • Data brokers to allow consumers to opt-out of having the information included in a people search product;
  • Data brokers to disclose the original sources of the information so consumers can correct it;
  • Data brokers to disclose any limitations of an opt-out feature. 

What the FTC Data Broker Report means for data brokers and others

For the past few years, the Federal Trade Commission has monitored, and attempted to guide online behavioral advertising and behavioral targeting. However, while it has repeatedly requested the advertising industry to self regulate its practices, it has not suggested, or even less outlined, proposed legislation.

With its 18-month evaluation of the data broker industry, and the issuance of its Data Broker Report on May 27, 2014, the Federal Trade Commission increases the pressure. This time, without asking for self-regulation, the FTC calls directly for legislation requiring transparency and accountability from data brokers and the availability of access and correction rights for consumers. This is an important step, which may also provide guidance in related areas.

In its Data Broker Report, the Federal Trade Commission limited the scope of its initiative to the use of big data by data brokers, i.e. entities that collect and process data for resale or licensing purposes. It did not address the use of big data by non-brokers – entities that are using the new, sophisticated tools available from big data technologies to mine a wide range of data about their own customers that they have accumulated over the years. While limiting its focus to a segment of the big data users, the FTC made a powerful call for legislation, and provided very specific direction on the principles that should be addressed in that legislation.

The FTC Data Broker Report is a major milestone, compared with the recent White House Big Data Report (May 2014), which suggested legislation that would be based on the White House Consumer Privacy Bill of Rights (February 2012), but did not identify with specificity the elements that this legislation should address or contain, or pointed to the White House Consumer Privacy Bill of Rights without explaining in what way legislation that would be based on the White House Consumer Privacy Bill of Rights would address the specific and unique issues raised by the use of big data technologies and techniques by data brokers.

The FTC Data Broker Report, on the other hand, provides a blue print for legislation that focuses on the unique issues raised by the massive collection of personal data. The principles outlined by the FTC are more directly useable, more practicable, and more pragmatic. They are also better adapted to the idiosyncracies of the world of data brokers, where all uses of data are secondary uses, and were not anticipated – and probably not disclosed – in the privacy disclosures of the customer facing companies that collected the data in the first place.  Thus, it would be much easier to act upon the call for action, and draft legislative text.

It should be further noted that, while the FTC Data Broker Report is limited to a specific market, the ideas that it submits to the U.S. legislator could easily be expanded or extrapolated to all users of big data, i.e. those entities other than data brokers who use big data techniques and massive computing powers for their internal purpose. Thus, entities other than data brokers that process large amounts of data with the intent of producing personal profiles or inferring personal interests, practices, or other characteristics of individuals should consider evaluating the guidance provided in both the FTC Data Broker Report – in addition to that provided in the White House Big Data Report – when trying to anticipate the direction that laws, regulations, and enforcement might take in the next few years with respect to the secondary uses of personal data.

The FTC Data Broker Report is published at: http://www.ftc.gov/news-events/press-releases/2014/05/ftc-recommends-congress-require-data-broker-industry-be-more

 

 

 

Review of the Safe Harbor soon?

Posted by fgilbert on March 27th, 2014

In a short statement following the EU-US summit held in Brussels earlier this week, Herman Van Rompuy, President of the European Council, announced on March 27, 2014, that the United States and the European Union have agreed to take steps to address concerns caused by last year’s revelations on the USA NSA surveillance programs, and restore trust.

He indicated that, with respect to commercial use of personal data, the United States “have agreed to a review of the so-called Safe Harbour framework” to ensure transparency and legal certainty. In addition, with respect to government access to personal data, the parties will “negotiate an umbrella agreement on data protection by this summer, based on equal treatment of EU and US citizens.”

The full text of Mr. Van Rampuy’s statement is available at http://www.consilium.europa.eu/uedocs/cms_data/docs/pressdata/en/ec/141919.pdf

 

New FTC COPPA Rule will better protect 21st century children

Posted by fgilbert on December 19th, 2012

The Federal Trade Commission final updated COPPA Rule, published this morning (December 19, 2012),  brings child protection online to the 21st century. While most of the high level requirements, which stem directly from the Child Online Privacy Protection Act (COPPA) remain unchanged, the updated Rule contains references to modern technologies such as geolocation, plug-ins and mobile apps, and modern methods of financing websites, such as behavioral targeting. It also takes into account more than ten years of practice and attempts to address some of the shortcomings and complexities of the prior rule. For example, the new Rule requires better accountability from Safe Harbor programs, which will have to annually audit their members and also report annually to the FTC on the outcome of these annual reviews.  It also requires better accountability from companies.  Companies that release children personal information to third parties service providers or otherwise will be responsible for ensuring that these third parties are capable of protecting the confidentiality, security and integrity of children’s personal information, and that they actually do provide these protections when handling the children information in their custody.

 

More covered entities

The new definition of “operator” now also covers website or online service directed to children that integrate outside services, such as a plug-in or ad network.  The new definition of “website or online service” will also include plug-ins and ad networks that has actual knowledge that it is collecting personal information through a child-directed website or service.

 

More personal information protected

The definition of personal information is expanded to include:

  • Geolocation information
  • Photos, videos, and audio files that contain a child’s image or voice
  • Persistent identifiers, such as IP address or mobile device IDs, that can be used to recognize a user over time and across different websites or online services.

 

More permitted activities

Conversely, more activities are specifically permitted. These contextual advertising, frequency capping, legal compliance, site analysis, and network communications. However, this does not include behavioral advertising. Parental consent is required when using or disclosing information to contact a specific person or develop a profile on that person.

 

New form of disclosures

The Rule still requires a direct notice to parents in addition to the online notice of information practices, but it streamlines what website or service must disclose in their online privacy statements describing their information practices.

 

New forms of parental consent

The new Rule offers more ways in which parents can communicate their consents. These additional means include electronic scans of signed parental consent forms (in addition to mail and fax), videoconferencing, use of government-issued ID, and use of online payment systems (other than credit or debit cards) that provides notification of each discrete transaction to the primary account holder.

 

Stronger security and confidentiality

While operators continue to be responsible for protecting the confidentiality, security and integrity of children’s information, they will be required, in addition, to ensure, before releasing information to service providers and third parties, that these entities are capable of maintaining the confidentiality, security, and integrity of the information. They will be responsible for obtaining assurances that these measures will be maintained.

 

New limited retention and disposal rules

Operators will be expected to retain personal information collected online from a child for only as long as reasonably necessary to fulfill the purpose for which the information was collected. They will also be required to delete such information by using reasonable measures to protect against unauthorized access to, or use of, the information in connection with its deletion.

 

New monitoring and reporting requirements

The new Rule strengthens the FTC’s oversight of safe harbor programs. Safe harbor programs will be required to arrange for annual assessment of operators compliance with the program guidelines, and to provide the FTC with an annual report of the aggregated results of these independent assessments.

 

Compete web analytics under FTC supervision for 20 years

Posted by fgilbert on October 22nd, 2012

The Federal Trade Commission has published a proposed settlement with Compete, Inc. a web analytics company, for violation of Section 5 of the FTC in connection with its collection, use, and lack of protection of personal information (including some highly sensitive information).

Compete uses tracking software to collect data on the browsing behavior of millions of consumers. Then, it uses the data to generate reports, which it sells to clients who want to improve their website traffic and sales.

According to the FTC, Consumers were invited to join a “Consumer Input Panel,” which was promoted using ads that pointed consumers to a Compete website, www.consumerinput.com. Compete told consumers that by joining the “Panel” they could win rewards while sharing their opinions about products and services. It also promised that consumers who installed the Compete Toolbar (from compete.com) could have “instant access” to data about the websites they visited.

Compete did not disclose to consumers that it would collect detailed information such as information they provided in making purchases, not just “the web pages you visit.” Once installed, the Compete tracking component operated in the background, and automatically collected information that consumers entered into websites, such as usernames, passwords, search terms, credit card and financial account information, security codes and expiration dates, and Social Security Numbers.

In addition, Compete represented to consumers that their personal information would be removed from the data it collected before transmitting it to its servers and that it would take reasonable security measures to protect against unauthorized access to, alteration, disclosure or destruction of personal information.”

The FTC accused Compete of violating federal law by using web-tracking software that collects personal data without disclosing the extent of the collection and by failing to honor promises it made to protect the collected personal data, not providing reasonable and appropriate data security; transmitting sensitive information from secure websites in readable text; failing to design and implement reasonable safeguards to protect consumers’ data; and failing to use readily available measures to mitigate the risk to consumers’ data.

The proposed settlement order would require Compete and its licensees to:

  • Fully disclose what information they collect;
  • Obtain consumers’ express consent before collecting any data from Compete software downloaded onto consumers’ computers;
  • Delete or anonymize the consumer data it already has collected; and
  • Provide directions to consumers for uninstalling its software.

In addition, the settlement bars misrepresentations about the company’s privacy and data security practices and requires that it implement a comprehensive information security program with independent third-party audits every two years for 20 years. A copy of the proposed consent decree with Compete is available at: http://www.ftc.gov/os/caselist/1023155/121022competeincagreeorder.pdf

Compete also licensed its web-tracking software to other companies. Upromise, one of Compete licensees, settled similar FTC charges earlier this year.  The final consent order is available at: http://www.ftc.gov/os/caselist/1023116/120403upromisedo.pdf.

Posted in FTC

FTC v. Google V2.0 – Lessons Learned

Posted by fgilbert on August 13th, 2012

The Federal Trade Commission has published its long-awaited Proposed Consent Order with Google to close its second investigation into Google’s practices (Google 2). Under the proposed document, Google would agree to pay a record $22.5 million civil penalty to settle charges that it misrepresented to users of Apple Safari’s browser that it would not place tracking cookies on their browser, or serve targeted ads. It would also have to disable all tracking cookies that it had said it would not place on consumer’s computers, and report to the FTC by March 8, 2014 on how it has complied with this remediation requirement.

Google 2 Unique Aspects

Unlike most consent orders published by the FTC, the Google 2 Consent Order does not address primarily the actual violations privacy promises made. Rather, it addresses the fact that Google’s activities allegedly violate a prior settlement with the FTC, dated October 2011 (Google 1).

As such, beyond evidencing the FTC’s ongoing efforts to ensure that companies live up to the privacy promises that they make to consumers, Google 2 clearly shows that the FTC takes seriously the commitments that it requires from companies that it has previously investigated. When an FTC consent decree requires a 20-year commitment to abide by certain practices, the FTC may, indeed, return and ensure that the obligations outlined on the consent decree are met.

Privacy Promises are made everywhere

A significant aspect of the proposed Google 2 Consent Order and related Complaint, is that privacy promises are made in numerous places beyond a company’s online privacy statement. They are found, as well as, in other representations made by the company, such as through its regulatory filings, or in its marketing or promotional documents. In the Google 1 enforcement action, the FTC looked at the promises and representations made in Google’s Safe Harbor self-certification filings. In the Google 2 enforcement action, the FTC looked at the promises and representations made in Google’s statements that it complied with the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI).

Misrepresentation of compliance with NAI Code

In the third count of the FTC Complaint in Google 2, the FTC focuses on Google’s representation that it adheres to, or complies with the NAI Self-Regulatory Code of Conduct. The alleged violation of this representation allows the FTC to claim that Google violated its obligation under Google 1 to not “misrepresent the extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity”.

Evolution of the FTC Common Law

Google 2 shows a clear evolution of the FTC “Common Law” of Privacy. As the concept of privacy compliance evolves, the nature of the FTC’s investigations becomes more refined and more expansive. In its prior cases, the FTC first focused on violations of companies’ privacy promises made in their public Privacy Statements. Then, more recently, in several consent orders – including Google 1 – the FTC expanded the scope of its enforcement action to include violations of the Safe Harbor Principles outlined by the US Department of Commerce and the EU Commission. Now, with Google 2, the FTC expands again the scope of its enforcement actions to include potential violation of representations made of compliance with the NAI Self Regulatory Code of Conduct. This trend is likely to continue, and in future cases, we should expect to see an expansion of the FTC investigations into verifying compliance with statements made that a company follows other self-regulatroy industry standards.

What consequences for Businesses

Companies often use their membership in industry groups or privacy programs as a way to show their values, and to express their commitment to certain standards of practice. This was the case for Google with the Safe Harbor of Department of Commerce and of the European Union (Google 1), and with the Network Advertising Initiative (Google 2).

These promises to comply with the rules of a privacy program are not just statements made for marketing purposes. The public reads them, and so do the FTC and other regulators.

Privacy programs such as the Safe Harbor or the NAI Code have specific rules.  As shown in the Google 1 and Google 2 cases, failure to comply with the rules, principles and codes of conducts associated with membership in these programs could be fatal.

If the disclosures made are not consistent with the actual practices and procedures, such deficiency would expose the company to claims of unfair and deceptive practice; or in the case of Google, to substantial fines for failure to comply with an existing consent decree barring future misrepresentation.

If your company makes promises or statements about its privacy – or security – practices, remember and remind your staff that these representations may have significant consequences, and may create a minefield if not attended to properly; and

  • Look for these representations everywhere, and not just in the official company Privacy Statement; for example, look at the filings and self-certification statements, the cookie disclosures, the marketing or sales material, the advertisements;
  • Periodically compare ALL promises that your business makes with what each of your products, services, applications, technologies, devices, cookies, tags, etc. in existence or in development actually does;
  • Educate your IT, IS, Marketing, Communications, Sales, and Legal teams about the importance of working together, and coordinating efforts so that those who develop statements and disclosures about the companies policies and values fully understand, and are aware of all features and capabilities of the products or services that others in the Company are designing and developing;
  • If your company claims that it is a member of a self-regulatory or other privacy compliance program, make sure that you understand the rules, codes of conduct or principles of these programs or industry standards; and ensure that the representations of your company’s compliance with these rules, codes of conduct, principles are accurate, clear and up-to-date;
  • Ensure that ALL of your company’s products and services comply and are consistent with All of the promises made by , or on behalf of, the company in ALL of its statements, policies, disclosures, marketing materials, and at ALL times.

Proposed Changes to FTC COPPA Rule

Posted by fgilbert on August 1st, 2012

The FTC has issued a NPRM seeking comments on proposed changes to the COPPA Regulations. These changes are intended to take into account the evolution of web technologies, such as plug-ins and the use third party cookies and ad networks; they would also clarify some of the requirements for websites that contain child-oriented material that may appeal to both parents and children. This new NPRM pertains to changes to the COPPA Regulation that diverge from previously proposed changes that the FTC presented in its September 2011 proposal.

  • Expansion of the definitions of “operator” and “website or service directed to children”

The proposed changes to the definitions of “operator” and “website or online service directed to children” would clarify that an operator that integrates the services of third parties that collect personal information from visitors of its site or service would itself be considered a covered “operator” under the Rule. Further, an ad network or plug-in would also be subject to COPPA if it knows or has reason to know that it is collecting personal information through a child-directed site or service.

  • Clarification of the definition of “personal information”

The proposed change the definition of “personal information” would make it clear that a persistent identifier – e.g., a persistent cookie – would be deemed “personal information” subject to the Rule if it can be used to recognize a user over time or across different sites or services.

However, the use of tracking technologies or identifiers for authenticating users, improving navigation, for site analysis, maintaining user preferences, serving contextual ads, and protecting against fraud and theft would not be considered the collection of “personal information” if the collected data is not used or shared to contact a specific individual, e.g. for behaviorally-targeted advertising.

  • Mixed audience websites

The proposed changes would also clarify that mixed audience websites that contain child-oriented content and whose audience includes both young children and others, including parents, would be allowed to age-screen all visitors in order to provide COPPA’s protections only to users under age 13. However, those child-directed sites or services that knowingly target children under 13 as their primary audience or whose overall content is likely to attract children under age 13 as their primary audience would still be required to treat all users as children

  • Text of the Notice of Proposed Rule Making

The text of the Notice of Proposed Rule Making is available at http://www.ftc.gov/os/2012/08/120801copparule.pdf

Remove any P2P Filesharing Software from your Network

Posted by fgilbert on June 7th, 2012

Remove any P2P filesharing software from your network or be prepared to enter into a 20-year relationship with the Federal Trade Commission. This is what will happen to EPN, Inc., a debt collection business based in Provo, Utah and to Franklin’s Budget Car Sales, Inc., of Statesboro, Georgia, a car dealership. In both cases, the P2P software caused sensitive personal information of thousands of consumers to be accessible to users of other computers connected to the same peer-to-peer network.

On June 7, 2012, the FTC published proposed settlement agreements with these two businesses because they had allowed peer-to-peer file sharing software to be installed on their network.

The FTC case against EPN, Inc. alleges that the lack of security measures at the company allowed the company’s COO to install P2P file-sharing software on the company’s network. As a result, sensitive information including Social Security numbers, health insurance numbers, and medical diagnosis codes of 3,800 hospital patients were available to any computer connected to the P2P network.

The case against Franklin’s Budget Car Sales, Inc. alleges that the installation of P2P software on the company’s network resulted in sensitive financial information of 95,000 consumers such as, names, addresses, Social Security Numbers, dates of birth, and driver’s license numbers to be made available on the P2P network.

In both cases, the companies were charged with failure to observe commonly used best practices:

  • Failure to have an appropriate information security plan;
  • Failure to assess risks to the consumer information collected and stored online;
  • Failure to use reasonable measures to ensure security of the network, such as scanning its networks to identify any P2P file-sharing applications operating on them
  • Failure to adopt policies to prevent or limit unauthorized disclosure of information;
  • Failure to prevent, detect and investigate unauthorized access to personal information on the company’s networks;
  • Failure to adequately train employees;
  • Failure to employ reasonable measures to respond to unauthorized access to personal information.

Failure to implement reasonable and appropriate data security measures as described above was an unfair act or practice and violated federal law, namely Section 5 of the FTC Act. In addition, Franklin Car Sales, as a “financial institution” subject to the Gramm-Leach-Bliley Act (GLBA) was found to have violated both the GLBA Safeguards Rule and Privacy Rule by failing to provide annual privacy notices and a mechanism by which consumers could opt out of information sharing with third parties.

The proposed consent order against EPN and Franklin would require the companies to establish and maintain comprehensive information security programs, and cease any misrepresentation about their data handling practices. The settlement orders with the two companies are substantially similar. They:

  • Bar any future misrepresentations about the privacy, security, confidentiality, and integrity of any personal information;
  • Require the companies to establish and maintain a comprehensive information security program; and
  • Require the companies to undergo data security audits by independent auditors every other year for 20 years.

As always with FTC consent orders, each violation of such an order may result in a civil penalty of up to $16,000.

 

Posted in FTC

FTC v. Myspace

Posted by fgilbert on May 8th, 2012

On May 8, 2012, Myspace agreed to settle Federal Trade Commission charges that it misrepresented its protection of users’ personal information.

The two majors issues at stake were misrepresentation of privacy practices, and misrepresentation of compliance with Safe Harbor principles.

Misrepresentation of Privacy Practices

Myspace assigns a persistent unique identifier, called a “Friend ID,” to each profile created on Myspace. A user’s profile may publicly display the user’s name, age, gender, picture, hobbies, interests, and lists of users’ friends. 

The Myspace privacy policy promised that it would not share a user’s personally identifiable information, or use such information in a way that was inconsistent with the purpose for which it was submitted, without prior notice to, and consent from, the user. It also promised that the information used to customize ads would not identify users to third parties and would not share non-anonymized browsing activity.

The FTC charged that Myspace provided advertisers with the Friend ID of users who were viewing particular pages on the site. Advertisers could use the Friend ID to locate a user’s Myspace profile and obtain personal information publicly available on the profile. Advertisers also could combine the user’s real name and other personal information with additional information to link broader web-browsing activity to a specific individual.

Misrepresentation of Compliance with Safe Harbor Principles

Myspace certified that it complied with the U.S.-EU Safe Harbor principles, which include a requirement that consumers be given notice of how their information will be used and the choice to opt out.

The FTC alleged that the way in which Myspace handled personal information was inconsistent with its representations of compliance with the Safe Harbor principles.

Proposed Settlement

The proposed settlement order would:

  • Bar Myspace from misrepresenting the extent to which it 
protects the privacy of users’ personal information
  • Bar Myspace from misrepresenting the extent to which it belongs to or complies with any privacy, security or other compliance program, including the U.S.-EU Safe Harbor Framework.
  • Require Myspace to establish a comprehensive privacy program designed to protect consumers’ information;
  • Require Myspace to obtain biennial assessments of its privacy program by independent, third party auditors for 20 years.
  • Expose Myspace to a civil penalty of up to $16,000 for each future violation, if any, of the consent order.

The proposed settlement is open for comments; it will be finalized and will become effective after the end of the comment period.

 

Posted in FTC

Mobile App Privacy Webinar on April 19, 2012

Posted by fgilbert on April 17th, 2012

On Thursday April 17, 2012, at 10am PT / 1pm ET, I will be moderating and presenting at a one-hour webinar organized by the Practising Law Institute: “A New Era for Mobile Apps?  What Companies Should Know to Respond to Recent Mobile Privacy Initiatives”.

The webinar will start with an overview of the technologies and ecosystem that surround the operation and use of mobile application, presented by Chris Conley, Technology and Civil Liberties Attorney, ACLU Northern California (San Francisco).

Patricia Poss, Chief, BCP Mobile Technology Unit, Federal Trade Commission (Washington DC) will then comment on the two reports recently published by the Federal Trade Commission:  “Mobile Apps for Children” (February 2012) and the final report “Protecting Consumer Privacy in an Era of Rapid Change”, which both lay out a framework for mobile players (March 2012).

I will follow with an overview of the recent agreement between the California State Attorney General and six major publishers of mobile apps, which sets up basic rules and structures for the publication and enforcement of mobile app privacy policies, and the Consumer Privacy Bill of Rights, which was unveiled by the White House in February 2012.  I will end with suggestions for implementing privacy principles in the mobile world.

To register for this webinar, please visit PLI website.