Archive for May, 2014

The FTC Recommends Data Broker Legislation

Posted by fgilbert on May 27th, 2014

The Federal Trade Commission (FTC) is calling for legislation to shed some light on data brokers’ practices and give consumers some control over the use of their personal information. In its 110-page report, “Data Brokers: a Call for Transparency and Accountability”, published on May 27, 2014, the FTC outlines the content of legislation that it is recommending to enable consumers to learn of the existence and activities of data brokers and to have reasonable access to information about them held by data brokers.

This report is the result of an 18 month-study of the practices of nine data brokers – Acxiom, CoreLogic, Datalogix, eBureau, ID Analytics, Intelius, PeekYou, Rapleaf, and Recorded Future. This study started in December 2012, when the FTC served orders on these data brokers, requiring them to provide information about their collection and use of consumers personal data.

In its Data Broker Report, the FTC observes that data brokers collect and store billions of data elements covering nearly every U.S. consumer. The extent of consumer profiling is such that these data brokers know every minute details of consumers’ everyday lives, such as income, socioeconomic status, political and religious affiliations, online and in-store purchases, social media activity, magazine subscriptions. The ability to create such detailed and precise profiles creates significant privacy concerns. According to the FTC Data Broker Report, one of the data brokers studied holds 700 billion data elements, and another adds more than 3 billion new data points to its database each month.

In most cases, data is collected behind the scenes, without consumer knowledge. The FTC Data Broker Report notes that personal data often passes through multiple layers of data brokers who share data with each other. Data brokers combine online and offline data, which may result in potentially sensitive inferences such as those related to ethnicity, income, religion, political leanings, age, or health conditions, such as pregnancy, diabetes, or high cholesterol. Many of the purposes for which data is collected pose risks to consumers, such as the unanticipated secondary uses of the data. For instance, data collected to offer discounts to potential purchasers of motorcycles, could also be interpreted by an insurance provider as a sign of risky behavior, resulting in an increase in life insurance premium. Some data brokers unnecessarily store data about consumers indefinitely, which may create security risks, in addition to the privacy risks described above.

The FTC Data Broker Report recommends that Congress enact legislation to require the following:

For brokers that provide marketing products:

  • The creation of a centralized mechanism, such as an Internet portal, where data brokers can identify themselves, describe their information collection and use practices, and provide links to access and opt-out tools;
  • Data brokers to give consumers access to their data, including any sensitive data, at a reasonable level of detail;
  • Data brokers to inform consumers that they derive certain inferences from raw data;
  • Data brokers to disclose the names and/or categories of their data sources, to enable consumers to correct wrong information with the original source;
  • Consumer-facing entities (e.g., retailers) to provide prominent notice to consumers when they share information with data brokers, along with the ability to opt-out of such sharing; and
  • Consumer-facing entities to obtain consumers’ affirmative express consent before collecting and sharing sensitive information with data brokers.

For brokers that provide “risk mitigation” products:

  • When a consumer-facing company uses a data broker’s risk mitigation product to assist in the decision making process, that company would have to identify the information on which it relied when it decided to limit a consumer’ ability to complete a transaction;
  • Data brokers to allow consumer to access the information used and to correct it, as appropriate.

 For brokers that provide “people search” products:

  • Data brokers to allow consumers to access their own information;
  • Data brokers to allow consumers to opt-out of having the information included in a people search product;
  • Data brokers to disclose the original sources of the information so consumers can correct it;
  • Data brokers to disclose any limitations of an opt-out feature. 

What the FTC Data Broker Report means for data brokers and others

For the past few years, the Federal Trade Commission has monitored, and attempted to guide online behavioral advertising and behavioral targeting. However, while it has repeatedly requested the advertising industry to self regulate its practices, it has not suggested, or even less outlined, proposed legislation.

With its 18-month evaluation of the data broker industry, and the issuance of its Data Broker Report on May 27, 2014, the Federal Trade Commission increases the pressure. This time, without asking for self-regulation, the FTC calls directly for legislation requiring transparency and accountability from data brokers and the availability of access and correction rights for consumers. This is an important step, which may also provide guidance in related areas.

In its Data Broker Report, the Federal Trade Commission limited the scope of its initiative to the use of big data by data brokers, i.e. entities that collect and process data for resale or licensing purposes. It did not address the use of big data by non-brokers – entities that are using the new, sophisticated tools available from big data technologies to mine a wide range of data about their own customers that they have accumulated over the years. While limiting its focus to a segment of the big data users, the FTC made a powerful call for legislation, and provided very specific direction on the principles that should be addressed in that legislation.

The FTC Data Broker Report is a major milestone, compared with the recent White House Big Data Report (May 2014), which suggested legislation that would be based on the White House Consumer Privacy Bill of Rights (February 2012), but did not identify with specificity the elements that this legislation should address or contain, or pointed to the White House Consumer Privacy Bill of Rights without explaining in what way legislation that would be based on the White House Consumer Privacy Bill of Rights would address the specific and unique issues raised by the use of big data technologies and techniques by data brokers.

The FTC Data Broker Report, on the other hand, provides a blue print for legislation that focuses on the unique issues raised by the massive collection of personal data. The principles outlined by the FTC are more directly useable, more practicable, and more pragmatic. They are also better adapted to the idiosyncracies of the world of data brokers, where all uses of data are secondary uses, and were not anticipated – and probably not disclosed – in the privacy disclosures of the customer facing companies that collected the data in the first place.  Thus, it would be much easier to act upon the call for action, and draft legislative text.

It should be further noted that, while the FTC Data Broker Report is limited to a specific market, the ideas that it submits to the U.S. legislator could easily be expanded or extrapolated to all users of big data, i.e. those entities other than data brokers who use big data techniques and massive computing powers for their internal purpose. Thus, entities other than data brokers that process large amounts of data with the intent of producing personal profiles or inferring personal interests, practices, or other characteristics of individuals should consider evaluating the guidance provided in both the FTC Data Broker Report – in addition to that provided in the White House Big Data Report – when trying to anticipate the direction that laws, regulations, and enforcement might take in the next few years with respect to the secondary uses of personal data.

The FTC Data Broker Report is published at: http://www.ftc.gov/news-events/press-releases/2014/05/ftc-recommends-congress-require-data-broker-industry-be-more

 

 

 

Article 29 Working Party Supports ECJ “Right to be Forgotten” Ruling

Posted by fgilbert on May 23rd, 2014

In a May 23, 2014 press release, the Article 29 Working Party (WP29) has indicated that it welcomes the May 13, 2014 ruling of the European Court of Justice (ECJ), which recognizes a “right to be forgotten” for individuals.

The WP29 also announced that it is planning a discussion among the EU data protection authorities at its upcoming plenary meeting on June 3-4, 2014 to analyze the consequences of the ECJ ruling. The WP29 indicated that it intends to develop guidelines to help build a common approach of EU data protection authorities on the implementation of this ECJ ruling. It is hoped that these guidelines will help clarify the criteria to be used when evaluating a data subject’s request to “be forgotten” against the public’s interest in having access to information.

The ECJ was requested to rule on a data subject’s right to obtain the deletion of links to certain search results. In its May 13, 2014 ruling, the ECJ concluded web users have the right to directly request from the search engine the deletion of the links to web pages containing information breaching their rights under the Directive, even if the publication of the information on the web pages in question is lawful in itself.

The ECJ noted, however, that while the rights to privacy and to the protection of personal data set forth in the EU Charter of Fundamental Rights override the search engine’s economic interest, they are not absolute; the right to deletion of information will have to be assessed on a case by case basis depending on the nature of the information in question, on its sensitivity for the data subject, and on the interest of the public to have access to that information, considering in particular the role played by the data subject in public life.

This decision has significant consequences both for search engines and for the public. Search engines will have to incur costs in responding to individual requests to block unwanted links. Since the publication of the ruling, they have already been flooded by takedown requests from a wide range of individuals. To follow the ruling, they would have to assess and balance, on a case-by-case basis, the individual’s right to be forgotten against the public’s right to information. If links are blocked, the public might be deprived of relevant information that otherwise might be relevant, useful, or necessary in making decisions.

In addition to the above, the ECJ ruling addresses two important issues that have been of great concern to companies that operate their websites on a worldwide basis. First, the ECJ ruling adopts a wide interpretation of the notion of “establishment” for determining the applicability of the EU Directive 95/46/EC and national law to a company when the processing of personal data is carried out in the context of the activities of a subsidiary on the territory of a Member State, set up to promote and sell advertising space in that Member State. This is likely to influence national courts in the European Economic Area into asserting broad scope jurisdiction over companies based on their promotion and advertising activities.

The other important position taken in the May 13, 2014 is the clarification of the concepts of “data processing” and “controller” in the context of the processing of personal data by search engines. So far numerous companies that view themselves as services providers, such as search engines or cloud service providers, have argued that they were only data processors, and that third parties were data controllers. In its May 13, 2014 ruling, the ECJ determined that search engine providers are data controllers when they automatically index information published online and provide such information to web users according to a particular order of preference.

The May 13, 2014 ECJ ruling is a very important decision. It is likely to have significant consequences in many areas of the data protection field, and beyond. It may also affect the current discussions regarding a “right to be forgotten” or a “right to erasure” in the proposed EU Data Protection Regulation.

This post was also published by The Computer & Internet Lawyer (August 2014)  Volume 31, Number 8, page 18 (Wolters Kluwer publisher).

White House Big Data Report

Posted by fgilbert on May 7th, 2014

 

Big data tools offer astonishing and powerful opportunities to unlock previously inaccessible insights from new and existing data sets. Large amounts of data are processing through new techniques and technologies, dissecting the digital footprints individuals leave behind, and revealing a surprising number of personal details. As a result, big data analytics have the potential to eclipse longstanding civil rights protections in how personal information is used in housing, credit, employment, health, education, and the marketplace.

The White House Big Data Report, published on May 1, 2014, suggests structures and safeguards to avoid negative or harmful consequences for individuals. The general theme of the Report and its recommendations center on finding responsible uses of big data for the benefit of individuals, respecting their privacy and intimacy, and setting up better structures, disclosures or technologies to allow for these new uses. The Report identifies five areas of focus:  protecting privacy, preventing discrimination, ensuring responsible use of information by government agencies, harnessing data as a public resource, and using big data to enhance learning opportunities. The Report concludes with policy recommendations, including, advancing the Consumer Privacy Bill of Rights; passing national data breach legislation; amending the Electronic Communications Privacy Act; expanding technical expertise to stop discrimination; extend privacy protections to non-U.S. persons; and ensure that data collected on students in school are used for educational purposes.

Many of the proposed initiatives would be translated into new laws and regulations, which are likely to create obstacles and compliance requirements for businesses. These recommendations are likely to affect the way in which companies operate, how and why they collect data, and what uses they make of it.

Law Enforcement Powers

With the recent revelations on the extensive use of personal information by US and foreign government agencies, it is not surprising that the Report would recommend clarifying law enforcement‘s powers and role. For example, government use of lawfully acquired commercial data should be evaluated to ensure consistency with the country’s values. Federal agencies should implement best practices for institutional protocols and mechanisms to ensure the controlled use and secure storage of data. Law enforcement use of predictive analytics should receive careful policy review. Federal agencies with expertise in privacy and data practices should provide technical assistance to state, local, and other federal law enforcement agencies seeking to deploy big data techniques.

Electronic Communications Privacy Act

There is no doubt that the Electronic Communications Privacy Act, almost 30 years old, is out of synch with the reality of today’s cloud services, texting, social media and other means that did not exist or were in their infancy in 1986. Consistent with the numerous initiatives already in progress, the Report recommends amending the ECPA to provide the same protection for online, digital content as that which afforded in the physical world.

National Privacy Legislation

The Report recommends the adoption of a national data privacy law that would incorporate the principles laid out in the White House Consumer Privacy Bill of Rights. The Department of Commerce would be tasked with drafting legislative text implementing the Consumer Privacy Bill of Rights for submission by the President to Congress.

Notice and Consent

The traditional concepts of notice and consent, which have been a key requirement in all data protection regimes, may no longer be sufficient to protect personal privacy. The Report recognizes that notice and consent would be incompatible with the way big data functions, because it would blog new, non-obvious, unexpectedly powerful uses of data. Thus, new criteria for access to and processing of data would have to be developed.

“Do Not Track”

Do not track has been at the forefront of numerous government and private initiatives in the US at the State and Federal level, and internationally, as well. Numerous obstacles are delaying implementation. Concurrently, companies are resisting the implementation of “do not track” to preserve their ability to analyze usage data to understand their market. The Report recommends strengthening “do not track” tools, technologies, and mechanisms to address the growing array of technologies available for recording individual actions, behavior, and location data across a range of services and devices.

Data Brokers

Data brokers have been the subject of intense scrutiny in the past few months, including several initiatives by the Federal Trade Commission, alleging violation of the US Fair Credit Reporting Act. The Report encourages the data broker industry to build a portal where data brokers would disclose their data practices and provide methods for consumers to better control the collection and use of their information and to opt-out of certain marketing uses. The Report suggestions might help sanitize or curb certain aggressive practices.

National Data Breach Legislation

More than ten years after California passed the first Security Breach Disclosure Law, the Federal legislators have not been able to pass a law that would provide uniformity. As a result, companies have to deal with 47 different state laws. The Report supports passing a national data breach law that would impose reasonable time periods for notification, minimize interference with law enforcement investigations, and potentially prioritize notification about large, damaging incidents over less significant incidents.

Global Privacy Frameworks

After having been the target of much criticism for its practices, its lack of “adequate protection”, the United States is now stepping up its efforts to communicate with other worldwide powers and attempt to establish, and participate in, bridges between the different privacy and data protection regimes, such as through its initiatives as part of the Asia Pacific Economic Cooperation (APEC).

The Report encourages the US Departments of State and Commerce to engage with the European Union, APEC, Organization for Economic Cooperation and Development (OECD), and other stakeholders, to take evaluate how existing and proposed policy frameworks address big data. It recommends strengthening the U.S.-European Union Safe Harbor Framework, encourages more countries and companies to join the APEC Cross Border Privacy Rules system. It also promotes collaboration on data flows between the United States, Europe, and Asia through efforts to align Europe’s system of Binding Corporate Rules and the APEC CBPR system.

Discrimination

Big data may create tools or information that may lead to discrimination. The Report recommends that civil rights and consumer protection agencies expand their technical expertise and identify practices and outcomes that may have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law.

Protections for Non-U.S. Persons

Cloud computing and other technologies allow the presence, on US servers, of information generated by non-U.S. persons, and intended to be used outside the United States. The Report recommends that the 1974 Privacy Act be applied to non-U.S. persons where practicable, or that alternative privacy policies that apply appropriate and meaningful protection be applied to personal information regardless of a person’s nationality.

Conclusion

While big data has the potential for numerous positive developments, such as in the health or the education area. However, big data analytics and technologies – especially when combined with the new means of collecting personal information such as sensors, wearable technologies, smart grid, or Internet of things devices – create the potential for new uses of data. Some of these uses may be invasive, and erode privacy rights. Structures are needed to help preserve intimacy, and protect personal lives. The White House Big Data Report is an important step in the right direction but it cannot remain just a report. The next steps will be crucial. We suggestions in the Report need to be taken to the next steps, analyzed further, and distilled into practical, pragmatic steps, to help establish a workable balance between the different players, and the different goals.