You Are Viewing Best Practices

Use of Cloud Computing in a Law Office

Posted by fgilbert on October 10th, 2013


Attorney and law firms are increasingly interested in taking advantage of the proliferation of cloud computing services in their law practice. For example, they might wish to use web-based email to interact with their clients, subscribe to customer relationship management (CRM) services that are offered as Software as a Service (SaaS) to manage their customer and prospect lists. They may be tempted to store documents in the many storage services that are offered at no charge. New options are emerging every day, as more applications are developed and marketed.

However, while cloud services present significant advantages, the use of cloud computing services by attorneys and law firms present unique challenges due to the ethical rules to which attorneys are subject. In addition to ethical concerns, services provided in a cloud computing environment present a number of technical, physical, and contractual risks. Cloud computing agreements should be reviewed carefully before venturing into this new, complex form of outsourcing.

The Advantages of Cloud Computing

Cloud computing offers so many advantages that it is difficult to resist the temptation. Many services can be obtained at a significantly low cost; in many cases, they may be offered free of charge. Thus, it may be less expensive for the law firm to acquire these services from a cloud provider rather than running and maintaining an application using one’s own server on one’s premises. The maintenance is usually included in the offering, so there may be no need to worry about keeping up with updates, as they are installed automatically. The services are accessible from anywhere, a feature of great interest to attorneys who work long hours and may take advantage of the remote access capability to telecommute if needed. Altogether, cloud computing requires less in-house expertise and capability and less infrastructure, which may result in significant savings.

Cloud computing services may provide flexibility. As these services are often sold on demand, a law firm may take advantage of the elasticity to purchase as little as it needs on a regular basis, knowing that it can quickly ramp up and add storage, computing capability, or a few new features if the need arises.

Cloud computing may also provide increased stability and security. Reputable cloud providers usually employ the most up-to-date, sophisticated security measures. Their experienced, adequately trained staff excels at implementing security measures that take into account the current trends. They have access to sophisticated tools to monitor unauthorized access to the systems or manage permissions. These entities also have the ability to put in place sophisticated disaster recovery and business continuity features that are likely to be more powerful and effective than those that a small or lean law practice could implement.

However, entrusting data to cloud providers is not without danger. For instance, a large cloud provider that is known for servicing prestigious customers might also be the target of cyber attacks aimed at disrupting these customers’ operation or accessing their critical data. In addition, attorneys are subject to stringent ethical rules that may hamper their ability to use certain types of cloud services for certain purposes or with certain categories of data.

Ethical Rules

Before starting a search for cloud services that would make your practice so much more efficient, you should first determine whether the Ethical Rules that apply to your profession would allow your law firm to use cloud services. Ethical rules vary from one jurisdiction to another, but they tend to follow some common general principles.

Competence, Confidentiality

Most Ethical Rules that apply to attorneys contain a duty of competence and a duty of confidentiality. Will the professionals who will use the new cloud based program be sufficiently proficient, and able to log in and out of a system, save or annotate documents, in a manner that does not put at risk the confidentiality or the integrity of the data?

Duty to Supervise

The Ethical Rules may also contain a duty to supervise and may require an attorney who assigns work or responsibilities to a non-attorney (e.g., the cloud provider) to make reasonable efforts to ensure that the third party’s conduct is compatible with the attorney’s professional obligations.

Duty to Safeguard Client Data

Attorneys are also generally required to keep client property, such as files, information, and documents appropriately safeguarded. Would a law firm be able to ensure proper safekeeping of the clients file if these files were stored in a cloud? Certain cloud services may host the data or several customers on the same server. Would this co-location be deemed “appropriate safeguard?

Further, the cloud provider may have structured its network so that the servers are spread throughout the world. Keep in mind that a foreign country would be likely to assert jurisdiction over any server located within its territory. These countries are also likely to have adopted different laws or standards with respect to third party or government access to data, confidentiality, or data ownership.

Duty to Communicate with Client

Finally, Ethical Rules for attorneys may contain a duty to communicate with clients. Would this duty require a attorney or law firm to promptly inform clients of any decision to store the client’s data in a third party’s cloud and to seek their consent?

Given the potential application of these and other ethical rules it would be prudent for attorneys and law firms that contemplate the use of cloud computing services to review carefully the ethical rules that apply to their profession, in their region, and review, as applicable, any opinion or guidance that may have been published by the applicable authority that regulates their profession.

How to Manage Cloud Computing Risk

Numerous precautions and measures can be taken by attorneys to reduce their exposure to legal, commercial, and reputational risk in connection with the use of cloud services.

Internal Due Diligence

Before stepping into the cloud, you should conduct an internal due diligence in order to determine the potential obstacles or constraints that might prohibit or restrict the use of cloud services by your law firm. For example, you should review the ethical rules that might apply to your organization, as discussed above. You should also determine whether the law firm or any of its professionals has entered in a confidential agreement or data use agreement that might restrict the transfer of data to third parties, even if these third parties are service providers. You should also determine whether the proposed plan to use a cloud service or host would require the prior consent of your clients.

Keep in mind, as well, that some data might be so sensitive or confidential that they should not be transferred to cloud, or the transfer might require significant precautions. This might be the case, for example, for files that pertain to high stakes mergers or acquisitions.

External Due Diligence; Contracts

Make sure that you understand the particular application or service you are contemplating to purchase. How will the servers be used to process your data? While it is important to involve your information technology team, you should understand how the service will operate, where the servers will be located, whether your data will be collocated with others customers’ data, and how your data will be protected from intrusion or disasters. Ensure that the service will be reliable and easy to use by everyone at the law firm. Conduct appropriate due diligence of the proposed vendor and the proposed applications. Check references. Conduct online searches and/or call current clients to evaluate the vendor’s reputation.

You should also review the proposed contract carefully, even if you are told that it is not negotiable. First, it might actually be possible to negotiate changes. And even if it is not, you should understand the consequences and implications of the engagement you are making. Pay special attention to the disclaimers of liability, confidentiality, intellectual property, and security provisions.

Continuous Access to Data

Service outages happen regularly. It is important to ensure that the cloud service will provide alternative access to data, such as by switching to a server located in a different region if an outage affects a specific data center. The service provider should have in place a robust disaster recovery plan that alleviates the effect of outages.

Consider backing-up your data to an alternative system or a second cloud provider, to ensure that you will be able to access the data in the event of an outage in the vendor’s facility or network, or in the event of a natural or other disaster.

Ensure that you have the ability to change providers when it becomes necessary or desirable to do so. Keep in mind, however, that while it may be feasible to move from one hosting service to another, changing applications, such as a customer relationship management, is likely to be impossible, or very costly.

Many cloud contracts provide that in the event of an outage the customer will be refunded that portion of their monthly fee that corresponds to the duration of the outage. Be realistic about the actual effect of such provision. The refund might be insignificant compared to the huge inconvenience and loss of business and loss of data availability. For example, what would you do if you are in the middle of a trial or closing an acquisition, and suddenly the needed data are not available due to an outage or other force majeure event?

Security, Security Breaches

Ensure that the data will be appropriately protected from unauthorized access or modification. Specific steps that may be required such as installation of firewall, access limitations, encryption, strong passwords or other authentication measures, and electronic audit trail to monitor access to data. Ensure that you are informed of the security breaches that affect the data that your law firm uploads to the cloud. You may have a legal and/or ethical obligation to inform your clients and the regulators about an incident affecting these data. Negotiate compensation or indemnification by the service provider if the breach is caused by the cloud provider either affirmatively or through its own negligence/failure to maintain agreed-upon safeguards or reasonable security measures.

Data Ownership

Beware of obscure or confusing clauses that might give the cloud provider ownership of data stored in its services, or the metadata associated with the access to or processing of your law firm’s or clients’ data. Ensure that the contracts with the service provider(s) acknowledge that the data are owned by the law firm and/or its client, and not by the cloud provider.


Anticipate the need to terminate the service. Have an exit strategy in place so that the law firm may change its provider when it becomes necessary or desirable to do so.


Train your own staff and professionals who will use the cloud service or products, and obtain their written agreement to comply with your security measures and those that are recommended by the cloud provider such as the use of strong passwords, and the prohibition of sharing passwords.


There is no doubt that cloud computing is here to stay and that gradually companies will move most of their data to the cloud. However, switching the physical custody of one’s data to a third party does not relieve an organization from its legal obligations to protect these data, ensure adequate security and integrity, limit its use to specific purposes, or ensure its availability. Thus, any company should carefully consider the pros and cons, as well as the consequences of the use of cloud services. For lawyers and law firms, these concerns are compounded with other concerns that come from the specific ethical rules that govern the profession. Before venturing in the cloud, lawyers and law firms must evaluate the effect of the relevant rules of ethics to which they are subject, identify the categories of data that may be processed or stored in the cloud, and take other necessary measures to ensure that they will be able to fulfill all of their legal and ethical duties to their clients.

How to address cybersecurity threats in medical devices

Posted by fgilbert on June 24th, 2013

The FDA has published for comments a draft guidance that is intended to assist the health industry in identifying and addressing cybersecurity threats in medical devices. Indeed, medical devices are frequently used to collect patients’ vital signs. The information is then transferred to a database within the medical office or in the cloud, for further processing. For instance a diabetic patient may be equipped with a device that collects blood samples and sends the information to a cloud-based service that makes a diagnosis, determines the right dosage of a drug, and sets the time at which the dosage should be administered to the patient.

To complete this prowess, the medical device takes advantage of wireless, network, and Internet connections in order to exchange medical device-related health information collected from patients with a remote service or practitioner. The transmittal of patient information to remote computing facilities and their storage in a cloud can cause significant cybersecurity concern. The interception and unauthorized use, modification or deletion of critical patient information could have deadly consequences.

The draft guidance provides recommendations to consider and identifies documentation to be provided in FDA medical device premarket submissions in order to assure effective cybersecurity management and reduce the risk of compromise. Not surprisingly, the guidance recommends that engineers and manufacturers should develop security controls to maintain the confidentiality, integrity, and availability of the information collected from the patient and transmitted the medical cloud that allows the storage and processing of the information.

The draft guidance suggests the use of “cybersecurity by design”, a concept similar to that of “privacy by design,” to bake into the design of the medical devices and the equipment connected to these devices, the much-needed security features that could ensure more robust and efficient mitigation of cybersecurity risks.

The proposed guideline outlines the steps to be used for this purpose and stresses the importance of documenting the different steps taken:

  • Conduct a risk analysis and develop a management plan as part of the risk analysis;
  • Identify the assets at risk, the potential threats to these assets and the related vulnerabilities;
  • Assess the impact of the threats and vulnerabilities on the device functionality;
  • Assess the likelihood that a vulnerability might exploit;
  • Determine the risk levels and suitable mitigation strategies;
  • Assess residual risk, and define risk acceptance criteria.

As always, the issue is one of balance. Balancing the universe of threats against the probability of a security breach. Factors to be taken into account would include the type medical device, the environment in which it is used, the type and probability of the risks to which it is exposed, and the probable risks to patients from a security breach. In addition, the guidance recommends that manufacturers should also carefully consider the balance between cybersecurity safeguards and the usability of the device in its intended environment of use (e.g., home use vs. healthcare facility use) to ensure that the security capabilities are appropriate for the intended users.

The FDA draft guidance recommends that medical device manufacturers should be prepared to provide justification for the security features chosen and consider appropriate security controls for their medical devices including, but not limited to:

  • Limit access to trusted users only;
  • Ensure trusted content;
  • Use fail-safe and recovery features.

The proposed guidance also identifies the type of documentation that should be developed in preparation for premarket submission filed with the FDA. This information includes:

  • Hazard analysis, mitigations, and design considerations pertaining to intentional and unintentional cybersecurity risks associated with the device;
  • Traceability matrix that links the cybersecurity controls to the cybersecurity risks that were considered;
  • Systematic plan for providing validated updates and patches to operating systems or medical device software;
  • Documentation to demonstrate that the device will be provided to purchasers and users free of malware; and instructions for use and product specifications related to recommended anti­virus software and/or firewall use appropriate for the environment of use.

The Draft Guidance is available at


New York State launches investigation of top insurance companies’ cybersecurity practices. Who’s next?

Posted by fgilbert on June 4th, 2013

The State of New York has launched an inquiry into the steps taken by the largest insurance companies to keep their customers and companies safe from cyber threats. This is the second inquiry of this kind.  Earlier this year, a similar investigation targeted the cyber security practices of New York based financial institutions.

On May 28, 2013, the New York Department of Financial Services (DFS) issued letters pursuant to Section 308 of the New York Insurance Law (“308 Letters”) to 31 of the country’s largest insurance companies, requesting information on the policies and procedures they have in place to protect health, personal and financial records in their custody against cyber attacks.

Among other things, the 308 Letters request:

  • Information on any cyber attacks to which the company has been subject in the past three years;
  • The cyber security safeguards that the company has put in place;
  • The company’s information technology management policies;
  • The amount of funds and other resources that are dedicated to cyber security;
  • The company’s governance and internal control policies related to cyber security

The insurance companies will have a short period to respond to the questionnaire.  For further detail see Press Release of the New York Governor’s Office.

It is not clear what the State of New York will do with the information collected from the responses to this inquiry, but it is certain that this initiative is likely to be followed with great interest by other State Insurance and Financial industry regulators.  Indeed, both the insurance and financial services institutions collect, process and retain a significant amount of highly sensitive personal information about prospective, current and past customers.

Companies in the insurance or financial services sectors, as well as their respective service providers, should take the time to review their risk assessments, policies and procedures, especially with a focus on evaluating whether they adequately address known vulnerabilities, meet the current “best practices” standards, and are keeping up with the most recent technologies and forms of cyber attacks.

Hot Issues in Data Privacy and Security

Posted by fgilbert on April 22nd, 2013

Data privacy and security issues, laws and regulations are published, modified and superseded at a rapid pace around the world. The past ten years, in particular, have seen a significant uptake in the number of laws and regulations that address data privacy or security on all continents. On March 1, 2013, a program held at Santa Clara University’s Markkula Center for Applied Ethics, titled “Hot Issues in Global Privacy and Security”, featured attorneys practicing on all continents who provided an update of the privacy, security and data protection laws in their respective countries.

The second half of the program featured a panel moderated by Francoise Gilbert, where the chief privacy counsel of McAfee, Symantec and VMWare talked about how to drive a global privacy and security program in multinational organizations.

Videos of the program are available by clicking here.

The program was the second part of a two-day series of events. The first event was held in San Francisco on February 28, 2013, and was sponsored by Box, Inc. and the Cloud Security Alliance. This program focused on US and Foreign Government Access to Cloud Data and started with an overview of the laws that regulate US government access to data, presented by Francoise Gilbert. A panel featuring European and North American attorneys followed; they discussed the equivalent laws in effect in their respective countries. The program concluded with a presentation by the general counsel of Box, Inc., who spoke about the way in which his company responds to government requests to access to data stored.

Videos of the program are available by clicking here.

Article 29 Working Party’s Opinion on Mobile App Privacy

Posted by fgilbert on March 15th, 2013

On March 14, 2013, the European Union’s Article 29 Working Party published its opinion on the unique privacy and data protection issues faced by applications used on mobile device.  The 30-page opinion provides an analysis of the technical and legal issues, and concludes with a series of recommendations to app developers, platform developers, equipment manufacturers and third parties.

In many respects, this new opinion of the Article 29 Working Party is very similar to the document that the Federal Trade Commissions has published recently on the same topic.  It addresses many themes also found in the FTC documents regarding the use of mobile applications in general, or that mobile applications directed to children.

The Article 29 Opinion WP 202 provides two series of recommendations for application developers.  The first set of recommendation is in fact a recitation of general principles set forth in the proposed Data Protection Regulation, but adapted to the specific context of the mobile world, with references to location data, unique device identifier, SMS.   There are also references to other modern concepts, such as privacy design, also found on the proposed Data Protection regulation, but absent from Directive 95/46/EC, the directive currently in effect.

The second set of recommendations to application developers includes specific guidance on the actions to be taken.  These include:

  • Adopting appropriate measures that address the risks to the data;
  • Informing users about security breaches;
  • Telling users what types of data are collected or 
accessed on the device, how long the data are retained and what security measures are used to protect these data;
  • Developing tools to enable users to decide how long their data should be retained, based on their specific preferences and contexts, rather than offering pre-defined retention terms;
  • Including information in their privacy policy dedicated to European users;
  • Developing and implementing simple but secure online access tools for users, without collecting 
additional excessive personal data;
  • Developing, in cooperation with OS and device manufacturers and others, innovative solutions to adequately inform users on mobile devices, such as through layered information notices combined with meaningful icons.

The remainder of the recommendations is addressed to app stores, OS and device manufacturers, and third parties.

The protection of children reappears as a common theme in the different recommendations to the different players in the mobile market.  Each set of recommendations provided in WP 202 stresses that they should limit their collection of information from children, and especially refrain from processing children’s data for behavioral advertising purposes, and refrain from using their access to a child’s account to collect data about the child’s relatives or friends.

FTC v. Google V2.0 – Lessons Learned

Posted by fgilbert on August 13th, 2012

The Federal Trade Commission has published its long-awaited Proposed Consent Order with Google to close its second investigation into Google’s practices (Google 2). Under the proposed document, Google would agree to pay a record $22.5 million civil penalty to settle charges that it misrepresented to users of Apple Safari’s browser that it would not place tracking cookies on their browser, or serve targeted ads. It would also have to disable all tracking cookies that it had said it would not place on consumer’s computers, and report to the FTC by March 8, 2014 on how it has complied with this remediation requirement.

Google 2 Unique Aspects

Unlike most consent orders published by the FTC, the Google 2 Consent Order does not address primarily the actual violations privacy promises made. Rather, it addresses the fact that Google’s activities allegedly violate a prior settlement with the FTC, dated October 2011 (Google 1).

As such, beyond evidencing the FTC’s ongoing efforts to ensure that companies live up to the privacy promises that they make to consumers, Google 2 clearly shows that the FTC takes seriously the commitments that it requires from companies that it has previously investigated. When an FTC consent decree requires a 20-year commitment to abide by certain practices, the FTC may, indeed, return and ensure that the obligations outlined on the consent decree are met.

Privacy Promises are made everywhere

A significant aspect of the proposed Google 2 Consent Order and related Complaint, is that privacy promises are made in numerous places beyond a company’s online privacy statement. They are found, as well as, in other representations made by the company, such as through its regulatory filings, or in its marketing or promotional documents. In the Google 1 enforcement action, the FTC looked at the promises and representations made in Google’s Safe Harbor self-certification filings. In the Google 2 enforcement action, the FTC looked at the promises and representations made in Google’s statements that it complied with the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI).

Misrepresentation of compliance with NAI Code

In the third count of the FTC Complaint in Google 2, the FTC focuses on Google’s representation that it adheres to, or complies with the NAI Self-Regulatory Code of Conduct. The alleged violation of this representation allows the FTC to claim that Google violated its obligation under Google 1 to not “misrepresent the extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity”.

Evolution of the FTC Common Law

Google 2 shows a clear evolution of the FTC “Common Law” of Privacy. As the concept of privacy compliance evolves, the nature of the FTC’s investigations becomes more refined and more expansive. In its prior cases, the FTC first focused on violations of companies’ privacy promises made in their public Privacy Statements. Then, more recently, in several consent orders – including Google 1 – the FTC expanded the scope of its enforcement action to include violations of the Safe Harbor Principles outlined by the US Department of Commerce and the EU Commission. Now, with Google 2, the FTC expands again the scope of its enforcement actions to include potential violation of representations made of compliance with the NAI Self Regulatory Code of Conduct. This trend is likely to continue, and in future cases, we should expect to see an expansion of the FTC investigations into verifying compliance with statements made that a company follows other self-regulatroy industry standards.

What consequences for Businesses

Companies often use their membership in industry groups or privacy programs as a way to show their values, and to express their commitment to certain standards of practice. This was the case for Google with the Safe Harbor of Department of Commerce and of the European Union (Google 1), and with the Network Advertising Initiative (Google 2).

These promises to comply with the rules of a privacy program are not just statements made for marketing purposes. The public reads them, and so do the FTC and other regulators.

Privacy programs such as the Safe Harbor or the NAI Code have specific rules.  As shown in the Google 1 and Google 2 cases, failure to comply with the rules, principles and codes of conducts associated with membership in these programs could be fatal.

If the disclosures made are not consistent with the actual practices and procedures, such deficiency would expose the company to claims of unfair and deceptive practice; or in the case of Google, to substantial fines for failure to comply with an existing consent decree barring future misrepresentation.

If your company makes promises or statements about its privacy – or security – practices, remember and remind your staff that these representations may have significant consequences, and may create a minefield if not attended to properly; and

  • Look for these representations everywhere, and not just in the official company Privacy Statement; for example, look at the filings and self-certification statements, the cookie disclosures, the marketing or sales material, the advertisements;
  • Periodically compare ALL promises that your business makes with what each of your products, services, applications, technologies, devices, cookies, tags, etc. in existence or in development actually does;
  • Educate your IT, IS, Marketing, Communications, Sales, and Legal teams about the importance of working together, and coordinating efforts so that those who develop statements and disclosures about the companies policies and values fully understand, and are aware of all features and capabilities of the products or services that others in the Company are designing and developing;
  • If your company claims that it is a member of a self-regulatory or other privacy compliance program, make sure that you understand the rules, codes of conduct or principles of these programs or industry standards; and ensure that the representations of your company’s compliance with these rules, codes of conduct, principles are accurate, clear and up-to-date;
  • Ensure that ALL of your company’s products and services comply and are consistent with All of the promises made by , or on behalf of, the company in ALL of its statements, policies, disclosures, marketing materials, and at ALL times.

FTC v. Google 2012 – Misrepresentation of Compliance with NAI Code a Key Element

Posted by fgilbert on August 9th, 2012

Google was hit by a $22.5 million penalty as a result of an investigation by the Federal Trade Commission covering Google’s practices with users of the Safari browser. A very interesting aspect of this new case against Google (Google 2), is that it raises the issue of Google’s violation of the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI Code). This is an interesting evolution in the history of the FTC rulings. At first, the FTC focused on violation of privacy promises made in Privacy Statements, then it went on to pursue violation of the Safe Harbor Principles. In this new iteration, the FTC attacks misrepresentation of compliance with industry standard.

Misrepresentation of user’s ability to control collection or use of personal data

Two elements distinguish this case (Google 2) from most of the prior enforcement actions of the FTC. One is that the large fine results, not directly from the actual violations of privacy promises made in Google’s privacy policy, but rather from the fact that Google’s activities are found to violate a prior settlement with the FTC, dated October 2011 (Google 1).

In Google 1, Google promised not to misrepresent:

  • (a) The purposes for which it collects and uses personal information;
  • (b) The extent to which users may exercise control over the collection, use and disclosure of personal information; and
  • (c) The extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity.

According to the FTC complaint in Google 2, Google represented to Safari users that it would not place third party advertising cookies on the browsers of Safari users who had not changed the default browser setting (which by default, blocked third party cookies) and that it would not collect or use information about users’ web-browsing activity. These representations were found to be false by the FTC, resulting in a violation of Google’s obligation under Google 1 (see paragraph (b) in bulleted list above.

Misrepresentation of compliance with NAI Code

The second, and more interesting element of the Google 2 decision, is the FTC analysis of Google’s representation that it adheres to, or complies with the Self-Regulatory Code of Conduct of the Network Advertising Initiative (NAI Code). In the third count of the FTC Complaint in Google 2, the FTC focuses on Google’s alleged violation of the NAI Code.

This alleged violation allows the FTC to show that Google violated its obligation under Google 1 to not “misrepresent the extent to which it complies with, or participates in, a privacy, security, or other compliance program sponsored by the government or any other entity” (see the requirement under (c) in the bulleted list above). The FTC found that the representation of Google’s compliance with the NAI Code was false, and thus violated its obligation in Google 1 not to make any misrepresentation about following compliance programs.

Evolution of the FTC Common Law

Google 2 shows an interesting evolution of the FTC “Common Law.” In its prior cases, the FTC first focused on violations of companies’ privacy promises made in their public Privacy Statements. Then, in several consent orders published in 2011, including Google 1, the FTC expanded the scope of its enforcement action to violations of the Safe Harbor of the US Department of Commerce and the EU Commission. Now, with Google 2, the FTC expands again the scope of its enforcement action to include, as well, violation of Industry Standards such as the NAI Code.

What this means for businesses

The Google 2 Consent Order has significant implications for all businesses.

Companies often use their membership in industry groups as a way to show their values, and to express their commitment to certain standards of practice. Beware which industry group or program you join; understand their rules. As a member of that group or program, you must adhere by its code of conduct, rules or principles. Make sure that you do, and that all of the aspects of your business do comply with these rules.

When a business publicizes its membership in an industry group or a self-regulatory program, it also publicly represents that it complies with the rules or principles of that group or program. For example, those of the Safe Harbor (as was the case under Google 1) or those of the NAI (as was the case under Google 2), or others. Remember that these representations may have significant consequences, and may create a minefield if not attended properly. To stay out of trouble, the company must also make sure that these representations are accurate, and that it does abide by these promises at all times, and with respect to all of its products.

When a company makes a public commitment to abide by certain rules, it must make sure that it does comply with these rules; otherwise, it is exposed to an unfair and deceptive practice action. Make sure that you periodically compare ALL promises your business makes, with what ALL of your products, services, applications, technologies, actually do.

Mobile App Privacy Webinar on April 19, 2012

Posted by fgilbert on April 17th, 2012

On Thursday April 17, 2012, at 10am PT / 1pm ET, I will be moderating and presenting at a one-hour webinar organized by the Practising Law Institute: “A New Era for Mobile Apps?  What Companies Should Know to Respond to Recent Mobile Privacy Initiatives”.

The webinar will start with an overview of the technologies and ecosystem that surround the operation and use of mobile application, presented by Chris Conley, Technology and Civil Liberties Attorney, ACLU Northern California (San Francisco).

Patricia Poss, Chief, BCP Mobile Technology Unit, Federal Trade Commission (Washington DC) will then comment on the two reports recently published by the Federal Trade Commission:  “Mobile Apps for Children” (February 2012) and the final report “Protecting Consumer Privacy in an Era of Rapid Change”, which both lay out a framework for mobile players (March 2012).

I will follow with an overview of the recent agreement between the California State Attorney General and six major publishers of mobile apps, which sets up basic rules and structures for the publication and enforcement of mobile app privacy policies, and the Consumer Privacy Bill of Rights, which was unveiled by the White House in February 2012.  I will end with suggestions for implementing privacy principles in the mobile world.

To register for this webinar, please visit PLI website.


Never too Small to Face an FTC COPPA Action

Posted by fgilbert on November 9th, 2011

Some companies think that they are small and can fly under the radar, and need not worry about compliance.  They should rethink their analysis of their legal risks after the recent FTC action against a small social networking site.

On November 8, 2011 the FTC announced a proposed settlement with the social networking site, which collected personally information from children without obtaining prior parental consent, in violation of COPPA, and made false statements in its website privacy notice, in violation of the FTC Act.

In this case, the personal information of 5,600 children was illegally collected. This was much less than the violations identified in some of the recent FTC COPPA enforcement actions. For example, the 2006 action against Xanga revealed that Xanga had collected 1.7 million records, the 2008 action against Sony, that Sony had collected 30,000 records, and the 2011 action against W3 Innovations identified 50,000 illegally collected records.

The Problem

The social networking site Skid-e-kids targeted children ages 7-14 and allowed them to register, create and update profile information, create public posts, upload pictures and videos, send messages to other Skid-e-kids members, and “friend” them.

According to the FTC complaint, the website owner – a sole proprietor – was prosecuted for:

  • Failing to provide sufficient notice of its personal data handling practices on its website;
  • Failing to provide direct notice to parents about these practices; and
  • Failing to obtain verifiable parental consent.

In addition, these practices were found to be misleading and deceptive, which in turn was deemed to violate Section 5 of the FTC Act.

The site online privacy statement claimed that the site requires child users to provide a parent’s valid email address in order to register on the website and that it uses this information to send parents a message that can be used to activate the Skid-e-kids account, to notify the parent about its privacy practices, and that it can use the contact information to send the parent communications about features of the site.

According to the FTC, however, Skid-e-kids, actually registered children on the website without collecting a parent’s email address or obtaining permission for their children to participate. Children who registered were able to provide personal information, including their date of birth, email address, first and last name, and city.

The Proposed Settlement

The proposed Consent Decree and Settlement Order against Jones O. Godwin, sole owner of the site is available at The proposed settlement would:

  • Bar Skid-e-Kids from future violations of COPPA and misrepresentations about the collection and use of children’s information.
  • Require the deletion of all information collected from children in violation of the COPPA Rule;
  • Require that the site post a clear and conspicuous link to, the FTC site focusing on the protection of children privacy, and that the site privacy statement as well as the privacy notice for parents also contain a reference to the On Guard Online site;
  • Require that, for 5 years, the company engaged qualified privacy professionals to conduct annual assessments of the effectiveness of its privacy controls or become a member in good standing of a COPPA Safe Harbor program approved by the FTC;
  • Require that, for 8 years, records be kept to demonstrate compliance with the above.

A lenient fine … subject to probation

An interesting aspect of the proposed settlement is that the settlement, in effect, imposes only a $1,000 fine to the defendant. The fine is to be paid within five days of the entry of the order. However, if Skid-e-Kids fails to comply with some of the requirements of the Settlement, it will have to pay the full $100,000 fine that is provided for in the settlement.

Specifically, a $100,000 will be assessed if:

  • The defendant fails (a) to have initial and annual privacy assessment (for a total of 5 annual assessments) conducted by a qualified professional approved by the FTC and identifying the privacy controls that have been implemented, how they have been implemented and certifying that the controls are sufficiently effective; or (b) to become a member in good standing of a COPPA Safe Harbor program approved by the FTC for 5 years; or
  • The disclosures made about the defendant’s financial condition are materially inaccurate or contain material misrepresentations.

The Lesson for Site with Children Content

This new case is a reminder that the COPPA Rule contains specific requirements that must be followed, no matter the size of the site, when intending to collect children personal information. The COPPA rule defines procedures and processes that must be followed rigorously.

Among other things, the COPPA Rule requires websites that are directed to children and general audience websites that have actual knowledge that they are collecting children information to:

  • Place on its website a conspicuous link to its privacy statement;
  • Provide specified information in the website privacy statement, describe in clear terms what personal information of children is collected, how it used, and explain what rights children and parents have to review and delete this information;
  • Provide a notice directly to the parents, which must include the website privacy statement, and inform the parents that their consent is required for the collection and use of the children’s information by the site, and how their consent can be obtained;
  • Obtain verifiable consent from the parents before collecting or using the children’s information;
  • Give parents the option to agree to the collection and use of the children’s information without agreeing to the disclosure of this information to third parties.

In addition, we suggest also including, clearly and conspicuously, (a) in the website privacy statement; (b) in the notice to parents; and (c) at each location where personal information is collected a notice that invites the user to visit the On Guard Online website of the Federal Trade Commission for tips on protecting children’s privacy online:




New EU Directive on Consumer Rights Affects Website Terms

Posted by fgilbert on November 9th, 2011

In late October 2011, the European Council of Ministers formally adopted the new EU Consumer Rights Directive. The new Directive will drastically affect the rules that apply to online shopping. Numerous provisions will also apply to both the online and the offline markets.

Scope of the Consumer Rights Directive

The Directive is intended to protect “consumers,” i.e., all natural persons who are acting for purposes that are outside

their trade, business, craft, or profession. It creates new obligations for “traders,” a broad term that encompasses all categories of persons who sell products or services. The Directive defines the term “trader” as any natural or legal person who is acting, directly or indirectly for purposes relating to his/its trade, business, craft of profession in relations to contracts covered by the Directive. These contracts include: sales contracts, service contracts, distance contracts, off-premises contracts, and public auction contracts that are concluded between a trader and a consumer.

There are numerous exceptions, such as contract for healthcare services, for financial services, for the construction of new buildings, for package travel, for passenger transport services, or contracts concluded by means of automatic vending machines.

Effect on US Companies

US companies that operate websites that sell to European customers, as well as their affiliates who make direct sales to EU consumers, must start evaluating the numerous consequences that the implementation of the Directive on Consumer Rights will have on their operations. The consequences include:

  • Practical consequences: The Directive introduces a new way of doing things. Thus, there will be a need to adapt the exisitng processes, procedures, and interaction with the customer to the new order. Forms and purchase orders will have to be revised.
  • Logistics: The Directive encourages returns. Under the new regime, customers will have 14 days to change their minds and return the purchased goods. Thus, the rate of return will increase. Logistics will have to change to allow the company to handle a heavier rate of returns.
  • Financial consequences: Merchants and traders will have to bear more costs. For example, hotline services will be permitted to charge only for actual telephone rate for phone calls.
  • Rewrite of Terms:  Terms of sale will have to be clearer and more explicit. For example, the additional charges must be clearly explained, or the customer will not bear these charges. Thus, new terms will to be drafted in order to communicate better with customers.

Overview of the changes

The Directive will require extensive changes in the Consumer Protection Laws of the Member States, including changes to implement the following requirements:

  • Pre-ticked boxes on websites will be banned

Pre-ticked boxes will be banned, so that consumers do not inadvertently get charged for options or services that they did not intend to purchase. Currently, consumers are frequently forced to untick these boxes if they do not want extra services.

  • Price transparency will be increased

Consumers will not have to pay charges or other costs if they were not properly informed before they place an order. Traders will be required to disclose the total cost of the product or service, as well as any extra fees.

  • Hidden charges and costs on the Internet on the Internet will be eliminated

Consumers will be required to explicitly confirm that they understand that they have to pay a price. This measure is expected to prevent hidden charges and cost that arise when companies try to trick consumers into paying for “free services,” such as horoscopes or recipes.

  • Surcharges for the use hotlines prohibited

Traders who operate telephone hotlines allowing the consumer to contact them in relation to the contract will not be able to charge more than the basic telephone rate for the telephone calls.

  • Surcharges for the use of credit cards prohibited

Traders will not be able to charge consumers more for paying by credit card (or other means of payment) than what it actually costs the trader to offer such means of payment.

  • Better consumer protection in relation to digital products

Information on digital content will have to be clearer, including about its compatibility with hardware and software and the application of any technical protection measures, for example digital rights management applications, which limit the right for the consumers to make copies of the content.

  • 14 Days to change one’s mind on a purchase

Consumers will be able to return the goods that they purchased if they change their minds within 14 calendar days. This change extends by 7 days the current period during which purchases can be returned. In addition, if a seller has not clearly informed the customer about the right to return the goods, the return period will be extended to a year.

The 14-day return period will start from the moment the consumer receives the goods. The rules will apply to Internet, phone, and mail order sales, sales outside shops (e.g. on the consumer’s doorstep, in the street, at a home party or during an excursion organized by the trader).

The right of withdrawal is extended to online auctions, such as eBay. However, the ability to return goods bought in auctions will be limited to goods bought from a professional seller. In the case of digital content, such as music or video downloads, consumers will have a right to withdraw from purchases of digital content only up until the moment the actual downloading process begins.

  • Better refund rights

Traders will be required to refund consumers for the product within 14 days of the withdrawal. This includes the costs of delivery. In general, the trader will bear the risk for any damage to goods during transportation, until the consumer takes possession of the goods.

  • Clearer information on who pays for returning goods must be provided

Traders who want the consumer to bear the cost of returning goods after they change their mind, will be required to clearly inform consumers about this requirement beforehand. Otherwise, they will have to pay for the return themselves.

At a minimum, they will have to clearly give, before the purchase, an estimate of the maximum costs of returning bulky goods (e.g. a sofa) bought on the Internet or through mail order.

  • Common rules will apply throughout the European Union

A single set of rules for distance contracts (sales by phone, post or internet) and off-premises contracts (sales away from a company’s premises, such as in the street or the doorstep) will apply throughout the European Union. Standard forms will be used, such as a form to comply with the information requirements on the right of withdrawal.

Implementation in the national laws

The EU Member States will have two years to implement the Directive into their national laws. The deadline for implementation will be computed from the date of publication of the Directive in the Official Journal of the European Union.

Based on experience with other implementations of other directives, we can expect that several EU countries will have implemented the Consumer Rights Directive by the end of 2013, and the remainder will follow during the following years. As always, the manner in which each country implements the Directive will be crucial. If the member states diverge in their interpretation of the Directive, websites, which reach customers across borders, will have to juggle with these discrepancies.

Relations with existing directives

The Directive on Consumer Rights will replace the current Directive 97/7/ECon the protection of consumers in respect of distance contracts and the current Directive 85/577/EECto protect consumer in respect of contracts negotiated away from business premises.

However, Directive 1999/44/ECon certain aspects of the sale of consumer goods and associated guarantees and Directive 93/13/EECon unfair terms in consumer contracts will remain in force.