You Are Viewing US Law

Privacy v. Data Protection. What is the Difference?

Posted by fgilbert on October 1st, 2014

I recently participated in a discussion about the difference between “privacy” and “data protection.” My response was “it depends.” It depends on the country. It may also depend on other factors.

When some countries use the term “privacy,” they may mean the same thing or refer to the same principles as what other countries identify as “data protection.” In other countries, “data protection” may be used to mean “information security” and to overlap only slightly with “privacy.” In this case, the term “data protection” may encompass more than just the protection of personal information (but only through security measures). It may cover as well the protection of confidential or valuable information, trade secrets, know-how, or similar information assets.

In the extensive research I conducted when writing my two-volume treatise, Global Privacy and Security Law, which provides an in-depth analysis of the laws of about 70 countries on all continents, I noticed that the use of the terms “privacy” and “data protection” varies from country to country. It may depend on the language spoken in that particular country. It may depend on the region where the country is located.

While in the United States the term “privacy” seems to prevail when identifying the rules and practices regarding the collection, use and processing of personal information, outside the United States, the term “data protection” tends to be more widely used than “privacy.” Among other things, this might be due to the idiosyncrasies of the languages spoken in the respective countries, as explained below.

— “Data Protection” Outside the United States

Throughout the world, “data protection” is frequently used to designate what American privacy professionals call “privacy”, i.e., the rules and practices regarding the handling of personal information or personal data, such as the concepts of notice, consent, choice, purpose, security, etc.

Europe

In Europe, “data protection” is a key term used, among other things, to designate the agencies or individuals supervising the handling of personal information. The 1995 EU Data Protection Directive identifies these agencies as “Data Protection Supervisory Authority.” See, e.g. 1995 EU Data Protection Directive, Article 28 defining the “Data Protection Supervisory Authority,” the agency that regulates and oversees the handling of personal data in an EU Member State. The individuals responsible for the handling of personal information within a company – a role similar to, but different from, that of the American Chief Privacy Officer – are designated as “Data Protection Official.” See, e.g. 1995 EU Data Protection Directive, Article 18(2) and Article 19.

Asia

Outside Europe, the term “data protection” is also frequently used to designate activities that Americans would designate as “privacy” centric. In Asia, for example, the laws of Malaysia, Singapore, and Taiwan are named “Personal Data Protection Act.” The law of Japan is called “Act on the Protection of Personal Information.” South Korea’s laws, APICNU and the recent Personal information Protection Act also use the term “data protection.”

 Africa

African countries also use the concept of “data protection” rather than “privacy.” South Africa named its new law “Protection of Personal Information Act.” Tunisia and Morocco, also named their privacy laws “law relating to the protection of individuals with respect to the processing of personal data.”

 Americas

In the Americas, Canada’s PIPEDA stands for Personal Information Protection and Electronic Documents Act. The new Mexican law is called “Ley Federal de Protección de Datos Personales.”

—  “Privacy” in Foreign Laws

On the other hand, the term “privacy” is seldom used to identify foreign laws or regimes dealing with the protection of personal information. There are, however, a few example of the use of the term “privacy” outside the United States. APEC used the term “privacy” for its 2004 “APEC Privacy Framework.” The law of the Philippines is called “Data Privacy Act.”

— Translations of “Privacy”

When analyzing which term is used to address the protection of personal data throughout the world, it is also important to keep in mind that the word “privacy” (as understood in the United States) does not exist in some languages.

French

It is very difficult to translate “privacy” into French. There is no such word in French, even though the French are highly private and very much concerned about the protection of their personal information. If you look for a translation, you will find that “privacy” is translated into French as “intimité,” which is inaccurate, or very narrow. The French “intimité” is actually equivalent to “intimacy” in English and has little to do with the US concept of “privacy” or “information privacy.” Indeed, the French law of 2004 does not refer to “intimacy” but is titled “Act relating to the protection of individuals with regard to the processing of personal data.”

 Spanish

There is a similar disconnect with the translation of “privacy” into Spanish where “privacy” is translated into “privacidad,” which has a meaning closer to intimacy, remoteness, or isolation. Unsurprisingly, the Spanish law regarding data privacy is named “Organic Law data protection law on the Protection of Personal Data.” The term “privacidad” is not used.

 

 — Data Protection as “Security”

On the other hand, in the US, the term “privacy” seems to prevail. We commonly refer to HIPAA or COPPA as “privacy laws.”

What about “data protection”? I have noticed that, many US information security professional tend to use the term “data protection” to mean protecting the security of information, i.e. the protection of the integrity and accessibility of data. In this case, they do not distinguish the protection of personal data from the protection of company data because from a security standpoint, the same tools may apply to both types of data. In other circles, the terms “information security”, “data security”, “cybersecurity” are frequently used as well.

 — Online Searches

Finally, if you are based in the US, and you run an online search for “data protection”, you will see that the search results either provide links to “security” products (e.g. in my case, a link to McAfee Data Protection product that prevents data loss and leakage) or links to foreign laws dealing with what Americans call “privacy”, (e.g. in my case, a link to Guide to Data Protection from the UK Information Commissioner’s Office).

Verizon to pay $7.4 million to settle FCC privacy enforcement action

Posted by fgilbert on September 8th, 2014

The Enforcement Bureau of the Federal Communication Commission (FCC) reached a $7.4 million settlement with Verizon on September 3, 2014, after an investigation into the company’s use of customers’ personal information for marketing purposes. This $7.4 million fine is the largest such payment in FCC’s history for settling an investigation related solely to the privacy of phone customers’ personal information.

Section 222 of the Communications Act, entitled “Privacy of Customer Information” imposes a duty on every telecommunications carrier to protect the “proprietary information” of its customers. These obligations are further clarified in the Customer Proprietary Network Information Rules (CPNI Rules) of the FCC.

Among other things, phone companies are prohibited from accessing or using certain personal information except in imitated circumstances. To be able to use customers’ information for certain marketing purposes, phone companies must obtain the approval of their customers through an opt-in or an opt-out. When that process is not working, the phone company must report the problem to the FCC within five business days.

The FCC investigation found that, beginning in 2006, and continuing for seven thereafter, Verizon failed to notify approximately two million new customers, on their welcome letter or their first invoices, of the privacy rights, including how to opt-out from having their personal information used in marketing campaigns. Further, Verizon failed to discover this deficiency until September 2012, and failed to notify the FCC until January 2013, over four months later.

Verizon represented that it took remediation efforts following discovery of the problem, including sending opt-out notices, banning all marketing, and implementing a new program to place CPNI opt-out notice on every invoice, each month, for all the potentially affected customers (consumers and small and medium size business customers).

In addition to the $7.4 million fine, to be paid to the US Treasury, Verizon will be required improve its privacy practices, including, among others, to:

  • Designate a senior corporate manager to serve as compliance manager responsible for implementing and administering Verizon’s compliance plan;
  • Notify all Verizon directors, officers, managers and employees of the terms of the consent order;
  • Establish operating procedures to ensure compliance with the consent order;
  • Develop and distribute a compliance manual regarding the handling of customer information;
  • Establish a compliance training program;
  • Notify customers of their opt-out rights on every bill;
  • Monitor and test its billing system and opt-out notice process on a monthly basis, to ensure that customers are receiving appropriate notices;
  • Report any detected problem to the FCC within 5 business days;
  • Report any non-compliance to the FCC within 30 calendar days.

Several of the compliance obligations listed above terminate three years after the date of the Consent Decree.

The Federal Trade Commission is only one of the federal agencies charged with the protection of personal information. Several agencies have sectoral responsibilities, as well. As discussed above, Section 222 of the Federal Communications Act and the related CPNI Rules, contain important provisions regarding the privacy of the personal information of phone users. These provisions are enforced by the Federal Communications Commission.

 

How to address cybersecurity threats in medical devices

Posted by fgilbert on June 24th, 2013

The FDA has published for comments a draft guidance that is intended to assist the health industry in identifying and addressing cybersecurity threats in medical devices. Indeed, medical devices are frequently used to collect patients’ vital signs. The information is then transferred to a database within the medical office or in the cloud, for further processing. For instance a diabetic patient may be equipped with a device that collects blood samples and sends the information to a cloud-based service that makes a diagnosis, determines the right dosage of a drug, and sets the time at which the dosage should be administered to the patient.

To complete this prowess, the medical device takes advantage of wireless, network, and Internet connections in order to exchange medical device-related health information collected from patients with a remote service or practitioner. The transmittal of patient information to remote computing facilities and their storage in a cloud can cause significant cybersecurity concern. The interception and unauthorized use, modification or deletion of critical patient information could have deadly consequences.

The draft guidance provides recommendations to consider and identifies documentation to be provided in FDA medical device premarket submissions in order to assure effective cybersecurity management and reduce the risk of compromise. Not surprisingly, the guidance recommends that engineers and manufacturers should develop security controls to maintain the confidentiality, integrity, and availability of the information collected from the patient and transmitted the medical cloud that allows the storage and processing of the information.

The draft guidance suggests the use of “cybersecurity by design”, a concept similar to that of “privacy by design,” to bake into the design of the medical devices and the equipment connected to these devices, the much-needed security features that could ensure more robust and efficient mitigation of cybersecurity risks.

The proposed guideline outlines the steps to be used for this purpose and stresses the importance of documenting the different steps taken:

  • Conduct a risk analysis and develop a management plan as part of the risk analysis;
  • Identify the assets at risk, the potential threats to these assets and the related vulnerabilities;
  • Assess the impact of the threats and vulnerabilities on the device functionality;
  • Assess the likelihood that a vulnerability might exploit;
  • Determine the risk levels and suitable mitigation strategies;
  • Assess residual risk, and define risk acceptance criteria.

As always, the issue is one of balance. Balancing the universe of threats against the probability of a security breach. Factors to be taken into account would include the type medical device, the environment in which it is used, the type and probability of the risks to which it is exposed, and the probable risks to patients from a security breach. In addition, the guidance recommends that manufacturers should also carefully consider the balance between cybersecurity safeguards and the usability of the device in its intended environment of use (e.g., home use vs. healthcare facility use) to ensure that the security capabilities are appropriate for the intended users.

The FDA draft guidance recommends that medical device manufacturers should be prepared to provide justification for the security features chosen and consider appropriate security controls for their medical devices including, but not limited to:

  • Limit access to trusted users only;
  • Ensure trusted content;
  • Use fail-safe and recovery features.

The proposed guidance also identifies the type of documentation that should be developed in preparation for premarket submission filed with the FDA. This information includes:

  • Hazard analysis, mitigations, and design considerations pertaining to intentional and unintentional cybersecurity risks associated with the device;
  • Traceability matrix that links the cybersecurity controls to the cybersecurity risks that were considered;
  • Systematic plan for providing validated updates and patches to operating systems or medical device software;
  • Documentation to demonstrate that the device will be provided to purchasers and users free of malware; and instructions for use and product specifications related to recommended anti­virus software and/or firewall use appropriate for the environment of use.

The Draft Guidance is available at http://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/UCM356190.pdf

 

New York State launches investigation of top insurance companies’ cybersecurity practices. Who’s next?

Posted by fgilbert on June 4th, 2013

The State of New York has launched an inquiry into the steps taken by the largest insurance companies to keep their customers and companies safe from cyber threats. This is the second inquiry of this kind.  Earlier this year, a similar investigation targeted the cyber security practices of New York based financial institutions.

On May 28, 2013, the New York Department of Financial Services (DFS) issued letters pursuant to Section 308 of the New York Insurance Law (“308 Letters”) to 31 of the country’s largest insurance companies, requesting information on the policies and procedures they have in place to protect health, personal and financial records in their custody against cyber attacks.

Among other things, the 308 Letters request:

  • Information on any cyber attacks to which the company has been subject in the past three years;
  • The cyber security safeguards that the company has put in place;
  • The company’s information technology management policies;
  • The amount of funds and other resources that are dedicated to cyber security;
  • The company’s governance and internal control policies related to cyber security

The insurance companies will have a short period to respond to the questionnaire.  For further detail see Press Release of the New York Governor’s Office.

It is not clear what the State of New York will do with the information collected from the responses to this inquiry, but it is certain that this initiative is likely to be followed with great interest by other State Insurance and Financial industry regulators.  Indeed, both the insurance and financial services institutions collect, process and retain a significant amount of highly sensitive personal information about prospective, current and past customers.

Companies in the insurance or financial services sectors, as well as their respective service providers, should take the time to review their risk assessments, policies and procedures, especially with a focus on evaluating whether they adequately address known vulnerabilities, meet the current “best practices” standards, and are keeping up with the most recent technologies and forms of cyber attacks.

562-page HIPAA/HITECH Final Rule Published

Posted by fgilbert on January 17th, 2013

A 562-page, unofficial version of the final HIPAA / HITECH Rule was posted today. The final version of the document (“the 2013 Rule) is scheduled to be published on January 25, 2013 at http://federalregister.gov/a/2013-01073.

This 2013 Rule becomes effective on March 26, 2013. Covered entities and business associates must comply by September 23, 2013.

This 2013 Rule is comprised of four final rules that are intended to update the existing HIPAA Privacy, Security, and Enforcement Rules to strengthen privacy and security protections for health information, improve enforcement, and implement the provisions of the HITECH Act (enacted in 2009) and the Genetic Information Nondiscrimination Act of 2008 (GINA).

Most of these changes are not a surprise. Some were clearly required by the HITECH Act, for example the increased duties and responsibilities of the Business Associates. Others had been identified in prior interim versions of the Rules. Other changes result from a clear intent to simplify procedures, and reduce redundancies and unnecessary burdens.

Nevertheless, it will take time to decipher and analyze the 562 pages of the 2013 Rule (in PDF format). Further observations will be published later on this blog. In the meantime, some of the changes or additions are listed below.

Business Associates

Under the 2013 Rule, business associates of covered entities become directly liable for compliance with certain requirements of the HIPAA Privacy Rule and most provisions of the Security Rule. Further, the definition of business associates is slightly expanded to include additional entities and intermediaries. In addition, subcontractors of business associates are also included. Portions of the Privacy Rule are modified, as well, to identify clearly which provisions of the Privacy Rule apply to business associates.

Limit to Use for Marketing and Resale

The 2013 Rule strengthens the limitations on the use and disclosure of protected health information for marketing and fundraising purposes. It prohibits the sale of protected health information without individual authorization.

Increased rights for Individuals

Individuals are granted the right to receive electronic copies of their health information. They also have the right to prohibit disclosures concerning their treatment to a health plan if the individual has paid out of pocket in full for this treatment. In addition, the 2013 Rule grants family members or others the right of access to decedent information.

Privacy Notices

The 2013 Rule requires modifications to a covered entity’s notice of privacy practices, such as to add statements regarding uses and disclosures that require authorization, fund raising, and an individual’s right to opt-out of receiving these communications. The privacy notice must also inform individuals of their right to restrict disclosure of protected information when the individual pays out of pocket in full for a health service or health care item. The notice of privacy practices must be redistributed after these changes to the Rule have been implemented.

Security Rule

Portions of the Security Rule are modified as well. There are technical changes; for example expanding obligations that apply to employees to include as well non-employees who operate in a quasi employee capacity, such as volunteers. Provisions that are duplicative of certain provisions of the Privacy rule are removed.

Breach Notification

The 2013 Final Rule also replaces the current version of Breach Notification Rule under the HITECH Act, originally published in 2009. The most important change is the replacement of “harm” threshold with a more objective standard.

Genetic Information

The 2013 Final Rule implements the Genetic Information Nondiscrimination Act of 2008 (GINA) into the HIPAA Privacy Rule by prohibiting most health plans from using or disclosing genetic information for underwriting purposes.

Enforcement

The modified HIPAA Enforcement Rule incorporates the increased and tiered civil money penalty structure provided by the HITECH Act. Other modifications include the addition of provisions addressing enforcement of noncompliance with the HIPAA Rules due to willful neglect, and change to the definition of “reasonable cause,” in connection violations of the Rule.

White House Unveils Consumer Privacy Bill of Rights

Posted by fgilbert on February 22nd, 2012

On February 23, 2012, the White House unveiled its proposal for a Consumer Privacy Bill of Rights as part of its Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy. The Framework consists of four key elements: a Consumer Privacy Bill of Rights; a stakeholder driven process to specify how the principles in the Consumer Privacy Bill of Rights apply in particular business contexts; strong enforcement by the Federal Trade Commission; and a commitment to increase interoperability between the US privacy framework and those of the international partners of the United States.

 

Overview

The Consumer Privacy Bill of Rights is intended to provide a baseline of clear protections for consumers online and greater certainty for companies. The Administration indicates that it will encourage stakeholders to implement the Consumer Privacy Bill of Rights through codes of conduct and will support Federal legislation that adopts the principles of the Consumer Privacy Bill of Rights.

Broad Definition of “Personal Data”

The proposed Consumer Privacy Bill of Rights defines “personal data” as any data or aggregations of data that are linkable to a specific individual, a definition that is very similar to that which is used by the European Union Data Protection Directive and the proposed EU Data Protection Regulation. “Personal data” may also include data that are linked to a specific computer or other device.

Seven Principles

The Consumer Privacy Bill of Rights is a comprehensive statement of the rights that consumers should expect, and the obligations to which companies should commit.  It applies the well-known Fair Information Practice Principles (FIPPs) to today’s interactive and highly interconnected environment. Seven fundamental rights for consumers are identified:

  • Individual Control: Control over what personal data companies collect and how they use the data.
  • Transparency: Easily understandable and accessible information about privacy and security practices.
  • Respect for Context: Collection, use, and disclosure personal data in a manner that is consistent with the context in which consumers provide the data.
  • Security: Secure and responsible handling of personal data.
  • Access and Accuracy: Right to access and correction of personal data.
  • Focused Collection: Limitation on the collection and retention of personal data.
  • Accountability: Use of appropriate measures to assure adherence to the Consumer Privacy Bill of Rights.

Next Steps and Other Key Concepts

In addition to the Consumer Privacy Bill of Rights, the White House Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy, identifies three key objectives:

  • Fostering Multi-stakeholder Processes to Develop Enforceable Codes of Conduct

The Framework outlines a process to produce enforceable codes of conduct that implement the Consumer Privacy Bill of Rights. The Commerce Department National Telecommunications and Information Administration (NTIA) will convene open industry and privacy advocates to develop enforceable codes of conduct that implement the Consumer Privacy Bill of Rights for specific industry sectors.

The administration will also work with Congress to enact comprehensive privacy legislation based on rights outlined in the Consume Privacy Bill of Rights to promote trust in the digital economy and extend baseline privacy protections to commercial sectors that existing federal privacy laws do not cover.

  • Strengthening FTC and State AG Enforcement

The Administration encourages Congress to provide the FTC and State Attorneys General with specific authority to enforce the Consumer Privacy Bill of Rights.

  • Improving Global Interoperability

The Administration is aware that US companies doing business on the global internet depend on the free flow of information across borders.  To this end, the Framework lays the groundwork for increasing interoperability between the US data privacy framework and that of its global trading partners, as a means to provide consistent, low-barrier rules for personal data in the user-driven and decentralized Internet environment. Two key principles are promoted:  mutual recognition and enforcement cooperation.

Scope of the Bill of Rights too Narrow to Meet Other Countries’ Laws

If the Consumer Privacy Bill of Rights and the ideas outlined in the Framework are implemented, US companies will have clearer guidelines on how they should handle personal data online. However, the scope of the document is too narrow to provide a uniform protection to all types of personal data, whether or not they are collected as part of a consumer relationship online.

While this document – and the proposed implementation – may solve some of the issues associated with the collection of consumer information online, it is not clear how it would affect to other forms of collection or use of personal information. For example, it is likely that personal information collected or used in the context of employment would not be covered. It is also not clear whether the rules would cover information collected in connection with B to B relationships, such as when a company collects the personal information of prospective customers’employees in the context of CRM systems.

Thus, while the proposed seven principles would create a data protection framework that is a little closer to that which is in effect in more than 60 countries and on all continents, there would stil remain a susbstantial gap between the US regime and the data protection laws elsewhere if personal information such as information collected as part of employment or as part of a business relationship is not covered by a clear set of data protection principles, as well.

 

Full Text of Consumer Privacy Bill of Rights

The full text of the Consumer Privacy Bill of Rights follows:

The Consumer Privacy Bill of Rights applies to personal data, which means any data, including aggregations of data, which is linkable to a specific individual. Personal data may include data that is linked to a specific computer or other device. The Administration supports Federal legislation that adopts the principles of the Consumer Privacy Bill of Rights. Even without legislation, the Administration will convene multistakeholder processes that use these rights as a template for codes of conduct that are enforceable by the Federal Trade Commission. These elements—the Consumer Privacy Bill of Rights, codes of conduct, and strong enforcement—will increase interoperability between the U.S. consumer data privacy framework and those of our international partners.

1 – Individual Control: Consumers have a right to exercise control over what personal data companies collect from them and how they use it.
  • Companies should provide consumers appropriate control over the personal data that consumers share with others and over how companies collect, use, or disclose personal data.
  • Companies should enable these choices by providing consumers with easily used and accessible mechanisms that reflect the scale, scope, and sensitivity of the personal data that they collect, use, or disclose, as well as the sensitivity of the uses they make of personal data.
  • Companies should offer consumers clear and simple choices, presented at times and in ways that enable consumers to make meaningful decisions about personal data collection, use, and disclosure.
  • Companies should offer consumers means to withdraw or limit consent that are as accessible and easily used as the methods for granting consent in the first place.
2 – Transparency: Consumers have a right to easily understandable and accessible information about privacy and security practices.

At times and in places that are most useful to enabling consumers to gain a meaningful understanding of privacy risks and the ability to exercise Individual Control, companies should provide clear descriptions of:

  • What personal data they collect,
  • Why they need the data,
  • How they will use it,
  • When they will delete the data or de-identify it from consumers, and
  • Whether and for what purposes they may share personal data with third parties.
3. Respect for Context: Consumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data.
  • Companies should limit their use and disclosure of personal data to those purposes that are consistent with both the relationship that they have with consumers and the context in which consumers originally disclosed the data, unless required by law to do otherwise.
  • If companies will use or disclose personal data for other purposes, they should provide heightened Transparency and Individual Control by disclosing these other purposes in a manner that is prominent and easily actionable by consumers at the time of data collection.
  • If, subsequent to collection, companies decide to use or disclose personal data for purposes that are inconsistent with the context in which the data was disclosed, they must provide heightened measures of Transparency and Individual Choice.
  • Finally, the age and familiarity with technology of consumers who engage with a company are important elements of context.
  • Companies should fulfill the obligations under this principle in ways that are appropriate for the age and sophistication of consumers. In particular, the principles in the Consumer Privacy Bill of Rights may require greater protections for personal data obtained from children and teenagers than for adults.

4.         Security: Consumers have a right to secure and responsible handling of personal data.

Companies should assess the privacy and security risks associated with their personal data practices and maintain reasonable safeguards to control risks such as loss; unauthorized access, use, destruction, or modification; and improper disclosure.

5.         Access and Accuracy: Consumers have a right to access and correct personal data in usable formats, in a manner that is appropriate to the sensitivity of the data and the risk of adverse consequences to consumers if the data is inaccurate.
  • Companies should use reasonable measures to ensure they maintain accurate personal data.
  • Companies also should provide consumers with reasonable access to personal data that they collect or maintain about them, as well as the appropriate means and opportunity to correct inaccurate data or request its deletion or use limitation.
  • Companies that handle personal data should construe this principle in a manner consistent with freedom of expression and freedom of the press.
  • In determining what measures they may use to maintain accuracy and to provide access, correction, deletion, or suppression capabilities to consumers, companies may also consider the scale, scope, and sensitivity of the personal data that they collect or maintain and the likelihood that its use may expose consumers to financial, physical, or other material harm.

6.         Focused Collection: Consumers have a right to reasonable limits on the personal data that companies collect and retain.

  • Companies should collect only as much personal data as they need to accomplish purposes specified under the Respect for Context principle.
  • Companies should securely dispose of or de-identify personal data once they no longer need it, unless they are under a legal obligation to do otherwise.
7.         Accountability: Consumers have a right to have personal data handled by companies with appropriate measures in place to assure they adhere to the Consumer Privacy Bill of Rights.
  • Companies should be accountable to enforcement authorities and consumers for adhering to these principles.
  • Companies also should hold employees responsible for adhering to these principles.
  • To achieve this end, companies should train their employees as appropriate to handle personal data consistently with these principles and regularly evaluate their performance in this regard.
  • Where appropriate, companies should conduct full audits.
  • Companies that disclose personal data to third parties should at a minimum ensure that the recipients are under enforceable contractual obligations to adhere to these principles, unless they are required by law to do otherwise.


 

 

New Regime for Mobile Apps

Posted by fgilbert on February 22nd, 2012
The California Attorney General has unveiled its recent agreement with Mobile App providers Google, Facebook, Hewlett Packaard, Reserach in Motion/Blackberry in which the largest mobile apps providers have committed to ensure that mobile apps purchasers will have access to a clear, conspicuous, privacy policy before they download an app from their site.
The actual agreement is provided at:  http://oag.ca.gov/news/press_release?id=2630

Don’t forget the March 1, 2012 Deadline for Compliance with the Massachusetts Security Regulation 201 CMR 17.00

Posted by fgilbert on February 7th, 2012

Businesses that receive, maintain, process, or have access to personal information of Massachusetts residents are required to comply with the Massachusetts Security Regulation, 201 CMR 17.00. The Regulation requires business to implement and comply with a comprehensive a Written Information Security Plan or WISP in order to protect certain categories of personal information about employees, customers, prospects, business contacts and other third parties.

A first implementation deadline of March 1, 2010 required all businesses subject to the Regulation to WISPs for their operation. The Regulation contains a second deadline.  By March 1, 2012, covered businesses must have updated of all service provider contracts that were entered before March 1, 2010 in order to require these service providers to also comply with the requirements to adopt a WISP in order to protect this personal information. Thus, by March 1, 2012, companies that are subject to the Massachusetts Regulation will have to be fully compliant, both with respect to their own operations, and with respect to their contracts and interaction with their service providers.

Highlights of the Regulation are provided below.

Only certain categories of personal information are covered. The requirement applies only to the protection a person’s first and last name (or first initial and last name) combined with any of the following:

  • Social Security number
  • Driver’s license number
  • State-issued ID card number
  • Financial account (such as bank account, insurance account)
  • Credit or debit card number

The Regulation requires a business that owns or licenses this protected personal information in paper or electronic form to develop, implement, and maintain a comprehensive written information security program. Companies that fail to implement such a program may be subject to a $5,000 civil penalty for each violation.

The requirements include, among other things:

  • Designating one or more employees to maintain the comprehensive information security program;
  • Development and implementation of a comprehensive written information security program that contains administrative, technical, and physical safeguards appropriate to the size, scope, and type of business;
  • Special security measures for computer systems and wireless systems;
  • Secure user authentication and access controls;
  • Encryption of all records and files containing personal information that will travel across public networks or are transmitted wirelessly;
  • Use of reasonably up-to-date versions of system security agent software that must include malware protection and reasonably up-to-date patches and virus definitions.
  • Security monitoring and intrusion detection
  • Ongoing monitoring of company’s compliance with the information security program;
  • Ongoing employee training.
  • Use of reasonable steps to select and retain third-party service providers that are capable of maintaining appropriate security measures to protect personal information consistent with the Regulations and any applicable federal regulations;
  • Written contracts with these service providers;
  • Reviewing the scope of the information security measures at least annually or whenever there is a material change in business practices that may reasonably implicate the security or integrity of records containing personal information.

For a detailed analysis of the Massachusetts Security Regulation 201 CMR 17.00, click here.