Data Privacy: A Compliance Blind Spot

1 Introduction

What should be planned for when implementing a new computerized system, such as a clinical trial database?

For clinical computerized systems, compliance goes beyond Good Clinical Practice (GCP), because these systems frequently process “privacy relevant” data. Controls required by Data Privacy regulations include encryption and restricted access, along with informed consent.

Data Privacy represents legal frameworks that require specific controls for information systems. Challenges with Data Privacy regulation, when compared to Good “x” Practice (GxP) include:

  • There is little clear guidance on what is required
  • The scope of Data Privacy is often not clear
  • Many clinical systems have a multinational or global footprint, requiring data movement across national borders,and the proliferation of privacy laws in different countries can have complex, and sometimes conflicting,implications.

This Concept Paper aims to highlight where Data Privacy regulations could apply, and the requirements for system implementation arising from those regulations.

2 What is private data?

If a computerized system contains personal data, then the system is within scope of Data Privacy frameworks:

‘personal data’ shall mean any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity; [1]

Following from this, it can be deduced that height may be personal data, but it is only privacy relevant if the data subject can be identified, using Personally Identifiable Information (PII). Examples of personally identifiable information include:

  • Person’s name
  • A social security or national identity number
  • Employee ID
  • A Patient/Subject ID (even if coded/blinded)
  • Attributes that, in combination, can uniquely identify a person

The last two examples can cause significant discussion.

It has been assumed that if data is de-identified and uses only a coded (i.e., blinded) Subject ID, the actual person cannot be identified, so the related system will not be privacy relevant. The counter argument (and the legal interpretation) is that there is data somewhere that links the Subject ID to an actual person. Even though the data may be found in a different location (such as an investigator site) and it could be difficult to obtain, it is available.

Similarly, single data attributes of a person may not in themselves identify the individual, but sufficient attributes in combination can point to a specific individual, e.g., Date of Birth plus Gender plus Initials plus Zip Code/Postal Code.

However, the use of “Honest Brokers1” can remove the burden of Data Privacy regulation from a system that holds only coded patient data by ensuring that a coded ID remains separated from patient identity.

Examples of personal data:

  • Last Name
  • First Name
  • Initials
  • Work Address
  • Home Address
  • IP Address
  • Place of Work
  • Work telephone, fax, and/or e-mail
  • Home telephone, fax, and/or e-mail
  • Date of Birth
  • Place of Birth
  • ID Card Number
  • Gender
  • Race
  • Religion
  • Political Affiliation
  • Civil Status
  • Name of Spouse
  • Birth Dates of Children
  • Photograph
  • Health Data
  • Curriculum Vitae

Depending on the country of jurisdiction, these data may be considered sensitive personal information, which requires a higher level of control, e.g., race, religion, or political affiliation are considered sensitive in many countries. Health data are also considered as sensitive in many countries, and these data are often collected in the context of clinical trials.

Data for employees, investigators, third party suppliers, and other companies are also covered by Data Privacy, as well as data on patients/trial subjects.

Read more by downloading Data Privacy: A Compliance Blind Spot (Published: June 2017).