Home | About us | Editorial board
Search | Ahead of print | Current issue | Archives
Submit article | Instructions
Subscribe | Contacts | Login 
Print this page Email this page - Users Online: 282

 Table of Contents  
REVIEW ARTICLE
Year : 2015  |  Volume : 2  |  Issue : 1  |  Page : 4-8

Quality in clinical research activities: Role of institution/clinical trial site


Department of Medicine, JSS Medical College, Mysore, Karnataka, India

Date of Web Publication8-Jul-2015

Correspondence Address:
Dr. Pratibha Pereira
38/D, 2nd Cross, Krishna Jayalaxmipuram, Mysore, Karnataka
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/2319-1880.160232

Rights and Permissions
  Abstract 

There is an increasing focus on having quality systems in place during the planning stages of clinical trials. Such systems require the development and implementation of standards for each step. Although this is not imposing something totally new on clinical research, a systematic approach will produce a more reliable and useful end product-high-quality data obtained without compromising the protection of human subjects' rights and welfare. A suggested quality system with standards for each step is addressed in this article. The organization measures and improves when necessary, compliance with organizational policies, procedures, and applicable laws; regulations, codes, and guidance. The organization also measures and improves when necessary, the quality, effectiveness, and efficiency of the clinical trial process to protect the research participant.

Keywords: Accreditation, clinical research, quality


How to cite this article:
Pereira P. Quality in clinical research activities: Role of institution/clinical trial site. J Nat Accred Board Hosp Healthcare Providers 2015;2:4-8

How to cite this URL:
Pereira P. Quality in clinical research activities: Role of institution/clinical trial site. J Nat Accred Board Hosp Healthcare Providers [serial online] 2015 [cited 2019 Oct 17];2:4-8. Available from: http://www.nabh.ind.in/text.asp?2015/2/1/4/160232


  Introduction Top


Clinical research is a several stage highly controlled process in medical science testing the effect of drug device or biological sample on human subjects. Clinical research demands exact science and impeccable quality to protect the human research participant. The data generated need to be reliable as human lives can depend on getting drug dosage.

The clinical trial site has an important role in maintaining this as clinical trial are conducted, data are generated, subject are recruited, serious adverse event (SAE) are reported and managed all at the site. The stringent compliance to good clinical practices (GCPs), rules, and regulation; adhering to declaration of Helsinki needs evaluation and oversight by the institution needs quality assurance (QA) quality control (QC) programs. [1]

These programs should include conducting internal (self) audits, actively implementing, and revising standard operating procedures (SOPs) recording protocol deviations, and initiating procedures to correct any shortcomings and prevent their recurrence. In addition, to conducting routine self-audits exemplary clinical trial sites should also submit to external audits to provide independent verification of the operations of their QA program. External audits can point out weaknesses in a QA program and provide suggestions for corrective action, which can improve the integrity of the trial site. [2]

Implementation of an effective QA system relies on the proper education of the entire trial site staff. [3]

Defining and measuring quality in the clinical trial setting can be difficult. The most important variable involved is the ever-unpredictable human and data capture.

Two goals for its quality improvement activities : i0 n clinical research:

  • To ensure compliance with all rules and regulations and organizational SOP and policies and professional standards.
  • To identify strengths, weaknesses, and opportunities to improve the quality, efficiency, and effectiveness of the research process.


QC as per (International Conference on Harmonization [ICH] sec 1.47) is defined as operational techniques and activities undertaken within the QA system to verify that the requirements for quality of the trial-related activities have been fulfilled.

QC in clinical research helps to identify inconsistencies/deviations/noncompliance. This is a verification process. This is the main responsibility of the investigator.

QA (ICH sec 1.46) all those planned and systemic actions that are established to ensure that the trial is performed and data are generated, recorded and reported in compliance with GCP and regulatory requirements and QA helps to prevent them, this is a preventive process. [9] This is the responsibility of the institution and not only of the sponsor.

Staff is often familiar with sponsor audits and/or regulatory audits. However, QA programs at the investigator site are less frequently seen. Establishing such a program is not arduous and should be done promptly if none exists. [1]

A third element of the quality of research is the introduction of an independent and objective audit of the QA/QC system and its outcomes.

Corrective and preventive action

Potential problems should be anticipated, and steps should be taken to avoid them. Nevertheless, problems will inevitably arise, and the discovery of a problem should trigger swift corrective action and the development of a plan to prevent recurrences. A reevaluation of the system should be performed to ascertain how the problem occurred. Documentation of these actions is necessary to avoid any questions from an auditor.

Among the most widely used tools for continuous improvement is a four-step quality model-the plan-do-check-act cycle, also known as the Deming Cycle or Shewhart Cycle [3] other widely used methods of continuous improvement are Six Sigma, Lean, and total quality management (QM):

  • Quality management system (QMS) in clinical research.
  • QA.
  • QC.
  • Trial protocol … collect and analyze data … trial report
  • Training procedures.
  • Types of errors or deficiencies.


They classify the types of deficiencies detected by auditing of the research sites into three categories: Procedural, significant, and critical. Procedural deficiencies are typically administrative-type errors and may include events such as documentation not being filed in the right place, which may reflect staff members not fully understanding the processes. With procedural deficiencies, these types of findings do not necessarily represent a departure from GCP and proper documentation practices. Significant deficiencies may represent a departure from GCP, but do not invalidate the data. An example of a significant finding may be a test being performed out of the protocol required window, which needs to be reported as a protocol deviation. Procedural and significant deficiencies are findings in which the subjects' rights, safety, and welfare have not been compromised, and the data integrity have not been compromised. Critical deficiencies are the most severe, and occur when major departures from GCP and the protocol are discovered. These findings may have affected the subjects' rights, safety, and welfare, or their willingness to participate. In addition, data integrity may be affected. Clearly, prevention of any of these errors, particularly critical ones, is the aim of good QA procedures and training. [4]

Quality plan

Describes how QC and QAs processes will be applied throughout the clinical trial.

The quality plan describes the improvement measures to increase the quality efficiency. The institution also has policies and procedures for conducting regular training for research staff and personnel on the Human Research Protection Program (HRPP) plan and conducts internal audits on a regular basis. Institution has and follows quality improvement plan that periodically assesses the quality, efficiency, and effectiveness of the QMS in research activities by conducting internal audits. [5]

Institution needs to apply QA/quality improvement methods to the HRPPs. There are various approaches (you do not have to adopt a formal approach, but I mention these to give you a sense of background: Kaizen, Six Sigma, others). The general approach is to track problems/errors, set targets for performance (which helps identify problems/errors) and then implement interventions to improve things, and measure whether the improvements have worked. Association for the Accreditation of HRPP is starting with some minimal requirements: Have a goal, have a measure, and then indicate how the information obtained will be used.

EG: QA methods in a conflict of interest:

  • Disclosure requirements were met.
  • Management plans were developed for identified conflicts
  • Management plans are being executed.


Identified conflicts have been adequately managed, reduced and/or eliminated.

Quality standards are important in clinical research to safeguard and promote the health and welfare of human research subject by assuring their rights and wellbeing are protected, providing timely review of human research projects and facilitating excellence in research. Exercising oversight over research protection by supporting an effective monitoring and compliance program to ensure research is conducted in compliance with ethical principles, applicable laws and regulations, and institutional policies and procedures: [6],[7]

  • To ensure compliance of quality activity and the effectiveness of the QMS as per standards operating procedures to conduct periodic assessments and to investigate complaints and noncompliance.
  • Taking corrective action and preventive actions.
  • Setting QA goals/month and regular weekly QA.
  • Check to meet the goals.


Therefore, QC and QA systems together constitute key quality systems that are parts of QM of a clinical trial at the site.

Quality assurance audits

  • Trial/project specific audits QA … trial specific audits … audit of a protocol/case report form (CRF), IB, and audit to see compliance of data collection. Patient selection.


Systems audits system

  • A selected process plus all related activities, resources, organization, documents (including SOPs and records), facilities, and equipment.
  • System audits audit of electronic systems. Audit of monitoring system, audit of intellectual property storage system, audit of document management and archiving.


Benefit of quality management system

  • It supports continuous quality improvement.
  • Implement recommendation.


Quality assurance audits

  • Quality activities … act plan, check, do.
  • Clinical trial process in line with SOP.



  Role of an Institution/Organization Top


  1. Organization can conduct audits or surveys or uses other methods to assess compliance with organizational policies and procedures; and applicable laws, regulations, codes, and guidance.
  2. The organization makes improvements to increase compliance when necessary.
  3. Organization ensures that compliance to adherence to rules and regulations is maintained by regular assessment of measures of compliance with important indicators. The data, in order to measure compliance, will be analyzed and based on the analysis, corrective action can be taken to improve compliance.

    E.g., compliance to consent process.
  4. The organization identifies strengths and weaknesses of the HRPP and makes improvements, when necessary, to increase the quality, efficiency, and effectiveness of the program. The institution also has policies and procedures for conducting regular training for research staff and personnel on the HRPP plan and conducts internal audits on a regular basis.


Measure to assess and to improve the quality, efficiency of research activities by the regular monitoring process.

The Ethics Committee and the principal investigator can conduct regular monitoring:

  • For the Ethics Committee:
    • Total number of actions (e.g., new studies, continuing reviews, modifications, exemptions etc.).
    • Time from board meeting to approval of minutes by the chair (target = 90% ≤ 4 days).
    • Member attendance (target = at least 10 of 12 meetings/year).
    • Member reviewer comments submitted prior to meetings (target = 90% within 2 days of meetings).
    • Informed consent document readability using the Flesch-Kincaid Instrument (target = reading ease ≥45 or grade level ≤eighth).
    • For new protocols, the mean number of days from submission to review at meeting*.
    • For new protocols, the mean number of days from submission to approval*.
    • For new protocols reviewed using the expedited procedure, the mean number of days from submission to review*.
    • For new protocols reviewed using the expedited procedure, the mean number of days from submission to approval*.
    • Set up a process to identifying goals for these measures.
    • Basic descriptive analysis of the data on a monthly basis using a spreadsheet program. This will include the monitoring.


The Ethics Committee can at least quarterly to discuss results and plan program improvements.

Results are reported to leadership on a quarterly basis. The program uses the information to design and implement improvement plans:

  • A goal is an outcome you want your organization to achieve and is described in a way that specifies the outcome and indicates a measure to evaluate success. Consider these examples:
    1. Eighty percentage of initial applications are complete when submitted to staff, and do not have to be returned by staff because they are incomplete.
    2. The convened Institutional Review Board has a quorum at 100% of its meetings during the next year.
    3. Ninety percentage of approval letters in the next 2 months contain the information specified in the policy.


The key point is that the goal should describe a target outcome so you know when you have succeeded, and milestones for evaluating progress. The same person might be responsible for implementing each goal, or different groups might be assigned this. The key is to specify who does this, how they will analyze the data, and what actions they will take with the information they obtain.

Here is an example that describes:

  • At least one goal/measure of quality, efficiency, and effectiveness.
  • A description of the method (monthly measures).
  • A description of what corrective actions will be taken.


* Note that you do not need to have more than one goal and one measure (the example below has multiple measures).


  Areas of Monitoring for Principal Investigator Top


Identifying high-risk trials and conducting an end to end monitoring of compliance to:

  1. Adherence to protocol.
  2. Consent process.
  3. Data capture.
  4. Drug accountability.


Key areas where audit can be conducted by organization

  1. Protocol review QC personnel will provide independent review of the approved protocol.
  2. Data capture. Comparison of CRF with the protocol to ensure that it is designed to collect all the necessary data.
  3. Informed consent document/form.
  4. It must contain all the essential elements as per regulation.
  5. Study conduct:
    • Compliance to protocol.
    • Accountability of investigational medicinal product.
    • Time is given by the investigator.
  6. Source document verification.
  7. Compliance to the requirements of all regulations. E.g., SAE reporting adherence to SOP and terms of reference.
  8. Ethics Committee.


Operational quality control or quality control plan

For each key operational stage of the study that defines standards against which QC will be conducted. [8]

For example monitoring of subject consent in high-risk trials.

Different aspects of managing quality in the clinical field are:

  1. The definition of a two-stage quality system: It identifies the differences between QC activities which reside with the operational units and auditing activities performed by an independent QA group, which assesses the efficiency and integrity of the control systems established to ensure the quality.
  2. The multidisciplinary picture of a clinical study: Every operational staff contributes to quality through a chain using the "quality in/quality out" concept one dedicated person of the research team does monthly monitoring of all trial master files at the site.
  3. The review of the main responsibilities of the main personnel coordinator, investigator, in a GCP environment, to show that quality is achievable by involving appropriate qualified staff.
  4. The need for proactive control systems to reduce the risk of error and increase the credibility and confidence in building quality. E.g., monitoring or verifying the consent process to ensure that the subject has understood the consent document and his participation is voluntary.
  5. The role and objectives of a QA group. E.g., role of the institution in conducting internal audits.



  Need of the Hour Top


  • The head of the institution with the support of the Ethics Committee and investigator must ensure that compliance with rules and regulations are maintained by regular assessment of measures of compliance.
  • There must be a quality team will collect the data in order to measure compliance.

    There must be an analysis of the data, based on quality planning aspects reviewed includes: EG

    • Training of personnel at different levels.
    • Trial specific activities.
    • Other supporting activities and equipments wherever applicable.
    • Identification and preparation of applicable quality records.
    • Other relevant documents.


Continual quality improvement plan and internal audit

There must be a mechanism to improve continually the effectiveness of QMS and performance through the use of following tools:

  • Internal quality audit.
  • Subject feedback.
  • Corrective/preventive action.


Internal audit/monitoring must be conducted regularly at the intervals, analysis, and corrective action must be taken to improve compliance with GCP and applicable rules and regulations.

The organization identifies areas for improvements when necessary, to increase the quality, efficiency, and effectiveness of the trial conduct. The organization must measure and improves when necessary, the quality, effectiveness, and efficiency of the research activities.

An example of which would be basic elements of consent disclosure in the consent:

  • Areas of audit.
  • Process audits.


Internal audits for quality control and external audits for quality assurance

System audit

  • Adverse event (AE) handling.
  • Staff training.
  • SOP.
  • Equipment records.
  • Facilities.
  • Procedures inspection.
  • Archiving.


EG: American depositary receipt

  • All staff are aware of reporting AE procedure.
  • SOPs available for reporting AEs.
  • Documentation to demonstrate timely and satisfactory handling of AEs.
  • Regulatory requirement fulfilled.



  Assessing the Effectiveness of Quality System Top


  1. Anticipate errors

    Look for all the ways a process could fail and then make improvements to ensure that it does not.
  2. Procedures

    Develop clear systems and procedures. A clear procedure for consent process, conflict of interest, documentation of drug accountability.
  3. Train

    Ensure that staff is trained for the tasks they perform.
  4. Validate

    All operations must be validated.
  5. Avoid short-cuts

    Follow SOPs.
  6. Control change

    Uncontrolled change can cause noncompliance.
  7. Challenge what you do

    Regularly check that systems meet applicable standards.


Financial support and sponsorship

Nil.

Conflict of interest

There are no conflicts of interest.

 
  References Top

1.
FDA Regulations Relating to Good Clinical Practice and Clinical Trials. http://www.fda.gov/ScienceResearch/SpecialTopics/RunningClinicalTrials/ucm114928.htm [Last accessed on 2009 Jun 21].  Back to cited text no. 1
    
2.
Quality assurance and educational standards for clinical trial sites. J Oncol Pract 2008;4:280-2.  Back to cited text no. 2
    
3.
Quality assurance and educational standards for clinical trial sites. J Oncol 2015;11:11-3.  Back to cited text no. 3
    
4.
Baigent C, Harrell FE, Buyse M, Emberson JR, Altman DG. Ensuring trial validity by data quality assurance and diversification of monitoring methods. Clin Trials 2008;5:49-55.  Back to cited text no. 4
    
5.
Quality assurance and educational standards for clinical trial sites. J Oncol Pract 2008;4:280-2.  Back to cited text no. 5
    
6.
Davis J, Nolan V, Woodcock J, Estabrook R, editors. Assuring Data Quality and Validity inClinical Trials for Regulatory Decision Making. A Workshop Report. Washington, DC: The Institutes of Medicine, National Academies Press; 1999.  Back to cited text no. 6
    
7.
International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH), E3: Structure and Content of Clinical Study Reports; November, 1995.  Back to cited text no. 7
    
8.
Fukushima M. Gan To Kagaku Ryoho. 1996;23:172-82. Review. Japanese. [http://www.ncbi.nlm.nih.gov/pubmed/8611045].  Back to cited text no. 8
    
9.
U.S. Department of Health and Human Services Food and Drug Administration Center for Drug Evaluation and Research (CDER) Center for Biologics Evaluation and Research (CBER) April 1996 ICH Page 5 line 34 right side.  Back to cited text no. 9
    



This article has been cited by
1 Evaluation of quality management systems implementation in medical diagnostic laboratories benchmarked for accreditation
Selvi Manickam Tamil,Ankanagari Srinivas
Journal of Medical Laboratory and Diagnosis. 2015; 6(5): 27
[Pubmed] | [DOI]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
Abstract
Introduction
Role of an Insti...
Areas of Monitor...
Need of the Hour
Assessing the Ef...
References

 Article Access Statistics
    Viewed2399    
    Printed86    
    Emailed1    
    PDF Downloaded505    
    Comments [Add]    
    Cited by others 1    

Recommend this journal


[TAG2]
[TAG3]
[TAG4]