PD Handbook

Clinical Competency Committee (CCC) and Program Evaluation Committee (PEC)

Last updated: January 27, 2020


  1. Eliza Slama MD, MPH, Surgery Resident, Saint Agnes Hospital, Baltimore, MD
  2. Isam Hamdallah MD, Associate Program Director, General Surgery, Saint Agnes Hospital, Baltimore, MD
  3. Gopal Kowdley MD PhD, Program Director and CMIO, Saint Agnes Hospital, Baltimore, MD


By definition, the Clinical Competency Committee (CCC) is a “required body comprising three or more members of the active teaching faculty who is advisory to the program director and reviews the progress of all residents in the program” (ACGME, 2016). The goal of the CCC is to assure that resident physicians are meeting set Milestones developed by the ACGME in a progressive fashion. In addition, when present, the CCC aims to identify and address resident deficiency and provide feedback to help with earlier remediation, and resolution. 

Topic 1: Clinical Competency Committee

ACGME principles:

V.A.3. A Clinical Competency Committee must be appointed by the program director. 

V.A.3.a) At a minimum, the Clinical Competency Committee must include three members of the program faculty, at least one of whom is a core faculty member. 

V.A.3.a).(1) Additional members must be faculty members from the same program or other programs, or other health professionals who have extensive contact and experience with the program’s residents. 

V.A.3.b) The Clinical Competency Committee must: 

V.A.3.b).(1) review all resident evaluations at least semi-annually; 

V.A.3.b).(2) determine each resident’s progress on achievement of the specialty-specific Milestones; and, 

V.A.3.b).(3) meet prior to the residents’ semi-annual evaluations and advise the program director regarding each resident’s progress. 

Details (Evidence/data)


The CCC must be selected by the Program Director. The minimum number of individuals in the committee is three, all must be program faculty, with at least one who is a core faculty member. There may be additional members, both physician and non-physician members, which may be of benefit for different relational perspective, and may or may not be within the same program. 

  • Anecdotal Key: In our medium size community teaching facility, the CCC is comprised of 7 members, including 3 core faculty, 3 physicians and 1 physician’s assistant. We feel it is beneficial to obtain multiple perspectives, on different levels, when assessing residents. Other, larger, programs may benefit from subcommittees within the CCC to effectively evaluate residents.

The degree of Program Director involvement in the CCC is at the discretion of the committee, and varies depending on different characteristics of the program. Regardless, the Program Director ultimately has the final responsibility over the program, and also the evaluation and promotion of each resident.

  • Anecdotal Key: In our program, the Program Director acts as facilitator within the CCC, providing input only when needed or necessary. The PD usually appoints the roles and reviews the order of the discussion session, making sure that the committee does not get side tracked in the discussion. 

The CCC should continue to develop each year with the changing residency program. You will find things that work and do not for your specific program. In addition, you can seek support from your DIO, other programs within and outside of your institution, and the ACGME. 


Requirements of the CCC as outlines above include:

  • Semi-annual meeting (at least), recommended to be in 6 month intervals
    • Meeting must be done prior to semi-annual evaluation of residents
    • Anecdotal Key: We meet every 6 months in the months of November and May. The timing gives us sufficient time for residents, attending physicians, and ancillary staff enough observational time to complete required evaluations. This also provides us enough time to report to ADS our milestone results.
  • Progress of each resident should be made via observation of Milestones. Milestones are specialty-specific objective parameters to assess resident education, which is stratified into year to provide a clear perspective. 
    • For General Surgery Milestones please see:


    • For additional information on Milestones please see:


    • Milestones are not intended to be the sole source for resident evaluation. Programs can consider using other evaluation forms as needed.
      • Anecdotal Key: At our program, we not only use the Milestones laid out by the ACGME, we use evaluations of residents from other residents, attending physicians (both clinical and operative), and medical student evaluation of residents.  


  1. Resident:
          1. Feedback to the resident is a requirement, which can improve their ultimate learning experience.
          2. Factors for “High Quality Feedback” according to the ACGME:
            1. Timeliness: results after CCC given to resident soon after the meeting.
            2. Specificity: give specific details instead of vague statements.
            3. Balance positive and negative: should give both reinforcing and corrective feedback
            4. Learner reaction and reflection: important to allow resident to react and reflect on feedback
            5. Individualized learning plans: creating an individualized plan per resident is key
        1. Milestones are reported mid-year and at the end of year to the ACGME website, the program coordinator inputs the data.  These dates for submission correspond to deadlines for reporting in January and June.
      1. Evidence
        1. Effective meeting strategies include:
          1. Group discussions (Schwind, 2004; Hemmer, 2000; Thomas, 2011)
          2. Structured discussions (e.g. order of speaking, multiple perspectives, weighing benefits, et) (Hauer, 2016)
          3. Group leaders should prompt elaboration and information exchange (Hauer, 2016)
          4. Committee shared understanding of purpose and expectation of resident performance (Hauer, 2016)
          5. Sharing written assessment, as opposed to based off memory (Hauer, 2016)
          6. Attempting to avoid bias such as basing opinion due to one instance, not accepting resident change/improvement, opinion influenced by group, judgement based on emotions, etc. In addition, CCC should continue to watch out for bias and learn how to conquer it (Dickey, 2017)
    1. Anecdotes: We keep a track of our resident rotations and ask for input from members on our CCC from each of the respective services that the residents rotated on (minimizing recall bias).  This is structured discussion around group discussions with PD playing the role of keeping discussion on topic. For our CCC, all evaluations are provided in a packet form to the members for each resident.  Some committees will assign specific residents to specific members for milestone assignment, however, we do not do this given the relative small size of our program.
    2. Additional resources/links/examples: (Added to respective section)
      1. Please see the following link for a complete guidebook provided by the ACGME: https://www.acgme.org/Portals/0/ACGMEClinicalCompetencyCommitteeGuidebook.pdf 

Topic 2: Program Evaluation Committee


The goal of the Program Evaluation Committee (PEC) is to oversee the curriculum and its development within a program. It should function to plan and develop educational activities, and also function as an avenue for self-evaluation for improvement and further educational endeavors.

ACGME Principles

V.C. Program Evaluation and Improvement 

V.C.1. The program director must appoint the Program Evaluation Committee to conduct and document the Annual Program Evaluation as part of the program’s continuous improvement process. 

V.C.1.a) The Program Evaluation Committee must be composed of at least two program faculty members, at least one of whom is a core faculty member, and at least one resident. 

V.C.1.b) Program Evaluation Committee responsibilities must include: 

V.C.1.b).(1) acting as an advisor to the program director, through program oversight; 

V.C.1.b).(2) review of the program’s self-determined goals and progress toward meeting them;

V.C.1.b).(3) guiding ongoing program improvement, including development of new goals, based upon outcomes; and, 

V.C.1.b).(4) review of the current operating environment to identify  strengths, challenges, opportunities, and threats as related to the program’s mission and aims. 



The PEC is appointed by the Program Director, and consists of at least two faculty members, one of which is a core faculty member, and at least one resident. 

  • Anecdotal Key: At our program, the PEC is composed of 4 members, 2 core faculty and our chief residents.  When we have identified specific areas of concern, we split the PEC into goal specific areas as suggested by responses to the ACGME annual program surveys.  For example, faculty improvement subcommittee.  


The PEC advises the Program Director by review of the program’s goals and progress towards accomplishing them. It functions to create new goals for the program, in addition to outlining strengths, challenges, opportunities, and threats to the program. The summation of their work should be reflected in the Annual Program Evaluation (APE). 

As outlined below, the following should be considered in the PEC assessment:

  • Curriculum
  • Outcomes from previous APE(s)
  • ACGME letters of notification e.g. citations, improvement areas, and comments. 
  • Quality and safety of patient care
  • Aggregate resident and faculty:
    • Well-being
    • Recruitment and retention
    • Workforce diversity
    • Engagement in quality improvement and patient safety
    • Scholarly activity 
    • ACGME Resident and Faculty Surveys
    • Written evaluation of the program
  • Aggregate resident:
    • Achievement and Milestones
    • In-training examinations 
    • Board pass and certification rates
    • Graduate performance
  • Aggregate faculty:
    • Evaluation
    • Professional development

The annual review must:

  • Be distributed and discussed with the members of the teaching faculty and the residents
  • Be submitted to the DIO

The program must complete a self-study prior to its 10-year Accreditation Site Visit, and this must be submitted to the DIO. The Self-Study is a longitudinal assessment of the program and its learning environment. 


Evidence based data has suggested the following strategies for an effective PEC:

  • Resident feedback and involvement within the PEC and APC (Lypson, 2016)
  • Streamlining of resources including the use of online documents and surveys (Lypson, 2016)
  • Creating a PEC that is appropriate size for your program and where each member plays a vital role (Jordan, 2016) 
  • Anecdotes
    Our program utilizes the PEC meetings occurring every 3 months in the year to check on progress of areas for improvement.  Our main PEC meeting for setting agendas for the upcoming year come at our residency retreat where members of the PEC are present and formulate tasks for the year.
  • Additional resources/links/examples: (Added to respective section)


ACGMEGlossaryofTermsJuly1,2013.AccessedJuly27,2016: https://www.acgme.org/Portals/0/PDFs/ab_ACGMEglossary.pdf. 

Dickey CC, Thomas C, Feroze U, Nakshabandi F, Cannon B. Cognitive Demands and Bias: Challenges Facing Competency Committees. J Grad Med Educ. 2017 Apr;9(2):162-164.

Hauer KE, ten Cate O, Boscardin CK, Iobst W, Holmboe ES, Chesluk B, Baron RB, O’Sullivan PS, Ensuring Resident Competence: A Narrative Review of the Literature on Group Decision Making to Inform the Work of Clinical Competency Committees J Grad Med Educ. 2016 May; 8(2): 156–164. 

Hemmer PA, Hawkins R, Jackson JL, Pangaro LN. Assessing how well three evaluation methods detect deficiencies in medical students’ professionalism in two settings of an internal medicine clerkship. Acad Med. 2000;75:167-73. 

Jordan, Kerrie et al. The Program Evaluation Committee Handbook: from Annual Program Evaluation to Self-Study. HCPro Inc, 2016.

Lypson ML, Prince MEP, Kasten S, et al. Optimizing the post-graduate institutional program evaluation process. BMC Med Ed. 2016; 16:65 

Schwind CJ, Williams RG, Boehler ML, Dunnington GL. Do individual attendings’ post-rotation performance ratings detect residents’ clinical performance deficiencies? Acad Med. 2004 May;79(5):453-7. 

Thomas MR, Beckman TJ, Mauck KF, Cha SS, Thomas KG. Group assessments of resident physicians improve reliability and decrease halo error. J Gen Intern Med. 2011; 26: 759-64.