|
 |
ORIGINAL ARTICLE |
|
Year : 2019 | Volume
: 63
| Issue : 4 | Page : 277-281 |
|
|
Development, validation and use of appropriate assessment tools for certification of entrustable professional activities in community medicine to produce a competent postgraduate: A pilot study
Saurabh Rambiharilal Shrivastava1, Thomas V Chacko2, Shital Bhandary3, Prateek Saurabh Shrivastava4
1 Vice-Principal Curriculum, Department of Community Medicine, Member of the Medical Education Unit and Institute Research Council, Shri Sathya Sai Medical College and Research Institute, Sri Balaji Vidyapeeth-Deemed to be University, Kancheepuram, Tamil Nadu, India 2 Dean Medical Education and Professor, Department of Community Medicine, Believers Church Medical College and Hospital, Thiruvalla, Kerala, India 3 Associate Professor, Department of Community Medicine, PAHS School of Medicine, Lalitpur, Nepal 4 Professor, Department of Community Medicine, Shri Sathya Sai Medical College and Research Institute, Sri Balaji Vidyapeeth–Deemed to be University, Kancheepuram, Tamil Nadu, India
Date of Web Publication | 18-Dec-2019 |
Correspondence Address: Dr. Prateek Saurabh Shrivastava Professor, Department of Community Medicine, Shri Sathya Sai Medical College and Research Institute, Sri Balaji Vidyapeeth (SBV) – Deemed to be University, Tiruporur - Guduvancherry Main Road, Ammapettai, Nellikuppam, Chengalpattu, Kancheepuram, Tamil Nadu - 603 108 India
 Source of Support: None, Conflict of Interest: None  | Check |
DOI: 10.4103/ijph.IJPH_45_19
Abstract | | |
Background: Adoption of competence-based medical education (CBME) is the need of the hour. Objectives: The objective of the study is to develop and validate appropriate assessment tools for the community medicine entrustable professional activities (EPAs) and to assess the usefulness of the validated tools in the assessment of postgraduate (PG) students. Methods: An interventional study for 14 months was done in the department of community medicine. After the sensitization of faculty members and PGs, three EPAs were selected through consensus between faculty members and appropriate assessment tools mini-clinical evaluation exercise (Mini-CEX), case-based discussion (CBD), and direct observation of procedural skills (DOPS). Rubrics of milestones were formulated for the selected tools, and the designed tools were validated. These three validated tools were used for the quarterly assessment. Results: The item-content validity index for all three assessment tools was one, while Scale Content Validity Index for Mini-CEX and CBD were 1, and for DOPS, it was 0.87. Three PG students were assessed using the validated tools thrice for the three selected EPAs. The PGs opined that assessment using rubrics made their task-specific, while faculties were quite satisfied with the assessment process as it removed subjectivity. Conclusions: The developed and selected tools of EPAs were found to have a substantial level of both face validity and content validity. The tools were also found to useful for periodic assessment in workplace settings and acceptable to both PG students and internal/external faculty members.
Keywords: Community medicine, competency-based medical education, India, postgraduate students
How to cite this article: Shrivastava SR, Chacko TV, Bhandary S, Shrivastava PS. Development, validation and use of appropriate assessment tools for certification of entrustable professional activities in community medicine to produce a competent postgraduate: A pilot study. Indian J Public Health 2019;63:277-81 |
How to cite this URL: Shrivastava SR, Chacko TV, Bhandary S, Shrivastava PS. Development, validation and use of appropriate assessment tools for certification of entrustable professional activities in community medicine to produce a competent postgraduate: A pilot study. Indian J Public Health [serial online] 2019 [cited 2023 Mar 25];63:277-81. Available from: https://www.ijph.in/text.asp?2019/63/4/277/273369 |
Introduction | |  |
The ultimate goal of medical education is to train aspiring medical students in such a way that they are empowered and efficient enough to respond to the health needs of the general population.[1],[2] However, the current setup of medical education system thrives on a curriculum, which is subject-centered and time-oriented.[1] The teaching–learning activities and the assessment methods predominantly target the knowledge domain, while the attitude or skill acquisition during the course remains neglected.[1] As a result, the graduate students only have knowledge about medicine, but often lack in terms of clinical skills required in clinical settings.[1] Moreover, most of the conducted exams are summative in nature and have limited scope for obtaining feedback from the students.[1]
Competency-based medical education (CBME) has been advocated to neutralize the demerits of the traditional form of medical education, as it is more accountable, flexible, and learner-centered.[1],[2],[3],[4] Entrustable professional activities (EPAs) refers to the tasks which can be entrusted to the unsupervised execution by a learner once they have attained sufficient competence.[2],[3] Further, the proposed assessments are formative in nature and criterion referenced as the performance of one student is never compared with other students.[2],[3],[4] The process of mapping of EPAs with relevant competencies and milestones is the key factor to ensure that the framed activities are entrusted to the students in a meaningful way.[1],[2],[3],[4] These EPAs should be designed separately for each subject by the subject experts in collaboration and should be periodically assessed to check the progress of the learners.[1],[2],[3],[4]
CBME has been adopted in parent institute of authors for the postgraduation course in community medicine since 2016 and subject-specific 50 EPAs have been formulated. These EPAs were framed after discussion and deliberation between faculty members of two medical colleges. However, in the 1st year of its implementation, it was realized that faculty and postgraduate (PG) students were not knowledgeable enough for the transition to CBME, and there was neither a systematic assessment framework nor valid and reliable tools were used in the assessment process. In addition, the reflection element from PGs was not explored. Acknowledging the problems and the opportunity to bridge the existing gaps, the current study was planned with objectives to develop and validate appropriate assessment tools for subject-specific EPAs and to assess the usefulness of the validated tools in the assessment of PG students.
Materials and Methods | |  |
It was an interventional study done in the Department of Community Medicine of Shri Sathya Sai Medical College and Research Medicine, Kancheepuram, for 14 months (September 2017–October 2018).
First of all, the departmental faculty and PGs were sensitized about CBME, EPAs, assessment in CBME, and the need of the study. Subsequently, three EPAs were selected through consensus among faculty members on a pilot basis out of the 50 EPAs formulated for the specialty, with a plan to expand the assessment of other EPAs in the coming years.
This was followed by the selection of three appropriate assessment tools for three respective EPAs, namely
- Obtains and documents relevant history for noncommunicable diseases: mini-clinical evaluation exercise (Mini-CEX)
- The clinical management of diarrhea cases using the Integrated Management of Newborn and Childhood Illnesses (IMNCI) standards: case-based discussion (CBD)
- The assessment of nutritional status using anthropometry among children <5 years of age: direct observation of procedural skills (DOPS).[5]
Thereafter, three PG students were oriented about the clinico-social case pro forma, the management of diarrhea cases as per IMNCI standards, and various classifications to grade malnutrition. In addition, demonstration sessions were taken to make them understand about various anthropometry methods. For the three selected EPAs, the benchmarking levels were set at 100% through consensus between the department colleagues. In addition, the plan for the assessment of PGs was designed, and it was decided that each of the three PG students will be assessed nine times (viz., 3 times for three separate EPAs at quarterly intervals).
This was followed by the formulation of rubrics for all the three EPAs under the various descriptors mentioned in selected tools. Subsequently, the face and content validation of the assessment tools for the assessment of EPAs was done with the help of subject experts (involving both internal and external experts). Based on the received comments, the assessment tools were appropriately modified. These validated tools [Annexure 1], [Annexure 2], [Annexure 3] were used by the faculty members for the assessment and certification of PG in the three selected EPAs.


Statistical analysis
A Likert's scale of five-point, where 1 was strongly disagree and 5 was Strongly agree was used for the face, and content validation and item and scale content validity index was calculated for the same.
Ethical considerations
Institutional Ethics Committee clearance was obtained with an IEC No.– 2017/341, dated August 31, 2017. Consent was obtained from the study participants (PG students) before enrolling them in the study.
Results | |  |
The results of the pilot study are grouped into the following subheadings:
Views of postgraduate students about sensitization session
What was good about the session? The sensitization on CBME, EPAs, and assessment process using appropriate tools gave them an idea about what they are supposed to do in the course of community medicine. PGs also reported that the assessment process in itself can act as a self-assessment tool.
How it could have been made better? One of the PG's perceived workplace-based assessments same as objective structured clinical examination and needed more clarity. They even insisted for a hands-on-demo for a better understanding about the entire process. These needs were addressed subsequently in the orientation session before the start of the study.
Development of assessment plan rubric for three tools
The rubric of milestones was prepared for various descriptors as mentioned in the pre-validated assessment tools.[5] The designed rubrics have been depicted in [Annexure 1], [Annexure 2], [Annexure 3].
Presentation of findings of the validation process
Face validation
For face validation, the designed assessment tools (Mini-CEX, DOPS, and CBD) along with their rubrics of milestones were shared with five subject experts. These experts were explained the scope and the purpose of the study, in order to ensure that they have a better understanding about the study. Experts were asked to answer to the following questions and record their responses in a 5-point Likert's scale, where 1 was strongly disagree and 5 was strongly agree.
- Are the descriptors mentioned in the tool relevant to the EPA being assessed?
- Are the rubrics of milestones developed for each descriptor covered all the areas for the EPA being assessed? If not, kindly list the areas to be incorporated
- Is a gradual progress in rating from 1 to 9 recorded in the rubric of milestones? If not, please suggest the rearrangement
- Are there any items which may end up in measuring the same variable?
- Is the language used in questionnaire clearly understandable? or there are any technical terms, which is difficult to understand?
Four subject experts recorded 5 (strongly agree) for each of the parameters while one of the experts opined about the gradual progress in degrees of professionalism, and graded the descriptor as 3 (neutral).
The experts thoroughly reviewed the questionnaire and gave the following suggestions:
General remarks:
- Even though the assessment tool had a 9-point rating scale, rubrics were initially designed only for 5 points – Thus, it was suggested that subjectivity still remains, and hence rubrics of milestones should be framed for all 9 points
- The designed tools will be extremely useful for the assessment of the selected EPAs (internal and external subject experts)
- Using only those attributes under professionalism, which can be systematically measured.
Mini-clinical evaluation exercise
The subject experts suggested removing the rubric pertaining to “confidentiality” under the descriptor “Physical examination,” as it is repeated under “Professionalism” and to elaborate “possessing leadership qualities” under “Communication Skills.”
Case-based discussion
It was suggested to give the expansion of IMNCI, incorporate understanding about the importance of dehydration assessment and to add informing the patient about appropriate “National Health Programs and Government Schemes” for the welfare of the patient.
Direct observation of procedural skills
The experts pointed out that the rubrics designed for obtaining informed consent in Grade 8 and 9 are almost the same and “possessing leadership qualities” under “overall clinical judgment” should be elaborated. The tools [Annexure 1], [Annexure 2], [Annexure 3] were modified based on the received comments from all the subject experts.
Content validation
Copies of the assessment tools were sent to seven subject experts, other than those participated in the face validity to assess the content validity of the instrument. Only five experts responded to the invitation. These experts were asked to rate the individual item on a Likert's scale of five, where one was strongly disagree and five was strongly agree. A total of four parameters was assessed, namely readability; clarity; comprehensiveness; and relevance.
Item content validity index: For all 3 tools (Mini-CEX, DOPS, and CBD): 1 (all experts strongly agreed).
Scale content validity index: Mini-CEX and CBD: 1 (universal agreement). DOPS: 0.87 (28 out of 32 items).
Assessment findings
Three PG students were assessed in the outpatient department, wards, and in the department using validated assessment tools [Annexure 1], [Annexure 2], [Annexure 3]. Two rounds of assessments were done and each of the PG students has been assessed by 3–4 faculties for each of the selected EPAs. The assessment was done on real patients and based on the observations during the performance of the task, appropriate, and constructive feedback was given to the student. In fact, after each such session, the PG student and the involved faculty member sat together to develop a shared plan for ensuring the professional growth of the student in all three domains of learning. Subsequently, PG students reflected about the entire process of each assessment and wrote down what they actually learnt during a specific interaction and their experiences about the same.
All the three PG students showed a progressive increase in their performance in all the three selected EPAs in the two rounds of formative assessments. From the assessors' perspective, they felt the entire process was systematic, clear and there was minimal scope for the assessor bias if prepared rubrics were used for the assessment. In fact, to corroborate their experience, it was realized that consistent observation and scoring was done by the faculty members.
Postgraduate students' views about assessment in competency-based medical education
- Prior to use of the tool: no rationale to grade my performance; no specific comments were given to improve mistakes; no scope for interaction with faculties other than the PG guide
- After using the tools (All three PGs were assessed thrice by the validated assessment tools for all three EPAs): more aware regarding attributes examined by assessors during examination; enlightened about the required knowledge and skills for respective EPAs; assessment using rubrics makes the assigned task very precise and systematic; the feedback given by the assessor helps in improving myself; and development of a shared plan and bridge the existing gaps.
Discussion | |  |
CBME has been acknowledged as the need of the hour due to the associated benefits. In fact, a wide number of institutes have implemented CBME within their setup to train medical students in areas such as public health and epidemiology or family medicine residency program.[6],[7] The process of the implementation of CBME is not simple, and it requires extensive planning, curricular reforms, support from the administration, training of faculty members, and orientation of the medical residents.[8],[9]
In the current study of the implementation of CBME, the first step was to sensitize the departmental faculty and PGs about CBME, EPAs, assessment in CBME, and the need of the study. The findings of different studies have indicated that if we want to change the culture of medical training, we have to address all the stakeholders.[8],[9],[10] Owing to the fact that often, faculty members are resistant to change, we addressed their needs primarily before starting the study. In fact, time and again, the importance of faculty members in the successful implementation of the CBME program across different specialties has been emphasized.[7],[10]
On a similar note, even PG students were sensitized about the entire process, the selected EPAs, and expectations out of them. Their views were obtained about the sensitization session and all the identified areas were addressed. It is of immense importance to obtain the views of the target students, regardless of undergraduate or PG, so that the planned approach can be modified based on their needs. This becomes extremely important as CBME is quite different from the traditional mode of education delivery, and the administrators or even the faculty members have to assess the perspectives and preparedness of the students.[10]
In the current study, prevalidated and appropriate assessment tools released by the Association for Medical Education in Europe were employed.[5] The advantage of using these tools was that at least the descriptors were validated, and we had to validate only the designed rubrics of milestones for each of the descriptors. The rubrics for each of the three assessment tools were designed [Annexure 1], [Annexure 2], [Annexure 3] and was subjected to external and internal face and content validation. These designed milestones helped the faculty member to assess the progress of the PG student, while for the residents, it acted as a self-assessment tool.[11],[12]
The assessment process in CBME is quite unique and thus in order to attain the desired objectives, the assessment has to be authentic and formative in nature.[13],[14] In the present study, after each assessment, a constructive feedback was given to the PG student, and this was followed by the development of a shared plan to improve the identified areas of weaknesses. Moreover, the students even reflected about the entire assessment process and the things which they learnt to ensure that it becomes their deep learning. The feedback obtained played a significant part in the knowledge, attitude, skills, and approach toward the patients has occurred. The findings of another study revealed similar sorts of outcome for the undergraduate medical students.[15] In addition, even the faculty members were quite satisfied with the assessment process as it removed the subjectivity and gave them an explicit guideline to assess the students.
The strength of the present study is that not many institutes across India have started with CBME within their setup at present, and thus the findings of the study will be of immense utility to all those institutes which are on the anvil to start the same within the near future. At the same time, it is a major step to ensure that the produced community medicine doctors are competent health professionals and will be able to meet the needs of the community while being globally relevant. The limitation of the study was that it involved only three EPAs in the subject, but as it was a pilot study, and even we were learning throughout the process, it made sense to initiate the process on a small scale. There is a definite plan to expand the assessment process to other EPAs in the coming days.
Conclusion | |  |
CBME is the need of the hour, and it has been adopted for the community medicine PG residents in the institute. Three EPAs were selected for certification, and for each one of them, one workplace-based assessment tool was selected. Rubrics of milestones were developed, and the tool was subsequently validated. These tools were then used for periodic assessment in workplace settings and both PG students and internal/external faculty members found the tools to be useful.
Acknowledgment | |  |
This Project was done as a part of the FAIMER Fellowship under the guidance of Faculty Members of PSG – FAIMER Regional Institute. The authors thank Dr. Sitanshu Sekhar Kar, JIPMER, Puducherry and Dr. Praveen Kulkarni, JSS, Mysore for their guidance for the successful conduction of the project.
Financial support and sponsorship
Nil.
Conflicts of interest
There are no conflicts of interest.
References | |  |
1. | Shah N, Desai C, Jorwekar G, Badyal D, Singh T. Competency-based medical education: An overview and application in pharmacology. Indian J Pharmacol 2016;48:S5-9. |
2. | Touchie C, ten Cate O. The promise, perils, problems and progress of competency-based medical education. Med Educ 2016;50:93-100. |
3. | Ten Cate O, Billett S. Competency-based medical education: Origins, perspectives and potentialities. Med Educ 2014;48:325-32. |
4. | Levine MF, Shorten G. Competency-based medical education: Its time has arrived. Can J Anaesth 2016;63:802-6. |
5. | Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE guide no 31. Med Teach 2007;29:855-71. |
6. | Dankner R, Gabbay U, Leibovici L, Sadeh M, Sadetzki S. Implementation of a competency-based medical education approach in public health and epidemiology training of medical students. Isr J Health Policy Res 2018;7:13. |
7. | Schultz K, Griffiths J. Implementing competency-based medical education in a postgraduate family medicine residency training program: A stepwise approach, facilitating factors, and processes or steps that would have been helpful. Acad Med 2016;91:685-9. |
8. | Shrivastava SR, Shrivastava PS, Ramasamy J. Development of a module to successfully implement competency based medical education program in an institute. Muller J Med Sci Res 2018;9:27-9. [Full text] |
9. | Nousiainen MT, Caverzagie KJ, Ferguson PC, Frank JR, ICBME Collaborators. Implementing competency-based medical education: What changes in curricular structure and processes are needed? Med Teach 2017;39:594-8. |
10. | Walsh A, Koppula S, Antao V, Bethune C, Cameron S, Cavett T, et al. Preparing teachers for competency-based medical education: Fundamental teaching activities. Med Teach 2018;40:80-5. |
11. | Johnston C. Residents prepare for switch to competency-based medical education. CMAJ 2013;185:1029. |
12. | Holmboe ES, Call S, Ficalora RD. Milestones and competency-based medical education in internal medicine. JAMA Intern Med 2016;176:1601-2. |
13. | Humphrey-Murto S, Wood TJ, Ross S, Tavares W, Kvern B, Sidhu R, et al. Assessment pearls for competency-based medical education. J Grad Med Educ 2017;9:688-91. |
14. | Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, et al. Core principles of assessment in competency-based medical education. Med Teach 2017;39:609-16. |
15. | Kerdijk W, Snoek JW, van Hell EA, Cohen-Schotanus J. The effect of implementing undergraduate competency-based medical education on students' knowledge acquisition, clinical performance and perceived preparedness for practice: A comparative study. BMC Med Educ 2013;13:76. |
This article has been cited by | 1 |
The new pan-European post-specialty training curriculum in Paediatric and Adolescent Gynaecology |
|
| Sarah L. Mourik,Eveline J. Roos,Angelique J. Goverde,Paul L. Wood | | European Journal of Obstetrics & Gynecology and Reproductive Biology. 2021; 258: 152 | | [Pubmed] | [DOI] | | 2 |
An update on current EPAs in graduate medical education: A scoping review |
|
| Lu Liu,Zhehan Jiang,Xin Qi,A’Na Xie,Hongbin Wu,Huaqin Cheng,Weimin Wang,Haichao Li | | Medical Education Online. 2021; 26(1) | | [Pubmed] | [DOI] | | 3 |
Utilization of work place based assessment tools in medical education: Potential challenges and solutions |
|
| Saurabh RamBihariLal Shrivastava,Prateek Saurabh Shrivastava | | Research and Development in Medical Education. 2020; 9(1): 8 | | [Pubmed] | [DOI] | |
|
 |
 |
|