Back to Journals » Advances in Medical Education and Practice » Volume 12

Evaluation of the Potential of National Sharing of a Unified Progress Test Among Colleges of Pharmacy in the Kingdom of Saudi Arabia

Authors Albekairy AM, Obaidat AA , Alsharidah MS, Alqasomi AA, Alsayari AS , Albarraq AA, Aljabri AM , Alrasheedy AA , Alsuwayt BH, Aldhubiab BE , Almaliki FA, Alrobaian MM, Aref MA , Altwaijry NA , Alotaibi NH, Alkahtani SA , Bahashwan SA, Alahmadi YA 

Received 8 September 2021

Accepted for publication 3 December 2021

Published 16 December 2021 Volume 2021:12 Pages 1465—1475

DOI https://doi.org/10.2147/AMEP.S337266

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Abdulkareem M Albekairy,1 Aiman A Obaidat,1 Mansour S Alsharidah,2 Abdulmajeed A Alqasomi,3 Abdulrhman S Alsayari,4 Ahmad A Albarraq,5 Ahmad M Aljabri,6 Alian A Alrasheedy,7 Bader H Alsuwayt,8 Bandar E Aldhubiab,9 Faisal A Almaliki,10 Majed M Alrobaian,11 Mohammad A Aref,12 Najla A Altwaijry,13 Nasser H Alotaibi,14 Saad A Alkahtani,15 Saleh A Bahashwan,16 Yaser A Alahmadi16

1College of Pharmacy, King Saud bin Abdulaziz University for Health Sciences, Riyadh, Saudi Arabia; 2College of Medicine, Qassim University, Buraydah, Saudi Arabia; 3College of Pharmacy, Qassim University, Buraydah, Saudi Arabia; 4College of Pharmacy, King Khalid University, Abha, Saudi Arabia; 5College of Pharmacy, Jazan University, Jazan, Saudi Arabia; 6College of Pharmacy, University of Tabuk, Tabuk, Saudi Arabia; 7College of Pharmacy, Qassim University, Unaizah, Saudi Arabia; 8College of Pharmacy, Northern Border University, Rafha, Saudi Arabia; 9College of Clinical Pharmacy, King Faisal University, Al Hofuf, Saudi Arabia; 10College of Pharmacy, Umm Al-Qura University, Makkah, Saudi Arabia; 11College of Pharmacy, Taif University, Taif, Saudi Arabia; 12College of Pharmacy, Albaha University, Albaha, Saudi Arabia; 13College of Pharmacy, Princess Nourah Bint Abdulrahman University, Riyadh, Saudi Arabia; 14College of Pharmacy, Jouf University, Aljouf, Saudi Arabia; 15College of Pharmacy, Najran University, Najran, Saudi Arabia; 16College of Pharmacy, Taibah University, Madina, Saudi Arabia

Correspondence: Aiman A Obaidat
College of Pharmacy, King Saud bin Abdulaziz University for Health Sciences, P.O. Box 3660, Riyadh, 11481, Saudi Arabia
Email [email protected]

Background: With the expansion in pharmacy education in Saudi Arabia, there is a pressing need to maintain quality assurance in pharmacy programs using several tools. The progress test is a formative assessment tool that can serve to provide information to all stakeholders. This study evaluated the results of a unified progress test that was shared among 15 colleges of pharmacy.
Methods: The progress test was composed of 100 MCQs where 30% of which cover basic pharmaceutical sciences and 70% cover pharmacy practice. The questions were collected from all the 15 colleges of pharmacy participated in the test. The test was administered online to all undergraduate students in the professional programs of these colleges.
Results: The overall attendance rate was 80% from the total number of students enrolled in the participating colleges. Mean scores of students in basic pharmaceutical sciences were relatively higher than in pharmacy practice. The assessment results of the students in the unified program learning outcomes among colleges were higher in the domains of knowledge and skills compared to competence domain. There was a significant increment in the mean scores of the students as they progress through the years of the professional program. No correlation was found between the mean scores in the test and the cumulative grade point average (cGPA) of all students regardless of their level.
Conclusion: The results indicated growth and maintenance of the gained knowledge and skills by the students as they progress through the years of the professional program with consistency in the results among the participating colleges. Sharing a unified test was effective as a valuable tool for the colleges of pharmacy for the purposes of benchmarking and improving the curricula. In addition, it could serve to evaluate learning of students and harmonize knowledge and skills gained by students at different institutions.

Keywords: progress test, formative test, assessment, learning outcomes, pharmacy education, Saudi Arabia

Introduction

The pharmacy education in the Kingdom of Saudi Arabia (KSA) has witnessed an exponential expansion since 2001. This was due to the government initiative to train more pharmacists in order to meet the national needs and the demands of the growing population of the country. Currently, there are 22 government colleges of pharmacy in addition to seven private colleges.1 All of these colleges are offering Doctor of Pharmacy (Pharm.D.) program and a few of them are still offering a Bachelor of Science (B.Sc.) in pharmacy.

In order to maintain high quality education and for accreditation purposes, the Saudi Education and Training Evaluation Commission (ETEC) and through its affiliated National Center for Academic Accreditation and evAluation (NCAAA) has developed the Saudi Arabian Qualification Framework (SAQF) which is a document that integrates education, training and employment in a unified system. All educational institutions in the KSA have to register as institutions as well as their qualifications/programs that they offer on a set of standards provided by this document. This is the first and mandatory step for institution and program accreditation. There are sets of standards for the institution and for the program.2 SAQF also provides the domains for program learning outcomes (PLOs) and these are categorized into knowledge, skills and competence. The competence domain covers the areas of autonomy and responsibility, practice, and attributes. Each particular Pharm.D. program has a set of PLOs to satisfy these domains, and consequently, the course learning outcomes (CLOs) for each course in the curriculum have to be logically mapped to the program’s PLOs. Under the above mentioned domains, the colleges of pharmacy among the KSA share a unified set of 18 PLOs for their Pharm.D. programs.

With this increase in the number of colleges of pharmacy, there is a crucial need for maintaining quality assurance in their programs which will serve to provide information for all stakeholders to improve curricula and work on faculty and facility enhancement.3 This requires the use of assessment tools that will evaluate learning of students and harmonize the knowledge and skills gained from such programs at different institutions.4 Summative assessment has been traditionally used as a common approach for assessment of learning and to assure competence of pharmacy graduates for external stakeholders.5 Lately, the impact of formative assessment on students’ learning has been recognized by educators.6,7 The progress test is an approach that has been shown to be effective for the purpose of both formative and summative assessment of student achievement. It is a strategy of longitudinal assessment that can be periodically applied to all students of the program and it is expected that a progressive percentage of their answers will be right.8,9 The progress test can serve to measure deep and prolonged term learning of students. In addition, it can allow early detection of underperformance of some students.10 The Universities of Maastricht and Missouri were the first to develop and introduce the progress test back in the 1970s.11,12 Since then, it has been practiced in various health care programs in the world with more emphasis in medicine. Therefore, the progress test can be used to serve as a formative assessment tool for the achievement of students and also for evaluation of the curriculum.13

There is a Council of Deans for all the colleges of pharmacy in the KSA that is composed from all the deans of these colleges and periodically chaired by one of them. Briefly, the function of this council is to implement strategic planning for the colleges of pharmacy and to foster cooperation on all matters related to the profession of pharmacy concerning the number of pharmacy graduates and their prospective work opportunities. In the past academic year (2020–2021), the Council of Deans decided on an initiative to administer a unified progress test for all the undergraduate students enrolled in these colleges. The test was intended to target all student population in each college and in all levels of the professional program, ie, professional year 1 (P1) to professional year 4 (P4). It has been agreed to conduct the test annually thereafter for the purpose of evaluation of students and curricula, and in addition, it will serve as a reliable tool for benchmarking to be used for accreditation application.

The aim of this study is to report on the utilization of an annual progress test for pharmacy students in the KSA as well as presenting the results of the test for the academic year (2020–2021) in order to evaluate the performance of students at different levels in the program.

Methodology and Design

Rationale

Although the current Pharm.D. programs’ curricula are based on competency education in this field, there are still challenges in the assessment of the students and curricula that have not been yet fully resolved. Therefore, it is worth studying the contribution of a unified progress test, in a significant number of pharmacy colleges, to the assessment of knowledge and skills gained and retained by the students over time as well as identifying specific areas of concern in the curriculum that may need re-evaluation and improvement.

Study Design

This task was led by the College of Pharmacy (COP) at King Saud bin Abdulaziz University for Health Sciences (KSAU-HS) in Riyadh, KSA. Initially, the number of colleges willing to participate in the test was identified. It was requested from each college’s Dean to instruct the faculty members, according to their specialties, to prepare and submit a minimum of 30 questions in the areas of basic pharmaceutical sciences and another 50 questions to cover the areas of pharmacy practice. The question items were made up of multiple choice questions (MCQs) with four response options for each and have to be mapped to the PLOs. The MCQs collected from all colleges were reviewed by a panel of faculty members in basic pharmaceutical sciences and pharmacy practice to meet a specified inclusion criteria and then saved in a question bank for this purpose. Based on the inclusion/exclusion criteria mentioned below, the total number of questions entered in the question bank was 472 with an average of 3–4 questions mapped to each PLO. To ensure balanced distribution of questions over the 18 unified PLOs, 2–3 MCQs were selected to assess each PLO by measuring the average percentage of students’ accurate responses on these questions. Since skills and competences cannot be evaluated by MCQs, we used questions type of MCQs that can measure cognitive skills, application of guidelines and competences relevant knowledge. The test was prepared by selecting questions from this question bank where it was composed of 100 MCQs; 30% of which cover basic pharmaceutical sciences and the remaining 70% cover different areas in pharmacy practice. The test was designed to be administered over a period of 2 hours.

The test was conducted during week 11 of the first semester of the academic year 2020–2021. A unified date and time have been set during that week to be the most convenient for all students of all batches and to have them available for a 2 hour period to take the test. The date and time were announced for the students during that week and that was just 2 days in advance before the date of the test. They were also informed that attendance is mandatory and they just have to take the test without any prior preparation or studying. The test was given online in all participating colleges and the students were informed how to access the test on the specified time. Target students were only the students enrolled in the professional program. For the purpose of reporting the results, colleges will be identified by codes (C1-C15) without disclosing the name of the college and the university. No review or approval was required for this study by the IRB since it does not involve experimenting on human or taking any biological samples from the participants.

Inclusion Criteria

A question item was accepted in the question bank if it is:

-In the form of MCQ.

-Clearly addressing a specific topic in either basic pharmaceutical sciences or pharmacy practice.

-Logically mapped to one or more PLOs.

Exclusion Criteria

A question item was rejected and not entered in the question bank under one or more of the following conditions:

-Not in the form of MCQ

-Vague without clear identification of the topic to be addressed

-Not mapped to one or more PLOs

-Response options less or more than 4

-MCQs with responses like “All of the above” or “None of the above”

Statistical Analysis

Descriptive statistics have been applied wherever appropriate to calculate the number and percent for categorical variables in addition to the mean and standard deviation for continuous variables along with one-way analysis of variance (ANOVA) to determine the level of significance. A linear regression model was constructed to find the correlation coefficient and a 95% confidence interval (CI) in an attempt to investigate if there is a significant correlation between the students’ average cumulative grade point average (cGPA) and their mean scores in the progress test. The Pearson correlation coefficient with its p-values for each batch of students were calculated and a p-value < 0.05 is to be considered significant.

Results

Fifteen colleges agreed to participate in the progress test out of the 22 government colleges in the kingdom. Table 1 shows a summary of these colleges’ programs, their number of enrolled students in the program and the number of students attended the test. Pharm.D. programs are offered in 11 of these colleges while 2 are offering Pharm.D and B.Sc. programs at the same time. However, the B.Sc. program is in its final year of phasing out in one of these colleges (C12) where only 3 B.Sc. students attended the test out of a total of 431 which is insignificant. Thus, colleges C1 – C12 are considered as the colleges that offer Pharm.D. program only. College C13 offers a B.Sc. program and just started a Pharm.D. program, and therefore, it will be treated as one of the colleges that offer B.Sc. program. There are 2 colleges that are still offering B.Sc. programs only (C14 and C15). Thus, colleges C13 – C15 are considered as the colleges that offer B.Sc. program only.

Table 1 Program(s) Offered by Each Participating College, the Total Number of Enrolled Students and the Number of Students Attended the Progress Test

The total number of students enrolled in all of these participating colleges is 5364 and the total number of students who attended the test was 4321. This indicates an overall attendance rate of about 80%. Table 1 also shows the attendance percentage of students for each college. The attendance percentage was ranging between 59% to slightly above 97%.

Figure 1 shows the mean scores in the progress test for male and female students from each college. College C9 offers Pharm.D. program for female students only while C14 offers B.Sc. program for male students only. Figure 2 shows the mean scores of the students in the questions on basic pharmaceutical sciences and pharmacy practice in the colleges that offer Pharm.D. program only (C1 – C12). In the majority of the colleges, the mean scores of the students in basic pharmaceutical sciences were relatively higher compared to their scores in pharmacy practice. The same data was stratified for each professional level in the colleges that offer Pharm.D. program as shown in Table 2. The results show a gradual increase in the mean scores of the students as they progress through the Pharm.D. program. Figure 3 also shows the same comparison, as total mean scores, for the colleges that offer B.Sc. program only (C13 – C15) with a similar observed trend where the mean scores are higher in the questions covering the areas of basic pharmaceutical sciences compared to disciplines of pharmacy practice. Figure 4 illustrates the total mean scores for each professional year of the Pharm.D. students. It has been computed as the total combined average scores in basic pharmaceutical sciences and pharmacy practice for each professional year in the Pharm.D. program of colleges C1 – C12.

Table 2 Mean Scores (% ± SD) of Students in the Questions on Basic Pharmaceutical Sciences and Pharmacy Practice for Each Professional Level (P1 – P4) in the Colleges That Offer Pharm.D. Program

Figure 1 Mean scores (%) of all students in the professional programs as reported by the colleges participated in the progress test.

Figure 2 Mean scores (%) of students in the questions on basic pharmaceutical sciences and pharmacy practice in the colleges that offer Pharm.D. program.

Figure 3 Mean scores (%) of students in the questions on basic pharmaceutical sciences and pharmacy practice in the colleges that offer B.Sc. programs (Data on pharmacy practice questions were not reported by C14).

Figure 4 Combined mean scores (%) of all students in the professional years (P1 – P4) of the Pharm.D. program for colleges C1 – C12.

Table 3 summarizes the assessment results of the unified PLOs among the colleges of pharmacy in the KSA. The table presents the assessment results for both Pharm.D. and B.Sc. students based on the progress test results obtained from the participating colleges. In general, and for both Pharm.D. and B.Sc. programs, it can be noticed that the assessment results are higher in the domains of skills and knowledge compared to the domain of competence with its sub-domains of autonomy and responsibility, practice and attributes. For B.Sc. program in particular, it can be noticed that the assessment results in the knowledge and skills domains are significantly higher compared to the competence domain. In addition, their scores in competence domain are significantly lower in comparison to Pharm.D. students. The assessment result for each PLO has been calculated by computing the mean scores of the students in all questions that are mapped or linked to that particular PLO. Although the unified PLOs are designed for Pharm.D. program, these assessment results have been calculated based on the progress test results for students enrolled in Pharm.D. program as well as students enrolled in B.Sc. program for the sake of comparison.

Table 3 Unified Program Learning Outcomes (PLOs) Among the Colleges of Pharmacy in the KSA and Their Assessment Based on the Students’ Results in the Progress Test for Both Pharm.D. and B.Sc. Students. The Last Column Shows the PLOs Assessment Results Based Only on the P4 (Pharm.D.) Students’ Scores

The last column in Table 3 summarizes the assessment results of the unified PLOs based only on the mean scores of the P4 Pharm.D. students which is calculated as combined results from colleges C1 – C12. It can be clearly observed that there is a gradual increment in the assessment results of the PLOs as we go to the competence domain and its sub-domains. The scores of the P4 Pharm.D. students in these PLOs are higher compared to the knowledge and skills domains.

Table 4 shows the results of correlation between the average cGPA of students in each professional level of the Pharm.D. program and their mean scores in the progress test. The results indicated no significant correlation as the CI for each level is very wide with p-values that are much higher than 0.05.

Table 4 Correlation Between the Average cGPA and Mean Scores in the Progress Test for Each Professional Level (P1 – P4) of Students in the Pharm.D. Program

Discussion

The total number of students who attended the progress test was 4321 from all the 15 pharmacy colleges participated in the test. This number includes Pharm.D. and B.Sc. students and it constitutes a satisfactory sample size to draw conclusions based on this test. College C12 has a B.Sc. program in its final year of phasing out. The major number of students who attended the test from this college were Pharm.D. students with a total of 428 students. College C13 has just started a Pharm.D. program with only 25 students and still offering B.Sc. program with the majority of students are enrolled in this program. Therefore, the data obtained from this college was only considered for B.Sc. program only since the total number of its students who attended the test was 374 and only 25 out of them were Pharm.D. students. Thus, for the purpose of data treatment, colleges C1 – C12 are known to offer Pharm.D. program and colleges C13 – C15 are colleges that offer B.Sc. program only.

A comparison between the mean scores of male and female students is shown in Figure 1. The male students’ average score was 40.75 ± 5.40 and the female students’ average score was 41.64 ± 5.61 and one-way single factor ANOVA showed a p-value = 0.314 indicating no significant difference in the mean scores of both groups. Furthermore, the relationship between the genders showed a positive and very high correlation with a very high statistical significance. The Pearson correlation coefficient (r) is 0.871 with a p-value = 0.0001. This also supports that the results of male and female students are highly correlated and not significantly different from each other. Therefore and for further discussion, the results will be treated as combined students’ results regardless of the gender.

In the majority of the colleges that offer Pharm.D. program (C1 – C12), the mean scores of the students in area of basic pharmaceutical sciences were relatively higher compared to their mean scores in the questions addressing pharmacy practice disciplines (Figure 2). One-way single factor ANOVA showed a p-value = 0.0094 indicating a significant difference in the mean scores in basic pharmaceutical sciences compared to pharmacy practice. In addition, the average of the mean scores in basic pharmaceutical sciences was 46.29 ± 8.92 while in pharmacy practice it was 36.57 ± 7.78. Detailed results for each professional year in the Pharm.D. program in all colleges as a combined average ± SD are presented in Table 2 which shows the same trend as observed in Figure 2. The difference in the mean scores of basic pharmaceutical sciences and pharmacy practice was statistically significant for P1 and P2 where the p-values were 0.011 and 0.018, respectively. However, the p-values for P3 and P4 were almost close to 0.05 indicating no significant difference in the mean scores of these 2 professional levels in pharmaceutical sciences and pharmacy practice. A similar trend was also observed with the B.Sc. students (C13 – C15) as shown in Figure 3 where the scores in basic pharmaceutical sciences are higher compared to pharmacy practice. The mean score in basic pharmaceutical sciences was 57.30 ± 1.84 and in pharmacy practice 38.48 ± 6.60 with a p-value = 0.015 indicating a significant difference. This trend could be referred to two factors. Firstly, the percentage of questions on basic pharmaceutical sciences in the test was 30% and the remaining 70% of questions were addressing different areas of pharmacy practice. It is expected that P1 and P2 students will score better in basic pharmaceutical sciences since they are still studying these sciences in the first two years of the professional program and they are not yet exposed to clinical and pharmacy practice courses. Secondly, the major concepts of the gained knowledge and skills from basic sciences courses are usually repeated and strengthened or re-emphasized during all the professional years of the program. A similar trend has been observed in a study compared the students’ knowledge and skills in pharmacology and pharmacotherapy.14 This trend is also expected in colleges that offer B.Sc. program as the primary focus of the curriculum is mainly on basic pharmaceutical sciences in such a program. This trend is clear for the colleges that offer B.Sc. program (Figure 3). Another issue that can also be addressed here, and in relation to lower scores in the areas of pharmacy practice, is that the test was only based on MCQs. In practice and during the progress of the students in the professional program, the learning outcomes that are mainly related to competence domains are usually evaluated and assessed by many other assessment tools like assignments, case and topic discussions, research projects, seminars and students’ portfolios during clinical rotations. Some programs even utilize additional assessment tools like Objective Structured Practical Examination (OSPE) and Objective Structured Clinical Examination (OSCE). Thus, inclusion of all these tools in a progress test will be difficult and tedious to be executed.

Investigating the overall results of the students indicated that there was a clear increment in the mean scores from P1 to P4 among all participating colleges. It showed that P1 mean scores where the lowest and P4 mean scores where the highest as illustrated in Figure 4. This indicates growth and maintenance of the gained knowledge and skills of the students through the years of schooling in the professional program. Further investigation of the overall results presented in Figure 4 showed that by looking at the mean scores in basic pharmaceutical sciences and by applying ANOVA that the p-value is 0.0665 which indicates no significant difference between the professional levels (P1-P4). Similar investigation of the mean scores in pharmacy practice gave a p-value of 0.00053 which indicates a significant difference between the professional levels of the students. This supports the observation that student gained knowledge and skills in pharmacy practice grow and improve significantly during their progression in the Pharm.D. curriculum while they maintain similar level of knowledge and skills in basic pharmaceutical sciences throughout the program.

The assessment of the unified PLOs among the participating colleges (Table 3) showed a relatively higher scores in the questions addressing the domains of knowledge and skills compared to the competence domain. This is consistent with the trend observed with higher scores in basic pharmaceutical sciences. It is known that the learning outcomes of basic pharmaceutical sciences courses are more likely to be mapped to the domains of knowledge and skills rather than competence which is usually more related to higher clinical courses and experiential training. Again, this also could be attributed to repeating and emphasizing the basic concepts related to knowledge and skills during all levels of the professional program. As shown in Table 3, the assessment results for the unified PLOs based on the progress test results for B.Sc. students are comparable to Pharm.D. students regarding knowledge and skills domains and they are much lower in the competence domain. One-way single factor ANOVA to compare between B.Sc. and Pharm.D. students in their achievement in the competence domain gave a p-value = 5.99×10−8 indicating a very high statistical significant difference between them. This can be attributed to the nature of B.Sc. programs as they are pharmaceutical product oriented rather than patient oriented.15 It is well known that the contribution of basic pharmaceutical sciences courses in such programs is much higher compared to pharmacy practice and clinical pharmacy courses in addition to lack of experiential training. Therefore, it is expected that B.Sc. students will achieve lower than Pharm.D. students in the competence domain which is usually more linked to pharmacy practice and clinical pharmacy courses.

The last column in Table 3 shows the assessment of the unified PLOs based only on the progress test results of the P4 Pharm.D. students from the colleges that offer Pharm.D. program (C1 – C12). It can be concluded that their achievement in the questions addressing the competence domain is higher compared to the combined results of all students (P1 – P4) where the p-value = 1.52×10−5. This is expected as these are the senior students in their final year of the Pharm.D. program where they have been through almost all the phases of the curriculum to achieve all the expected learning outcomes.

As shown in Table 4, it was difficult to find correlation between the average cGPA and the mean scores of the students in the progress test. This could be attributed to the fact that the cGPA is usually calculated based on a series of periodic summative assessments every semester which is not related to the progress test as a formative tool.16 Besides, the students usually set for the progress test without any prior preparation and some of them will not even take it seriously although their attendance was mandatory. The progress test is a comprehensive test, and therefore, it is expected that students in the initial levels of the program will not be able to correctly answer questions that are intended for higher levels.

This progress test has shown a very good potential to be shared among colleges of pharmacy in the KSA especially colleges that offer Pharm.D. program since the SAQF learning domains are mainly designed for such a professional program. In addition, the validity of the content has been established and maintained in line with a blueprint regarding the question items included in the test. The progress test was made up of 100 MCQs with four response options for each. Based on our analysis of the results, it can be suggested here to have a fifth option of “I do not know” for the students to select and this will be intended to minimize guessing as it is known that some questions will be difficult for some students to answer.

Therefore, the progress test can provide significant information about several aspects related to students and curriculum in addition to peer comparison among pharmacy colleges.17 Our test showed that the students retained the gained knowledge and skills through their progress in the curriculum of the Pharm.D. program as reflected by higher mean scores of senior students compared to juniors. The experience of sharing the progress test has been implemented over the past three academic years by three colleges and these are C1, C3 and C4. This experience was very effective and it served as one of the key performance indicators (KPIs) for the purposes of benchmarking and accreditation. Our study indicates that this experience can be expanded to be shared among larger number of colleges in the KSA since they have many unified Pharm.D. PLOs based on the SAQF domains.

Conclusion

This study showed a great potential of sharing a unified progress test at the national level among the colleges of pharmacy in the KSA. A consistency in the results has been observed where there was an obvious increment in the mean scores of the students as they progress through the professional program. This indicates that the progress test is a valuable tool that measures the students’ gain and retention of knowledge and skills over time through the professional years considering that the test will be administered annually. Besides, sharing a unified test could serve as an excellent tool for the purposes of benchmarking and accreditation. Since the SAQF domains are mainly designed for Pharm.D. program, it is recommended that pharmacy colleges that are still offering B.Sc. program to completely implement and transfer to Pharm.D. program.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Alhamoudi A, Alnattah A. Pharmacy education in Saudi Arabia: the past, the present, and the future. Curr Pharm Teach Learn. 2018;10(1):54–60. doi:10.1016/j.cptl.2017.09.014

2. Available from: https://etec.gov.sa. /Products and Services/NCAAA/Program Accreditation/Program Accreditation Standards 2018.Accessed December 8, 2021.

3. Verhoeven BH, Verwijnen GM, Scherpbier AJJA, van der Vleuten CPM. Growth of medical knowledge. Med Educ. 2002;36(8):711–717. doi:10.1046/j.1365-2923.2002.01268.x

4. Nash R, Chalmers L, Brown N, Jackson S, Peterson G. An international review of the use of competency standards in undergraduate pharmacy education. Pharm Educ. 2015;15:131–141.

5. Peeters MJ. Targeting assessment for learning within pharmacy education. Am J Pharm Educ. 2017;81(8):6243. doi:10.5688/ajpe6243

6. Sturpe D. Objective structured clinical examination in Doctor of Pharmacy programs in the United States. Am J Pharm Educ. 2010;74:1–6. doi:10.5688/aj7408148

7. Schuwirth L, van der Vleuten CP. Programmatic assessment: from assessment of learning to assessment for learning. Med Teacher. 2011;33:478–485. doi:10.3109/0142159X.2011.565828

8. Vantini I, Benini L. Models of learning, training and progress evaluation of medical students. Clin Chim Acta. 2008;393(1):13–16. doi:10.1016/j.cca.2008.03.015

9. Medina M. Does competency-based education have a role in academic pharmacy in the United States? Pharmacy. 2017;5(1):13–18. doi:10.3390/pharmacy5010013

10. Van der veken J, Valcke M, De Maeseneer J, Schuwirth L, Derese A. Impact on knowledge acquisition of the transition from a conventional to an integrated contextual medical curriculum. Med Educ. 2009;43(7):704–713. doi:10.1111/j.1365-2923.2009.03397.x

11. Arnold L, Willoughby TL. The quarterly profile examination. Acad Med. 1990;65(8):515–516. doi:10.1097/00001888-199008000-00005

12. Van der Vleuten CPM, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with progress testing in a problem-based learning curriculum. Med Teacher. 1996;18(2):103–109. doi:10.3109/01421599609034142

13. Al-Alwan I, Al-Moamary M, Al-Attas N, et al. The progress test as a diagnostic tool for a new PBL curriculum. Educ Health. 2011;24(3):493–502.

14. Keijsers C, Brouwers J, de Wildt D, et al. A comparison of medical and pharmacy students’ knowledge and skills of pharmacology and pharmacotherapy. Br J Clin Pharmacol. 2014;78(4):781–788. doi:10.1111/bcp.12396

15. Supapaan T, Low BY, Wongpoowarak P, Moolasarn S, Anderson C. A transition from BPharm to the PharmD degree in five selected countries. Pharm Pract (Granada). 2019;16(3):1611. doi:10.18549/PharmPract.2019.3.1611

16. Schuwirth L, van der Vleuten C. The use of progress testing. Perspect Med Educ. 2012;1(1):24–30. doi:10.1007/s40037-012-0007-2

17. Plaza CM. Progress examinations in pharmacy education. Am J Pharm Educ. 2007;71(4):Article 66. doi:10.5688/aj710466

Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.