Back to Journals » Journal of Multidisciplinary Healthcare » Volume 14

Relationship Between Research Culture and Research Activity of Medical Doctors: A Survey and Audit

Authors Brandenburg C , Noble C , Wenke R , Hughes I , Barrett A, Wellwood J, Mickan S 

Received 8 May 2021

Accepted for publication 21 July 2021

Published 10 August 2021 Volume 2021:14 Pages 2137—2150


Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Scott Fraser

Caitlin Brandenburg,1,2 Christy Noble,1,3,4 Rachel Wenke,1,5 Ian Hughes,1 Anthony Barrett,1 Jeremy Wellwood,1 Sharon Mickan1,2

1Clinical Governance, Education and Research, Gold Coast Hospital and Health Service, Southport, QLD, Australia; 2Faculty of Health Sciences and Medicine, Bond University, Gold Coast, QLD, Australia; 3Faculty of Medicine, University of Queensland, St Lucia, QLD, Australia; 4School of Medicine, Griffith University, Southport, QLD, Australia; 5School of Allied Health Sciences, Griffith University, Southport, QLD, Australia

Correspondence: Caitlin Brandenburg Email [email protected]

Purpose: To describe the research capacity and culture, and research activity (publications and new projects) of medical doctors across a health service and determine if the research activity of specialty groups correlated with their self-reported “team” level research capacity and culture.
Methods: Cross-sectional, observational survey and audit of medical doctors at a tertiary health service in Queensland. The Research Capacity and Culture (RCC) validated survey was used to measure self-reported research capacity/culture at organisation, team and individual levels, and presence of barriers and facilitators to research. An audit of publications and ethically approved research projects was used to determine research activity.
Results: Approximately, 10% of medical doctors completed the survey (n= 124). Overall, median scores on the RCC were 5 out of 10 for organisational level, 5.5 for specialty level, and 6 for individual level capacity and culture; however, specialty-level scores varied significantly between specialty groups (range 3.1– 7.8). Over 80% of participants reported lack of time and other work roles taking priority as barriers to research. One project was commenced per year for every 12.5 doctors employed in the health service, and one article was published for every 7.5. There was a positive association between a team’s number of publications and projects and their self-reported research capacity and culture on the RCC. This association was stronger for publications.
Conclusion: Health service research capacity building interventions may need a tailored approach for different specialty teams to accommodate for varying baselines of capacity and activity. When evaluating these initiatives, a combination of research activity and subjective self-report measures may be complementary.

Keywords: research culture, research activity, health service, hospital, medical, doctor


Clinicians play a key role in shaping research agendas, generating research questions, and conducting research that enhances rapid translation of findings.1 Studies have found that health services whose clinicians conduct more research tend to have lower mortality rates, greater organizational efficiency, better staff retention and higher patient and staff satisfaction.2–4 Additionally, a key recommendation to improve the estimated 85% avoidable waste in health research is the increased involvement of health service-embedded clinicians in driving research agendas.1,5

In Australia, building research capacity of clinicians embedded within healthcare services has been identified as a key priority for health research,6 and most recently as one of the 12 priorities for the $20 billion Medical Research Future Fund.7 However, with increasing pressure on public health systems to meet activity targets and provide services to a growing and aging population, finding the time, money and resources to conduct research is challenging. Medical doctors are the second-largest health profession in Australia,8 and have an important role in many types of health research from bench to bedside. However, only 7% of the medical workforce report active involvement in research, and it has been argued this is decreasing.9

A key facilitator of building research capacity is an in-depth understanding of context-specific barriers and facilitators, research culture, and the levels of and drivers for current research activity. In health services in particular, arguments have been made that traditional research output measures, such as publications, may need to be replaced or supplemented by process measures.10 However, there have been very few studies comparing different forms of measurement in this context.

In Australia, the Research Capacity and Culture (RCC) survey has frequently been used to understand the current level of research culture and engagement of a profession, most commonly in Allied Health,11–13 but also in medical professions.14–16 Studies using this tool have found differences in research culture between teams, and findings have been used to inform tailored development of research capacity building strategies.11,13 One study in Allied Health also found that there was no association between a team’s RCC score and their research activity; however, publication and project outputs were low (0–4 publications and 0–8 new projects per team), which limited interpretation.13

In line with the increasing focus on improving health professional research capacity, there is a need to robustly measure and evaluate outcomes of research capacity building initiatives. There is currently little understanding of the relationship between validated measures of research culture and capacity, and traditional research activity and output measures.

Thus, this study aimed to:

  • describe the research capacity and culture of medical doctors across a health service using the Research Capacity and Culture tool
  • describe the research activity, in terms of publications and projects, of medical doctors across a health service
  • determine whether “team” level research capacity and culture on the Research Capacity and Culture tool is significantly different between specialty groups, and whether this score is associated with actual research activity

Materials and Methods

This cross-sectional, observational study collected data using a survey and audit. Ethics approval was obtained from the Gold Coast Hospital and Health Service Human Research Ethics Committee (HREC/10/QGC/177).


Gold Coast Health (GCH) is a publicly funded tertiary health service located in South-East Queensland, Australia. The service includes two hospital facilities of 750 and 403 beds each, as well as outpatient and community-based services. Research is not currently routinely included in medical role descriptions, and engagement in research is variable across the organisation. GCH’s 2019–2022 research strategy has a focus on growing clinician research capacity and establishing a sustainable research culture.17 Strategic incentives and support for research include an annual grant scheme; access to small-scale grants for conference presentations and open access journal fees; an annual research week; a centralised Clinical Trials Unit; and a Research Council and Research Subcommittee of the Board to oversee implementation of the research strategy.17 Considerable work has also been done to increase Allied Health research engagement in the organisation,18 with the RCC tool used across multiple years to measure improvements.13,19 This current study aims to build on this success in the medical stream.


Convenience sampling was used for the RCC survey. All medical doctors employed by GCH were invited to participate. “Medical doctors” included all professions requiring a medical degree (MBBS, MD or equivalent), including physicians, surgeons, anesthetists, radiologists and others. Data for the audit were gathered from institutional records on publications and projects.

Survey Tool

The Research Capacity and Culture (RCC) tool is a validated questionnaire, which measures indicators of research capacity and culture across three levels – organization, team and individual.8 The tool includes 52 questions (items) on self-reported success or skill in research, including 18 at the organisational level, 19 at the team level and 15 at the individual level. Each item is scored on a scale 1–10, with 10 being the highest possible level of skill or success, as well as an “unsure” option. The RCC also asks respondents to select applicable barriers and motivators to participate in research from a list of 18 of each, and prompts them to add additional barriers/motivators if desired. Lastly, the tool includes a standard set of demographic questions.

Small modifications were made to the tool to better fit the target population, and additional demographic questions were included (eg participant’s facility within GCH). For this study, participants were asked to reflect on their Medical College specialty (eg cardiology, general surgery) when answering questions about their “Team”. This clarification was made as participants could have also interpreted “team” to mean their multispeciality or multidisciplinary team. The tool was administered via a secure online platform (Survey Monkey®).


Active promotion and recruitment for the survey was in two stages, – the first in January and February 2019, then a break whilst the organization transitioned to an integrated electronic medical record, and then a second promotion in July and August. Potential participants were provided with an electronic participant information sheet and gave voluntary informed consent prior to participation. The survey took respondents approximately 15 minutes to complete.

Two key audits were performed to collect data on the publications and new projects of doctors in the health service for the calendar years 2018 and 2019. Data was collected for both years as, due to the distributed timing of the survey recruitment, it was determined that both years of data would be relevant to doctors’ RCC responses. For the purposes of the audit, new projects were defined as projects, which obtained a health service governance approval to proceed within that year. Health service-maintained research governance approvals and publication databases were used to identify projects and publications involving medical doctors from the health service. Where necessary, original articles were accessed, or authors/investigators were contacted to clarify information. Information on the number of doctors and FTE status was extracted from internal institutional records.

Data Analysis

Quantitative survey data analysis was performed using Stata 15 (College Station, TX, USA). Survey and audit data were analysed descriptively, using frequencies, percentages, medians and interquartile ranges. “Unsure” responses on the RCC were not included in the analyses but the percentage of “unsure” responses is presented for each item. A one-way analysis of variance analysis with post hoc Scheffe tests was used to determine whether there were differences between specialty groups on the mean of all RCC item scores at the “team” level. Linear regression was used to investigate the relationships between these means and the publications and projects per full-time equivalent (FTE) for each specialty group. Analysis of open-ended survey responses was conducted using inductive qualitative content analysis,20 in which core meanings were derived from the text and grouped into themes. This analysis was completed by a single researcher with experience in health-related qualitative research (CB). Themes were discussed with a subset of the team with qualitative expertise (CB, CN, SM) to reach consensus.


Survey: Quantitative Results

In total, 225 participants consented to complete the survey. Of these, 96 incompletely answered the questions and 129 completed the entire survey. Five of these complete responses were excluded as 4 were not doctors (allied health/nursing/midwifery) and one was a duplicate response. In total, data from 124 survey responses were available for analysis, representing 10.1% of the health service’s estimated total medical workforce.

Participant characteristics are shown in Table 1. The sample had almost equal numbers of males (49.2%) and females (46.8%). Consultants were overrepresented in the sample, making up 72.5% of respondents, whereas they make up slightly less than half of doctors in the health service. Conversely, registrars (12.1% of respondents) and junior doctors (12.9%) were under-represented. Research was part of the role description of 34.7% of respondents, and was not for 42.7%, while almost a quarter (22.6%) of respondents were unsure.

Table 1 Demographic Information and Professional Qualifications

The median score for research capacity and culture on the RCC was 5 at the organization level, 5.5 at the team level, and 6 at the individual level. Higher scores on the 10-point scale indicate a more positive perception of research culture and capacity. There was a high rate of “unsure” responses for the organization (18.6% of all item responses) and team (14.7%) levels, and a low rate for individual level (2.8%). “Unsure” responses relating to external funding, applications for scholarships/degrees, mechanisms for monitoring research quality, consumer involvement, and research software made up over a fifth of responses at both the organisational and team level. At the organisational level only, ensuring the availability of career pathways and having a policy/plan for research development also resulted in over 20% “unsure” responses.

Median scores for each item at the organizational level ranged between 3 and 7 (Table 2). Key strengths of the organisation’s research culture were “promotes clinical practice based on evidence” (median=7), “engages external partners (eg universities) in research” (6), “supports the peer-reviewed publication of research” (6) and “has regular forums to present research findings” (6). Lowest success was reported for “ensures staff career pathways are available in research” (3) and “has funds, equipment or admin to support research” (4).

Table 2 Median Score for Organisation Level RCC Items, Arranged in Descending Order

Median scores for items at the team level ranged from 3 to 7 (Table 3). Key strengths of team-level research culture reflected the organizational level, including “conducts research activities which are relevant to practice” (7), “supports peer-reviewed publication of research” (7) and “supports a multidisciplinary approach to research” (7). Lowest success was reported for “has funds, equipment or admin to support research” (3) and “has incentives and support for research mentoring activities” (3).

Table 3 Median Scores for Team Level RCC Items, Arranged in Descending Order

Median scores for items at the individual level ranged from 3.5 to 7 (Table 4). Strengths in individual research success were related to evidence-based practice (EBP) skills, including “finding relevant literature” (7), “integrating research findings into practice” (7) and “critically reviewing the literature” (7). The lowest rated item was “securing research funding” (3.5).

Table 4 Median Scores for Individual Level RCC Items, Arranged in Descending Order

Table 5 shows the most commonly reported barriers and motivators by percent of total survey respondents. “Lack of time for doing research” and “Other work roles take priority” were the most commonly reported barriers, identified by 83.9% and 82.3% of respondents, respectively. In decreasing order of frequency, other common barriers were the “lack of funds for research” (57.3%), “desire for work/life balance” (45.2%), “lack of a coordinated approach to research” (45.2%), and “lack of skills for research” (45.2%). Only 4% of respondents reported “not interested in research” as a barrier. The most commonly reported motivators were “to develop skills” (70.2%), followed closely by “increased job satisfaction” (66.9%). Other common motivators included “problem identified that needs changing” (58.9%), “career advancement” (58.1%) and “to keep the brain stimulated” (57.3%).

Table 5 Reported Frequency of Personal Barriers and Motivators to Conducting Research, Arranged in Descending Order

Survey: Qualitative Results

Twenty-nine percent (n= 36) of respondents provided free text responses to the question “Do you have any final comments or suggestions about the survey or research in general?”. Responses were a mean of 34 words in length. The four main themes are summarised in Table 6. The most common theme (n=21) was expansion on barriers to doing research, which were mostly reflective of the quantitative results. A significant portion (n=16) described the tension for change,21 emphasising there was a need for improvement in research engagement, positing that the health service was behind others in this aspect, and outlining frustrations with a perceived lack of focus on and support for research in the service. Others (n=11) included suggestions for potential strategies for improving the research culture at GCH in their response. These included suggestions like dedicated time for research, research support staff, making it easier to link in with potential projects, and calls for improved research culture and planning in the health service. A small number of respondents (n=4) outlined their perceptions of the benefits of increased research engagement as justification for the need for change.

Table 6 Codes from Qualitative Analysis of Final Free Text Question

Project Audit

There were 266 research projects that received health service governance approval from January 2018 to December 2019. 74.1% (n=197) of these had a medical doctor from the health service as an investigator. Data was not collected on investigators who were not affiliated with the health service. For every 12.5 doctors employed in the health service, one project was commenced each year.

Table 7 displays further details on these 197 projects. The health service was the only site in 50.8% of the projects, the lead site of a multisite study in 6.6%, and nonlead site on a multisite study in 42.6%. Most (80.2%) projects involved doctors from a single specialty in the health service, while 19.8% involved doctors from multiple specialties (eg cardiology and rheumatology). A third of projects involved collaboration with other professional streams, such as nursing and Allied Health.

Table 7 Characteristics of Publications and Projects from Medical Doctors in 2018 and 2019

Publication Audit

There were 479 total publications that included health service staff between January 2018 and December 2019. More than two thirds (68.3%; n=327) of these publications had a medical doctor from the health service as an author (data was not collected about authors not employed in the health service). For every 7.5 doctors employed in the health service, one article was published each year.

Table 7 displays further details on these 327 publications. There were around 160 unique authors each year (approximately 13% of GCH doctors), and just over a quarter of these published in both years. In 2018, 72.6% published a single paper, 13.7% published two papers, and only 1.2% published over ten papers. These results were similar in 2019.

A medical doctor from the health service was first or last author in 53.2% of the publications, and 13.8% involved authors from another professional stream in the health service, such as nursing or Allied Health. The most common publication type was primary research (59.9%), followed by other article types like letters or opinions (20.2%) and case reports (12.8%).

Relationship Between Survey Results and Research Activity for Specialty Groups

“Team” level scores on the RCC were summarised into 7 broad specialty groups (Table 8), which were (in no particular order): emergency medicine; surgery and intensive care; anaesthetics; psychiatry; paediatrics, obstetrics and gynecology; and 2 physician groups split according to which division of the health service the specialty was placed. One response could not be categorised as a specialty group. Specialty groups were anonymized as Groups 1–7 in accordance with this study’s ethical approval.

Table 8 Mean Scores for Team Level RCC Items and Publication and Projects per FTE, Separated by Broad Specialty Group, Arranged in Descending Order

Differences were identified in the mean total-item RCC scores at the “team” level between specialty groups (P<0.0001). Group 5 had the highest mean score (7.8) which was greater than Group 6 (3.1, P<0.0001), Group 2 (4.0, P<0.0001), Group 3 (4.6, P=0.010), and Group 4 (5.2, P=0.036). Group 6 had the lowest mean score (3.1) and was lower than Group 7 (6.0, P=0.003) and Group 1 (6.4, P=0.006) as well as Group 5.

Linear regression showed a relationship between mean “team” level RCC score and both projects (Figure 1) and publications (Figure 2) per FTE. Each additional project or publication per FTE corresponded to an increase of 5.3 (95% CI 2.0–8.6; P=0.002) or 6.6 (95% CI 4.8–8.5; P<0.0001) in the mean “team” level RCC score, respectively. It was estimated that of the total variance in “team” level RCC score, 29.95% was associated with variance in publications per FTE and 7.91% in projects per FTE.

Figure 1 Relationship between Projects per FTE of teams and the teams’ mean RCC scores.

Figure 2 Relationship between Publications per FTE of teams and teams’ mean RCC scores.


The research capacity and culture of medical doctors at individual, team and organisational levels was moderate, with medians of 6, 5.5, and 5, respectively. This level of research culture and capacity, and a pattern of team and individual RCC scores being higher than organisational-level scores, is reflected in similar studies including medical doctors in other Australian health services.15,16

The pattern and spread of results between the highest- and lowest-rated items found in this study is also broadly reflected in the results of these other studies.14,15 This includes the general trend of Evidence Based Practice-related items scoring more positively than pure research-related items, and items relating to funding, career pathways and incentives for research activity consistently scoring the lowest.22 This reflects progress towards the commonly cited goal that all clinicians should understand/use research, while fewer will participate in or lead it.23 Lack of time and other work roles taking priority as the most common barriers to undertaking research are also consistent across the literature for clinicians within health services.12–15

A 2017 study using the RCC with Allied Health professionals in the same health service returned scores consistently 1–3 points higher for the organisation level, with an overall median of 7.19 Reasons for this are unclear, as it is unlikely the organisation’s research culture has changed so significantly in that time. The Allied Health clinicians also scored their team-level success slightly higher, with a median of 6, and their individual capability lower, with a median of 5.19 Notably, both studies also found a high level of “unsure” responses, especially at the organization level, implying that the health service could improve communication and promotion of institutional research supports and initiatives.19

Consistent with the Allied Health study, differences were identified between specialty groups at the team level, with mean scores varying from 3.1 to 7.8. Other research has also shown that different teams within the same health service may have different barriers, motivations and levels of research capacity and current activity.11,13,19,24,25 Literature on research capacity building in health settings has found that team-based approaches are likely to be most effective, as they allow strategies to be tailored to specific needs.11,24–26 Tools like the RCC are a useful way for health services to capture their teams’ current research engagement and needs.

This study found that medical doctors were the largest producers of research projects and publications within the health service. Over a third of projects involved collaborations with other professions like Allied Health and nursing, reflecting the importance of the multidisciplinary team in modern models of care.27 Due to differences in how studies measure and report research activity,28 it is difficult to compare research activity with other health services.29,30 Available international literature on research outputs from medical clinicians usually focuses on specific specialties, often further limited to registrars and academic doctors, rather than a whole of health service measurements.16,31–35 However, one self-report survey found that for every 12.8 Australian physiotherapists employed in tertiary facilities, one article was published per year, compared to 7.5 doctors in this study.36

We identified an association between a specialty group’s research activity and their self-reported, subjective research culture and capacity. Publications demonstrated a stronger association than projects, likely because the two highest scoring groups on the RCC (1 and 5) had relatively few projects. Further investigation showed that one of these groups had a high proportion of multiphase, multisite and complex projects. This indicates that the number of projects may be a poorer indicator than the number of publications, as a simple project like a retrospective chart audit is counted equivalently to a multisite, multiphase interventional trial. The same can also be true of publications; however, complex projects will likely result in multiple publications, which helps offset this effect.

Previous research has called into question the utility of research activity and output measures, such as publications for measuring clinician research capacity. As producing research is not the core role of clinicians, an improvement in research culture is likely to have a significant time lag before a measurable increase in outputs is realized.10,12,37,38 Some authors have argued that self-report measures of research capacity and culture should be combined with traditional research activity measures.12,13 However, this is the first study to identify that the two types of measure are associated, adding further weight to this argument.


Limitations to the study were the low (10.1%) response rate for the survey and the fact that the sample was not random. Other studies using the RCC have achieved both lower14,15 and higher12,13,16 response rates. Selection bias may also affect results, as it is likely that those interested in research were more likely to respond to the survey. This may also explain the overrepresentation of consultants compared to junior doctors. A poor response rate from junior doctors has been found in other surveys of research culture.24 Due to this, the results may be more likely to reflect the opinions of senior doctors, and should be interpreted with caution for the junior doctor and registrar populations.

Future Directions

Locally, results from this study were fed back to key strategic groups at Gold Coast Health in February 2020; however, the Covid-19 pandemic, staffing changes, and an organisational restructure have slowed the translation of this work. Nevertheless, funding has been obtained for a Knowledge Translation study to identify and implement evidence-based strategies for increasing medical engagement in research. The findings of this current study are being used to inform this work, provide a baseline measure, and help localise and tailor potential strategies.

In terms of wider implications, the findings of this work may help inform the approach to research capacity building in similar settings. This study has demonstrated that research capacity and culture vary widely between teams within an organisation, and comparison with other studies shows that it varies between organisations. While this means that the results should not be directly applied to other settings, there are common patterns, and this work adds to the literature on the research capacity and culture of medical professionals15,16 to provide a set of results for comparison. More importantly, however, there are few studies reporting data on health service-wide research activity, in contrast to the plethora of research describing the research outputs of universities.39 As health services and clinicians are increasingly called upon to be producers, not just consumers, of research,3,4 systematic study of their research activity is essential.

The link between RCC results and actual research activity of teams should also be further explored, especially in longitudinal research. It would be valuable to determine whether there is delay between improvements in a team’s scores on the RCC survey, and translation into increased research activity in the form of new projects. This would serve to delineate whether a self-report measure like the RCC can reflect improvements in research capacity and culture more rapidly, thus being a more sensitive measure to change than traditional research output measures.


This study reinforces the most significant challenge faced in supporting clinical research within a tertiary health context governed by activity-based funding allocations: providing medical staff time to engage in research activities. New approaches to address this in an increasingly constrained fiscal environment are needed. This study also demonstrated significant differences between team’s reported research activity and culture, indicating that research capacity building initiatives may need to be tailored to specialty groups. Objective activity measures, particularly publications, were shown to be associated with a team’s self-rated research capacity and culture. A combination of subjective process and objective activity outcome measures may therefore be complementary when measuring the impact of research capacity building initiatives. The results of the RCC in this study are intended to identify areas for improvement and provide a baseline for multi-faceted and tailored research capacity building programs in the health service.


We would like to sincerely thank all the Gold Coast Health staff who participated in or helped to disseminate this research.

Author Contributions

All authors made significant contribution to the work reported, either in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.


This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors. Costs associated with open access publication have been funded by the Gold Coast Health Study, Education and Research Trust Account.


The authors declare no conflicts of interest for this work and that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.


1. Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–89. doi:10.1016/S0140-6736(09)60329-9

2. Ozdemir BA, Karthikesalingam A, Sinha S, et al. Research activity and the association with mortality. PLoS One. 2015;10(2):e0118253. doi:10.1371/journal.pone.0118253

3. Harding K, Lynch L, Porter J, et al. Organisational benefits of a strong research culture in a health service: a systematic review. Aust Health Rev. 2016;41(1):45–53. doi:10.1071/AH15180

4. Jonker L, Fisher SJ, Dagnan D. Patients admitted to more research-active hospitals have more confidence in staff and are better informed about their condition and medication: results from a retrospective cross-sectional study. J Eval Clin Pract. 2020;26(1):203–208. doi:10.1111/jep.13118

5. Stehlik P, Noble C, Brandenburg C, et al. How do trainee doctors learn about research? Content analysis of Australian specialist colleges’ intended research curricula. BMJ Open. 2020;10(3):e034962. doi:10.1136/bmjopen-2019-034962

6. Australian Government Department of Health and Ageing. Strategic review of health and medical research final report. Canberra: Commonwealth of Australia; 2013.

7. Australian Government Department of Health. MRFF strategy and priorities. Canberra; 2020. Available from: Accessed August 2020.

8. Australian Institute of Health and Welfare. Australia’s health 2016. Canberra: AIHW; 2016.

9. Joyce CM, Piterman L, Wesselingh SL. The widening gap between clinical, teaching and research work. Med J Aust. 2009;191(3):169–172. doi:10.5694/j.1326-5377.2009.tb02731.x

10. Cooke J. A framework to evaluate research capacity building in health care. BMC Fam Pract. 2005;6(1):1–11. doi:10.1186/1471-2296-6-44

11. Holden L, Pager S, Golenko X, et al. Validation of the research capacity and culture (RCC) tool: measuring RCC at individual, team and organisation levels. Aust J Prim Health. 2012;18(1):62–67. doi:10.1071/PY10081

12. Alison JA, Zafiropoulos B, Heard R. Key factors influencing allied health research capacity in a large Australian metropolitan health district. J Multidiscip Healthc. 2017;10:277–291. doi:10.2147/JMDH.S142009

13. Wenke RJ, Mickan S, Bisset L. A cross sectional observational study of research activity of allied health teams: is there a link with self-reported success, motivators and barriers to undertaking research? BMC Health Serv Res. 2017;17(1):114. doi:10.1186/s12913-017-1996-7

14. Gill SD, Gwini SM, Otmar R, et al. Assessing research capacity in Victoria’s south‐west health service providers. Aust J Rural Health. 2019;27(6):505–513. doi:10.1111/ajr.12558

15. Lee SA, Byth K, Gifford JA, et al. Assessment of health research capacity in Western Sydney Local Health District (WSLHD): a study on medical, nursing and allied health professionals. J Multidiscip Healthc. 2020;13:153–163. doi:10.2147/JMDH.S222987

16. McBride KE, Young JM, Bannon PG, et al. Assessing surgical research at the teaching hospital level. ANZ J Surg. 2017;87(1–2):70–75. doi:10.1111/ans.13863

17. Gold Coast Health. Research Strategy 2019–2022. Southport; 2020. Available from: Accessed August 2020.

18. Mickan S, Wenke R, Weir K, Bialocerkowski A, Noble C. Strategies for research engagement of clinicians in allied health (STRETCH): a mixed methods research protocol. BMJ Open. 2017;7(9):e014876. doi:10.1136/bmjopen-2016-014876

19. Matus J, Wenke R, Hughes I, Mickan S. Evaluation of the research capacity and culture of allied health professionals in a large regional public health service. J Multidiscip Healthc. 2019;12:83–96. doi:10.2147/JMDH.S178696

20. Graneheim UH, Lundman B. Qualitative content analysis in nursing research: concepts, procedures, and measures to achieve trustworthiness. Nurse Educ Today. 2004;24(2):105–112. doi:10.1016/j.nedt.2003.10.001

21. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. doi:10.1111/j.0887-378X.2004.00325.x

22. Phang DTY, Rogers G, Hashem F, et al. Factors influencing junior doctor workplace engagement in research: an Australian study. Focus Health Prof Educ. 2020;21(1):13–28. doi:10.11157/fohpe.v21i1.299

23. Del Mar C. Publishing research in Australian family physician. Aust Fam Physician. 2001;30:1094.

24. Paget SP, Lilischkis KJ, Morrow AM, Caldwell PHY. Embedding research in clinical practice: differences in attitudes to research participation among clinicians in a tertiary teaching hospital. Intern Med J. 2014;44(1):86–89. doi:10.1111/imj.12330

25. Pager S, Holden L, Golenko X. Motivators, enablers, and barriers to building allied health research capacity. J Multidiscip Healthc. 2012;5:53–59. doi:10.2147/JMDH.S27638

26. Holden L, Pager S, Golenko X, et al. Evaluating a team-based approach to research capacity building using a matched-pairs study design. BMC Fam Pract. 2012;13(1):16. doi:10.1186/1471-2296-13-16

27. Mitchell GK, Tieman JJ, Shelby-James TM. Multidisciplinary care planning and teamwork in primary care. Med J Aust. 2008;188(8):S63. doi:10.5694/j.1326-5377.2008.tb01747.x

28. Caminiti C, Iezzi E, Ghetti C, et al. A method for measuring individual research productivity in hospitals: development and feasibility. BMC Health Serv Res. 2015;15(1):468. doi:10.1186/s12913-015-1130-7

29. DiDiodato G, DiDiodato JA, McKee AS. The research activities of Ontario’s large community acute care hospitals: a scoping review. BMC Health Serv Res. 2017;17(1):566. doi:10.1186/s12913-017-2517-4

30. Prasad V, Goldstein JA, Wray KB. US news and world report cancer hospital rankings: do they reflect measures of research productivity? PLoS One. 2014;9(9):e107803. doi:10.1371/journal.pone.0107803

31. Bateman EA, Teasell R. Publish or perish: research productivity during residency training in physical medicine and rehabilitation. Am J Phys Med Rehabil. 2019;98(12):1142–1146. doi:10.1097/PHM.0000000000001299

32. Boudreaux ED, Higgins SE Jr, Reznik-Zellen R, Volturo G. Scholarly productivity and impact: developing a quantifiable, norm-based benchmark for academic emergency departments. Acad Emerg Med. 2016;23:S44.

33. Elliott ST, Lee ES. Surgical resident research productivity over 16 years. J Surg Res. 2009;153(1):148–151. doi:10.1016/j.jss.2008.03.029

34. Gaught AM, Cleveland CA, Hill JJ 3rd. Publish or perish? Physician research productivity during residency training. Am J Phys Med Rehabil. 2013;92(8):710–714. doi:10.1097/PHM.0b013e3182876197

35. Wang H, Chu MWA, Dubois L. Discrepancies in research productivity among Canadian surgical specialties. J Vasc Surg. 2019;70(4):e103. doi:10.1016/j.jvs.2019.07.030

36. Skinner EH, Hough J, Wang YT, et al. Physiotherapy departments in Australian tertiary hospitals regularly participate in and disseminate research results despite a lack of allocated staff: a prospective cross-sectional survey. Physiother Theory Pract. 2014;31(3):200–206. doi:10.3109/09593985.2014.982775

37. Pain T, Plummer D, Pighills AC, Harvey D. Comparison of research experience and support needs of rural versus regional allied health professionals. Aust J Rural Health. 2015;23(5):277–285. doi:10.1111/ajr.12234

38. Sweeny A, van den Berg L, Hocking J, et al. A Queensland research support network in emergency healthcare: collaborating to build the research capacity of more clinicians in more locations. J Health Organ Manag. 2019;33(1):93–109. doi:10.1108/JHOM-02-2018-0068

39. Rhaiem M. Measurement and determinants of academic research efficiency: a systematic review of the evidence. Scientometrics. 2017;110(2):581–615. doi:10.1007/s11192-016-2173-1

Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.