RefCheck Maintenance Notice

On Monday, December 3, 2018, from 16:00-18:00 EST, RefCheck will be undergoing maintenance. RefCheck is the process where, during copyediting, all references are extracted from the manuscript file, parsed, matched against various databases (eg, PubMed and CrossRef), and automatically corrected. For more information on RefCheck, please visit our Knowledge Base.

Who will be affected?

Advertisement

Citing this Article

Right click to copy or hit: ctrl+c (cmd+c on mac)

Published on 27.02.15 in Vol 4, No 1 (2015): Jan-Mar

This paper is in the following e-collection/theme issue:

    Original Paper

    Competency-Based Assessment for Clinical Supervisors: Design-Based Research on a Web-Delivered Program

    1School of Public Health and Nutrition, Bruce ACT, Australia

    2Griffith Health Institute, Griffith University, Gold Coast, Australia

    3School of Public Health and Nutrition, Faculty of Health, University of Canberra, Bruce ACT, Australia

    Corresponding Author:

    Rachel Bacon, BSc, MSc (Nutr & Diet)

    School of Public Health and Nutrition

    Faculty of Health

    University of Canberra

    Bruce ACT, 2601

    Australia

    Phone: 61 (0) 2 6201 5274

    Fax:61 (0) 2 6201 5727

    Email:


    ABSTRACT

    Background: Clinicians need to be supported by universities to use credible and defensible assessment practices during student placements. Web-based delivery of clinical education in student assessment offers professional development regardless of the geographical location of placement sites.

    Objective: This paper explores the potential for a video-based constructivist Web-based program to support site supervisors in their assessments of student dietitians during clinical placements.

    Methods: This project was undertaken as design-based research in two stages. Stage 1 describes the research consultation, development of the prototype, and formative feedback. In Stage 2, the program was pilot-tested and evaluated by a purposeful sample of nine clinical supervisors. Data generated as a result of user participation during the pilot test is reported. Users’ experiences with the program were also explored via interviews (six in a focus group and three individually). The interviews were transcribed verbatim and thematic analysis conducted from a pedagogical perspective using van Manen’s highlighting approach.

    Results: This research succeeded in developing a Web-based program, “Feed our Future”, that increased supervisors’ confidence with their competency-based assessments of students on clinical placements. Three pedagogical themes emerged: constructivist design supports transformative Web-based learning; videos make abstract concepts tangible; and accessibility, usability, and pedagogy are interdependent.

    Conclusions: Web-based programs, such as Feed our Future, offer a viable means for universities to support clinical supervisors in their assessment practices during clinical placements. A design-based research approach offers a practical process for such Web-based tool development, highlighting pedagogical barriers for planning purposes.

    JMIR Res Protoc 2015;4(1):e26

    doi:10.2196/resprot.3893

    KEYWORDS

    Crowdfunding campaign to support this specific research

    We help JMIR researchers to raise funds to pursue their research and development aimed at tackling important health and technology challenges. If you would like to show your support for this author, please donate using the button below. The funds raised will directly benefit the corresponding author of this article (minus 8% admin fees). Your donations will help this author to continue publishing open access papers in JMIR journals. Donations of over $100 may also be acknowledged in future publications.

    keyboard with crowdfunding key instead of enter key

    Suggested contribution levels: $20/$50/$100



    Introduction

    Support for Supervisors to Assess Clinical Competence

    Within the dietetics profession, students are required to complete 20 weeks of placement, with half of that time spent in developing and demonstrating competence in individual case management [1]. The assessment of the clinical competence of student dietitians is a shared responsibility between the university and the health sector [1], with the assessments made by site supervisors during clinical placements providing a key source of evidence of student competence [2]. The difficulties faced by site supervisors in assessing student performances during clinical placements are clearly reported in the literature [3,4]. Clinicians therefore need to be supported by universities to use credible and defensible assessment practices [5]; however, the geographical distribution of placement sites prohibits face-to-face education of all supervisors.

    Web-Based Delivery

    Web-based delivery of education to support clinical supervisors has been successfully used by the professions of medicine, nursing, and physiotherapy [6-9]. The Web-based mode transcends geographical and time constraints [10] and may be more accessible to clinicians, particularly those in rural or community-based settings who may be sole practitioners within a multidisciplinary team [11]. Web-based delivery provides an efficient means to share resources and avoid duplication [12]. Professional development delivered via the Web has been shown to achieve equivalent outcomes (satisfaction, knowledge retention, and change in practice) when compared to face-to-face delivery [13,14].

    Pedagogy

    When developing a Web-based learning program, both the discipline-specific content and the learning process need to be considered. Constructivist pedagogy, in which learners construct their own meaning by forming connections through collaboration and reflection between their prior knowledge and new experiences (authentic real-world problems), has been recommended for Web-based delivery [15,16]. Collaboration can be supported within a virtual community using a central online discussion forum [17]. This learner-centered approach to Web-based education allows participants to be independent self-paced learners and to select learning content in a way that meets their learning style [16,17]. Rowe and Rafferty [18] have demonstrated improved user engagement with Web-based learning by self-regulated learning strategies such as activation of prior knowledge, self-monitoring, and reflections. There is evidence to suggest video-based learning material may improve learner engagement [9,19-21]. Effective Web-based delivery must also consider the usability and accessibility of the program [22,23].

    Objective

    Programs to educate supervisors in the use of more credible and defensible assessment practices are currently non-existent. This paper explores the potential for a Web- and video-based constructivist tool to support clinical supervisors to use credible and defensible assessment practices during clinical placements. The program aims to use authentic video-based learning material and metacognitive activities such as self-monitoring and reflection to support clinical supervisors to transform their assessment practices. The study also considers the interdependence between pedagogy, usability, and accessibility.


    Methods

    Design-Based Research

    The Web-based program “Feed our Future” was developed using a design-based research approach adapted from Wang and Hannifin [24]. This approach has been used in the design of technology-enhanced learning environments for the way in which it advances design, research, and practice concurrently [25]. Design-based research addresses a practical problem in context, is informed by theory, and is refined through an iterative process of formative feedback and reflection in consultation with participants [25]. In the final stage of product development in design-based research, the intervention is pilot-tested and evaluated. This stage is then used to inform final revisions of the program [24].

    Stage 1: Program Development

    In October 2012, research consultation and initial development of the program commenced concurrently.

    Research Consultation

    Research presented in the publication, “Credible and defensible assessment of entry-level clinical competence: Insights from a modified Delphi study” [26], informed the development of the professional content of the program. This research was conducted with a panel of experienced clinical supervisors (potential end-users) and explored the issues of judgment and subjectivity in the assessment of health professional competence. The paper includes a focused literature review on credible and defensible competency-based assessment practices including the need for a shared definition of competence [27], clearly defined standards [28], a global approach to assessment [29], consideration of the learning context [30,31], multiple sources of evidence [32], and the need for an interpretive community of assessors [33].

    Development of the Prototype

    An interview with Professor Sue Ash, a member of the original taskforce that developed the dietetic competency standards in 1994 [34], and participated in their reviews [35,36], was recorded as expert opinion. This recording was incorporated into the program to provide clarity on the definition and application of the competency standards within the dietetics profession.

    Evidence suggests that resources to support assessments such as visual representation of entry-level performance may increase the consistency of supervisor assessments [37]. As an outcome from the research consultation, 11 video recordings of authentic dietetic student-client consultations were produced for the program (mean duration 60 minutes; residential aged care and outpatient settings), with corresponding assessments of each student’s performance made by the panel of experienced clinical supervisors [26].

    Information technology (IT) expertise from an academic was sought to select an appropriate delivery platform. Consideration was given to budget and timeline, security, usability, incorporation of different file types, with particular considerations of video recordings, and capacity to provide feedback to participants on their learning.

    The first prototype of Feed our Future was completed within 8 months using the website builder WIX as the delivery platform. The planned learning outcomes for the program were for supervisors (1) to feel more confident in their approach to assessment, and (2) to use credible and defensible competency-based assessment practices. The program comprised four learning modules, each including questions to consider, problem-based learning and self-monitoring activities, key concepts, and suggested readings. A pre-program quiz, a post-program quiz, a discussion forum, and a practice capstone module were included.

    Formative Feedback

    Feedback obtained during the Feed our Future program development included several sources. An advisory group comprised of industry, academic, student, consumer, and regulatory representation provided direction on the research and the development of the Web-based program. Potential end users trialed the prototype and provided feedback via a market stall/booth established at the Annual National Conference of the Dietitians Association of Australia (DAA) in May 2013. The DAA’s Board of Directors also reviewed the program.

    Stage 2: Pilot Test and Evaluation

    Participants

    A purposeful sample of nine dietitians located in a variety of health care sites and involved in the University of Canberra’s clinical placement program was invited, via email, to participate in this study. These potential end-users were provided with access to the password-protected website and asked to pilot-test the program over a 4-week period. The Human Research Ethics Committee (HREC) of the University of Canberra approved the study protocol (12-209) that conformed to the provisions of the Declaration of Helsinki.

    Data Generated From Feed Our Future

    Data generated as a result of user participation during the pilot test including participation rates and outcomes from the pre-test, discussion forum, multiple choice quiz, and the post-test, were reviewed. In the pre-test and post-test, participants were asked to (1) rate their level of confidence with assessing a student’s competence during his/her clinical placement using a 10-point scale (1=not at all confident; 10=extremely confident), (2) rate a student’s performance as observed from a video recording (the method of assessment is described elsewhere [26]), and (3) provide a qualitative description of how they would ensure their assessment of a student’s competence during his/her clinical placement was credible and defensible. Content analysis was used to analyze the qualitative responses from the discussion forum, informed by the focused literature on credible and defensible assessment practices described in the research consultation section [26].

    Qualitative Evaluation

    User experiences during the pilot test were explored using an interpretivist qualitative approach. During the pilot test, users were invited to reflect on a series of questions to be discussed at a later interview. Interviews were held in a focus group for those who could attend (n=6) and in the format of individual interviews, via telephone, for the remainder (n=3). Focus groups were chosen to make use of group dynamics to stimulate discussion in a secure environment [38]. The focus group and individual interviews were facilitated by the primary researcher and began with a scripted introduction outlining the research and ethical considerations. Users provided informed signed written consent that included permission for their interview to be audio-recorded. In the focus group session, a research assistant was employed as a scribe.

    The interview questions were developed by the first author in consultation with LW and MJ and covered (1) the overall experience of using the Feed Our Future program, (2) what they learned, (3) whether and in what way their thinking had been challenged, (4) whether it had prompted them to change the way they assessed students on their clinical placement, and (5) suggestions to improve the program. Users were also asked to describe their workplace and experience with supervising and assessing students up to the time of viewing the program.

    Recordings were audiotaped, transcribed verbatim by two researchers, and crosschecked for accuracy to maintain the integrity of user responses. Transcripts were analyzed independently by the primary researcher and one research assistant with themes highlighted using van Manen’s highlighting approach to thematic analysis [39]. Pedagogical themes arising from the focus group and individual interviews were compared and found to be similar enough to pool. Exemplar quotations illustrating each theme were identified.


    Results

    Stage 1: Program Development

    Formative Feedback

    Table 1 presents the formative feedback that was generated from the advisory committee, dietitians at the Dietitians Association of Australia’s (DAA) National Conference, and from the DAA Board of Directors.

    Table 1. Formative feedback and subsequent refinement to the program.
    View this table

    Stage 2: Pilot Test and Evaluation

    Participants

    Of the nine users that pilot-tested Feed our Future, two were from rural and seven from urban locations, four worked in hospitals and five in community settings, five were experienced supervisors, two reported some experience, and two had little or no experience with supervising students.

    Data Generated From Feed Our Future

    Data generated by participants after pilot-testing the program Feed our Future are presented in Tables 2- 4. In the pre-test, the mean confidence level for users with their assessment approach (using a 10-point scale: 1=not at all confident; 10=extremely confident) was 5.75 (range 2-9). In the pre-test, only five out of eight users rated the student performance, as observed from the video recording, in a similar way to the panel of experienced supervisors (see Table 3). In their qualitative responses, only some concepts supporting credible and defensible competency-based assessment practices were identified by the users (see Table 4).

    Although technical issues prevented some users from participating, the discussion forum was used for introductions and to share learning and reflections. The average score achieved from the multiple choice quiz by users was 86%. Technical issues delayed participants’ completion of the program and hence no results are available from the post-test.

    Table 2. Data generated from Feed our Future: participation.
    View this table
    Table 3. Data generated from Feed our Future: pre-test results Question 2: assessment rating of student’s performance by users.
    View this table
    Table 4. Data generated from Feed our Future: pre-test results Question 3: content analysis from qualitative responses informed by focus literature review [26].
    View this table
    Qualitative Evaluation

    The analysis of interview transcripts revealed three pedagogical themes: (1) constructivist design supports transformative online learning, (2) videos make abstract concepts tangible, and (3) accessibility, usability, and pedagogy are interdependent.

    Theme 1: Constructivist Design Supports Transformative Online Learning

    Although the post-test was not completed by users due to technical issues, qualitative feedback from the focus group and personal interviews showed an increase in user confidence as demonstrated by this exemplar quote:

    From doing this, I now feel like I would be able to confidently have a final clinical placement student.
    [Focus Group User # 3]

    The constructivist design assisted users to apply their learning as demonstrated by the exemplar quotes in Table 5. The program enabled users to compare their assessments of an individual student performance with those made by a panel of experienced supervisors. As one user commented:

    We can use this process for moderation, if we have a number of different supervisors that watch a particular video, we could use it to make sure that our assessments are similar…
    [Personal Interview User #9]

    Through participation in the program, users achieved consensus in their understanding of entry-level performance.

    Table 5. Constructivist design supports transformative online learning.
    View this table
    Theme 2: Videos Make Abstract Concepts Tangible

    Users supported the use of video-based learning material:

    you can read about it, but actually seeing the videos of an assessment, and knowing where they sit on the scale [from novice to expert]…you know we always want to see stuff in action.
    [Focus Group User #3]

    They commented that prior to completing the program they had found the learning content “difficult to apply” and “frustrating at times”. The users found that the video representations of the authentic student-patient consultations allowed their understanding of “entry-level” competence to become more tangible.

    I really liked the videos that when, at the end of them would show the scale of where the students were, like from the beginning to the end.
    [Focus Group Participant #2]

    As demonstrated by Figure 1, organizing the videos on a scale helped the supervisors to distinguish between a novice, intermediate, and entry-level student performances.

    Figure 1. Visual representation of competency development using videos.
    View this figure
    Theme 3: Accessibility, Usability, and Pedagogy Are Interdependent

    IT access and capacity at some worksites limited engagement with the program:

    I’m computer literate but not really up-to-speed with some technological advances I suppose. I was a bit frustrated with some of those things…I suppose once I get annoyed with something I’m not inclined to go back.
    [Personal Interview User #7]

    Table 6 summarizes accessibility and usability barriers experienced by users and presents revisions made to improve the program.

    Product Release and Dissemination

    Table 7 presents the final learning content for Feed our Future. Figures 2-4 present screenshots of the final interface.

    Table 6. Program features: barriers and solutions.
    View this table
    Table 7. Learning content included in Feed our Future.
    View this table
    Figure 2. Final interface home page.
    View this figure
    Figure 3. Final interface learning modules.
    View this figure
    Figure 4. Final interface practice modules.
    View this figure

    Discussion

    Principal Results

    This paper describes the development of the first research- and Web-based learning program to support clinical supervisors in their assessments of student dietitian competence during clinical placements. This case example demonstrates the value of a design-based research and consultative approach to developing a program. The use of a Web-based mode has the potential to disseminate expertise and research findings nationally, overcoming geographical and time boundaries, in the provision of continuing professional development to health practitioners who assess student performance.

    Comparisons With Prior Work

    The results of the pilot test supported the pedagogical design of Feed our Future. The program encouraged independent self-paced learning and catered to different learning styles as recommended by Ng’ambi and Lombe [16]. Participants demonstrated new understandings that aligned with the programs’ learning objectives of the program through the use of authentic student-client consultations, problem-based learning activities, and reflections. Kyeong-Ju Seo and Engelhard [9] achieved similar results with their constructivist Web-based continuing education program for physiotherapy supervisors with their participants perceiving improvements in the quality of their clinical education skills and practices. The approach used in this study highlights the interdependence of pedagogical, usability, and accessibility considerations [22] with the iterative process and the end-user involvement facilitating the identification of barriers to effective educational outcomes.

    Participants in the pilot test found the video recordings of student-client consultations to be helpful in learning about competency-assessment practices. Clinical vignettes in traditional face-to-face learning programs have been used to assist supervisors to gain a shared understanding of entry-level competence in physiotherapy [37]. When used in Web-based delivery, videos have been shown to help engage students and improve learning outcomes [20,21,40]. Maloney and colleagues [19] found learners preferred videos in comparison to other learning materials made available through a Web-based resource repository. Developing a Web-based program with a large number of videos (n=20) in Feed our Future was technically challenging. The decision to edit and divide the videos was driven by network capacity limitations, but Guo’s research [41] suggests that short (6-9 minute) videos also have pedagogical advantages.

    Consistent with the findings of Cook and Steinert [14], users appreciated material that was relevant, well-organized, and had clear expectations including time commitments. The participation rates for the discussion forum in this study were low despite the fact that other studies have identified conversational discussion and social bonding as key factors for successful Web-based education [14]. This feature is also key to constructivist pedagogy [16] and aligns with the notion of an interpretive community of assessors [33]. Possible solutions to address the lack of engagement with discussions may include more active moderation on the forum, blended Web-based learning with face-to-face contact, a social media approach that conforms to workplace security restrictions, or more assistance with technical problems [14].

    Technical barriers experienced in the pilot-testing of Feed our Future such as IT incompatibilities between organizations’ infrastructure, software and Internet browsers, security restrictions, and bandwidth limitations are not unique [42]. Universities have very few security restrictions and are able to use programs such as YouTube to achieve positive learning outcomes [43]. Awareness that this freedom may not be available in some health settings is required if effective Web-based programs are to be available for use by clinical supervisors working in these settings.

    Feed our Future, like many programs [44,45], was developed on a limited budget. Lack of IT expertise, infrastructure, and associated software, were limitations to the development of this program. Two years was required to complete the design-based research approach. Despite the advantages, the development requirements for Web-based programs are more labor intensive than face-to-face delivery [44,45].

    Limitations

    The research-based design and national consultation used for the development of this program was robust. The sample size and the qualitative design of the pilot test and evaluation, although consistent with similar studies [46,47], does not support generalization of the results. Rather, these findings have been used to inform and improve the innovative product. Due to the lack of comparison with other modes of delivery, conclusions cannot be drawn as to whether Web-based delivery was the preferred option by clinical supervisors. The design-based research approach, however, offers supporting evidence for Web-based pedagogical approaches [25]. Further research is required to measure whether the learning of participants translated into actual changes in their competency-based assessment practices, and to determine the uptake of the program nationally.

    Conclusion

    Web-based programs, such as Feed our Future, offer a viable solution for universities to provide professional development to geographically dispersed clinical supervisors in preparation for their students’ clinical placements. A design-based research approach offers a practical process for Web-based tool development.

    Acknowledgments

    This research was made possible due to funding made available by Health Workforce Australia, an Australian Government Initiative, as part of the 2012 National Clinical Supervision Fellowship Initiative, through a Fellowship awarded to Rachel Bacon. This research is supported by the University of Canberra as part of a doctoral thesis.

    We would like to thank the research participants, Professor Sue Ash, the University of Canberra’s Centre for Teaching and Learning, Online Support Service, the inSPIRE Centre for their IT support, and Amy Haughey and Emma Agnew for their contributions as research assistants.

    Conflicts of Interest

    None declared.

    References

    1. Dietitians Association of Australia (DAA). DAA Manual for Accreditation of Dietetics Education Programs version 1.2, 2010. Canberra; 2011.   URL: http://daa.asn.au/wp-content/uploads/2011/03/DAA-accreditation-manual_v1.2_Oct-2011.pdf [accessed 2014-09-25] [WebCite Cache]
    2. Bacon R, Williams L, Grealish L. Aged care facilities and primary health-care clinics provide appropriate settings for dietetic students to demonstrate individual case management clinical competence. Nutrition & Dietetics 2014 Nov 17:n/a-n/a (forthcoming)(forthcoming). [CrossRef]
    3. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA 2002 Jan 9;287(2):226-235. [Medline]
    4. Palermo C, Capra S, Ash S, Beck E, Truby H, Jolly B. Professional competence standards, learning outcomes and assessment: designing a valid strategy for nutrition and dietetics. Sydney: Office for Learning and Teaching, Australian Government; 2014.   URL: http:/​/www.​olt.gov.au/​project-professional-competence-standards-learning-outcomes-and-assessment-designing-valid-strategy- [accessed 2014-09-25] [WebCite Cache]
    5. Govaerts M, van der Vleuten CP. Validity in work-based assessment: expanding our horizons. Med Educ 2013 Dec;47(12):1164-1174. [CrossRef] [Medline]
    6. Wearne S, Greenhill J, Berryman C, Sweet L, Tietz L. An online course in clinical education - experiences of Australian clinicians. Aust Fam Physician 2011 Dec;40(12):1000-1003 [FREE Full text] [Medline]
    7. Zahner SJ, Tipple SM, Rather ML, Schendzielos C. Supporting nurse preceptors through online continuing education. J Contin Educ Nurs 2009 Oct;40(10):468-474. [CrossRef] [Medline]
    8. McColgan K, Rice C. An online training resource for clinical supervision. Nurs Stand 2012;26(24):35-39. [CrossRef] [Medline]
    9. Seo KK, Engelhard C. Using the constructivist tridimensional design model for online continuing education for health care clinical faculty. American Journal of Distance Education 2014 Mar 06;28(1):39-50. [CrossRef]
    10. Huckstadt A, Hayes K. Evaluation of interactive online courses for advanced practice nurses. J Am Acad Nurse Pract 2005 Mar;17(3):85-89. [Medline]
    11. Brown L, Williams L, Capra S. Going rural but not staying long: recruitment and retention issues for the rural dietetics workforce in Australia. Nutr Diet 2010;67(4):294-302. [CrossRef]
    12. Steinert Y. Faculty development in the new millennium: key challenges and future directions. Med Teach 2000 Jan;22(1):44-50. [CrossRef]
    13. Maloney S, Haas R, Keating JL, Molloy E, Jolly B, Sims J, et al. Effectiveness of Web-based versus face-to-face delivery of education in prescription of falls-prevention exercise to health professionals: randomized trial. J Med Internet Res 2011;13(4):e116 [FREE Full text] [CrossRef] [Medline]
    14. Cook DA, Steinert Y. Online learning for faculty development: a review of the literature. Med Teach 2013 Nov;35(11):930-937. [CrossRef] [Medline]
    15. Bangert AW. The development and validation of the student evaluation of online teaching effectiveness. Computers in the Schools 2008 Jul 04;25(1-2):25-47. [CrossRef]
    16. Ng’ambi D, Lombe A. Using podcasting to facilitate student learning: A constructivist perspective. Educ Tech Soc 2012;15(4):181-192 [FREE Full text]
    17. Park J. Designing education online: Learning delivery and evaluation. iJADE 2011;30(2):176-187. [CrossRef]
    18. Rowe F, Rafferty J. Instructional design interventions for supporting self-regulated learning: enhancing academic outcomes in postsecondary e-learning environments. Journal of Online Learning and Teaching 2013;9(4):590-601 [FREE Full text]
    19. Maloney S, Chamberlain M, Morrison S, Kotsanas G, Keating JL, Ilic D. Health professional learner attitudes and use of digital learning resources. J Med Internet Res 2013;15(1):e7 [FREE Full text] [CrossRef] [Medline]
    20. Azer SA, Algrain HA, AlKhelaif RA, AlEshaiwi SM. Evaluation of the educational value of YouTube videos about physical examination of the cardiovascular and respiratory systems. J Med Internet Res 2013;15(11):e241 [FREE Full text] [CrossRef] [Medline]
    21. Chen H, Hu Z, Zheng X, Yuan Z, Xu Z, Yuan L, et al. Effectiveness of YouTube as a source of medical information on heart transplantation. Interact J Med Res 2013;2(2):e28 [FREE Full text] [CrossRef] [Medline]
    22. Ardito C, Costabile F, Marsico MD, Lanzilotti R, Levialdi S, Roselli T, et al. An approach to usability evaluation of e-learning applications. Univ Access Inf Soc 2005 Dec 8;4(3):270-283. [CrossRef]
    23. Fisher E, Wright V. Improving online course design through usability testing. Journal of Online Learning and Teaching 2010;6(1):228-245 [FREE Full text]
    24. Wang F, Hannafin M. Design-based research and technology-enhanced learning environments. Edu Res Dev 2005 Dec;53(4):5-23. [CrossRef]
    25. Anderson T, Shattuck J. Design-based research: A decade of progress in education research? Educational Researcher 2012 Feb 03;41(1):16-25. [CrossRef]
    26. Bacon R, Williams L, Grealish L. Credible and defensible assessments of entry-level clinical competency: insights for modified Delphi study. FoHPE: In Press 2015 (forthcoming).
    27. Brownie S, Bahnisch M, Thomas J. University of Queensland Node of the Australian Health Workforce Institute in partnership with Health Workforce Australia, editor. Adelaide, Australia; 2011. Exploring the literature: competency-based education and competency-based career frameworks: Deliverable fulfilling part of the requirements for NHPRC projects 4 and 5 regarding frameworks for competency-based education, training and health career frameworks   URL: https://www.hwa.gov.au/sites/uploads/national-competency-literature-review-20120410.pdf [accessed 2014-12-22] [WebCite Cache]
    28. Gonczi A. Competency based assessment in the professions in Australia. Assess Educ: Princ Pol Pract 1994;1(1):27-44. [CrossRef]
    29. Govaerts MJ, van der Vleuten CP, Schuwirth LW. Optimising the reproducibility of a performance-based assessment test in midwifery education. Adv Health Sci Educ Theory Pract 2002;7(2):133-145. [Medline]
    30. McAllister S, Lincoln M, Ferguson A. Issues in developing valid assessments of speech pathology students’ performance in the workplace. Int J Lang Comm Dis 2010;45(1):1-14. [CrossRef]
    31. Johnsson M, Hager P. Navigating the wilderness of becoming professional. Journal of Workplace Learning 2008 Sep 12;20(7/8):526-536. [CrossRef]
    32. Schurmirth L, van der Vleuten C. ABC of learning and teaching in medicine: written assessments. BMJ 2003;326:643-645. [CrossRef]
    33. Govaerts M, van der Vleuten CP. Validity in work-based assessment: expanding our horizons. Med Educ 2013 Dec;47(12):1164-1174. [CrossRef] [Medline]
    34. Dietitians Association of Australia (DAA). The national competency standards for entry-level dietitians. DAA: The Education and Accreditation Manual, Appendix 4 (Australian National University Archives) 1994.
    35. Phillips S, Ash S, Tapsell L. Relevance of the competency standards to entry-level dietetic practice. Australian Journal of Nutrition and Dietetics 2000;57(4):198-207 [FREE Full text]
    36. Ash S, Dowding K, Phillips S. Mixed methods research approach to the development and review of competency standards for dietitians. Nutr Diet 2011;68(4):305-315. [CrossRef]
    37. Dalton M, Keating J, Davidson M. Melbourne: Australian Learning and Teaching Council. The Assessment of Physiotherapy (APA) Instrument Clinical Educator Resource Manual   URL: http://www.clinedaus.org.au/files/resources/2012_app_resource_manual_1.pdf [accessed 2014-12-22] [WebCite Cache]
    38. Barbour RS. Making sense of focus groups. Med Educ 2005 Jul;39(7):742-750. [CrossRef] [Medline]
    39. van Manen M. Practising phenomenological writing. Pheno Ped 1984;2(1):36-68 [FREE Full text]
    40. McKenna L, Boyle M, Palermo C, Molloy E, Williams B, Brown T. Promoting interprofessional understandings through online learning: a qualitative examination. Nurs Health Sci 2014 Sep;16(3):321-326. [CrossRef] [Medline]
    41. Guo P, Kim J, Rubin R. How video production affects student engagement: an empirical study of MOOC videos. L@S 2014:41-50. [CrossRef]
    42. Park J. Designing education online: learning delivery and evaluation. iJADE 2011;30(2):176-187. [CrossRef]
    43. Kent M. Changing the conversation: Facebook as a Venus for online class discussion in higher education. Journal of Online Learning and Teaching 2013;6(1):546-565 [FREE Full text]
    44. Britt R. Online education: a survey of faculty and students. Radiol Technol 2006;77(3):183-190. [Medline]
    45. Kowalczyk K. Perceived barriers to online education by radiologic science educators. Radiol Technol 2014;85(5):486-493. [Medline]
    46. McLead PJ, Brewer J, Steinert Y, Chalk C, McLeac A. A pilot study designed to acquaint medical educators with basic pedagogic principles. Med Teach 2008;34(1):92-93. [CrossRef]
    47. Gray KM, Clarke K, Alzougool B, Hines C, Tidhar G, Frukhtman F. Internet protocol television for personalized home-based health information: design-based research on a diabetes education system. JMIR Res Protoc 2014;3(1):e13 [FREE Full text] [CrossRef] [Medline]


    Abbreviations

    DAA: Dietitians Association of Australia
    IT: information technology


    Edited by G Eysenbach; submitted 25.09.14; peer-reviewed by T Chan; comments to author 09.12.14; revised version received 29.12.14; accepted 14.01.15; published 27.02.15

    ©Rachel Bacon, Lauren Therese Williams, Laurie Grealish, Maggie Jamieson. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 27.02.2015.

    This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Research Protocols, is properly cited. The complete bibliographic information, a link to the original publication on http://www.researchprotocols.org, as well as this copyright and license information must be included.