Development of a Student-Centred Evaluation Framework for Environmental Vocational Education and Training Courses. Development and validation of a Student-Centred Evaluation Framework for Environmental Vocational Education and Training Courses derived from Biggs' 3P Model and Kirkpatrick's Four Levels Evaluation Model.
AuthorDraper, Fiona J.
KeywordEnvironmental vocational education and training
Continuing education and training
Biggs' 3P Model
Kirkpatrick's Four Levels Evaluation Model
The University of Bradford theses are licenced under a Creative Commons Licence.
InstitutionUniversity of Bradford
DepartmentSchool of Engineering, Design and Technology
MetadataShow full item record
AbstractIndividuals and organisations need to do much more if sustainable development is to be achieved. Appropriate environmental vocational education and training (EVET) is essential for current decision makers. Crucial decisions need to be made before the present generation of school and college students achieve significant positions of authority. An increasing range of EVET courses and course providers are available within the UK. However, availability is not synonymous with suitability for either the attendee and/or his/her (future) employer. Previous research indicates that, as a component of lifelong learning, EVET courses should and the methods used to evaluate them should be student-centred. This thesis describes the development and validation of a new studentcentred evaluation framework. Preliminary literature reviews identified six fundamental issues which needed to be addressed. Existing academically productive evaluation models were examined and critically appraised in the context of these problems. The output from this process was used to develop a bespoke research methodology. Empirical research on four commercial EVET programmes revealed distinct personal, teaching and work-based presage factors which influenced course attendance, individual learning and subsequent organisational learning. Modified versions of Biggs' 3P model and Kirkpatrick's Four level Evaluation Model were shown to provide an effective student-centred evaluation framework for EVET courses. Additional critical elements pertaining course utility and the student's long(er) term ii retention of knowledge/skill were derived from previous research by Alliger et al (1997). Work-based presage factors and the student¿s return on expectation were added as a direct consequence of this research. The resultant new framework, the Presage-Product Evaluation Framework, was positively received during an independent validation. This confirmed inter alia that the framework should also be capable of adaption for use with other VET courses. Recommendations for additional research focus on the need to demonstrate this through further empirical studies.
Showing items related by title, author, creator and subject.
Palliative curriculum re-imagined: A critical evaluation of the UK Palliative Medicine SyllabusAbel, J.; Kellehear, Allan (2018-05)The UK Palliative Medicine Syllabus is critically evaluated to assess its relationship and relevance to contemporary palliative care policy and direction. Three criteria are employed for this review: (1) relevance to non-cancer dying, ageing, caregivers, and bereaved populations; (2) uptake and adoption of well-being models of public health alongside traditional illness and disease models of clinical understanding; and (3) uptake and integration of public health insights and methodologies for social support. We conclude that the current syllabus falls dramatically short on all 3 criteria. Suggestions are made for future consultation and revision.
Education and Security: Design and Evaluation Tools for Deliberate Disease Risks MitigationWhitby, Simon M.; Mancini, Guilio M. (University of BradfordFaculty of Social Sciences, 2016)This thesis addresses the role of education to mitigate the risks of deliberate disease, including biological weapons. Specifically, it aims to analyse how education was constructed as a potential instrument to mitigate specific security risks; if and how education could impact on risks; and how effectiveness of education as a risk mitigation measure could be improved. The research framework combines concepts of security, risk and education within a general constructionist approach. Securitization is used to analyse attempts to construct education as a tool to mitigate specific security risks; risk assessment is used to identify and characterize risk scenarios and potential for risks mitigation; and instructional design and evaluation models are used for the design and evaluation of education. The thesis contends that education has been constructed as a mitigation tool for what were presented as urgent security risks of deliberate disease. Nine attempted securitization moves are identified and assessed. Improved competences identified in four thematic areas, and built with education, can mitigate risks in specific scenarios via impacting factors that primarily influence risk likelihood. The thesis presents several examples of achieved learning objectives, and tools that can be useful to evaluate behavioural and risk impacts, though empirical results on these levels here are still scarce. Design and evaluation tools, illustrated through a large amount of original and pre-existing data from a range of countries and contexts, are presented that can improve effectiveness of education as a deliberate disease risks mitigation measure.
Evaluation of Dementia Training for Staff in Acute Hospital SettingsSmythe, A.; Jenkins, C.; Harries, M.; Atkins, S.; Miller, J.; Wright, J.; Wheeler, N.; Dee, P.; Bentham, P.; Oyebode, Jan R. (2014-02-28)he development, pilot and evaluation of a brief psychosocial training intervention (BPTI) for staff working with people with dementia in an acute hospital setting are described. The project had two phases. Phase one involved adapting an existing competency framework and developing the BPTI using focus groups. For the pilot and evaluation, in phase two, a mixed methods approach was adopted using self-administered standardised questionnaires and qualitative interviews. Qualitative analysis suggested that delivering skills-based training can develop communication, problem-solving and self-directed learning skills; benefit staff in terms of increased knowledge, skills and confidence; and be problematic in the clinical area in terms of time, organisation and the physical environment. These factors must be taken into consideration when delivering training. These changes were not reflected in the quantitative results and measures were not always sensitive to changes in this setting. Definitive conclusions cannot be drawn about the efficacy of the intervention, due to the contradictory outcomes between the quantitative and qualitative data. Further developments and research are required to explore how staff and organisations can be supported to deliver the best possible care.