Seminar room A – Monday 15:30-16:30
One hour paper
School of Social Science and Law/Dept. for Learning Development, Teesside University
Matt Portas is a Principal Lecturer in Sport Science and is a University Teaching Fellow with responsibility for leading teaching and learning innovation. His particular area of interest is supporting students in transition.
Authors: Portas M, Anderson J, Massey, B (student), Davison L (student), O’Hare L
Students’ feedback on their learning experiences plays a vital role in shaping the future of contemporary Higher Education institutions. We analyse feedback from students to determine the key factors that best relate to their overall student experience. This information can be used to improve the student experience during their study.
Authors: Portas M, Anderson J, Massey, B (student researcher), Davison L (student researcher), O’Hare L
Students’ feedback on their learning experiences plays a vital role in shaping the future of contemporary Higher Education institutions. Indeed the National Student Survey (NSS) not only informs institutions like ours on what we do we and what needs to be improved but it also acts as a metric for us to be benchmarked against other UK Higher Education Institutions. Such information can help to deliver learning that is appropriate to the students’ level of study. It may also aid us to aid in assisting the transition of students between levels of learning.
At Teesside University the Level 4 and Level 5 end of stage surveys (Year 1 and Year 2 of a 3-year degree for full-time students) provide a potentially powerful tool to analyse current students’ experiences whilst we have time to improve their learning experience with us. These tools are based on the UK National Students Survey that Year 3 full-time students complete and can help us to understand what is important to students so that we can target interventions where they will have maximum impact that is specific to their level of learning. However, to date evaluation of this data with a systematic and thorough analysis is sparse. Here we report our initial findings from a survey that replicated the NSS questions with full-time Year 1 and Year 2 students (N = 827) and provide some thoughts for further investigation. The data were analysed by relating questions about teaching, assessment and feedback, academic support, organisation and management, learning resources and personal development to a question that asked students to rate their overall satisfaction of their experience.
We then identified higher scoring and lower scoring areas and also identified these areas were higher or lower impact on overall satisfaction. At both years 1 and 2 students rated ‘rated the course is well organised and is running smoothly’ and ‘staff have made the subject interesting’ as lower performance and higher impact. Strategically, these areas our institution may wish to focus on developing to improve the students’ experiences. At year 1 higher score higher impact scores were given to personal development and assessment arrangements. These were also higher score at year 2 but had power impact on overall experience. Higher scoring but of lower impact at years 1 and 2 were learning resources, while questions around feedback were lower scoring but also lower impact. Further results will be discussed in the presentation. In addition, the team consider the reasons why aspects of the student experience appear to impact on overall satisfaction and explore verbatim comments for additional insight.
This more sophisticated approach to analyses of end of year student experience data help us to more clearly identify what students relate most to when they decide whether their overall experience is positive or not. With this data we can work with our students to help and shape their learning experience while they are still studying with us and before they complete the publically available NSS at the end of year 3.