I’ve been conceptualising and refining my first experiment (VR1) since around March. I’m now in a position where I have a full methodology, most of my instruments and my participant information documents. My plan also received ethical approval in August. I’m hoping to run the experiment in Easter term 2024. This experiment is focused on the impact of interactivity and presence on learning outcomes, and will incorporate both validated quantitative survey instruments, and semi-structured interviews.
I chose to do a mixed methods study for two reasons; firstly, as my first experiment I’m not entirely sure how many participants I will be able to access – being remote makes this harder as well. So there is a good chance I will not get enough participants to make a representative sample. Secondly, there is a dearth of literature in this area that asks participants about their qualitative experience. The survey instruments that are used are generally quantised qualitative measures – they ask people to rate their feeling on a Likert scale. This seems to me to lack the context and nuance that could be gained from qualitative engagement with the participants.
Since August, I’ve been working on planning the development of the VR intervention I will be using with this experiment. I have a base project plan, which I have now turned into a task backlog so that I can start developing. I don’t imagine this will be a smooth and linear process, but I’m hoping I have allowed myself enough time to make something engaging and interesting for participants to use.
Research questions & hypotheses
- Do increased levels of interactivity result in greater feelings of presence in an immersive VR
learning environment compared to a lower interactivity intervention? - Do feelings of presence correlate positively with learning outcomes?
- How do interactivity and presence impact the learner’s experience?
This study will focus on interactivity as the independent variable, and assess its impact on presence, cognitive load and knowledge.
H1: Highly interactive content will have a positive effect on presence
H2: Highly interactive content will have a negative effect on extraneous environmental cognitive load
H3: Self-reported feelings of presence will have a positive effect on learning outcomes.
Further information
I presented my methodology at the Durham University Science Faculty PGR conference in September 2023. I have made the poster and related information and references available on this site.
About the VR intervention
The subject of the VR app I’m creating is the pH scale – chosen to limit the difficulty of the subject matter, and because I have contacts in the Chemistry department who can help to shape and give feedback on the material development. I’ve chosen to pitch this at non-science students, to prevent those with greater science expertise from skewing the results.
The app incorporates a scenario-driven approach, and will include a pedagogical agent. In the high-interactivity version, the users will have a lab role, guided by the agent character. They will answer questions, be provided with material to aid their understanding, and will be able to perform activities to help develop their understanding of pH.
Development process
The experience will be developed in Unity, and the targeted device is the Oculus Quest 2. We have a number of these available within my department (DCAD), and I have my own device for use during development and testing.
I’m using Codecks – a project management and planning tool designed for game development – to track my tasks and milestones for this. I came across this tool while looking for something sufficiently visual and flexible to keep me on track.
I have been doing some unrelated test development work in Unity over the past few months, and have learned some useful skills around working with Unity itself and the VR framework I have (VRIF). It’s not been an especially smooth course, but it’s been very rewarding. Certainly I don’t see too many problems with the grab mechanisms and basic movements. One thing I haven’t done much with yet is rigging/characters, animation and multiple object/character states. I have made a habit of producing devlog videos of my work so far, both to chart progress and document my work methods, and problem solving notes. I will be posting devlogs for this project going forward.