In this update, I’ve spent time improving the user experience inside the scene. I’m putting in a lot of work and thought into this scene because almost everything I’m adding now will be used in later scenes as well. I’m front-loading the development to avoid having to repeat or revise work further into the development process.
New things this time:
- Particle effects, sound effects and background audio
- An anti-chucking/butterfingers solution
- Settings interface
I’ve added a particle effect and audio to the sample delivery tube, sound effects to the water tube, and in various other places. I found a great source of free sound effects at Pixabay – I had initially done my own sound samples for the sample bottles and desk/monitor grab effects, which is great, but when you want something more mechanical or hard to drag a microphone to, I had to look elsewhere. So far, Pixabay’s range has included something workable for everything else I’ve needed, including the background music I selected.
When thinking about the pedagogic design of this app, because it is a learning experience rather than a ‘game’ in the usual sense, there are definitely arguments for not having any background audio. Mayer would suggest that this is an unnecessary distraction, and by and large, I would agree. However, every environment has a background audio quality to it – a quiet room, a lab, an office, etc. All of these come with an auditory sense of place. A VR environment without that audio texture feels dead to me.
I looked for something calming and ambient to provide that sense of space and place I felt was missing, and settled on Ambient Wave 48 (Tribute), a gentle soundscape that lacks large variations in volume, and would easily loop. I have this audio background set as a 2D source at a low default volume.
All these sound effects mean that it’s time to think about settings and how a user can customise their experience. I decided I wanted a practical solution that would feel more interactive to the user, so I went back to the desk design and modelled a cutout section and physical button interface. The interface panel rises up out of the desk when the settings button is pressed, and the controls are grabbable box sliders. The panel folds and sinks back down when dismissed. I have added an ‘audio’ selection button for the moment, on the basis that further settings screens are likely to be needed.
Because this game uses physics for grabbable, when a user drops something on the floor (or throws it away), I needed a way of returning objects to the user.
I created box colliders parented to the desk object for the area under the player, and to the left, right, top, front and back of the player area, and applied the trigger zone script I’ve used elsewhere, which detects when objects of a certain tag are in those zones.
Each box collider also has attached a ReturnToSnapZone script attached, which will return the dropped object to the nominated snap zone if the object is both in the trigger zone and NOT held.
I could have made all the grabbable objects kinematic on drop, which is a mechanic used in other games, but I felt that if possible I’d like it to feel more like interacting with real objects, which don’t just stay where you leave them when you drop them. I think this is more in keeping with the idea of skill development inherent in this experience – the next scenes will look at using tools to conduct sample testing.
- Make the volume settings functional
- Add other settings pages (e.g. accessibility – captions on/off, audio pitch/stereo adjustment)
- Integrate captions
- Player settings/save game
- Final script/videos for scene 1