The first part of the development of the game has been about setting up the VR environment and laying out some of the basic interactable objects. Some of the assets were in an asset pack I purchased previously from the Unity asset store – the sample bottles and desks were mostly ready to go. I do find that quite often these assets require some adjustment before they are totally suitable. For the sample bottles, I split out the label section, as it was a part of the mesh originally. The desks worked fine, but I have added bumpers around all sides to prevent objects from easily falling off the surface.
For the environment, I modelled a basic hemisphere and flipped the normals. This means that the inside surface reads as the outside face and therefore renders a texture. Unity culls back surfaces by default to minimise resources. A simple floor plane acts as the ground, and a texture applied to each object finishes the base room.
Other modelling at this point includes the sample tray, the monitor display for the AI character and the ‘holographic projector’, which will be used to give the user objects.
Most of the modelling is fairly straightforward, and uses basic textures. I’m using the free Mesh Machine add-on for Blender, and KitOps with the Magic Design paid asset pack. This makes it easy to add complex elements such as the lens assembly for the projector.
Bugs
I’m trying to log bugs and fix them as I go, which has so far been fairly straightforward. I had one main issue with this section, where the text on the labels was visible in front of everything else, including the player hands. I solved this by adding the TextMeshPro component to a child game object of the one with the Canvas component. I only noticed this bug while editing the DevLog video, so it is visible throughout the final video.
The other issue I came across, which I had seen before but forgotten about, was the importance of marking grabbables to the correct layer and ensuring that physics interactions are switched off between the player and the grabbables. Not doing this can result in the player being pushed upwards if they are effectively above the grabbably. Sort of unintentional flying.
The final issue I need to fix next is to add additional grab points for the left-hand controller. At the moment, using the left controller leads to the object being grabbed to the back of the hand model.
Do you have a tutorial or course on how to use the table to zoom left or right as I see in some devlogs.
It would be very useful for my final project.
Hi Kevin
Sorry, I didn’t notice your comment before. Can you explain what you mean about the zoom effect? I don’t have what I would call a zoom feature in this app. I move about the space in VR by grabbing the table in this project, and I also had a standard joystick controlled movement I use sometimes (usually called smooth locomotion in VR). What’s your project about?
Thanks
Nic