Click here to Skip to main content
15,887,135 members
Articles / Game Development / Unity

Day 37 of 100 Days of VR: Adding Basic VR Changes Into Our FPS

Rate me:
Please Sign up or sign in to vote.
4.17/5 (4 votes)
14 Nov 2017CPOL6 min read 4.3K   2  
How to add basic VR changes into our FPS

Welcome to Day 37! Today, things are finally going to get serious in working with VR!

Currently, there are a lot of problems with the app from when we launch it. Some are just limitations and others are actual problems.

However, before we start to go in and fix everything that we encountered yesterday, today we’re going to add the Google VR SDK into our game.

Today, we’re going to:

  1. Set up wireless debugging so we can both: debug and receive console print statements from our game
  2. Remove the old scripts that we’ll replace with VR
  3. Add the Google VR SDK into our game
  4. Set up a simple Event Trigger

Today, we’re going to start working in VR, so let’s not waste time and get to it!

Step 1: Setting Up Wireless Debugging/Consoles

Before we do anything, one of the most important things we should do is to set up remote debugging or at the very least, the ability to have our console statements be sent to Unity untethered.

Currently, in Development mode, we only get console logs from our Android device if our phone is connected to our computer.

This wire would become too limiting if we must do something like spin around in a circle.

To fix this, we’re going to set up wireless debugging where our phone can send data remotely to Unity.

We’re going to follow Unity’s documentation on attaching a MonoDevelop Debugger to an Android Device.

The instructions are straightforward, so I’ll just leave the link to the instruction.

In our current state, because we have no way of restarting the game, we must rebuild and run every single time we want to see the console wirelessly. The reason being we lost the ability to restart the game inside the game.

However, when we re-add our ability to restart the game, wireless debugging will be more useful.

Image 1

Step 2: Removing Old Scripts that Needs VR Replacements

It’s finally time to start working in Unity!

Before we do anything, let’s think about the things that the Google VR SDK gave us and what we must get rid of in our current system that conflicts with the VR SDK.

The main thing that the Google VR SDK provides is:

  1. The ability to move the camera with our head
  2. Its own Raycasting system

What we need to remove from our game is:

  1. The ability to move our character
  2. The ability to move our camera
  3. The ability to shoot
  4. The crosshair UI

Luckily for us, this process is going to be fast and easy.

First, let’s remove our ability to move:

  1. In our game hierarchy, select the Player game object.
  2. Select the little cog on the top right-hand corner of the Player Controller (Script) and select Remove Component

Next, let’s get rid of the game following our mouse.

  1. Select Player > Main Camera
  2. Remove our Mouse Camera Controller (Script) Component

After that, let’s get rid of the shooting script. We’re going to come back later and re-purpose this script, but that’s going to be for a different day:

  1. Select Player > Main Camera > MachineGun_00
  2. Disable the Player Shooting Controller (Script). We’re still going to need this.

Finally, let’s get rid of the crosshair. As you recall, when we add the VR SDK, we get a gaze prefab that already adds a cursor in for us.

  1. Select HUD > Crosshair and delete it from our hierarchy.

When we’re done, we’ll have a completely unplayable game! Yay….

Image 2

Step 3: Adding the Google VR SDK in

Recalling from the Google Cardboard demo, for our game, we’ll need to add:

  1. GvrEditorEmulator – to simulate head movement
  2. GvrEventSystem – to use Google’s Event System for dealing with raycasting
  3. GvrReticlePointer – for our gaze cursor
  4. GvrPointerPhysicsRaycaster – The Raycaster that GoogleVR uses to hit other objects

The set up for this will also be very straightforward.

  1. Drag GvrEditorEmulator in Assets > GoogleVR > Prefabs > GvrEditorEmulator to the hierarchy
  2. Drag GvrEventSystem in Assets > GoogleVR > Prefabs > EventSystem to the hierarchy
  3. Drag GvrReticlePointer in Assets > GoogleVR > Prefabs > Cardboard to be the child of Main Camera
  4. Select cs from Assets > GooglveVR > Scripts > GvrPointerPhysicsRaycaster.cs and attach it to our Main Camera.

When we’re done, we’ll have something like this:

Image 3

Now with these prefabs and scripts in, we can rotate and look around our game by holding Alt.

We can also shoot our raycasts with our VR Raycaster, however right now, we don’t have an Event Trigger set up in our enemies that will detect them getting hit.

Let’s do that!

Step 4: Setting Up an Event Trigger

Before we end today, I want to make a simple event trigger that allows us to be able to defeat an enemy.

Luckily for us, we already have the function available to us! Specifically, inside our Enemy Health script, we have a code that we call to damage an enemy.

Let’s set this up. We want to get something like this:

Image 4

For now, we’re only going to change our Knight enemy. Here’s what we’re going to do:

  1. Select our Knight prefab in Assets > Prefab > Knight
  2. Add an Event Trigger Component to our prefab.
  3. Click Add New Event Type to select what type of event we want to listen for
  4. Select PointerClick
  5. Now click + to add the object we want to access the scripts of
  6. Drag our Knight Prefab into the empty Object slot
  7. Then we need to select the function to call: EnemyHealth > TakeDamage(float)
  8. Set the float value we pass in as 1

When we play our game now, when our gazer focuses on an enemy and we click, we’ll shoot him!

There are a lot of things that we’re missing like the push back, but we can start focusing on the rest of that tomorrow!

Now let’s do that to the rest of our prefabs: Bandit and Zombie!

Conclusion

There we have it! Our first dive into doing some work with VR. It turns out right now, there’s a lot less code that needs to be written, instead, a lot of it is just putting prefabs and scripts to the correct location so our game would work.

Either way, now we have a game that is playable. Tomorrow, we’re going to discuss what changes we should do to make a better VR experience. Or at the very least, as good as it was before we try to VR-ify it!

Phew, it’s been a long day, I’ll see you all tomorrow on day 38!

Day 36 | 100 Days of VR | Day 38

Home

The post Day 37 of 100 Days of VR: Adding Basic VR Changes Into Our FPS appeared first on Coding Chronicles.

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
United States United States
Joshua is a passionate software developer working in the Seattle area. He also has experience with developing web and mobile applications, having spent years working with them.

Joshua now finds his spare coding time spent deep in the trenches of VR, working with the newest hardware and technologies. He posts about what he learns on his personal site, where he talks mostly about Unity Development, though he also talks about other programming topic that he finds interesting.

When not working with technology, Joshua also enjoys learning about real estate investment, doing physical activities like running, tennis, and kendo, and having a blast with his buddies playing video games.

Comments and Discussions

 
-- There are no messages in this forum --