Don’t know how to develop VR games? Unity 5.3 official VR tutorial is released - Series 2

Don’t know how to develop VR games? Unity 5.3 official VR tutorial is released - Series 2

[[163670]]

Laying the foundation

In order to prepare to learn how to develop VR applications with Unity, we must first check whether our computer hardware and software configurations meet the requirements. In short, the graphics card must be NVIDIA GTX970 or AMD290 or above, the CPU must be Intel i5-459 or above, the memory must be above 8GB, there must be two USB3.0 interfaces, one USB2.0 interface, and one HDMI 1.3 interface.

The operating system is quite bad, it does not support Mac or Linux. It supports Win7, Win8, and Win10.

Of course, you also need to upgrade your graphics card driver to the latest version.

Oculus official detection address: oculus.us5.list-manage.com

For information about computer configuration, please refer to: Oculus Rift will be available for pre-order tomorrow, is your computer configuration sufficient? (Added official detection tool address)

Once you've completed this expensive work, it's time to install Unity. Remember to connect and power on the DK2 before opening Unity. Before continuing, open the Oculus Configuration Utility app and check that the Demo Scene works. Note that you may need to set up a new user in the Oculus Configuration Utility before running the Demo Scene.

Create your first VR project

Next, we will use Unity to create a simple VR project demo, the effect of which is to observe a cube in a VR helmet. If you want to study more VR examples, you may wish to download the VR sample project (Asset Store) we mentioned in the previous tutorial.

Step 1.

Open Unity and create a new empty project.

I would like to clarify that the Unity version I am currently using is 5.3.1f1. It may have been upgraded by the time you see this tutorial.

Step 2.

In Unity's menu, select Field- Build Settings and select PC, Mac & Linux Standalone

Step 3.

Create a new cube in the scene, select Game Object - 3D Object - Cube from the menu, and use the Translate tool to put the cube in front of the default Main Camera, similar to the following.

Step 4.

Save the scene (File- Save Scene, or use the shortcut key).

Step 5.

In the menu, select Edit- Project Settings - Player, and in the "Other Settings" section, check "Virtual Reality Supported"

Step 6.

Click the Play button on the Unity interface to enter Play mode.

If everything went well, you should now be able to see the scene through the DK2. Look around and the camera in Unity will automatically react to the position and rotation of the DK2.

What if something goes wrong?

If you don't see the expected scene in DK2, then check the following things:

  1. Make sure you have DK2 connected and powered on before opening your Unity project.
  2. Open the Oculus Configuration Utility that comes with Oculus to see if the Demo Scene can work properly
  3. Update your graphics card driver to the latest version
  4. Make sure you have the latest Oculus Runtime 0.8 or higher installed on your computer.

Of course, if you still have questions, you can join the discussion in the forum.

Some useful information about VR development:

While VR app development is very similar to standard Unity app development, there are some differences to be aware of.

1. Frame rate displayed in the editor

When you view your project through the editor, please note that there may be some delay in the experience because the computer needs to render the same content twice. So when actually testing the project, it is best to create an executable version and actually experience it on a test device.

2. Camera Movement

Note that we cannot move the VR camera directly in Unity. If you want to adjust the position and rotation of the camera, you need to make sure it is set as a child of another GameObject, and then move it through the attached object.

For more information on this, see the Flyer and Maze scenes in the VR Samples project.

3. Camera Node

The left and right eye cameras are not created by Unity. If you need to get the position of these nodes during development, you must use the InputTracking class.

If you want to get different positions of the eye in the scene (for example for testing), use the following sample script and attach it to the camera.

4. C# Script

  1. using UnityEngine;
  2. using UnityEngine.VR;
  3. public   class UpdateEyeAnchors : MonoBehaviour
  4. {
  5. GameObject[] eyes = new GameObject[2];
  6. string [] eyeAnchorNames ={ "LeftEyeAnchor" , "RightEyeAnchor" };
  7. void Update()
  8. {
  9. for ( int i = 0; i < 2; ++i)
  10. {
  11. // If the eye anchor is no longer a child of us, don't use it  
  12. if (eyes[i] != null && eyes[i].transform.parent != transform)
  13. {
  14. eyes[i] = null ;
  15. }
  16. // If we don't have an eye anchor, try to find one or create one  
  17. if (eyes[i] == null )
  18. {
  19. Transform t = transform.Find(eyeAnchorNames[i]);
  20. if (t)
  21. eyes[i] = t.gameObject;
  22. if (eyes[i] == null )
  23. {
  24. eyes[i] = new GameObject(eyeAnchorNames[i]);
  25. eyes[i].transform.parent = gameObject.transform;
  26. }
  27. }
  28. // Update the eye transform  
  29. eyes[i].transform.localPosition = InputTracking.GetLocalPosition((VRNode)i);
  30. eyes[i].transform.localRotation = InputTracking.GetLocalRotation((VRNode)i);
  31. }

5. Image Effect in VR

Using many image effects in VR projects is a luxury. Considering that you need to render the same scene twice (once for each eye), many of the currently commonly used image effects are wasteful for VR applications and will seriously affect the running frame rate of the game.

Because VR places the user's eyes in a virtual space, some image effects don't make sense for VR. For example, depth of field, blur effects, and lens flare effects don't make sense for VR because we can't see them in the real world. However, if VR headsets can support eye tracking in the future, depth of field may make sense.

There are some effects you can consider using though: anti-aliasing is useful (especially given the low resolution of some headsets), color grading is useful (see this link for more on this: Color Grading with Unity and the Asset Store), and Bloom can be useful for some games.

However, before using any effect, it is best to test it in-game first to see if it actually works.

Unity itself provides many image effects (Assets-Import Package-Effects). In addition, the Asset Store also provides many effects, such as Colorful, Chromatica, Amplify Color, and more.

6. Render Scale

Depending on the complexity of your VR interactive application and the hardware environment it is running on, you may need to change the render scale setting. This setting allows you to adjust the texel:pixel ratio before lens correction, which can sacrifice game performance in exchange for image clarity.

This setting should be done through code, you can refer to here: http://unity3d.com/cn

You can change the render scale setting by using the following code:

  1. using UnityEngine;
  2. using System.Collections;
  3. using UnityEngine.VR;
  4. namespace VRStandardAssets.Examples
  5. {
  6. public   class ExampleRenderScale : MonoBehaviour
  7. {
  8. [SerializeField] private   float m_RenderScale = 1f;
  9. //The render scale. Higher numbers = better quality, but trades performance  
  10. void Start ()
  11. {
  12. VRSettings.renderScale = m_RenderScale;
  13. }
  14. }
  15. }

For this setting, you can refer to our VR Samples. The specific example is

Scenes/Examples/RenderScale scene. In addition, this setting is also applied in the MainMenu scene.

The effect of changing the render scale is as follows: Unity's default render scale is 1.0, and the effect is as follows:

If you set the render scale to 1.5, you can see that the display effect is sharper:

Next, set the renderscale to 0.5, and you can see that the pixelation is very serious:

Depending on the game scenario, you can consider lowering the render scale to improve game performance, or increasing the render scale value to make the picture sharper, but this will sacrifice game performance.

Well, now you know how to integrate VR into your Unity project, how to set up the camera movement in the game, and how to use graphic effects compared to non-VR games.

<<:  Google Photos now supports Apple's Live Photos feature

>>:  When developing an app, what does a product manager need to do from beginning to end? -- Before the project starts

Recommend

Short video app “Tik Tok”: Will it become popular for a while and then die?

I wrote an article before about how the rapidly g...

What are the benefits and effects of Baidu's bidding account hosting service?

Bidding for hosted content; 2. Establishment of b...

How to conceive a data operation plan and implement it?

We know that if we write a good operation plan an...

Is acrophobia just a fear of the body? You think it’s too simple.

Whether you are a mountain climbing enthusiast or...

User operation, how to retain users after fission?

What I want to discuss with you this time is how ...

AI can tell where your photos were taken

"Hi! Tomorrow is Mother's Day and my mom...

What kind of bird is the "jiu" in "guanguanjujiu" in "The Book of Songs"?

Wang Chenxu Most of us have read “Guangguanjujiu”...

How to organize an event with almost zero budget, free event

Last time a colleague told me that their annual g...