Search Unity

The next generation of VR gaming on PS5 | Hero image
The next generation of VR gaming on PS5 | Hero image
Share

Is this article helpful for you?

Thank you for your feedback!

Sony Interactive Entertainment’s next-gen VR headset, the PlayStation® VR2 (PS VR2), launches today, and we’re excited to share the latest tools you can use to build for this innovative platform. In this post, we’ll cover two aspects of developing for PS VR2: graphics and inputs.

PS VR2 and graphics

Example of the graphics potential with PS VR2’s 4K resolution

PS VR2 leverages PS5’s next-gen computing and graphics power to help you create stunning, high-performing VR games. You can target a 4k resolution equivalent, running titles at 60Hz, 90Hz, or 120Hz, which should be achievable with the PS5 using some of the techniques discussed below.

First, let’s start off with render pipelines. We recommend the Universal Render Pipeline (URP) for most VR developers because URP will be our first render pipeline to support some of PS VR2’s unique features, such as foveated rendering and gaze tracking. You can also use the Scriptable Render Pipeline (SRP), Built-in Render Pipeline, and High Definition Render Pipeline (HDRP), but be aware that some features, like foveated rendering, are only available on URP for the time being.

Overall, URP is a great match for VR games. It’s flexible, straightforward to use, and customizable. It also works well if you’re building for multiple platforms, including all-in-one VR devices. You can make your own custom render pipeline using SRP, and we provide an extensive C# API that allows you to implement any renderer your games require.

The PS5 also provides advances on the GPU side through its new NGGC Graphics API. With NGGC, we’ve taken advantage of the optimization technologies available on PS5 in ways that were previously unavailable. We’ve also rearchitected our rendering backend to allow efficient utilization across multiple cores while improving GPU state transitions. This adds up to more efficient rendering in terms of CPU, GPU, and memory usage while offering the same visual results, without any reauthoring of assets or game code.

Foveated rendering

Example of foveated rendering in use in PS VR2

PS VR2 will be able to use a technique known as foveated rendering. This technique helps you make VR games with better visual fidelity by decreasing the GPU rendering workload required for a given scene. Foveated rendering is used to improve GPU performance by reducing image quality in peripheral vision.

PS VR2’s hardware can go a step further by using eye tracking to optimize GPU rendering. By projecting the eye-gaze information into screen space, you’re able to render at high quality in the precise screen area where a player is looking.

The intent of foveated rendering and eye tracking is to keep the image quality high in those parts of the image deemed important, while smoothly fading to lower resolutions in areas of the image considered to be less important. This means that you can reduce the size of some render targets while still keeping quality where you want it.

We’ve removed all of the complexity of setting this up in Unity, allowing you enough control over the degree of foveation to be able to balance image quality versus GPU performance for your specific requirements.

With this capability, foveated rendering on PS VR2 can be up to 2.5x faster, without any perceptual loss compared to equivalent image quality through standard stereo rendering. We’ve also seen gains up to 3.6x faster when foveated rendering is combined with eye tracking. (Note that these tests represent ideal increases in performance, tested on a Unity demo, and numbers will vary based on your game.)

Foveated rendering on PS VR2 can bring a massive reduction in GPU usage while producing the same perceptual quality – and, combined with eye tracking technology, the performance gains are even better.

Input controls for PS VR2

PS VR2

Outside of graphics performance, eye tracking also unlocks a new input method. You can use eye tracking to allow users to select items from menus, start interactions with NPCs, use in-world tools, and more. Eye tracking could even be a focal point of the gameplay mechanics.

Leveraging eye tracking works in much the same way as other XR input devices. You have access to the components for eye gaze, which is a combination of position and rotation across both eyes that defines a place in the virtual world. You can use this to tell where the user is currently looking.

In addition to basic pose information, you will also have access to pupil diameter and blinking states for both of the player’s eyes. Combining these with the pose, you can start to form your own ideas around gameplay and interaction to more deeply engage with players.

Here is an example of a simple gaze-based reticle:

public class GazeReticle : MonoBehaviour
{
Vector3 m_GazePosition;
Vector3 m_GazeForward;
Quaternion m_GazeRotation;

	// In this script, gazeTracker is driven by a TrackedPoseDriver component wired up to the Position/Rotation
// of the eyeGazePosition and eyeGazeRotation from PSVR2 headset in the Input System Actions.
public GameObject gazeTracker;

void Update()
{
RaycastHit hitInfo;
m_GazePosition = gazeTracker.transform.position;
m_GazeRotation = gazeTracker.transform.rotation;
m_GazeForward = m_GazeRotation * Vector3.forward;
if (Physics.Raycast(m_GazePosition, m_GazeForward, out hitInfo, 10.0f))
{
transform.position = hitInfo.point;
}
}
}

You’ll see a gaze tracker object that drives the movement of the reticle in this script, using the same TrackedPoseDriver that we might use to track any other XR controller in Unity. This one just happens to be tied to the eyeGazePosition and eyeGazeRotation set up in our Input System Action Map. There is also a TrackedPoseProvider specifically made to handle eye tracking if you are planning to use the more traditional Unity input methods.

Sense Controllers

The new Sense Controllers are available for Unity developers and include interesting unique features only found on PS VR2.

Finger touch detection uses capacitive touch to detect when a player’s fingers are resting on the buttons without actually pressing them. These controls are available on all the primary buttons and thumbsticks, so you can use them to drive more natural gestures with players’ hands during gameplay. You could also, in a more basic approach, drive a hand model to enable players to “see” where their fingers are when they look at the controller. This can really help players stay focused and immersed in an experience without having to lift the headset up or feel around for a specific button.

PS VR2 uses inside-out tracking technology for the new system, giving you six-degree of freedom tracking for both the headset and controllers. You can now use most, if not all, of the standard Unity XR stack, making it easier to develop your games for broader platform reach. To set up the controllers themselves, we have exposed these input controls through both traditional Unity Input Manager and the newer Input System package.

In addition to eye tracking and controller input, PS VR2’s SDK also allows full control over PS VR2 Sense technology haptics. This includes audio-based haptic feedback to provide a deeper experience for players, as well as a more traditional vibration support. The new controllers also include the same Adaptive Triggers available with PS5 DualSense controllers, meaning you can program the triggers with different styles of feedback based on game context. In addition to controller-based haptics, PS VR2 has added headset feedback, allowing you to give players adjustable vibration in the headset. This could be used to alert players of an event, or combined with audio to add more realistic sensation to experiences.

We have worked hard to give you flexibility when it comes to integrating PS VR2 input and haptics into your games. With a combination of tracking improvements and a standard Unity XR SDK for PS VR2, you can leverage the full Unity XR stack, including things like the XR Interaction Toolkit, other XR SDK-dependent assets, Unity Asset Store packages, and other packages available through the Unity Package Manager.

These features allow you to explore new forms of gameplay and worldbuilding.

Get started with PS VR2

PS VR2 is available today, and you can build for it with Unity 2021 LTS and later. Foveated rendering requires Unity 2022.2 and later. We’re excited to see what this headset means for the VR industry as it unlocks a new level of performance and input controls for you to build even more immersive and exciting experiences.

You’ll need an active Unity Pro subscription (or a Preferred Platform license key provided by the respective platform holder) to access these specific build modules via developer platform forums. Register here to become a PlayStation developer.

Share your PS VR2 and PS5 games with us using the #MadewithUnity hashtag on social media. Have questions? Registered PlayStation developers can connect with us in the Unity forum on the PlayStation developer site.

Is this article helpful for you?

Thank you for your feedback!

Related Posts