We’re thrilled to share some exciting news with you. Wolvic is about to transform how you interact with the web in a VR environment with the introduction of eye tracking support! Starting with the just released v1.7.0 release on the Gecko backend and the highly anticipated v1.0 release on the Chromium backend, you’ll be able to control the browser pointer just by looking at what you want to interact with. While this feature is still being refined, it’s a fantastic start, and we can’t wait for you to try it out.

What Eye Tracking Brings to Wolvic

Eye tracking is one of those technologies that feels like magic when you first use it. Imagine reading a webpage in VR and being able to click on links just by looking at them. That’s the kind of mind-reading, seamless, natural interaction we’re aiming for with this new feature.

Here’s how it works: with the help of specialized hardware in the VR headset, Wolvic is able to track where you’re looking and moves the pointer accordingly. So wherever your eyes go, the pointer follows. It’s that simple! This makes it easier to navigate websites and interact with web elements, all without needing to rely so heavily on physical controllers.

Why It’s Exciting

We know eye tracking isn’t perfect yet—it’s still a work in progress. But even in its early stages, it’s a game-changer for browsing in VR. We’re starting to close the gap between you and the content, making everything feel more intuitive and engaging. Sure, we’re not quite at Vision Pro’s level, but we’re heading in the right direction, and we’re excited to take you along for the ride.

Having a well-calibrated device is crucial when it comes to eye tracking. Since the technology relies on accurately detecting where you’re looking, even the slightest misalignment can lead to frustrating experiences—like the pointer not landing exactly where you want it to. Note that this is not exclusive to Wolvic: accurate calibration is also a must for Apple’s Vision Pro, for example. If you find Wolvic’s eye tracking is more frustrating than rewarding, consider recalibrating your device’s eye tracking settings (in the future, extensions like XL_ML_USER_CALIBRATION would be used to automatically determine how well the device is calibrated).

Using Eye Tracking Alongside Your Usual Controls

One thing we’re really proud of with this update is how well eye tracking integrates with the tools you’re already using in Wolvic. Whether you prefer (or the device you use supports) hand tracking or physical controllers, eye tracking was designed to work smoothly with your setup.

  • Hand Tracking: If you’ve already got hand tracking enabled, you can use your gaze to move the pointer and your hands to click, scroll, and interact with content. Pinching will act as a click, and if you combine pinching with moving your hand up or down, it will allow you to scroll.

  • Physical Controllers: For those who love their controllers, just guide the pointer with your eyes, then use controllers buttons for actions like clicking, grabbing or scrolling.

This flexibility means you can mix and match how you control Wolvic, making the browsing experience truly your own.

Getting Started with Eye Tracking

Ready to give eye tracking a try? You’ll need a VR headset that supports eye tracking, and then all you have to do is update to Wolvic’s latest version (v1.6.2 for the Gecko backend or v1.0 for the Chromium backend). After updating, you can enable eye tracking in Settings -> Controller Options as shown in the picture, and start exploring this new way of interacting with the web.

Controller Options dialog with the new setting to enable eye tracking navigation

So far we have successfully tested it with the following devices:

  • Pico 4E 1
  • MagicLeap 2

The Juicy Details

If you are not familiar with OpenXR, you can skip this section. For XR API geeks, “supports eye tracking” really means supporting the XR_EXT_eye_gaze_interaction OpenXR extension. Wolvic’s multi-backend architecture still supports different SDKs, but eye tracking support was only implemented in the OpenXR backend (very likely the one you’re using now).

OpenXR makes it easy to get eye gaze input, just like how you retrieve input from controllers or hand tracking. It introduces a new interaction profile /interaction_profiles/ext/eye_gaze_interaction for the .../input/gaze_ext/pose input source. This means you can use the same xrLocateSpace() function that you use to track the headset’s position to also track where users are looking. We won’t dive too deep into the technical details here, but if you’re interested, you can check out the sample code in the OpenXR specifications or look at Wolvic’s implementation for more information.

When we set out to add eye tracking support to Wolvic, we knew it wouldn’t be as simple as just plugging it into the existing controller framework. Eye tracking doesn’t provide a virtual controller—just a pointer. Unlike physical controllers or hand tracking, where you can click buttons, squeeze triggers, or use joysticks, eye tracking alone can’t handle these actions effectively. Imagine trying to click by blinking; it sounds good in theory, but given that the average person blinks around 15 times a minute without thinking about it, it’s hardly practical.

Scrolling presented its own set of challenges. Typically, users scroll by pointing at an area, clicking, holding, and dragging—actions that are intuitive with a physical controller or hand tracking. But with eye tracking, this approach falls apart. You’d need to look at the scrollable area, click to engage, and then move your eyes up or down to scroll. Not only does this make it impossible to read while scrolling, but it also leads to significant eye fatigue.

To make scrolling more comfortable, we came up with a solution. When Wolvic detects that you’re about to scroll, it temporarily turns off2 eye tracking so you can scroll like you normally would with a controller or hand. This way, you don’t have to worry about straining your eyes or losing your place while reading.

Risks of Eye Tracking Technologies

While eye tracking opens up new possibilities for interacting with the web, it also comes with certain risks, particularly around privacy and user data. For instance, eye tracking can potentially be used for fingerprinting or surveillance by tracking where you look and how long you linger on certain elements.

We want to be transparent that, at this stage, Wolvic has not yet implemented mitigations for these risks due to limited time and resources. To address this, whenever you try to enable eye tracking, Wolvic will show a prominent alert dialog to ensure you’re aware of these potential privacy concerns.

Alert dialog warning about eye tracking technology risks

Our goal is to develop robust protections similar to those found in Safari for Vision Pro, which, for example, prevents hover events triggered by eye gaze from being sent, helping to avoid unwanted tracking. Achieving this will require significant investment and, likely, the support of more partners within the Wolvic project. In the meantime, we encourage you to use eye tracking carefully and to stay informed about the privacy implications.

Be Part of the Future of VR Browsing

We’re so excited to see how you use eye tracking in Wolvic. It’s a big step toward making VR browsing feel more natural, and while we know there’s still work to do, we think you’re going to love it. Your feedback will be super valuable as we continue to refine and improve this feature.

Once the update is available, you can download or update Wolvic from the stores, or grab it directly from our downloads page and sideload it. Give eye tracking a try, and let us know what you think—we’re excited to hear your feedback!

Conclusion

With eye tracking, Wolvic is pushing the boundaries of what VR browsing can be. This is just the beginning, and we’re excited to keep innovating with your help. Stay tuned for more updates, and happy browsing!

It’s easy to think that because a project is open source, it’s free in every sense of the word. But in reality, developing and maintaining something like this takes a lot of time, energy, and yes, money. We can’t just assume that there will always be people around to keep it going. If we want Wolvic to survive, we need more people and organizations to step up and invest in its future. It’s a collective effort, and with more hands on deck, we can ensure it continues to grow and benefit everyone involved.


  1. The ‘E’ for enterprise is important, as the consumer’s Pico 4 does not support eye tracking ↩︎

  2. Actually eye tracking is not really turned off, we just ignore the aim provided by eye tracking and use the poses of the controllers/hands to scroll ↩︎