Well, vive pro with eye tracker is out



i know, that’s what my reference to the “whack a mole” was about (demo game @ CES)
using it as input device is what the hardware vendor will deliver you “for free” with the device but incorporating the HMD specific things for foveated rendering or distortion correction needs a lot of good (fast) code from pimax (bedside the money for the 6000 free units) - so coding this part might be of lower priority then other things because its still to expansive as free give away, so official version might be its still in R&D but the reason why its that way might be financial (indirect)
at least they can now have a look how htc does it (the xtal and star vr one are not that common)

as with hand tracking that will take much longer then most expect, the thing we will see first (imho) is foveated rendering (we have the fixed version already, the spot with high resolution “only” needs to move around as 2nd step)


I freaking love the Fixed Foveated rendering in my Pimax.


The right time to release is when all the backers are gone and do not respond to their email about confirming their address then pimax will not have to spend money sending most of these out. This only costs pimax money since we are supposed to get them for free.


Unless Pimax is using multi res shading for the fixed foveated rendering, they shouldn’t have too many issues to worry about. Literally all they have to do is dynamically adjust the shader to where the eye is looking. As seen from Nvidia’s website it’s very simple. Aliasing might be a problem though. The fixed foveated rendering we have now already suffers from that issue. Even if they don’t add it, when Pimax goes open source and we can access the rendering pipeline it shouldn’t be very hard to add at all. People could probably just piggyback off of the fixed foveated rendering code to get it working.


Could indeed be used to point with the eyes (Macross Missile >__<), but could also be used for interaction with NPC, or even time events when the user blinks (jumpscare incoming).
As for the rendering, apart from easing the rendering, you could also play with the depth of field which would eventually be able to know where you are looking and where they should blur, instead of deciding where you should be looking.

For that, I suggest you look at the FOVE headset promotional campaign video.


does it work ok?
I only have a 1080 so no idea how it stands up


Fov has been around for a long time. Not sure if they updated to a new headset


As far as I remember, they delivered their headset in 2017, then when MIA.
Despite a couple of exclusive partnership (like the Sword Art Online app), the product didn’t sell, and I think they now have move to a professional market instead of consumer one.

Still, I was just illustrating the benefit from eyetracking through their video.


Indeed. I think Fove was also looking at trying to license their tech to other hmds.


HTC has officially released its foveated rendering sdk for the Vive Pro Eye. If now isnt the time for Pimax to jump into the scene with their eye tracking module, then when?


Reducing bandwidth with wireless sounds simple enough. Just render part of the frame in full res, and the rest in low res. Then you send both parts of the frame to the wireless adapter, which then combines the two parts into one frame. Boom, you’ve successfully reduced bandwidth by a ton.