Brainwarp, skipping an eye vs Interlacing


Any method could be used if the gfx drivers support it, it’s easy and nothing to worry too much about, the easy way could be render every single full frame then send the even lines to left panel and odd lines to the right panel, with no cost for the gfx card at all.

But any method you’re going to use, like these, including brainwarp, there will always be some perceived image tearing, glitching and frame jump, because the panels can’t be fast enough to mask the anomalies, unless you have a panel that can show the image at 200Hz without a single frame/line miss or partial screen drawing.

All that is needed are just faster panels, most high end gfx card are already capable to push up to 700fps, except that of course such a fast render would be a total waste of computing power, anything over 120 would almost cancel the problem…


Aye but then you are rendering a full frame for each camera (to keep stereoscopic) to get those odd/even lines with the current rendering pipeline. so it would not save on performance. We would need a mod to the game engine that could do this. Then we could get all creative and do a fixed noise pattern (or every other pixel) render instead so that we don’t have solid odd or even lines of nothing on an eye.


So we just need our pc’s to render at 180fps with one side blank alternating you wouldn’t see the blank space? Where’s my 1180ti???


Even just a gtx1060 with g-sync can do that… rendering power is different from the card’s capacity to push frames to the display, please notice.

You just need a panel who can keep up with such fast screen refreshes the card is pumping out…the upcoming announced VR panels next year will do, providing they finally remove the crappy logic and bandwidth restrictions the panels had until now.


Depends if you just want to try something that has been tried and used in every conceivable manner already, and have a little different perceived anomaly on the screen (and train your eyes/brain to avoid noticing and being pissed off about it) or take the easy way pretending better panels that don’t make you feel sick and completely solving the problem one time for all :smiley:


Or you can simply take it as rendering interleaved right eye / left eye scenes. Considering the original picture Pimax put in KS:

Brainwarp is a set of technologies Pimax use in the VR headsets.

For example, 8K renders and displays image in a sequence. i.e. For each time, only one eye can see a 4K image. Pimax 8K renders a single 4K image at 150/180 times per second, but users perceive a complete 8K at 150/180 Hz with high frame rate.

Brainwarp boosts refresh rate, reduces latency and decreases GPU pressure for a smoother VR experience.

It suggests that in one scenario (left side of the picture) both, left and right eye are rendered at each time, while the right side of the image suggests that only one, either the left or the right eye are rendered. Thus effectively saving half of the rendering effort. But Pimax also claim that the timeframe T1-T2 actually lasts for 1/180th (1/150th) of the seconds, effectively twice as the target refresh rate.

Either you run the headset at 90Hz refresh for both eyes in sync, thus rendering 90 double WQHD res scenes, or you render 180 single WQHD scenes, interleaved, in both scenarios you render 2 x 90 = 180 WQHD scenes. No gain on performance.

If the rendering of left and right eye in the (interleaved) brainwarp happen at the same time point then you may face some continuity problems using brainwarp as one eye will be systematically delayed behind the other.

If the rendering of the left and the right eye in the (interleaved) brainwarp happen at different time points (delayed by T2 - T1) then you got correct continuity, but can have problem in stereoscopic, because they will never be the left and right images which match the same scene. On top of that you would need modified engine to go this route and would lose the stereoscopic rendering advantage, from the first case.

Pimax never explained how exactly Brainwarp should work, but neither case mentioned above really help the case.


[quote=“D3Pixel, post:1, topic:7416”]

So what happens to the eye that is being skipped? does it hold the previous frame or blank it?

If it holds the previous frame then does that not add 2 frames of latency to what you should be seeing with motion to photon. That would cause a strange effect in fast motion wouldn’t it?

If it displays a black frame then would that could create a strobe effect?

The HMD should already be showing black most of the time, with resulting strobing, if Pimax have done things correctly.

The strobing, or rather the mitigation of it, would be a significant factor, into why 90fps was settled on, By Valve and Oculus, as a lower refresh and frame rate threshold limit.

The HTC Vive and Oculus Rift both do this, under the monicker: “low persistence”, in order to avoid the judder you mention, that one get from sample and hold, and which is greatly exacerbated when the picture moves along with one’s turning one’s head.

You still get persistence of vision discrete unblurred ghosts of previous frames (wave your laser pointers around quickly in the SteamVR dashboard environment, for an easy example, and observe the trail of beams), but at least there is no information that directly conflicts with the optical flow your brain would expect from your moving (e.g. the image staying put, when it should pan, as you turn).

Brainwarp should just interleave itself into this already existing strobing. Whilst you have doubled the frame update rate for both eyes added together to 180 (…or 160, rather), the refresh rate is still only 90 (80) per eye.

If you were to interlace the images, you would need to A) Make the displays really refresh at 180Hz, if possible, rather than just offsetting their VBlank, and… B) Render 180 frames per second for both eyes - not just one. Nothing would truly have been gained, other than that your transport traffic would have been halved, thanks to throwing away half the rendered lines (one could conceivably render them half height, of course, with some offset trickery, or at least rasterise them full height, and only shade half).

All this is of course best if you can provide 180 real frames every second. If one have to synthesise, the positive effect is reduced, just like usually.

( As a silly little side note: There is this little thing that’s been rattling about inside my cranium… So; Many LEDs can function as an opto diode as well. Since we already have this long blank period between flashes of global refresh imagery, I wonder if some display manifacturer could make one that reverses function during these periods: Flash an IR LED, and capture a couple of frames of the user’s eye, collimated by the very same lens that helps you see the image - most aligned eye tracking one could hope for, skipping some difficult steps. :7 )


I don’t think you understand. Each of the two panels never surpasses 80fps. They run in half phase of each other. In your stereo overlap area, it will be nearly indistinguishable from 160fps.


Yep, this just because you have to keep to the panel’s limits, this only applies to current lcd, and the methods that are being used are just to adapt to that…

But it will never be resulting in a true 180fps image, just a “perceived” sense of that…


That is half phase exactly what I just described. That’s only half the design though. If there is not original frames being alternated, the motion will not be 180fps smooth.


As your visual cortex cannot interpret the alternating data at such high frequency independently of each eye because the brain converts to signals into one image, visual data should look a lot more dense if it’s done in any of the ways I mentioned and not just completely repurposed as a pile of useless nothing hype.


True. Your periphery is very sensitive to motion. However, the center of your vision (your fovea) is actually more sensitive to framerate. You’re also more sensitive to framerate if you’ve been drinking coffee! Here’s a video which explains that…

The whole video is interesting, but the part about your fovea starts at 12:05 and the framerate difference is at 12:46. At 14:15 he talks about closing 1 eye vs. blanking its image, which is important for BrainWarp. He discusses framerate perception at 6:06.

Note that the “not needing 4K resolution” isn’t true for VR, since there are lenses which magnify the view and you can indeed see pixels.


Thank you…


Are you a neuro physician ?? Because there are studies and patented systems that are affirming the contrary…

Other than that you’re welcome…


You wrote yours whilst I was still writing mine :slight_smile:


would be interesting to try it , with a packet of aspirin on hand


I thought this too. How do you explain their acceptance of the “Oculus Go” refresh? Has their ASW/ATW matured to the point they can use lower Hz with no visible discomfort?

To be honest, I would not want Brainwarp enabled if I can hit the max refresh of the display. I would only want it to kick in if frames are constantly dropping below 80 so not 80/160 but just sustain 80 no matter what. Unless Brainwarping to 160fps has a positive effect. But then this is not done on other HMD’s, it is a fallback solution only?

This is why we have all these software enhancements in the first place, to overcome the limitations in the hardware. it is all very interesting, coming up with software tricks that most of us humans don’t notice. There is always room to improve too.

@neal_white_iii sorry, must have hit reply to you while quoting @jojon lol


That image is what is making people think that there is a blank image when its alternating L/R.


An easy way to see for yourself the problem is still there are the UFO tests:

Try the ghosting tests plus other tests.

But you need to use a 144Hz G-Sync or Freesync gaming monitor, set it to maximum refresh and run the ghosting test at that speed, you can also overclock the monitor to higher refresh rates (but caution!).

Either way, even if the resulting animation looks more smooth than most standard 75hz monitors, there will always be some ghosting, blur or artifact, because even those top gaming monitor panels are struggling to keep up the constant frame rates required, and the tricks being used to (supposedly) achieve it.


[quote=“D3Pixel, post:37, topic:7416”]
I thought this too. How do you explain their acceptance of the “Oculus Go” refresh? Has their ASW/ATW matured to the point they can use slower displays with no visible discomfort?[/quote]

Those are separate issues altogether. ATW/ASW is for when your rendering can not keep up with the refresh rate of the display (which never stops to wait), and you have to tween “fake” frames to fill in the blanks, as a means of meeting its delivery demands. -Doesn’t affect refresh in any way, but it does mean motion in the imagery may hitch up here and there, when active – better than just repeating the last frame, though.

(As an aside: I believe in Oculus’ case, it is likely ATW may always be on, even when hitting the framerate target, in order to compensate for the user’s last bit on motion after the frame started rendering.)

The Oculus Go… People are probably just generally not quite as bad off with 60/75, as one may think. It is probably situation-dependent, with many both technical and bio-/psychological factors playing in – the nervous system is a fickle thing. :7

People have been using shutter glasses with 120Hz TV sets, which leaves you with much the same 60HZ flicker, without much complaining that I have heard of (yes, I realise the situation is very different, in many ways, to a closed HMD) – I remember when we had that exact same technology back in standard definition days (even fields to left, odd, to right, at 30fps (…or 25 here in Europe)) - that could be unpleasant :7.

Umm… I don’t see that it makes all that much sense under 80, personally, and that hitting 80 (or at least having really, really good frame synthetisation bringing you up there), is pretty much a prerequisite for it to work. It is a trick for making things appear even smoother, when you have already hit the refresh rate ceiling – not a mitigation technique for when dropping frames.

It is a bit difficult to speak about, though, or get a half-secure grok handle on, because Pimax sometimes appear to use the term as a blanket label for all their tricks lumped together, including ATW/ASW equivalents, rather than only the specific technique of interleaving refreshes between the two displays, which muddies the waters considerably.