Brainwarp Software Progress?


You clearly do not get what brainwarp is supposed to do.
You are talking about 3d, where every eye gets the right picture at the same moment. Of course that works. But i clearly see the flickering with 60Hz (120Hz) 3D-Vision, so 3D-vision would improve definately from higher Hz. But there is the core-limit which holds the further development back and nvidia is not ready to put enough resources into 3D-Gaming. We are clearly a minority.

Brainwarp wants to delay the frames between the eyes. And with that comes the coresponding problem. So this has to be fixed (cheated) somehow…


mr.muu has a point. Atleast to me the alternating frames technique does not sound so good, especially the refreshrate is so low.
atleas i can spot the flicker in active 3d , it´s like super fast strobo light to my visual cortex.
over hours it gets less annoying but still it´s anoying as fuck.
only way i can enjoy 3d is constant image stream to both eye simultaneysly. eather passive 3d tv or
dual projector setup.


brainwarp is propably basicly like mp3 .everybody says it sounds exactly the same as uncompressed audio.
sadly is not. ESPECIALLY when you cranck up the volume. the louder you get the more the difference shows sadly.
there is no golden shortcuts ever. everything is a compromise ,atleast some sort of


I say screw the fancy controllers and pour resource into making brainwarp great and their asw equivalent great.


Well excuse me then.

I never noticed flickering in 3d 120hz.
Maybe you are just very sensitive in this matteR?

But when you see those people test pimax 8k v2 or even v3.
Why they dont get sick? I remember someone posted a guy who was very senstive for such issues and he had no problems with 8k pimax versus vive/rift.

Everyone is different. For some people maybe VR is not their thing no matter how much they want it?
Some people get sick on a boat but most people dont. But those who get sick on the boat can’t demand the boat to float and not hit the waves up and down imho. Make your conclusions, if it doesn’t work for you s ell it. no harm done. You will get money back since you payed less then official retail when it hits.


People who tried v2 or v3 will not see flickering, VR-3D does not use shutters to blank the eye - except Pimax 4K :smiling_imp:

So please let‘s not compare apples (3D gaming) with oranges (Brainwarp) and stay on topic.

I am pretty certain that so far Brainwarp is not developed and active, so nobody who tested v2 or v3 did experience it. And again: i am a fan of Pimax and not trolling. With my (limited) engineering background it right now just do not makes sense to me and i an curious about how it will work…


While it’s true, nobody sees flickering and it’s probably not even worth mentioning, i don’t think 3d gaming (vorpx, tridef 3d) are off topic.
Like I’ve said before, hopefully brainwarp will be natively compatible with vorpx and use all 180 frames from a game and vorpx doesnt reduce it to 90. Its important that ralf, who is the owner of vorpx, listens and communicates with the pimax team so they achieve the true potential of this system.


I’m not trying to troll, but while you are correct about 3D monitors with LCD shutter glasses, you’ve missed a key point about Brainwarp: The left and right eye images are shifted in “game time”.

That is, a 3D monitor flips between two images which are rendered at the same tick of the game’s clock. Therefore, any spacial differences are solely due to parallax.

With Brainwarp, the left and right eye images are no longer synchronized. Assuming that the game is rendering at 180 Hz, the left and right eye images will be calculated at different game clock ticks; they will be separated by 0.0056 sec. While that’s not much time, it might be noticeable (depending on the game), which would lead to spacial anomalies.

For example, a fast-moving missile might move a significant distance in that short amount of time, in which case the left and right eye images would show the missile being offset by the traveled distance in addition to the parallax. That would mean that the missile would apparently be at the wrong distance. In a first-person game, turning quickly might have similar spacial anomalies.

In practice, I don’t think this would be noticeable in most cases, but it might be an issue for some fast-action games. It would be a bigger issue if the game were only rendering at 60 Hz, because a slower game clock-rate would allow more time to pass between the image rendering for each eye.


I am at the wait and see mode, and since we are getting our HMD’s before anyone else, we are going to be the beta testers, not just first customers.

I hope for an enjoyable and quite possibly bumpy ride.

I also think\hope it’s not as simple as 3d monitors do their stereoscopy, it could be the silver bullet for high res VR or something most will try not to rely on like ASW.


VR without Brainwarp is much the same as what 3D monitors do. Each eye sees a slightly different image with parallax offsets. I think Brainwarp has the potential to be better than ASW, for most games, since there’s no image manipulation. Each frame is exactly what was drawn by the game itself.

Depending on how Brainwarp is implemented, it’s possible that people might perceive stuttering, instead of spacial anomalies.

You’re correct, as “beta testers”, we’ll get to see Brainwarp first. If there are weird artifacts, we can always disable it.


My concern about Brainwarp is, if both frames are rendered at the same time and there’s a delay in sending one of them, it just seems like a bad idea. At 90 Hz that introduces an additional 11 ms of lag. Their marketing material specifically mentions that Brainwarp reduces lag, so that can’t be how it works.

Here’s the image they provide, which shows frames rendered in an alternating sequence. It looks like frames are being rendered and sent without a delay, but only to one eye at a time. If this is the case my concern is what the non-rendering display is doing while it’s waiting. If it is displaying no image, it essentially works like shutter glasses which are only open 50% of the time, that means 50% reduction in perceived brightness.

If the non-rendering display is still showing the last image that was sent to it, then that will create a mismatch between what your eyes are seeing. That could be troublesome for fast moving objects that are seen by both eyes as each eye will be reporting a different position of the same object. Imagine trying to look at a spinning wheel for example, when each eye is reporting a different position of the spokes.


According to this picture they need native support of the rendering engine of the game. Because they need to render only one frame per timeslot. This is so far not possible - afaik.

Inbetween the frames there is a backlight blank, so no picture (or black) is visible while the panel switches the LCs for the next picture. Remember that each Tx stands for 1/180s, and therefore the display needs two timeslots to display a picture.
The engine then also needs to support to update the physics, AI, etc. at 180Hz, so that also needs processing power and time…

This picture is ecactly why i think brainwarp is a good idea but right now because of lack of support from the renderengines not possible.

Only if there is native support in the rendering engine this picture makes sense… at least to me but i could be very wrong. I hope i am wrong…


is that we have a thread that’s full of what I think is meaningful theories/concerns, yet we have not seen a single reply from anyone associated with the project. Makes me think this is a subject they aren’t ready to talk about.

Personally I am hoping my system will not need Brainwarp or any other help tool but it would sure be reassuring to know that there was software with feasible solutions to performance issues.


I would like to point out to one more potential benefit from using alternate screens refresh technique(Brainwarp). From pure GPU load perspective rendering single screen at 180FPS comparing 90FPS both screens to match theoretical(for now) 180/90Hz, you would actually in my understanding lessen the GPU workload if render half the pixels at double FPS. Here’s my explanation:
Normal mode for example 8K version is rendering games at 5120x1440@90FPS, for VR rendering you would also have to render the same scene with 2 view points(1 per eye) to get stereoscopic 3D effect VR is known for. So that is additional work for GPU. With “Brainwarp” you need to render 1 view for eye, since at one time 1/180 of the seconds, only 1 eye gets image. So we can render 2560x1440@180FPS monoscopic. How much performance bump you may get, Oculus already has research about this.
Is this done per game level or not, I don’t have best knowledge of rendering pipeline, but I read somewhere when Valve announced that LCD is now viable for use in high-end VR HMDs, that there are hardware and software improvements that enable such statement. My understanding is that making LCDs low persistence is HW thing, and some technique similar to Brainwarp is software “trick” that they refer to. Because this will require a lot of testing and validation that I don’t think Pimax has the manpower to achieve(and Valve does :slight_smile: ). Just speculating here, but interesting to someone also other than myself, I hope :wink:


Even if your syste, doesn’t need brainwarp it still enhances visuals.


How so Aesop? Does it aid by allowing higher supersampling of the image or something like that?


No, it doubles the perceived framerate by actually doubling It, but only a single eye instead of two eyes per frame. If they can make this work universally for unity, unreal, and source 2 is yet to be seen. Really hope so its a very clever way to make your hmd feel like it’s super powered.


Okay. I was familiar with this functionality. Yes it sounds like a really big deal if they can actually get it out. I just haven’t seen anything other than theory and I’m starting to think they don’t have it working yet. I really hope it comes together as planned.


With comparing BrainWarp to 3D Shutter Glasses or even Pimax 4K, one has to remember that those systems are trying to give us 3D vision based off a single screen. Where Pimax 8K differs is that each eye gets its own screen.

The effect of strobing shutters will not apply per say but instead you will have each eye being fed a seperate rendered image at a min of 60Hz to max of 90 with a slight offset to the other eye. How this effects our perception of the 3D image is anyone’s guess but it is not the same as single screen with shutter glasses. Our eyes work as 2 input sources that are processed into the single view field.


Right, ideally, we won’t need Brainwarp, but right now, mainstream video cards aren’t really powerful enough to drive VR at high quality settings. Things will be different in the future. Until then, I hope Brainwarp works without noticeable artifacts (like stuttering, weird depth issues, or headaches).

FYI - Recent NVidia cards (the 10X0 series) can draw both left and right images much faster than the cost of drawing 2 separated frames, due to “render target” optimizations, which allow the same scene to be drawn from up to 6 different camera positions, without having to resubmit the scene data. Brainwarp will therefore be most useful to people like me, who have a GTX 980Ti.