The HMD should already be showing black most of the time, with resulting strobing, if Pimax have done things correctly.
The strobing, or rather the mitigation of it, would be a significant factor, into why 90fps was settled on, By Valve and Oculus, as a lower refresh and frame rate threshold limit.
The HTC Vive and Oculus Rift both do this, under the monicker: "low persistence", in order to avoid the judder you mention, that one get from sample and hold, and which is greatly exacerbated when the picture moves along with one's turning one's head.
You still get persistence of vision discrete unblurred ghosts of previous frames (wave your laser pointers around quickly in the SteamVR dashboard environment, for an easy example, and observe the trail of beams), but at least there is no information that directly conflicts with the optical flow your brain would expect from your moving (e.g. the image staying put, when it should pan, as you turn).
Brainwarp should just interleave itself into this already existing strobing. Whilst you have doubled the frame update rate for both eyes added together to 180 (...or 160, rather), the refresh rate is still only 90 (80) per eye.
If you were to interlace the images, you would need to A) Make the displays really refresh at 180Hz, if possible, rather than just offsetting their VBlank, and... B) Render 180 frames per second for both eyes - not just one. Nothing would truly have been gained, other than that your transport traffic would have been halved, thanks to throwing away half the rendered lines (one could conceivably render them half height, of course, with some offset trickery, or at least rasterise them full height, and only shade half).
All this is of course best if you can provide 180 real frames every second. If one have to synthesise, the positive effect is reduced, just like usually.
( As a silly little side note: There is this little thing that's been rattling about inside my cranium... So; Many LEDs can function as an opto diode as well. Since we already have this long blank period between flashes of global refresh imagery, I wonder if some display manifacturer could make one that reverses function during these periods: Flash an IR LED, and capture a couple of frames of the user's eye, collimated by the very same lens that helps you see the image - most aligned eye tracking one could hope for, skipping some difficult steps. :7 )