The Pimax 8k already uses two 4k screens, but because of bandwidth and performance limits of current gpus they aren’t running them at native resolution. They are upscaling 1440p to 4k with a chip in the headset. This is okay, but it will result in some softening of the image. Meaning that things like small text won’t look as readable as they would if the screens were running native resolution. We already know that Pimax is working with other companies to bring eye tracking to the 8k. Wouldn’t foveated rendering solve both issues regarding bandwidth and performance? Therefore making the Pimax 8k X unneeded?
No. Foveated Rendering is not a panacea.
The 8K will never be able to render at a resolution higher than 2560x1440, due to cable bandwidth and the scaler chip. The 8KX, could theoretically support Foveated Rendering at the panel’s full 4K res. Foveated Rendering would speed up the time it takes to generate the image on the graphics card (by reducing the quality of the outer area), but it will still need to send two left/right images down the cable(s) to the headset. That means that even if FR was enabled for both the 8X and the 8KX, the 8KX would look a lot better (due to the higher native res).
Theoretically, the 8K could be upgraded to act like a 8KX, but you’d have to replace the circuit board in the headset.
Foveated rendering reduces cable bandwidth. Also, we don’t really know what the scaler is doing. Perhaps it would be possible to bypass the scaler completely with software.
Not necessarily. Unless the destination device has dedicated hardware and/or firmware to decode a non-standard image frame, the full data must be sent. Currently any available headset will need to transfer complete frames. The only savings is graphics compute time, not bandwidth.
As new hardware becomes available, this will likely change, but for now, there will be no bandwidth savings.
We know that the scaler chip is very limited in its capabilities, compared to the chip in a regular monitor. It will only support scaling from 2560x1440 to 3840x2160, as far as I know. There was some discussion about being able to scale a 1920x1080 input image, but I haven’t seen any confirmation from Pimax that 1080p will be added. Their main focus has been trying to get it to run at 90 FPS, not adding enhancements.
In addition, there’s still the need to get the signal to the headset. The 8KX has 2 video cables for a reason. Foveated Rendering won’t change that.
If the scaler was bypassed; you would be limited by the bridgechip’s output.
The 8k-X will use 2 bridgechips to be able to supply mative panel res.
From my understanding without the scaler lower res may result with image centered with black borders.
Monitors & TV support non native resolutions by scaling input to native panel res.
Indeed we haven’t heard official news on the scaler support 1080p/eye. The pimax 4k supports 1080p & 1440p input & looks quite well.
I wonder if people will just mod their 8k into an 8kx somehow xD
From your own words I see how it would make the 8K obsolete but not the 8K X…
Quite some time ago, I asked Pimax to consider offering an official DIY upgrade package w/ circuit board(s) and a second video cable. I got no reply, which isn’t all that surprising.
Maybe they’ll offer a way to upgrade. Like you pay them a fee, ship the headset to them, and they upgrade it and ship it back.
Ah I just found this on the forums from PimaxVr “one bridge chip can support one 4K panel, two 4K panel need two bridge chip, that’s why 8K X need 2*DP1.4 (two chips). we do such comparative test base on our internal 4K plus headset prototype.”
Well, Pimax earlier stated that the 8k can be upgraded to 8kx.
@pegon, can you please post a link to the source of that info? I’ve been watching for something like that and somehow missed it.
According to Google, one of the goals they are seeking with foveated rendering is to reduce bandwidth requirements, which would be astronomical for retina resolution VR. However, I think to render same frame at different resolution in different parts would indeed require some kind of hardware support, that is not available now.
What about Brainwarp? From what I’ve understood it only uses one screen at a time and flickers between the two? Wouldn’t it work in that case? Pimax says that the 8k x will have 2 bridge chips because one bridge chip couldn’t handle both screens at native resolution. But if you only have one screen running at any given time, (Like how Brainwarp does) shouldn’t the one bridge chip be able to handle the one screen?
I’ve always imagined bandwidth optimalisation through foveated rendering as a dual stream with one stream a low res blurred version and then a high res circle that a chip would combine into one view or something. So basically a combination of the 8K and 8KX technology. An upscalar chip for the low quality complete screen and a one-to-one pixel super high res small focus area to be rendered at the desired spot together on the high res capable screen.
It would be counter productive to take all that FR reduced pixel data on the GPU and then resample it to fill the native panel while still on the GPU. You move that process to the HMD chip and thus data bandwidth is vastly reduced too. This is why FR is tooted for mobile / portable devices as it has an improvement on battery life due to reduced bandwidth (power) demands.
That is what I read anyway.
If you search for “Foveated rendering bandwidth” there are many articles making this claim.
Look at this article:
Find the heading “Foveated Transmission” for a visual representation of how that works.
As for the 8K-X, lets hope that the driver for the HMD can recognize a foveated frame vs a full frame and a software/firmware patch will be built to process it as the technology matures.
Sorry Neal i can not remember exactly. But it was stated around the time when they were showing the prototype last year. The 8kx is only to remove the scaling chip and installing other cables.
If the 8k were to be upgraded into the 8k x it would require removing or disabling the scaling chip, as well as adding another bridge chip to the headset.