Wouldn't foveated rendering leave the Pimax 8k x obsolete?


#21

The 1440p input resolution will be just fine.
The major advantage of the final 4K per eye screens is to completely eliminate the screen door effect.


#22

It’ll be okay, but it won’t be optimal for things like using the desktop. Unless the hardware scaler works really well.


#23

As i understood it at the time, there was no need for a new chip. The 8k and 8kx has the same screen. With 8kx beeing the native resolution, there is no need for a new chip.


#24

That’s… fine for you but a major advantage of 4K over 1440p is, wait for it, the 4K


#25

Its hard to say without understanding more on how the bridge chip is setup/works.

While 1 bridge chip setup with a single screen can do 4k. Having 2 displays connected to the bridge chip is likely using different soldered connections & sets the chip to dual display mode.

@crony & @Cr4zyJ is likely to have better understanding if bridge chip configurations having worked on hmd building.


#26

I do remember it as well. The proposed process was said to be to send headset back to be upgraded.


#27

Even if they were in dual display mode it should still work. Because there’s only ever one screen running with Brainwarp. Brainwarp basically confirms the ability to have one screen on at a time.


#28

The 8k-X uses dual bridge chips & no scaler.


#29

I don’t see anything holding it back from running natively as long as Brainwarp is enabled.


#30

2.5k (legit) Qhd = 25601440
4k (false) Uhd. = 3840
2160
Difference 1280*720

Yes 4k is bigger but as you can see not much to upscale.

So yes native 4k better but not as much as a jump as one might think.


#31

For things like making out small details such as text, it could be a noticeable difference.


#32

Yes but is likely only to work at 1440p input & not the 4k input. As the chip is setup in dual screen mode. Why i said @crony & @Cr4zyJ are likely to have better info on your question.


#33

Precisely. It’s not like it’s 1080p upscaled to 4K.

It’s only a 1.5x factor we talk about, not 2x.


#34

Well that’s subjective, I still see it as a great improvement


#35

I sense the need for yet another blind test. :smirk::joy:


#36

I ended up not having the time or resources over the past year to get into development like I planned. Sorry but this question is over my skill set :frowning:


#37

Well i did say may know. :beers::sunglasses::+1::sparkles:


#38

[quote=“D3Pixel, post:18, topic:6297”]It would be counter productive to take all that FR reduced pixel data on the GPU and then resample it to fill the native panel while still on the GPU. You move that process to the HMD chip and thus data bandwidth is vastly reduced too. This is why FR is tooted for mobile / portable devices as it has an improvement on battery life due to reduced bandwidth (power) demands.

That is what I read anyway.

If you search for “Foveated rendering bandwidth” there are many articles making this claim.[/quote]
Yes, but I don’t think that’s applicable to the 8K. The way Google is doing it (based on an article I read previously) is that they take the lower res “background” image and then increase the height of the bitmap and incorporate the data for the fovea image to the new area.

In order to do this, you need something on the receiving end which can convert the complex image data into something that can be displayed. A simple scaler chip cannot do that. The primary intent of Foveated Rendering is to reduce the load on the GPU, any bandwidth savings is just “icing on the cake”.


#39

Me either, this is all in relation to the 8K-X which I certainly do hope they consider it because nothing (demoed) can run it at 90Hz. It is waiting for exactly this kind of advancement. The 2 x DP solution they have now is a brute force solution which will hopefully work but a full F.R. implementation + eye tracking is the bread winner.


#40

True, but I don’t think Foveated Rendering is quite ready for “prime time”.

Unless the video drivers have automatic support for it (like antialiasing), games and apps will have to add support for it. Even if NVidia or AMD supported it in their drivers, it’s unlikely both would, at least in the near term. That means that some games or video cards would NOT be able to use FR on the 8KX, ergo the solution needs to be brute force.