I think streaming would be the only way to go for this at least currently. We do not even have consumer available 8K rated cables. Hell even 4K hardware is still maturing. How does anyone expect to drive these 16K displays? If however you ran the entire game in the cloud then not only is processing power not a concern since you can keep throwing scalable resources at it, but advanced techniques like real time global illum and raytracing are easily achieved. Then all you have to do is compress the output and feed back to the user over the network. The only issue then is the latency the network would introduce. However this also assumes that we have users with the required bandwidth to run such content. As of right now even where I am in the bay area access to Gigabit fiber is limited and I can only imagine that this is the type of connection that would be a minimum to support such a hefty video feed. But who knows maybe even 10Gbit will be needed and good luck finding that from a public utility around here. Not to mention the cost of the equipment that will allow you home network to run at such speeds.
I think it’s more about the reduction of SDE than to render 16K per eye …
I think they would render 2kx2k per eye which would be doable and have a pretty good image … but that’s if course just speculation …
These panels might make it onto something like the rift cv3 or something inow. exciting but it’s going to take awhile.
Also the only way things like this will be useable in any near time table is with in HMD upsampling.
Which if given enough resolution to start with. Can very closely become as good as the real deal.
Just as an example. Most movies released in 4k are just upsampled 2k (practically 1080p) and a tv with good upscaling makes 1080p blueray look very good.
And for instance the base resolution of the 8k at 1440 is a even better starting point.
Yeah that makes sense, but would upsampling not introduce additional latency?
Yeah, of course, that’s the main problem as far as I can see it, but by the sounds of it, that is becoming so negligible as to be meaningless, even for VR.
We would most likely get scaling chips fast enough to not matter, long before we have GPU’s capable of natively rendering 20k@100hz.
I don’t suppose Pimax would give us numbers regarding motion to photon latency for the 8k and 5k ?
I suspect, going forwards we really shouldn’t bother with hard resolution at all, rendered resolution will scale as possible in game, and then upscaled to a panels native, and it’s the pixels per inch that get interesting.
For regular flat screens it could be amusing to have a myriad of panels, as suggested by the Samsung LED wall screen.
If you want a bigger screen, plug in some more panels and you get X more screen size withouth sacrificing pixel density.
Or they could be using the Texas Instruments DMDs and then using XPR for pixel “upscaling” in essence a screenless display not unlike Avegant Glyph.