Best GPU for Pimax 8K VR?


#385

this is the word we are living :
http://www.usdebtclock.org/index.html


#386

I actually still have my good ol’ 980Ti with an i6700K. I am planning to upgrade to a 2080Ti, if the increase in performance from the 1080Ti is substantial enough to justify dishing out 500,- more for it. If not, if it turns out to be a meager 15% for our use cases, I will consider buying a 1080Ti and wait for the next gen which some hope will arrive sooner than the usual 1,5-2 years gap we are used to.

It all really depends on the combination of a) the demand of the 8K or 5K+ and b) the benchmarks of the 2080Ti compared to the 1080Ti in VR and in 4K scenario’s (both of them are relevant, because each of them misses a part of the equation we will face with the 8K). As Sweviver has ordered a 2080Ti already I am also particularly interested in his findings, because he will be able to bring both of them together. But then again we still may not see the true picture by the end of this month due to the missing Windows drivers, right ? In that sense I am glad that my Pimax headset will only arrive probably around the end of November so I have a bit of time to wait for well-founded conclusions on the question of which GPU to choose.

BTW: is there are reason I would really need to consider moving to an i9 CPU ? I believe the real strain will be solely on the GPU, so my CPU should not become a bottleneck any time soon, right ?


#387

No. Not really. I just want my new rig as modern as possible. Just frogleaping many generations. But i understand your question. i9 might well be not required for VR to have an awesome VR experience. It does save some money needed elsewhere :slight_smile:


#388

There is always a question, why the new “multi-view rendering” announced with RTX line cannot work on the old 1080. Is it just NVidia trying to force customers to upgrade, or is there really a hardware requirement for the new RTX? Besides, from what I read, it seems that even “stereo-view rendering” announce with GTX 10xx line did not really take off with the devs. I tried to find anything about this feature in Unreal Engine doc, and it seems UE supports something like that, but only on mobile platforms, but not on Windows. If anyone knows more, feel free to correct me.

Concerning the CPU, I would say, it looks that VR is still depending on the IPC and therefore the execution latency. Traditionally, this field was dominated by Intel and i7-{7/8}/700(K) seemed to be the best bet. Only now things are getting complicated. First, the recent security patches for side-channel attacks on Intel seem to be taking their toll on the perf, second, AMD coming with the Ryzen is getting close on IPC level (though not there yet).

Year ago, I would have bought Intel, this year, I am not so sure. Ryzen 2700X seems like a good contender. It is not as fast as Intel on single-thread, but offers more cores (until i7-9700(K) is released) and also offers better upgrade path, as all Intel chips and chipsets from the recent look like end-of-line products. Even if i7-9700 turns out to be a beast (with probably huge TDP), I still consider getting Ryzen 2700X with the upgrade path to Zen 2 (if it turns out that 2700X cannot handle it) next year as more future proof choice.

From that perspective the extensive benchmarks of the new graphics cards combined with the top of the line CPUs from both AMD and Intel will be an important factor in what to choose. Will the CPU limit the cards, or vice-versa? Will the single-thread performance of Intel i7 chips, still dominate the rendering latency? Some of the important info is still missing.


#389

Problem is that Magic doesn’t work until developers decide to program for it, and Devs want to make money so they often program for the most affordable products. Based on the steam Hardware survey you’re talking 1060 gpus for most people.

Not to say that these rendering Technologies won’t help, but they’re going to take awhile to reach saturation in the market. A lot of those rendering tricks can already be done on existing video cards, and gave maybe a 40% gain, again if it’s software that is programmed to take advantage.


#390

@xunshu I do think it would be a very good idea to look into these render optimizing Technologies from vendors like AMD and Nvidia that didnt see wide adoption on the 10 series cards, or the Vega Cards, like VRWORKS and LiquidVR, and also the optimizations we see for Unreal, Unity, and other engines that are exclusive to mobile devices for VR, like Oculus Go, and see if pimax can integrate those mobile specific rendering features into their SDK for the PC side for ease of access for developers.

I think it’s incredibly foolish that so many Developers have spent so much time optimizing VR for mobile devices, but have not taken the time to make those same features and optimizations available for the PC.

In effect its people wasting resources that they don’t have to, that could otherwise be used to much better effect.

With a headset like the 8K, you could use every resource at your disposal to make the device perform better across Hardware configurations.


#391

@Sweviver: do you have an ETA of the delivery of your RTX2080Ti ? As you can guess, we are very interested to see how the performance for use with the 8K/5K+ is in comparison to the 1080Ti, and in addition if that makes a difference for the decision to go for the 8K or the 5K+.
II would hope that the additional power of the 2080Ti may push the 8K more and make it deliver a better picture than the 5K+ if used with the 2080Ti.


#392

AnotherVRworks feature nobody will implement.


#393

How many features already exist that have not yet been implemented :confused:


#394

I preordered from the first batch here in Sweden.
They say all pre orders arrives 20th Sep, but lets hope it comes earlier :slight_smile:


#395

Multi view rendering was just as hyped for the 10 series; they’re claiming the 20 does more views (apparently StarVR requires 4), but the relevance of that is questionable considering the 10 series was marketed with 8 views (two eyes, 4 panes for lens matched shading). The real question is why this is still a new feature another generation later.


#397

Maybe because until they see people are asleep and just reacting to the hype like Apple fanboys, they can keep selling with the same tricks ?.. :smirk:


#398

The “new” feature removes viewport positioning restrictions from the previous version.


#399

Now I’m curious what these restrictions are. They don’t seem evident in OpenGL multiview, where the restriction is that vertex positions (which by the transform chain includes the perspective; the view matrix includes position and direction) are allowed to be view dependent. AIUI the whole feature was designed to support multiple viewpoints (thus positions) in the first place, though it did allow for limited numbers of views.

Found something in Vulkan: VkPhysicalDeviceMultiviewPerViewAttributesPropertiesNVX adds a flag to indicate hardware that only permits X axis to be dependent. So that rules out our friend from glove and boots. More to the point, that restriction would seem to imply the render panes are forced to both point forward and be coplanar, unlike Pimax 8K and InfinitEye families. It’s also no help at all for CAVE systems.

Is this another case of nvidia just doing things wrong for years, like their broken triple buffering?


#400

Hey guys, what are the differences between the different versions/models for the 2080ti from

Nvidia

Asus

Evga

Gigabyte

Msi

Pin

Zotac

?


#401

Usually just the cooling solutions on them. Some will come with a better base overclock than others but that is about it.I’m usually an ASUS guy myself as I have been buying their parts for 20 years but I just bought EVGA as I understand their warranty is the best among the group.


#402

All of Nvidia’s proprietary technogy requires Devs to implement. Thats why I told @xunshu to see about integrating these Rendering optimizations at an SDK level for Pimax.


#403

@MarcoBalletta. We won’t know this time until the Nvidia review embargo expires. Traditionally, Nvidia sold cards were solid but had less aggressive cooling and clock speeds. But this generation, Nvidia claims that their Founders Editions cards will be clocked higher than 3rd party providers. If I had to guess, the Asus / EVGA / MSI / etc. will still have cards that outperform the Nvidia Founder Editions (especially the water cooled one from Inno3D), but I wouldn’t pre-order anything until the reviews are out.


#404

Thank you for the replies!


#405

Here is @sweviver 's current msi 1080 ti gaming-X compared to other 1080 ti.