1080ti SLI in VR


Sure. ED is probably the worst case scenario. Lots of skinny repeating polygons and the rumble of your engines acts as a minor up/down vibration which causes everything to crawl. One of the things I hope that VR will do is increase the super-sampling above 2. I know that’s an option; the question is: Will my (future) 1180Ti be able to handle SS of 3+?


As to the topic is there an easy way ( besides GPU usage in the task manager and the NVIDIA VR Tool) for getting a hint on my FPS in VR?
Edit: it’s FCat I meant


I‘m still not absolute into the subject of SS although many people here were helping me with that. But if you use SS ingame (if supported) + steam you should theoretically be able to achieve 3+ whether it will be working smoothly is of course a different story.
As a matter of fact I tried a not very demanding scenario and it worked well. I played a 2d movie with 3x internal SS + steam 2,5x maintaining 90fps.


I sais resolution but it’s true for other things like shadows, texture etc… At some point if it’s too ugly I prefer to play on my monitor. I also have a 4k monitor and 1080ti. If a game runs smoothly on 4k on my monitor doesn’t mean it would be ok in VR.


Truth with foveate typically we enhance where the eyes are looking or main binocular overlap.

What might be interesting is to render the binocular overlap say at norm & render the peripheral. At increased say ss.


Agreed but shouldn’t really call it sli or cf as that was alternate frame rendering & the new concept is per eye. Multi-Gpu VR; really need a new bling label. Lol


of course, my point being if it is possible to gain back HUGE amounts of fps playing in 4k toying with those settings I speculate that even a fragment of these won-back frames could lead to a better VR experience. Sometimes 5 frames decide whether something is playable for me. before freesync there were a lot of games that I could not play without vsync which is usually very demanding. Lots of my friends have no problem turning vsync off even in games like borderland. in borderlands with vsync turned off it took max 30 min. and i started feeling sick. so I always needed to make up for turning vsync on on certain titles where it mattered and gain fps back by turning off less important stuff. sometimes even games are preconfigured with nvidia settings and it takes some time to figure these settings out if you‘re amd. eg mhbo on for honor took 30+fps


Nvidia as has that free vr funhouse to showcase nvidia bling.


still, anyone with a reliable source about the state of microstutters at present time? yes, I get It won‘t matter to some people but It will matter to me if this problem still exist using sli or crossfire


With VR mgpu its hard to say (though there is likely articles & user reports).

With say the base idea of 2 gpus, 1 per eye not as likely to get microstuttering as its not rendering alternate frames as with stsndard sli/cf. But rendering the same frame but the per eye perspective in theory should be much more stable as I understand it.


Played it - but not much fun


That’s what I would hope - we might need to get doc ok fixed on the subject :+1:t3::stuck_out_tongue_winking_eye:


[quote=“noro, post:64, topic:6085”]
…But if you use SS ingame (if supported) + steam you should theoretically be able to achieve 3+ whether it will be working smoothly is of course a different story.

Yes: Elite has an internal “HMD Quality” setting, that stacks on top of one’s SteamVR/Oculus one: SteamVR at “5”, and “HMDQ” at “2.0” results in 4.47 times supersampling.

But… at that point, the compositor’s texture filter is not taking anywhere near enough samples from the rendered images, that it does any good for antialiasing purposes (It’ll be skipping every second texel, pretty much).

What one can do (and every Elite Dangerous player with a 1080Ti owes themselves checking it out just once), is (whilst keeping the product of VR Runtime supersampling and “HMD Quality” around 2.0) to use the game’s “Supersampling” setting, on top of the lower level one. This will render a larger image, just like the others, but it will be downsampled before it is handed over to the VR Runtime, so you have twice the size, as per “HMD Quality”, then twice again, for “Supersampling”; After rendering, the output is scaled down to the first double size, which the VR Runtime then uses to the fullest. This way, almost every rendered pixel goes into the final output, and few are just thrown away. (…and if you have SteamVR using its older texture filtering method (it’s an option), you still benefit from a slight bit of aliasing, that makes things look way sharper and more detailed (you see more detail by moving your head – that lets your brain do the same kind of thing some old scanners used have an option to do, where they’d do a second pass scan, with the sensor array shifted half a pixel to the side, ))

x4.0 is impractical for actual play, of course, given the massive workload; and in-game “supersampling” should always be left at 1.0, with precedence given to VR_Runtime_SS/HMD_Quality, which makes better use of the extra resolution (being as it takes into account the predistortion for the lenses, and does not constitute an extra “generation” of reductions.

(Unfortunately the game also streams down to lower level mipmaps, when one supersamples this much, even if one have the VRAM to spare. None the less, it still make things render nicer. :7 )


When playing games on an NVidia card on a monitor, try turning vsync off in-game and set vsync to FAST in the NVidia control panel. You’ll usually get a higher framerate with no tearing (at the expense of more video memory usage for a second back buffer).


So in fact VR works already implements SLI technik in rendering games. Its now just up to the game developers to support this feature or not.
I dont see any reason why this feature ist being supported enough. I mean AAA games like Fallout 4 are held down because of performance issues, and improving them would improve sales as well


Well let’s face it good for short time & good it’s free. Wouldn’t be a must buy for sure. Lol


Multi gpu setups is still a nich market. With dx12 & vulkan may see more adoption. But only time will tell. If Nvidia did like Amd & supported more gpus in mgpu we might see more adoption. Ie 1060 sli.


So few people buy SLI rigs, it’s not a huge win for the game developers, plus they have to buy 2x cards (and SLI capable motherboards) for at least some of the programmers and testers, in addition to the extra development time. That’s a lot of extra effort and cash for very little payback.


no one is doing multi gpu vulkan or dx 12 because it takes more effort and money from the developers than letting nvidia tweak their drivers with sli and nvidia is killing sli with dx 11. So don’t get your hopes up for multi gpu vr the titles that supports it will be far apart unless something really drastic happens. I (and my dual gpu setup) hope a miracle happens and proves me wrong.


wow, thanks I will try that. I have no performance issues since the 1080ti though that I had to make up for. Sadly the freesync support of my monitor is useless now. I actually liked the option in the radeon software to set an FPS limit target rate to each game.
Call me crazy but I have the feeling that too many frames beyond the monitors Hz are not optimal as well. ok clearly it’s better to drop frames than repeating frames or making some up but still.
I had always amd and matrox cards, a gtx770 briefly in between and do not have the knack with nvidia software yet. yes there are guides but i figure it’s different with every game. hope I‘ll catch up because the amount of considerable improvement through tweaking with amd settings was great.