Yeah its too bad Nvidia is so greedy & won’t support royalty free tech standards. This is where Amd has been interesting. They created Freesync & Lightning Bolt(free variant of Thunderbolt).
Yeah, I have a FreeSync monitor, which is ignored by NVidia cards. The vsync FAST option works really well and doesn’t drop you from 60 fps to 30 fps. Instead, I usually find I’m running at 100+ fps, with slowdowns in the 40s and 50s.
Apart of that I‘m still very curious about my question that is actually bulls eye on topic for once
and wasn’t helped with although it seems that some of you habe experience with SLI
and I’m talking old school SLI not a probable future technology that makes a lot of sense but unsure if ever going to be introduced…
SLI is known (at least what I read about) having microstutters so:
- is this still a relevant existing bad issue on 2D games?
- those that have actually first hand experience with SLI (as it exists NOW) in VR noticed whether these issues even translate worse?
If it is no problem anymore a boost of 30% at least in certain games could be a valid option for some.
So do you have a 144hz monitor? what are the +frames good for if they can’t be displayed in time anyway?
I’ve not managed to get SLI working with Skyrim although there are people out there who claim they have. I don’t think any game I’ve tried in VR has taken advantage on my dual Titan X setup.Pancake games that supports SLI is a bit hit and miss regarding micro-stutter.
Anyone can confirm Fallout 4 VR improving fps and/or latency when run off 2 cards?
I have a second 1080ti in my simrig (didnˋt simrace for long) and would like to improve my fps in Fallout 4 VR which i play nowadays by putting it in my sli capable MoBo.
I have to confess that i am blown away by FO4VR. The visuals are the best i have ever seen. Feel presence most of my intime and do not want to leave the commonwealth anymore. I play with almost everything at max (except Objects), so i have little stutter here and there (i7-7700k, 1080ti@120%powerallowed, Odyssey WMR) That does not bother me much but more fps are aways wanted…
No, I have a 60 Hz monitor. Yes, I can only see a maximum of 60 frames per second, but excess frames are simply dropped.
The advantage is increased responsiveness for most games. This is particularly true when your system can’t quite keep up with 60 fps. In that case, I’ll be running at ~55 fps instead of 30 fps (which is what happens when vsync is on). That’s a huge improvement!
Essentially, you get framerates similar to vsync OFF, but with NO tearing artifacts.
Responsiveness seems reasonable. I usually try to aim for the max refresh rate to avoid drops if max is higher. Hope u get what I mean. So fast vsync is not messing with latency? BTW. sometimes I have the feeling that I get h-sync issues. Need to sort out why but probably your suggestion may help as well.
Yes, even when vsync ON would not throttle me to 30 fps, games feel more responsive (than running a constant 60 Hz). That makes sense, since according to Fraps (or in-game fps display), I’m getting 100+ fps (in some games, it’s over 200 fps).
With you having an Nvidia Sli setup. Maybe checkout Nvidia VR Funhouse it’s free in steam.
K I’ll check that. It may be different with nvidia. I had better results limiting the frames on AMD cards with many games. Have not checked responsiveness though, just motionflowwize. Had the feeling that sometimes odd frames were dropped.
Have you try foveated rendering before? If not, how do you know that the low res on the peripherals is a no-no? I thought foveate rendering will give you wide FOV as well…why not? It takes no resources because it doesn’t rendering the entire field. It only render where you see and focus and you’re being tricked into believing that everything else is clear because everywhere you focus on is clear.
Nope havent tried foveated rendering, only Tobii eye tracking which was cool but with noticeable delay. Im sure foveated rendering will be efficient and a part of the future, but for now while the technology is so young, it would have been so much easier to just add some kind of SLI support in VR game engines to get the framerates we need. After all, foveated rendering is all about getting better performance… Can we keep it simple with raw power SLI without sacrificing fov and input delay with eye tracking, then I prefer SLI all day long… at least for now.
Adhawk seemed better, much faster and cheaper. Pimax said they were talking to Adhawk’s people, but no more has been known about it.
Yea but I wonder how this could affect the wide 8K FOV. In my opinion foveated rendering is a good idea for wide FOV headsets as a lot of the non-focused area renders in vain. But I would never sacrifice the wide FOV for a 120-130 degree limitation because of added eye tracking cameras. Decreasing fov also defeats the whole idea of foveated rendering…
It’s not that it reduces the FOV at all, it just improves the definition at the point where you’re looking and/or reduces it at the points where you’re not, where the eye already sees much less clearly. The gain in quality and performance is enormous.
I think the Pimax 8K should be standard, not as a module.
I get that the logic chain of thoughts would prefer a stronger engine if the car is too slow, but if it’s not possible, making the car lighter is the next obvious choice in order to get faster and may still prove handy when stronger engines are available.
Anyway added latency is indeed a bad trade off (may be software/driver related though) and so is reducing fov if it’s significant (probably due mechanical issues, covering lenses?)
Curious how long it’s gonna take till this tech proves beeing useful.
Im not saying foveated rendering itself reduces fov. But to use foveated rendering u will need eye tracking. Unless you can track the eyes from inside of the lenses (which i haven’t seen anyone do yet) then you will need to use a ring of sensors/cameras on the edges of lenses next to your eyes. Thats what makes me think it could reduce fov. I have a hard time to see the 8K lenses not losing fov if those rings are on top.
Having that said, i hope they can prove me wrong
Just a tiny chip at the end of each lens.
The interesting idea for regular multi gpu setup would be render priorities to create 1 frame. Ie background gpu0 foreground gpu1