Good explanation, could be the case what you say. But when I tried the V2 at the time, I noticed exactly that kind of sudden blurriness on the outside rim of the (then different) lenses. So I would not be so sure it is a non-issue (but then again not a huge issue if a work-around is simple).
People already saw the distortion at CES, without recording with any camera. It’s not clear to me if it’s a reflection, a distortion or a rendering failure.
I agree Pimax needs to fix the edge distortion.
However, in person, it might not be as bad an issue as first appears. With continued use, distortions may indeed be “ignorable”. I wear “progressive multifocal” glasses. At first the distortion is horrible, but after a few days, my brain adjusts and I can switch between pairs of glasses (with differing multifocal prescriptions) without issue. (My brain near-instantly adjusts to the very different visual input). Actually, the brain can even adjust to upside down images:
[quote]"In the 1890s, psychologist George M. Stratton conducted experiments in which he tested the theory of perceptual adaptation. In one experiment, he wore a reversing glasses for 21½ hours over three days. After removing the glasses, “normal vision was restored instantaneously and without any disturbance in the natural appearance or position of objects.”
On a later experiment, Stratton wore the glasses for eight whole days. By day four, the images seen through the instrument were still upside down. However, on day five, images appeared upright until he concentrated on them; then they became inverted again. By having to concentrate on his vision to turn it upside down again, especially when he knew images were hitting his retinas in the opposite orientation as normal, Stratton deduced his brain had adapted to the changes in vision.
Stratton also conducted experiments where he wore glasses that altered his visual field by 45°. His brain was able to adapt to the change and perceive the world as normal. Also, the field can be altered making the subject see the world upside down. But, as the brain adjusts to the change, the world appears “normal.”"[/quote]
Here’s a link: https://en.wikipedia.org/wiki/Neural_adaptation
Fun Fact: The image formed on your eye’s retina is actually upside down, due to the eye’s lens. Your brain inverts it, so you see the world normally. That means that the upside down glasses are actually “fixing” the eye’s optics, so that the image is rightside up.
I do have 1 concern: If there is distortion, it’s possible that Pimax’s team might inadvertently adapt to it, so they might not even be able to see the issue after a while.
I disagree because with programming and open source, it is given freely. So people can try it and hate it and ll they lose is their time. Further to that, the lead programmer can be a moron and no one will care so long as his code is works.There is no need or care for political correctness because in that environment, the code wins and if people don’t like the way it goes, then they modify and write their own version.
In a kickstarter environment there is a customer/provider type relationship and a standard degree of business decorum is required. If political correctness stifles creativity then Pimax should also stop being politically correct and start telling some backers where to shove their crap demands or requests…(because truly some of them are so bad). Because political correctness (or lack thereof) should be a two way street… it should not only work in one direction.
Reducing fov is not the answer. Innovation in engineering is. You want reduced fov get a vive or rift.
“either you fix it or reduce the fov”
Sorry I didn’t mean to offend you, I agree we can add that as a third option:
@xunshu, if you can’t fix it please recruit aesopfabled.
Agreed. You can see the same thing in the Rift if you move your eyes back a little bit. Not as prominent though
Well watch all the video a university conducted a study & the more or less ffa debate kfa over the pcu debate.
@xunshu, it has been commented that an nVidia GTX 1080 has been used for the v5 video, but can you explain what are the other specifications of the computer? CPU, memory…
I did watch the entire video. The study you mentioned is based on a particular scenario. I am trying to indicate that the survey results are as they are because those partaking in it were peers and partners. As such each viewpoint had equal weight and sway on the outcome.
A founder/backer relationship is resoundingly different. If you were to apply the removal of PC then Pimax should act like Linus Torvalds and should just say F you to those that don’t agree with them or their philosphy… I don’t see that working. What you are trying to do is apply the PC concept to only one side…eg the backers… when the philosophy is intended to be applied to all parties involved equally… Anyway we are delving off topic so I’ll end my point at that.
just did a test and if you move headset left or right you can see lenses hedges and its exactly the same effect, as soon as i open my other eye it disappear
i am really not worry about that.
It should be fixed indeed. That being said it will be reintroduced later again with foveated rendering.
I really hope that foveated rendering gets adapted as soon as possible. This will have an huge impact on PC requirements and immersion. I hope that developers are already implementing this in their code, as an option.
It is really pointless to complain to PIMAX about the distortion shown beyond 180 degree FoV. The fact that they take a viewport frustum from a game and can extend it beyond 180 degrees at all is wizardry on their part as a company. No AAA game draws beyond 180 degrees camera angle. That 10 degrees to the left and right of your FoV should by all intensive purposes not exist.
Gotta disagree with you. Or are you saying that everyone currently playing Elite Dangerous on triple monitor cockpit setups are just hallucinating or lying to themselves…
Please check your facts before you post something so wildly untrue and completely refutable.
Number of screens and FOV are not relative, the FoV of the in-game camera and the screen space it is drawn to are completely independent.
Keep in mind this is not something implemented by game developers because the market for it not existing means its not worth the investment. This is done engine side, without access to an SDK you have no hopes of seeing this.
It is not so magical, actually. While your total FOV, for both eyes together, is 200-ish, the most peripheral parts are only visible to the eye that is on their respective sides, so the frustum set up for the camera serving each eye is only something like 150 degrees.
BIG lag on 0:17 Arizona Sunshine with Pimax 8K v5 prototype.
No lag observed without shooting when through lenses scenes are shown.
How can someone play with such issues?
What lag ?
YOu will have to describe it better.
Game freeze duration 0.5 sec. Check muzzle flash on monitor, hate those freezes while playing.