Radeon VII Support comment



Nvidia actually is one of the first to receive new Amd gpus.


But I think @geoffvader is right.

All we have today regarding AMD Vega and Pimax8k is lackluster experience reports and some claims on this forum PimaxVR and AMD would be working on that (forgot where that was, it might be one of your posts?)

With regards to the new AMD Radeon VII a statement of PimaxVR for the likeliness of upcoming fixes would be really appreciated. It’s in a comparable price/value position as the latest nVidias but comes with different advantages but it also has all the power needed for today’s VR experiences with a Pimax8k. That makes it a very relevant option but we really need some more confidence before shilling out that 700+ bucks.

@xunshu can you help us with some more infos on that matter?


Yes Amd Vega driver has been messed up since it’s early release as 4k owners know.

My contact with Amd said they are aware their driver has some issues. Since being able to work on it with a pimax headset in the last 2 months. A user has reported good gains. He had to switch off Auto Gpu to Manual SS overide.

Truth Amd Gpu’s as demonstrated by Consoles & games made for Amd gpu are often not properly utilized. For example 2x7950 ran my 4k fine. My r9 390 can run the 8k; though I don’t reccommend it as it’s limited on how many titles but the point it’s being utilized well.


This. Hopefully continued development in Vulkan and drivers will eventually allow AMD cards to be equally utilized. It’s not just GPUs- AMD has been hurt in CPU performance as well. Level1 has done a great job of showing how the Windows kernel has been negatively affecting Threadripper. Intel and nVidia have enjoyed control of the market for way too long. I am really looking forward to the continuation of this positive momentum and really hope my next GPU is AMD based.


Long Hail Linux! :beers::sunglasses::+1::sparkles:


It seems it will be a solid 20 FPS faster than a Vega 64, is going to be available for $699 in contrast to the $850 that many 2080s are going for, it comes with 3 free games, and I honestly don’t want to give Nvidia any more of my money. That company has been so anti-consumer that it’s incredible. The 2080 is actually slower than 1080ti and the card that is 30% faster is going for more than $1200. If someone buys Nvidia right now, its sending a message that you are ok payiing through the nose for mediocrity.


300w is nothing :stuck_out_tongue: Try a 295x2


I can’t speak to what prices are like in your area, but here, 2080’s are going for right around the same price it looks like this is going to be released at. With most VR games favouring nvidia cards you’d have to really hate supersampling to go with AMD (assuming they even fix their drivers to work with the pimax). 1080ti’s are going for even less.


It’s all nice discussing pros n cons of new GPUs from both camps.
However this thread is about getting PimaxVR and AMD to act and fix the current software issue making Vegas underperform below expectations whilst with other VR gear or even other Radeons that is not the case.

While Vega 56 like the GTX1070 might not be really appropriate for Pi8k the new Vega variant certainly is.


I currently have a 3gb ,1060, its a good card. That said, Nvidia is playing every anti-consumer card they have right now, and folks are buying into it like lemmings. Nvidia can kiss both sides of my rear. In my area 2080 is about $800-$900

Right now they’re even stealing mindshare credit for freesync, in the name of “Gsync compatible,” while they have bashed generic adaptive sync tech for years as subpar, talking up their proprietary expensive gsync module garbage.

The 2080 TI is only 30% faster than 1080 TI, and at almost double the price. I don’t want to buy a used pascal card when just a couple of months ago, I could have bought one new. I would like to buy a new one without useless proprietary b******* features that 12 games will use.

DLSS is technically just fancy upscaling, (giving Nvidia a convenient excuse to cut ROPS, TMUS, and memory bandwidth, all while calling that a 4K card) and until Ray tracing becomes fully-fledged path tracing, ala Otoy, it’s really just a gimmick.

AMD is already better at compute oriented tasks then Nvidia is, because their GCN Graphics architecture has been preparing for compute oriented tasks way longer.

It wouldn’t surprise me one bit if AMD could literally push a driver update and enable Ray tracing when the feature becomes Market dominant.

We’ll see how the Radeon VII performs in compute, but if it can approach a Titan V for compute, it will be able to ray trace without RT cores and tensor cores, at least on the level of 2070.

The first GPU I ever owned was an Nvidia GPU, and I haven’t had a Radeon card since it was ATI.

I agree 100% that pimax needs to get on the stick with AMD and fix those drivers, because im buying a Radeon VII

And here we have it, as of March 15th 2019 Ray traced reflections running at 4k 30hz on a Vega 56!

@Heliosurge @Lillo


@VRGIMP27 :+1::+1: Impressive bro !!

This is a perfect example of what a correctly informed consumer can discern, after an accurate analysis of the tech offered by the companies can lead to…

If we apply this in our everyday thinking, especially regarding technology and consumer products, most if not all the junk producing companies and unethical companies would go out of business in no time, or at least change to avoid a complete failure, like what is happening to Nvidia right now…

AMD has showed a path to change their way to a better, wider view and ethical way to make business and produce tech, with a closer eye to the future needs of the community, and this only and alone should be encouraged full hands.

And for those looking to find a G-Sync compatible monitor, I think that a way to look for its compatibility is to choose one of the newer Freesync-2 monitors, since these have the required panel specs and hardware to go even higher than Nvidia monitors, up to 200Hz , this would assure that the compatibility is covered more than needed.


That looks like you are comparing waterblocked/top tier AIB card prices to AMD’s reference model price - the 2080 is on newegg from $699 and includes 2 games too, I mean if you are already doing those kind of mental gymnastics to justify the vega then it seems you are pretty dead set on getting one regardless of whether it even works for your use case or not.

I’ve waited over a year in the past for AMD to fix a driver bug before getting fed up and swapping back to nvidia, so I hope it works out better for you with this one.


I have a Vega 64 and it is running my Pimax 5K+ perfeclty fine.

PiTools rendering quality at 1, Steam supersampling at 60% - 150% depending upon the game. It is giving similar performance to GTX 1070Ti or 1080.


Wow - that’s some positive news I didn’t expect. So you have a reference model with what kinda CPU?
That was on small-normal-wide FoV? How do you know its in 1070/1080 ball park?

I wonder what the issue is with those users reporting 20fps with Vega and hope the community and PimaxVR/AMD can shed some light on that (!?!)

So what is the current status of Vega GPU support for Pimax8k?


Prior to my 1070 Max-Q laptop I was running a AMD 390. I don’t really know what driver issues people talk about but that was a good card and the AMD drivers have not caused Windows 10 to Crash unlike the great Nvidia Drivers I run for the 1070.

I will build another desktop down the track and it will be most likely AMD all the way as both Intel and Nvidia have bad market practices.

My 27" 1440p Freesync Ben-Q monitor runs relatively well with the 1070 utilising full screen with V-sync enabled. Frames in Il2 vary from 80 to 140 with no issue so I am not sure what “G-Sync Compatible” aka Freesync will bring to the table.

Anyway, this card refresh from AMD is a placeholder until AMD get NAvi based cards out the door later this year. I wish them the best in that and to do to Nvidia what they have done to Intel in the CPU sphere.


The Vega 56 and 64 were starved for bandwidth, and it slowed the card down. With 16gb of HBM2 I think they are trying to aleviate that.


When I first ran a game (Il-2 BoS) after getting my Pimax I was also getting 20 - 25 FPS. My first thoughts were that Pimax have not sorted this crap and it was reported months ago.

It was after messing with settings I saw that Steam SS was at 200% and it was killing performance. I had also forgot that I had set the in game graphics to ultra for 2D gaming rather than VR which is more demanding.

Dropping settings to medium/high and the SS to realistic levels gave much more expected results.

I have a Ryzen 2700X with a mild overclock using PB2 (power boost 2).



I don’t think it was a bandwidth problem, or a memory quantity matter, Vega 56 and 64 had over 400GB/s bandwidth to spare with on a 2048 bit bus HMB2, almost on par with GTX1080x series, it was probably a combination of sw optimization and that AMD did only a partial jump to the new memory architecture and not optimizing it well enough to work with it, in simplier words…they had not optimized memory access well enough to fully exploit it properly, Vega II has now a 4096 bit bus HBM2 (world’s first GPU with it) and close to 1TB/s bandwidth to spare, plus some memory access optimizations strictly adapted to use with HBM type memory that should address previous problems, providing they will adapt the driver side well enough for it.

It however remain to be seen how the card will perform in real world performance and how the announced performance slides from AMD are close to reality, but if it is, this is going to be the fastest compute and ray-tracing oriented card on the market again, as Radeon cards had often had the crown of that multiple times in the market past years.


The real question is, have they optimized the power delivery well enough to actually hit the peak boost clocks and stay there.

Every video I have seen where the card is undervolted with power limit set to max has seen good gains.