Radeon VII Support comment

issue

#1

Can some PimaxVR staff @Pimax-Support @xunshu make a commenting on near advances for AMD Radeon support?
I would like to buy a Radeon VII that is the Vega 2 16GB GPU for gaming right at 7th of February.

Unfortunately there were only complaints but no solutions regarding Radeon Vega performance with the Pimax8k lately…


#2

Maybe @mixedrealityTV or a pimax member at ces can ask Amd.

@xunshu @PimaxVR


#3

A 16gb framebuffer is huge. Great for Pimax if they can get it working.

If we are extremely conservative with Vega VII gains, this card will perform comfortably where an overclocked 1080 non TI card does in all games. Why do I say that? Based on Gamers Nexus’ overclock of a gen1 Vega with a 242% power table mod. In his video it traded blows with 2070 in some titles, and was ahead of a 1080. Vega VII is a die shrink, so I dont see it overcoming the usual drawbacks.

That said, I’m buying this over BS RTX any day!

The trouble with Vega is in maintaining high clocks without hitting thermal and power limits. I also want this card, because its an awesome deal at $699.

16gb of HBM2? 1 terabyte of memory bandwidth? Yay!


#4

I don’t see how it won’t flop unless it can do better than a 1080ti/2080 in a majority of games. I’m as sick of Nvid. hegemony as anyone else but there’s obviously no way a worse card than 2080 will sell at the same price point, and so far it’s only 30% better than Vega 64 aka gtx 1080 according to Lisa Su, which I believe puts it just below 1080ti/2080


#5

Amd cards are often not taken advantage of properly. My old r9 390 for example is finally seeing some interesting gains as it can run the 5k+ & 8k (don’t really reccommend it though).

Interestingly Nvidia is finally going to support freesync & amd is dominating CES.


#6

going to be 300w or more to power this beast!
https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699


#7

They could have priced it lower, no need to make us pay for 16GB memory


#8

Judging by how AMD configures these out of the box (ie more voltage than is needed,) I’m hoping that when its dialed in it will consume less power.


#9

the biggest surprise to me is that Andandtech and Videocardz claim it has double the ROPs compared to current Vega GPUs, 128 instead of 64. The Vega 56 showed already it had a better Shader/ROP ratio and was more effective, with 128 ROPs the bottleneck for costly per pixel operations should be history.

In case that holds true - that combined with the insane 1TB/s memory bandwidth - aside the 16GB VRAM - would make it the best bet for very high resolution renderings and super sampling that is needed for VR.

Given the price point on paper that new Vega7 should be better suited for VR than a 2080 or even a 2080Ti since raytracing and DLSS don’t apply here. Actually in theory it’s on par in peak compute power for FLOPS with the new Ti, more ROPs and memory bandwidth should offer the performance to better maintain 90 FPS at VR resolutions.

But it all comes down to the software stack and how well that performs in games and VR. So that is my question for @Pimax-Support @PimaxVR what about the software optimization PimaxVR is supposedly working on with AMD driver developers?


#10

be very careful, Nvidia support variable rate shading , have multi projection acceleration, texture space shading acceleration and AI-based filling which AMD don’t seem to have and yet extremely usefull for VR.

Nvidia have been working closely with Pimax, and unless it was all about brainwrap there is a chance they were also working on foveated rendering and high FOV rendering together with Pimax for their upcoming eyetracking module. In that scenario AMD would not benefit from it. It’s a risk to take in account.


#11

Its awesome and not even just for VR. Look at all of the 4k 120hz panels at CES this year.

I dont even care if its slightly slower than Nvidia’s chips, with those specs at that price, its got me excited.

I have a 1080p 144hz panel. Nice to know I will be able to supersample like crazy.


#12

Any sources on that? I have a freesync monitor but I had to turn freesync off when I switched from AMD to nvidia. So that would be awesome.
Or is it the other way round?


#13

They are going to support adaptive sync on some monitors. They are calling it Gsync compatable. There will be a setting in Nvidia control panel when the update goes live.


#14

I’m on 4k@75hz 21:9 and I do not regret not having 1080p@144hz because I work a lot on my PC. But for gaming I figured that 1080p@144hz would have been the better choice, so the new monitors look very promising.
I think I’ll wait for the follow up gen. though as my last monitor is still kinda new and ok.


#15

As said, it’s all about the software stack and we need a comment from PimaxVR on this.

Even the last GPU gen had VR-Works for nVidia and LiquidVR for AMD, these so claimed hardware features didn’t show relevant impact until today. Today is what counts, in 6-12mths time there will be new GPUs again…and again…and again…


#16

There are only 10 monitors (from 400 being tested) with freesync that are gsync capable, all of them cost a lot like a gsync monitor.


#17

That’s only half the story. Those are certified to meet certain hardware features such as nit, contrast and others that nVidida deemed ‘good enough’ to allow a ‘gysnc capable’ logo.

But the option can be turned on via the driver for pretty much all monitors that allow for adaptive sync.


#18

It doesn’t have the GPU grunt to service 16gb of memory, so really what is the point of paying for memory you won’t be able to use?

Most VR games seem to have much better support for nvidia cards as well, and then there is the AMD driver issue that means you can’t even really use and AMD card with the pimax right now. I don’t see why you would get this new vega card over a 1080ti or even 2080 which seems to have more utility for the foreseeable future.


#19

Look at @Lillo’s recent posts. He posted an article on Nvidia adding Freesync.

Nvidia at last opens up, G-Sync will be available on Freesync monitors


#20

My R9 390 had the grunt for it’s 8g of memory. It’s more devs haven’t in the past done proper support.

Just look at how powerful the Amd gaming consoles are with proper utilization.