Radeon VII Support comment

issue

#41

Well they still have time until the february 2019 retail launch, and partners will probably optimize that side on their own, plus add their own cooling solution.


#42

Now one news site extreme tech claims it’s 64 ROPs only but that doesn’t translate in huge performance gains…


#43

The gist with this new card from AMD is this. It probably doesnt have 120 ROPS.

Yes, you are getting roughly equivalent performance to Nvidia’s last gen top tier card, ie the1080 ti, and yes, the price is the same as the 1080 TI was at its launch.

However, you are getting 7nm on Vega, a new node ( which means there’s actually been an improvement to the core Vega architecture when comparing gen1 Vega to gen2, as well as its probably more expensive for AMD to make.)

Turing by contrast is not aiming for improvement to architecture except on the highest tier card, but seeks paired down equivalency of performance combined with niche features. On 2080 you get less ram, less bus, less throughput, same performance, same or greater price.

  1. Price did not increase to the consumer relative to the performance gain on Vega II. You are paying what a new 1080 TI would have cost, or what the clearly cut down 2080 currently costs for equivalent performance. (Nvidia is trying to sell us less architectural improvement as if it was more, at a steeper price. AMD 's price is consistent with the performance you are getting relative to last gen, but that is a net improvement in AMDs own product stack.

When Nvidia switches to 7nm you can bank on the prices being $50-$100 more expensive for performance that will be equivalent to this generation’s “higher end” cards, 2080 and 2080 TI

  1. It seems likely that AMD could devise software Solutions to roughly approximate Nvidia’s proprietary software.

  2. Vega offers a framebuffer that eclipses all but Nvidia’s $3,000 and up quadros.

Nvidia has better driver support that offers immediate gains, but AMD aims for longevity in the true sense.

Fury X for example in Doom 2016 trades blows with 1070/1070 TI, even though it was a generation old by the time pascal launched.

AMD may be having driver issues, but you can tell that they are a company that is doing the best it can considering they have had two mediocre generations of GPU hardware launches.


#44

So anybody got a Radeon VII for the Pimax?


#45

Very interested in hearing at 16g hbm2 :heart_eyes: though may need sometime for the driver to mature.


#46

Drivers only have issues with temp management for gaming instead of server compute working scenarios. Besides that the game optimizations should be Vega-10 like for the very most part.

I’d really like to see some true VR FPS numbers. In some 5k res benches it really looks very good


#47

My Radeon VII just arrived the day before my 5k+.

Unfortunately, both are still in their packaging as they are in Tennessee and I am currently attending to medical issues in Michigan. Additionally, since I already have a Vega 64 (two in fact, along with a 56) that are water-cooled to help manage heat, I would prefer to get this card water-cooled from the beginning - once I find and acquire a full cover water block I will install and report back as to my findings.

Hope this helps.


#48

Sorry to hear about your Michigan trip; wish you well.

But awesome to hear on your deliveries. :beers::sunglasses::+1::sparkles:


#49

im actually pretty happy with my vega64 performance on my pimax5k+.
radeon7 is ordered, but i dont expect it to arrive before march…
since radeon7 is more or less a beefed up vega chip i dont expect there is much to change in the drivers. my guess is that it will just work.


#50

From what I hear the performance of the VII doesn‘t compete with the RTX cards, unfortunately.

Meaning it‘s no worthwhile alternative for us and NVidia will keep their prices in dreamland. This just plainly sucks. I need a better card, my 980Ti is surviving but I haven‘t even started Project Cars 2 or Fallout with it yet, knowing it will be a slide show. I‘m running out of time - but I really have a hard time to pay 50% surcharge just because some blokes at NVidia‘s executive level can‘t reset their brains from the mining Eldorado they enjoyed for a couple of years and thought we would just shrug our shoulders and pay them an extra 500$ premium for no other reason than getting them that next fat bonus. Normally I don‘t write that sort of stuff but this kind of thinking with executives is something which has astonished (and annoyed) me more than once in corporate life. Reality out there doesn‘t care what you wish for, and just because you want your new toy (another house near South Beach, fancy super car, what-ever it is) doesn‘t mean people will just spend 50% more on an article they are used to replace every 3,4 years.
NVidia‘s Q4 was poor ? Go figure.


#51

Still need to see tests in pimax headsets to have an idea of how well it compares.


#52

The problem isn’t really the prices; the problem is the disappointing performance of the 20xx GPUs. The RTX 2080 Ti wasn’t really designed to be a gamer card, it’s the Titan version, renamed. (If you look at the video card specs, you will see that the 2080 Ti actually uses a different GPU chip than the 2080.) The 2080 is really the 2080 Ti, etc.

Because the 20xx performance was so pitiful, nVidia renamed their products downward. If you compare the prices of the 2070 to the 1080, the 2080 to the 1080 Ti, and the 2080 Ti to the 10xx-based Titan, you will see that the prices make some sort of sense. The performance problem meant that no one would upgrade, since there’s not much performance improvement with the 2080, compared to the 1080 Ti.

NVidia dropped the ball on this and gamers are the ones to pay the price, literally!


#53

You are correct there. They basically changed the naming scheme to make up for the fact that the cards didn’t have much more rasterizing performance.

Maybe if they had focused on the rasterizing cores (instead of the AI/RT cores) then the gaming performance would be strong enough to justify the prices at their “traditional” values. The AI/RT should have been added to the Titan/Quadro lines.


#54

Don’t wanna disturb your fake news party, but a RTX 2080 is cheaper than the GTX 1080 I’ve bought 2,5 years ago. It’s 30% faster, that’s a normal increase. But continue with your circle fun, I don’t care.


#59

STOP IT ALL

This thread is about Radeon support for PimaxVR.

Now grab your Nvidia posts and move your way dammit!


#72

Did anyone try this with the pimax?


#74

This thread is about information and news regarding Radeon driver support and performance of a Radeon VII when used for PimaxVR.

Any discussion with jacked up arguments about buying hardware is completely off topic. If you have nothing to add on topic please leave and be so kind to delete your garbage, you can still make own topics about hardware buying recommendations and put it there.

Thank you.


#77

Has anybody using a Radeon card discovered a working Smart Smoothing Function aka Brainwarp?

@Sean.Huang: Only Fixed Foveated Rendering is RTX exclusive atm, GTX would be next, Radeon Polaris/Vega supported by FFR is when?


#78

My most recent understanding is that there is absolutely no artifical frame increase solution for AMD cards, irrespective of wether it’s WMR, SteamVR or Oculus.

If this still stands true, I doubt Pimax will be able to provide what the other big three haven’t been able to achieve on the sw side.

Also, from what I understand and this should probably be expanded on once the announced big update hits around Ferbuary 18th, Pimax is using the nVidia Turing specific VRS functionality of the RTX series to provide their current FFR solution. If this is true, only hardware that actually has that physical capability will be able to use it, akin to the VR specific improvements between Pascal and Turing such as the support for multi-viewport rendering in the same rendering pass.


#79

For the Oculus Rift CV1 the TimeWarp/SpaceWarp feature worked about 2-3 weeks later than on nVidias and is still working. (it also did on my DK2 back then)
When @Sean.Huang posts explicitly sth specific is only supported by RTX cards assumption is when no such information is given it works for the whole DirectX stack of cards, just like the Pitool itself.