9900K. Potential saviour for 8K?


#41

Yep, I do. The risks he was talking about is the fact that EPYC up until Zen 2 hadn’t been certified with several certifications (that take years of steady supply to gain). Not being certified by, say, CISCO can be an automatic percieved high risk to a good chunk of companies and a complete disaualifier. Zen 2 is going to be the first set of EPYC chips to be certified by such organizations.

The thing you have to understand as well is that these corporations do not care about initial cost. They care about density and cost of ownership, both of which AMD have huge advantages in:

Density: AMD can fit 128 cores and 256 threads on a single Motherboard with current-gen (next gen is pretty much garunteed to be 256/512, and according to leaks maybe more…), with a whack ton more memory (4TB per socket x 4 sockets) and a monsterous amount of more PCIe lanes at 128 x 4(something VERY important to data-centric services, such as Dropbox… that have now completely switched over to EPYC). Intel can only fit 96 cores and 192 threads on a single board, with 3TB of memory per socket x 4 sockets and 36 PCIe lanes each (other xeons have higher amounts of PCIe lanes, up to 48, but don’t have as many cores).

Cost of ownership: AMD’s processors as-is are already a pretty significant power savings. The higher density allows for less servers… which leads to lower electricity usage and lower HVAC cooling bills. On top of that even at the same amount of physical processors AMD processors are still more efficient, with the highest end current EPYC processor having a 180w TDP and the Intel processor above having a 220w TDP. Add on top of this 7nm having a 60% per-clock power efficiency advantage over 12nm LP (current EPYC is on 14nm), and suddently Zen 2 is posed to be an insanely efficient and cool chip, with insane density and insane short and long term cost savings. Insane!

As a bonus…
Security: Now that EPYC is being certified with the same certifications as Intel, you now have a more even playing field where processors can compete by the content of their character rather than the color of their advertising. Intel has several low-level exploits that cost performance to fix, whereas AMD is now the inherintly more secure chip (even with Intel’s fixes. AMD has some of these exploits too, but they require you to have local in-person admin access to the machine. By that point the hacker has already won and done more scary stuff than Spectre or meltdown. That means several organizations are going to chose AMD over Intel simply because they’re more secure. Intel isn’t able to completely fix this issue without creating an entirely new architecture, which they haven’t done I’m over a decade. That’s why so many chips spanning so far back are affected. And remember how long it takes to create a new architecture? 6-8 years. Meaning if Intel started when this stuff broke, they have 5-7 years to go.


#42

I hope you are right and that AMD does come out with a better cpu than Intel within the next year. Definitely would be good to have the competition. I can’t even remember the last AMD chip I bought. I would love to see similar competition on the GPU front again as well


#43

I would love to see similar competition on the GPU front again as well

Unfortunately won’t be happening. Nvidia has released the 20 series on 12nm LP now in order to do what they do best, repeat charge the high-end consumers (see: Titan X, XP, Xp). By releasing it now they give AMD the opportunity to release Navi on 7nm, which then Nvidia will respond by immediately releasing the 21 series on 7nm. It will be a huge leap, sure, but Navi will not compete with 7nm Turing - at least not in the metric we all here in the pimax forums care about (highest end possible). I see Navi as hitting 1080Ti level performance for a decent cost ($350? $400?). What AMD needs to do to get a ryzen-style comeback for RTG is embrace the chiplet architecture there as well, which Nvidia has also not started looking into last I checked. But AMD is posed to release a chiplet GPU within 2-3 generations optimistically. Unfortunately we’re going to enter a bit of a dark age in graphics as we did in CPU power. Maybe not as long, but it’s going to happen (see: 20 series). Fortunately for us VR players foveated rendering will ease the pain significantly, but for our pancake cousins the news is not good.


#44

Wow.

I can’t believe I watched all of that but it really helped me.

You see, anyone could see that Athlon destroyed Intel and yet Intel was still ahead. I couldn’t understand why Intel was everywhere even though AMD was so much better. And whilst many of my friends had Intel, in the same way that they chose to wear Nike etc and I hate fashion, I went with what I thought was best and built my own AMD systems. It’s weird because I always thought they’re not as good or there’s something wrong with them or surely they would be market leaders. This was in my mind proven when they vanished and Intel took over. But something just didn’t feel right. Why were AMD, who were obviously far better at CPU creation in my mind suddenly so crap? I just assumed they used some flash-in-the-pan one hit wonder short lived tech that temporarily beat Intel but they probably did it the wrong way, wasn’t sustainable, chose the wrong tech path and then just “lost their way”. I just assumed AMD were crap. Then when there was all this hype about Zen I got temporarily excited only to see it wasn’t as good as Intel after all. I assumed AMD lost their way relying on multiple cores instead of actually making good chips.

But oh my, this video is such a revelation. The poor guys at AMD were and still are far better innovators just as I initially believed and now feel my 20-something younger self is totally vindicated for believing that. It all makes so much sense. Something was just wrong and I just couldn’t work it out. Fuck Intel.

Having said fuck Intel though I’m no fanboy of anyone and whilst I have principles at the end of the day u want the fastest CPU and if 9900K still beats Zen 2 for gaming I’ll get that. But I will almost certainly wait if I can after learning this, that’s the least I owe AMD. And despite not being a fanboy I bloody well hope AMD have turned the corner and will destroy Intel.


#45

You don’t think AMD’s upcoming 7nM will beat RTX?

Or are you saying it will, but the advantage will only be short-lived?

And there’s a chap here who for some reason refuses to buy Nvidia. They’re not dodgy like Intel are they? (although I’m aware all corporations are dodgy to some degree).


#46

Yeah, 2 main reasons AMD lost to Intel in signle core performance:

  1. Intel’s 14nm node is actually really wonderful and very well refined. As I had said earlier the less errors in a chip the better it performs. Intel has been on 14nm for so long that it actually is a wonderful process. It allows them to clock higher than GloFo 14nm or 12nm LP. But Intel has hit the limit of it, it’s just an endless money pit trying to improve it more now. And if they can’t get off of it, that’s when they’re in trouble.

  2. Those issues I mentioned earlier impact single core performance a decent amount.

AMD actually beats Intel in multi-core performance, the decision for AMD to release 8c/16t as the first mainstream Zen chips was more to force Intel off their high horse and to get game developers to wake up. And they’re not garbage cores like FX, they’re very high quality. For multi-threaded loads Intel can’t touch AMD at the moment. But for games, very few support more than 4 cores. Now a few years later AAA and indie titles are starting to support more. The new tomb raider game, battle field 5, the egyptian Assassin’s Creed game and the new one, etc. all support higher core counts and thus perform better on AMD processors. Game devs are waking up, and the i5 of yester year won’t cut it anymore. Shout out to @StonebrickStudios for having the only VR game I know of to support up to 32 cores linearly.


#47

7nm Navi will compete decently with the RTX 20 series (not including 2080Ti), but yes it will be short lived with the 21 series that Nvidia is no doubt already sampling.

Nvidia is pretty dodgy. In my opinion, not nearly to the same degree as Intel (however they REALLY started to push it with the GeForce Partner Program where they partnered / forced OEMs to only use certain brands like STRIX, ROG, etc. with Nvidia cards to make AMD cards seem like knockoffs. They backed off after community backlash). You can see a video here by the same guy you just watched on Nvidia, also worth a watch even if a little long:

Nvidia’s biggest shady stuff imo was the gameworks stuff they pushed hard to have implimented in 2012ish, which added an absurd amount of tessalation on purpose in order to sabatoge the performance of AMD cards. It actually hit both AMD cards and Nvidia cards in performance, but it obviously hit AMD cards a lot harder. They were willing to give their customers a worse experience so that they could benchmark higher and sell more cards.

Nvidia also has aggressive guerilla marketing to perpetuate the myth that Nvidia drivers are more stable than AMD drivers, which simply isn’t true. Infact, Nvidia has pushed out updates twice that have straight up killed cards, and an independent study conducted a few months back found that AMD drivers were on average 11% more stable than their Nvidia counterparts. There’s also the fun fact that Nvidia’s GeForce experience collects analytics and information (and I believe filenames IIRC) on your system and sends them to Nvidia’s servers… to do… something… Here’s a video on their guerilla marketing: https://youtu.be/dE-YM_3YBm0

Last final fun fact for this post: Jensen (CEO of Nvidia) is Lisa Su’s (CEO of AMD) uncle! I bet that makes for wonderful thanksgiving dinners.


#48

@Serinity whoa fantastic information here, I was completely clueless about the current AMD status so I’m glad to hear all this, my motherboard and cpu are the weakest link of my rig so I’m sure to watch out for Zen 2 next year and wait a bit before upgrading.


#49

I watched it too. And thinking back to my Athlon days and BSOD’s everywhere (probably my own lack of PC building skill) I just stuck with Intel next time.

But that video!
Makes me want to root for AMD.

So all we need now is for game developers to implement proper multi-threading support. I mean if everybody ends up with cheaper AMD chips that have large core counts then the developers will start to optimize for it. Not to mention that it is very useful in business apps like rendering and post production and AMD dominate the console market don’t they? So I would expect ports would already use multi-core/threaded optimizations.

I also found the part about Intel having too much control over PCI so could have NVidia by the balls, especially as Intel announced they will work on their own dedicated GPU’s

Anyway, I would like to see AMD take the crown after watching that video.


#50

For some reason most flight sims are CPU hungry so there is a use case for VR there.


#51

Yep I don’t buy the GPU bottleneck myth anymore.

CPU definitely helps.


#52

You might like / hate this video:

Intel and AMD could technically lock out Nvidia from using their CPUs. Intel and Nvidia really hate each other, so Intel partnered with AMD to bring Radeon graphics to Intel processors in order to steal sales.


#53

That’s possible. But that’s independent of resolution. So if a game struggles to reach 90 FPS on the Pimax 8K because of the CPU, then it’ll also struggle to reach 90 FPS on other headsets like the Vive or Rift.


#54

Hmm, not so sure that is a solid argument. If flight sims render or calculate anything on the CPU then lowering the resolution lowers the workload. At the end of the day, they are just not optimized for parallel processing on the GPU. Same applies to creative apps when rendering effects, many are moving routines to the GPU but more are still CPU bound and rendering motionblur, post effects etc can take forever when working at 4K compared to 1080p


#55

Since the end of Pentium 4 debacle and the advent of Core 2 (12 years ago), Intel had no competition in AMD. AMD can blame themselves for Bulldozer and for dragging this design for so long (or coming with Zen so late).


#56

Intel dragged AMD through litigation for 15 years. AMD’s funds were exhausted and they were not able to compete in a traditional sense (bulldozer was actually designed by a computer mostly and not by humans. Ryzen was designed by humans. Think AutoRoute if you’re familiar with PCB design). I suggest you give the first video I recommended on Intel a watch as well.


#57

It’s a solid argument. Physics and AI require the same amount of computation regardless of CPU. Much of the rendering is the same: the number of polygons typically does not depend on resolution. But it’s also true that regular headsets also have trouble with both IL2 and DCS.


#58

I’ve asked a load of AMD admitted fanboys and most of them think it’s unlikely that Zen 2 will beat 9900K for games due to lower latency and other issues. They say Zen 2 is a better CPU overall (they reckon AMD already beats Intel) but for VR extra FPS fhey are not so convinced. Not the thoughts I was hoping for. Guess I’ll have to wait for benchmarks.


#59

Always wait for benchmarks and third party reviews.


#60

Given 11.3 is around the corner and is said to have good performance improvements I think we worry to much about hardware, especially given it can cost an awful lot of money for little gain - say 1080Ti to 2080Ti
If Vulkan performs as expected we should see at least a 30% performance improvement depending on individual builds, I expect the greater the CPU bottleneck and more powerful the GPU the more a system will benefit.

I’m hoping advancements such as DLSS and foveated rendering for VR specifically can also help push up frame rates so that wider FOV and higher res HMD’s are usable with Xplane.