9900K. Potential saviour for 8K?


#61

Different servers, different needs.

Where AMD wins the most, is when they can reduce total amount of needed sockets for the same performance.

So the amount of CPUs is the same, but the amount of everything else is much lower.


#62

The rendering is done on the GPU and always has been. The resolution is only relevant for the rendering. I wouldn’t know any line of code that will be slower or faster on the CPU, depending on the render resolution. I’ve never heard of any game that could do post-processing on the CPU. The CPU doesn’t even have access to the frame buffer, it’s stored in VRAM. It would be extremely inefficient and difficult to even program it, unless you’re doing the entire render on the CPU. In which case you don’t use the GPU at all, for example if you’re rendering a picture in a professional program, which usually takes minutes.


#63

This is why you should wait for reviews before dropping hundreds of currency units on hardware…

Edit: acknowledging the irony of me backing several kickstarters here…


#64

I was going to link this, thanks for doing it.

Yeah I’d be interested to see the benefits by genuine reviewers.


#65

This video could be interesting for anyone considering Ryzen (instead of i7/i9) and RTX 2080 (Ti).


#66

I’m glad to see AMD getting some love in this thread. Thank You @Serinity for bringing some much needed perspective. I think it would be foolish to rush out and buy intel right now, wait for benchmarks people. Zen 2 is right around the corner, Lisa Su is scheduled to speak at ces 2019 in January. Maybe there will be some more info revealed then.


#67

Interesting but not surprising but good to see 2080Ti checked. We already know AMD currently is not as good as Intel in games.

I suspect Zen2 won’t beat 9900k but let’s wait for honest benchmarks for both. Very exciting times. Go AMD!!!


#68

Are you sure they weren’t talking about Ryzen 2, rather than Zen 2?

Ryzen 1000 = Zen
Ryzen 2000 = Zen+
Ryzen 3000 = Zen 2

The 2700x is Zen+. Yes of course the 9900k is going to beat the 2700x, it’s Intel 14nm vs AMD on 12nm LP (an inferior design node).

Zen 2 is on 7nm and addresses latency issues. It’s pretty impossible Intel will beat it in my eyes. 7nm is a humoungous leap, and again Zen 2 was made to compete with 10nm Cascade Lake (that never came).

Problem is everyone thinks that Zen 2 is another incrimental upgrade as Zen+ was to Zen, which if that was true Intel would still beat AMD. But that’s simply not the case, this is an entirely new architecture on a super small node.

For the first time in AMD’s history, Lisa Su will be presenting at CES. The engineers when they heard that Zen 2 would be competing against Intel’s 14nm were flabbergasted. Lisa said that Rome (EPYC Zen 2) is ahead of schedule.

We’re in for a treat, boys.


#69

I believe this was not the point of the video. Intel was/is better at high FPS, low res (relatively speaking), but when the resolution approaches 4K even 2080 Ti becomes a bottleneck. Pimax HMD needs a level of rendering performance which exceeds 4K resolution in 2D games, so the chances are that there should not be much of a difference in performance with either Ryzen or i7/i9, not because Ryzen will be better at the higher res, but because even 2080 Ti is not fast enough (at this resolution) to keep up with either CPU.

So if for whatever reason, someone is thinking about getting Ryzen for Pimax, and pair it with 2080 or 2080 Ti, he should not lose much or anything at all compared to Intel platform as the difference should not be significant at this particular application.

One good reason for going for Ryzen now is if you believe that Zen 2 will finally catch up with Intel even on latency/IPC front and you will plan to upgrade to Zen 2 later (next year), which however might not be necessary as long as we will not have more powerful GPU than 2080 Ti.


#70

For me the Computing sphere is littered with Mafia style tactics where a dominant player locks out the competition stifling innovation and real competition and choice for the end user.

Be it Microsoft with their dodgy OEM install deals destroying better OS alternatives from getting pre-installed on configured computer systems i.e. BeOS in the 90’s and Linux to Intel and Nvidia screwing over their competitors doing the same to AMD and VIA.

Unfortunately Corporations in the West act like Psychopaths and without a strong body politic to police their shady behaviour, humanity is held back technologically. We see this in VR also with Oculus and Vive who release tech that captures the market and then sits on it milking the most they can. It is the reason when I saw what PIMAX was doing with their kickstarter and the 8K, I became a backer.

Sure it has been a bumpy ride for the 8K but as an informed consumer and one who like to see technological progress, I have been happy to show support for what they are doing.

Simulations like Il2 BoX series and DCS can be very CPU limited due to their use of DirectX 11 and all the AI/Physics modelling computation they demand. I know in the BoS series AI aircraft use the same Flight Model as the player and so when you enter a battle with many aircraft involved especially with twin engined aircraft, your frames drop considerably. Not that I complain as I remember the UFO behaviour of AI flight models in the original Il2 series and I would never want a return to that. Just the need for AI / Physics modelling to be better distributed over multi core CPU’s.

If you guys are interesting in OS tech, check out Haiku-OS which is the free successor to BeOS. An os that in the late 90’s that was doing multiple video streams, software 3D and audio simultaneously without slowing down and or dropping frames on standard PC hardware. The OS was designed from the ground up for Media use and put Windows 9x and NT variants as well as Mac to shame. Haiku OS is an Open rewrite of BeOS which has been in development for 10 years by a small group of dedicated programmers and they have just released their Beta.


#71

So if I don’t want to overclock and won’t play flight sims then 8700K is ok I don’t need to wait for 9900K?


#72

8 logical cores? In this case being 8 physical cores if there’s no hyper threading.


#73

Yeah, lol I misspoke


#74

8 physical and logical cores technically, lol


#75

Yes. I’m still running with my trusty old z97 based i7 4770K with 1080ti (previously 780-970 and 1080) I bought back in 2014. Overclocked to 4.1GHz with a Corsair H110 water cooler, it’s been 100% flawless and has dutifully performed everything I’ve thrown at it. I really think I picked it up during a cost/performace sweet spot.

As such I’ve been completely underwhelmed with Intel offerings since then and now they have to balls to pull hyoerthreading from the i7 (IIANM) and make it an i9 “feature”

I have been debating a Threadripper for a while but I’m more likely to go “clean break” and build up a Zen 2 based system with a 2xxx series Nvidia.


#76

I plan to keep my i7 4770 for a while longer. If my framerates appear to be too limited with a 2080Ti in Elite Dangerous, I’ll upgrade. The 2080Ti will empty my “upgrade budget” for a while, so waiting makes a lot of sense, especially considering that Intel will be fixing some known malware/security issues in their CPUs in the relatively near future.


#77

Yeah, the reason it’s held up so well is because Intel has done barely anything to improve on it. Most of the changes they’ve made that have impacted processor performance has been in process refinement.

The reason Intel pulled hyperthreading from the i7 wasn’t actually as an anti-consumer move, it was to drive up single core performance as high was they could possibly get it. They’re terrified of Zen 2, which is also why for the first time since 2nd gen we’ve got a soldered gold-plated IHS. They’re trying to drive up clockspeed as high as possible to stay relevant.

Threadripper would really only benefit you if you’re a streamer or just have a bad habit of rendering videos whilst playing VR. The only exception to this would be cyubeVR, which can take advantage of all cores and threads quite beautifully. There might be more than that but all I can think of are games that support 4 or 8 core utilization.


#78

Ok so this whole thread is pointless then if 2080Ti will be the bottleneck. But I do not believe that anyway. CPU plays a part even if ‘bottlenecked’ at GPU. It’s not that straightforward or linear in my opinion.


#79

You may be right. Perhaps even AMDs own fanboys do not fully appreciate the improvement. I look forward to benchmarks.


#80

So if framrates are limited even with 2080Ti you’ll upgrade CPU to boost framerates. So you don’t believe in a linear bottleneck where GPU bottlenecks first. Neither do I, not these days (even for non flight Sims).