9900K. Potential saviour for 8K?


#81

Let me describe the level of this “not that straight forward”. The GPU spends around 11ms building a frame to be displayed. The CPU simultaneously prepares what work changes for the next frame. They synchronize during a window, perhaps 10µs wide. This synchronization is affected by CPU performance, on the order of around 4ns or so, but far more by bus clocks, possible switch latencies, and more than anything power states. A CPU that is actively looking for the event will react far better than one that was asleep. So CPU tuning helps (which is why we have e.g. game mode), but as far as VR timing goes, the GPU performance is the deciding factor for at least a hundred times as much time as the CPU is. CPU tuning can’t get you more than a percent of improvement.

There are workloads that behave differently, of course. Those are the ones that have such malfunctioning CPU loads they cannot perform decently on any VR system (nor, arguably, non-VR - it’s the “decently” line that shifts, not the performance). Apparently quite a few sims fit in that bracket. At the level of simulations they do, this is downright embarrassing. Simulations that actually are heavy workloads, by contrast, are run on GPUs and rarely in real time. This is because they’re also a hundred times or more faster at such work than CPUs.


#82

It is not pointless. You asked if 9900K may save 8K, which most here understood as if 9900K can improve the rendering speed. It can/will for sure, but the amount of improvement will depend on which CPU you upgrade from and for which games.

The video I posted and I commented on shows just four games, which are not representative of the VR games neither of pancake games in general. They just represent slightly different takes, e.g. RoTR was optimized by/for AMD and it shows even on CPU side, SoTR was not optimized for AMD (maybe for Nvidia?), FarCry is clearly not optimized at all. Metro, I do not know.

The only interesting point in this benchmark is at certain resolution (which translates to fill rate/rasterization power) both CPUs are “fast enough” as the GPU takes already longer time than those CPU needs. It does not have to be true for all games, maybe flightsims do need more CPU power (or are badly optimized and fall into FarCry rank) and you will still see the improvement in VR as well.

The idea is that if the game can utilize the CPU power smartly (which IL2 or DCS do not seem to do) the rendering speed (refresh rate) will be determined by the GPU and at certain scene complexity and target resolution it becomes the bottleneck even if you use RTX 2080 Ti, and when it happens, the small difference in CPU performance will not matter, i.e. it will not matter whether you have 8700K or 9900K or even Ryzen 2700X.

I am curious about how @SweViver benchmarks turn out, once he got his 9900K, but it will also depend if he will run it with his “old” 1080 Ti or already got 2080 Ti.


#83

Its awesome isn’t it?


#84

I’m on 2700k with 1080ti


#85

I’m running 6600k at 4.4 ghz and 1080ti…staying with this setup till I get pimax and will reevaluate at that time…


#86

The annoying thing here is even if AMD provide competition, for my gsync monitors to work properly I’ll need Nvidia. If I had freesync monitors, Nvidia cards wouldn’t help. I realise Nvidia is at fault for this, but new monitors + a new card would be pretty expensive.

I just have to hope AMD does provoke a strong 21 series.


#87

#88

Blimey this guy really hates Intel. And I feel sorry for any Americans here trying to understand him. But yeah it’s really eye opening.


#89

Defiantly eye opening


#90

Truly, it depends on HOW limited the framerates are. I’m certain that my 980 Ti is the biggest bottleneck in my system. Once I’ve upgraded that, I’ll decide what my next upgrade will be. I have a system which I built from scratch and have continually upgraded over the years: Motherboard and CPU, SSDs and hard drives, several video cards, power supply, addon cards for USB 3.0, etc. I generally budget ~$500-$1000 per year on upgrades. A 2080Ti will eat all of that and more. (Last year’s budget went to the Pimax Kickstarter.)


#91

Which is unfortunate. I have a 4K Freesync monitor (the GSync ones were too expensive at the time). It would be great if nVidia would support Freesync too.


#92

Not pointless but the cost-performance ratio is even crapper than with the 2080 TI.
I upgraded my i7 3770K @4.6ghz to a 8700k @ 5.2ghz and it helped gaming with the Vive.

There is not much difference in FPS at first. Where this shines in VR is the min FPS. When the CPU misses a tic you get a frame drop. You hardly notice this when gaming but in VR this is something you want to avoid.
As long you got something half decent for a CPU I still advise to invest your money into the best GPU you can afford at first. The performance gain is much, much higher.


#93

He’s pretty damn fair about all 3 of them though. He was calling out VEGA as being too little too late something like 11 months before release. He does an insane amount of research and is amazing at speculation.

He also has some of the best sources in the industry, he was the first to have the specs of the RTX cards.

Main takeaway for those who don’t want to watch the video: Intel has a history of having relations like those which PT (the group who benchmarked these processors). They practically own PT and paid themselves to do the benchmarks.

And that’s not the first time they’ve done something like this (seriously, just watch the video).


#94

Yeah, that’s actually why the new Nvidia cards don’t have the newest HDMI spec, because they support adaptive sync (freesync) as a part of the spec.

Nvidia could snap their fingers and have freesync working on current-gen cards as is, and Level1Techs actually got freesync working natively on the cards but couldn’t release anything due to fear of legal action.


#95

Wasn’t freesync already included in displayport, and Nvidia blocks it even then?

I would love to punish Nvidia for this bs, but don’t want to end hurting my own interests. As it stands, I’ll have to wait until AMD/Intel provide an equivalent or superior GPU, AND either someone manages to figure out a way of forcing Gsync monitors to play nice with AMD cards or my monitors become redundant before I can get out from under Nvidia’s thumb.


#96

I see that as unlikely, since G-sync is actually a hardware solution. The reason you have to pay $200-$300 more for a G-sync monitor than you do for an equivalent Freesync is that they are paying the NVidia tax for a G-sync chip.

It is currently possible to use Freesync on NVidia cards because it is a software solution but it is a bit janky. You also need two monitors and some form of AMD graphics (usually an APU), so it isn’t usually the best option.


#97

Yeah probably. I hate being locked in though :confused: I wish these adaptive framerate solutions had come from a third party, because I don’t see anything changing for a long time.


#98

I just wish all the major engines and game studios would simply make use of the power we already have. I could cry to see that most games are still stuck on 1 ( ONE!!!) Core or maybe 2. Most of the Games don’t use any of the advanced features the green or red team have (like VRWorks or SLI for the green side) and there is only a view games that fully use dx12 features or use the power of vulcan.
I am sure we could save ourselves quite some dollars if the games would use all the potential our hardware has and had the last couple years.
We should create an initiative to “Unleash the Power!” which supports All Multi CPU and MultiGPU ing and usage of Advanced API features. – but I guess nobody would care.
.


#99

Well yes, most of the games are multi-platform and not PC exclusive, so no extra effort for PC versions.
I have always been impressed with Croteam (devs of Serious Sam and The Talos principle) they seem to implement a lot of stuff in their game engine and you can really tweak their games easily.
They oldskool PC nerds.


#100

Yeah, I guess that is one of the Problems (Corss Platform) but also Games Like ARMA or DCS care little for all the cores or multi GPU.
CorTeam should start Training other studios, they really have it going for the tec!