That was my thought too. I was shocked (no pun intended). I’m always careful, but not extreme (no gloves or ground wires).
Can someone with RTX knowledge please explain the revolutionary thing called gigarays?
GPU Memory Memory with NVLink Ray Tracing CUDA Cores Tensor Cores
Quadro RTX 8000 48GB 96GB 10 GigaRays/sec 4,608 576
Quadro RTX 6000 24GB 48GB 10 GigaRays/sec 4,608 576
Quadro RTX 5000 16GB 32GB 6 GigaRays/sec 3,072 384
With availability starting in Q4 2018, the professional focused Quadro RTX GPUs are priced at $10,000, $6,300, and $2,300 for the RTX 8000, 6000, and 5000 respectively.
Billion (10^9) of rays cast per second. A ray is a (3D) line (3D vector) drawn from the viewpoint to an object (or a line bounced off an object) by a ray-traced renderer.
“Turing is NVIDIA’s most important innovation in computer graphics in more than a decade,” said Jensen Huang, founder and CEO of NVIDIA, speaking at the SIGGRAPH 2018 conference. “Hybrid rendering will change the industry, opening up amazing possibilities that enhance our lives with more beautiful designs, richer entertainment and more interactive experiences. The arrival of real-time ray tracing is the Holy Grail of our industry.”
Please explain the holy grail part. not the amount part. i know giga is a lot.
Why is it holy grail?
Yesh I remember those days. I had a 60m 5.25" double height. Lol
I had the big whopper 1 meg trident videocard.
Because it’s so time-consuming to do (for an entire image). Years ago, I used POV-Ray to do some fairly simple ray-traced animations. It took between 2-12 hours per frame at 320x240 resolution. The reason to use ray-tracing is that it looks very good, since it can model reflections, diffraction, and high-quality lighting and shadows.
Here’s a POV-Ray image which took 4.5 days to render… http://hof.povray.org/pebbles.html
If game devs use something like this https://www.onmsft.com/news/new-microsoft-research-video-demos-flashback-vr-technology#!middle-east-africa we could potentially run games on the 8K with with reasonable FPS, but this would only work for games where you cannot interact directly with the environment. Since this is essentially pre-rendered.
Now i am realy worried,
i did send them an email if my gtx1080 will be good enough for demanding games like project cars etc…
They replied: we will discuss your graphic card specs with the team and come back to you…
maybe the gtx 1080 is also not enough…
I think it depends on what you feel is “acceptable” quality settings in-game. Low quality video settings will probably run “ok”.
Personally, I plan to buy a third-party overclocked RTX 2080Ti as soon as they are available. In the meantime, I’ll scrape by with a 980Ti.
Indeed as Neil said. It will depend on how much quality settings & such.
Consider games like doom 3 & crysis; these games challenged gpus for years & used for benchmarking.
A 1080Ti cannot run the more demanding games on full settings on the rift.
People are dreaming to think they can run these games on high settings on an 8k with current gen cards. This was never going to happen without something like foveated rendering. And with nvidias talk on the quadro mentioning it has a chip which will be used for foveated rendering…suggests even when foveated rendering is a consumer reality you will still need to buy their new card as its implementation will rely on this added chip. Capitalism at its finest.
But you still will find your self with what you wanted ,only not for free
Hmm,that’s not entirely true. I think it’s similar to the Raytracing units. It’s not like you will entirely not be able to do the thing with older hardware. It’s just that the upcoming hardware will do it lots better.
This report claims that the 2080 and 2080Ti would both be released this month. I can hardly believe that, to be honest (not only because the first photo shows a card featuring GTX rather than RTX).
Baah that’s just guesswork and Photoshop tbh. We just need to wait until Monday to know for sure. Let’s ignore rumors and guesswork for three more days, yeah?
Rumor goes they go on pre-orders next monday. (the 2080)
They will also have the new VirtualLink connector. I wish the Pimax 8K would have it as well.
I found the same message on a trusted dutch technology site and they seem to put a lot of trust in this leak by videocardz.
It is also assumed thus that the rtx2080 is more like a 1070 in terms of price/performance and the rtx2080ti is more like a 1080. the true xx80ti would perhaps come later in the form of a RTX2290 or some such.
So in other words, don’t expect the 2280ti to be the final card of the Turing generation (like you might expect since 1080ti was).
If the rumors about the meager 8% performance increase of the 2080 towards the 1180Ti are true, what you say would make quite some sense. NVidia have no competition and the mining business is probably pretty much history - so they play foul on the gamers and increase the pricing for the next gen and just relabel the 2070 to 2080 and the 2080 to 2080Ti. Perhaps the Turing chip allows them to release an even more powerful version later so they can even come with a 2080Ti version at a later stage.
[quote=“Axacuatl, post:215, topic:4218, full:true”]
This report claims that the 2080 and 2080Ti would both be released this month. I can hardly believe that, to be honest…[/quote]
That would be great! I’d love to have a 2080Ti, even if I have to wait for my 8K. My 980Ti isn’t really powerful enough to drive my 4K monitor using high quality game settings.
…provided it is not just a 2080 with the pricing of the 2080Ti… wait till we get the benchmarks, that will reveal their true nature.