Toms hardware: Nvidia GTX 1180 possible July Release


#21

There’s no holdup. NVidia is just sitting on the cards, so that they can sell off their old stock. AMD currently doesn’t have a truly competitive card (at the moment), so there’s no pressure on NVidia to release a new card. There are rumors that NVidia might actually wait until NOVEMBER to release a new 7nm-based GPU and not ship a new card this summer.

I sure hope that’s not the case.


#22

Agreed. But I do understand them. The need to upgrade videocards will be over in a few iterations I guess. I remember when sound cards got better every year and to get better sound, you needed to upgrade every year. Then good 16 bit sound cards emerged and the need to upgrade was pretty much gone. In a couple of years it will be the same with graphic cards. I don’t see much use beyond cards that can render beyond 2x16k screens. Of course there will be still demand from other segments, like AI, but I’d say that the gaming industry is somewhat of a dead end for them already, so they try to postpone that moment as long as they can. In fact, I already hear ‘normal’ gamers (non VR) saying that they’re not even interested in the next gen videocards, for them the current gen is already pretty much good enough, who wants more than 4k rendering for normal (non VR) videogaming ? So from a business perspective you can’t blame them. Yet for us VR freaks it s*cks big time of course.


#23

You are Right! 640k should be more than enough for anybody. :slight_smile:


#24

Bill Gates made that statement. :beers::joy::+1::sparkles:


#25

He did. It’s hard to predict how much mem is enough, that’s clear by now :slight_smile: But for other hardware area’s we’ve reached that point that it’s just ‘good enough’, like soundcards I mentioned, but also think of mobile phone display resolution (1440p) for example. And for normal (non VR) gaming I think we’re now reaching that limit too. Like said, I was reading a topic on a Dutch hardware site regarding the upcoming NVIDIA cards and I read quite a few replies where people literally said they were not even interested in the next gen cards because they felt current hardware was simply good enough. So I bet these gaming card manufacturers will try to slow that process as much as possible.


#26

The days of fun. Memory managers & trying to get enough free base memory by loading drivers & such into hi memory. Just to get doom working & such classics like the Wing Commander series in svga graphics with top 1 meg videocards. :beers::sunglasses::+1::sparkles:


#27

Game Devs will help to some degree to push graphical needs if new cards are released. But yeah graphics are starting to level out in terms of quality realism. Which of course leads to what you say. If today’s graphics are “good enuff” for majority it makes it harder to push newer gpu technologies.

Gaming consoles likewise are getting quite strong in capability & quality of experience.


#28

Quarterdeck Extended Memory Manager :slight_smile:


#29

Indeed they also had “Deskview” a text based multi tasking program.


#30

I have yet to see a game I would call convincingly realistic looking. It is all too easy to pop in stuff that tanks your performance, and a lot of development time goes to balancing, restricting, and simplifying things, to make games render at decent frame rates. (And then, just wait until raytracing becomes more prevalent.)

What I could see taking load off of one’s GPU, is cloud rendering. Latency is a major issue with that, which is doubly problematic with VR, but maybe it could assist the local host, by “pre digesting” some things that you’ll use over several frames,.


#31

I believe imho there is still a lot to graphics (cards) just starting to come around the corner. Raytracing, precision for one and AI just for example, but also physics and so on - and there is still a lot of pretending going on to make believe it looks real in games which can be done more realistic. I agree we are at a point where it seems quite realistic but there is still a lot to cover to get there and as also mentioned VR, resolution and refreshing will crank that demand up for quite a while.


#32

I agree…shareholders want progress and profit…


#33

No he didn’t say that


#34

If your refering to Billy Gates; he certainly did say it. Lol


#35

Can we just stop this silly myth, no he didn’t.

Please provide the source of the quote, I am sure that you will learn something searching for it.


#36

Its no myth; it was during the glory of DOS days.

The same with when he was part of the OS/2 Warp team & asked about Windows Development he said there would be no point as you would just have OS/2.

Here’s the 640k proof.

https://quoteinvestigator-com.cdn.ampproject.org/v/s/quoteinvestigator.com/2011/09/08/640k-enough/amp/?amp_js_v=a1&amp_gsa=1&usqp=mq331AQECAEYAQ%3D%3D#amp_tf=From%20%1%24s&ampshare=https%3A%2F%2Fquoteinvestigator.com%2F2011%2F09%2F08%2F640k-enough%2F

https://en.m.wikiquote.org/wiki/Talk:Bill_Gates

Now stop making the rest of us feel old. :beers::joy::+1::sparkles:


#37

Did you even read the pages you linked?

From your first link
“Since Gates has denied the quotation and the evidence is not compelling I would not attribute it to him at this time. Thanks for this difficult interesting question”

You proved two links both saying he did not say it.


#38

He has denied many things like trying to steal stackers drive compression software. (Why doublespace was removed in 6.2? & later returned as drive space in 6.21? Or 6.22 -MS DOS).

Sure it was taken out of context & was as presented not said during an official speech. Most likely like the misgivings between him & Steve Jobs’ falling out; where Bill even warned him. Lol


#39

And then there’s the whole ms-dos ripoff from CP/M, although I think it was never proven that Gates indeed ripped it off.


#40

Truthfully still not as bad as those who made the y2k bug by not using forward thinking on a 4 digit year code. Lol

Now i do know 100% fact in Windows 3.1 & prior the Calculator if you entered 2.01 - 2 = 0. This was fixed in 3.11. However just shows that MS like IBM’s first Pentiums had issues with Math. :beers::joy::+1::sparkles: