NVIDIA GTC Keynote, RTX 40, AI and Metaverse

Yeah, forgot about that. Having just changed my monitor to something > 60hz, it really opened my eyes as to sky is really the limit in terms of what load you can put on your card. Not that you'll notice anything above a certain point, but theoretically you can have your gpu render at 200+ fps, if you've got the power.

I keep hearing competitive fps players saying they notice differences even at 240fps, but somehow I find that really hard to believe. I guess it's also tied into how the game engine computes logic and networking client/server-side, especially in multiplayer over-the-internet games.
 
Joined
Jul 31, 2007
Messages
6,383
Why not just get a 30xx series card? They're dropping in price fast, and they'll probably be significantly cheaper in November after the 4080 launches.

I was planning to get a 4080 before this announcement, but there's no way I'm paying that much. I'll grab a 3080 Ti in a few months for $500-$600.
US is so cheap. Cheapest 3070 Ti on a reputable site is around £650 and the MSI one which is a brand I would trust is £690 or $780.


Can't see that drop to $500-600 for a 3080Ti happening anytime soon here :(
 
Joined
Nov 13, 2006
Messages
9,195
Location
Manchester, United Kingdom
How should I feel that I payed around 1200 euros for the 3080 ti + transport. :(
Fucking country I live in.

The issue with these cards is that there's very few games that have such high requirements. Unless you're running at crazy resolutions most of the time they're really overpowered for no apparent reason. Currently the only one that truly is able to put my card to its knees is RDR2.

So even if you try to take the "2x-4x faster than the strongest last-gen cards" at their face, which I wouldn't really (since it's likely not pure performance, but "tricky" using the DLSS tech), there's not much there to make use of these. I'm thinking we need the first generations of Unreal Engine 5 games to really see these cards put to the test. That's the probably the moment I will be even considering an upgrade.
Requirements, no, probably not.

For me, though, I struggle to maintain satisfactory frame rates in several games with my 3080 - with 4K resolution. I don't know if that's what you consider a "crazy" resolution - but I would say it's a common one among enthusiasts.

As we've talked about before, my own minimum FPS level is around 80 FPS - and I'm not just talking average FPS, because many games will dip well below average FPS during combat or similarly stressful situations.

The only way I can maintain decent FPS in many modern games is because of DLSS technology - so I can't agree at all that these cards aren't (nearly) a necessity for someone like myself.

But they're definitely not 1200 USD level appealing.
 
For people who don't want to watch the long Keynote video, here's an article with a short transcript of the RTX Remix part.

Personally I would still need to see the quality of the experience before being bought on that. I don't know what you people think of the current ray-tracing games; the only one I've tried looked more accurate with the lighting effects, but it was also more blurry overall. Maybe it was an exception, I'll need to try other demos or games to get a good idea.


And the link to the NVIDIA's detailed article (with image comparisons):

 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
I wonder what games really need this kind of firepower.
MSFS 2020 (civil aviation sim) and DCS (military aviation sim) are good examples, but I think the sim market is not very large - still, NVIDIA mentioned MSFS. It's possible to reduce the features to get better performances but the resulting experience really feels underwhelming.

For DCS it's mainly a lack of optimization, they are still using an old API which doesn't take full advantage of the current hardware. (when I say current, it includes the generation before RTX)
 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
Requirements, no, probably not.

For me, though, I struggle to maintain satisfactory frame rates in several games with my 3080 - with 4K resolution. I don't know if that's what you consider a "crazy" resolution - but I would say it's a common one among enthusiasts.

As we've talked about before, my own minimum FPS level is around 80 FPS - and I'm not just talking average FPS, because many games will dip well below average FPS during combat or similarly stressful situations.

The only way I can maintain decent FPS in many modern games is because of DLSS technology - so I can't agree at all that these cards aren't (nearly) a necessity for someone like myself.

But they're definitely not 1200 USD level appealing.
Yeah, I meant more along the lines that, for me at least, it seems very frivolous to pour that amount of money into hardware but to only have 2-5 games max that really need that sort of hardware.

I feel I pretty much splurged with this last upgrade of mine, and while it's nice and all to see it all run very smoothly, I wouldn't call it a smart move. It was very much on a whim.

I don't know, maybe I'm having a bit of buyers remorse. Not that I would give up what I got. So, I guess I'm a bit hypocritical :D
 
Joined
Jul 31, 2007
Messages
6,383
Why not just get a 30xx series card? They're dropping in price fast, and they'll probably be significantly cheaper in November after the 4080 launches.

I was planning to get a 4080 before this announcement, but there's no way I'm paying that much. I'll grab a 3080 Ti in a few months for $500-$600.
I would like a (for example) RTX 3060Ti... but where I live, the Philippines, they're still charging the equivalent of around $600. The cheapest RTX 3070Ti is around $800 equivalent. Unfortunately, in their topsy-turvy pricings here, the AMD stuff is more expensive still! Rampant price-fixing also. No competition.
 
Joined
Nov 18, 2021
Messages
113
I would like a (for example) RTX 3060Ti... but where I live, the Philippines, they're still charging the equivalent of around $600. The cheapest RTX 3070Ti is around $800 equivalent. Unfortunately, in their topsy-turvy pricings here, the AMD stuff is more expensive still! Rampant price-fixing also. No competition.
I'm suprised to hear that. I thought most countries in that region were on the cheaper side when it comes to computer hardware. I can get a 3060 Ti for $450 here and a 3070 Ti for $630.
 
Joined
Oct 21, 2006
Messages
39,401
Location
Florida, US
It seems DLSS3 might come to previous RTX generations:


It's impressive to see how they can lower the latency with this technique. I haven't read the long version but I doubt it includes the monitor latency though, it would be foolish of them.

There could be a very interesting system in which the user only has the upscaling feature with a cheaper and less consuming card, and the heavy work of 3D rendering is done on NVIDIA's side, or any server farm installed more locally.

The users wouldn't have to update their card often, if at all, only the servers would need to. NVIDIA could lower the size of their chips both on client and server sides - hence increasing the yield and lowering the prices significantly. And they could truly develop the cloud service side of their business. That would also reduce the bandwidth significantly, and the latency of course, which is important for games.

But I'm sure they thought about it, it seems obvious when looking at the evolution of their technology.

Oh, and that would definitely shut the door to the blockchain problem.
 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
I guess getting yourself a modern GPU is going to be like getting a Tesla.

Much like the Tesla stretch - we're going to have gamers doing the Nvidia stretch :)

But that's accusatory - as Nvidia is merely trying to change the world for the better!
 
It seems DLSS3 might come to previous RTX generations:


It's impressive to see how they can lower the latency with this technique. I haven't read the long version but I doubt it includes the monitor latency though, it would be foolish of them.

There could be a very interesting system in which the user only has the upscaling feature with a cheaper and less consuming card, and the heavy work of 3D rendering is done on NVIDIA's side, or any server farm installed more locally.

The users wouldn't have to update their card often, if at all, only the servers would need to. NVIDIA could lower the size of their chips both on client and server sides - hence increasing the yield and lowering the prices significantly. And they could truly develop the cloud service side of their business. That would also reduce the bandwidth significantly, and the latency of course, which is important for games.

But I'm sure they thought about it, it seems obvious when looking at the evolution of their technology.

Oh, and that would definitely shut the door to the blockchain problem.
Why would Nvidia do any of that? They'd be cutting into their own profits.
 
Joined
Oct 21, 2006
Messages
39,401
Location
Florida, US
Why would Nvidia do any of that? They'd be cutting into their own profits.
Because it offers more flexibility in the solutions / products. But I have honestly no idea if that's more profitable, that's above my paygrade. It just doesn't seem an unreasonable segmentation to me.

They may sell more cards if they are cheaper, and still find a reason for people to upgrade them now and then, for example more powerful neural net implementations or higher resolutions.

There's an increasing demand for AI chips because they are being used in more applications, for ex. image recognition, image enhancement, automotive. So an NVIDIA (or AMD, Intel) may find just the right formula that is compatible with that demand too and that people could use without all the attached rendering pipeline when it's useless. This is not just imaginary, where I work we were in that specific situation not long ago.

They may earn more from the service, which is not an uncommon pattern. They've already started it, it would be an enhancement, maybe optional for the AI card owners.

Actually, they could even continue to produce the complete cards at the same time, because some people will always prefer to own the whole system and not rely on a service.

The other half part with the rendering pipeline should interest other customers too, it was already the case before the RTX series.
 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
PCGamer breaks down why the 4080 is a shit deal.

When Nvidia launched the existing Ampere generation and the RTX 30-series roughly two years ago, the RTX 3080 series board was a slightly cut-down version of the RTX 3090 using the same GA102 chip and with around 80% of the functional units of its bigger sibling. In turn, the RTX 3070 used the next-tier GA104 GPU and delivered in the region of 55% of the hardware of the RTX 3090.

Now compare that with the new Ada Lovelace series. The RTX 4080 12GB uses the AD104 chip and offers just 45% of the functional units of the RTX 4090. To give one obvious example, it packs well under half the shaders of the RTX 4090—7,680 compared with 16,384. For the RTX 3080 versus the RTX 3090, it was 8,704 shaders compared with 10,496. That's 80% of the shader count.

In terms of its relationship with the RTX 4090, the new RTX 4080 12GB is more akin to the RTX 3060 Ti with its 4,864 shaders. Except the RTX 3060 Ti at least had a 256-bit memory bus. The RTX 4080 12GB only has a 192-bit bus. Oh, and the RTX 4080 is $900.
 
Joined
Oct 21, 2006
Messages
39,401
Location
Florida, US
What the catch exactly? It's still less expensive than an RTX 3090 Ti and has 100-200% its performances, depending on the app. I see that the raw specs are not linear with the price, is that the problem?
 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
Where are you seeing 200% of the performance of a 3090 Ti?
In the "Up to 4x performance" tiny graph, for MSFS in the "current games" category on the left. Well, maybe a little under 2x but not far.

It's surprising because I'd expect a game like that to handle a lot of textures. Of course we all know that those graphs always show the product under the best possible angle, perhaps it was recorded with the camera looking down at the ground from 1 m high.
 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
PCGamer breaks down why the 4080 is a shit deal.
Yeah, they're right.

I actually wrote most of that stuff then decided I just don't care enough to post it and went with "Still nothing to play".

It was such a boring demo. It just felt like nothing but bullshit marketing. The misleading new naming scheme, the lack of next-gen games to show off, the focus on RT and DLSS. So boring.

Just about all I got out of it was confirming the 3GHz OC (4090) leak was likely true. We already knew the specs from the leaks. We just assumed the "12GB" would be called 4070.

But, you know, they are nice clocks. They're good cards.

But the fact remains that MOST gamers play on their phones then we have another 6 years, at least, of 10TFLOP consoles so no one is gonna be targeting games for these cards. I suppose it's good in a way. My 1070 looks like it will run everything at 1080p for a long time yet. If you wanna be a graphics whore and spend big for a higher res then gg to you.

But DLSS isn't even native 4k so enjoy your upscaled 1440 or whatever. :)
 
Joined
Jul 10, 2007
Messages
2,993
Location
Australia
In the "Up to 4x performance" tiny graph, for MSFS in the "current games" category on the left. Well, maybe a little under 2x but not far.

It's surprising because I'd expect a game like that to handle a lot of textures. Of course we all know that those graphs always show the product under the best possible angle, perhaps it was recorded with the camera looking down at the ground from 1 m high.
Nvidia's PR spiels always reference performance from a very specific angle. I'm sure there will be a handful of games where, using ray tracing + DLSS 3, you will get a significant boost.

I think he's correct here though where he says the vast majority of games won't see a big difference going from a higher-end 30xx series card to a 4080.
And no, before you suggest it, the higher clocks of Ada Lovelace over Ampere don't make up for this. They would if the RTX 4080 clocked something like 50% higher than the RTX 4090. But, instead, we're expecting the gap to be a few percentage points. All of which means the RTX 4080 series looks like it will very probably suck for regular rasterized games rather than those that use lots of fancy ray-tracing effects. In other words, the vast majority of games.

Hard to believe, but the RTX 4080 12GB has fewer shaders than the RTX 3080 and a much narrower memory bus. According to the best information we have, it will probably have fewer ROPs and fewer texture units, too. Yes, it has a much higher clock and Ada Lovelace's shader cores aren't directly comparable. But here's the rub. If you think the RTX 4080 12GB is going to be a big leap over the RTX 3080 in most games, just as the RTX 3080 was over the RTX 2080, you're in for a big old letdown.
 
Joined
Oct 21, 2006
Messages
39,401
Location
Florida, US
Nvidia's PR spiels always reference performance from a very specific angle. I'm sure there will be a handful of games where, using ray tracing + DLSS 3, you will get a significant boost.

I think he's correct here though where he says the vast majority of games won't see a big difference going from a higher-end 30xx series card to a 4080.
If DLSS 3 comes to the RTX 30 series, the difference will be even less dramatic.

Looking again at the 3090 Ti, I see it has twice the memory so that may be interesting for demanding games. 24 GB is a lot though, I don't see many games that would use that much, even in the near future.

On the other hand, the RTX 40 consumes noticeably less power since they moved to a better technology node: from TDP 450 W (!) to 285 W for the two cards mentioned above. If I had to choose between the two, this alone would make me think twice.

Anyway, let's wait until they're released to see their relative prices and observed performances.
 
Joined
Aug 29, 2020
Messages
10,365
Location
Good old Europe
Back
Top Bottom