NVIDIA to Stick to Monolithic GPU Dies for its GeForce "Blackwell" Generation (2024)

');$('.tpu-fancybox-wrap').css('maxWidth', maxWidth);*/instance.$refs.stage.on('transitionend', function() {updateButtonPos(instance);});},onUpdate: updateButtonPos,afterShow: function(instance, slide) {updateButtonPos(instance);instance.$refs.inner.find('.fancybox-tpu-nav').show();},beforeClose: function(instance, slide) {instance.$refs.inner.find('.fancybox-tpu-nav').hide();},afterClose: function(instance, slide) {$('.tpu-fancybox-wrap').contents().unwrap();$('body').removeClass('tpu-fancybox-body-wrap')},baseTpl: '

',});});}loadjs.ready(['jquery', 'fancybox', 'swiper'], function() {attachLightbox('a[data-fancybox]');if ($(window).width()<600) {$('.imgcontainer').each(function() {var $this=$(this);if (($this.find('a').length==1) || ($this.find('a').length>7))return;$this.addClass('swiper-container');$this.find('a').addClass('swiper-slide').css('width', 'auto').wrapAll('

');new Swiper ($this.eq(0), { slidesPerView: 'auto', slidesPerGroup: 1, spaceBetween: 15, pagination: { el: '.swiper-pagination', clickable: true } });});}$('.newspost').on('click', '.spoiler > .button, .spoiler > a', function(e) {e.preventDefault();$(this).next('div').toggle();});$('.newspost').on('click', '.ispoiler', function(e) {e.preventDefault();$(this).find('div').css('filter', '');$(this).removeClass('ispoiler');});$('.contnt').on('click', '.newspoll_btn', function() {popup.Show('TechPowerUp Quick Poll','Loading...');$.get('/news-poll/options?id='+$(this).data('id'), function(data) {$('#popup_content').html(data);});});});

Thursday, May 30th 2024

NVIDIA to Stick to Monolithic GPU Dies for its GeForce "Blackwell" Generation (1)

by

btarunr
Discuss (30 Comments)

NVIDIA's GeForce "Blackwell" generation of gaming GPUs will stick to being traditional monolithic die chips. The company will not build its next generation of chips as either disaggregated devices, or multi-chip modules. Kopite7kimi, a reliable source with NVIDIA leaks, says that the largest GPU in the generation, the "GB202," is based on a physically monolithic design. The GB202 is expected to power the flagship GeForce RTX 5090 (or RTX 4090 successor), and if NVIDIA sticking to traditional chip design for this, then it's unlikely that smaller GPUs will be any different.

In contrast, AMD started building disaggregated devices with its current RDNA 3 generation, with its top two chips, the "Navi 31" and "Navi 32," being disaggregated chips. An interesting rumor suggests that team red's RDNA 4 generation will see a transition from disaggregated chips to multi-chip modules—packages that contain multiple fully-integrated GPU dies. Back to the green camp, and NVIDIA is expected to use an advanced 4 nm-class node for its GeForce "Blackwell" GPUs.

Sources:kopite7kimi (Twitter), HXL (Twitter)

Related News

  • Tags:
  • 4 nm
  • Blackwell
  • GB202
  • NVIDIA
  • RDNA 4
  • RTX 5090
  • Aug 21st 2023 NVIDIA BIOS Signature Lock Broken, vBIOS Modding and Crossflash Enabled by Groundbreaking New Tools (209)
  • Dec 25th 2023 NVIDIA GeForce RTX 50 Series "Blackwell" On Course for Q4-2024 (126)
  • May 24th 2024 NVIDIA RTX 5090 "Blackwell" Founders Edition to Implement the "RTX 4090 Ti" Cinderblock Design (118)
  • May 6th 2024 NVIDIA to Only Launch the Flagship GeForce RTX 5090 in 2024, Rest of the Series in 2025 (154)
  • Feb 19th 2024 NVIDIA RTX 50-series "Blackwell" to Debut 16-pin PCIe Gen 6 Power Connector Standard (106)
  • Aug 22nd 2023 NVIDIA Announces DLSS 3.5 Ray Reconstruction Technology, Works on GeForce 20 and Newer (89)
  • May 9th 2024 NVIDIA Testing GeForce RTX 50 Series "Blackwell" GPU Designs Ranging from 250 W to 600 W (84)
  • Oct 18th 2023 NVIDIA Readies GeForce RTX 4070 SUPER, RTX 4070 Ti SUPER, and RTX 4080 SUPER (76)
  • Apr 5th 2024 NVIDIA Releases DLSS 3.7.0 With Quality E Preset for Image Quality Improvements (54)
  • Jun 26th 2023 More Pictures of NVIDIA's Cinder Block-sized RTX 4090 Ti Cooler Surface (145)
Add your own comment
#1
stimpy88

So, if it's an old-fashioned monolithic die, then it's only half what the GB202 they have shown for the datacenter, as that's two dies glued together. So does that mean the 5090 will have a GB203 die?

Looks like the consumers are getting the shaft yet again, unless the 5080 gets the GB203 die as well, I suppose.

The performance uplift over Lovelace will be interesting with this series, as to me it sounds like a lot of overclocking is going to be needed to bring the big performance gains to these cards. Maybe 600w+ 5090s will be a thing, and 350w 5080s etc...?

#2
Broken Processor
stimpy88So, if it's an old-fashioned monolithic die, then it's only half what the GB202 they have shown for the datacenter. So that means the 5090 will be a GB203 die?

Looks like the consumers are getting the shaft yet again, unless the 5080 gets the GB203 die as well, I suppose.

The performance uplift over Lovelace will be interesting with this series, as to me it sounds like a lot of overclocking is needed to bring big gains to these cards. Maybe 600w 5090s are a thing, and 350w 5080s etc...?

It's always safe to assume the customer is getting shafted by Nvidia these days.
I used to be so excited for new generation of GPU and CPU but I just can't anymore the pricing is leaving such a bad taste in my mouth I just buy peripherals or add to my water cooling. Honestly consoles never looked better.

#3
Jomale

No time for glueing chiplets, because of big AI-sale.

#4
x4it3n

The real performance increase will probably come from the 512-bit bus w/ GDDR7 since the 4090 is Memory bandwidth starved! (and Blackwell will probably be much better in RT/PT too).
But if the rumors are true Nvidia might go back to 1 Generation of GPUs per year (just like it used to be with GTX GPUs back then).

#5
pavle

Good choice; on-chip communication is still the fastest compared to mosaic structures.

#6
wolf

Performance Enthusiast

All that bothers me is the price, performance, specs, features and power consumption. How the sausage is made is below the waterline to a large extent, unless it's going to significantly impact the above mentioned considerations. For all the hoo haa about RDNA3 being chiplets, it didn't seem to matter all that much, at least not in a positive way for the consumers.

#7
Wirko

The Cerebras Wafer-Scale Engine is also monolithic.

#8
N/A

GB202 is likely two GB203 mirrored and linked by 10TBs bus in the middle already prepared for the halved high NA reticle size in mind even though there's no need as all the way down to 16 angstroms does not employ the high Na wavelength yet. Or is it all EUV that gets halved from 868 to 429 is unclear but they are ready for next gen.

#9
BorisDG

Blackwell is looking more like stop-gap chip/generation (like Apple's M3/A17 Pro) than anything else.

#10
SOAREVERSOR
stimpy88So, if it's an old-fashioned monolithic die, then it's only half what the GB202 they have shown for the datacenter, as that's two dies glued together. So does that mean the 5090 will have a GB203 die?

Looks like the consumers are getting the shaft yet again, unless the 5080 gets the GB203 die as well, I suppose.

The performance uplift over Lovelace will be interesting with this series, as to me it sounds like a lot of overclocking is going to be needed to bring the big performance gains to these cards. Maybe 600w+ 5090s will be a thing, and 350w 5080s etc...?

If you want the bigger chip pay up for it. This is like saying drivers are getting screwed because you can't pay for a ferrarri.

#11
AnarchoPrimitiv
wolfAll that bothers me is the price, performance, specs, features and power consumption. How the sausage is made is below the waterline to a large extent, unless it's going to significantly impact the above mentioned considerations. For all the hoo haa about RDNA3 being chiplets, it didn't seem to matter all that much, at least not in a positive way for the consumers.

Remember how the Radeon Fury X with HBM wasn't a "big deal", and now look at HBM....it's so valuable it can't be used in consumer cards. The first application or a new technology's isn't always guaranteed to make a big splash, but eventually it catches on to the point where we don't understand how we survived without it. Chiplets WILL happen, it's inevitable.

#12
Tahagomizer

One has to remember that nVidia is now an AI/datacenter company. They don't really care about yields all that much because they still have the GPU market to eat up partially damaged chips. With how fast their main market is moving, they might have concluded that farting around with disaggregated architecture is just too slow.

#13
Denver

And water is wet... It was always so obvious.

#14
watzupken
Broken ProcessorIt's always safe to assume the customer is getting shafted by Nvidia these days.
I used to be so excited for new generation of GPU and CPU but I just can't anymore the pricing is leaving such a bad taste in my mouth I just buy peripherals or add to my water cooling. Honestly consoles never looked better.

"These days" or its always been like this? Like Ampere for data center are produced on TSMC 7nm, but retail customers/ gamers' version are produced by Samsung on some matured node. Ultimately, data center solutions are the high margin products and will get the best. So it is not surprising.

#15
AusWolf
An interesting rumor suggests that team red's RDNA 4 generation will see a transition from disaggregated chips to multi-chip modules—packages that contain multiple fully-integrated GPU dies.

Actually, I've read the opposite somewhere - that RDNA 4 will be monolithic, and only RDNA 5 will return to chiplets. We'll see, I guess.

As for Blackwell, I guess Nvidia isn't ready to risk accepting or willing to iron out the shortcomings of the chiplet design (high idle power being the key). Playing it safe makes sense.

#16
Prima.Vera
pavleGood choice; on-chip communication is still the fastest compared to mosaic structures.

Yes, but the yields are at least x2 times lower than a die half it's size ;)

SOAREVERSORIf you want the bigger chip pay up for it. This is like saying drivers are getting screwed because you can't pay for a ferrarri.

Again comparing video cards with cars :laugh: :laugh: :laugh: :slap::banghead:
Again, you don't change your car every 1 or 2 years just because a new, faster model came in existence. Best comparison you can do, is with the smart phones. However, only suckers buys the exact same phone (let's call it iPhone 14, when the 3 years old model is exactly the same in all possible ways) :))

#17
stimpy88
x4it3nThe real performance increase will probably come from the 512-bit bus w/ GDDR7 since the 4090 is Memory bandwidth starved! (and Blackwell will probably be much better in RT/PT too).
But if the rumors are true Nvidia might go back to 1 Generation of GPUs per year (just like it used to be with GTX GPUs back then).

I will see a 512bit bus when I actually see it, even in a so called $2000+ "gamer card".

#18
evernessince
pavleGood choice; on-chip communication is still the fastest compared to mosaic structures.

CoWoS is able to provide 8.6TB/s of bandwidth: en.wikichip.org/wiki/tsmc/cowos

That was circa 2022, current implementation on Blackwell could be even better.

On-chip communications have a scaling issue wherein routing more and more data lines throughout the chips becomes increasingly difficult to do in an ideal maner as complexity increases. The university of Toronto did a paper studying the latency of CPUs with and without an interposer which demonstrated that as CPU core count increases so too does the benefit of having an active interposer.

#19
Minus Infinity

What's the surprise. It's still on 4nm and on low-NA lithography with ~850mm^2 die size limit. They'll only move to MCM when they migrate to high-NA EUV , which means much smaller die sizes. If it's TSMC that's not until at least 2027 with A16 node IIRC.

#20
Craptacular
Broken ProcessorIt's always safe to assume the customer is getting shafted by Nvidia these days.
I used to be so excited for new generation of GPU and CPU but I just can't anymore the pricing is leaving such a bad taste in my mouth I just buy peripherals or add to my water cooling. Honestly consoles never looked better.

I would say CPU prices are really good, I remember the days when flagship consumer CPUs, such as a X6800 Core2Duo were 1000 bucks, adjusted for inflation that is $1400 dollars. Today flagship prices are basically 700 dollars.

#21
Broken Processor
CraptacularI would say CPU prices are really good, I remember the days when flagship consumer CPUs, such as a X6800 Core2Duo were 1000 bucks, adjusted for inflation that is $1400 dollars. Today flagship prices are basically 700 dollars.

CPU prices are good but yeah I was thinking more motherboard prices being a joke well for X and Z series but B series are not bad at all until you start wanting features like a debug display. Motherboards are very carefully segmented to give very little until you go up in price but they are still cladding them in armour at every opportunity instead of offering better sound , more m 2 or more vrm it blows my mind. And why are decent dual Dimm boards so expensive the current motherboard situation is a joke. I've got no problem putting down money on a board I've spent 600+ on hedt boards in the past but it's got to be value so when I see motherboard manufacturers selling boards at hedt pricing for a lot less and clad in worthless armour instead of quality components yeah it makes me think these companies think I'm a fool but hey at least they got plenty of RGB for me to stare at while I try to work out why they won't boot.

#22
KsushaTeaKisa

I'm sorry for off-topic, but when should we expect 5070? Tbh I'm not sure if I should wait for the 5070 or should I buy the 4070Ti Super

#23
Prima.Vera
KsushaTeaKisaI'm sorry for off-topic, but when should we expect 5070? Tbh I'm not sure if I should wait for the 5070 or should I buy the 4070Ti Super

next year.

#24
AusWolf
Broken ProcessorCPU prices are good but yeah I was thinking more motherboard prices being a joke well for X and Z series but B series are not bad at all until you start wanting features like a debug display. Motherboards are very carefully segmented to give very little until you go up in price but they are still cladding them in armour at every opportunity instead of offering better sound , more m 2 or more vrm it blows my mind. And why are decent dual Dimm boards so expensive the current motherboard situation is a joke. I've got no problem putting down money on a board I've spent 600+ on hedt boards in the past but it's got to be value so when I see motherboard manufacturers selling boards at hedt pricing for a lot less and clad in worthless armour instead of quality components yeah it makes me think these companies think I'm a fool but hey at least they got plenty of RGB for me to stare at while I try to work out why they won't boot.

There's no reason for 95% of home users to choose anything other than a $200 B-series motherboard. Prices of Z and X series are artificially inflated to make you think you're getting 2-3x value when you're not.

#25
ARF
KsushaTeaKisaI'm sorry for off-topic, but when should we expect 5070? Tbh I'm not sure if I should wait for the 5070 or should I buy the 4070Ti Super

Why not a Radeon with more VRAM? RX 7900 XT has 20 GB, RX 7800 XT has 16 GB?
I guess right now is a bad moment for buying, because the new generation is coming soon.

Add your own comment
NVIDIA to Stick to Monolithic GPU Dies for its GeForce "Blackwell" Generation (2024)

References

Top Articles
Latest Posts
Article information

Author: Msgr. Benton Quitzon

Last Updated:

Views: 5878

Rating: 4.2 / 5 (63 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Msgr. Benton Quitzon

Birthday: 2001-08-13

Address: 96487 Kris Cliff, Teresiafurt, WI 95201

Phone: +9418513585781

Job: Senior Designer

Hobby: Calligraphy, Rowing, Vacation, Geocaching, Web surfing, Electronics, Electronics

Introduction: My name is Msgr. Benton Quitzon, I am a comfortable, charming, thankful, happy, adventurous, handsome, precious person who loves writing and wants to share my knowledge and understanding with you.