I believe GAA is the next gen tech for the node process. Samsung is the first foundry to do GAA w/3nm while TSMC is sticking w/FinFET for their 3nm. It'll be interesting to see how 3nm FinFET compare to 3nm GAA.
https://www.anandtech.com/show/16041/where-are-my-gaafets-ts...
edit:
>After that, transistor structures begin to change. Samsung and TSMC are manufacturing chips at 7nm and 5nm based on today’s finFETs. Samsung will move to nanosheet FETs at 3nm. Intel is also developing GAA technology. TSMC plans to extend finFETs to 3nm, and then will migrate to nanosheet FETs at 2nm around 2024.
https://semiengineering.com/the-increasingly-uneven-race-to-...
Samsung Foundry have a history of over promise and under deliver. This 3nm launch will likely be similar to their EUV node which they claim to be industry first but wasn't shipping in any real volume. So arguably they are not lying, but it is a marketing spin.
TSMC is an extremely pragmatic company. It either work or it doesn't, there is no need to save faces. No need to push industry first GAA or FinFET, push for whatever it works within the timeframe with respect to yield and cost. What is the point of having the best tech in 2022 when they cant produce it with enough volume that any of their customer would want?
That is not to say Samsung Foundry are evil or anything, they are pushing very very hard to try and catch up to TSMC and stays competitive in the market. ( Look at what happen to Global Foundry ). And now Intel is coming. Pat Gelsinger seems to be doing all the right thing.
After listening into their earnings call, their CEO really pushes aside quick profits (such as raising prices) and prefers long-term stability and keeping their existing clients happy. The investors wanted TSMC to raise chip prices and expand their operations.
What is the point of having the best tech in 2021 if all capacity is allocated to a single customer?
Intel has loads of cash and would pay that money in a second, but I suspect that most other companies would rather hold off a bit longer in exchange for much cheaper products.
Comparing Apples to Oranges, and doing doing a comparison on taste vs. size.
The device may well be awesome, but first ICs using it not so.
The biggest advancements in under 14nm were in metal, not so much with the device, process, or materials.
Even if Samsung will produce a better device design, they will still have to catch up to TSMC in so many, many other areas.
Only single digit number of people on this planet will ever know the exact measurements of FinFet vs. GAAFet
But one thing for sure, Samsung saying that they pioneered a new device ahead of TSMC does indeed sound very, and impressive to a certain category big co. people regardless of actual performance.
Even if you pay [€$]1000+ for one of their Smart Phones, you can look forward to:
* Uninstallable cruft, as if they were a telcom and you were on a contract and had not just handed over a grand
* ...like a confusingly-similar-looking competitor to Google Contacts that will upload your info to their servers
* GDPR? LOL
* A hard button on the side of your phone located just below the "volume down" button, easy to press accidentally, that is hard-coded and unconfigurable, that will launch their AI assistant Bixby. Don't want to use Bixby? Tough shit. Nothing you can do about it.
* Constant badgering by the phone's native notification to sign up for "Samsung Members", a social media platform. No, you can't turn that off.
* Other, similar bullshit.
3nm? These are such sketchy practices I cannot imagine it won't affect, say, their high-end TVs (they would totally monitor your house and show you advertisements).
Seriously, avoid that company. No, paying for their high-end options will not insulate you from their nonsense.
My Galaxy Buds Plus are pretty good and have unparalleled battery life - but you can’t use the companion app on Android because it won’t work unless you give it access to your contacts.
My Samsung TV is quite snappy and, besides my model being a special edition that doesn’t come with Bluetooth and them not specifying it anywhere, it’s actually pretty alright. Cold-boots quickly, has a snappy UI, theoretically comes with all the smart features you want ... but it’s full of ads the moment you enable internet access, plus you know the spying allegations. I guess I’ll still have to figure out proper firewall rules.
I’m going to guess that their other appliances are similar. Pretty good hardware, pretty good software underpinnings, just severely held back by some anti-consumer software decisions.
Apple does something similar. Every now and then I get a notification about "try tv+/arcade/music for x months".
Or few years back when wallet was launched, daily notifications to add my card.. in a country that doesnt support apple wallet.
That button is actually my favorite feature of the phone and it'll be very hard to give up. Obviously I'm not using it for Bixby. I have it mapped to play/pause on long press, toggle flashlight on double press and as a secondary unlock/lock button on simple press.
https://play.google.com/store/apps/details?id=com.jamworks.b...
Works fine on my S10, ymmv.
Edit: And yes, the pre-installed crap is super annoying. You can remove some, but not all of it via adb, it's a hassle. Samsung is hardly the only offender in this regard, though they may be among the worst (aside from Google and Apple which get a free pass).
You can turn off sync. Same as Google.
Samsung software isn't worse than Apple or Google IMO.
It's a stretch to blame that on Samsung, processor generations described in nanometres haven't been based on actual component size for years now, by any manufacturer.
Then again, the nanometer sizes aren't always completely indicative of performance and aren't necessary to be used in any capacity when actually using a computing device, so maybe it's not as bad.
Dishonest marketing, though? Most certainly.
You still need to somewhat segment into low/medium/high power chips when comparing them though as it's generally not possible to just take a very high perf/watt low power part and scale up the same design with the same level of efficiency.
To be fair, I read the Xnm labelling is pretty much pure marketing at this point - since 45nm; https://en.wikichip.org/wiki/technology_node#Meaning_lost
QLED
OLED
In this case,
Samsung's 3nm = Intel's 7nm
I am still waiting for a standard based on transistor density numbers!
Intel 4 was estimated to be up to 200M gates/mmsq. I don't think we have exact numbers since Intel only released numbers for their previous 10nm plan, which were heavily revised for Tigerlake iirc. I think Intel 3 is a variant of Intel 4 so 3GAA will presumably be similar to Intel 3.
edit: slides of the 3nm conference yesterday https://twitter.com/stshank/status/1445924295121592321/photo...
then there was windows. have you actually heard an elite hipster tell you, a professional highly paid tech guy, that mac is superior because windows is behind on operating systems? not because unix is better. no, because windows was "8" and mac was "10." X is the roman numeral for "10" you see, and 10 is a later version than 8. So ms ended up skipping windows 9.
There was a defined standard for process sizes. A bunch of unethical marketing scammers used half-truths to scam people. It worked, because people can't even figure out a hamburger. And now the people who followed the long-established standard have to switch to the scammer's standard, because they still need that hamburger guy's money.
Here's the problem with customers... You need their money.
Intel has renamed their node so the industry has now pretty much standardise on naming. Where all 3nm from Samsung, TSMC and Intel will have similar transistor density. ( But not similar performance or any other characteristic )
Officially Intel doesn't use 4nm or 3nm, they call it Intel 4 or Intel 3. But for the sake of easier comparison most people still use Intel 4nm to describe it.
Creating chips with an accuracy of 1nm does not mean 1nm transistors.
But 1nm sounds good as marketing.
Some vertical distances, e.g. thicknesses, are indeed of only a few nm, but the process size name always referred strictly to horizontal distances (i.e. parallel with the wafer surface), which are determined by lithography. Those are at least 10 times larger, in the range 25 nm to 60 nm for modern processes.
There are a few horizontal distances that are not determined by lithography, i.e. they do not correspond with something drawn on the mask, but the distances are determined by speeds of corrosion or diffusion, like also for the vertical dimensions, but those also do not count for naming processes, because you could have such a distance of only e.g. 5 nm even in an 180 nm process. Those distances that are not determined by lithography do not influence the potential density of a circuit but only certain electrical performances.
It’s allowed, as many, many, many other bad things in this world are.
See — you rarely ever see so much marketing money spent on an industrial service, and like here seeing Hollywood level gfx on an obscure industry event keynote would've been more laughable than noteworthy 10 years ago.
Without these cash piles, there is no way to finance new fabs, and SEL is fighting for its survival here. Once you are out of the race in semi industry, you can never catch up.
I wonder if that is some kind of cultural shift that is taking place that started around 2009, or if it's always been like this and I just never noticed.
BMW model numbers used to more or less accurately reflect engine sizes, not anymore, it's just numbers now.
2G, 3G, 4G used to mean something, not anymore.
I could add a remark about the federal reserve, but... I'll just stay away from that. Don't want to be too edgy/turn this into a political discussion (I just think it's interesting from a cultural perspective).
It's like we collectively decided that "it's just numbers, man."
You can get arbitrarily small at the cost of exploding count of masks. I.E. double patterning needs 2X masks, but quad patterning needs 8X. Octuple patterning is completely impractical.
Effects are moderately visible and have to be counteracted at 5nm. I heard some rumours about 7nm, but cannot confirm any countermeasures were taken to avoid quantum effects.
It should be noted that "3nm" is now purely for commercial reasons and has no relationship to the size of transistors on board.
Gate leakage is the phenomenon of quantum tunneling through the gate dielectric barrier and started appearing as gate dielectrics became thinner and thinner. Gate leakage was mitigated by moving to higher k dielectrics (from silicon dioxide, SiO2, to more exotic materials that include other elements such as Hafnium).
Higher k dielectrics allow for the same capacitance per unit area and channel control with a thicker physical gate compared to plain SiO2, reducing gate leakage. This technology change came along with metal gates (which used to be polysilicon) and were a combined advance that Intel incorporated a few years before before TSMC, IIRC circa 2008.
This is a circuit designer's perspective. Someone who actually understands device physics and material properties can chime in to correct me.
Off the back of this episode, I can't help but feel TSMC's monopoly needs disrupting with players like Samsung to contend with Taiwan & China's geopolitical tension.
https://www.tomshardware.com/news/samsung-foundry-to-produce...
Is transistor density the best measure of chip competitiveness these days?
With theoretical organic 1-electron designs you can build a transistor out of 4 carbon atoms. But nobody knows how to mass produce those ones.
We are somewhere on the middle, modern designs did break the 10nm barrier, but are not nearly as small as those numbers you see around.