All this as the OP glorifies AMD's engineering and grit-based culture to drive through all though tough missteps and missed opportunities.
To expand on that, I really do feel AMD has great engineering culture but they keep falling to the same traps. They do not invest strongly enough in software support nor vendor relationships. Neither of these necessitate the more evil monopolistic practices of vendor lock-in and proprietary, non-free (as in libre) software. If they can navigate that without turning evil, they'd be a company for the ages.
And I can't close with mad respect to Dr. Lisa Su for her admirable leadership, itself bookworthy. Also, quick fact, she and Jensen are cousins!
Lest we forget the Intel IPC advantages over comparable AMD CPUs was due to some shortcuts that exposed major vulnerabilities in Intel CPUs made from ~2011 to 2019. I’d be curious to see how a Spectre and Meltdown-patched Intel CPU fares against its AMD competitor NOW. Some of the performance hits were brutal- 20%+ in some workloads.
Nvidia was pushing AMD out of the GPU market back when GPUs were effectively only used for gaming and while GameWorks was predatory, you can’t really blame them for having the cooler-running, quieter, more energy-efficient GPUs going back to the Maxwell line (GTX 9x0). CUDA didn’t screw AMD until recently… but in 2014, people were picking Nvidia because the GPUs were considerably “better”. AMD had the best bang for buck back then, but you’d have more power consumption and heat output, and the drivers tended to be buggy. The bugs would be fixed, but it really sucked for people trying to play games on release day.
For like 8 years their drivers on Linux were a nightmare and AMD could have come in and done better.
AMD eventually did while Nvidia's drivers remained a nightmare almost until these days. But sure, AMD could have done it sooner.
Nowadays almost nobody cares about OpenCL.
https://news.ycombinator.com/item?id=40790924
That was OK for the CPU turnaround, but on the GPU front it completely shut them out of the first rounds of the AI party and maybe a trillion in market cap.
I'm really surprised AMD isn't throwing a whole bunch of money on emulating CUDA. If they could "just" make CUDA work on AMD cards, it feels like Nvidia's position would be severely weakened.
Kind of like how Valve invested heavily into Proton and now gaming on Linux is pretty much fine.
What was their shady practices and lock-in policies?
Personally, I have no issue with "ends justify the means"-style thinking as a blanket rule, often it's perfectly appropriate.
I would argue it is, in this case, where Nvidia was playing a game by the rules. If there is an issue with how they played, then government should change the rules.
The people in power in the US don't want that though.
This takeaway was a little odd to me in the context of 2008. I had been an AMD stalwart in my PCs since about 2000 (Athlon Thunderbird), but IIRC in 2008 Intel had the better processor. Better single core performance, better performance/watt, and I think AMD processors tended to have stability issues around this time. I remember I built a PC in 2009 with a Core processor for these reasons.
Obviously this is a niche market (gaming PC) perspective. But I don't think it was so clear cut.
Personally, I’ve always liked Intel for stability reasons. Running Intel chipsets and CPUs, I’ve just had fewer issues. I’m an enthusiast, so I do spend more than I should on both Intel and AMD rather frequently… but now, I’m hungry for an Ampere system. My wallet is crying.
That happened well after 2008, with the advent of Zen and chiplet bases tech and better perf/W
Given Nvidia's track record I'd sooner imagine them just slacking off and overcharging more for lack of competition. I wish AMD would actually compete with them on GPUs (for graphics, not AI). Interestingly Intel seems to be trying to work up to that now.
Being able to take a trailing-node strategy during the Turing/Ampere years, being able to run a full node behind RDNA1/2 and use dirt-cheap Samsung crap and last-gen TSMC 16FF/14FFN while still fighting AMD to a standstill on efficiency is entirely the result of AMD slacking off.
AMD themselves have said they slacked off. Lost focus, is the quote.
Practicality beats purity 100% of the time. This echoes "Worse is better".
Not understanding the importance of GPUs in 2006, or of being first-to-market, while confusing OpenGL with OpenCL (twice), survival bias (BELIEVE IN YOUR VISION)…
> AMD (NASDAQ:AMD) today announced revenue for the first quarter of 2024 of $5.5 billion, gross margin of 47%, operating income of $36 million, net income of $123 million and diluted earnings per share of $0.07.
https://ir.amd.com/news-events/press-releases/detail/1192/am...
And Nvidia had a revenue $22.1 billion in Q4 2024, gross margin of 76%, operating income of $13.6B, net income of $12.2B, and diluted earnings per share $4.93. https://investor.nvidia.com/news/press-release-details/2024/...
It's interesting that they see such a monopoly as something that would bring costs down. It seems more to me like competing with AMD does much more to keep Nvidias costs down (if they can be described as "down") than combining resources would.
Is that a far statement to make, given ~20-years has passed?
What does this mean? I thought neither have any “fab” (manufacturing) facilities.
Imagine the wealth destruction if they had merged way back then! I don't love the way mergers are regulated today but I do feel like preventing companies from growing too big through mergers is desirable.
I get arguments that maybe one fab is better than the other, but what about all of them combined? All of our modern chipmaking capability all at once.
Nvidia has no factories. You can ship their output on a USB flash drive. Valuation: ~3.1T.
Intel, TSMC and Samsung have all the factories. Every modern chip made on earth in this circle. Combined valuation: ~1.1T
This is simple napkin math for this arbitrary retail investor. I don't know when the music will stop but it absolutely will.
It's impossible to overstate to advantages that CUDA, it's documentation, toolchain, and nSight software provide to outside developers.
The closest thing I've seen to nSight Systems software is Intel's VTune. But that's just one piece I'm a much larger puzzle, and last I checked, VTune was only for Intel CPU.
AFAICT, Nvidia's software seriously reduces the ramp-up time for new developers to write kernels or apps that make good use of the available hardware.
E.g., nsys-ui (like VTune) recognizes anomalous profile results, and makes solid suggestions for next steps. I don't know of other software that does this (well), although maybe I'm just uninformed.
DGX is a complete data center from Nvidia where Nvidia is the supplier of everything themselves:
- CPU+GPU from Nvidia - Rack from Nvidia - Interconnects + networking from Nvidia - SW from OS to application framework from Nvidia
The only thing Nvidia really needs partners with DGX is memory (RAM + SSD).
One reason Nvidia's margins are so high is because they provide the whole data center so while competition has to split margins (AMD/Intel + SMCI/DELL + Broadcom/Arista + Cray/HPE).
Don't believe me? Do you believe nvda?
There are many companies working on alternatives at the moment, but it will be a while until Nvidia can be replaced.
thats gonna take a while
I remember reading that on places like the Register, but they kept the second A, so DAAMIT.
I'm sure it mean engineering but i've never seen that abbreviation, he motioned he's from India, is that where this comes from or is it just an individual quirk?
I wonder how many companies had this problem.
I should‘ve BET 50% my portfolio.
So, long story short is that most engineers, especially ones as fanboyish as this, are wildly out of place in decision making and can't see the forest for the trees?
It doesn't seem that surprising.
My experience is rather that people who are passionate about engineering simply have a very different "taste" in hardware and buying decisions than other groups. So they see the forest insanely well, but they see very different paths through this forest than other people (say analysts or the general population) do.
Vaguely interesting side note, yandex found that from poor search terms very easily and google abjectly failed to. I hope google are tracking how frequently people use their engine to find yandex, while remembering bing being mostly used to find google and maybe the death of yahoo.
is there anybody here who has access to a B200 NVL72 with working external nvlink switches and wants to share non-marketing impressions?
it is wild the way AMD engineers can't stop themselves from throwing stones, even with 20 years of distance and even when their entire product strategy in 2024 now rides on gluing together these cores.
people forget that Intel saying that AMD was gluing together a bunch of cores comes after years of AMD fans whining that Intel was gluing together a bunch of cores - that was always an insult to Intel users that pentium D wasn't a real chip, that core2quad wasn't a real chip (not like quadfather, that's a real quad-core platform!). And you see that play out here, this guy is still salty that Intel was the first to glue together some chips in 2002 or whatever!
and the first time AMD did it, they rightfully took some heat for doing it... especially since Naples was a dreadful product. Rome was a completely different league, Naples really was glued-together garbage in comparison to Rome or to a monolithic chip. You can argue that (like DLSS 1.0) maybe there was a vision or approach there that people were missing, but people were correct that Naples was a dogshit product that suffered from its glued-together nature. Even consumer ryzen was a real mixed bag, vendors basically took one look at naples and decided to give AMD 2 more years to cook. People wedge still so wound into it they sent death threats to GamersNexus for the “i7 in production, i5 in gaming” which frankly was already quite generous given the performance.
frankly I find it very instructive to go back and read through some of the article titles and excerpts on semiaccurate because it just is unthinkable how blindly tribal things were even 10 years ago, but this shit is how people thought 10 years ago. Pentium D is bad, because it's glued-together! Core2Quad is bad because it's glued-together! And that from the actual engineers who have the perspective and the understanding to know what they're looking at and the merits, with 20 years of retrospect and distance! If you instead look at what the discourse of this time was like...
https://www.semiaccurate.com/tag/nvidia/page/6/
"NVIDIA plays games with GM204"
"how much will a GM204 card cost you!?"
"Why mantle API will outlive DX12 [as a private playground for API development outside the need for standardization with MS or Khronos]"
"GP100 shows that NVIDIA is over four years behind AMD in advanced packaging"
"NVIDIA profits are up in a fragile way".
like why are amd people like this? inside the company and out. It’s childish. None of the other brands engineers are out clowning on twitter (frank azor? chris hook? etc), none of the other fans are sending death threats when their brand’s product isn’t good. Like you wanna make a $10 bet over it???