There's a bunch of teams there with three-letter acronyms whose origins have been totally forgotten. Like, nobody knows what LTQ or ASR stands for, or what purpose they have. When you're an intern, you tend to think that the higher-ups know what they're doing, but if you ask for an explanation, you will soon conclude that they don't know either.
People were not working hard enough. At the time Intel's dominance was supreme. They should have been picking up on niche ideas like GPUs and mobile chips, it would have been cheap and adjacent to what they had. Instead, all I heard at the meetings was laughing at the little guys who are now all bigger than Intel. Even my friend in the VC division couldn't get the bosses to see what was happening. People would spend their whole day just having coffee with random colleagues, and making a couple of slides. It's nice to relax sometimes, but when I was there it was way too much of that. There was just way too much fat in the business.
I still have friends there who stayed on. They tell me not to come, and are now wondering how to do the first job search of their professional lives. A couple have moved very recently.
It's very odd that the guy who was famous for saying what upper management should do (set culture) ended up building a culture that has completely failed.
I knew a lot of people who got jobs like this after college. I was so very jealous at the time. I was working in a company that was nice, but also wasn’t afraid to tell people when they weren’t meeting expectations. Some of my friends were at companies where they weren’t expected to “ramp up” for the first year. One person I know read “The Four Hour Work Week” and talked his company into letting him work remote, then started traveling the world. He would brag that his entire job was “telling the engineers what to do” and it took him an hour a day because he did it all through email in one sitting.
Years pass, economies evolved, and now it’s harder to get a job. Companies start looking for dead weight and discover people doing jobs that barely contribute, if at all.
A tech company near me looked at their VPN logs (required to interact with their internal services and do any dev work) and discovered a lot of engineers who were only connecting a couple times per month.
By then it’s hard to turn it around. It’s not easy to take people who have become so comfortable not working that the entire idea of urgency is a foreign concept. Ask for some task that should only take an hour or two and they’ll say they’ll have it by early next week. Any request turns into a series of meetings, which have to be scheduled with all participants, which means they can’t start discussing it until Bob is back from vacation next week, so they might have an idea of what’s required by end of month.
At some point you can’t turn it around without making big changes to the people involved. There’s too much accumulated inertia and habit. You have to reorg at minimum and bring in new management, while also making it clear to everyone that their performance is now actually being noticed. It’s hard.
With Intel, I’ve also heard from some ex-employees who left because pay was lagging. Companies with low expectations can feel like they’re getting away with low pay because many people will keep an easy job despite the low pay. It masks the problem, for a while.
It sounds like you blame their own lack of effort for losing their jobs. Like, if they would have worked harder, it wouldn't be them on the line.
But the reality is, they did not let the corporations take advantage of them. They turned table and had a good Work-Life-Balance and got paid for it. Yes, maybe it cost them their job. But at the same time, they had one for years, and for many people it would have meant that they had been ready for a change anyway.
Eventually, happiness is a personal measure and what fulfills you is your own desire and the way the people worked, that you talk about, may not be your preference. But it does not sound like they made a poor choice.
I worked my ass off for 20 years. I'm an expert in the field that I work in, but when I had been skipped for raises in three years I said fuck it and put my personal life in front of everything else. I wake up when I want, start my work when I want, work way less than I should. I still don't get no raise, but all my peers and my manager continue to tell me what a great job I do. Now I'm slacking hard, but why should I feel bad, when hard work is not valued? That my boss and peers are happy are a positive thing, but I would not concern myself much, if they were less.
I think the thing that's not obvious to young people is that choices that seem good at any given time may turn out to be poor choices further down the line. The guy who traveled the world while working one hour a day telling engineers what to do over email probably had a great young adulthood. It sounds like he paid for it later, though, by getting laid off and having difficulty finding another job.
This doesn't mean that those who worked their asses off didn't get screwed over, but on average they probably did better professionally - and by proxy, financially.
I think some people with 'cushy' jobs don't take on this same mentality, perhaps overestimating the security of their current job. “telling the engineers what to do” is not a good starting point and the answers to follow-up questions had better be pretty detailed and convincing.
Also interviewed someone my year but we were both a year out of school, same major, roughly same job title, at our first post-undergrad jobs. I was thrown into the deep end and learning a lot. He was buying software licenses. I commend him on sticking it out for a bit but also realizing it was a bad fit.
Is it? Everywhere I worked upper management is taking big about the culture but their taking points are rarely applied to the company.
Like when Facebook says something like "we value your privacy"
Sort of like that Twilight Zone episode. The aliens come and convince us they are here to serve man. "Here, if you don't believe us look at our book called 'To Serve Man.'"
Finally one of the humans translates it and discovers it's a cookbook.
https://en.m.wikipedia.org/wiki/To_Serve_Man_(The_Twilight_Z...
Or worse, where I am their talking points about the culture they want ONLY applies to the company and not themselves. (In-office requirements, how the office is laid out, etc.)
I left because I was working on a machine learning project that was a "solution in search of a problem;" and I spent too much time working alone. I was very early in my career and felt like I just wasn't learning enough from my peers.
Overall, I felt like Intel was a positive experience. I do think their biggest problem was that they had to many lifers and didn't have enough "healthy turnover." Almost everyone there started at the beginning of their career, and thus everyone who was mid-late career didn't understand what the rest of the industry was doing.
They are the poster child for "we have a monopoly so we don't have to innovate or even maintain competence". Mind you, how much worse must things be at AMD that they're not winning the x64 war? Eventually the "PC" market is going to get run over by ARM like everything else. Especially now there's a Windows on ARM with proper backwards compatibility.
(although something is very odd with drivers on Windows-ARM, if anyone knows the full story on how to get .inf based 'drivers' working it would be genuinely helpful)
Windows on ARM is still largely ignored, everyone on the consumer level is more than happy with current Intel/AMD offerings.
Every single attempt to sell Windows ARM systems has been more or less a flop, including the recent CoPilot+ PCs.
Windows developer community also largely ignores Windows on ARM, unless there is an actual business value to support yet another ISA during development, CI/CD pipelines, and QA.
Only Apple gets to play the vertical integration game, our way or go away attitude, and they are the survivors of home computer vertical integration only because they got lucky when banks where already knocking on the door.
> This is a very Apple viewcentric point of view.
Which also isn't great for Apple. I mean they're lagging Microsoft now. We've all felt this coming, right? The M series was great but it's hard to think of more innovation after Jobs. I mean... things got smaller/thinner? That's so exciting... now can we fix the very basic apps I use every day that have almost trivially fixable bugs?In a way, Pantheon feels weirdly accurate. People not actually knowing what to do. Just riding on momentum and looking for the easiest problem to solve (thinner & extract more money from those making your product better) because the concern is next quarter, not next year, not the next 5 years. What's the point of having "fuck your money" if you never say "fuck you"?
They shouldn't be. Apple's chips changed the game so much that it was a no-brainer for me to choose them when I bought a new laptop - PCs just couldn't compete with that compute and battery life. Anyone with a decent enough budget is not even considering Windows.
I don't think any power user will be happy with Intel/AMD any more.
https://www.alltechnerd.com/amd-captures-17-more-cpu-market-...
> Despite Intel still holding the lead with 56.3% of systems validated through CPU-Z, AMD is closing in, now claiming 43.7% of the market.
That's probably the strongest mis-statement I've heard this week. At least, it seems AMD have been the x86-64 leaders for several years now.
Why are you thinking AMD aren't winning?
https://www.tomshardware.com/pc-components/cpus/amd-records-...
I wouldn't say that Microsoft's Prism x86 to ARM translation layer has been anywhere near as successful as Rosetta was at running the majority of legacy software on day one.
Prism is improving, but the Windows on ARM reviews are still bringing up x86 software that didn't work at all.
Dealing with MCUs for projects, RISC-V Espressif chips and boards are no-brainers now; I buy big bags of ESP32 boards from Seeed. I get some free ARM boards at work, which are neat - I always love playing with MCUs - but they're relatively power-hungry and expensive without a lot to show for it. I'm either using a ~$6 ESP32 board or a ~$1 ATTiny in a DIP package for home/fun projects. ESP32s are starting to show up in consumer electronics I find, too, along with the relatively pared-down ESP8266s which I'm not as fond of, though I can still flash them easily over USB-TTL at least, so whatever.
In the SBC space, ARM is competing with x86. RISC-V exists but only really for enthusiasts. RISC-V may start making inroads here soon. I picked up some Radxa Rock 2F boards (using ARM-based Rockchips) for ~$12 shipped a few months ago, they run Debian, and these have been fantastic for projects (though now ~impossible to source the cheap 1GB variant of). It's difficult to imagine it being worth getting involved in this nightmarishly competitive space, though obviously some still do. Most seem to try finding some obscure niche to justify a high markup.
In many workloads, it's more the GPU that matters. I need an MMU, a PCIe slot, and driver support. Most of us don't really need these outlandishly complex and CPU-centric $100+ ATX motherboards, or even CPU/RAM sockets/slots; just solder it on. -Like, how often do people even upgrade the CPU on a motherboard anymore? I'm more liable to throw the whole thing out because it doesn't have any 10PB/s 240GW USB9 quantum ports, so cut materials, decrease surface area, lower cost, and make it disposable.
At least in R&D, from the angle I saw it. Clearly, being stingy wasn't a universal problem: heavy buybacks, ludicrous M&A (in foresight and hindsight), and that $180k average salary in the article sounds completely divorced from the snapshot impression that I got. I don't know what gives, was R&D "salary optimized" to a degree that other parts of the business weren't? Did the numbers change at some point but the culture was already rotten and cynical? Or did I see noise and mistake it for signal? Dunno.
In another world I'd love to have been part of the fight to make 10nm work (or whatever needed doing) rather than working on something that doesn't fully use my skills or in my private opinion contribute as much to humanity, but my employer pays me and respects my time and doesn't steer their business into every iceberg in the ocean, and in the end those things are more important.
In R&D management, this is an extremely well-known problem with an extremely well-known solution: use the oversupply to be selective rather than cheap. The fact that they chose to be cheap rather than selective is managerial incompetence of the highest order. They had one job, and they blew it. "Selective" doesn't even mean that the rating system has to be perfect or even good, it just has to equilibrate supply and demand without shredding morale. Even a lottery would suffice for this purpose.
The mountain of money for intel has always been with server chips, as its their high margin chipsets. While they make alot of money on consumer laptops and desktops, its no where near the amount of money they traditionally have made on their server oriented chipsets.
I don't think Intel is likely to come out of this state without something extremely radical happening, and every time they try to do something that could be radical it never has enough time to gestate to work, it always ends up abandoned.
Monopoly and Bureaucracy. That is basically what government is. It is kind of sad reading Intel was like that even in 2005.
Arithmetic Shift Right? (I kid, of course, but seeing a team name that _might_ correspond to an assembly instruction, in an post about Intel amused me.)
I didn’t end up taking the job.
I never really knew what happened to that division.
This is the big risk we all took when we moved away from the Bay Area to work remotely. You arbitrage the COL difference and come out ahead big time, but it might be very hard to make the same salary locally if you can't find a remote job.
Best to make some hay while the sun is shining.
I've heard on this forum of a tactic Intel employed where they broke off some people into a subsidiary, dissolved the subsidiary, and then offered to rehire them with the caveat: Oops, the pension you were promised is now gone. Then Intel's foundry business started failing. Oops!!
Sounds pretty empathetic to me. I’m guessing he also has empathy for Wall St and his shareholders. Ultimately Intel has no choice but to either grow or downsize and the former hasn’t materialized. They’re losing market share and revenue and if they keep that up they will be empathizing with their creditors and the bank.
Snark aside, did Intel management take any cuts, even symbolic ones to show they are in it together?
The newer "web 2.0" companies (and I mean, even Google and Amazon) opened shop in more affluent places
The "older" companies were manufacturers. Even places like Mountain View and San Jose were the working-class towns with HP factories and semiconductor plants. The concentration of engineering talent (HP/Intel/Apple/Atari) is what created the affluence, especially after manufacturing itself was outsourced globally.
The newer Web 2.0 companies don't make physical things; they make software. Their most critical infrastructure isn't a factory but a dense network of developers. They go to the Bay Area, Seattle, etc., because that's where the network is. For the parts of their business that don't require that network, like customer service, they locate in less expensive regions, just as PayPal did with Nebraska. They were even the second largest employer in Nebraska iirc.
But modern skilled workers know how risky it is to put down roots in a place where they only have a couple employment options. So companies struggle to attract talent to remote areas and end up needing to hire in places that already have an established pool of skilled labor, which is typically in the cities and more affluent areas of the state or country.
In this case, the lack of employment options means many of the engineers laid off by Intel will end up needing to uproot their families' lives and move to a new city or state to find a new employer who can to pay for their skills.
I was contracting out at Intel Jones Farm campus in Hillsboro in 2004 and I'd walk around the (then) new neighborhood there by the campus and I distinctly recall thinking "What if something were to happen to Intel in, say 25 or 30 years? What would happen to these neighborhoods?" It was just kind of a thought experiment at the time, but now it seems like we're going to find out.
The $180k figure is also inflated. Most folks being laid off don't make over $100k.
No, it's not. Not even close.
They were getting paid "California salaries in Colorado" (well, really Massachusetts salaries but popular sayings don't have to be completely accurate) and lots of people had virtual mansions on senior tech salaries (plus probably stock options?).
Then DEC imploded and there were almost no other options for hundreds of storage engineers. Knew a lot of people who had their houses foreclosed because so many were flooding the market at the same time.
I suspect most of those folks did not "come from" the bay area in the first place.
Overall my 5000 ft view, was the culture was very different from FAANG or a Bay Area Tech company. If the Bay Area approach is high ownership and high accountability, Intel was much more process driven and low ownership. They even tracked hours worked for engineers in Oregon.
I think it speaks to common challenges when hiring mangers are disconnected from the work, degrees and resumes are worthless, and turnover is difficult.
In many companies team leads dont have a role in the hiring or firing of the employees working for them.
The sad thing is they acquired the basis smartwatch and destroyed it, leaving only Garmin as developers of dedicated activity trackers. I considered getting a basis but was obviously glad I didn't.
But Apple bought the company recently. I worry that whatever made the product great will go away post acquisition. Whether or not Apple keeps working on it at the same level of quality is anyone's guess. Or maybe they'll integrate the best features into their free Photos app and ditch the rest. Or something else entirely.
I can't think of any examples where acquisitions make a product better. But dozens where the product was killed immediately, or suffered a long slow death.
With Apple it's harder for me to know. How do former Dark Sky users feel about the Weather app? I think it has all the features? How about Shazam, which I never used before it became an iOS feature? TestFlight retained its identity. Beats by Dre headsets did too, though Beats Music I think became Apple Music in a way.
There are many acquisitions that lead to better products.
Hedge funds also hire physicists and mechanical engineers
James hamilton the “mechanic” … with EE & CS degrees and time at ibm and ms. Dave Clark the “musician” (undergrad) … and an MBA focused on logistics. Jeff wilke the “chemist” … who worked on process optimization at honeywell and supply chains at aderesen.
So sure, might as well say DeSantis is an SDE Intern figuring out software deployments, Vosshall is an amateur aircraft EE, or marc brooker is some foreign radar engineer.
Signed, some newpaper dude who was an AWS PE doing edge networking and operations.
It maps 1:1 with the computer science but chemical engineering as a discipline has more robust design heuristics that don’t really have common equivalents in software even though they are equally applicable. Chemical engineering is extremely allergic to any brittleness in architecture, that’s a massive liability, whereas software tends to just accept it because “what’s the worst that could happen”.
Graph theory originated in Chemistry. Not Computer Science.
Musicians know harmonics and indirectly lots of cyclical travel stuff. And waves.
The good car mechanics I know are scary smart.
also I was sorta laid off by the current Intel CEO from my last startup!
Also laying off incompetent managers alone won't solve the problem of having hired the wrong people
In comparison:
Nvidia 36,000
AMD 28,000
Qualcomm 49,000
Texas Instruments 34,000
Broadcom 37,000
It is obvious that Intel is ridiculously overstaffed.
TSMC is a fab, not a chip designer. And NV makes GPUs and small scale SoCs like the ones in the Nintendo Switch and automotive (IIRC the Tegra SoC that powered the Switch 1 literally was an automotive chip that they repurposed).
That's quite the difference from what Intel makes: CPUs that power a lot of the world's compute capacity for laptops, PCs and servers, wireless chips (Bluetooh+WiFi), their own GPU line...
The only true comparison is TSMC but in only does chip manufacturing and not chip design/development.
So Nvidia + TSMC would probably be a fair comparison.
It felt to me like the people at the top were clueless, and so were hoping these hires would help give them an idea which direction to steer the ship.
Of course, mostly he found was how out of touch the executives at Xerox were with what their employees were actually doing in practice. The executives thought of the technicians who repaired copiers almost as monkeys who were just supposed to follow a script prepared by the engineers. Meanwhile the technicians thought of themselves as engineers who needed to understand the machines in order to be successful, so they frequently spent hours reverse engineering the machines and the documentation to work out the underlying principles on which the machines worked. The most successful technicians had both soft skills for dealing with customers and selling upgrades and supplies as well as engineering skills for diagnosing broken hardware and actually getting it fixed correctly. It seems that none of the sales, engineering, or executives at Xerox liked hearing about any of it.
Yes, I remember contracting at Intel in 2006 and the Anthropologists were at one end of the building we were in. Their area was a lot different than the engineering areas. Lots of art, sitting around in circles, etc. I remember asking about what was up over there "Those are the anthropologists".
https://www.nteu.au/News_Articles/Media_Releases/Staff_lose_...
I didn't use Word to create my resume and if they can't deal with a PDF that was their problem.
> Probably so they can make changes behind your back
Nope, I don't consent to that.
> Or, less cynically, so they can more easily copy/paste stuff into their HRM tool
Their HRM tool should support PDFs if they are competent. They should also be able to read my resume with their own eyes. If not I consider the company not a good fit for me.
They missed on buying Nvidia and in the last 5 years they have netted 30b but also spent 30b on stock buybacks. So they could still have 30b, but they chose to manipulate their stock instead.
All of those workers will move. There aren't any jobs in the Portland area. Downtown is vacant and still expensive and the startup scene has dwindled.
I am curious about this, moving out of Portland -> Seattle myself. For software I see it, but for hardware, it feels like there's a kind of inertia / concentration that still benefits staying and fixing. It seems like shedding a large chunk of their workforce is on the path to righting the ship. It also feels like chips are too important an asset to discard. I'm skeptical they'll merely bleed out, even if the current phase is quite chaotic. Also frankly Portland area doesn't have enough high tech careers to replace them (and the income tax that goes with it), I feel the state would likely incentivize them staying / growing almost whatever it takes.
This is all a hot take with little insight, other than being a tech person currently living in the Portland area.
Those who liked it stayed on Intel cuz it is the only company which literally operates at all levels of tech stack
From sand to jsons
When I think of people that went into Tech 20+ years ago, this choice of work was a vocation. Not saying they were all pleasant, but they were all largely invested.
At some point Tech became a safe, lucrative profession, for people who say things like 'life is more than work. Nobody is required to like what they do.', like the managers from Intel.
(J.R.R. Tolkien, The Silmarillion)
The difference is the the psychologists and the philosophers agree with me over the long term. Being work obsessed at age 40+ when you have other aspects of life worth exploring is simply mental illness.
Did you ever even consider that such people have other things to do?
Yes, I understand the argument that Intel management screwed up for too long and this is the market at work, but that ignores the geopolitical risks of what we're going to end up with. Forming some kind of consortium to keep Intel fabs running (and new ones built) could also include completely changing the management of the company.
I don't buy this. I think the primary problem was mismanagement especially in the 2008 to 2020 timeframe. Too many bean counter CEOs during that period who did not understand the need to constantly invest in SOTA fabs.
Ergo policy should have been that X percent of chips be made on US shores. Wups
They are not.
Chips act was a whole lot of hot air. It passed in 22 and intel did not receive any money from it until end of 24.
Intel is also likely going to lose hundreds of millions in incentives from Oregon for failure to meet hiring objectives, but they have a while to do that.
> China will take the reigns in 2027
As I understand, the best fab tech is TSMC (Taiwan) and Samsung (Korea). Do you really expect China can surpass both in only two years? It seems unlikely, as they don't have access to high-end fab equipment from ASML.Edited to add: this was not the point being made, I am aware. Just my thoughts on the matter.
it's not unprecedented, when companies' businesses contract, shrinking is exactly the right thing to do, not to mention that it's forced on them anyway.
"The Global Data Center Chip Market size is expected to be worth around USD 57.9 Billion by 2033, from USD 14.3 Billion in 2023, growing at a CAGR of 15.0% during the forecast period from 2024 to 2033."
He even stated the following in "Only the Paranoid Survive": One, don’t differentiate without a difference. Don’t introduce improvements whose only purpose is to give you an advantage over your competitor without giving your customer a substantial advantage. The personal computer industry is characterized by well-chronicled failures when manufacturers, ostensibly motivated by a desire to make “a better PC,” departed from the mainstream standard. But goodness in a PC was inseparable from compatibility, so “a better PC” that was different turned out to be a technological oxymoron.
One might think Itanium goes against that.
"Compilers just need to keep up" was Intel's marketing apologia, not reality.
" 0:26:21 BC: But that was lost after Andy left. That was lost, that part of the culture went away.
0:26:27 PE: Who succeeded him?
0:26:28 BC: Craig Barrett.
0:26:29 PE: Right. Were you still there when that happened, when did Grove leave?
0:26:34 BC: Grove stopped being the president in January 1998.
0:26:40 PE: Yes.
0:26:41 BC: And that's when Craig Barrett took over. 0:26:43 PE: And what changed at point?
0:26:46 BC: Well Craig's not Andy, I mean he had a different way of thinking and doing things, Craig, I don't want it to sound cynical but I always sound cynical when I talk about him because I had such a bumpy relationship with him. I don't think he felt like he needed anything I could tell him, and it wasn't just me, I wasn't taking this personally. I never once got the same feeling I got with Andy that my inputs were being seriously and politely considered, and then a decision would be made that included my inputs. 0:27:21 PE: Yes.
0:27:22 BC: That never happened. Instead, for example five Intel fellows including me went to visit Craig Barrett in June of 98 with the same Itanium story, that Itanium was not going to be able to deliver what was being promised. The positioning of Itanium relative to the x86 line is wrong, because x86 is going to better than you think and Itanium is going to be worse and they're going to meet in the middle. We're being forced to put a gap in the product lines between Itanium and x86 to try to boost the prospects for Itanium. There's a gap there now that AMD is going to drive a truck through, they're going to, what do you think they're going to hit, they're going to go right after that hole" which in fact they did. It didn't take any deep insight to see all of these things, but Craig essentially got really mad at us, kicked us out of his office and said (and this is a direct quote) "I don't pay you to bring me bad news, I pay you to go make my plans work out".
0:28:22 PE: Gee. 0:28:25 BC: So. 0:28:25 PE: Yeah he's polar opposite. 0:28:26 BC: So and he, and at that point he stood up and walked out and to back of his head. I said, "Well that's just great Craig. You ignored the message and shot the messengers. I'll never be back no matter how strong of a message I've got that you need to hear, I'll never bring it to you now.” 0:28:38 PE: Yeah.
0:28:40 BC: It's not rewardable behavior. It was sad, a culture change in the company that was not a good one and there was no way I could fix it. If it had been Andy doing something that I thought was dumb, I'd go and see him and say "Andy what you're doing is dumb", and maybe I'd convince, maybe I wouldn't. But as soon as you close that door, it is a dictatorship. You can't vote the guy out of office anymore, you can't reach him. There's no communication channel."
I keep wondering why Andy didn't see that before nominating him?
Smells like corporate bulimia.
When I worked/lived in the Bay Area there was a sense that corporations, and residents of the Bay Area, were moving to Oregon because it was cheaper … but still close enough to Silicon Valley. (Apropos of nothing really.)
If companies have extra cash on hand, don't we want them to invest it and hire? The alternatives are stock buybacks or just sitting on the cash.
Obviously every bet is not going to pan out, but hiring even on the margin is probably good.
No. Hiring should be a long-term strategic investment, not something you do whenever you have extra cash lying around. If you needed the extra people you should have been trying to hire them already, and if you don't then you shouldn't hire them now.
If I'm a shareowner, if the company doesn't have any intelligent ideas on how to spend my money, they should send it back to me as a dividend, or buy me out (share buyback).
Please don't waste my money trying to build some immortal empire as a shrine to the CEO's ambition.
When corporations just invest because they have money, there is a gigantic agency problem, and executives have a tendency to burn shareholder value on vanity projects and fancier headquarters.
Stock buybacks are exactly what I want wealthy companies to be doing with money they don't have a high expected ROI for.
* they've done about $152B in stock buybacks since 1990 https://www.intc.com/stock-info/dividends-and-buybacks. I think... ~$108B in the last decade.
* during the same time period they fell behind TSMC and SEC in semiconductor fab , missed the boat on mobile (couldn't really capture the market for either smartphone or tablet CPUs), and are missing the boat w/AI training https://www.hpcwire.com/2025/07/14/intel-officially-throws-i...
Discussion of Intel's buyback behavior as excessive and wasteful was also picked up on during all the discussion of CHIPs subsidies last year: https://news.ycombinator.com/item?id=39849727 see also https://ips-dc.org/report-maximizing-the-benefits-of-the-chi...
I think we've become too complacent/accepting of corporations just laying off employees with what amounts to a shrug.
But big picture I disagree. We kind of need creative destruction in an economy - we need to be able to lay off people in horse buggey industries so that they can be hired to make Model T's. We're better off focusing on our social safety network and having a job market that encourages some amount of transit between careers.
Treating the employer/employee relationship like some life-long commitment sounds like pure hell. It is a transaction. I don't want it to be anything more than that.
it does though doesn't it? divorce is so common that marriage no longer feels like it has any permanence like you imply it does
It is another significant flaw in the "capitalist", i.e., publicly traded corporate system that incentivizes all the various financial shenanigans to generate false stock performance to enrich the c-suite.
It's a different state and a 9-10 hour drive away; in what sense is it close?
Note that these were NOT executive jets for C-suite, these were for all employees who had meetings at other locations (at least according to people I've met since I moved to AZ a few years ago to be near my in-laws).
I don't doubt you but most people would not find a 2-hour one-way commute pleasant.
Different strokes for different folks.
Definitely not close as in "commute close".
Maybe more like "close to feeling the same as the Bay Area"?
(You can believe Portlanders hated Californians that moved up there. Or so I've been told.)
They still do, only it's not really Portlanders anymore, it's all the smaller cities that hate them. Why? A couple reasons: they came in and pay over asking price for housing, driving up prices across the board so those working for local non-conglomerates have a hard time affording housing. And then they vote contrary to how the locals do (locals, I might add, who didn't have any problem with how things were run before, even if their "betters" felt they were "backwards").
Basically, they end up burying the local culture and replacing it with California.
Also, no sales tax!
PS – as someone who spent hundreds of hours on Glider PRO as a kid, thank you!
The Asserted problem: Labor force/expense is too high, or at least, higher than is now thought necessary.
The (IMO) core problem: Measuring professional success/skill primarily by the size of the team a person manages.
The asserted solution: AI replacing Labor to reduce inflated labor costs/pools.
While there is some inherent benefit there to reducing team sizes back down into allegedly functionally sized units, there is a lack of accountability and understanding as to why that's beneficial, as it at seems to be done either due to the lofty promise of AI (which I'm critical of), or a more brutalist/myopic approach of merely trying to make the big labor-cost number smaller to increase margin/reduce expenses. To be clear, while I'm a critic of AI, I fully acknowledge it can absolutely be helpful in many instances. The problem is that people are learning the wrong lessons from this, as they've improperly identified the issue, and why the force reduction is allegedly/appears to be working.
Obviously, YMMV on a case-by-case/team/company basis, but Intel is known for being guilty of "Bigger = Better" when it comes to team size, and their new CEO acknowledged this somewhat with their "Bureaucracy kills innovation" speech [0].
That said, what may be good for the company (even if done for the right reasons) can still hurt the communities it built/depend on it.
0: https://www.inc.com/kit-eaton/in-just-3-words-intels-new-ceo...
Just off the top of my head I can think of a dozen or so.
They're just not in tech for the most part. I can think of 2 breweries, a standardization company, two machine shops. And those are just within a mile or so of my house.
I think a lot of folks work at Intel in order to get out of tech. That's basically what I'm doing, lol. Work enough to save and get out of tech.
Tech is also expensive to start up in. So it makes sense that a lot of the intel-driven businesses would be non-tech.
"Ex-Intel executives raise $21.5 million for RISC-V chip startup":
https://www.aheadcomputing.com/
I believe the founding team is all in Oregon - and mostly all ex-Intel.
https://www.oregonlive.com/silicon-forest/2025/06/top-resear...
Those are two of the biggest economies in the world pumping loads of money and people into that engine to try to get it started and still just starting to make some progress. That's not to say there aren't great ideas out there that we're missing which could make it all easier and cheaper, but a small team is definitely going to have one hell of a time making stuff on a nanometer scale without billions and billions of dollars behind them. Software startups are easy, but hardware is hard. Massive hardware and microscopic hardware are harder.
I'm not saying it _should_ or _must_ be that way, just that it is.
https://www.littler.com/news-analysis/asap/california-reache...
Has this been tested? Why would an Oregon court care about what a California law says it can and cannot do?
> Intel CEO says it's "too late" for them to catch up with AI
Everyone non-technical was hired. Everyone with a strong ability was seen as difficult, and kicked out.
Housing costs in the Bay Area are soul-crushing, but they do motivate people to work on the highest value projects because complacency just doesn't usually work if you're trying to buy a house. And so I wonder, if Intel had kept their workforce mostly in California, could they have stayed a dominant force in computing?
Also, Oregon is a terrible state to invest in as a business, especially one that is looking to pay high salaries.
Intel was in the business of selling the most cutting edge, technologically advanced products in the world, but they didn’t want to pay enough for the best people.