There's a bunch of teams there with three-letter acronyms whose origins have been totally forgotten. Like, nobody knows what LTQ or ASR stands for, or what purpose they have. When you're an intern, you tend to think that the higher-ups know what they're doing, but if you ask for an explanation, you will soon conclude that they don't know either.
People were not working hard enough. At the time Intel's dominance was supreme. They should have been picking up on niche ideas like GPUs and mobile chips, it would have been cheap and adjacent to what they had. Instead, all I heard at the meetings was laughing at the little guys who are now all bigger than Intel. Even my friend in the VC division couldn't get the bosses to see what was happening. People would spend their whole day just having coffee with random colleagues, and making a couple of slides. It's nice to relax sometimes, but when I was there it was way too much of that. There was just way too much fat in the business.
I still have friends there who stayed on. They tell me not to come, and are now wondering how to do the first job search of their professional lives. A couple have moved very recently.
It's very odd that the guy who was famous for saying what upper management should do (set culture) ended up building a culture that has completely failed.
I knew a lot of people who got jobs like this after college. I was so very jealous at the time. I was working in a company that was nice, but also wasn’t afraid to tell people when they weren’t meeting expectations. Some of my friends were at companies where they weren’t expected to “ramp up” for the first year. One person I know read “The Four Hour Work Week” and talked his company into letting him work remote, then started traveling the world. He would brag that his entire job was “telling the engineers what to do” and it took him an hour a day because he did it all through email in one sitting.
Years pass, economies evolved, and now it’s harder to get a job. Companies start looking for dead weight and discover people doing jobs that barely contribute, if at all.
A tech company near me looked at their VPN logs (required to interact with their internal services and do any dev work) and discovered a lot of engineers who were only connecting a couple times per month.
By then it’s hard to turn it around. It’s not easy to take people who have become so comfortable not working that the entire idea of urgency is a foreign concept. Ask for some task that should only take an hour or two and they’ll say they’ll have it by early next week. Any request turns into a series of meetings, which have to be scheduled with all participants, which means they can’t start discussing it until Bob is back from vacation next week, so they might have an idea of what’s required by end of month.
At some point you can’t turn it around without making big changes to the people involved. There’s too much accumulated inertia and habit. You have to reorg at minimum and bring in new management, while also making it clear to everyone that their performance is now actually being noticed. It’s hard.
With Intel, I’ve also heard from some ex-employees who left because pay was lagging. Companies with low expectations can feel like they’re getting away with low pay because many people will keep an easy job despite the low pay. It masks the problem, for a while.
I think some people with 'cushy' jobs don't take on this same mentality, perhaps overestimating the security of their current job. “telling the engineers what to do” is not a good starting point and the answers to follow-up questions had better be pretty detailed and convincing.
Also interviewed someone my year but we were both a year out of school, same major, roughly same job title, at our first post-undergrad jobs. I was thrown into the deep end and learning a lot. He was buying software licenses. I commend him on sticking it out for a bit but also realizing it was a bad fit.
It sounds like you blame their own lack of effort for losing their jobs. Like, if they would have worked harder, it wouldn't be them on the line.
But the reality is, they did not let the corporations take advantage of them. They turned table and had a good Work-Life-Balance and got paid for it. Yes, maybe it cost them their job. But at the same time, they had one for years, and for many people it would have meant that they had been ready for a change anyway.
Eventually, happiness is a personal measure and what fulfills you is your own desire and the way the people worked, that you talk about, may not be your preference. But it does not sound like they made a poor choice.
I worked my ass off for 20 years. I'm an expert in the field that I work in, but when I had been skipped for raises in three years I said fuck it and put my personal life in front of everything else. I wake up when I want, start my work when I want, work way less than I should. I still don't get no raise, but all my peers and my manager continue to tell me what a great job I do. Now I'm slacking hard, but why should I feel bad, when hard work is not valued? That my boss and peers are happy are a positive thing, but I would not concern myself much, if they were less.
I think the thing that's not obvious to young people is that choices that seem good at any given time may turn out to be poor choices further down the line. The guy who traveled the world while working one hour a day telling engineers what to do over email probably had a great young adulthood. It sounds like he paid for it later, though, by getting laid off and having difficulty finding another job.
This doesn't mean that those who worked their asses off didn't get screwed over, but on average they probably did better professionally - and by proxy, financially.
It’s one thing if someone is iron willed enough to make productive use of their new free time.
It’s different if they use it to play video games and sleep.
Most people, if left to their own devices, will do the latter.
We can say what we want about a hard, challenging job, but it forces us to work and learn. Thus, at the end of it, we have the benefit of that working and learning.
The better question is not “How little work can I get away with doing?” but rather “What will I have at the end of this work?”
Is it? Everywhere I worked upper management is taking big about the culture but their taking points are rarely applied to the company.
Like when Facebook says something like "we value your privacy"
Sort of like that Twilight Zone episode. The aliens come and convince us they are here to serve man. "Here, if you don't believe us look at our book called 'To Serve Man.'"
Finally one of the humans translates it and discovers it's a cookbook.
https://en.m.wikipedia.org/wiki/To_Serve_Man_(The_Twilight_Z...
Or worse, where I am their talking points about the culture they want ONLY applies to the company and not themselves. (In-office requirements, how the office is laid out, etc.)
I left because I was working on a machine learning project that was a "solution in search of a problem;" and I spent too much time working alone. I was very early in my career and felt like I just wasn't learning enough from my peers.
Overall, I felt like Intel was a positive experience. I do think their biggest problem was that they had to many lifers and didn't have enough "healthy turnover." Almost everyone there started at the beginning of their career, and thus everyone who was mid-late career didn't understand what the rest of the industry was doing.
They are the poster child for "we have a monopoly so we don't have to innovate or even maintain competence". Mind you, how much worse must things be at AMD that they're not winning the x64 war? Eventually the "PC" market is going to get run over by ARM like everything else. Especially now there's a Windows on ARM with proper backwards compatibility.
(although something is very odd with drivers on Windows-ARM, if anyone knows the full story on how to get .inf based 'drivers' working it would be genuinely helpful)
Windows on ARM is still largely ignored, everyone on the consumer level is more than happy with current Intel/AMD offerings.
Every single attempt to sell Windows ARM systems has been more or less a flop, including the recent CoPilot+ PCs.
Windows developer community also largely ignores Windows on ARM, unless there is an actual business value to support yet another ISA during development, CI/CD pipelines, and QA.
Only Apple gets to play the vertical integration game, our way or go away attitude, and they are the survivors of home computer vertical integration only because they got lucky when banks where already knocking on the door.
> This is a very Apple viewcentric point of view.
Which also isn't great for Apple. I mean they're lagging Microsoft now. We've all felt this coming, right? The M series was great but it's hard to think of more innovation after Jobs. I mean... things got smaller/thinner? That's so exciting... now can we fix the very basic apps I use every day that have almost trivially fixable bugs?In a way, Pantheon feels weirdly accurate. People not actually knowing what to do. Just riding on momentum and looking for the easiest problem to solve (thinner & extract more money from those making your product better) because the concern is next quarter, not next year, not the next 5 years. What's the point of having "fuck your money" if you never say "fuck you"?
They have plenty of money to burn, but unless they make their systems more affordable to the common man that doesn't live with tier 1 country salaries, they will eventually become the iPhone/iPad company.
There is no longer Apple hardware for servers, the way MacPro has been dealt with, it is clear that the professional desktop workstation is also not a market that they care about any longer, if the only PCI slots on studio are for audio cards.
So it doesn't matter how great the M chips are, if they don't have products on their portfolio that people care about buying, instead of having Windows/Linux/BSD systems for servers, and mostly Windows on consumer hardware (70% worldwide market share).
Those are my numbers.
If you prefer something more official,
https://www.pcworld.com/article/2816617/microsofts-copilot-g...
Those still exist?
Media Market, Carrefour, Publico, Worten, Cool Blue, FNAC,...
They shouldn't be. Apple's chips changed the game so much that it was a no-brainer for me to choose them when I bought a new laptop - PCs just couldn't compete with that compute and battery life. Anyone with a decent enough budget is not even considering Windows.
I don't think any power user will be happy with Intel/AMD any more.
As for laptops maybe when there is something able to compete with Razor laptops for e-sports, using ARM.
Snapdragon chips ain't it.
Uhm, no. This is so totally dependent on your use case. I use my home box MOSTLY for gaming; it's just better on Windows. I also want a box I can upgrade. I never need to carry it with me.
Apple isn't even in the consideration space for me for that.
For work I don't have a choice, but the M[1-4] machines are _good enough_ for that; the battery life is nice, but I'm not mobile that often. I don't use its screen, mouse, or keyboard, so don't care there. The OS is close enough to familiar unixen that I like it fine, but WSL2 on Windows or native linux would be MORE THAN FINE, and closer to my deployment environment so would be better at way less cost, but our IT dept. doesn't want to support it so I get what I get.
Don't you mean on x86?
Windows on ARM is no more suitable for running legacy x86 games at full performance than any one else's OS on an ARM chip.
https://www.alltechnerd.com/amd-captures-17-more-cpu-market-...
> Despite Intel still holding the lead with 56.3% of systems validated through CPU-Z, AMD is closing in, now claiming 43.7% of the market.
The Core Ultra CPUs are an absolute joke for gaming, often being beaten even by the 14th gen CPUs. The Core Ultras had a major performance regression in L3 cache speed which destroyed gaming performance.
Games love large cache sizes. The Ryzen 9700X has the same number of cores, but a slower clock than a 9800X3D, yet the 9800X3D comes out on top purely because it has 96 MB of L3 cache compared to 32 MB. If Intel would have put out an i9-14900K with 96 MB of L3 cache, it'd probably come out on top.
That's probably the strongest mis-statement I've heard this week. At least, it seems AMD have been the x86-64 leaders for several years now.
Why are you thinking AMD aren't winning?
https://www.tomshardware.com/pc-components/cpus/amd-records-...
I don't think I even know anyone (including in businesses) who buy Intel for anything any more, and it's been that way for a few years.
I think Epyc will get there within a few years, it has great momentum and Intel's response has been pretty weak.
EDIT: Though I'm leaning towards satire.
I wouldn't say that Microsoft's Prism x86 to ARM translation layer has been anywhere near as successful as Rosetta was at running the majority of legacy software on day one.
Prism is improving, but the Windows on ARM reviews are still bringing up x86 software that didn't work at all.
Dealing with MCUs for projects, RISC-V Espressif chips and boards are no-brainers now; I buy big bags of ESP32 boards from Seeed. I get some free ARM boards at work, which are neat - I always love playing with MCUs - but they're relatively power-hungry and expensive without a lot to show for it. I'm either using a ~$6 ESP32 board or a ~$1 ATTiny in a DIP package for home/fun projects. ESP32s are starting to show up in consumer electronics I find, too, along with the relatively pared-down ESP8266s which I'm not as fond of, though I can still flash them easily over USB-TTL at least, so whatever.
In the SBC space, ARM is competing with x86. RISC-V exists but only really for enthusiasts. RISC-V may start making inroads here soon. I picked up some Radxa Rock 2F boards (using ARM-based Rockchips) for ~$12 shipped a few months ago, they run Debian, and these have been fantastic for projects (though now ~impossible to source the cheap 1GB variant of). It's difficult to imagine it being worth getting involved in this nightmarishly competitive space, though obviously some still do. Most seem to try finding some obscure niche to justify a high markup.
In many workloads, it's more the GPU that matters. I need an MMU, a PCIe slot, and driver support. Most of us don't really need these outlandishly complex and CPU-centric $100+ ATX motherboards, or even CPU/RAM sockets/slots; just solder it on. -Like, how often do people even upgrade the CPU on a motherboard anymore? I'm more liable to throw the whole thing out because it doesn't have any 10PB/s 240GW USB9 quantum ports, so cut materials, decrease surface area, lower cost, and make it disposable.
At least in R&D, from the angle I saw it. Clearly, being stingy wasn't a universal problem: heavy buybacks, ludicrous M&A (in foresight and hindsight), and that $180k average salary in the article sounds completely divorced from the snapshot impression that I got. I don't know what gives, was R&D "salary optimized" to a degree that other parts of the business weren't? Did the numbers change at some point but the culture was already rotten and cynical? Or did I see noise and mistake it for signal? Dunno.
In another world I'd love to have been part of the fight to make 10nm work (or whatever needed doing) rather than working on something that doesn't fully use my skills or in my private opinion contribute as much to humanity, but my employer pays me and respects my time and doesn't steer their business into every iceberg in the ocean, and in the end those things are more important.
In R&D management, this is an extremely well-known problem with an extremely well-known solution: use the oversupply to be selective rather than cheap. The fact that they chose to be cheap rather than selective is managerial incompetence of the highest order. They had one job, and they blew it. "Selective" doesn't even mean that the rating system has to be perfect or even good, it just has to equilibrate supply and demand without shredding morale. Even a lottery would suffice for this purpose.
The mountain of money for intel has always been with server chips, as its their high margin chipsets. While they make alot of money on consumer laptops and desktops, its no where near the amount of money they traditionally have made on their server oriented chipsets.
I don't think Intel is likely to come out of this state without something extremely radical happening, and every time they try to do something that could be radical it never has enough time to gestate to work, it always ends up abandoned.
Monopoly and Bureaucracy. That is basically what government is. It is kind of sad reading Intel was like that even in 2005.
Arithmetic Shift Right? (I kid, of course, but seeing a team name that _might_ correspond to an assembly instruction, in an post about Intel amused me.)
I didn’t end up taking the job.
I never really knew what happened to that division.
This is the big risk we all took when we moved away from the Bay Area to work remotely. You arbitrage the COL difference and come out ahead big time, but it might be very hard to make the same salary locally if you can't find a remote job.
Best to make some hay while the sun is shining.
I've heard on this forum of a tactic Intel employed where they broke off some people into a subsidiary, dissolved the subsidiary, and then offered to rehire them with the caveat: Oops, the pension you were promised is now gone. Then Intel's foundry business started failing. Oops!!
Sounds pretty empathetic to me. I’m guessing he also has empathy for Wall St and his shareholders. Ultimately Intel has no choice but to either grow or downsize and the former hasn’t materialized. They’re losing market share and revenue and if they keep that up they will be empathizing with their creditors and the bank.
Snark aside, did Intel management take any cuts, even symbolic ones to show they are in it together?
The leadership has empathy for both sets and it's emotionally mind numbing.
The newer "web 2.0" companies (and I mean, even Google and Amazon) opened shop in more affluent places
The "older" companies were manufacturers. Even places like Mountain View and San Jose were the working-class towns with HP factories and semiconductor plants. The concentration of engineering talent (HP/Intel/Apple/Atari) is what created the affluence, especially after manufacturing itself was outsourced globally.
The newer Web 2.0 companies don't make physical things; they make software. Their most critical infrastructure isn't a factory but a dense network of developers. They go to the Bay Area, Seattle, etc., because that's where the network is. For the parts of their business that don't require that network, like customer service, they locate in less expensive regions, just as PayPal did with Nebraska. They were even the second largest employer in Nebraska iirc.
But modern skilled workers know how risky it is to put down roots in a place where they only have a couple employment options. So companies struggle to attract talent to remote areas and end up needing to hire in places that already have an established pool of skilled labor, which is typically in the cities and more affluent areas of the state or country.
In this case, the lack of employment options means many of the engineers laid off by Intel will end up needing to uproot their families' lives and move to a new city or state to find a new employer who can to pay for their skills.
I remember the first time I was sent to the Bay Area for training. I was excited to see this City of Mountain View I'd heard so much about; to explore its city nightlife and enjoy the view of the mountain. My boss had to let me down gently :) "Mountain View in Europe would be called a village", he said.
I don’t know how quickly we’ll find the political will to break that since everyone who owns property in a city has a financial incentive to keep prices artificially high. Removing density restrictions helps by making redevelopment financially advantageous for individuals but the degree of uncertainty we have now is going to slow that down, too.
The big problem with changes like this (which I support, btw) is that the changes get immediately reflected in land prices, which means that you basically can only put the maximum number of units on the land, which tends to increase prices.
If you build enough, this doesn't happen but I don't think any western urban area is anywhere close to that point.
But when I think of “suburbia” I think of a series of housing developments, strip malls p, golf courses just off all major highways/roads. Cul de sacs as opposed to grid pattern. Generally hostile to pedestrians getting fro residential to commercial and business areas. On,y part of the Valley is like this, mostly the richer areas more towards 280, such as Los Altos, Portola Valley, Cupertino.
I guess this is reflective of US/Europe suburbia. From my (Irish) perspective, the valley is clearly suburbia given the density. I'll never forget taking the caltrain from Palo Alto to SF and seeing basically low-density housing with sporadic strips of shops. That would be clearly suburban to me (but obviously other people's opinions will differ).
Plenty of large corporations have headquarters in suburbs (where the rich execs want mansions) but in a close enough commute to a major city where more of the employees want to live.
You don't need a big home space when any cafe can become your living room, any restaurant your kitchen, dining room and wait staff, and any park your professionally tended garden.
Your choice of entertainment, especially live entertainment, is mainly limited by your willingness to keep up with what's going on, and not by the sparse calendar of touring acts.
Metropolises are fantastic places to live, especially when you are comfortable spending money to expand your space on demand.
It doesn’t have to be this way, though. Tokyo is a vibrant metropolis that is also relatively affordable by global standards. The key to this is sensible housing policy that doesn’t inflate the cost of living to oppressive levels.
What keeps me in the Bay Area besides being tenure-track are proximity to family, the acceptance of multiculturalism, and (as an academic) California’s support for academia in a national political climate that has become hostile to academia. But financially I wish the price of rent, food, and other necessities weren’t so oppressive.
That's a very optimistic perspective, which I somewhat envy. It makes sense in a way, assuming your rent is negligible, but when you're paying out the ass for an apartment, having the privilege of being able to pay short-term rent in the form of coffees and brezels for a shared proper living room doesn't sound great...
Insane. Enjoy your pod. And all the crime too.
Maybe it is not the lifestyle for you, but I don't find it "insane" to want to have plenty of stuff available at walking distance. And you can't have that without high population density and yeah "pods".
> You don't need half an acre to sleep.
Where do I put my vegetable garden and fruit trees?
How can I relax on my back porch listening to the birds and creek?
How do I get away from all the hustle and bustle of dense city living?
> plenty of stuff available at walking distance. And you can't have that without high population density and yeah "pods".
Exhaust, brake dust, sirens, litter, concrete jungle, noisy neighbors with thin walls, massive crowds/traffic... there are tradeoffs to living in a dense city.
I think the Pareto optimal living conditions in the US today are the suburbs bordering the rural outskirts. Especially now with WFH prevalence.
Some cities provide shared gardens, but it is a niche thing.
> How can I relax on my back porch listening to the birds and creek?
Most cities have parks.
> How do I get away from all the hustle and bustle of dense city living?
You go to the countryside.
But sure, if you like these things, maybe living in a city is not the best. The thing is that there are solutions. In the same way that if you live in the countryside, you can still go to cultural and sport events, fancy restaurants and bars, but it will take a trip, and you also have to consider that driving and drinking is not great. If you love these things, maybe you should live in a city instead.
I am not a fan of suburbs as there is essentially nothing you can do without driving. For me, it would be my last choice, as it tends to be expensive compared to the countryside, and not as calm. An intermediate choice that can make sense, but not a Pareto optimal.
Half an acre in the middle of nowhere, where you need to drive everywhere or a pod in walkable distance to everything you need, all services and entertainment?
The answer depends on your lifestyle.
I grew up in a rural area, with a decent-sized house and garden, surrounded by nature. It had its pros and cons. I’ve now spent a couple of decades living in large cities, and I love it. I personally don’t need all that private space, I enjoy having many people from all over the world around me and the endless options for culture, food and entertainment. When I get older still, my priorities may change again, or maybe they won’t.
I don’t think you’re insane for making other choices than I have, but you seem to lack both imagination and empathy if you can’t wrap your head around why not everyone feels the way you do.
HN is largely comprised of people who are disproportionate beneficiaries of modern society, so its not a surprise that people here develop a pathological ideology that goes as far as valuing the trappings of modern society. It is like in Huxley, the purpose of soma and orgy porgies are to keep people from noticing how fucked up their world is. It is fucked up to live inside pods stacked and crammed up to the clouds - so what is the soma?
All the "vibrant" amenities available in the city. But why would you want to work on a laptop in a cafe when you can work on a quad monitor in underwear looking out over your own literal fiefdom? The amenities are substitutes not the real thing. You shouldn't be fooled.
I will say that Canadian suburbia (at least Vancouver area) is subtly different from US suburbia in a way that makes it vastly more livable. My parents house is right next to a river trail with mostly ungroomed foliage, still somewhat walkable for groceries and restaurants, and has passable public transit.
Obviously that would be nice, but not even HNers are rich enough to build rural villas in city centres.
I have an acre and maintaining it's quite a nice alternative to computers and screen based entertainment. It's just another form of exercise basically.
I gladly pay a lot of money to be in the center of things
Yes there are people who live in houses in or near the city. Most people live in pods. We're talking about the "influx of young ambitious people" remember?
At the very least, even if you can afford it, to own a home in the city for most people means putting off FIRE which I think is poor decision making.
The high COL is proof that people enjoy it and are voting with their wallets. It’s not as though everyone here is ignorant of the existence of Oklahoma. I’ve lived there-ish. I pay extra not to have to anymore.
You really think if high paying jobs were more spread out we'd see this insane Calhoun-like concentration of people in geographically tiny areas?
WFH could have changed that equation and made life better for everyone. Too bad we can't have nice things.
I was contracting out at Intel Jones Farm campus in Hillsboro in 2004 and I'd walk around the (then) new neighborhood there by the campus and I distinctly recall thinking "What if something were to happen to Intel in, say 25 or 30 years? What would happen to these neighborhoods?" It was just kind of a thought experiment at the time, but now it seems like we're going to find out.
The $180k figure is also inflated. Most folks being laid off don't make over $100k.
They were getting paid "California salaries in Colorado" (well, really Massachusetts salaries but popular sayings don't have to be completely accurate) and lots of people had virtual mansions on senior tech salaries (plus probably stock options?).
Then DEC imploded and there were almost no other options for hundreds of storage engineers. Knew a lot of people who had their houses foreclosed because so many were flooding the market at the same time.
I suspect most of those folks did not "come from" the bay area in the first place.
Overall my 5000 ft view, was the culture was very different from FAANG or a Bay Area Tech company. If the Bay Area approach is high ownership and high accountability, Intel was much more process driven and low ownership. They even tracked hours worked for engineers in Oregon.
I think it speaks to common challenges when hiring mangers are disconnected from the work, degrees and resumes are worthless, and turnover is difficult.
In many companies team leads dont have a role in the hiring or firing of the employees working for them.
The sad thing is they acquired the basis smartwatch and destroyed it, leaving only Garmin as developers of dedicated activity trackers. I considered getting a basis but was obviously glad I didn't.
But Apple bought the company recently. I worry that whatever made the product great will go away post acquisition. Whether or not Apple keeps working on it at the same level of quality is anyone's guess. Or maybe they'll integrate the best features into their free Photos app and ditch the rest. Or something else entirely.
I can't think of any examples where acquisitions make a product better. But dozens where the product was killed immediately, or suffered a long slow death.
With Apple it's harder for me to know. How do former Dark Sky users feel about the Weather app? I think it has all the features? How about Shazam, which I never used before it became an iOS feature? TestFlight retained its identity. Beats by Dre headsets did too, though Beats Music I think became Apple Music in a way.
Something like Minecraft for an example - the existing established customer base with perpetual license was not justification for buying it. The value Microsoft saw was around things like DLC content and cosmetics, and subscription revenue through server hosting.
From what I have observed - one could say that everything Apple acquires is an accu-hire first, for a product they want to ship and trying to find a delivery-focused team to help them with that.
If the company already built a product similar to that and had it hit the market - thats great! It means that they are getting a team which has delivered successfully and maybe even have a significant head start toward Apple's MVP. That likely means also that the team will have a fair bit of autonomy too (and often retain their brands).
DarkSky's product in that light wasn't their app. It was their work on localized weather models and their weather API.
Apple's Weather App doesn't look like DarkSky, but AFAICT you could rebuild the DarkSky app on the WeatherKit REST API (including features like historical weather, and supporting alternative platforms like Android).
For starters they split the community among bedrock & java. And while a minecraft copy leveraging a C++ was a good idea, it seems they've mostly made the split to justify adding heavy monetization for skins and world maps to bedrock. (Maybe they feared backlash if they did that to the OG Java version?) This monetization seems to have killed people's appetite for hobby-project mods and maps.
Likewise, it's clear that the intended demographic of their marketing has become much younger. From the mob votes, the type of things that go in updates, it seems that what's added is far less deep. That updates are now more of a social media "Llamas in minecraft, look how goofy they are!" stunt.
I recently started a 1.7.10 modded world, and was surprised to see just how much stuff was already there. The only newer vanilla things that I found I missed were bees and slime blocks.
Maybe it's nostalgia, but this version feels nicer, like it's cohesive, and respects me as a player more.
There are many acquisitions that lead to better products.
They're more lucrative for creators/streamers and have further reach but the platform experience is noticeably worse.
But there's also hundreds of examples of the opposite happening: Successful products being bought by a big company and then killed post acquisition.
We probably won't know which camp Pixelmator will fall into for a few years yet.
Not to mention all the topics that have been soft-banned because one algorithm flags those videos as not monetizable, and the next algorithm decides that only showing or recommending videos that can show ads results in the most add revenue
I don't think YouTube is clearly better or worse than it was before acquisition, and maybe an independent YouTube would have walked the same path. It is simply a very different platform that was ship-of-theseusd
Follow some channels like Practical Engineering or Veritasium ... both good quality, information dense. Yes, decent production values, but that's not a bad thing at all in my book.
I'm tossing up between pixelmator and affinity photo.
The main changes were integration of Apple's AI stuff and improved VoiceOver support. Nothing earth-shattering but it's still active.
Hedge funds also hire physicists and mechanical engineers
James hamilton the “mechanic” … with EE & CS degrees and time at ibm and ms. Dave Clark the “musician” (undergrad) … and an MBA focused on logistics. Jeff wilke the “chemist” … who worked on process optimization at honeywell and supply chains at aderesen.
So sure, might as well say DeSantis is an SDE Intern figuring out software deployments, Vosshall is an amateur aircraft EE, or marc brooker is some foreign radar engineer.
Signed, some newpaper dude who was an AWS PE doing edge networking and operations.
It maps 1:1 with the computer science but chemical engineering as a discipline has more robust design heuristics that don’t really have common equivalents in software even though they are equally applicable. Chemical engineering is extremely allergic to any brittleness in architecture, that’s a massive liability, whereas software tends to just accept it because “what’s the worst that could happen”.
I studied chemical engineering after I was already working in software, so I did it backward.
Ultimately it is all about how strict the hiring pipeline is to the credentials vs potential.
Graph theory originated in Chemistry. Not Computer Science.
Musicians know harmonics and indirectly lots of cyclical travel stuff. And waves.
The good car mechanics I know are scary smart.
Most trace it back to Euler when he considered the problem of Seven Bridges of Konigsberg https://en.wikipedia.org/wiki/Seven_Bridges_of_K%C3%B6nigsbe...
also I was sorta laid off by the current Intel CEO from my last startup!
If you are honest and generous with people, they aren't mad that you made a mistake and let them go. It's companies that try to give 2 weeks + 1 week per year of severance that are making a mistake, not the entire concept of layoffs.
(Without delving into the systemic reasons that layoffs are inevitable of course. If the system was different, they wouldn't have to happen, but we live in this system at the moment.)
Nobody can predict market conditions or technological advances.
If you don’t change course (mission, people) the company will likely fail and then everyone is out of a job, shareholders, pensioners, and 401k holding laypeople look money.
I do think that leadership is not held accountable enough for their mistakes and failures.
I disagree that the workers are the ones who should have the power to fire management unless they are shareholders. I think this should (and it does) fall upon the board and the shareholders. If the workers are shareholders, all the better.
Regardless, it's clear the current system needs work.
“If your name is Farmer you’re a farmer.” mentality but self selected euphemism. “I trained as a software engineer and that’s what I am for 50 years! Dag gubmint trynna terk my herb!”
Service economy role play is the root of brain dead job life we’re all suffering through.
Also laying off incompetent managers alone won't solve the problem of having hired the wrong people
> I heard from a friend who works for Intel that he doesn't know why he was hired in the first place; his PhD was in a completely different domain, the objectives of the project were remote to his skills, and he told me this is what his entire team was made of. Seems like a lot of bloat present in this company, and it makes sense they feel the way forward is layoffs.
In comparison:
Nvidia 36,000
AMD 28,000
Qualcomm 49,000
Texas Instruments 34,000
Broadcom 37,000
It is obvious that Intel is ridiculously overstaffed.
TSMC is a fab, not a chip designer. And NV makes GPUs and small scale SoCs like the ones in the Nintendo Switch and automotive (IIRC the Tegra SoC that powered the Switch 1 literally was an automotive chip that they repurposed).
That's quite the difference from what Intel makes: CPUs that power a lot of the world's compute capacity for laptops, PCs and servers, wireless chips (Bluetooh+WiFi), their own GPU line...
Tegra was designed for mobile devices like smartphones. The automotive part came later and isn’t particularly relevant. Intel also makes low power SoCs for mobile devices, e.g. Atom.
Last time I heard that name was well over a decade ago for crappy "netbook" devices. Just looked it up, last Atom CPU was released in 2013 per Wikipedia [1]. They might still make them for embedded computing purposes with very long life cycles, but no idea at which volume.
The only true comparison is TSMC but in only does chip manufacturing and not chip design/development.
So Nvidia + TSMC would probably be a fair comparison.
It felt to me like the people at the top were clueless, and so were hoping these hires would help give them an idea which direction to steer the ship.
Of course, mostly he found was how out of touch the executives at Xerox were with what their employees were actually doing in practice. The executives thought of the technicians who repaired copiers almost as monkeys who were just supposed to follow a script prepared by the engineers. Meanwhile the technicians thought of themselves as engineers who needed to understand the machines in order to be successful, so they frequently spent hours reverse engineering the machines and the documentation to work out the underlying principles on which the machines worked. The most successful technicians had both soft skills for dealing with customers and selling upgrades and supplies as well as engineering skills for diagnosing broken hardware and actually getting it fixed correctly. It seems that none of the sales, engineering, or executives at Xerox liked hearing about any of it.
Yes, I remember contracting at Intel in 2006 and the Anthropologists were at one end of the building we were in. Their area was a lot different than the engineering areas. Lots of art, sitting around in circles, etc. I remember asking about what was up over there "Those are the anthropologists".
https://www.nteu.au/News_Articles/Media_Releases/Staff_lose_...
I didn't use Word to create my resume and if they can't deal with a PDF that was their problem.
> Probably so they can make changes behind your back
Nope, I don't consent to that.
> Or, less cynically, so they can more easily copy/paste stuff into their HRM tool
Their HRM tool should support PDFs if they are competent. They should also be able to read my resume with their own eyes. If not I consider the company not a good fit for me.
They missed on buying Nvidia and in the last 5 years they have netted 30b but also spent 30b on stock buybacks. So they could still have 30b, but they chose to manipulate their stock instead.
All of those workers will move. There aren't any jobs in the Portland area. Downtown is vacant and still expensive and the startup scene has dwindled.
I am curious about this, moving out of Portland -> Seattle myself. For software I see it, but for hardware, it feels like there's a kind of inertia / concentration that still benefits staying and fixing. It seems like shedding a large chunk of their workforce is on the path to righting the ship. It also feels like chips are too important an asset to discard. I'm skeptical they'll merely bleed out, even if the current phase is quite chaotic. Also frankly Portland area doesn't have enough high tech careers to replace them (and the income tax that goes with it), I feel the state would likely incentivize them staying / growing almost whatever it takes.
This is all a hot take with little insight, other than being a tech person currently living in the Portland area.
Those who liked it stayed on Intel cuz it is the only company which literally operates at all levels of tech stack
From sand to jsons
When I think of people that went into Tech 20+ years ago, this choice of work was a vocation. Not saying they were all pleasant, but they were all largely invested.
At some point Tech became a safe, lucrative profession, for people who say things like 'life is more than work. Nobody is required to like what they do.', like the managers from Intel.
(J.R.R. Tolkien, The Silmarillion)
The difference is the the psychologists and the philosophers agree with me over the long term. Being work obsessed at age 40+ when you have other aspects of life worth exploring is simply mental illness.
Did you ever even consider that such people have other things to do?
It makes sense that people dont want to work with others that try to do as shitty of a job as possible without being fired, fucking over whomever and whatever happens to be collateral damage.
Being an obsessive company man is not the only alternative, and certainly not what they were suggesting. Im not sure why you thought it was being advocated for.
> Im not sure why you thought it was being advocated for.
aren't you? i thought i was clear. let me know if you need this explained.
He even stated the following in "Only the Paranoid Survive": One, don’t differentiate without a difference. Don’t introduce improvements whose only purpose is to give you an advantage over your competitor without giving your customer a substantial advantage. The personal computer industry is characterized by well-chronicled failures when manufacturers, ostensibly motivated by a desire to make “a better PC,” departed from the mainstream standard. But goodness in a PC was inseparable from compatibility, so “a better PC” that was different turned out to be a technological oxymoron.
One might think Itanium goes against that.
"Compilers just need to keep up" was Intel's marketing apologia, not reality.
You have to admit though that the EPIC (Explicitly Parallel Instruction Computing) model was quite innovative. The philosophy influenced the LLVM project and some of the principles are used in GPU's and AI accelerator chips, even if hardware-based dynamic scheduling won the game.
" 0:26:21 BC: But that was lost after Andy left. That was lost, that part of the culture went away.
0:26:27 PE: Who succeeded him?
0:26:28 BC: Craig Barrett.
0:26:29 PE: Right. Were you still there when that happened, when did Grove leave?
0:26:34 BC: Grove stopped being the president in January 1998.
0:26:40 PE: Yes.
0:26:41 BC: And that's when Craig Barrett took over. 0:26:43 PE: And what changed at point?
0:26:46 BC: Well Craig's not Andy, I mean he had a different way of thinking and doing things, Craig, I don't want it to sound cynical but I always sound cynical when I talk about him because I had such a bumpy relationship with him. I don't think he felt like he needed anything I could tell him, and it wasn't just me, I wasn't taking this personally. I never once got the same feeling I got with Andy that my inputs were being seriously and politely considered, and then a decision would be made that included my inputs. 0:27:21 PE: Yes.
0:27:22 BC: That never happened. Instead, for example five Intel fellows including me went to visit Craig Barrett in June of 98 with the same Itanium story, that Itanium was not going to be able to deliver what was being promised. The positioning of Itanium relative to the x86 line is wrong, because x86 is going to better than you think and Itanium is going to be worse and they're going to meet in the middle. We're being forced to put a gap in the product lines between Itanium and x86 to try to boost the prospects for Itanium. There's a gap there now that AMD is going to drive a truck through, they're going to, what do you think they're going to hit, they're going to go right after that hole" which in fact they did. It didn't take any deep insight to see all of these things, but Craig essentially got really mad at us, kicked us out of his office and said (and this is a direct quote) "I don't pay you to bring me bad news, I pay you to go make my plans work out".
0:28:22 PE: Gee. 0:28:25 BC: So. 0:28:25 PE: Yeah he's polar opposite. 0:28:26 BC: So and he, and at that point he stood up and walked out and to back of his head. I said, "Well that's just great Craig. You ignored the message and shot the messengers. I'll never be back no matter how strong of a message I've got that you need to hear, I'll never bring it to you now.” 0:28:38 PE: Yeah.
0:28:40 BC: It's not rewardable behavior. It was sad, a culture change in the company that was not a good one and there was no way I could fix it. If it had been Andy doing something that I thought was dumb, I'd go and see him and say "Andy what you're doing is dumb", and maybe I'd convince, maybe I wouldn't. But as soon as you close that door, it is a dictatorship. You can't vote the guy out of office anymore, you can't reach him. There's no communication channel."
I keep wondering why Andy didn't see that before nominating him?
it's not unprecedented, when companies' businesses contract, shrinking is exactly the right thing to do, not to mention that it's forced on them anyway.
"The Global Data Center Chip Market size is expected to be worth around USD 57.9 Billion by 2033, from USD 14.3 Billion in 2023, growing at a CAGR of 15.0% during the forecast period from 2024 to 2033."
Yes, I understand the argument that Intel management screwed up for too long and this is the market at work, but that ignores the geopolitical risks of what we're going to end up with. Forming some kind of consortium to keep Intel fabs running (and new ones built) could also include completely changing the management of the company.
I don't buy this. I think the primary problem was mismanagement especially in the 2008 to 2020 timeframe. Too many bean counter CEOs during that period who did not understand the need to constantly invest in SOTA fabs.
> Too many bean counter CEOs during that period who did not understand the need to constantly invest in SOTA fabs.
I am not here to defend Intel, but I don't think this is the correct interpretation of events. Basically, Intel failed in their fab process R&D to keep up with TSMC and Samsung, and that is not lack of effort or money. Since their fab process R&D was going so poorly, Intel slowed down their fab construction rate. This makes good business sense to me. The truth appears to be that Intel fab process got beat fair and square by TSMC and Samsung.Intel absolutely flubbed some nodes and bad employee execution was a part of it.
But management has consistently tried to tell customers what they want/need. Intel has a history of developing products with no customer base or pulling out of markets too early. Neither of those are the responsibility of low level employees. That's higher management.
One of the big concerns about the GPU division is "Will Intel keep going long enough for this to matter or will they pull out the second there's an issue?"
Ergo policy should have been that X percent of chips be made on US shores. Wups
They are not.
Chips act was a whole lot of hot air. It passed in 22 and intel did not receive any money from it until end of 24.
Intel is also likely going to lose hundreds of millions in incentives from Oregon for failure to meet hiring objectives, but they have a while to do that.
> China will take the reigns in 2027
As I understand, the best fab tech is TSMC (Taiwan) and Samsung (Korea). Do you really expect China can surpass both in only two years? It seems unlikely, as they don't have access to high-end fab equipment from ASML.Edited to add: this was not the point being made, I am aware. Just my thoughts on the matter.
Smells like corporate bulimia.
When I worked/lived in the Bay Area there was a sense that corporations, and residents of the Bay Area, were moving to Oregon because it was cheaper … but still close enough to Silicon Valley. (Apropos of nothing really.)
If companies have extra cash on hand, don't we want them to invest it and hire? The alternatives are stock buybacks or just sitting on the cash.
Obviously every bet is not going to pan out, but hiring even on the margin is probably good.
No. Hiring should be a long-term strategic investment, not something you do whenever you have extra cash lying around. If you needed the extra people you should have been trying to hire them already, and if you don't then you shouldn't hire them now.
If I'm a shareowner, if the company doesn't have any intelligent ideas on how to spend my money, they should send it back to me as a dividend, or buy me out (share buyback).
Please don't waste my money trying to build some immortal empire as a shrine to the CEO's ambition.
I'd rather they accept reality, slim down to size, focus on making good Xeons again, and stop acting like they have a monopoly they no longer have.
So far, LBT seems to have the right idea.
Everyone else, like AMD/Global Foundries/Samsung/Intel don’t seem to be making an enormous amount of money.
The margins are all in the best chips. Less than best is commoditized.
When corporations just invest because they have money, there is a gigantic agency problem, and executives have a tendency to burn shareholder value on vanity projects and fancier headquarters.
Stock buybacks are exactly what I want wealthy companies to be doing with money they don't have a high expected ROI for.
* they've done about $152B in stock buybacks since 1990 https://www.intc.com/stock-info/dividends-and-buybacks. I think... ~$108B in the last decade.
* during the same time period they fell behind TSMC and SEC in semiconductor fab , missed the boat on mobile (couldn't really capture the market for either smartphone or tablet CPUs), and are missing the boat w/AI training https://www.hpcwire.com/2025/07/14/intel-officially-throws-i...
Discussion of Intel's buyback behavior as excessive and wasteful was also picked up on during all the discussion of CHIPs subsidies last year: https://news.ycombinator.com/item?id=39849727 see also https://ips-dc.org/report-maximizing-the-benefits-of-the-chi...
The existence of markets Intel didn't dominate does not, to me, imply that it would have been a good use of resources to throw (more) money at the markets they didn't dominate. Not every company is good at every business, even if they dominate some seemingly related market.
There's also the matter that dividends are meant to be long-term and recurring. So it's not great for one-time windfalls.
This anti stock buyback meme is silly. It’s like people who are anti shorting stock. Companies list on the stock exchange in order to sell their own stock to raise capital. If they have excess capital, absolutely they should be able to buy back their stock. And buy other companies stock if they see it as undervalued also.
A great case to see the absurdity of it is Intel, doing stock buybacks for almost a decade to push its stock price up while flailing around and losing its edge, if it was paying high dividends while flailing around then major shareholders would be asking why the fuck would they be paying dividends while the business is losing competitiveness but by doing stock buybacks it kept investors "happy" so they could jump ship and let the company fail on its own.
Stock buybacks have perverse incentives, everyone responsible for keeping the company in check gets a fat paycheck from buybacks: executives, major investors, etc., all financed by sucking the coffers dry. The buybacks at Intel just made the company as a whole lose money, they bought back stocks when they were high and it only dipped since then (10y window).
The idea that the stock market can only be used to flow shares in one direction has no merit. If you want to regulate executive compensation do that with direct clear regulation on executive compensation, not via some indirect rule change on the stock market.
It's not about regulating executive compensation, it's to close a gap that was opened and only led to poorer decision making at the executive/board level, there's no advantage to the company. It's a stupid instrument with no reason to exist except to return money to shareholders in a way they can avoid taxation events.
The fact that c-suites authorize buybacks largely to boost the stock price in order to trigger their own performance bonuses tied to the stock price only highlights that point.
If you did something even remotely similar, you would be prosecuted for fraud, because it's fraud.
1) Wrongful or criminal deception intended to result in financial or personal gain.
2) A person or thing intended to deceive others, typically by unjustifiably claiming or being credited with accomplishments or qualities.
The problem though is that the incentive structure is so that none of the involved parties has any disincentive, let alone an adversarial incentive to end the practice, let alone has standing to do anything legally, short of sabotaging their own stock value.
It's a totally perverse and corrupted incentive structure, similar to why both Trump or Biden, or Democrats or Republicans have the real will or interests in ... non of the involved parties have any interest in revealing the rot and corruption, and all parties involved have every incentive to keep it all under wraps, suppressed, covered, up and distracted from.
In some ways, a civil activist organization could in fact buy a single stock of one of the most egregious stock buyback stock price inflation causing corporations and sue them for fraud and deception, but it would have to come with a claim at manipulation of the market due to fraudulent manipulation of the price discovery process similar to a light version of cornering the market through restriction of supply, i.e., cartel behavior.
If there is any fraud, it would be having performance bonuses tied to individual stock price, rather than market cap. But blaming the buyback itself, is short-sighted.
> performance bonuses tied to individual stock price
This is pretty common for the board to setup stock price targets for the CEO, then pay large bonuses (cash or shares) for beating the targets.They are in effect more like the guys who stand around a cup and ball scam to make it look like there’s action and winners and keys you think you could do better.
A buyback is a removal of the security from the market, not participation in the market.
It’s like people buying their own books to drive up sales in order to get in lists to promote more book sales, which is when they then supply the market with the books their bought once the price has been artificially elevated and has become sticky.
You may not like hearing that and it’s clearly not the mainstream street preferred narrative, but that’s what it is.
There are mechanisms that are commonly employed to REDUCE the price of the stock, (ie. a stock split), and nobody bats an eye about that. Buying back stocks is a reasonable way to employ cash reserves, and protects the firm from exposure to foreign exchange rate, and inflation risks.
I will agree with you that the way executive bonuses are structured can be a perverse incentive that borders on fraud. But blaming the buyback of stocks itself, isn't grounded in any inherent economic misdeed.
There is zero fraud implied or even suggested by stock buybacks. They are heavily-publicized-in-advance returns of capital to shareholders. That's it. The sales are often offset by the creation of new stock via RSUs, and in that case just reduce the dilution intrinsic to RSUs.
Shareholders want executives to be incentive-aligned to reduce agency problems. Stock based compensation furthers that goal. If a manager doesn't think they have a better use of spare capital than returning it to shareholders, returning the capital is exactly what shareholders want. There's nothing nefarious here.
Again, it’s like the Epstein narrative; the right and good thing is to release the information, but the whole system, both sides and most in between have a vested interest of one kind or another to keep it under wraps. We know there as organized human trafficking, sexual slavery, abuse, rape, and various types of racial master race level eugenics associated with out of control “intelligence” agencies, including for foreign governments against the American government …. It everyone is just covering it up because everyone is implicated or is mentally compromised and the “peasants” unlikely have the organized power to change that.
That is also all documented, in fact it was documented for how many years? 10? 15? 20?
Ever hear of a guy called Madoff? He sure made off with money of sophisticated and smart people for several decades.
Don’t lie to yourself about the confidence in the system.
Investors in a corporation don't want individual teams to spend money "just because it was budgeted, even if we didn't have a good thing to spend it on".
I, as a manager of a team at a corporation, of course have a partially adversarial relationship with investor goals; I want my team to be happy, in part because happier teams often are more productive, but in large part also because it's just nice to spend my work life working side by side with people who are enjoying their perks.
If my entertainment budget directly (or even partially) reduced my team's bonus pool, that would be crappy for team cohesion, but it would probably make me think more carefully about taking everyone out to lunch.
I think we've become too complacent/accepting of corporations just laying off employees with what amounts to a shrug.
But big picture I disagree. We kind of need creative destruction in an economy - we need to be able to lay off people in horse buggey industries so that they can be hired to make Model T's. We're better off focusing on our social safety network and having a job market that encourages some amount of transit between careers.
Treating the employer/employee relationship like some life-long commitment sounds like pure hell. It is a transaction. I don't want it to be anything more than that.
it does though doesn't it? divorce is so common that marriage no longer feels like it has any permanence like you imply it does
As that's all pretty basic... Maybe you... Don't understand divorce well enough to say whether or not it fits an analogy.
But as you call it a "decree", and I only know one legal system that does (Texas), we're probably from different backgrounds here.
So let's just flip this. Why have you been asked, and what were they asking to see? Proof you are divorced? Did you claim something that you were awarded by the judge and had to show proof of it? If someone is asking for that kind of information, did you not ask why they needed and get an answer? It would be much easier if you just explained your experience as it is the outlier here
Your employer enjoys your data without your say so [0].
EU, UK [1], Au [2], NZ... They have to ask. So they do.
[0] https://www.dshs.texas.gov/vital-statistics/marriage-divorce...
[1] https://www.equifax.co.uk/Products/learning-centre/marriage-...
[2] https://www.auscheck.gov.au/what-we-do/background-checks
It is another significant flaw in the "capitalist", i.e., publicly traded corporate system that incentivizes all the various financial shenanigans to generate false stock performance to enrich the c-suite.
If you reinvest it into the stock, you've had to pay taxes on the dividend amount, so you've lost vs a buyback.
If you want to spend money and your stocks don't issue dividends, you just have to sell some of your shares. Selling $X of shares will almost always generate less taxable income than receiving $X of dividends as some of it will be a return of capital; so again, if you take $X out of the holdings, you've lost with a dividend vs a buyback and you sold $X.
Any time you want that to happen for you, simply press the sell button for some percentage.
Why would you want the company to decide the timing and the percentage for you?
It's a different state and a 9-10 hour drive away; in what sense is it close?
Note that these were NOT executive jets for C-suite, these were for all employees who had meetings at other locations (at least according to people I've met since I moved to AZ a few years ago to be near my in-laws).
Coming out of San Jose, the plane would enter this corkscrew to gain altitude. I guess to avoid SFO airspace.
I would often see high level executives on the same plane.
Ha, you're right:
https://www.planespotters.net/photo/1640095/n486hf-intel-air...
https://www.planespotters.net/photo/355961/n386ch-intel-air-...
There's even a 286: https://www.planespotters.net/photo/1738295/n286sj-intel-air...
I don't doubt you but most people would not find a 2-hour one-way commute pleasant.
Anyway, a 2 hour commute is normal in the bay area . I used to do the south bay to San Francisco Caltrain daily... Flying is better. Free drinks for one thing
For a 4 bedroom house, Zillow gives me around 7-8k a month in rent for SF and around 4k for SJ, so you'll at least break even and have a better quality of life for your family even if you won't see them much during the week.
Anyway: The fact that a simple adequate house for a family runs at 4k in the US is ... maddening. 1500 sqft is around 140 m², a comparable house in Berlin (one of Germany's hottest markets!) runs ~2500€. How one is supposed to live in the US when not on a typical techbro salary, I have no idea how y'all even manage that.
Different strokes for different folks.
Definitely not close as in "commute close".
Maybe more like "close to feeling the same as the Bay Area"?
(You can believe Portlanders hated Californians that moved up there. Or so I've been told.)
They still do, only it's not really Portlanders anymore, it's all the smaller cities that hate them. Why? A couple reasons: they came in and pay over asking price for housing, driving up prices across the board so those working for local non-conglomerates have a hard time affording housing. And then they vote contrary to how the locals do (locals, I might add, who didn't have any problem with how things were run before, even if their "betters" felt they were "backwards").
Basically, they end up burying the local culture and replacing it with California.
Also, no sales tax!
PS – as someone who spent hundreds of hours on Glider PRO as a kid, thank you!
The Asserted problem: Labor force/expense is too high, or at least, higher than is now thought necessary.
The (IMO) core problem: Measuring professional success/skill primarily by the size of the team a person manages.
The asserted solution: AI replacing Labor to reduce inflated labor costs/pools.
While there is some inherent benefit there to reducing team sizes back down into allegedly functionally sized units, there is a lack of accountability and understanding as to why that's beneficial, as it at seems to be done either due to the lofty promise of AI (which I'm critical of), or a more brutalist/myopic approach of merely trying to make the big labor-cost number smaller to increase margin/reduce expenses. To be clear, while I'm a critic of AI, I fully acknowledge it can absolutely be helpful in many instances. The problem is that people are learning the wrong lessons from this, as they've improperly identified the issue, and why the force reduction is allegedly/appears to be working.
Obviously, YMMV on a case-by-case/team/company basis, but Intel is known for being guilty of "Bigger = Better" when it comes to team size, and their new CEO acknowledged this somewhat with their "Bureaucracy kills innovation" speech [0].
That said, what may be good for the company (even if done for the right reasons) can still hurt the communities it built/depend on it.
0: https://www.inc.com/kit-eaton/in-just-3-words-intels-new-ceo...
Just off the top of my head I can think of a dozen or so.
They're just not in tech for the most part. I can think of 2 breweries, a standardization company, two machine shops. And those are just within a mile or so of my house.
I think a lot of folks work at Intel in order to get out of tech. That's basically what I'm doing, lol. Work enough to save and get out of tech.
Tech is also expensive to start up in. So it makes sense that a lot of the intel-driven businesses would be non-tech.
"Ex-Intel executives raise $21.5 million for RISC-V chip startup":
https://www.aheadcomputing.com/
I believe the founding team is all in Oregon - and mostly all ex-Intel.
https://www.oregonlive.com/silicon-forest/2025/06/top-resear...
I think what hampers Oregon is that there isn't much non-Intel investment in R&D in the Portland region as well, compared to the Bay Area – there used to be a graduate institute funded by Tek et al., but that never got sustained. [1] The local academic medical research center is well-regarded and otherwise wouldn't have trouble attracting talent if it wasn't for the salaries.
They merged with OHSU, but it turned out OHSU was about as broke as they were, so most of the CS faculty migrated en-masse to PSU and took all their grad students (myself included) along with them. (It turns out grants generally go to the principle investigator, not the school. So if your advisor moves schools, their funding goes with them.)
Yeah, this is going to be the ultimate deciding factor. When local companies don't pay enough to live and Bay Area companies are paying upto 10X+ the compensation (for AI roles) people are going to make the move.
I knew the end was in sight when it became company policy that no employee stay past 6PM without HR approval due to the danger.
One of the underappreciated things about the bay area is that, while it is very suburban, there are several respectably sized downtown cores -- Mountain View, Palo Alto, Redwood City, and of course the Big Kahuna - San Francisco -- all connected by relatively speedy (and from what I understand, much speedier now) rail.
As a result, basically every west coast city absolutely destroyed itself and will take at least a decade or more to recover… it they every really do.
Portland specifically is lagging a bit, but they are on the upwards trajectory (with a few big things to fix still).
I suspect the person denigrating the West Coast cities is doing so from their basement in East Prolapse, Kentucky, as a way to rationalize their life choices.
The city, the county, and the state all spend increasing proportions on debt service, and the rewards for earning a lot of money are much less than the neighboring state.
The total fertility rate is also one of the lowest in the country. And Portland lacks a flagship university to bring in young talent.
It might be a nice place to visit during the summer months, but I don’t foresee many high paying jobs or highly profitable businesses being made there.
That being said, as someone who took a few graduate stats courses at PSU, I don’t think the library would have been the right place anyway - it always couched itself as a commuter access school that relied heavily on transfers from community college, and the library reflected that. It’s definitely not a well-resourced research institution, and with the retrenchment of federal funding for research and financial aid I’m not sure what it’ll focus on.
Great place though, definitely might end up there one day
I am a critical person and Portland is currently deserving of much criticism in order to fulfill its potential. No one got anywhere by patting themselves on the back reassuring themselves they had already made it.
But we all agree the state is seriously going to have to think hard about how to attract new business to both I-5 corridor and rural areas - whether it’s through investing in OSU upstream, or attracting more downstream manufacturing jobs.
Those are two of the biggest economies in the world pumping loads of money and people into that engine to try to get it started and still just starting to make some progress. That's not to say there aren't great ideas out there that we're missing which could make it all easier and cheaper, but a small team is definitely going to have one hell of a time making stuff on a nanometer scale without billions and billions of dollars behind them. Software startups are easy, but hardware is hard. Massive hardware and microscopic hardware are harder.
I'm not saying it _should_ or _must_ be that way, just that it is.
https://www.littler.com/news-analysis/asap/california-reache...
Has this been tested? Why would an Oregon court care about what a California law says it can and cannot do?
It’s absolutely possible to be a fugitive in one state from another. It’s also typically quite expensive and stressful.
If I were on the other side of this fight, I’d use the opponent’s absence from Oregon courts to get civil judgements in my favour and then use the federal credit system to collect.
Absent any superseding federal law, a contract with unenforceable clauses in California is not going to be enforceable in California.
Since the breach of contract occurred in California, the claimant would have have legal standing
Everyone non-technical was hired. Everyone with a strong ability was seen as difficult, and kicked out.
Housing costs in the Bay Area are soul-crushing, but they do motivate people to work on the highest value projects because complacency just doesn't usually work if you're trying to buy a house. And so I wonder, if Intel had kept their workforce mostly in California, could they have stayed a dominant force in computing?
> Intel CEO says it's "too late" for them to catch up with AI
Intel actually made a decent video card that sells above MSRP: Battlemage. They can easily advance it into more powerful GPUs.
Gelsinger understood that. The current MBA empty suit doesn't.
Also, Oregon is a terrible state to invest in as a business, especially one that is looking to pay high salaries.
Intel was in the business of selling the most cutting edge, technologically advanced products in the world, but they didn’t want to pay enough for the best people.