For the average person with average needs, there is no difference between, for example, a $100 Dell Latitude E5530 from 10+ years ago and a $600 Best Buy low-end Dell laptop from today, so long as the Latitude has been modestly upgraded with 8GB of RAM and a small, used SSD. Its 3rd generation i5 is more than enough to do anything they need. It even runs Windows 11 just fine, so long as you inform the customer about the need to manually install feature updates.
For the general public, buying new computers is an expensive scam that contributes massively to waste. The machines I refurbish would typically have been thrown out or 'recycled' (stripped for precious metals in an expensive process) if not for my intervention. There's no reason for this except number-go-up greed, and it should stop.
I doubt the average person knows how to or is willing to manually install feature updates to continue to run Windows 11 on an unsupported laptop. Refurbishing is great, but I'm not sure how much more you can get out of a 10+ year old platform. I think the sweetspot is a 3-6 year old platform where a refurbished unit will be a decent bit cheaper, but still have a good bit of life left.
The point that others have made about business laptops vs consumer laptops is also salient. Most of what I am refurbishing is business-grade and therefore has held up quite well in terms of build quality.
I do also do quite a bit of business in the ~4-6 year old machine world, but that's a different demographic of customer from my average.
It's hard for me to recommend most ~$500 Windows laptops when they skimp out on those things to lean into specs, while older-model Apple Silicon MacBook Airs are just a bit pricier but absolutely deliver on quality-of-life.
Gamers and power users of course shunned them for so long saying, "you could get a better laptop for half the price!" but it's a testament to how good the build quality was that the full force of tech enthusiasts telling everyone not to buy it wasn't enough to sway people away.
Everywhere but the low end the point has become kind of moot these days for the most part, Apple has beefy specs now and mid-high range Dells and Thinkpads have good build quality and QoL. I think speaker quality is the most noticeable difference between Apple and Dell where Dell just doesn't value it as anything other than an afterthought.
I've had two Intel MBP cook themselves to death within two years.
The best laptop I've ever used was an Thinkpad X1 Carbon and I'll die on that hill - I find that Apple's build quality for a lot of stuff is ridiculously overrated and on-par with PCs that cost similarly. The lack of cooling is awful and the extra survivability of MacBooks is helped a lot by their status as shiny prestigious toys for people who don't care that much about using their computers.
Obviously plenty of people get real work done on them, but I'm not sure that they're actually more reliable for the price.
Doesn't that come with an anemic 256GB HD expected to hold both OS and user software?
In the meantime, you can buy cheap miniPCs with Celeron/i5/Ryzen5 with 16GB of RAM expandable to 32GB and 500GB HD with multiple SSD expansion slots for less than $300.
Good refurb definitely should have an SSD and battery at least in good condition.
They can be, but there's an inflection point of age. For ~400 USD you can get an all-E-core i3-N305/512GB SSD/8GB RAM/1080p laptop - which is about on-par for performance with a midrange 4-core CPU from the final 14nm mobile chips (Comet Lake, 2019). With the N305 you get notably lower power draw under load.
Battery can be changed easily, memory can be replaced in case of failure or need to upgrade.
It doesn't support Windows 11, but it happily runs 10, browser and the entire Office software suite. It's built in an plastic/aluminum chassis so it's a bit sturdy but the keyboard is not soft as low-end plastic keyboards.
The value of such a laptop is lower (if not nearly $0) than a low-end laptop but much snappier.
Battery life is one of the biggest issues there isn't a good way around. Replacement non-OEM batteries are extremely variable (and often pretty poor) in quality.
Also, it probably does support Windows 11, as long as you're OK with manual installation of the once-a-year feature updates.
I tried.
My current garage computer, a i7-8700 with 64GB and UHD-630 graphics cannot play a YouTube video without dropping frames if the "Ambient Mode" (which adds a glowing drop shadow behind the video playback rectangle) YouTube option is turned on.
A Raspberry Pi can.
Smooth 1080p video playback using contemporary video codecs is a mandatory task for the average person with average needs.
I like old computers. My home's NTP server is a Digital AlphaStation 200 4/233 from 1995 running Digital UNIX 4.
I have so many old machines people think I am a little off, which I am. One of my coffee tables is a Sun Ultra 450.
You will hear no arguments from me about keeping a system running as long as is practical.
As is practical.
A user with only $100 to spend is better served by a Raspberry Pi.
Anyone who needs the computer to improve their education, further their life, or earn money, needs something that gives them the least amount of friction they can afford.
And a Latitude E5530 is nothing but friction. From poor battery life to abysmal screen to slow performance to OS workarounds: friction.
Most of my buyers buy one machine and are done, but a substantial subset are return customers. Return business is all about the usual stuff: Reputation, level of service, quality of the product. I have a few "regulars" who have bought 4+ machines from me, usually hobbyists or small business owners with a specific need. Having done this for a couple of years, I also now get return customers semi-frequently who bought something a while ago, like it, remember me, and seek me out again.
I don't offer warranties or guarantees on items, but I'm aggressively inexpensive compared to the competition, and I operate on an informal "if I broke it, I fix it" policy. If I told you something was working and then later it breaks, I either fix it, swap it out, or replace the machine.
I have a five-star reputation on Facebook Marketplace and 99.6% positive feedback on eBay as a result of this policy, so it works decently well.
All my XP product listings have a disclaimer that mentions XP is unsupported and shouldn't be used for everyday computing; I have on occasion had to steer away customers looking for a general-purpose machine to something else. The typical customer who does buy an XP machine is either doing retrogaming, running legacy hardware (industrial control systems and medical equipment are common), or has some other XP-specific need.
I have relationships with a few recyclers and also volunteer my time with a nonprofit that also offers refurbishing; I frequently buy the machines they don't have time to deal with for low prices.
Recyclers have a huge advantage in getting their machines for free; I am too small-time to do that effectively. However, I take on the repair jobs they're not willing to do, so I get a pretty good deal on the inventory they have.
I have a 16GB M1 MacBook Air that feels very snappy for now, but with the pile of Electron app I run daily (Visual Studio Code, Discord, Slack, Signal, 1Password, Plexamp, Google Messages for Web) and some misc other accessory apps (WhatsApp, Teams, Microsoft To Do, Excel, OneNote)... along with a browser, I'm basically running out of memory with my set of startup apps before I even open anything for "heavy" work.
I was also going to say that Teams needs to go in the Electron list, but apparently they switched to Edge WebView2, so thanks for the second lesson! On macOS, that still means they install another chromium browser per app, sigh...
(BTW, I'm not sure how you installed Google Messages for Web but if you use chrome it's just a PWA meaning two less JS runtimes being installed.)
I took a quick look at my computer and I knew Obsidian was electron... And now I see the Google Drive app seems to be shipping its own browser too(?)
And a bunch of those aren't the same in the web version - Signal, WhatsApp, VS Code, 1Password and Plexamp are all Electron and are either unavailable or functionally not useful for me on the web.
Case in point, people have managed to run a bit of the modern internet on Amiga. Enough to be useful. But all of it? Forget about it.
Yep, that was me back in the day; i thought the same! :-)
This is so true. I wish MacBook didn’t have such shallow keyboards or I’d be all in. Maybe it’s improved recently but at one point it was like typing on a table. Always loved the travel of the ThinkPad keyboards.
I’ve got both, have used both for webdev work, but compared to a modern laptop, the screens suck, the video is underpowered for 4k monitors, and the ssd interface is slow, and the trackpads are awful. On the bright side, trackpoints. (But I’m using a M13 w/ trackpoint so…. ) They’re also heavy and battery life is almost long enough to work the whole way on my bus commute, one way.
Even if you don't use Linux, they typically come with a Windows Pro license built in.
For around 100$ and some manual work you can upgrade to 2k resolution with say a B140QAN02.0 screen or heck even 4k if you dont mind spending twice the laptop's worth on that
Absolutely love that laptop, tricked out mine with 64gb of ram (absolutely overkill but hey I could) and a X1E glass trackpad and it's been my main dev laptop for the last 5-6 years
A 6e wifi upgrade is somewhere on the roadmap as well for me
Basically the last real big jump in performance was SSD. I have used M1 - M3 Macbook at work. While they are faster it wasn't that much faster compared to the switch between HDD and SSD. Even On devices Voice Dictation and other AI features worked pretty well.
As many stated software is getting slower. Security and all the other requirement will likely put more burden on your machine. So there may be a need to upgrade this 2015 machine in the future, but as far as I am concern most of those have to do with Memory rather than CPU performance. I could have a 2015 Quad Core MacBook Pro and 32GB and I am sure it will last me till 2030.
ARM and Qualcomm have both catch up to Apple in CPU performance. Oryon and Cortex X725 is now within ~12% ( IIRC ) IPC difference or even similar if you ignore a small type of workload. With X730 and Oryon 2 both expect to eliminate or even exceed that gap. Unless A19 / M5 pull some other magic tricks we have basically make High CPU Performance a commodity.
If there was one thing I would fix about the software industry with a magic wand it’s this constant need to be on a buying treadmill because software developers refuse to support legacy hardware.
That is grinding to a halt. Chips are making only modest performance gains with each new fabrication bode, and the time between nodes is stretching to 3 years. Not only that but it looks like GAA FETs at 1-2 nm (marketing name) is close to the end of the road.
Software is going to have to stop getting more bloated, and may have to become more efficient as people want to run it on smaller devices.
It depends on the software usage. If you're not using cpu-demanding tasks like rendering videos in Adobe Premiere, Blender 3D, etc, then very old pcs will continue to work fine.
The desktop computer I'm typing this comment on is a 10-year old Intel i7-5820K 3.3GHz pc. Back in 2014, I maxed it out at 64 GB RAM but I took half out and reduced it to 32GB RAM. I use it daily for VS2022, VSCode, VMware, MS Excel.
I also help maintain a desktop for my 80-year old friend. Her computer is a 15-year old i7-950 3.06GHz. That computer from 2009 runs Windows 10 and she uses it daily for Chrome browsing, Youtube videos (including 4k), Amazon shopping, and Mozilla Thunderbird email.
It's possible that Windows 11 with its TPM requirement may finally force a hardware upgrade of those dinosaurs but I read there are hacks to get around that.
I could definitely see how buying a new high-end pc today will last ~15 more years for typical consumers. On the other hand, the power users who want to run the latest LLM locally with 600-watt graphics cards that will be obsolete in a year will be a different story. Today's NVIDIA 5090 with 32GB RAM may be too small to run the next latest & greatest LLMs for those who want to stay on the bleeding edge.
EDIT REPLY: >Why did you take half the RAM out?
It was a long story that I left out. The motherboard was unstable with all 64GB of RAM in it. It would lock up with RAM corruption after a few hours. Finding the root cause of this this took several days of trial & error with swapping the 8 RAM sticks and running MEMCHECK on multi-hour scans. After testing and going the process of elimination, it turns out that none of the RAM sticks had defects. The defect was the motherboard itself. Take any of the 4 out of 8 RAM sticks so it's 32GB RAM and everything is super stable.
I was just mentioning the 32 GB RAM without all that backstory to emphasize that I've gone 10 years without being at the more "future-proof" 64 GB.
This is what killed a lot of computers in my company's laboratories.
Having said that my 8GB MacBook Air runs the unit tests for my current project four times faster than my 2018 i7 Mac. I will upgrade within a couple of years.
I mean, depends on what software you use. There's a pretty sizeable and growing ecosystem of people who put a lot of thought into performance. Just look at tooling like ripgrep, some of the newer terminals people have been working on, recently I came across a pretty nice neovim plugin where someone had written their own custom SIMD fuzzy string matcher (https://github.com/saghen/frizbee). There's some pretty admirable effort people put into performance these days.
I think speed of your setup is mostly limited by how willing you are to look for better alternatives.
I have an old Windows 7 laptop and the newest versions of the Chromium browser (Supermium fork, for legacy Windows compatibility) run far faster than any versions of Firefox or Internet Explorer ever did.
I doubt website loads faster. Statistics show that modern websites load slower on modern hardware than old websites used to their respective hardware. I don't see why it would be different on Windows 7.
I build computers to last - the specs were high-end at the time, and have been upgraded over the years (video card, RAID controller, SSD's, etc). Even though it's getting long in the tooth, the box is still reasonably performant today.
It's highly customized; the case sports thoughtful additions like sound-dampening foam, bespoke brackets for additional cooling fans (all Noctua of course), hardware thermostats & monitoring LCD, interior lighting that activates when you open a panel even if the machine is off (makes it a pleasure to work with when under a desk), etc.
Choices that really panned out well include: Infiniband (this was back when 10G NIC's were stupid-expensive, but eBay was flooded with great, second-hand Mellanox cards off universities), Areca (their RAID controllers and arrays were so easily upgradeable across generations), ECC RAM everywhere, and an external PCI-E expander (six x16 slots just weren't enough).
It has in the range of 1000 software titles installed, countless ones used regularly (guess I'm somewhat a jack of all trades). Specialized diagnostics and tooling track and isolate changes made by software, which has helped manage things and prevent bloat accretion. I periodically run benchmarks to ensure metrics like bootup time, disk transfers, etc. still match out-of-the-box numbers).
When you have to install and configure that many apps, migration is a real pain, which motivates longevity (and a collateral reduction of e-waste).
Out of interest - what do you use the extra slots for? At most I can think of:
- NIC
- GPU
- NVMe
- HBA
- Maybe old protocols like FireWire/SCSI/GPIB
I am also expecting to reuse my current daily drivers (like I did before) as backups or auxiliary machines. My laptop keyboard has some loose keys and my phone screen started to die, but they still have a lot of compute to give.
If only we could have a bigger percentage of people that thought the same way. Then we might be able to get away from the insanity of marketing for new New NEW when what you have will do. Maybe these huge “tech” companies will be taken down a peg into more sane valuation territories. Maybe we’ll stop with the mounting piles of e-waste driven by the advertisers pushing FOMO of not having the shiniest.
A guy can dream though.
I figure I’ll slow my pace of upgrades even more than I have now and when the software becomes yet a larger pile of bloated nonsense shat out by clueless developers than it already is, I’ll switch back to writing letters.
The problem is that it doesn't. Software developers get the latest hardware, either because they like it ("it's my job") or because their company pays for it. As a result, they write software that works on their hardware, which is obviously more forgiving in terms of performance. Eventually, everybody has to update their hardware because their current hardware can't load a simple website or a chat app.
I can see a huge difference when loading website on my work computer vs my personal computer. Just last month my weather forecast app was updated and became literally unusable on my phone. Of course I can't use the old one anymore, so I don't have access to the public national weather forecast app. It works great on more modern phones though... showing exactly the same data as the previous app, but... I guess it looks more modern?
Developing on high-end laptops should definitely not be an excuse to deliver slow software, and in the teams I work in, we do pay attention to performance. You are right though, a lot of software is a lot slower than it should be and my opinion is that the reason is often developers that lack fairly basic knowledge about data structures, algorithms, databases, latency,... One could say that time pressure on the project could also play a role, but I strongly believe that lack of knowledge plays a much bigger role.
Now, aside from that, also keep in mind that users (or the product owner) become more and more demanding about what software can and should do (deservedly or not). The more a piece of software must do, the more complex the code becomes and the more difficult it becomes to keep it in a good state.
Lastly, in my humble opinion, the lowest range budget laptops are simply not worth buying, even for less demanding users. I think that most users on a low budget would be better off with a second-hand middle or high range laptop for the same price. (I am talking here about laptops that people expect to run Windows on, no experience with Chromebooks.)
I disagree. For all my life, customers have been asking for as much as they can imagine. Customers wanted flying cars long before they wanted the latest iPhone.
The thing that changed is that we realised that if we write lower quality software that has more features (useful or not), customers buy that (because they are generally not competent to judge the quality, but they can count the features). So the goal now is to have more features.
> I think that most users on a low budget would be better off with a second-hand
Which is exactly the problem we are talking about: you are pushing for people to get newer hardware. You just say that poorer people should get the equivalent of newer hardware for the poors. But people on a budget would actually be better off if they could keep their hardware longer.
Can't have your cake and eat it too. It's not all laziness. How long did it take to get Doom to run on a toaster? ;)
I am genuinely trying, but I am finding hard to find modern software that qualifies for those words.
Is Slack "super cool and useful"? Is Word/Excel a lot cooler and more useful than... well honestly 20 years ago? Does Microsoft Teams qualify for that? Facebook? Instagram?
I don't think that more powerful hardware allows developer to write "cooler" and "more useful" stuff. What it allows is to write more, faster. Since the early 2010s, it feels like we specialized in writing worse software, but writing a lot more of it.
That is literally how modern capitalist consumer economies work. The whole system is based on the assumption of more people buying more things they don't need, computers and otherwise.
Our society is that way intentionally.
A single NVME SSD can now push over 10GB/s
Main memory bandwidth is now over 100GB on midrange hardware.
At 512GB/s for an x16 link, that is 32GB/s x1. If SSDs stay with x4 links that is a physical layer throughput of 128GB/s.
https://pcisig.com/blog/pcie%C2%AE-70-specification-version-...
https://www.youtube.com/watch?v=55NAeEwEqtQ
PCIe 6 is half of that, but we probably won't see anything in the consumer space until 2026 or 2027. That is probably two Ryzen releases from now. So really, Zen 7 ships for Christmas of 2028 along with PCIe 6 and we get x4 NVMe SSDs with 64GB/s of bandwidth.
Me back in 2005 would have though this setup was science fiction.
Alan Kay has talked about this many times, when a new technology comes around most people just see it as doing the same thing you could already do just faster, rather than enabling entirely new ways of doing something.
By all metrics the web is a slow buggy mess, but it's inherently different from a set of manpages and email addresses. While it's true that you don't "need" to do anything are you sure throughout the next 30 years that you will have no usecase for a local L*M as one example?
For daily drivers I agree with the author, fresh machines can last almost a decade. Browser, productivity, etc now works well on a phone, should work well on a 6-8 year old pc (if you can avoid windows). My 2013 Mac Air runs Mint Linux very well and does all this.
Separate server allows to decouple needs, get specific hardware and have simpler and independent update cycle. Also here buying refurbished enterprise hardware should last 5-8 years for a local user, again preventing from buying new…
Demanding Gaming is the exception to this idea, but I only play games from 5 years ago, as my backlog is huge and I buy on Steam Deals mostly. No more day 1 patches for me
If they can tell it’s faster then certainly a technical person like myself can.
And also, that was an incredibly cheap upgrade. In 12 years they went from one $600 computer to another $600. That’s right, the new one was the same price, so cheaper than the original after inflation, they’ve paid $50 a year to compute, and that’s on the world’s most premium brand of computers.
Sure, you don’t need to upgrade anything. And for now, the Ryzen 3600 is a fantastic “old” processor, it runs my game server and it’s certainly capable.
But it’s not like you wouldn’t notice a far better experience someday in the future with an upgrade.
I expect to replace the desktop components in a few years when something breaks. Broken CPUs due to age are extremely rare, but mainboards with bad contacts for memory are pretty common, I've seen a lot that don't work that well after 8-10 years. I don't expect a desktop PC to work forever, the PSU will break in 10 years anyway, the SSD will reach write limit (I did a few already). But right now performance is not a concern.
That said, I enjoy the hardware improvements happening, because it allowed me to go from that huge full tower desktop with multiple GPUs and water cooling to doing everything I need in my life, pretty much, from a 14" M1 Max Macbook Pro. I replaced a huge, power-hungry device, with something that's tiny, portable, and can be powered off USB.
For me, I am not quite ready to replace my M1 Max MBP with an M4, but I am likely going to bump to an M5, simply for performance when editing photos in Lightroom / DxO Photo Lab, but that's more of a recent requirement. Before I got this M1 Max MBP, I had a 2015 15" MBP that worked just fine for 7 years, and would have kept going if it didn't have a bulging battery and I decided to upgrade rather than repair it. I may just stick to my M1 Max MBP in the end, I can be patient.
> The report about the cost of planned obsolescence by the European Environmental Bureau [7] makes the scale of the problem very clear. For laptops and similar computers, manufacturing, distribution and disposal account for 52% of their Global Warming Potential (i.e. the amount of CO₂-equivalent emissions caused). For mobile phones, this is 72%. The report calculates that the lifetime of these devices should be at least 25 years to limit their Global Warming Potential.
https://wimvanderbauwhede.codeberg.page/articles/frugal-comp...
This is for consumer devices btw, probably not if you operate some server farm with high occupancy (steady load on all hardware)
My 2011 i5 desktop is still happily chugging away as a build server, home storage, and remote host. But oh yes, it will have to be nuked, thanks to MSFT policies.
Certainly there are people with significant performance needs. For the rest of us there's a Mac Mini or an iPad.
Ultimately the mac mini was a better purchase, delivering performance unknown to intel platforms (using only a tenth of the energy).
There is always new tech. Local LLMs and other high processing intensive things might be a thing people want. Not directly, but it may enable things they want. More viral TikTok videos. Maybe some kind of health monitoring. Maybe AR will finally get a compelling use case if it can identify everything in your field of view but it requires serious computing power. Maybe AR 3D movies where the characters show up in your house and adapt to your living room. Siri might suck, but lots of people want a "Star Trek" computer that actually understands them.
The point is not any specific example. Rather, it's that there's always something around the corner that needs more computing power. I have no idea what it will be, but I'm confident something will appear.
But here's the thing: in the 1990s, people pretty much needed to upgrade regardless of what they were doing. Sure, you had a few holdouts. These were people who would continue to use Wordstar and had no interest in exchanging documents with people who use that newfangled Microsoft Word. These people were the exception rather than the rule, since most people wanted to be able to share their documents, get onto the internet, or any other number of things. Chances are, they also had multiple reasons to upgrade.
The situation is quite different today. You can get away without upgrading because most of the software, if not all of the software, you need will run just fine on an old PC. As for the other stuff, maybe you'll have one or two reasons to upgrade. Is that enough to justify it? The answer is going to depend upon the person, and the actual task they need to complete. For most people though, I would suggest that they don't feel the same compulsion to upgrade their computer.
I suspect at some point the new "Youtube" (3d volumetric video, holodeck, or something) will come out, it will be as popular as youtube and as "must have feature" such that 95% of the population will want a computer that can do this new thing and todays computers won't be able to do that new thing.
Linux can extend the life for a while, until that too is also unsupported and becomes unsupported.
Does anyone use those other than spam?
I guess IDEs and even iOS are shipping them, albeit far in usefulness from the SOTA. Low latency in iOS is noticeable tho.
All that cost less than a typical PC I’d build in the late 90s.
Even as a power user who codes, I can’t imagine what I’d need more for, unless I want to train AIs.
I don't even use that system much because my M1 Pro macbook can do almost all the same things.
"software gets slower to counteract hardware getting faster" is mostly true, but what's more true is that "software gets slower to counteract the developer's hardware getting faster". Devs (or their employers) aren't feeling too compelled to upgrade, and so they don't, and so software is staying fast(ish). Apple's annoying RAM-upgrade pricing is likely helping here, too.
(By the way, I've diverted my hardware-upgrade itch into photography gear)
1. Spare parts: RAM will fail (it's 1666 MHz), keyboards wear out (I've got one spare left), etc
2. Support wanes for some old hardware. I already can't update NVIDIA driver past a certain release (I'm on Linux.)
Sooner or later I'll have to buy something new just to be able to read my screen or to cope with a failed irreplaceable part.
It's a coin toss whether I go Linux or Windows 11 once 10 becomes unusable.
Which means the MS is forcing people like me to either buy a few new computers or to finally commit to Linux.
Feels like game devs have come out of their covid slumber and decided it’s time to jack up requirements
I recently did an interim update (5800x3d and 3090) so will try to hang on for a few more years
I got off the Apple train because my first iPhone got too sluggish for me to want to use at 2 years old when there was an OS update.
Then the pain is finding a home for my old PC.
I heard about a guy on Facebook who builds and configures PCs for free (free labor, not free parts). He only does a couple each year. That sounds like a pretty fun hobby.
Today, £1100 will buy you a Macbook Air with an 8-core CPU, 16GB RAM and a 256GB SSD
In 2015, £1100 brought me a desktop PC with a quad-core 4GHz CPU, 32GB RAM and a 250GB SSD
(Yes, obviously, there's been inflation, comparing a laptop to a desktop is a little unfair, the newer machine's RAM will be faster, one includes a built-in screen, etc etc etc)
I still deal with 20 minute compile times. Let me know when that drops to 10 seconds.
I'm reminded of the dead parrot sketch - this thing wouldn't "voom" if I put four million volts through it.
I fired up my old iPod the other day to get tunes in my shop. Hasn’t been updated in 7 years and still has all the music I like. Doesn’t come with a subscription. Still has genius playlists. Remember those? they were great. I’m so glad Apple hasn’t thought to or isn’t able to brick it.
The biggest improvement in computer speed for the last decade or so comes from more ram and much as Apple wants you to think so, it ain’t scarce.
A five year old laptop with 16gb ram is totally fine for everything I ever do (32 if I want to splurge!). I mean unless you want to run windows 11 for some unknown reason. Which, to be perfectly clear, I do not suggest for anyone ever. Did you know Linux can run security updates in like 45s? Windows wants to download a 6gb patch and spend 45min replacing your entire operating system every month. That alone should be enough to get everyone to nope.
Given how much junk we (over)produce as a species, buying retail for a lot of this stuff just doesn't make sense unless you need it immediately for business or work purposes.
So... yeah, I tend to agree.
Many of the new machines are actually worse, e.g. 3770K @77W vs. 14900K @125W/253W. That isn't to say they're not also faster, but if you actually use it you're burning more watts.
and yet i'm still tempted to update every years :D
Bit zippier (not screaming), but it does have native support for Apple "Intelligence."
I was waiting for the M4Max/Ultra Studio, but, y'know, I realized that I have no need for that.
This has been working fine, for a couple of months. I suspect that I won't be replacing it, for a few years.
I probably will need to get a new iPhone, and maybe iPad, sometime in the next year or so (also for Apple Intelligence stuff), but I'm in no hurry.