I will never need to buy a new computer again
82mhz.net108 points by ecliptik 9 hours ago
108 points by ecliptik 9 hours ago
I am typing this on my MacBook Pro early 2015. Dual Core i5 with 8GB Memory. For 99% of usage on Chrome and Firefox it is fine. ( 12 years using Safari it is still the worst browser for many tabs )
Basically the last real big jump in performance was SSD. I have used M1 - M3 Macbook at work. While they are faster it wasn't that much faster compared to the switch between HDD and SSD. Even On devices Voice Dictation and other AI features worked pretty well.
As many stated software is getting slower. Security and all the other requirement will likely put more burden on your machine. So there may be a need to upgrade this 2015 machine in the future, but as far as I am concern most of those have to do with Memory rather than CPU performance. I could have a 2015 Quad Core MacBook Pro and 32GB and I am sure it will last me till 2030.
ARM and Qualcomm have both catch up to Apple in CPU performance. Oryon and Cortex X725 is now within ~12% ( IIRC ) IPC difference or even similar if you ignore a small type of workload. With X730 and Oryon 2 both expect to eliminate or even exceed that gap. Unless A19 / M5 pull some other magic tricks we have basically make High CPU Performance a commodity.
My daily driver is also a (16GB) 2015 MBP and the only reason I’m starting to consider an upgrade is because of the scary “unsupported OS” warnings that brew is starting to show. I don’t really look forward to the day when that warning changes from “things could break at any time” to “your system is now broken”.
I've mentioned this in other threads, but I run a small side business refurbishing and selling old laptops. One element of my work is saving retro machines for retrocomputing, old hardware interfaces, etc., but I also refurbish and sell for general use.
For the average person with average needs, there is no difference between, for example, a $100 Dell Latitude E5530 from 10+ years ago and a $600 Best Buy low-end Dell laptop from today, so long as the Latitude has been modestly upgraded with 8GB of RAM and a small, used SSD. Its 3rd generation i5 is more than enough to do anything they need. It even runs Windows 11 just fine, so long as you inform the customer about the need to manually install feature updates.
For the general public, buying new computers is an expensive scam that contributes massively to waste. The machines I refurbish would typically have been thrown out or 'recycled' (stripped for precious metals in an expensive process) if not for my intervention. There's no reason for this except number-go-up greed, and it should stop.
I'd argue that new, low-end laptops are in the $300-$400 range. Most people would be better served by a new laptop, instead of a decade old refurb. Sure, basic tasks might not need additional processing power, but things like better battery life, higher resolution screens, fast solid state drives, better webcams, and network adapters supporting newer wifi/bluetooth standards are things the average person would notice and benefit from.
I doubt the average person knows how to or is willing to manually install feature updates to continue to run Windows 11 on an unsupported laptop. Refurbishing is great, but I'm not sure how much more you can get out of a 10+ year old platform. I think the sweetspot is a 3-6 year old platform where a refurbished unit will be a decent bit cheaper, but still have a good bit of life left.
I think the use case you imagine most of my customers have is not the one they have. Most of my customers need a laptop for a handful of "can't do it on the phone" things that they do occasionally - taxes, bookkeeping, a Zoom call here and there. They're not daily driving it like you or I would. Another large plurality need a Chromebook-like device for school (I often install ChromeOS Flex on the lower-end machines, if it's compatible, to achieve this, and sell them for $40-50 each).
The point that others have made about business laptops vs consumer laptops is also salient. Most of what I am refurbishing is business-grade and therefore has held up quite well in terms of build quality.
I do also do quite a bit of business in the ~4-6 year old machine world, but that's a different demographic of customer from my average.
For laptops specifically, the technical specs don't matter for most use cases, but the "quality-of-life" things absolutely do: screen resolution and brightness, keyboard and trackpad comfort, and battery life.
It's hard for me to recommend most ~$500 Windows laptops when they skimp out on those things to lean into specs, while older-model Apple Silicon MacBook Airs are just a bit pricier but absolutely deliver on quality-of-life.
Yep, Apple likely got a bunch of lifetime customers during the decade long period they spent not leaning into specs in favor of putting every dollar into quality of life.
Gamers and power users of course shunned them for so long saying, "you could get a better laptop for half the price!" but it's a testament to how good the build quality was that the full force of tech enthusiasts telling everyone not to buy it wasn't enough to sway people away.
Everywhere but the low end the point has become kind of moot these days for the most part, Apple has beefy specs now and mid-high range Dells and Thinkpads have good build quality and QoL. I think speaker quality is the most noticeable difference between Apple and Dell where Dell just doesn't value it as anything other than an afterthought.
I'd argue that there's no better value right now for a basic computer than the Mac Mini M4 standard for $500 (been on sale for this price 2-3 times at various places since release, and it's the standard Education price at Apple store).
Refurbished laptops can be superior in comparison to cheap Bestbuy laptops. These old laptops are often much more solid built, have better keyboards, may even have better screens (FYI brand new laptops with cheap TN 1366x768 screens are still manufactured).
Good refurb definitely should have an SSD and battery at least in good condition.
> Refurbished laptops can be superior in comparison to cheap Bestbuy laptops.
They can be, but there's an inflection point of age. For ~400 USD you can get an all-E-core i3-N305/512GB SSD/8GB RAM/1080p laptop - which is about on-par for performance with a midrange 4-core CPU from the final 14nm mobile chips (Comet Lake, 2019). With the N305 you get notably lower power draw under load.
My 10 years old laptop has FHD screen, half a TB of SSD. Battery life is not as great as today's laptop but that's a tradeoff my family is willing to take because they transport the laptop but rarely use it on battery.
Battery can be changed easily, memory can be replaced in case of failure or need to upgrade.
It doesn't support Windows 11, but it happily runs 10, browser and the entire Office software suite. It's built in an plastic/aluminum chassis so it's a bit sturdy but the keyboard is not soft as low-end plastic keyboards.
The value of such a laptop is lower (if not nearly $0) than a low-end laptop but much snappier.
The value of that laptop is definitely not $0. It's probably $50-150 depending on the specifics of the machine.
Battery life is one of the biggest issues there isn't a good way around. Replacement non-OEM batteries are extremely variable (and often pretty poor) in quality.
Also, it probably does support Windows 11, as long as you're OK with manual installation of the once-a-year feature updates.
I only buy second hand laptops for myself, but I think the power/efficiency gains of modern cpu's should not be overlooked. The average person doesn't need a strong cpu, but the laptop should not burn their lap or run out of battery while they browse facebook.
Oddly enough, very few folks I know use their laptops without being plugged in most of the time. I know I rarely do.
And, refurbished business laptops tend to have better keyboards than consumer grade laptops, as well as a better build quality (generally speaking).
If you don't mind me asking: how did you first build your client-base? And even now, how do people know that you sell those products?
I started out with an eBay store, and expanded to Facebook Marketplace and Craigslist. I'd say about 50% of my sales are on eBay, 40% on FBM, and the remaining 10% through Craigslist - but it changes a lot based on time of year and what the product is. Windows XP machines almost exclusively sell through eBay; more expensive machines almost always sell through FBM.
Most of my buyers buy one machine and are done, but a substantial subset are return customers. Return business is all about the usual stuff: Reputation, level of service, quality of the product. I have a few "regulars" who have bought 4+ machines from me, usually hobbyists or small business owners with a specific need. Having done this for a couple of years, I also now get return customers semi-frequently who bought something a while ago, like it, remember me, and seek me out again.
I don't offer warranties or guarantees on items, but I'm aggressively inexpensive compared to the competition, and I operate on an informal "if I broke it, I fix it" policy. If I told you something was working and then later it breaks, I either fix it, swap it out, or replace the machine.
I have a five-star reputation on Facebook Marketplace and 99.6% positive feedback on eBay as a result of this policy, so it works decently well.
Where do you buy the old machines from?
Mostly from local recyclers; also sometimes from folks on Craigslist who are looking to sell instead of recycle. Prices range dramatically based on the machine but $10 or less for a Windows XP machine is normal, $15-20 for a 4th-6th gen i3/i5, and it goes up from there.
I have relationships with a few recyclers and also volunteer my time with a nonprofit that also offers refurbishing; I frequently buy the machines they don't have time to deal with for low prices.
Recyclers have a huge advantage in getting their machines for free; I am too small-time to do that effectively. However, I take on the repair jobs they're not willing to do, so I get a pretty good deal on the inventory they have.
I feel average person doesn't even use a computer/laptop anymore. Smartphone + TV covers most cases. iPad if you are a bit more advanced.
Exactly - but every now and then, they might need one for something, and a $60-100 laptop will do fine in that case.
I splurged for an M4 Pro Mini w/ 64GB. 5TB Between internal and external SSDs, then it’s over to a ~30 TB NAS if I need more.
All that cost less than a typical PC I’d build in the late 90s.
Even as a power user who codes, I can’t imagine what I’d need more for, unless I want to train AIs.
Give it enough time and a new version of Slack will eventually grind to a halt on your puny 2019 CPU.
Will run into memory pressure long before then.
I have a 16GB M1 MacBook Air that feels very snappy for now, but with the pile of Electron app I run daily (Visual Studio Code, Discord, Slack, Signal, 1Password, Plexamp, Google Messages for Web) and some misc other accessory apps (WhatsApp, Teams, Microsoft To Do, Excel, OneNote)... along with a browser, I'm basically running out of memory with my set of startup apps before I even open anything for "heavy" work.
Why not use those as a web app tho. It does exactly the same thing.
I've tried with Slack/Discord/Teams, and have found the web apps to not be particularly workable for my use. The Electron apps have better notifications, and I typically have dozens to hundreds of tabs open, which doesn't work well with keeping a handful of important tabs active. (Both active mentally/visually, and not put to sleep by the browser to save on resources.)
And a bunch of those aren't the same in the web version - Signal, WhatsApp, VS Code, 1Password and Plexamp are all Electron and are either unavailable or functionally not useful for me on the web.
Another idea is to use a separate browser or profile just for those web apps that need notifications and other settings you wouldn't want mixed up with your daily driver.
IDK whats wrong with notifications for you. For tabs - I pin core ones (i.e. slack, harvest, asana) to be in front. Use tab groups for each project/ticket I work with for longer term organization.
You could write a new client, but at some time you will run into an issue where they start adding features faster and heavier and more required than you can keep up with by optimizing the software to run on your potato.
Case in point, people have managed to run a bit of the modern internet on Amiga. Enough to be useful. But all of it? Forget about it.
I'm sure you're right, but its a sad state of affairs. I mean, is this what Marc Andreessen meant when he stated that famous phrase about software eating the world or something? That is, that software will get so crappy, so bloated that it will simply engulf the world like the Blob from that old horror/sci-fi movie? I'm sure its not, i'm just being silly of course. And, i know its the old fogey in me, but i sure hope more lowfi efforts like Gemini, tildes communites, etc. prosper, and create other similar creativity...Because with all the processing power that exists in the world, among the things that slow our machines to a crawl include a freaking chat app??? Seems like such a waste. (Not blaming you of course, simply went off on a tangent from my frustration with bloatware like Slack, etc.)
You're right. just look at the devolution of Windows by generation. As everyone on HN comments on, how terrible Windows 11 is compared to windows 10, compared to windows 7, etc. (bloat, unnecessary apps, intrusiveness and so on.) Advertising? WTF? We all thought - when we were running Windows 3.11, imagine if we had a faster computer ?!
Is slack that bad? I've only used it for Bernie organizing in 2020, but it worked fine on my 9-year old (at the time) PC.
I have a T430s which was handed down to me by my boss 10 years ago. It has i5 from that time, 8Gb RAM. I'm still doing web app development on it same as 10 years ago and i don't feel any need to change it. I'm actually afraid there is no better laptop i could change it for when mine dies. Also i can't imagine no better keyboard :(
The T420 keyboard is far better than the chicklet T430s. It’s a proper mechanical keyboard.
I’ve got both, have used both for webdev work, but compared to a modern laptop, the screens suck, the video is underpowered for 4k monitors, and the ssd interface is slow, and the trackpads are awful. On the bright side, trackpoints. (But I’m using a M13 w/ trackpoint so…. ) They’re also heavy and battery life is almost long enough to work the whole way on my bus commute, one way.
I have a T440p with 16 GB of RAM and a T480 with 40 GB of RAM (that can be extended again to 64 GB) and I'm also pretty happy with both and worried about the future of computing. I might stockpile a few of them!
Same, T460 and a T480 I think. Updated to an SSD and maxed out RAM. My only quibble is max 1980x??? Screen resolution. ThinkPad are so cheap when they come back from their leasing contract, not sure why not more people are excited about them.
Even if you don't use Linux, they typically come with a Windows Pro license built in.
You can upgrade the screen too, at least on the t480
For around 100$ and some manual work you can upgrade to 2k resolution with say a B140QAN02.0 screen or heck even 4k if you dont mind spending twice the laptop's worth on that
Absolutely love that laptop, tricked out mine with 64gb of ram (absolutely overkill but hey I could) and a X1E glass trackpad and it's been my main dev laptop for the last 5-6 years
A 6e wifi upgrade is somewhere on the roadmap as well for me
> ”T430…no better keyboard”
This is so true. I wish MacBook didn’t have such shallow keyboards or I’d be all in. Maybe it’s improved recently but at one point it was like typing on a table. Always loved the travel of the ThinkPad keyboards.
ThinkPad keyboards also got worse (flatter) every couple of generations, sadly. My laptop with the best keyboard is an R50e from 2004. The keyboards are still nice compared to most current ones, but...
He’s forgetting that software keeps getting slower. Forever and ever. With new hardware comes new expectations for hardware by software vendors.
>> With new hardware comes new expectations for hardware by software vendors.
That is grinding to a halt. Chips are making only modest performance gains with each new fabrication bode, and the time between nodes is stretching to 3 years. Not only that but it looks like GAA FETs at 1-2 nm (marketing name) is close to the end of the road.
Software is going to have to stop getting more bloated, and may have to become more efficient as people want to run it on smaller devices.
You would hope that, but instead you end up paying for megabytes of data and thousands of lines of javascript to deliver adtech when all you wanted was to read a 1kb article or blog post on your phone.
Maybe this is also why we're seeing the rise of more statically typed compiled languages like Go and Rust. I've successfully run Rust on mobile and it's great, very snappy compared to web apps.
The rise? Those kinds of languages have always been here and widespread. If anything you’re seeing the tapering off of the rise in other languages (JS predominantly) that took place over the last 15 years or so.
Yes, that's what I meant, the usage of languages like Ruby (for Rails) and Python (backend, not for AI work) is dropping and static languages are rising.
>He’s forgetting that software keeps getting slower.
It depends on the software usage. If you're not using cpu-demanding tasks like rendering videos in Adobe Premiere, Blender 3D, etc, then very old pcs will continue to work fine.
The desktop computer I'm typing this comment on is a 10-year old Intel i7-5820K 3.3GHz pc. Back in 2014, I maxed it out at 64 GB RAM but I took half out and reduced it to 32GB RAM. I use it daily for VS2022, VSCode, VMware, MS Excel.
I also help maintain a desktop for my 80-year old friend. Her computer is a 15-year old i7-950 3.06GHz. That computer from 2009 runs Windows 10 and she uses it daily for Chrome browsing, Youtube videos (including 4k), Amazon shopping, and Mozilla Thunderbird email.
It's possible that Windows 11 with its TPM requirement may finally force a hardware upgrade of those dinosaurs but I read there are hacks to get around that.
I could definitely see how buying a new high-end pc today will last ~15 more years for typical consumers. On the other hand, the power users who want to run the latest LLM locally with 600-watt graphics cards that will be obsolete in a year will be a different story. Today's NVIDIA 5090 with 32GB RAM may be too small to run the next latest & greatest LLMs for those who want to stay on the bleeding edge.
EDIT REPLY: >Why did you take half the RAM out?
It was a long story that I left out. The motherboard was unstable with all 64GB of RAM in it. It would lock up with RAM corruption after a few hours. Finding the root cause of this this took several days of trial & error with swapping the 8 RAM sticks and running MEMCHECK on multi-hour scans. After testing and going the process of elimination, it turns out that none of the RAM sticks had defects. The defect was the motherboard itself. Take any of the 4 out of 8 RAM sticks so it's 32GB RAM and everything is super stable.
I was just mentioning the 32 GB RAM without all that backstory to emphasize that I've gone 10 years without being at the more "future-proof" 64 GB.
>It's possible that Windows 11 with its TPM requirement may finally force a hardware upgrade of those dinosaurs but I read there are hacks to get around that.
This is what killed a lot of computers in my company's laboratories.
Nitpicking but you can run Blender on dirt specs I ran it on a $60 Chromebook and got a couple of good renders out of it even did some vfx
The only upgrade I made on my PC from 2014 is replacing an HDD by an SSD once those became affordable. It’s still perfectly fine for the web, office and dev work, and likely will last for the rest of the decade.
I always think the Core 2 Duo was an inflection point in terms of performance. Before that every new operating system release seemed slow on even the most modern hardware. After it was fine.
Having said that my 8GB MacBook Air runs the unit tests for my current project four times faster than my 2018 i7 Mac. I will upgrade within a couple of years.
>He’s forgetting that software keeps getting slower
I mean, depends on what software you use. There's a pretty sizeable and growing ecosystem of people who put a lot of thought into performance. Just look at tooling like ripgrep, some of the newer terminals people have been working on, recently I came across a pretty nice neovim plugin where someone had written their own custom SIMD fuzzy string matcher (https://github.com/saghen/frizbee). There's some pretty admirable effort people put into performance these days.
I think speed of your setup is mostly limited by how willing you are to look for better alternatives.
Not all software gets slower.
I have an old Windows 7 laptop and the newest versions of the Chromium browser (Supermium fork, for legacy Windows compatibility) run far faster than any versions of Firefox or Internet Explorer ever did.
Websites are software. Most websites get slower.
Technically websites are markup. Web apps are software and their frameworks are getting faster.
Adding features makes it slower and is inevitable pain of progress. Suggestion we should stop improving things is ridiculous.
What do you consider as "Chromium running faster"? Like loading Chromium or closing a tab?
I doubt website loads faster. Statistics show that modern websites load slower on modern hardware than old websites used to their respective hardware. I don't see why it would be different on Windows 7.
Rust code apps are super fast on my linux laptop from 10 years ago too.
The machine I'm typing this on is a 'whitebox' build I put together in 2010.
I build computers to last - the specs were high-end at the time, and have been upgraded over the years (video card, RAID controller, SSD's, etc). Even though it's getting long in the tooth, the box is still reasonably performant today.
It's highly customized; the case sports thoughtful additions like sound-dampening foam, bespoke brackets for additional cooling fans (all Noctua of course), hardware thermostats & monitoring LCD, interior lighting that activates when you open a panel even if the machine is off (makes it a pleasure to work with when under a desk), etc.
Choices that really panned out well include: Infiniband (this was back when 10G NIC's were stupid-expensive, but eBay was flooded with great, second-hand Mellanox cards off universities), Areca (their RAID controllers and arrays were so easily upgradeable across generations), ECC RAM everywhere, and an external PCI-E expander (six x16 slots just weren't enough).
It has in the range of 1000 software titles installed, countless ones used regularly (guess I'm somewhat a jack of all trades). Specialized diagnostics and tooling track and isolate changes made by software, which has helped manage things and prevent bloat accretion. I periodically run benchmarks to ensure metrics like bootup time, disk transfers, etc. still match out-of-the-box numbers).
When you have to install and configure that many apps, migration is a real pain, which motivates longevity (and a collateral reduction of e-waste).
> and an external PCI-E expander (six x16 slots just weren't enough).
Out of interest - what do you use the extra slots for? At most I can think of:
- NIC
- GPU
- NVMe
- HBA
- Maybe old protocols like FireWire/SCSI/GPIB
You can also buy second hand to save another 50-80% when you do upgrades due to something catastrophically breaking. I got a used but very good quality mid tier Ryzen laptop for $200 from a few generations ago and added 32GB memory and a nvme drive and it’s an absurdly good computer for dev work.
1080Ti from 2017 still handles modern AAA games really well, was able to play and finish Cyberpunk 2077 with no issues. Really a remarkable feat of engineering for it's time.
I have the same card (+8700k) and I agree, it plays without issues and at decent frame rates... Without ray tracing. Have you seen those videos? It looks amazing. There's a reason that Nvidia used it to demo their 50*0 series cards at CES. I'm waiting to play Phantom Liberty until I can play it with some kind of ray tracing.
I have never had a computer that was ever even close to being fast enough, and i doubt i ever will. I guess it depends on what you use them for.
I still deal with 20 minute compile times. Let me know when that drops to 10 seconds.
My almost 12 yo laptop is still OK for my job. I like its 3 buttons touchpad and touchpads have no buttons nowadays. I could keep it going for still a long time but for two things:
1. Spare parts: RAM will fail (it's 1666 MHz), keyboards wear out (I've got one spare left), etc
2. Support wanes for some old hardware. I already can't update NVIDIA driver past a certain release (I'm on Linux.)
Sooner or later I'll have to buy something new just to be able to read my screen or to cope with a failed irreplaceable part.
This is simply not true, the UI speed isn't increasing because of systemic bloat, The Great EnFattening. But the throughput gains are immense.
A single NVME SSD can now push over 10GB/s
Main memory bandwidth is now over 100GB on midrange hardware.
I thought 7GB/s was the max, but you're right! Looks like I need to upgrade to PCIe 5.0!
My experience so far is that you can get around 7 years of heavy usage from a premium product. It doesn't matter how much maintenance or care you have (I treat mine like it owes me money now, but I've been careful before), that's how far it goes without disappointing you.
I am also expecting to reuse my current daily drivers (like I did before) as backups or auxiliary machines. My laptop keyboard has some loose keys and my phone screen started to die, but they still have a lot of compute to give.
Energyconsumption is nowaday the reason for a CPU Update.
Unless you have a CPU from 2000, probably it's not worth the energy savings to have a new one produced:
> The report about the cost of planned obsolescence by the European Environmental Bureau [7] makes the scale of the problem very clear. For laptops and similar computers, manufacturing, distribution and disposal account for 52% of their Global Warming Potential (i.e. the amount of CO₂-equivalent emissions caused). For mobile phones, this is 72%. The report calculates that the lifetime of these devices should be at least 25 years to limit their Global Warming Potential.
https://wimvanderbauwhede.codeberg.page/articles/frugal-comp...
This is for consumer devices btw, probably not if you operate some server farm with high occupancy (steady load on all hardware)
When I upgraded 5 years ago, general mechanical failure without available replacement parts was the driving factor, but energy consumption was high on my list. A light laptop with a long battery life is something that never used to exist, and it definitely improves my quality of life. If battery life at a low weight cost doubles in the next 5-10 yrs I'll probably upgrade again even if the machine is usable.
it's often a corporate's need for new revenue and security that causes the machine to march on. Just look at this TPM Win 10 upgrade issue.
My 2011 i5 desktop is still happily chugging away as a build server, home storage, and remote host. But oh yes, it will have to be nuked, thanks to MSFT policies.
Not just Microsoft: look at Google's policy for app developers to target newer and newer devices, keep updating apps with no material changes besides this obsolescence, just to be allowed to stay in the Play Store
I just bought all new parts for an SFF build with a 9800X3D and 5090. So I guess this is a timely post. I know it's overkill for my needs but I love building PCs - been doing it since I was 12, and it was my first job (working at PC Club in SoCal).
My main machine is a 3.8GHz 8-core Ryzen, 64Gb RAM, GTX1070 GPU. Bought and self-built in ~2018 and still seems pretty good for development and the odd game. Even the 1TB Toshiba SSD is claiming to be healthy according to the monitoring app. It just zips along and copes with everything, and I've never felt any temptation to upgrade anything.
Me back in 2005 would have though this setup was science fiction.
Now that there is great GPU-accelerated remote desktop options, I mostly just remote into more powerful machines. Even a country away the on-screen performance is almost like sitting at the machine, and as a bonus I don't hear every fan on my laptop going crazy. I've been a happy Parsec.app user for a while, but there are many other options (e.g. RustDesk has this).
I've been waiting for this to get good enough. Can any of these apps do passthrough of USB/webcam?
Looks like it's not supported in RustDesk or Parsec, but there are other tools that will do it [1].
Well, newer laptops are built with poor quality plastics, so the hinge will break after 2-3 years. Older models are a beast though. Even the budget Dell Inspiron 3520 (2013 ~$400) is still running fine as a youtube streaming machine.
I just got rid of my 2013 Microsoft Surface Pro. It was still being used daily in my workshop, 11 years old. Core i5 processor, running Windows 10. I only got rid of it because the battery decided to become a spicy pillow one night, expanding until it cracked open the case and pushed out most of the touchscreen.
"640k is all you need!"
There is always new tech. Local LLMs and other high processing intensive things might be a thing people want. Not directly, but it may enable things they want. More viral TikTok videos. Maybe some kind of health monitoring. Maybe AR will finally get a compelling use case if it can identify everything in your field of view but it requires serious computing power. Maybe AR 3D movies where the characters show up in your house and adapt to your living room. Siri might suck, but lots of people want a "Star Trek" computer that actually understands them.
The point is not any specific example. Rather, it's that there's always something around the corner that needs more computing power. I have no idea what it will be, but I'm confident something will appear.
I'd argue lack of security updates makes older yet capable systems untrustworthy. You just can't be sure they won't be infected at some point.
Linux can extend the life for a while, until that too is also unsupported and becomes unsupported.
The author pretty much acknowledges that with the YouTuber example. And yeah, you can create a fairly long list of other reasons why people may want to upgrade.
But here's the thing: in the 1990s, people pretty much needed to upgrade regardless of what they were doing. Sure, you had a few holdouts. These were people who would continue to use Wordstar and had no interest in exchanging documents with people who use that newfangled Microsoft Word. These people were the exception rather than the rule, since most people wanted to be able to share their documents, get onto the internet, or any other number of things. Chances are, they also had multiple reasons to upgrade.
The situation is quite different today. You can get away without upgrading because most of the software, if not all of the software, you need will run just fine on an old PC. As for the other stuff, maybe you'll have one or two reasons to upgrade. Is that enough to justify it? The answer is going to depend upon the person, and the actual task they need to complete. For most people though, I would suggest that they don't feel the same compulsion to upgrade their computer.
I agree there hasn't been a super strong reason to upgrade in the last 10 years. I just disagree with "I will never need to buy a new computer again".
I suspect at some point the new "Youtube" (3d volumetric video, holodeck, or something) will come out, it will be as popular as youtube and as "must have feature" such that 95% of the population will want a computer that can do this new thing and todays computers won't be able to do that new thing.
> Local LLMs
Does anyone use those other than spam?
I guess IDEs and even iOS are shipping them, albeit far in usefulness from the SOTA. Low latency in iOS is noticeable tho.
Yep, I have a Lenovo E420 (I think?) that I bought when I graduated in 2011. If I replace it, it will be due to things like the USB ports not working, not due to processing power being insufficient. I don't game anymore, I can watch video on it, I can use the Internet and word processing. What does one DO with a high powered processor?
I felt like that from 2010 or thereabouts to two years ago. There wasn't really a use case for having a very fast machine. Now with llms and sd, I think there is a use case that happily absorbs any compute I can buy, just like first person shooters in the 90ies.
I'm still using my Dell XPS 7100 from 2009. It has an AMD Phenom II X6 1045T. Only upgraded the GPU over the years. Added more RAM and an SSD. It still works like a charm. Even the original keyboard it came with is the same. No need to upgrade to anything new as of yet.
I have a 3950x in my desktop and I feel exactly the same way. I have the upgrade itch but I can't justify it in cost/benefit terms.
I don't even use that system much because my M1 Pro macbook can do almost all the same things.
"software gets slower to counteract hardware getting faster" is mostly true, but what's more true is that "software gets slower to counteract the developer's hardware getting faster". Devs (or their employers) aren't feeling too compelled to upgrade, and so they don't, and so software is staying fast(ish). Apple's annoying RAM-upgrade pricing is likely helping here, too.
(By the way, I've diverted my hardware-upgrade itch into photography gear)
I thought the same about my 2020 Ryzen, until I started working with the Unity editor two months ago.
I'm reminded of the dead parrot sketch - this thing wouldn't "voom" if I put four million volts through it.
For me, the fun is spec'ing and building new PCs. I wish I could do it every year.
Then the pain is finding a home for my old PC.
I heard about a guy on Facebook who builds and configures PCs for free (free labor, not free parts). He only does a couple each year. That sounds like a pretty fun hobby.
As long as your computer runs a browser and a terminal emulator, it is more than fine.
Even when it takes 5 minutes from turning on the computer to loading a page on the browser?
A computer won't necessarily be slow just because it's old.
Today, £1100 will buy you a Macbook Air with an 8-core CPU, 16GB RAM and a 256GB SSD
In 2015, £1100 brought me a desktop PC with a quad-core 4GHz CPU, 32GB RAM and a 250GB SSD
(Yes, obviously, there's been inflation, comparing a laptop to a desktop is a little unfair, the newer machine's RAM will be faster, one includes a built-in screen, etc etc etc)
You’re probably not on windows 10, because that makes sure you need a new computer. Personally I wouldn’t say I’d never need a new computer, but my i5 bought 11 years ago does everything I need (as a software developer) other than playing back 4K video recorded on an iPhone.
Which means the MS is forcing people like me to either buy a few new computers or to finally commit to Linux.
Linux too eventually drops support for older hardware, though with certain LTS distros it can be as much as ten more years
After starting running some LLM models locally I would like a faster CPU, maybe with dedicated "AI cores" or whatever they are called. But old CPU still works and I'll need new motherboard, RAM, ... So I'll probably keep using my PC until one of parts will finally give up.
Electron would love to have a word with you.
Having to configure a new machine has put me off upgrading stuff. If I have to spend time doing that, it takes away from more important things. Obviously for major QOL improvements eg. Eye-level laptop screen, it's worth changing, but I've found very little that fits that metric.
That is why I love NixOS - migrating machines is trivial, as long as you are OK with using Linux, or finagle with macOS.
That's one of the reasons I'm happy I'm on a 4 year old Galaxy phone and it feels like new. Doesn't feel slow, battery life is great. I hope it lasts another 4 years.
I got off the Apple train because my first iPhone got too sluggish for me to want to use at 2 years old when there was an OS update.
4 years isn't that old for a phone though, not anymore at least. I put one smartphone away after 3 years because a chip on the motherboard broke, but feel like that's the exception (never heard of this failure mode from anyone else) and most phones should last 4+ years if you don't get unlucky with the battery (I can't identify why some batteries are good as new after 6 years whereas others can barely hold a charge after 2 — but anyway, those are replaceable)
I buy almost all tech refurbished / recertified - it's shocking how cheap off-lease equipment goes for on Newegg / Backmarket. Amazon has fantastic warehouse deals sometimes too, you just have to be patient and check occasionally. Especially with desktops, buy everything 'base' 4-8 years out of date, upgrade memory + storage, look for refurb monitor and peripheral deals, etc. Hell, I've gotten completely free shit (expensive stuff!) multiple times from Amazon because of UPS/FedEx screwing up or fudging deliveries! And don't discount Ebay either, sometimes pallets of crap end up on there trying to flog 100+ 'whatevers' at bananas prices, as well as finding weird/rare Chinese replicas of discontinued OEM hardware.
Given how much junk we (over)produce as a species, buying retail for a lot of this stuff just doesn't make sense unless you need it immediately for business or work purposes.
My only beef with this has to do with power usage which can make some of the older computers just not worth it.
The one other use case that will need better hardware is gaming. And compiling is also always better when faster. Using llms locally will also profit from new hardware, though I guess there is almost never a use case for those.
They may need to buy a new server though. (Site is down at the time of writing this)
If I don't build a new PC, what will my new 5090 run on?
I think I used my 2010 laptop for eight years. Upgrades: 120GiB (GB?) SSD.
I game so I had to upgrade my GPU and CPU but I'm going to ride Windows 10 for as long as I can.
It's a coin toss whether I go Linux or Windows 11 once 10 becomes unusable.
Energyconsumption is a reason for a CPU update.
(FYI: You've posted this comment twice, some seconds apart. I've responded to the older one)
Not really. The era of "modern efficient CPUs" started some 10-15 years ago. Under light loads, Ivy Bridge or Haswell is going to have a similar thermal profile to modern machines.
Many of the new machines are actually worse, e.g. 3770K @77W vs. 14900K @125W/253W. That isn't to say they're not also faster, but if you actually use it you're burning more watts.
This feels like a load of baloney to me. My most non-technical web browsing Microsoft Word-using relatives went from a 2012 Mac mini to an M4 Mac mini and the difference was night and day. They were extremely enthusiastic about the end result.
If they can tell it’s faster then certainly a technical person like myself can.
And also, that was an incredibly cheap upgrade. In 12 years they went from one $600 computer to another $600. That’s right, the new one was the same price, so cheaper than the original after inflation, they’ve paid $50 a year to compute, and that’s on the world’s most premium brand of computers.
Sure, you don’t need to upgrade anything. And for now, the Ryzen 3600 is a fantastic “old” processor, it runs my game server and it’s certainly capable.
But it’s not like you wouldn’t notice a far better experience someday in the future with an upgrade.
Don't worry, we - software developers are going to ruin the software with AI features that you will need to upgrade to Ryzen Al Max+ 395 just to run an editor.
My 4 year old personal desktop PC agrees with him in full. My shitty work laptop that the big corpo paid less than $400 for, it screams every second for an upgrade: it takes 10 seconds to open VSCode without any project vs less than 2 seconds and it can barely paint the external monitors when moving a browser window or resizing it. It is also 4 years old.
I expect to replace the desktop components in a few years when something breaks. Broken CPUs due to age are extremely rare, but mainboards with bad contacts for memory are pretty common, I've seen a lot that don't work that well after 8-10 years. I don't expect a desktop PC to work forever, the PSU will break in 10 years anyway, the SSD will reach write limit (I did a few already). But right now performance is not a concern.
> But here’s the thing: I don’t need it. I don’t have a single usecase for which I would need this much processing power. In fact, I could still use that i5 from 2011 and it would do everything I want it to do perfectly fine. I didn’t need to upgrade, I just wanted to.
If only we could have a bigger percentage of people that thought the same way. Then we might be able to get away from the insanity of marketing for new New NEW when what you have will do. Maybe these huge “tech” companies will be taken down a peg into more sane valuation territories. Maybe we’ll stop with the mounting piles of e-waste driven by the advertisers pushing FOMO of not having the shiniest.
A guy can dream though.
I figure I’ll slow my pace of upgrades even more than I have now and when the software becomes yet a larger pile of bloated nonsense shat out by clueless developers than it already is, I’ll switch back to writing letters.
> when what you have will do
The problem is that it doesn't. Software developers get the latest hardware, either because they like it ("it's my job") or because their company pays for it. As a result, they write software that works on their hardware, which is obviously more forgiving in terms of performance. Eventually, everybody has to update their hardware because their current hardware can't load a simple website or a chat app.
I can see a huge difference when loading website on my work computer vs my personal computer. Just last month my weather forecast app was updated and became literally unusable on my phone. Of course I can't use the old one anymore, so I don't have access to the public national weather forecast app. It works great on more modern phones though... showing exactly the same data as the previous app, but... I guess it looks more modern?
Not to mention QA machines tend to be a lot cleaner than the real life machines running the app.
These software “engineers” that insist on the latest-specced toys are part of the problem. By not living up to their imagined title and actually engineering within constraints (constrained hardware performance which would beget more efficient software), they’re just punting and saying “oops, -I- didn’t do this”. But they’re not engineers and never will be until they take some responsibility for being a partial cause for the current mess.
As a software engineer, I insist on giving developers high-end laptops. The reason is very simple: a lot of development environments are very heavy to run, and developers should not waste time on their development tools running slowly. I also don't want developers to disable tools that are meant to keep an eye on the quality of the code. High-end laptops generally serve well for development for up to 5 years.
Developing on high-end laptops should definitely not be an excuse to deliver slow software, and in the teams I work in, we do pay attention to performance. You are right though, a lot of software is a lot slower than it should be and my opinion is that the reason is often developers that lack fairly basic knowledge about data structures, algorithms, databases, latency,... One could say that time pressure on the project could also play a role, but I strongly believe that lack of knowledge plays a much bigger role.
Now, aside from that, also keep in mind that users (or the product owner) become more and more demanding about what software can and should do (deservedly or not). The more a piece of software must do, the more complex the code becomes and the more difficult it becomes to keep it in a good state.
Lastly, in my humble opinion, the lowest range budget laptops are simply not worth buying, even for less demanding users. I think that most users on a low budget would be better off with a second-hand middle or high range laptop for the same price. (I am talking here about laptops that people expect to run Windows on, no experience with Chromebooks.)
> users (or the product owner) become more and more demanding
I disagree. For all my life, customers have been asking for as much as they can imagine. Customers wanted flying cars long before they wanted the latest iPhone.
The thing that changed is that we realised that if we write lower quality software that has more features (useful or not), customers buy that (because they are generally not competent to judge the quality, but they can count the features). So the goal now is to have more features.
> I think that most users on a low budget would be better off with a second-hand
Which is exactly the problem we are talking about: you are pushing for people to get newer hardware. You just say that poorer people should get the equivalent of newer hardware for the poors. But people on a budget would actually be better off if they could keep their hardware longer.
To a point yes. However remember that it takes time and effort to optimize the software down. And if you write it for slower hardware from the start it will be less capable and/or creative. Might be pretty cool still, or even useful. Just not quite as cool and useful.
Can't have your cake and eat it too. It's not all laziness. How long did it take to get Doom to run on a toaster? ;)
> Just not quite as cool and useful.
I am genuinely trying, but I am finding hard to find modern software that qualifies for those words.
Is Slack "super cool and useful"? Is Word/Excel a lot cooler and more useful than... well honestly 20 years ago? Does Microsoft Teams qualify for that? Facebook? Instagram?
I don't think that more powerful hardware allows developer to write "cooler" and "more useful" stuff. What it allows is to write more, faster. Since the early 2010s, it feels like we specialized in writing worse software, but writing a lot more of it.
I'm doing my part. I'm writing this on a ~10 year old computer at home and my machine at work is ~9 years old. Both absolutely capable of doing what I need.
>Then we might be able to get away from the insanity of marketing for new New NEW when what you have will do.
That is literally how modern capitalist consumer economies work. The whole system is based on the assumption of more people buying more things they don't need, computers and otherwise.
Our society is that way intentionally.
Consumer Capitalism is neither driven nor perturbed by environmentally and clock-conscientious nerds.
I'm in this camp, perhaps a little more extreme: my daily driver laptop is a ThinkPad X230T, which I think is from 2011 or 2012. It is separate from a home lab - which I don't currently have, but which I'll use hardware from a few years ago if I ever need again. The only thing that can kill older hardware is software bloat - honestly, the web is the biggest culprit here.
My daily computer is 7 a years old laptop. The battery has its issues, but it's still powerful enough for everything I do – except games, for which I have a refurbished 5 year old desktop.
So... yeah, I tend to agree.
[dead]
I just got a new M4Pro Mini (Apple). It replaces my M1Max 14-inch MBPro.
Bit zippier (not screaming), but it does have native support for Apple "Intelligence."
I was waiting for the M4Max/Ultra Studio, but, y'know, I realized that I have no need for that.
This has been working fine, for a couple of months. I suspect that I won't be replacing it, for a few years.
I probably will need to get a new iPhone, and maybe iPad, sometime in the next year or so (also for Apple Intelligence stuff), but I'm in no hurry.