While moving from one nest to another (we’re lemmings here; RP it a bit) I realized I still have all computers I ever bought or assembled, except for those that literally broke beyond any hope of repair.
Some are no longer used daily but all work and being on a point in life where everything and anything in the nest needs to have a purpose or a function, led me think what actually renders a computer useless or truly obsolete.
I was made even more aware of this, as I’m in the market to assemble a new machine and I’m seeing used ones - 3 or 4 years old - being sold at what can be considered store price, with specs capable of running newly released games.
Meanwhile, I’m looking at two LGA 775 motherboards I have and considering how hard can I push it before it spontaneously combusts to make any use of it, even if only a type writer.
So, per the title, what makes a computer obsolete or simply unusable to you?
Addition
So I felt necessary to update the post and list the main reasons surfacing for rendering a machine obsolete/unusable
- energy consumption
overall and consumption vs computational power
- no practical use
Linux rule!
- space take up
When you no longer can find a use for it.
I have a 12yo MacBook pro at home with some Linux installed that runs perfectly.
Still I have absolutely no use for it. There’s nothing much else it can be used for than browsing the web. And for that I have lighter devices with a much better screen, so I prefer those anytime.
Media server? NAS? Use it to run your sprinkler system?
I have one of these a d wapped the old HDD for an SSD and it’s like a brand new machine. It’s still stuck on 10.13 but as a netbook and N64 emulator it’s great.
When space, time, or power it requires is no longer a good trade in exchange for the task it completes.
I live in Asia, so the space something physically takes up is often the biggest cost. The footprint of my house is like 25 square meters, so if I want to keep a bunch of older computers around, I’m going to need to rent a bigger house.
My time has also grown more expensive over the years.
In a very blunt answer in my specific case - the moment it can no longer serve as a DNS server. Which is a very low bar.
Running a computer with the sole purpose of having a DNS server seems like a huge waste of power, regardless how little power that computer uses.
Your argument is correct on its own, but seriously misses the point of all possible variations. Using locally generated renewables mostly defeats it; not discarding the machine means less ewaste. If you’re trying to be environmentally friendly - remember:
- Reduce
- Reuse
- Recycle
In that order. Since I cannot reduce the amount of computer I have already obtained, the next best thing is to reuse it. When that is no longer sensible - recycling is the third best thing.
As of today, locally produced renewable power is a scarce resource. Any use of it competes with all other possible uses. The usages who don’t get enough local renewable power get it elsewhere. So the waste argument still holds. Unless maybe you already have a solar panel on your house that generates superfluous power.
Power usage I guess.
If the cost of running a 2500k with a 790 eats up the cost of switching to a cheap newer one (it isn’t necessarily a new one) over some two three years, then that’s already a sign the old PC is dead.
How long are you running such a machine realistically, though?
I’m in Germany, so very expensive power, and my single person household costs me about 600€/year in power for everything. And I’m working from home, so about 100W baseload for 8-10h a day.
Unless a machine is running really long and doing something significantly more than idling, power usage is almost irrelevant.
I have this “rule” which might be a bit old, that 1 watt a year costs roughly 1€ (it’s just getting worse).
So over say 5 years (a somewhat reasonable time today I think), your 180watt PC used 8h/d would cost 300€ in power usage.
An older PC with a power hungry GPU could use 400 watts => 666€.
A ThinkPad (ok, it has not a gaming GPU) would be like 50€ and a good used one can be had for 200-300€.
You can also get a 4 to 8 gen Dell tower for 40-140€, add a cheap GPU and you’ll have a Roblox / even Minecraft PC.
If you buy a brand new PC yes that won’t (most probably) be an economical investion concerning power use. But old PCs suck(draw) power and one day it’s probably economically viable to change it for a mor recent one.
Which reasonable PC uses 400W?
BTW 600€ a year? You think that is expensive?
Is it 600€ only for your computers’ ?
Read the comment again.
Power usage is a massive one for me. I go by £1/W/Year for consumption of always-on devices. (I think it’s more like £3/W/Year now!)
If the 20w new server can do the same work as the 100w server, and will cost me less over 2 years including the purchase price, then the old server is obsolete.
IMO a computer is obsolete when it can no longer run any desired programs. My laptop for example has outlived my much beefier desktop since the laptop is basically just used for web stuff while my desktop is used for gaming, development, and the like. Especially gaming has had a significant increase over the years and a gaming PC might be rendered obsolete much faster than something used for the web. My old gaming PC that was rendered obsolete I repurposed to be a server and it works well for its new purpose and will probably live for a couple of years still.
So there isn’t any concrete limit on which you can say a computer has become obsolete. It is more of a subjective assessment of whether the computer can fulfill its tasks to a satisfactory degree.
You should only get rid of computers when your home, your parents’ home, and your parents’ garage have all run out of space. My parents’ garage used to be an industrial building and is about as big as the house, so can fit many ancient computers.
oh boy… i ask myself this a lot. i frequent the puppylinux community and dudes are out there slanging 32bit computers with sub 1gb ram all the time… much like others have echo’d, the answer seems to be when the computer dies.
It’s all very arbitrary and depends on the definition of computer for the individual.
Ultimately it does, I think, come down to practicality. Can I still use this thing to get what I need to do done, and can I still do it securely?
The security part can be more or less important depending on computer, as well. If you’re a Mac person, your machine may be obsolete as soon as Apple decides to stop giving you security updates. If you’re a Linux person, you can probably maintain a secure system easily on 10-15 year old hardware.
I would say when it becomes too slow for even basic tasks like browsing the web, or running an up-to-date operating system.
Today, I would say the bar is around 3000-4000 points on cpubenchmark for the cpu, 8gb of ram and an SSD.
You could definitely get a usable computer that has less. I have a Pentium II PC that works great, and can even connect to the Internet. But software today is far more bloated and inefficient than it used to be, such an old machine would be useful only if you don’t do anything computationally intensive, and don’t need to run any modern software.
But something I forgot to mention about old hardware is that it allows you to run old software, old games… and there’s also the nostalgia of Windows XP, or Windows 98, the early web. They remind me of a simpler time…
For me, it’s the hardware failure. If it’s damaged enough to be uncomfortable to use, it’s done. Similarly, if it can’t run a modern browser decently.
I just ditched a >10yo laptop that I used as a server. The display was off most of the time, and the battery offered some energy backup. Its last months I couldn’t even use the power button, had to take the mobo battery out and connect it without the battery in order to turn it on. Touchpad wasn’t working either. OS hard drive was failing but that was replaced. I’m sure the thing works fine but I can’t find the right flex cables to connect the power button and touchpad to the mobo. Guess it’s going to trash soon.
It can’t run Doom.But seriously, I question the “practical use” bit, not because it’s wrong but because it’s so completely situational. If you want it for a business you want to beat AWS prices probably, but if you are just goofing off a replica of the Zeus Z-1 is actually a substantial upgrade from an old XP desktop, just because of the huge cool factor. If you have some sort of basic but fairly practical personal need, the cutoff for usable will be in between.
In your situation, I’d figure out how many you want, and then keep the n best ones by your reckoning.
Shoutout to !retrocomputing@lemmy.sdf.org
The weird thing is, that we’re currently at a point, where even very old machines are perfectly usable, if you’re not playing modern games.
My main computer is an i5 4670 (or something like that), it’s almost 10 years old, but for my Firefox/vs code/docker workload, it’s pretty much about as good as my M1 MacBook. Sure, some tasks take a second longer, but not annoyingly long.
This comment you’re reading brought to you by a laptop from Obama’s first term.
13 year old kids on Twitter.
Pretty much the software you run on it and the support behind it. And for now, energy consumption, but I can imagine 100 years now that won’t be a factor anymore.
But that’s probably falls under “no practical use”
I mean, with the proper software, you still can automate your house with a Commodore 64, or browse the web with an Amiga
Imma need to see the commodore 64 smarthouse now
I moved to a laptop for my main system for portability, and I’m really enjoying the reduction in my power bill from my previous threadripper 1950x build.
When it’s slow and when I open up recent programs it locks and shuts. I’ve changed two computers in 25 years for that reason. I think the first one was for GTA III. Fortunately, there are no more games that are worth changing a PC.