Soon Twitter will complete its transition to Nazi Bar of Social Media.
Soon Twitter will complete its transition to Nazi Bar of Social Media.
And for those on the other side of the Atlantic, there are several computer shops that will just put a computed together for you without an OS.
Here’s a random example “configure your own computer” from a computer shop in France. In this one the OS (Système d’exploitation) is not included and you have to pay extra for it.
In my experience with custom assemblies like this the OS is never included.
When I live in the UK at some point I’ve even used of these kind of stores there to get a custom notebook.
It’s basically an “assemble your own computer” for people who don’t know how to do it and aren’t confident enough to try (understandable given that the parts value of a whole desktop PC adds up to at least €1000 so there generally is some fear of fucking it up if you’ve never done it before).
I don’t think the sensors really matter for a server but the rest makes some sense.
Still, 80 bucks will buy you quite literally a Mini-PC (a really crummy one, granted) which can run more server tasks because it has as much or more memory and storage and isn’t hindered by there being an Android OS layer there doing nothing useful, and which is absolutelly and 100% under your control because it boots into your OS of choice.
Half than that will buy you a crummy SBC which probably de facto has as much capability to run server tasks as that Oneplus (it’s weaker but doesn’t have Android there eating up resources) though in my experience those things tend to be a bit finicky.
I don’t think it’s actually worth it to spend $80 on an used phone to use as a server (unless you do need UPS-like features or built-in mobile nertwork access) since you quite literally have better options brand new for that money, but if you have one around it can make sense even if it’s a bit more work getting it going and is not fully under your control (unless we’re talking about something jailbroken where you can install Oxygen or Lineage on, so a Pixel would probably be a better choice).
That said, there is a certain technical elegance in the whole notion of repurposing an Android Phone to be a home server.
I think you’re seriously overestimating the technical prowess of the average law enforcement officer…
Except the price, which is much lower for the SBC, way much lower if one uses one of the lower end Orange Pi or Banana Pi SBCs.
Also you can put Linux on the SBCs (which always come unlocked) hence do way more with them as servers than if one has to use Android as the OS.
I mean, I can get it if people with the technical chops, love for technical challenges and an old and pretty much worthless Android phone, configure it as a server if only because “why not?!”, but it’s not exactly a great option considering that a 40 bucks SBC can do the same, only better, more easily and with far more possibilities (given that it will be running Linux rather than Android).
PS: Actually somebody below mention mobile network connection, which, thinking about it, would be a good reason to use an old Android phone as a server since it has built-in support for 3G (unless it’s quite old) whilst the SBC needs it add to it which might be a problem for the cheaper SBCs (just wondering about how I would get around to do it, I think you need to connect a USB dongle to it and it has to be something compatible with Armbian Linux)
I’m truly, totally, completely shocked … that Windows is still being used on the server side.
Same here and for me too it was gaming holding me back, though I mostly buy my games via GoG hence use Lutris and it’ve had a pretty low rate of games that won’t work at all (and, curiously, one of them which won’t work in Steam works fine if I use a pirated version with Lutris), though maybe 1/3 require some tweaking to work properly.
It’s also interesting that by gaming in Linux with Lutris I can make it safer and protect my privacy because Lutris let’s me do things like run the game inside a firejail sandbox which I have set up as default for all games including disabling network access for the game.
Still have the Windows partition around just in case, though the only time I booted it in the last several months was to clean up some of the stuff to free one of the disks to make it a dedicated Linux disk.
It’s not what makes them money so they don’t really have the business incentive for maximizing hardware sales that leads to a relentless pushing out of new versions of their hardware that are barely better than the last one and all manner of tricks for early obsolescence of older devices (things like purposeful OS and App under-performance and even incompatibility with older versions of the hardware).
Also in the big picture of gaming the Steam Deck is tiny and in its early stages, so business-wise is not the time to go down a strategy of relentless new hardware versions and enshittification, quite the opposite.
Absolutely, they’re doing the right thing and as the right thing aligns with their business objectives it’s a bit wishful thinking to claim its because they care so much about their customers as people.
Dried chilli’s are pretty easy to find in some parts of Europe, just not others.
I can tell you for sure they’re a thing in Portugal.
Making a mistake once in a while on something one does all time is to be expected - even somebody with a 0.1% rate of mistakes will fuck up once in while if they do something with high enough frequency, especially if they’re too time constrained to validate.
Making a mistake on something you do just once, such as setting up the process for pushing virus definition files to millions of computers in such a way that they’re not checked inhouse before they go into Production, is a 100% rate of mistakes.
A rate of mistakes of 0.1% is generally not incompetence (dependes on how simple the process is and how much you’re paying for that person’s work), whilst a rate of 100% definitelly is.
The point being that those designing processes, who have lots of time to do it, check it and cross check it, and who generally only do it once per place they work (maybe twice), really have no excuse to fail the one thing they had to do with all the time in the World, whilst those who do the same thing again and again under strict time constraints definitelly have valid excuse to once in a blue moon make a mistake.
If you system depends on a human never making a mistake, your system is shit.
It’s not by chance that for example, Accountants have since forever had something which they call reconciliation where the transaction data entered from invoices and the like then gets cross-checked with something else done differently, for example bank account transactions - their system is designed with the expectation that humans make mistakes hence there’s a cross-check process to catch those.
Clearly Crowdstrike did not have a secondary part of the process designed to validate what’s produced by the primary (in software development that would usually be Integration Testing), so their process was shit.
Blaming the human that made a mistake for essentially being human and hence making mistakes, rather than the process around him or her not having been designed to catch human failure and stop it from having nasty consequences, is the kind of simplistic ignorant “logic” that only somebody who has never worked in making anything that has to be reliable could have.
My bet, from decades of working in the industry, is that some higher up in Crowdstrike didn’t want to pay for the manpower needed for the secondary process checking the primary one before pushing stuff out to production because “it’s never needed” and then the one time it was needed, it wasn’t there, thinks really blew up massivelly, and here we are today.
Yeah, the tools are still there to figure out the low level shit, information on it has never been this easy to come by and bright people who are interested will still get there.
However growing up during a time you were forced to figure the low level details of tech out merely to get stuff to work, does mean that if you were into tech back then you definitely became bit of a hacker (in the traditional sense of the word) whilst often what people consider as being into tech now is mainly spending money on shinny toys were everything is already done for you.
Most people who consider themselves as being “into Tech” don’t really understand it to significant depth because they never had to and only the few who actually do want to understand it at that level enough to invest time into learning it do.
I’m pretty sure the same effect happened in the early days vs later days of other tech, such as cars.
More generally: delegate anything critical to a 3rd party and you’ve just put your business at the mercy of the quality (or lack thereof) of their own business processes which you do not control, which is especially dangerous in the current era of “cheapest as possible” hiring practices.
Having been in IT for almost 3 decades, a lesson I have learned long ago and which I’ve also been applying to my own things (such as having my own domain for my own e-mail address rather than using something like Google) was that you should avoid as much as possible to have your mission critical or hard to replace stuff dependent on a 3rd Party, especially if the dependency is Live (i.e. activelly connected rather than just buying and installing their software).
I’ve managed to avoid quite a lot of the recent enshittification exactly because I’ve been playing it safe in this domain for 2 decades.
Also a lot of people were “on call” to handle any problems when the year changed, so the few problem that had passed unnoticed when doing the fixed and did pop up when the year changed, got solved a lot faster than they normally would.
Having worked in making software for almost 3 decades, including in Finance both before and after the 2008 Crash, this blind reliance on algorithms for law enforcement and victim protection scares the hell out of me.
An algorithm is just an encoding of whatever the people who made it think will happen: it’s like using those actual people directly, only worse because by need an algorithm has a fixed set of input parameters and can’t just ask more questions when something “smells fishy” as a person would.
Also making judgements by “entering something in a form” has a tendency to close people’s thinking - instead of pondering on it and using their intuition to, for example, notice from the way people are talking that they’re understating the gravity of the situation, people filling form tend to mindlessly do it like a box-ticking exercise - and that’s not even going into the whole “As long as I just fill the form my ass is covered” effect when the responsability is delegated to the algorithm that leads people to play it safe and not dispute the results even when their instincts say otherwise.
For anybody who has experience with modelling, using computer algorithms within human processes and with how users actually treat such things (the “computer says” effect) this shit really is scary at many levels.
It certainly was CHUGGING ALONG nicelly.
Rebrand Github as an MMO were people fight for code dominance.
The whole increasing concentration of wealth and fall in median quality of life can be traced back to basically each individual of the Owner Class thinking that somebody else will keep the system going by employing people and paying them well enough so that they keep on buying stuff.
The whole think is pretty much a Tragedy Of The Commons as defined in Games Theory, only instead of a shared grazing commons that would be fine if just one person had a few more sheep than they should (but gets overgrazed and then everybody looses if more people have a few more sheep than they should), we have the Economic system.
Historically one of the big reasons for the invariable appearance of some kind of social construct above the individual with the ability to make decisions for the group and force individuals to comply (from the “council of elders” all the way to the modern Democracy) is exactly to stop people from, driven by pure selfishness, “overgraze” in the various “commons” we have and ending up destroying the whole thing for everybody - if you have one or two doing it the “commons” can handle it, but too many and you get a tragedy.
And here we are after 4 decades of Neoliberalism whose entire purpose was to reduce the power of entities making decisions for the good of the group to overseeing the commons and force individuals from overexploiting it, so it’s not at all surprising that we are seeing various common systems starting to collapse due to over-exploitation.
I’m pretty certain that whatever societies will be dominant next are not those which embraced Neoliberalism the most as those will be the ones with the most collapsed systems and that stuff takes a lot of time to recover, plus the very people who overexploited them to collapse will do all they can to avoid having stop what they’ve been doing and that gave them so much personal upside maximization and they’ve basically bought politics in the West, so there is no actual will to do it in the Power Elites (there’s a will to get the upsides of a well functioning society but no will for they themselves to do the concessions needed, only for somebody else to do it, which is exactly the mindset that when not stamped out by some kind of oversight entity causes the problem in the first place).
Mandatory random cavity searches.
It’s the only way to keep society safe!
Recently I’ve been playing Airline Tycoon Deluxe, Sims 3, Battle Brothers, Kerbal Space Program and Prey.
I think the newest is Prey, from 2018.
Airline Tycoon Deluxe is from 1998 and still fun (at the beginning, eventually you just make tons of money, use it to do more of the same to make even more money and it stops being fun). It helps that it’s a 2D game and the fun is in the management mechanics rather than related to anything visual.
By the way, they all run on Linux, though I had to literally pirate the Sims 3 to get it to work even though I own the game.