What a shower of twats. Don’t block the request in that case, just redirect it to your local server that returns a 1x1 transparent png for all requests.
What a shower of twats. Don’t block the request in that case, just redirect it to your local server that returns a 1x1 transparent png for all requests.
Invested in a water cooler setup back when I had a Bulldozer chip, which was near essential. Now on a Ryzen, and getting it to exceed about 35 degrees is very difficult. Been very good for long-term stability of my desktop - all the niggling hard disk issues seem to just go away when they’ve not subjected to such thermal cycling any more.
Fantastic chips.
Original release was fantastic - super-smooth, high resolution, “how I remember it on the N64”, ie. not true to the original at all, but like the impression it made on me as a youngster. Super-dark storyline and minimal hand-holding makes it very unusual for a Zelda game, absolutely stunning bit of porting work. Everything you could possibly want in 2024.
And then I got to the down-the-well bit, and that can just fuck off. New release needs a dedicated keybinding to skip that time-wasting shit, could bind it to a mouse button just so that there’s no difficulty finding it when the time comes.
IVEBEENUSINGTHISKEYBORDFORWHOLEMONTHNDMMOREEFFICIENTTHNIVEEVERBEENBEFORE
No ‘a’, so it’s perfect for ordering some piss.
Clicking the ‘Activate’ link prompts you to enter your shoe size and postal address, so that you may receive a shark plush toy and your own pair of The Socks.
Impressive, since “network effects” are what keeps people on a platform. Why move off Xitter or FB when everyone’s on there, and not on the new place? Keep moving a significant fraction of a million people every week, and pretty soon, it’ll be where everyone is.
My partner, who is very non-technical, signed up for a BlueSky as well this week: “all the teacher blogs have declared that they are moving over”. Looks like everyone has had enough.
Most of the laptops I’ve had open lately have had about the top third be the motherboard and the bottom two-thirds be battery, with maybe some ports and speakers tucked down the side. So I’d expect that last of replacements to include the battery, too.
I might check whether the hard drive survived - a decent M.2 is small, expensive and reusable - and maybe the RAM if it’s not soldered in.
Writing this on a Tuxedo Pulse 14 gen 3 - great laptop, flawless Linux support and a coding workstation. Perfect for a bit of eg. Disco Elysium or Crusader Kings 3 on the go, but it’s no gaming machine; it has a lot of pixels for a Radeon 780M to push. They do have a list of gaming laptops, though, if you wanted a speciality machine?
Cheaper for now, since venture capitalist cash is paying to keep those extremely expensive servers running. The AI experiments at my work (automatically generating documentation) have got about an 80% reject rate - sometimes they’re not right, sometimes they’re not even wrong - and it’s not really an improvement on time having to review it all versus just doing the work.
No doubt there are places where AI makes sense; a lot of those places seem to be in enhancing the output of someone who is already very skilled. So let’s see how “cheaper” works out.
PS3 most certainly had a separate GPU - was based on the GeForce 7800GTX. Console GPUs tend to be a little faster than their desktop equivalents, as they share the same memory. Rather than the CPU having to send eg. model updates across a bus to update what the GPU is going to draw in the next frame, it can change the values directly in the GPU memory. And of course, the CPU can read the GPU framebuffer and make tweaks to it - that’s incredibly slow on desktop PCs, but console games can do things like tone mapping whenever they like, and it’s been a big problem for the RPCS3 developers to make that kind of thing run quickly.
The cell cores are a bit more like the ‘tensor’ cores that you’d get on an AI CPU than a full-blown CPU core. They can’t speak to the RAM directly, just exchange data between themselves - the CPU needs to copy data in and out of them in order to get things in and out, and also to schedule any jobs that must run on them, they can’t do it themselves. They’re also a lot more limited in what they can do than a main CPU core, but they are very very fast at what they can do.
If you are doing the kind of calculations where you’ve a small amount of data that needs a lot of repetitive maths done on it, they’re ideal. Bitcoin mining or crypto breaking for instance - set them up, let them go, check in on them occasionally. The main CPU acts as an orchestrator, keeping all the cell cores filled up with work to do and processing the end results. But if that’s not what you’re trying to do, then they’re borderline useless, and that’s a problem for the PS3, because most of its processing power is tied up in those cores.
Some games have a somewhat predictable workload where offloading makes sense. Got some particle effects - some smoke where you need to do some complicated fluid-and-gravity simulations before copying the end result to the GPU? Maybe your main villain has a very dramatic cape that they like to twirl, and you need to run the simulation on that separately from everything else that you’re doing? Problem is, working out what you can and can’t offload is a massive pain in the ass; it requires a lot of developer time to optimise, when really you’d want the design team implementing that kind of thing; and slightly newer GPUs are a lot more programmable and can do the simpler versions of that kind of calculation both faster and much more in parallel.
The Cell processor turned out to be an evolutionary dead end. The resources needed to work on it (expensive developer time) just didn’t really make sense for a gaming machine. The things that it was better at, are things that it just wasn’t quite good enough at - modern GPUs are Bitcoin monsters, far exceeding what the cell can do, and if you’re really serious about crypto breaking then you probably have your own ASICs. Lots of identical, fast CPU cores are what developers want to work on - it’s much easier to reason about.
Yes, because it doesn’t do as much to protect you from data corruption.
If you have a use case where a barely-measurable increase in speed is essential, but not so essential that you wouldn’t just pay for more RAM to keep it in cache, and also it doesn’t matter if you get the wrong answer because you’ve not noticed the disk is failing, and you can afford to lose everything in the case of a power cut, then sure, use a legacy filesystem. Otherwise, use a modern one.
I think when Disney demands an internally-hosted version of your product, then the sales team tells engineering that they’ll provide one, and mark the price up accordingly. That kind of thing doesn’t appear on the external listing for everyone else.
Man alive, I thought that Mozilla had been doing their own Personal Package Archives so that we didn’t have to deal with Ubuntu packaging it as a Snap anymore. And this is doubly disappointing.
Needs an endless repeating loop in there, plus one slang word spelled out in ridiculous furneticccc fashion, otherwise it’s just not Joyce. AIs just have no appreciation of great art.
time for brekkie again, bit of a walk, wanked off on the beach, got burrrrrluckesaaaid with a bunch of prozzies while me wife cucked me and back home in
The kernel option is mitigations=off
, if you want to try adding it to your Grub command line? From the testing I’ve done, provides no benefits whatsoever - no more frames in games, compilation runs no quicker, battery life on a laptop is no better.
https://wiki.archlinux.org/title/Improving_performance#Turn_off_CPU_exploit_mitigations
If you made memory access lines twice as wide, they’d take up more space. More space means (a) chips run slower, because it takes time for the electricity to get there (b) they’d be bigger and more expensive.
The main problem with 32-bit, as others have noticed, is that that’s not really so much RAM. CPUs do addition and subtraction the way we were taught at school - ‘carry the one’, they’ve an overflow bit that’s set when your sum doesn’t fit in the columns. On 8-bit CPUs, we were always checking back when adding up large numbers. On 64-bit CPUs, we can deal with truly massive numbers anyway, it’s not such a hassle. And they’re so fast at doing sums anyway and usually waiting for memory, it’s barely a hassle.
Moving to 128-bit would give us a truly minuscule, probably unmeasurable, benefit in exchange for significant downsides. We could make them, but it would be pointless.
Got this installed on all my work machines - if you’re wanting to stick a screenshot on Jira or Slack with a couple of arrows, wavy lines, or a bit blurred out then it’s dead quick and has just the functionality that you need. Yes, it’s simple and lacks a lot of ‘power tools’. Sometimes that’s just what you need, tho.
emerges from a brand you’ve probably never heard of
Writing this on a Tuxedo Pulse 14 / gen 3 as we speak. Great little laptop. I’d wanted something with a few more pixels than my previous machine, and there’s a massive jump from bog-standard 1080p to extremely expensive 4K screens. Three megapixel screen at a premium-but-not-insane price, compiles code like a champion, makes an extremely competent job of 3D gaming, came with Linux and runs it all perfectly.
“Tuxedo Linux”, which is their in-house distro, is Ubuntu + KDE Plasma. Seemed absolutely fine, although I replaced it with Arch btw since that’s more my style. Presumably they’re using Debian for the ARM support on this new one? This one runs pretty cold most of the time, but you definitely know that you’ve got a 54W processor in a very thin mobile device when you try eg. playing simulation games - it gets a bit warm on the knees. “Not x64” would be a deal-breaker for my work, but for most uses the added battery life would be more valuable than the inconvenience.
At first, the air of mystery suggested hidden depths, but increasing exposure revealed that it was all just woven from thin air and would routinely come out with increasingly absurd stories that contradicted previous statements.
Also, the X-Files.