shaves the sphere down with a sculptor’s knife
There. 3.1416. Not perfectly round but it’ll bake in the oven just fine.
shaves the sphere down with a sculptor’s knife
There. 3.1416. Not perfectly round but it’ll bake in the oven just fine.
I remember reading that from 2021-2023, LLMs generated more text than all humans had published combined - so arguably, actually human generated text is going to be a rarity
I’m sorry; AI was trained on the sole sum of human knowledge… if the perfect human being is by nature some variant of a psychopath, then perhaps the bias exists in the training data, and not the machine?
How can we create a perfect, moral human being out of the soup we currently have? I personally think it’s a miracle that sociopathy is the lowest of the neurological disorders our thinking machines have developed.
You were right
Will you sign my petition?
Convinced a long distance friend to change their major from Acupuncture to Computer Science before they ruined their life.
They’re doing better than I am, now.
That seems to require a level of foresight and planning that most corporations don’t have. That’s almost like a blueprint for failure when some middle manager changes the scope of a project with a hard coded time limit, IMO.
Anyone interested in not-agile development? Maybe we can call it “Ship it when it’s ready” lol
Ancient developer here / not really a coder, but what the hell is “Agile” software development?
Is it some kind of pseudonym for pushing buggy, untested code to a production server or something?
Like a speed run category for software development?
I’m an AI Developer.
TLDR: CUDA.
Getting ROCM to work properly is like herding cats.
You need a custom implementation for the specific operating system, the driver version must be locked and compatible, especially with a Workstation / WRX card, the Pro drivers are especially prone to breaking, you need the specific dependencies to be compiled for your variant of HIPBlas, or zLUDA, if that doesn’t work, you need ONNX transition graphs, but then find out PyTorch doesn’t support ONNX unless it’s 1.2.0 which breaks another dependency of X-Transformers, which then breaks because the version of HIPBlas is incompatible with that older version of Python and …
Inhales
And THEN MAYBE it’ll work at 85% of the speed of CUDA. If it doesn’t crash first due to an arbitrary error such as CUDA_UNIMPEMENTED_FUNCTION_HALF
You get the picture. On Nvidia, it’s click, open, CUDA working? Yes?, done. You don’t spend 120 hours fucking around and recompiling for your specific usecase.
Try using a 1-bit LLM to test the article’s claim.
The perplexity loss is staggering. It’s like 75% accuracy lost or more. It turns a 30 billion parameter model into a 7 billion parameter model.
Highly recommended that you try to replicate their results.
I remember some interview with Warren where they were talking about the idea for the game and it was like “What if it was every single conspiracy theory, but they were all true?”
Well it turns out that makes for a pretty compelling story but also far too many of those ended up coming true, lol
Sure, I’ll try OpenSUSE!
Tumbleweed is a bit of a spooky name for a distro implying that a gentle breeze sends it, but y’know
Linux Mint as someone suggested, I’ve ran a long time ago for college on an ancient laptop, and it’s an extreme stable OS, similar to Windows 2000 Pro. I can’t remember it crashing or freezing even once on me, and the Thinkpad T42 has an anemic processor., which I ran with the Conservative Governor
I’m actually a little scared of running Linux on modern, fast hardware.
How is multi-GPU driver support?
My main machine is a 900 TFlops compute monster (4 GPUs) running ROCM on Windows, and the last time I’d tried Manjaro on Desktop, it seized up for unknown reasons.
I’ve got asynchronous monitors - 1440p@165Hz main display and 4K@85Hz flipped vertical for a side monitor. Occasionally, I plug in a projector which is 1080p, mirrored to the 4K, but flipped horizontal.
I’m not sure what I’d done wrong because it works perfectly on my 11 year old Z575 (Debian+KDE there).
What distro would you recommend for an extremely fast/high RAM machine? I’ve got 128GB of main system memory, and 4TB of M.2 for a system disk running at 7.6 gigabytes/second actual/real-world RW I/O.
I had something like this happen at a corp I once worked at. The CTO said they were going to outsource their entire datacenter and support staff to India.
I literally laughed in his face and obviously, got fired (always have 6-8 months of salary as an emergency fund, ahem-).
I won’t name the company but when half the Internet went down and a few major services? Yeah, it was that asshat driving and running between the datacenters realizing people in Bangladesh can’t do shit for you physically.
It’s like that graph: “Say we want to fuck around at a level 8, we follow this axis, and we’re going to find out at around a level 7 or 8”
I’m somewhat partial to the Telvanni Mushroom kingdom (the idea of, hey, here’s an acorn, go GROW your house) but Balmora has always held a special piece in my heart for being the first “big city” I’ve felt in a video game.
The transition to the Ashland and seeing a different biome entirely / grasslands / plains was also pretty incredible.
Ald’ruhn’s Capitol was also novel in design with the redundant rope bridges built on the inside of the shell of a gigantic upturned horseshoe crab.
Vivec’s cool but it’s only possible because of a demi-god’s literal meddling around with the terrain, and it’s too easy to get lost.
Caldera’s also nice, as well as Pelagiad.
I know I just named like ten places but Morrowind’s got a lot of diversity and biomes.
I bought a Radeon 9800 Pro for my 13th birthday.
I tell you, people kept telling me that I was wasting my life in front of a computer – but I lived an entire fucking lifetime in Morrowind, to the age of 92.
I must have walked every single square meter or Vvardenfell, and this was before major walkthroughs existed.
What, is the game really that good?
Card games have never appealed to me personally
The answer was to disable FreeSync. FreeSync was causing the stuttering with SAM turned on.
squints
That says , “PHILLIPS DVD+R”
So we’re looking at a 4.7GB model, or just a hair under the tiniest, most incredibly optimized implementation of <INSERT_MODEL_NAME_HERE>