• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle

  • noita is insane and has absolutely zero handholding. it’s truly hardcore and kinda souls-like in difficulty/lore, but truly excellent!

    magicraft is the king in casual spell crafting, very good game to play a bit after work, can call it quits anytime and pick it back up again. just had it’s full launch as well and might still be -20% (about 12€)

    fictorum is fairly unique, because it’s first-/third-person and 3D, and also very good with an intuitive spell system and a little bit of indie game jank


  • this is not true.

    it entirely depends on the specific application.

    there is no OS-level, standardized, dynamic allocation of RAM (definitely not on windows, i assume it’s the same for OSX).

    this is because most programming languages handle RAM allocation within the individual program, so the OS can’t allocate RAM however it wants.

    the OS could put processes to “sleep”, but that’s basically just the previously mentioned swap memory and leads to HD degradation and poor performance/hiccups, which is why it’s not used much…

    so, no.

    RAM is usually NOT dynamically allocated by the OS.

    it CAN be dynamically allocated by individual programs, IF they are written in a way that supports dynamic allocation of RAM, which some languages do well, others not so much…

    it’s certainly not universally true.

    also, what you describe when saying:

    Any modern OS will allocate RAM as necessary. If another application needs, it will allocate some to it.

    …is literally swap. that’s exactly what the previous user said.

    and swap is not the same as “allocating RAM when a program needs it”, instead it’s the OS going “oh shit! I’m out of RAM and need more NOW, or I’m going to crash! better be safe and steal some memory from disk!”

    what happens is:

    the OS runs out of RAM and needs more, so it marks a portion of the next best HD as swap-RAM and starts using that instead.

    HDs are not built for this use case, so whichever processes use the swap space become slooooooow and responsiveness suffers greatly.

    on top of that, memory of any kind is built for a certain amount of read/write operations. this is also considered the “lifespan” of a memory component.

    RAM is built for a LOT of (very fast) R/W operations.

    hard drives are NOT built for that.

    RAM has at least an order of magnitude more R/W ops going on than a hard drive, so when a computer uses swap excessively, instead of as very last resort as intended, it leads to a vastly shortened lifespan of the disk.

    for an example of a VERY stupid, VERY poor implementation of this behavior, look up the apple M1’s rapid SSD degradation.

    short summary:

    apple only put 8GB of RAM into the first gen M1’s, which made the OS use swap memory almost continuously, which wore out the hard drive MUCH faster than expected.

    …and since the HD is soldered onto the Mainboard, that completely bricks the device in about half a year/year, depending on usage.

    TL;DR: you’re categorically and objectively wrong about this. sorry :/

    hope you found this explanation helpful tho!


  • well, rimworld does have a focus on (micro)management and strategy!

    if your pawns are constantly down due to raiders, then you need better defenses! …or tame a herd of animals and release those at your enemies! (rhinos work very well for this!)

    there are tons of little optimizations you can make to efficiently run a colony. for example, social fights: you can keep those from happening by keeping the problematic pawns in different areas! or removing one or both of their tongues! or sending one on basically permanent caravan missions! etc., etc.

    this kind of deep strategizing, combined with the random bullshit the game throws at you, is mostly why people love rimworld!

    and mods… definitely get mods! that’s where the game reeeaaally shines!




  • you are right!

    i did actually forget about that when commenting, and thanks for the added info!

    however, that’s not exactly what i was talking about:

    assuming normal or better soil you need less work (i.e. time spent working the fields) per unit of nutrition when moving from rice->potato->corn because of yield.

    so your pawns spend less time planting and harvesting, which results in higher overall colony productivity since they can do other stuff in-between, like cooking, cleaning, mining, etc.

    you are correct in that you should choose which plant you use based on the soil first, and according to productivity second!

    i just wasn’t really considering soil quality when writing the comment…


  • when starting a new game:

    -set up a stockpile:

    indoors, preferably shelves, but that’s a goal to work towards

    -stockpile some food:

    starting with a talented grower makes early game easier. rice is best in the beginning, when it’s beginning to stockpile switch to potatoes, when those stockpile to corn. each step requires less work by your pawns, leaving more time for other stuff.

    -get a ranged weapon and some defenses

    some bows if there’s nothing else. first raid is alwaysa single melee guy, that’s scripted, afaik. setup some sand bags or embrasures. walls/corridors to limit the range enemies can shoot at you.

    -get batteries

    super important! difficult to have a reliable food supply without those!

    -get a freezer

    also super important because of the above!

    -set up a prison

    last on the list, not that high of a priority…but still, get some more people!

    and then do pretty much what you want…once early game is done, get some research done, plant some cotton, some herbal meds, set up a little medical area, etc.

    this should get you to mid game fairly reliably!



  • i looked it over and … holy mother of strawman.

    that’s so NOT related to what I’ve been saying at all.

    i never said anything about the advances in AI, or how it’s not really AI because it’s just a computer program, or anything of the sort.

    my entire argument is that the definition you are using for intelligence, artificial or otherwise, is wrong.

    my argument isn’t even related to algorithms, programs, or machines.

    what these tools do is not intelligence: it’s mimicry.

    that’s the correct word for what these systems are capable of. mimicry.

    intelligence has properties that are simply not exhibited by these systems, THAT’S why it’s not AI.

    call it what it is, not what it could become, might become, will become. because that’s what the wiki article you linked bases its arguments on: future development, instead of current achievement, which is an incredibly shitty argument.

    the wiki talks about people using shifting goal posts in order to “dismiss the advances in AI development”, but that’s not what this is. i haven’t changed what intelligence means; you did! you moved the goal posts!

    I’m not denying progress, I’m denying the claim that the goal has been reached!

    that’s an entirely different argument!

    all of the current systems, ML, LLM, DNN, etc., exhibit a massive advancement in computational statistics, and possibly, eventually, in AI.

    calling what we have currently AI is wrong, by definition; it’s like saying a single neuron is a brain, or that a drop of water is an ocean!

    just because two things share some characteristics, some traits, or because one is a subset of the other, doesn’t mean that they are the exact same thing! that’s ridiculous!

    the definition of AI hasn’t changed, people like you have simply dismissed it because its meaning has been eroded by people trying to sell you their products. that’s not ME moving goal posts, it’s you.

    you said a definition of 70 years ago is “old” and therefore irrelevant, but that’s a laughably weak argument for anything, but even weaker in a scientific context.

    is the Pythagorean Theorem suddenly wrong because it’s ~2500 years old?

    ridiculous.


  • just because the marketing idiots keep calling it AI, doesn’t mean it IS AI.

    words have meaning; i hope we agree on that.

    what’s around nowadays cannot be called AI, because it’s not intelligence by any definition.

    imagine if you were looking to buy a wheel, and the salesperson sold you a square piece of wood and said:

    “this is an artificial wheel! it works exactly like a real wheel! this is the future of wheels! if you spin it in the air it can go much faster!”

    would you go:

    “oh, wow, i guess i need to reconsider what a wheel is, because that’s what the salesperson said is the future!”

    or would you go:

    “that’s idiotic. this obviously isn’t a wheel and this guy’s a scammer.”

    if you need to redefine what intelligence is in order to sell a fancy statistical model, then you haven’t invented intelligence, you’re just lying to people. that’s all it is.

    the current mess of calling every fancy spreadsheet an “AI” is purely idiots in fancy suits buying shit they don’t understand from other fancy suits exploiting that ignorance.

    there is no conspiracy here, because it doesn’t require a conspiracy; only idiocy.

    p.s.: you’re not the only one here with university credentials…i don’t really want to bring those up, because it feels like devolving into a dick measuring contest. let’s just say I’ve done programming on industrial ML systems during my bachelor’s, and leave it at that.


  • perceptual learning, memory organization and critical reasoning

    i mean…by that definition nothing currently in existence deserves to be called “AI”.

    none of the current systems do anything remotely approaching “perceptual learning, memory organization, and critical reasoning”.

    they all require pre-processed inputs and/or external inputs for training/learning (so the opposite of perceptual), none of them really do memory organization, and none are capable of critical reasoning.

    so OPs original question remains:

    why is it called “AI”, when it plainly is not?

    (my bet is on the faceless suits deciding it makes them money to call everything “AI”, even though it’s a straight up lie)


  • if you’re searching for something general, like, i dunno “dishwasher cleaner” or something, it spits out usable results.

    but as soon as a query becomes technical in nature, like troubleshooting IT problems, it’s a straight up nightmare.

    the reason it’s so bad at searching for anything very specific is their attempt to “figure out what you really mean”:

    and google does that by… ignoring what you typed and changing your search prompt behind the scenes without telling you and without any options to change it.

    and putting it in quotes rarely improves searches anymore, only spits out more garbage.

    point is: google is basically dead for any specific searches and only really works for searches that amount to “i want to buy thing. show me thing.”


  • here’s a thorough analysis:

    https://www.youtube.com/watch?v=uCuy1DaQzWI

    TL;DW: they assume technology will magically “fix” the climate crisis and no big changes to society or the economy are necessary. thus perpetuating and worsening the climate crisis by pretty much telling people “it’s gonna be fiiiiine”…when it really won’t be “fine”.

    edit: note, that most of their content is fine, just the climate “solutions” and stuff are…so optimistic as to be misleading. their physics and futurology stuff is fine. also way oversimplified in many cases, but fine.




  • simple explanation: people get used to their monitors’ frame rate.

    if all you’ve been using is a 60Hz display, you won’t notice a difference down to 30-40 fps as much as you would when you’ve been using a 144Hz display.

    our brains notice differences much more easily than absolutes, so a larger difference in refresh rate produces a more negative experience.

    think about it like this:

    The refresh rate influences your cursor movements.

    so if a game runs slower than you’re used to, you’ll miss more of your clicks, and you’ll need to compensate by slowing down your movements until you get used to the new refresh rate.

    this effect becomes very obvious at very low fps (>20fps). it’s when people start doing super slow movements.

    same thing happens when you go from 144Hz down to, say, 40Hz.

    that’s an immediately noticeable difference!