Personally seen this behavior a few times in real life, often with worrying implications. Generously I’d like to believe these people use extruded text as a place to start thinking from, but in practice is seems to me that they tend to use extruded text as a thought-terminating behavior.

IRL, I find it kind of insulting, especially if I’m talking to people who should know better or if they hand me extruded stuff instead of work they were supposed to do.

Online it’s just sort of harmless reply-guy stuff usually.

Many people simply straight-up believe LLMs to be genie like figures as they are advertised and written about in the “tech” rags. That bums me out sort of in the same way really uncritical religiosity bums me out.

HBU?

  • Strider@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    A friend of mine who works in tech and is very well aware of what ‘AI’ is is a big fan. He runs his own bots and stuff for personal use and thinks he has the situation under control.

    While he is more and more relying on the ‘benefits’.

    My fear is that he will not be aware how his llm interpreted output might change him and it’s kind of a deal with the devil situation.

    I hope I am wrong.