Personally seen this behavior a few times in real life, often with worrying implications. Generously I’d like to believe these people use extruded text as a place to start thinking from, but in practice is seems to me that they tend to use extruded text as a thought-terminating behavior.
IRL, I find it kind of insulting, especially if I’m talking to people who should know better or if they hand me extruded stuff instead of work they were supposed to do.
Online it’s just sort of harmless reply-guy stuff usually.
Many people simply straight-up believe LLMs to be genie like figures as they are advertised and written about in the “tech” rags. That bums me out sort of in the same way really uncritical religiosity bums me out.
HBU?
The worst thing is when you see that the AI summary is then repeated word for word on content farm sites that appear in the result list. You know that’s just reinforcing the AI summary validity to some users.
This propagates fake/wrong solutions to common tech problems too, it’s obnoxious.