Google search failed to even find a hollywood movie, even after 1 hour of attempts. I don’t really care about the movie, but I am terrified by the prospect that google now ceased to function on this basic level. Why is this happening?
I understand the explanations of seo and other stuff like spam content. But why are there NO relevant results at all.
I wouldn’t mind having to start wading through results at page 2 or even 10 but now it utterly fails to find even the most basic things.
Things you found on the first attempt even just a year ago. Now they are effectively hidden.
To me functionally the entire internet has now vanished. I cannot access anything that I am searching for. Might as well not exist at all.
Has anybody found a way around this?
Is this on purpose? Is this an attack on the free internet, herding people to just the top 5 sites like facebook, youtube, tiktok, and so forth?
Are there search engines that still work?
Funny enough, GPT is where I’m going for searches like this now. Whenever my search query doesn’t pull the answer up with one or two clicks, I head to GPT and it finds the info for me.
*makes up the info for you.
You can ask it for sources etc now, it actually does the searching for you now instead of making shit up
By definition, everything it does is “making shit up”. Sometimes that shit is useful, sometimes not. Citations isn’t going to magically fix that, because it’s baked into how a generative AI based on an LLM works.
I always have it provide sources and I vet them. Same as I do Wikipedia. And it hasn’t been wrong about a movie having a post credit scene or not yet, and now I don’t have to read through all those shitty-ass articles that bury the lead somewhere after providing a shit ‘review’ of the movie.
It’s a very solid tool when used correctly, and GPT4 is head and shoulders above 3.5.
The same tool made up references to seemingly real legal cases that never existed.
K
You have a brain right? If you ask it for low water pressure shaving tips I think it would be pretty easy to tell if it’s suggesting nonsense.
The problem is that you’ll start trusting it based on a few examples that it was correct, and you’ll be burned by a seemingly correct answer that is really wrong. I tried testing it with simple science and engineering questions and it was garbage.
Interesting, I’ve had the total opposite experience. GPT-4 is reasonable more often than not. I don’t find the “it’s sometimes wrong” argument very compelling because the same is true for 99% of other information sources. I’ve always had to use critical thinking when look for answers online anyway.