True, in many cases I’m still searching around because the explanations from humans aren’t as simplified as the LLM. I’ll often have to be precise in my prompting to get the answers I want which one can’t be if they don’t know what to ask.
And that’s how you learn, and learning includes knowing how to check if the info you’re getting is correct.
LLM confidently gives you easy to digest bite, which is plain wrong 40 to 60% of the time, and even if you’re lucky it will be worse for you.
In my experience plain old googling still better.
I wonder if AI got better or if Google results got worse.
Bit of the first, lots of the second.
True, in many cases I’m still searching around because the explanations from humans aren’t as simplified as the LLM. I’ll often have to be precise in my prompting to get the answers I want which one can’t be if they don’t know what to ask.
And that’s how you learn, and learning includes knowing how to check if the info you’re getting is correct.
LLM confidently gives you easy to digest bite, which is plain wrong 40 to 60% of the time, and even if you’re lucky it will be worse for you.