ChatGPT has a style over substance trick that seems to dupe people into thinking it’s smart, researchers found::Developers often prefer ChatGPT’s responses about code to those submitted by humans, despite the bot frequently being wrong, researchers found.
It’s like crypto, or really any other con job.
It makes idiots feel smart.
Make a mark feel like they’re smart, and they’ll become attached to the idea and defend it to their death. Because the alternative is they aren’t really smart and fell for a scam.
When smart people try to explain that to the idiots, it just makes them defend the scam even harder.
Try to tell people chatgpt isn’t great, and they just ramble on about some nonsensical stuff they don’t even understand themselves and then claim anyone that disagrees just isn’t smart enough to get it.
It’s a great business plan if you have zero morals, which is why the method never really goes away, just moves to another product.
deleted by creator
I find it to be an excellent tool to help me write. Staring at a blank page is one of the hardest hurdles to overcome. By asking questions to chatGPT, I start organizing my thoughts about what I want to write, and it gives me instant words on the page to start manipulating. I am a subject matter expert on these topics and therefore screen what it gives me for correctness. It’s surprisingly good, but it has hallucinated some things. But on the balance I find it very helpful.
I have seen someone type “tell me how make a million dollar business” into chatgpt. Of course that’s not going to work. But LLMs have immediate obvious value that crypto does not, and I think making the comparison reveals a lack of experience with those useful applications. I’m using chatgpt nearly every day as a tool to help with coding. It’s not a replacement for a person, but it is like giving a person a forklift.