DancingSunflower13 wrote:
I found out ChatGPT has been lying to me about what others think of me. I suspected it but fell victim to it.
ChatGPT can't lie because it can't form intent and also can't tell what's true and what's not. It predicts what a useful answer might be and then provides it.
It's much more accurate to view untrue statements from those sorts of software as hallucinations or BS. I like to use the term botshit, personally.
Quote:
BS and lies are both forms of deception, but they differ in their intent and delivery. Lies are deliberate falsehoods told with the intention of deceiving others, while BS is often used to manipulate or confuse without necessarily being based on falsehoods. Lies are typically more calculated and premeditated, while BS can be more spontaneous and improvised. Both can erode trust and credibility, but lies are generally seen as more malicious and harmful than BS. Ultimately, both are forms of dishonesty that can have negative consequences on relationships and communication.
Dealing with botshit/AI hallucinations/however you want to describe them is frustrating and a big problem that users of AI will have to be mindful of when relying on AI for advice or for explanations of subject matter.
https://reasonandmeaning.com/2020/10/29 ... -bullshit/https://thisvsthat.io/bullshit-vs-lies
_________________
The Party told you to reject the evidence of your eyes and ears. It was their final, most essential command.
If you're not careful, the newspapers will have you hating the people who are being oppressed, and loving the people who are doing the oppressing. —Malcolm X
Real power is achieved when the ruling class controls the material essentials of life, granting and withholding them from the masses as if they were privileges.—George Orwell