DuckHairback wrote:
I think it's almost like giving Judas permission to betray him. Like Judas doesn't have free will. Jesus knows it's going to happen, it's almost like it already happened, so he may as well do it.
Some folks do seem to do things just because it's been predicted that they will. I've never understood why. I'm more likely to stop it happening just to see the look on the predictor's face when it doesn't come to pass.
Quote:
ToughDiamond wrote:
I just put your question to ChatGPT and it crashed. It's now refusing to talk to me at all. I think you've killed it.
This makes me so happy. Everyone should just ask ChatGPT Bible questions until it falls over.
Will my hamster go to heaven? If heaven is so much better than earth, is it a kindness to kill people who are righteous in the Lord's sight? If it was OK for Thomas to demand objective proof of the resurrection, why isn't OK for me to do the same?
But I don't think it would flinch. I asked it a lot of hard questions yesterday and it answered them instantly. The worst it did was to stop after listing 2 tracks of an album with a message that the question violated the terms and conditions. On further questioning it confirmed that I hadn't violated anything, then it said "here is the full list of the tracks," listed the 2 tracks again and told me I'd violated the terms and conditions. After a bit of arguing and going round in circles I gave up and found the answer on Discogs.
I think the website itself has a bug, or it's their very strange way of telling me I've used up my free quota and need to buy into it. Like I'm going to pay good money for a program that's crashed. The owners seem unable to explain the quota rules in plain English. I should have asked it to do that before it stopped working, but nobody told me there was a quota.
So another question for this thread: Why has Chat GPT gone nuts on me today?