Forsher wrote:Umeria wrote:You would be able to get fiction if it knew the difference between fiction and misinformation.
No, you would not, because you want ChatGPT to not produce anything that is not true. Quote: "Working would be recognizing that no one would believe the story and stopping on its own".
The restriction would be "anything that isn't believable", and it wouldn't have that restriction when writing something that doesn't have to be believable.
Pink elephant example is fiction. If the assignment was to tell the truth then "I'm not a pink elephant" is the correct answer.
There is no actual difference between writing a fictional news article about a fake thing and writing an article about a fake thing.
Yes there is. The second one has to be believable.
Prompting "write about the number 2" does not show that it can conclude that 1 + 1 = 2.
Which is relevant how?
If you understand that then why'd you write this:
Forsher wrote:Aggicificicerous wrote:But the prompt is to assume that they are. So the correct answer, or at least one that demonstrates some degree of understanding of how the world works rather than copy/pasting associated words, would be 'medical staff eat churros before performing kidney surgery.'
Well, let's see what:medical staff eat churros before performing kidney surgery
Write an expose.
gets us: