“ChatGPT-4 produces more misinformation than predecessor - NewsGuard”
This shouldn’t come as a surprise. In the research I’ve read, hallucinations are an emergent property that increases with the size of the model
... works as a web developer in Hveragerði, Iceland, and writes about the web, digital publishing, and web/product development
These are his notes
“ChatGPT-4 produces more misinformation than predecessor - NewsGuard”
This shouldn’t come as a surprise. In the research I’ve read, hallucinations are an emergent property that increases with the size of the model