nifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square216fedilinkarrow-up11.62Karrow-down122
arrow-up11.6Karrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agomessage-square216fedilink
minus-squaregamermanh@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up7·5 months agoBecause lies require intent to deceive, which the AI cannot have. They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description
Because lies require intent to deceive, which the AI cannot have.
They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description