nifty@lemmy.world to Technology@lemmy.worldEnglish · 1 month agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square221fedilinkarrow-up11.63Karrow-down123
arrow-up11.61Karrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 1 month agomessage-square221fedilink
minus-squareShardikprime@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down16·1 month agoI mean LLMs are not to get exact information. Do people ever read on the stuff they use?
minus-squaremint_tamas@lemmy.worldlinkfedilinkEnglisharrow-up15·1 month agoTheoretically, what would the utility of AI summaries in Google Search if not getting exact information?
minus-squareMalfeasant@lemmy.worldlinkfedilinkEnglisharrow-up3arrow-down1·1 month agoSteering your eyes toward ads, of course, what a silly question.
minus-squarePatch@feddit.uklinkfedilinkEnglisharrow-up11·1 month agoThis feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.
I mean LLMs are not to get exact information. Do people ever read on the stuff they use?
Theoretically, what would the utility of AI summaries in Google Search if not getting exact information?
Steering your eyes toward ads, of course, what a silly question.
This feels like something you should go tell Google about rather than the rest of us. They’re the ones who have embedded LLM-generated answers to random search queries.