• 0 Posts
  • 9 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle

  • Your statement on no way of fact checking is not a 100% correct as developers found ways to ground LLMs, e.g., by prepending context pulled from „real time“ sources of truth (e.g., search engines). This data is then incorporated into the prompt as context data. Well obviously this is kind of cheating and not baked into the LLM itself, however it can be pretty accurate for a lot of use cases.