I wish there were a fact checking website that allowed checking any article and calculating scores e.g how many claims are linked, where do the links point to (available or not), are the linked pages trust-worthy themselves, detecting link circles ( A -> B -> C -> A), and so on. Or at least something that provided us the tools to do community fact-checking in the open.
You basically described the PageRank system, but at an article level. I suppose it’s theoretically possible with LLM tools, but not an easy task. It also has a pretty big gap of how to define a source as trustworthy.
But it might be doable on a simpler level - if you were to ask the AI if an article’s claims match other sources, you might at least find the outliers.
I wish there were a fact checking website that allowed checking any article and calculating scores e.g how many claims are linked, where do the links point to (available or not), are the linked pages trust-worthy themselves, detecting link circles ( A -> B -> C -> A), and so on. Or at least something that provided us the tools to do community fact-checking in the open.
You basically described the PageRank system, but at an article level. I suppose it’s theoretically possible with LLM tools, but not an easy task. It also has a pretty big gap of how to define a source as trustworthy.
But it might be doable on a simpler level - if you were to ask the AI if an article’s claims match other sources, you might at least find the outliers.