LLMs are a decent tech if you know their limitations and actually want to use them. This shit fulfils neither condition - this shit is going to hallucinate wildly, and gullible muppets will swallow the hallucination like it was caviar, because they don’t even know what’s going on.
LLMs are a decent tech if you know their limitations and actually want to use them. This shit fulfils neither condition - this shit is going to hallucinate wildly, and gullible muppets will swallow the hallucination like it was caviar, because they don’t even know what’s going on.