LLM is a LLM. LLM is a transformer model generating likely output from a dataset.
I hate all this analogy stuff people keep resorting to. The thing does what it does, and trying to understand what it does by analogy is being used disingenuously to push all sort of misinformation-filled agendas.
It’s not about “trust”, it’s about how the output you’re being given is generated, and so what types of outputs are useful on what applications.
The answer is fairly narrow, particularly compared to how it’s being marketed. It absolutely, 100% isn’t a search engine, though. And even when plugged into a search engine and acting as a summarization engine it’s actually pretty terrible and very likely to distort an output that anybody who has been near a computer in the past thirty years can parse faster at a glance.
LLM is a LLM. LLM is a transformer model generating likely output from a dataset.
I hate all this analogy stuff people keep resorting to. The thing does what it does, and trying to understand what it does by analogy is being used disingenuously to push all sort of misinformation-filled agendas.
It’s not about “trust”, it’s about how the output you’re being given is generated, and so what types of outputs are useful on what applications.
The answer is fairly narrow, particularly compared to how it’s being marketed. It absolutely, 100% isn’t a search engine, though. And even when plugged into a search engine and acting as a summarization engine it’s actually pretty terrible and very likely to distort an output that anybody who has been near a computer in the past thirty years can parse faster at a glance.