I made a smartass comment earlier comparing AI to fire, but it’s really my favorite metaphor for it - and it extends to this issue. Depending on how you define it, fire seems to meet the requirements for being alive. It tends to come up in the same conversations that question whether a virus is alive. I think it’s fair to think of LLMs (particularly the current implementations) as intelligent - just in the same way we think of fire or a virus as alive. Having many of the characteristics of it, but being a step removed.
Stop calling gpt ai
That’s the inaccurate name everyone’s settled on. Kinda like how “sentient” is widely used to mean “sapient” despite being two different things.
I made a smartass comment earlier comparing AI to fire, but it’s really my favorite metaphor for it - and it extends to this issue. Depending on how you define it, fire seems to meet the requirements for being alive. It tends to come up in the same conversations that question whether a virus is alive. I think it’s fair to think of LLMs (particularly the current implementations) as intelligent - just in the same way we think of fire or a virus as alive. Having many of the characteristics of it, but being a step removed.
That is an extremely apt parallel!
(I’m stealing it)
How is it not AI? Just because it’s not AGI doesn’t mean it’s not AI. AI encompasses a lot of things.
This article is about Gemini, not GPT. The generic term is LLM: Large Language Model.