The term “AI” was established in 1956 at the Dartmouth workshop and covers a very broad range of topics in computer science. It definitely encompasses large language models.
I am sure llm is a little part of AI won’t deny it. But that sold as is their are a full ai. Which isnt true.
I wasn’t born in 1956 my définition of ai is Jarvis :D
Thing is, to the people who don’t follow tech news and aren’t really interested in this stuff, AI = AGI. It’s like most non-scientists equating “theory” and “hypothesis”. So it’s a really bad choice of term that’s interfering with communication.
I’ve recently taken to considering Large Language Models like essay assistants. Sure, people will try and use it to replace the essay entirely, but in its useful and practical form, it’s good at correcting typos, organizing scattered thoughts, etc. Just like an English teacher reviewing an essay. They don’t necessarily know about the topic you’re writing about, but they can make sure it’s coherent.
I’m far more excited for a future with things like Large Code or Math or Database models that are geared towards very particular tasks and the different models can rely on each other for the processes they need to take.
I’m not sure what this will look like, but I expect a tremendous amount of carefully coordinated (not vibe-coded) frameworks would need to be made to support this kind of communication efficiently.
Yeah I hate that is is used for llm, when we tell ia I see Jarvis from iron man not a text generator.
The term “AI” was established in 1956 at the Dartmouth workshop and covers a very broad range of topics in computer science. It definitely encompasses large language models.
I am sure llm is a little part of AI won’t deny it. But that sold as is their are a full ai. Which isnt true. I wasn’t born in 1956 my définition of ai is Jarvis :D
You are mistaking a specific kind of AI for all AI. That’s like saying a tulip isn’t a flower because you believe flowers are roses.
Jarvis is a fictional example of a kind of AI known as Artificial General Intelligence, or AGI.
Thing is, to the people who don’t follow tech news and aren’t really interested in this stuff, AI = AGI. It’s like most non-scientists equating “theory” and “hypothesis”. So it’s a really bad choice of term that’s interfering with communication.
This community where we’re discussing this right now is literally intended for following tech news. It is for people who follow tech news.
Okey, so gate keeping. I will stop here. Enjoy your day bye.
I’ve recently taken to considering Large Language Models like essay assistants. Sure, people will try and use it to replace the essay entirely, but in its useful and practical form, it’s good at correcting typos, organizing scattered thoughts, etc. Just like an English teacher reviewing an essay. They don’t necessarily know about the topic you’re writing about, but they can make sure it’s coherent.
I’m far more excited for a future with things like Large Code or Math or Database models that are geared towards very particular tasks and the different models can rely on each other for the processes they need to take.
I’m not sure what this will look like, but I expect a tremendous amount of carefully coordinated (not vibe-coded) frameworks would need to be made to support this kind of communication efficiently.
Yeah it have its use case. I translated a CV faster thanks to it.