@Tattooed_Mummy we are trying to replace humans with humanlike computer programs. What could possibly go wrong? Calling it now: we will soon have equivalents of babysitters, teachers and psychologists for AIs. Assuming that next steps will be memory and self correction in AI development.
@root42 @Tattooed_Mummy those "next steps" are no closer to becoming reality today than they were 30 years ago.
@tob @root42 @Tattooed_Mummy
I am not so sure about that. 30 years ago Eliza was about the best you could get for "therapy" chat bots, now there are things like Replika, Elomia, Mindspa, etc. We have come a long ways in the last 30 years. Note that I am not saying good, just that we have come a long way.
@nikatjef @tob @Tattooed_Mummy My point. And memory is already on its way and will be another game changer for current models. I do think that AI will still do great strides, but I don't see the point or better: I see a lot of drawbacks and dangers. We totally lose track that whatever we do we should do it for humanity, not "because".
@root42 @nikatjef @tob @Tattooed_Mummy
But it is not intelligent. It can randomly put together responses based on training and input, nothing it says is anything but random words put together to SOUND like a human. There is no medical or psychological knowledge in there, just random responses based on the data that it has been fed with while being trained to sound as if it knows anything.
People who mistake an RNG what-if code with "it is almost sentient" has no clue.
@WhyNotZoidberg @root42 @tob @Tattooed_Mummy
And anyone who thinks a LLM is a RNG has even less clue.
No one is claiming they are sentient or that it understands what it is say, but the fact is that LLM systems are our-scoring humans on various collegiate tests, they are diagnosing diseases that medical professionals are missing.
@nikatjef @root42 @tob @Tattooed_Mummy They are unable to tell the truth. All they can do is mimic the truth with no actual control that they DO tell the truth. Relying on AI for facts is not a good idea, since they cannot determin what fact is.
As someone else put it on here, we have burned billions of dollars, and are destroying the environment with enormous data centers burning drinking water and electricity, to produce tech that makes computers pretend they can't do math.
@WhyNotZoidberg @nikatjef @root42 @Tattooed_Mummy Good thing history doesn't have anything to say about civilizations collapsing because of the misallocation of investment based on the whims of a disconnected and increasingly bizarre class of elites.