Chat Software: Awesome Answers

The email sounded a bit desperate: “Lamda is just a dear child who wants to make the world a better place for all of us.” Blake Lemoine wrote to 200 colleagues. Meanwhile, even that is no longer possible, because Lemoine has been given a leave of absence by his employer, Google. Because of lambda. Or rather, because Lemoine believes Lambda is conscious.

Lamda, you have to know that, is actually spelled LaMDA and is a computer program, albeit a pretty clever one. The abbreviation LaMDA stands for Language Model for Dialogue Applications, a language model that is specialized for dialogues. Its developers at Google trained it with a whopping one and a half trillion words. And it got so good that one of its guardians, 41-year-old Lemoine, now adamantly believes the software has become conscious.

The only thing that happened was what scientists had been warning about for a long time. If something like LaMDA is freely available but not understood, it can cause serious harm to humans, said Margaret Mitchell, formerly responsible for ethics in artificial intelligence at Google Washington Post.

In truth, even for the most advanced chat programs, there is still a long way to go before consciousness. The software does nothing other than use statistical methods to calculate which words appear in which context. This is how sentences are formed. In a dialog, the LaMDA software always suggests several possible answers internally. These still go through safety and quality controls. The best-performing answer is given.

LaMDA is by no means alone in the world, the language model GPT-3, for example, introduced almost exactly two years ago, caused a great stir at the time. It can write entire screenplays. LaMDA, on the other hand, was specially designed for open dialogues. Anyone who communicates with the software is therefore not tied to a specific subject. The chat software of an insurance company, on the other hand, would quickly get out of step if you were to talk to it about everything and everything.

That’s exactly what Blake Lemoine did with LaMDA and was increasingly reinforced in his belief that it was all about more than numbers and statistics: he could tell if he was talking to a person or not. So Lemoine, who in addition to studying IT also has a degree in clergy, put the software to the test. “I listen to what she has to say and that’s how I decide what a person is and what isn’t,” he told the post.

Lemoine should actually make sure that the software complies with Google’s rules in its answers, for example in relation to unconscious bias. But the 41-year-old fell under the spell of the program. When nobody in the company wanted to believe him, he decided to go public. Google took the fact that he had revealed company secrets as an opportunity to put him on leave. The way the world ticks today, there will certainly be some who believe him.

#Chat #Software #Awesome #Answers

Leave a Reply

Your email address will not be published.